A stable systemic risk ranking in China's banking sector: Based on principal component analysis
NASA Astrophysics Data System (ADS)
Fang, Libing; Xiao, Binqing; Yu, Honghai; You, Qixing
2018-02-01
In this paper, we compare five popular systemic risk rankings, and apply principal component analysis (PCA) model to provide a stable systemic risk ranking for the Chinese banking sector. Our empirical results indicate that five methods suggest vastly different systemic risk rankings for the same bank, while the combined systemic risk measure based on PCA provides a reliable ranking. Furthermore, according to factor loadings of the first component, PCA combined ranking is mainly based on fundamentals instead of market price data. We clearly find that price-based rankings are not as practical a method as fundamentals-based ones. This PCA combined ranking directly shows systemic risk contributions of each bank for banking supervision purpose and reminds banks to prevent and cope with the financial crisis in advance.
77 FR 55371 - System Safety Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-07
...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...
A Risk-Analysis Approach to Implementing Web-Based Assessment
ERIC Educational Resources Information Center
Ricketts, Chris; Zakrzewski, Stan
2005-01-01
Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…
System Theoretic Frameworks for Mitigating Risk Complexity in the Nuclear Fuel Cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Adam David; Mohagheghi, Amir H.; Cohn, Brian
In response to the expansion of nuclear fuel cycle (NFC) activities -- and the associated suite of risks -- around the world, this project evaluated systems-based solutions for managing such risk complexity in multimodal and multi-jurisdictional international spent nuclear fuel (SNF) transportation. By better understanding systemic risks in SNF transportation, developing SNF transportation risk assessment frameworks, and evaluating these systems-based risk assessment frameworks, this research illustrated interdependency between safety, security, and safeguards risks is inherent in NFC activities and can go unidentified when each "S" is independently evaluated. Two novel system-theoretic analysis techniques -- dynamic probabilistic risk assessment (DPRA) andmore » system-theoretic process analysis (STPA) -- provide integrated "3S" analysis to address these interdependencies and the research results suggest a need -- and provide a way -- to reprioritize United States engagement efforts to reduce global nuclear risks. Lastly, this research identifies areas where Sandia National Laboratories can spearhead technical advances to reduce global nuclear dangers.« less
[Reliability theory based on quality risk network analysis for Chinese medicine injection].
Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui
2014-08-01
A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.
Expert systems in civil engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostem, C.N.; Maher, M.L.
1986-01-01
This book presents the papers given at a symposium on expert systems in civil engineering. Topics considered at the symposium included problem solving using expert system techniques, construction schedule analysis, decision making and risk analysis, seismic risk analysis systems, an expert system for inactive hazardous waste site characterization, an expert system for site selection, knowledge engineering, and knowledge-based expert systems in seismic analysis.
SADA: Ecological Risk Based Decision Support System for Selective Remediation
Spatial Analysis and Decision Assistance (SADA) is freeware that implements terrestrial ecological risk assessment and yields a selective remediation design using its integral geographical information system, based on ecological and risk assessment inputs. Selective remediation ...
A Simplified Approach to Risk Assessment Based on System Dynamics: An Industrial Case Study.
Garbolino, Emmanuel; Chery, Jean-Pierre; Guarnieri, Franck
2016-01-01
Seveso plants are complex sociotechnical systems, which makes it appropriate to support any risk assessment with a model of the system. However, more often than not, this step is only partially addressed, simplified, or avoided in safety reports. At the same time, investigations have shown that the complexity of industrial systems is frequently a factor in accidents, due to interactions between their technical, human, and organizational dimensions. In order to handle both this complexity and changes in the system over time, this article proposes an original and simplified qualitative risk evaluation method based on the system dynamics theory developed by Forrester in the early 1960s. The methodology supports the development of a dynamic risk assessment framework dedicated to industrial activities. It consists of 10 complementary steps grouped into two main activities: system dynamics modeling of the sociotechnical system and risk analysis. This system dynamics risk analysis is applied to a case study of a chemical plant and provides a way to assess the technological and organizational components of safety. © 2016 Society for Risk Analysis.
Risk Based Reliability Centered Maintenance of DOD Fire Protection Systems
1999-01-01
2.2.3 Failure Mode and Effect Analysis ( FMEA )............................ 2.2.4 Failure Mode Risk Characterization...Step 2 - System functions and functional failures definition Step 3 - Failure mode and effect analysis ( FMEA ) Step 4 - Failure mode risk...system). The Interface Location column identifies the location where the FMEA of the fire protection system began or stopped. For example, for the fire
Study of a risk-based piping inspection guideline system.
Tien, Shiaw-Wen; Hwang, Wen-Tsung; Tsai, Chih-Hung
2007-02-01
A risk-based inspection system and a piping inspection guideline model were developed in this study. The research procedure consists of two parts--the building of a risk-based inspection model for piping and the construction of a risk-based piping inspection guideline model. Field visits at the plant were conducted to develop the risk-based inspection and strategic analysis system. A knowledge-based model had been built in accordance with international standards and local government regulations, and the rational unified process was applied for reducing the discrepancy in the development of the models. The models had been designed to analyze damage factors, damage models, and potential damage positions of piping in the petrochemical plants. The purpose of this study was to provide inspection-related personnel with the optimal planning tools for piping inspections, hence, to enable effective predictions of potential piping risks and to enhance the better degree of safety in plant operations that the petrochemical industries can be expected to achieve. A risk analysis was conducted on the piping system of a petrochemical plant. The outcome indicated that most of the risks resulted from a small number of pipelines.
Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance
NASA Technical Reports Server (NTRS)
Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.
2016-01-01
Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.
NASA Astrophysics Data System (ADS)
Ahn, Junkeon; Noh, Yeelyong; Park, Sung Ho; Choi, Byung Il; Chang, Daejun
2017-10-01
This study proposes a fuzzy-based FMEA (failure mode and effect analysis) for a hybrid molten carbonate fuel cell and gas turbine system for liquefied hydrogen tankers. An FMEA-based regulatory framework is adopted to analyze the non-conventional propulsion system and to understand the risk picture of the system. Since the participants of the FMEA rely on their subjective and qualitative experiences, the conventional FMEA used for identifying failures that affect system performance inevitably involves inherent uncertainties. A fuzzy-based FMEA is introduced to express such uncertainties appropriately and to provide flexible access to a risk picture for a new system using fuzzy modeling. The hybrid system has 35 components and has 70 potential failure modes, respectively. Significant failure modes occur in the fuel cell stack and rotary machine. The fuzzy risk priority number is used to validate the crisp risk priority number in the FMEA.
Volatility and correlation-based systemic risk measures in the US market
NASA Astrophysics Data System (ADS)
Civitarese, Jamil
2016-10-01
This paper deals with the problem of how to use simple systemic risk measures to assess portfolio risk characteristics. Using three simple examples taken from previous literature, one based on raw and partial correlations, another based on the eigenvalue decomposition of the covariance matrix and the last one based on an eigenvalue entropy, a Granger-causation analysis revealed some of them are not always a good measure of risk in the S&P 500 and in the VIX. The measures selected do not Granger-cause the VIX index in all windows selected; therefore, in the sense of risk as volatility, the indicators are not always suitable. Nevertheless, their results towards returns are similar to previous works that accept them. A deeper analysis has shown that any symmetric measure based on eigenvalue decomposition of correlation matrices, however, is not useful as a measure of "correlation" risk. The empirical counterpart analysis of this proposition stated that negative correlations are usually small and, therefore, do not heavily distort the behavior of the indicator.
Impact of systemic risk in the real estate sector on banking return.
Li, Shouwei; Pan, Qing; He, Jianmin
2016-01-01
In this paper, we measure systemic risk in the real estate sector based on contingent claims analysis, and then investigate its impact on banking return. Based on the data in China, we find that systemic risk in the real estate sector has a negative effect on banking return, but this effect is temporary; banking risk aversion and implicit interest expense have considerable impact on banking return.
Risk Interfaces to Support Integrated Systems Analysis and Development
NASA Technical Reports Server (NTRS)
Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria
2016-01-01
Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.
Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip
2018-02-01
Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.
Train integrity detection risk analysis based on PRISM
NASA Astrophysics Data System (ADS)
Wen, Yuan
2018-04-01
GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.
A Risk-Based Approach for Aerothermal/TPS Analysis and Testing
2007-07-01
RTO-EN-AVT-142 17 - 1 A Risk-Based Approach for Aerothermal/ TPS Analysis and Testing Michael J. Wright∗ and Jay H. Grinstead† NASA Ames...of the thermal protection system ( TPS ) is to protect the payload (crew, cargo, or science) from this entry heating environment. The performance of...the TPS is determined by the efficiency and reliability of this system, typically measured
NASA Astrophysics Data System (ADS)
Zeng, Yajun; Skibniewski, Miroslaw J.
2013-08-01
Enterprise resource planning (ERP) system implementations are often characterised with large capital outlay, long implementation duration, and high risk of failure. In order to avoid ERP implementation failure and realise the benefits of the system, sound risk management is the key. This paper proposes a probabilistic risk assessment approach for ERP system implementation projects based on fault tree analysis, which models the relationship between ERP system components and specific risk factors. Unlike traditional risk management approaches that have been mostly focused on meeting project budget and schedule objectives, the proposed approach intends to address the risks that may cause ERP system usage failure. The approach can be used to identify the root causes of ERP system implementation usage failure and quantify the impact of critical component failures or critical risk events in the implementation process.
On some recent definitions and analysis frameworks for risk, vulnerability, and resilience.
Aven, Terje
2011-04-01
Recently, considerable attention has been paid to a systems-based approach to risk, vulnerability, and resilience analysis. It is argued that risk, vulnerability, and resilience are inherently and fundamentally functions of the states of the system and its environment. Vulnerability is defined as the manifestation of the inherent states of the system that can be subjected to a natural hazard or be exploited to adversely affect that system, whereas resilience is defined as the ability of the system to withstand a major disruption within acceptable degradation parameters and to recover within an acceptable time, and composite costs, and risks. Risk, on the other hand, is probability based, defined by the probability and severity of adverse effects (i.e., the consequences). In this article, we look more closely into this approach. It is observed that the key concepts are inconsistent in the sense that the uncertainty (probability) dimension is included for the risk definition but not for vulnerability and resilience. In the article, we question the rationale for this inconsistency. The suggested approach is compared with an alternative framework that provides a logically defined structure for risk, vulnerability, and resilience, where all three concepts are incorporating the uncertainty (probability) dimension. © 2010 Society for Risk Analysis.
Risk management of key issues of FPSO
NASA Astrophysics Data System (ADS)
Sun, Liping; Sun, Hai
2012-12-01
Risk analysis of key systems have become a growing topic late of because of the development of offshore structures. Equipment failures of offloading system and fire accidents were analyzed based on the floating production, storage and offloading (FPSO) features. Fault tree analysis (FTA), and failure modes and effects analysis (FMEA) methods were examined based on information already researched on modules of relex reliability studio (RRS). Equipment failures were also analyzed qualitatively by establishing a fault tree and Boolean structure function based on the shortage of failure cases, statistical data, and risk control measures examined. Failure modes of fire accident were classified according to the different areas of fire occurrences during the FMEA process, using risk priority number (RPN) methods to evaluate their severity rank. The qualitative analysis of FTA gave the basic insight of forming the failure modes of FPSO offloading, and the fire FMEA gave the priorities and suggested processes. The research has practical importance for the security analysis problems of FPSO.
NASA Astrophysics Data System (ADS)
Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua
2017-05-01
With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.
NASA Technical Reports Server (NTRS)
Tompkins, F. G.
1984-01-01
Guidance is presented to NASA Computer Security Officials for determining the acceptability or unacceptability of ADP security risks based on the technical, operational and economic feasibility of potential safeguards. The risk management process is reviewed as a specialized application of the systems approach to problem solving and information systems analysis and design. Reporting the results of the risk reduction analysis to management is considered. Report formats for the risk reduction study are provided.
Software for occupational health and safety risk analysis based on a fuzzy model.
Stefanovic, Miladin; Tadic, Danijela; Djapan, Marko; Macuzic, Ivan
2012-01-01
Risk and safety management are very important issues in healthcare systems. Those are complex systems with many entities, hazards and uncertainties. In such an environment, it is very hard to introduce a system for evaluating and simulating significant hazards. In this paper, we analyzed different types of hazards in healthcare systems and we introduced a new fuzzy model for evaluating and ranking hazards. Finally, we presented a developed software solution, based on the suggested fuzzy model for evaluating and monitoring risk.
Røssvoll, Elin Halbach; Ueland, Øydis; Hagtvedt, Therese; Jacobsen, Eivind; Lavik, Randi; Langsrud, Solveig
2012-09-01
Traditionally, consumer food safety survey responses have been classified as either "right" or "wrong" and food handling practices that are associated with high risk of infection have been treated in the same way as practices with lower risks. In this study, a risk-based method for consumer food safety surveys has been developed, and HACCP (hazard analysis and critical control point) methodology was used for selecting relevant questions. We conducted a nationally representative Web-based survey (n = 2,008), and to fit the self-reported answers we adjusted a risk-based grading system originally developed for observational studies. The results of the survey were analyzed both with the traditional "right" and "wrong" classification and with the risk-based grading system. The results using the two methods were very different. Only 5 of the 10 most frequent food handling violations were among the 10 practices associated with the highest risk. These 10 practices dealt with different aspects of heat treatment (lacking or insufficient), whereas the majority of the most frequent violations involved storing food at room temperature for too long. Use of the risk-based grading system for survey responses gave a more realistic picture of risks associated with domestic food handling practices. The method highlighted important violations and minor errors, which are performed by most people and are not associated with significant risk. Surveys built on a HACCP-based approach with risk-based grading will contribute to a better understanding of domestic food handling practices and will be of great value for targeted information and educational activities.
Ontology-based specification, identification and analysis of perioperative risks.
Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich
2017-09-06
Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.
Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.
Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi
2015-10-01
In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. © 2015 Society for Risk Analysis.
Game Theory and Risk-Based Levee System Design
NASA Astrophysics Data System (ADS)
Hui, R.; Lund, J. R.; Madani, K.
2014-12-01
Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.
Advanced uncertainty modelling for container port risk analysis.
Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin
2016-08-13
Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.
Derailment-based Fault Tree Analysis on Risk Management of Railway Turnout Systems
NASA Astrophysics Data System (ADS)
Dindar, Serdar; Kaewunruen, Sakdirat; An, Min; Gigante-Barrera, Ángel
2017-10-01
Railway turnouts are fundamental mechanical infrastructures, which allow a rolling stock to divert one direction to another. As those are of a large number of engineering subsystems, e.g. track, signalling, earthworks, these particular sub-systems are expected to induce high potential through various kind of failure mechanisms. This could be a cause of any catastrophic event. A derailment, one of undesirable events in railway operation, often results, albeit rare occurs, in damaging to rolling stock, railway infrastructure and disrupt service, and has the potential to cause casualties and even loss of lives. As a result, it is quite significant that a well-designed risk analysis is performed to create awareness of hazards and to identify what parts of the systems may be at risk. This study will focus on all types of environment based failures as a result of numerous contributing factors noted officially as accident reports. This risk analysis is designed to help industry to minimise the occurrence of accidents at railway turnouts. The methodology of the study relies on accurate assessment of derailment likelihood, and is based on statistical multiple factors-integrated accident rate analysis. The study is prepared in the way of establishing product risks and faults, and showing the impact of potential process by Boolean algebra.
Hu, Chenggong; Zhou, Yongfang; Liu, Chang; Kang, Yan
2018-01-01
Gastric cancer (GC) is the fifth most common cancer and the third leading cause of cancer-associated mortality worldwide. In the current study, comprehensive bioinformatic analyses were performed to develop a novel scoring system for GC risk assessment based on CAP-Gly domain containing linker protein family member 4 (CLIP4) DNA methylation status. Two GC datasets with methylation sequencing information and mRNA expression profiling were downloaded from the The Cancer Genome Atlas and Gene Expression Omnibus databases. Differentially expressed genes (DEGs) between the CLIP4 hypermethylation and CLIP4 hypomethylation groups were screened using the limma package in R 3.3.1, and survival analysis of these DEGs was performed using the survival package. A risk scoring system was established via regression factor-weighted gene expression based on linear combination to screen the most important genes associated with CLIP4 methylation and prognosis. Genes associated with high/low-risk value were selected using the limma package. Functional enrichment analysis of the top 500 DEGs that positively and negatively associated with risk values was performed using DAVID 6.8 online and the gene set enrichment analysis (GSEA) software. In total, 35 genes were identified to be that significantly associated with prognosis and CLIP4 DNA methylation, and three prognostic signature genes, claudin-11 (CLDN11), apolipoprotein D (APOD), and chordin like 1 (CHRDL1), were used to establish a risk assessment system. The prognostic scoring system exhibited efficiency in classifying patients with different prognoses, where the low-risk groups had significantly longer overall survival times than those in the high-risk groups. CLDN11, APOD and CHRDL1 exhibited reduced expression in the hypermethylation and low-risk groups compare with the hypomethylation and high-risk groups, respectively. Multivariate Cox analysis indicated that risk value could be used as an independent prognostic factor. In functional analysis, six functional gene ontology terms and five GSEA pathways were associated with CLDN11, APOD and CHRDL1. The results established the credibility of the scoring system in this study. Additionally, these three genes, which were significantly associated with CLIP4 DNA methylation and GC risk assessment, were identified as potential prognostic biomarkers. PMID:29901187
Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A
2001-10-12
As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.
Sun, F; Chen, J; Tong, Q; Zeng, S
2007-01-01
Management of drinking water safety is changing towards an integrated risk assessment and risk management approach that includes all processes in a water supply system from catchment to consumers. However, given the large number of water supply systems in China and the cost of implementing such a risk assessment procedure, there is a necessity to first conduct a strategic screening analysis at a national level. An integrated methodology of risk assessment and screening analysis is thus proposed to evaluate drinking water safety of a conventional water supply system. The violation probability, indicating drinking water safety, is estimated at different locations of a water supply system in terms of permanganate index, ammonia nitrogen, turbidity, residual chlorine and trihalomethanes. Critical parameters with respect to drinking water safety are then identified, based on which an index system is developed to prioritize conventional water supply systems in implementing a detailed risk assessment procedure. The evaluation results are represented as graphic check matrices for the concerned hazards in drinking water, from which the vulnerability of a conventional water supply system is characterized.
NASA Astrophysics Data System (ADS)
Telipenko, E.; Chernysheva, T.; Zakharova, A.; Dumchev, A.
2015-10-01
The article represents research results about the knowledge base development for the intellectual information system for the bankruptcy risk assessment of the enterprise. It is described the process analysis of the knowledge base development; the main process stages, some problems and their solutions are given. The article introduces the connectionist model for the bankruptcy risk assessment based on the analysis of industrial enterprise financial accounting. The basis for this connectionist model is a three-layer perceptron with the back propagation of error algorithm. The knowledge base for the intellectual information system consists of processed information and the processing operation method represented as the connectionist model. The article represents the structure of the intellectual information system, the knowledge base, and the information processing algorithm for neural network training. The paper shows mean values of 10 indexes for industrial enterprises; with the help of them it is possible to carry out a financial analysis of industrial enterprises and identify correctly the current situation for well-timed managerial decisions. Results are given about neural network testing on the data of both bankrupt and financially strong enterprises, which were not included into training and test sets.
The role of risk-based prioritization in total quality management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, C.T.
1994-10-01
The climate in which government managers must make decisions grows more complex and uncertain. All stakeholders - the public, industry, and Congress - are demanding greater consciousness, responsibility, and accountability of programs and their budgets. Yet, managerial decisions have become multifaceted, involve greater risk, and operate over much longer time periods. Over the last four or five decades, as policy analysis and decisions became more complex, scientists from psychology, operations research, systems science, and economics have developed a more or less coherent process called decision analysis to aid program management. The process of decision analysis - a systems theoretic approachmore » - provides the backdrop for this paper. The Laboratory Integrated Prioritization System (LIPS) has been developed as a systems analytic and risk-based prioritization tool to aid the management of the Tri-Labs` (Lawrence Livermore, Los Alamos, and Sandia) operating resources. Preliminary analyses of the effects of LIPS has confirmed the practical benefits of decision and systems sciences - the systematic, quantitative reduction in uncertainty. To date, the use of LIPS - and, hence, its value - has been restricted to resource allocation within the Tri-Labs` operations budgets. This report extends the role of risk-based prioritization to the support of DOE Total Quality Management (TQM) programs. Furthermore, this paper will argue for the requirement to institutionalize an evolutionary, decision theoretic approach to the policy analysis of the Department of Energy`s Program Budget.« less
Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R
2011-01-01
Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco
Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less
Method and system for dynamic probabilistic risk assessment
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)
2013-01-01
The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.
Gigrich, James; Sarkani, Shahryar; Holzer, Thomas
2017-03-01
There is an increasing backlog of potentially toxic compounds that cannot be evaluated with current animal-based approaches in a cost-effective and expeditious manner, thus putting human health at risk. Extrapolation of animal-based test results for human risk assessment often leads to different physiological outcomes. This article introduces the use of quantitative tools and methods from systems engineering to evaluate the risk of toxic compounds by the analysis of the amount of stress that human hepatocytes undergo in vitro when metabolizing GW7647 1 over extended times and concentrations. Hepatocytes are exceedingly connected systems that make it challenging to understand the highly varied dimensional genomics data to determine risk of exposure. Gene expression data of peroxisome proliferator-activated receptor-α (PPARα) 2 binding was measured over multiple concentrations and varied times of GW7647 exposure and leveraging mahalanombis distance to establish toxicity threshold risk levels. The application of these novel systems engineering tools provides new insight into the intricate workings of human hepatocytes to determine risk threshold levels from exposure. This approach is beneficial to decision makers and scientists, and it can help reduce the backlog of untested chemical compounds due to the high cost and inefficiency of animal-based models.
The -765G>C polymorphism in the cyclooxygenase-2 gene and digestive system cancer: a meta-analysis.
Zhao, Fen; Cao, Yue; Zhu, Hong; Huang, Min; Yi, Cheng; Huang, Ying
2014-01-01
Published data regarding associations between the -765G>C polymorphism in cyclooxygenase-2 (COX-2) gene and digestive system cancer risk have been inconclusive. The aim of this study was to comprehensively evaluate the genetic risk of the -765G>C polymorphism in the COX-2 gene for digestive system cancer. A search was performed in Pubmed, Medline (Ovid), Embase, CNKI, Weipu, Wanfang and CBM databases, covering all studies until Feb 10, 2014. Statistical analysis was performed using Revman5.2. A total of 10,814 cases and 16,174 controls in 38 case-control studies were included in this meta-analysis. The results indicated that C allele carriers (GC+CC) had a 20% increased risk of digestive system cancer when compared with the homozygote GG (odds ratio (OR)=1.20, 95% confidence interval (CI), 1.00-1.44 for GC+CC vs GG). In the subgroup analysis by ethnicity, significant elevated risks were associated with C allele carriers (GC+CC) in Asians (OR = 1.46, 95% CI=1.07-2.01, and p=0.02) and Africans (OR=2.12, 95% CI=1.57-2.87, and p< 0.00001), but not among Caucasians, Americans and mixed groups. For subgroup analysis by cancer type (GC+CC vs GG), significant associations were found between the -765G>C polymorphism and higher risk for gastric cancer (OR=1.64, 95% CI=1.03-2.61, and p=0.04), but not for colorectal cancer, oral cancer, esophageal cancer, and others. Regarding study design (GC+CC vs GG), no significant associations were found in then population-based case-control (PCC), hospital-based case-control (HCC) and family-based case-control (FCC) studies. This meta-analysis suggested that the -765G>C polymorphism of the COX-2 gene is a potential risk factor for digestive system cancer in Asians and Africans and gastric cancer overall.
U.K. Foot and Mouth Disease: A Systemic Risk Assessment of Existing Controls.
Delgado, João; Pollard, Simon; Pearn, Kerry; Snary, Emma L; Black, Edgar; Prpich, George; Longhurst, Phil
2017-09-01
This article details a systemic analysis of the controls in place and possible interventions available to further reduce the risk of a foot and mouth disease (FMD) outbreak in the United Kingdom. Using a research-based network analysis tool, we identify vulnerabilities within the multibarrier control system and their corresponding critical control points (CCPs). CCPs represent opportunities for active intervention that produce the greatest improvement to United Kingdom's resilience to future FMD outbreaks. Using an adapted 'features, events, and processes' (FEPs) methodology and network analysis, our results suggest that movements of animals and goods associated with legal activities significantly influence the system's behavior due to their higher frequency and ability to combine and create scenarios of exposure similar in origin to the U.K. FMD outbreaks of 1967/8 and 2001. The systemic risk assessment highlights areas outside of disease control that are relevant to disease spread. Further, it proves to be a powerful tool for demonstrating the need for implementing disease controls that have not previously been part of the system. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Case Study on Project Risk Management Planning Based on Soft System Methodology
NASA Astrophysics Data System (ADS)
Lifang, Xie; Jun, Li
This paper analyzed the soft system characters of construction projects and the applicability on using Soft System Methodology (SSM) for risk analysis after a brief review of SSM. Taking a hydropower project as an example, it constructed the general frame of project risk management planning (PRMP) and established the Risk Management Planning (RMP) system from the perspective of the interests of co-ordination. This paper provided the ideas and methods for construction RMP under the win-win situation through the practice of SSM.
Samantra, Chitrasen; Datta, Saurav; Mahapatra, Siba Sankar
2017-03-01
In the context of underground coal mining industry, the increased economic issues regarding implementation of additional safety measure systems, along with growing public awareness to ensure high level of workers safety, have put great pressure on the managers towards finding the best solution to ensure safe as well as economically viable alternative selection. Risk-based decision support system plays an important role in finding such solutions amongst candidate alternatives with respect to multiple decision criteria. Therefore, in this paper, a unified risk-based decision-making methodology has been proposed for selecting an appropriate safety measure system in relation to an underground coal mining industry with respect to multiple risk criteria such as financial risk, operating risk, and maintenance risk. The proposed methodology uses interval-valued fuzzy set theory for modelling vagueness and subjectivity in the estimates of fuzzy risk ratings for making appropriate decision. The methodology is based on the aggregative fuzzy risk analysis and multi-criteria decision making. The selection decisions are made within the context of understanding the total integrated risk that is likely to incur while adapting the particular safety system alternative. Effectiveness of the proposed methodology has been validated through a real-time case study. The result in the context of final priority ranking is seemed fairly consistent.
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
Program risk analysis handbook
NASA Technical Reports Server (NTRS)
Batson, R. G.
1987-01-01
NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.
Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems
NASA Astrophysics Data System (ADS)
Kwag, Shinyoung
Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.
Conceptual design study of Fusion Experimental Reactor (FY86 FER): Safety
NASA Astrophysics Data System (ADS)
Seki, Yasushi; Iida, Hiromasa; Honda, Tsutomu
1987-08-01
This report describes the study on safety for FER (Fusion Experimental Reactor) which has been designed as a next step machine to the JT-60. Though the final purpose of this study is to have an image of design base accident, maximum credible accident and to assess their risk or probability, etc., as FER plant system, the emphasis of this years study is placed on fuel-gas circulation system where the tritium inventory is maximum. The report consists of two chapters. The first chapter summarizes the FER system and describes FMEA (Failure Mode and Effect Analysis) and related accident progression sequence for FER plant system as a whole. The second chapter of this report is focused on fuel-gas circulation system including purification, isotope separation and storage. Probability of risk is assessed by the probabilistic risk analysis (PRA) procedure based on FMEA, ETA and FTA.
Martins, Marcelo Ramos; Schleder, Adriana Miralles; Droguett, Enrique López
2014-12-01
This article presents an iterative six-step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty. © 2014 Society for Risk Analysis.
A Risk-Based Approach for Aerothermal/TPS Analysis and Testing
NASA Technical Reports Server (NTRS)
Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak
2007-01-01
The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.
NASA Technical Reports Server (NTRS)
Prassinos, Peter G.; Stamatelatos, Michael G.; Young, Jonathan; Smith, Curtis
2010-01-01
Managed by NASA's Office of Safety and Mission Assurance, a pilot probabilistic risk analysis (PRA) of the NASA Crew Exploration Vehicle (CEV) was performed in early 2006. The PRA methods used follow the general guidance provided in the NASA PRA Procedures Guide for NASA Managers and Practitioners'. Phased-mission based event trees and fault trees are used to model a lunar sortie mission of the CEV - involving the following phases: launch of a cargo vessel and a crew vessel; rendezvous of these two vessels in low Earth orbit; transit to th$: moon; lunar surface activities; ascension &om the lunar surface; and return to Earth. The analysis is based upon assumptions, preliminary system diagrams, and failure data that may involve large uncertainties or may lack formal validation. Furthermore, some of the data used were based upon expert judgment or extrapolated from similar componentssystemsT. his paper includes a discussion of the system-level models and provides an overview of the analysis results used to identify insights into CEV risk drivers, and trade and sensitivity studies. Lastly, the PRA model was used to determine changes in risk as the system configurations or key parameters are modified.
An Accident Precursor Analysis Process Tailored for NASA Space Systems
NASA Technical Reports Server (NTRS)
Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare
2010-01-01
Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.
A clinical economics workstation for risk-adjusted health care cost management.
Eisenstein, E. L.; Hales, J. W.
1995-01-01
This paper describes a healthcare cost accounting system which is under development at Duke University Medical Center. Our approach differs from current practice in that this system will dynamically adjust its resource usage estimates to compensate for variations in patient risk levels. This adjustment is made possible by introducing a new cost accounting concept, Risk-Adjusted Quantity (RQ). RQ divides case-level resource usage variances into their risk-based component (resource consumption differences attributable to differences in patient risk levels) and their non-risk-based component (resource consumption differences which cannot be attributed to differences in patient risk levels). Because patient risk level is a factor in estimating resource usage, this system is able to simultaneously address the financial and quality dimensions of case cost management. In effect, cost-effectiveness analysis is incorporated into health care cost management. PMID:8563361
NASA Astrophysics Data System (ADS)
Sari, D. A. P.; Innaqa, S.; Safrilah
2017-06-01
This research analyzed the levels of disaster risk in the Citeureup sub-District, Bogor Regency, West Java, based on its potential hazard, vulnerability and capacity, using map to represent the results, then Miles and Huberman analytical techniques was used to analyze the qualitative interviews. The analysis conducted in this study is based on the concept of disaster risk by Wisner. The result shows that the Citeureup sub-District has medium-low risk of landslides. Of the 14 villages, three villages have a moderate risk level, namely Hambalang, Tajur, and Tangkil, or 49.58% of the total land area. Eleven villages have a low level of risk, namely Pasir Mukti, Sanja, Tarikolot, Gunung Sari, Puspasari, East Karang Asem, Citeureup, Leuwinutug, Sukahati, West Karang Asem West and Puspanegara, or 48.68% of the total land area, for high-risk areas only around 1.74%, which is part of Hambalang village. The analysis using Geographic Information System (GIS) prove that areas with a high risk potential does not necessarily have a high level of risk. The capacity of the community plays an important role to minimize the risk of a region. Disaster risk reduction strategy is done by creating a safe condition, which intensified the movement of disaster risk reduction.
NASA Astrophysics Data System (ADS)
Masure, P.
2003-04-01
The GEMITIS method has been implemented since 1995 into a global and integrated Risk Reduction Strategy for improving the seismic risk-assessment effectiveness in urban areas, including the generation of crisis scenarios and mid- to long term- seismic impact assessment. GEMITIS required us to provide more precise definitions of notions in common use by natural-hazard specialists, such as elements at risk and vulnerability. Until then, only the physical and human elements had been considered, and analysis of their vulnerability referred to their fragility in the face of aggression by nature. We have completed this approach by also characterizing the social and cultural vulnerability of a city and its inhabitants, and, with a wider scope, the functional vulnerability of the "urban system". This functional vulnerability depends upon the relations between the system elements (weak links in chains, functional relays, and defense systems) and upon the city's relations with the outside world (interdependence). Though well developed in methods for evaluating industrial risk (fault-tree analysis, event-tree analysis, multiple defense barriers, etc.), this aspect had until now been ignored by the "hard-science" specialists working on natural hazards. Based on the implementation of an Urban System Exposure methodology, we were able to identify specific human, institutional, or functional vulnerability factors for each urban system, which until had been very little discussed by risk-analysis and civil-protection specialists. In addition, we have defined the new concept of "main stakes" of the urban system, ranked by order of social value (or collective utility). Obviously, vital or strategic issues must be better resistant or protected against natural hazards than issues of secondary importance. The ranking of exposed elements of a city in terms of "main stakes" provides a very useful guide for adapting vulnerability studies and for orienting preventive actions. For this, GEMITIS is based on a systemic approach of the city and on value analysis of exposed elements. It facilitates a collective expertise for the definition of a preventive action plan based on the participation of the main urban actors (crisis preparedness, construction, land-use, etc.).
Ayyub, Bilal M
2014-02-01
The United Nations Office for Disaster Risk Reduction reported that the 2011 natural disasters, including the earthquake and tsunami that struck Japan, resulted in $366 billion in direct damages and 29,782 fatalities worldwide. Storms and floods accounted for up to 70% of the 302 natural disasters worldwide in 2011, with earthquakes producing the greatest number of fatalities. Average annual losses in the United States amount to about $55 billion. Enhancing community and system resilience could lead to massive savings through risk reduction and expeditious recovery. The rational management of such reduction and recovery is facilitated by an appropriate definition of resilience and associated metrics. In this article, a resilience definition is provided that meets a set of requirements with clear relationships to the metrics of the relevant abstract notions of reliability and risk. Those metrics also meet logically consistent requirements drawn from measure theory, and provide a sound basis for the development of effective decision-making tools for multihazard environments. Improving the resiliency of a system to meet target levels requires the examination of system enhancement alternatives in economic terms, within a decision-making framework. Relevant decision analysis methods would typically require the examination of resilience based on its valuation by society at large. The article provides methods for valuation and benefit-cost analysis based on concepts from risk analysis and management. © 2013 Society for Risk Analysis.
Systems Analysis of NASA Aviation Safety Program: Final Report
NASA Technical Reports Server (NTRS)
Jones, Sharon M.; Reveley, Mary S.; Withrow, Colleen A.; Evans, Joni K.; Barr, Lawrence; Leone, Karen
2013-01-01
A three-month study (February to April 2010) of the NASA Aviation Safety (AvSafe) program was conducted. This study comprised three components: (1) a statistical analysis of currently available civilian subsonic aircraft data from the National Transportation Safety Board (NTSB), the Federal Aviation Administration (FAA), and the Aviation Safety Information Analysis and Sharing (ASIAS) system to identify any significant or overlooked aviation safety issues; (2) a high-level qualitative identification of future safety risks, with an assessment of the potential impact of the NASA AvSafe research on the National Airspace System (NAS) based on these risks; and (3) a detailed, top-down analysis of the NASA AvSafe program using an established and peer-reviewed systems analysis methodology. The statistical analysis identified the top aviation "tall poles" based on NTSB accident and FAA incident data from 1997 to 2006. A separate examination of medical helicopter accidents in the United States was also conducted. Multiple external sources were used to develop a compilation of ten "tall poles" in future safety issues/risks. The top-down analysis of the AvSafe was conducted by using a modification of the Gibson methodology. Of the 17 challenging safety issues that were identified, 11 were directly addressed by the AvSafe program research portfolio.
An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.
2002-01-01
Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.
Risk Management using Dependency Stucture Matrix
NASA Astrophysics Data System (ADS)
Petković, Ivan
2011-09-01
An efficient method based on dependency structure matrix (DSM) analysis is given for ranking risks in a complex system or process whose entities are mutually dependent. This rank is determined according to the element's values of the unique positive eigenvector which corresponds to the matrix spectral radius modeling the considered engineering system. For demonstration, the risk problem of NASA's robotic spacecraft is analyzed.
Small numbers, disclosure risk, security, and reliability issues in Web-based data query systems.
Rudolph, Barbara A; Shah, Gulzar H; Love, Denise
2006-01-01
This article describes the process for developing consensus guidelines and tools for releasing public health data via the Web and highlights approaches leading agencies have taken to balance disclosure risk with public dissemination of reliable health statistics. An agency's choice of statistical methods for improving the reliability of released data for Web-based query systems is based upon a number of factors, including query system design (dynamic analysis vs preaggregated data and tables), population size, cell size, data use, and how data will be supplied to users. The article also describes those efforts that are necessary to reduce the risk of disclosure of an individual's protected health information.
Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land
2006-01-01
We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.
NASA Astrophysics Data System (ADS)
Faramarzi, Farhad; Mansouri, Hamid; Farsangi, Mohammad Ali Ebrahimi
2014-07-01
The environmental effects of blasting must be controlled in order to comply with regulatory limits. Because of safety concerns and risk of damage to infrastructures, equipment, and property, and also having a good fragmentation, flyrock control is crucial in blasting operations. If measures to decrease flyrock are taken, then the flyrock distance would be limited, and, in return, the risk of damage can be reduced or eliminated. This paper deals with modeling the level of risk associated with flyrock and, also, flyrock distance prediction based on the rock engineering systems (RES) methodology. In the proposed models, 13 effective parameters on flyrock due to blasting are considered as inputs, and the flyrock distance and associated level of risks as outputs. In selecting input data, the simplicity of measuring input data was taken into account as well. The data for 47 blasts, carried out at the Sungun copper mine, western Iran, were used to predict the level of risk and flyrock distance corresponding to each blast. The obtained results showed that, for the 47 blasts carried out at the Sungun copper mine, the level of estimated risks are mostly in accordance with the measured flyrock distances. Furthermore, a comparison was made between the results of the flyrock distance predictive RES-based model, the multivariate regression analysis model (MVRM), and, also, the dimensional analysis model. For the RES-based model, R 2 and root mean square error (RMSE) are equal to 0.86 and 10.01, respectively, whereas for the MVRM and dimensional analysis, R 2 and RMSE are equal to (0.84 and 12.20) and (0.76 and 13.75), respectively. These achievements confirm the better performance of the RES-based model over the other proposed models.
Adopting exergy analysis for use in aerospace
NASA Astrophysics Data System (ADS)
Hayes, David; Lone, Mudassir; Whidborne, James F.; Camberos, José; Coetzee, Etienne
2017-08-01
Thermodynamic analysis methods, based on an exergy metric, have been developed to improve system efficiency of traditional heat driven systems such as ground based power plants and aircraft propulsion systems. However, in more recent years interest in the topic has broadened to include applying these second law methods to the field of aerodynamics and complete aerospace vehicles. Work to date is based on highly simplified structures, but such a method could be shown to have benefit to the highly conservative and risk averse commercial aerospace sector. This review justifies how thermodynamic exergy analysis has the potential to facilitate a breakthrough in the optimization of aerospace vehicles based on a system of energy systems, through studying the exergy-based multidisciplinary design of future flight vehicles.
Yu, Wen-Kang; Dong, Ling; Pei, Wen-Xuan; Sun, Zhi-Rong; Dai, Jun-Dong; Wang, Yun
2017-12-01
The whole process quality control and management of traditional Chinese medicine (TCM) decoction pieces is a system engineering, involving the base environment, seeds and seedlings, harvesting, processing and other multiple steps, so the accurate identification of factors in TCM production process that may induce the quality risk, as well as reasonable quality control measures are very important. At present, the concept of quality risk is mainly concentrated in the aspects of management and regulations, etc. There is no comprehensive analysis on possible risks in the quality control process of TCM decoction pieces, or analysis summary of effective quality control schemes. A whole process quality control and management system for TCM decoction pieces based on TCM quality tree was proposed in this study. This system effectively combined the process analysis method of TCM quality tree with the quality risk management, and can help managers to make real-time decisions while realizing the whole process quality control of TCM. By providing personalized web interface, this system can realize user-oriented information feedback, and was convenient for users to predict, evaluate and control the quality of TCM. In the application process, the whole process quality control and management system of the TCM decoction pieces can identify the related quality factors such as base environment, cultivation and pieces processing, extend and modify the existing scientific workflow according to their own production conditions, and provide different enterprises with their own quality systems, to achieve the personalized service. As a new quality management model, this paper can provide reference for improving the quality of Chinese medicine production and quality standardization. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Jing, Wenjun; Zhao, Yan
2018-02-01
Stability is an important part of geotechnical engineering research. The operating experiences of underground storage caverns in salt rock all around the world show that the stability of the caverns is the key problem of safe operation. Currently, the combination of theoretical analysis and numerical simulation are the mainly adopts method of reserve stability analysis. This paper introduces the concept of risk into the stability analysis of underground geotechnical structure, and studies the instability of underground storage cavern in salt rock from the perspective of risk analysis. Firstly, the definition and classification of cavern instability risk is proposed, and the damage mechanism is analyzed from the mechanical angle. Then the main stability evaluating indicators of cavern instability risk are proposed, and an evaluation method of cavern instability risk is put forward. Finally, the established cavern instability risk assessment system is applied to the analysis and prediction of cavern instability risk after 30 years of operation in a proposed storage cavern group in the Huai’an salt mine. This research can provide a useful theoretical base for the safe operation and management of underground storage caverns in salt rock.
Risk, Robustness and Water Resources Planning Under Uncertainty
NASA Astrophysics Data System (ADS)
Borgomeo, Edoardo; Mortazavi-Naeini, Mohammad; Hall, Jim W.; Guillod, Benoit P.
2018-03-01
Risk-based water resources planning is based on the premise that water managers should invest up to the point where the marginal benefit of risk reduction equals the marginal cost of achieving that benefit. However, this cost-benefit approach may not guarantee robustness under uncertain future conditions, for instance under climatic changes. In this paper, we expand risk-based decision analysis to explore possible ways of enhancing robustness in engineered water resources systems under different risk attitudes. Risk is measured as the expected annual cost of water use restrictions, while robustness is interpreted in the decision-theoretic sense as the ability of a water resource system to maintain performance—expressed as a tolerable risk of water use restrictions—under a wide range of possible future conditions. Linking risk attitudes with robustness allows stakeholders to explicitly trade-off incremental increases in robustness with investment costs for a given level of risk. We illustrate the framework through a case study of London's water supply system using state-of-the -art regional climate simulations to inform the estimation of risk and robustness.
Integrated Safety Risk Reduction Approach to Enhancing Human-Rated Spaceflight Safety
NASA Astrophysics Data System (ADS)
Mikula, J. F. Kip
2005-12-01
This paper explores and defines the current accepted concept and philosophy of safety improvement based on a Reliability enhancement (called here Reliability Enhancement Based Safety Theory [REBST]). In this theory a Reliability calculation is used as a measure of the safety achieved on the program. This calculation may be based on a math model or a Fault Tree Analysis (FTA) of the system, or on an Event Tree Analysis (ETA) of the system's operational mission sequence. In each case, the numbers used in this calculation are hardware failure rates gleaned from past similar programs. As part of this paper, a fictional but representative case study is provided that helps to illustrate the problems and inaccuracies of this approach to safety determination. Then a safety determination and enhancement approach based on hazard, worst case analysis, and safety risk determination (called here Worst Case Based Safety Theory [WCBST]) is included. This approach is defined and detailed using the same example case study as shown in the REBST case study. In the end it is concluded that an approach combining the two theories works best to reduce Safety Risk.
DTREEv2, a computer-based support system for the risk assessment of genetically modified plants.
Pertry, Ine; Nothegger, Clemens; Sweet, Jeremy; Kuiper, Harry; Davies, Howard; Iserentant, Dirk; Hull, Roger; Mezzetti, Bruno; Messens, Kathy; De Loose, Marc; de Oliveira, Dulce; Burssens, Sylvia; Gheysen, Godelieve; Tzotzos, George
2014-03-25
Risk assessment of genetically modified organisms (GMOs) remains a contentious area and a major factor influencing the adoption of agricultural biotech. Methodologically, in many countries, risk assessment is conducted by expert committees with little or no recourse to databases and expert systems that can facilitate the risk assessment process. In this paper we describe DTREEv2, a computer-based decision support system for the identification of hazards related to the introduction of GM-crops into the environment. DTREEv2 structures hazard identification and evaluation by means of an Event-Tree type of analysis. The system produces an output flagging identified hazards and potential risks. It is intended to be used for the preparation and evaluation of biosafety dossiers and, as such, its usefulness extends to researchers, risk assessors and regulators in government and industry. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Moreira, Francisco; Silva, Nuno
2016-08-01
Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.
Assessing risk factors in the organic control system: evidence from inspection data in Italy.
Zanoli, Raffaele; Gambelli, Danilo; Solfanelli, Francesco
2014-12-01
Certification is an essential feature in organic farming, and it is based on inspections to verify compliance with respect to European Council Regulation-EC Reg. No 834/2007. A risk-based approach to noncompliance that alerts the control bodies to activate planning inspections would contribute to a more efficient and cost-effective certification system. An analysis of factors that can affect the probability of noncompliance in organic farming has thus been developed. This article examines the application of zero-inflated count data models to farm-level panel data from inspection results and sanctions obtained from the Ethical and Environmental Certification Institute, one of the main control bodies in Italy. We tested many a priori hypotheses related to the risk of noncompliance. We find evidence of an important role for past noncompliant behavior in predicting future noncompliance, while farm size and the occurrence of livestock also have roles in an increased probability of noncompliance. We conclude the article proposing that an efficient risk-based inspection system should be designed, weighting up the known probability of occurrence of a given noncompliance according to the severity of its impact. © 2014 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha
2016-04-01
This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gore, Bryan F.; Blackburn, Tyrone R.; Heasler, Patrick G.
2001-01-19
The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhousemore » systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.« less
NASA Astrophysics Data System (ADS)
Ndu, Obibobi Kamtochukwu
To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest.
Application of a risk management system to improve drinking water safety.
Jayaratne, Asoka
2008-12-01
The use of a comprehensive risk management framework is considered a very effective means of managing water quality risks. There are many risk-based systems available to water utilities such as ISO 9001 and Hazard Analysis and Critical Control Point (HACCP). In 2004, the World Health Organization's (WHO) Guidelines for Drinking Water Quality recommended the use of preventive risk management approaches to manage water quality risks. This paper describes the framework adopted by Yarra Valley Water for the development of its Drinking Water Quality Risk Management Plan incorporating HACCP and ISO 9001 systems and demonstrates benefits of Water Safety Plans such as HACCP. Copyright IWA Publishing 2008.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-29
...The Food and Drug Administration (FDA) is proposing regulations for domestic and foreign facilities that are required to register under the Federal Food, Drug, and Cosmetic Act (the FD&C Act) to establish requirements for current good manufacturing practice in manufacturing, processing, packing, and holding of animal food. FDA also is proposing regulations to require that certain facilities establish and implement hazard analysis and risk-based preventive controls for food for animals. FDA is taking this action to provide greater assurance that animal food is safe and will not cause illness or injury to animals or humans and is intended to build an animal food safety system for the future that makes modern, science and risk-based preventive controls the norm across all sectors of the animal food system.
A method for scenario-based risk assessment for robust aerospace systems
NASA Astrophysics Data System (ADS)
Thomas, Victoria Katherine
In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps involved in completing the modeling and simulation are: Alternative Solution Modeling, Uncertainty Quantification, Risk Assessment, and Risk Mitigation. Focus area three consists of Decision Support. In this area a decision support interface is created that allows for game playing between solution alternatives and risk mitigation. A multi-attribute decision making process is also implemented to aid in decision making. A demonstration problem inspired by Airbus' mid 1980s decision to break into the widebody long-range market was developed to illustrate the use of this method. The results showed that the method is able to capture additional types of risk than previous analysis methods, particularly at the early stages of aircraft design. It was also shown that the method can be used to help create a system that is robust to external environmental factors. The addition of an external environment risk analysis in the early stages of conceptual design can add another dimension to the analysis of feasibility and viability. The ability to take risk into account during the early stages of the design process can allow for the elimination of potentially feasible and viable but too-risky alternatives. The addition of a scenario-based analysis instead of a traditional probabilistic analysis enabled uncertainty to be effectively bound and examined over a variety of potential futures instead of only a single future. There is also potential for a product to be groomed for a specific future that one believes is likely to happen, or for a product to be steered during design as the future unfolds.
Risk-Based Prioritization of Research for Aviation Security Using Logic-Evolved Decision Analysis
NASA Technical Reports Server (NTRS)
Eisenhawer, S. W.; Bott, T. F.; Sorokach, M. R.; Jones, F. P.; Foggia, J. R.
2004-01-01
The National Aeronautics and Space Administration is developing advanced technologies to reduce terrorist risk for the air transportation system. Decision support tools are needed to help allocate assets to the most promising research. An approach to rank ordering technologies (using logic-evolved decision analysis), with risk reduction as the metric, is presented. The development of a spanning set of scenarios using a logic-gate tree is described. Baseline risk for these scenarios is evaluated with an approximate reasoning model. Illustrative risk and risk reduction results are presented.
NASA Astrophysics Data System (ADS)
Guo, Aijun; Chang, Jianxia; Wang, Yimin; Huang, Qiang; Zhou, Shuai
2018-05-01
Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on regional flood control systems. This work advances traditional flood risk analysis by proposing a univariate and copula-based bivariate hydrological risk framework which incorporates both flood control and sediment transport. In developing the framework, the conditional probabilities of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula-based model. Moreover, a Monte Carlo-based algorithm is designed to quantify the sampling uncertainty associated with univariate and bivariate hydrological risk analyses. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The univariate and bivariate return periods, risk and reliability in the context of uncertainty for the purposes of flood control and sediment transport are assessed for the study regions. The results indicate that sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the event that AMF exceeds the design flood of downstream hydraulic structures in the UCX and UCH. Moreover, there is considerable sampling uncertainty affecting the univariate and bivariate hydrologic risk evaluation, which greatly challenges measures of future flood mitigation. In addition, results also confirm that the developed framework can estimate conditional probabilities associated with different flood events under various extreme precipitation scenarios aiming for flood control and sediment transport. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.
NASA Astrophysics Data System (ADS)
Hu, Xiaojing; Li, Qiang; Zhang, Hao; Guo, Ziming; Zhao, Kun; Li, Xinpeng
2018-06-01
Based on the Monte Carlo method, an improved risk assessment method for hybrid AC/DC power system with VSC station considering the operation status of generators, converter stations, AC lines and DC lines is proposed. According to the sequential AC/DC power flow algorithm, node voltage and line active power are solved, and then the operation risk indices of node voltage over-limit and line active power over-limit are calculated. Finally, an improved two-area IEEE RTS-96 system is taken as a case to analyze and assessment its operation risk. The results show that the proposed model and method can intuitively and directly reflect the weak nodes and weak lines of the system, which can provide some reference for the dispatching department.
Risk-Significant Adverse Condition Awareness Strengthens Assurance of Fault Management Systems
NASA Technical Reports Server (NTRS)
Fitz, Rhonda
2017-01-01
As spaceflight systems increase in complexity, Fault Management (FM) systems are ranked high in risk-based assessment of software criticality, emphasizing the importance of establishing highly competent domain expertise to provide assurance. Adverse conditions (ACs) and specific vulnerabilities encountered by safety- and mission-critical software systems have been identified through efforts to reduce the risk posture of software-intensive NASA missions. Acknowledgement of potential off-nominal conditions and analysis to determine software system resiliency are important aspects of hazard analysis and FM. A key component of assuring FM is an assessment of how well software addresses susceptibility to failure through consideration of ACs. Focus on significant risk predicted through experienced analysis conducted at the NASA Independent Verification & Validation (IV&V) Program enables the scoping of effective assurance strategies with regard to overall asset protection of complex spaceflight as well as ground systems. Research efforts sponsored by NASAs Office of Safety and Mission Assurance (OSMA) defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs and allowing queries based on project, mission type, domain/component, causal fault, and other key characteristics. Vulnerability in off-nominal situations, architectural design weaknesses, and unexpected or undesirable system behaviors in reaction to faults are curtailed with the awareness of ACs and risk-significant scenarios modeled for analysts through this database. Integration within the Enterprise Architecture at NASA IV&V enables interfacing with other tools and datasets, technical support, and accessibility across the Agency. This paper discusses the development of an improved workflow process utilizing this database for adaptive, risk-informed FM assurance that critical software systems will safely and securely protect against faults and respond to ACs in order to achieve successful missions.
Risk-Significant Adverse Condition Awareness Strengthens Assurance of Fault Management Systems
NASA Technical Reports Server (NTRS)
Fitz, Rhonda
2017-01-01
As spaceflight systems increase in complexity, Fault Management (FM) systems are ranked high in risk-based assessment of software criticality, emphasizing the importance of establishing highly competent domain expertise to provide assurance. Adverse conditions (ACs) and specific vulnerabilities encountered by safety- and mission-critical software systems have been identified through efforts to reduce the risk posture of software-intensive NASA missions. Acknowledgement of potential off-nominal conditions and analysis to determine software system resiliency are important aspects of hazard analysis and FM. A key component of assuring FM is an assessment of how well software addresses susceptibility to failure through consideration of ACs. Focus on significant risk predicted through experienced analysis conducted at the NASA Independent Verification Validation (IVV) Program enables the scoping of effective assurance strategies with regard to overall asset protection of complex spaceflight as well as ground systems. Research efforts sponsored by NASA's Office of Safety and Mission Assurance defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs and allowing queries based on project, mission type, domaincomponent, causal fault, and other key characteristics. Vulnerability in off-nominal situations, architectural design weaknesses, and unexpected or undesirable system behaviors in reaction to faults are curtailed with the awareness of ACs and risk-significant scenarios modeled for analysts through this database. Integration within the Enterprise Architecture at NASA IVV enables interfacing with other tools and datasets, technical support, and accessibility across the Agency. This paper discusses the development of an improved workflow process utilizing this database for adaptive, risk-informed FM assurance that critical software systems will safely and securely protect against faults and respond to ACs in order to achieve successful missions.
Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems
Hedda, Monica; Malin, Bradley A.; Yan, Chao; Fabbri, Daniel
2017-01-01
Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%. PMID:29854153
Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems.
Hedda, Monica; Malin, Bradley A; Yan, Chao; Fabbri, Daniel
2017-01-01
Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%.
Banchhor, Sumit K; Londhe, Narendra D; Araki, Tadashi; Saba, Luca; Radeva, Petia; Laird, John R; Suri, Jasjit S
2017-12-01
Planning of percutaneous interventional procedures involves a pre-screening and risk stratification of the coronary artery disease. Current screening tools use stand-alone plaque texture-based features and therefore lack the ability to stratify the risk. This IRB approved study presents a novel strategy for coronary artery disease risk stratification using an amalgamation of IVUS plaque texture-based and wall-based measurement features. Due to common genetic plaque makeup, carotid plaque burden was chosen as a gold standard for risk labels during training-phase of machine learning (ML) paradigm. Cross-validation protocol was adopted to compute the accuracy of the ML framework. A set of 59 plaque texture-based features was padded with six wall-based measurement features to show the improvement in stratification accuracy. The ML system was executed using principle component analysis-based framework for dimensionality reduction and uses support vector machine classifier for training and testing-phases. The ML system produced a stratification accuracy of 91.28%, demonstrating an improvement of 5.69% when wall-based measurement features were combined with plaque texture-based features. The fused system showed an improvement in mean sensitivity, specificity, positive predictive value, and area under the curve by: 6.39%, 4.59%, 3.31% and 5.48%, respectively when compared to the stand-alone system. While meeting the stability criteria of 5%, the ML system also showed a high average feature retaining power and mean reliability of 89.32% and 98.24%, respectively. The ML system showed an improvement in risk stratification accuracy when the wall-based measurement features were fused with the plaque texture-based features. Copyright © 2017 Elsevier Ltd. All rights reserved.
Likelihood ratio-based integrated personal risk assessment of type 2 diabetes.
Sato, Noriko; Htun, Nay Chi; Daimon, Makoto; Tamiya, Gen; Kato, Takeo; Kubota, Isao; Ueno, Yoshiyuki; Yamashita, Hidetoshi; Fukao, Akira; Kayama, Takamasa; Muramatsu, Masaaki
2014-01-01
To facilitate personalized health care for multifactorial diseases, risks of genetic and clinical/environmental factors should be assessed together for each individual in an integrated fashion. This approach is possible with the likelihood ratio (LR)-based risk assessment system, as this system can incorporate manifold tests. We examined the usefulness of this system for assessing type 2 diabetes (T2D). Our system employed 29 genetic susceptibility variants, body mass index (BMI), and hypertension as risk factors whose LRs can be estimated from openly available T2D association data for the Japanese population. The pretest probability was set at a sex- and age-appropriate population average of diabetes prevalence. The classification performance of our LR-based risk assessment was compared to that of a non-invasive screening test for diabetes called TOPICS (with score based on age, sex, family history, smoking, BMI, and hypertension) using receiver operating characteristic analysis with a community cohort (n = 1263). The area under the receiver operating characteristic curve (AUC) for the LR-based assessment and TOPICS was 0.707 (95% CI 0.665-0.750) and 0.719 (0.675-0.762), respectively. These AUCs were much higher than that of a genetic risk score constructed using the same genetic susceptibility variants, 0.624 (0.574-0.674). The use of ethnically matched LRs is necessary for proper personal risk assessment. In conclusion, although LR-based integrated risk assessment for T2D still requires additional tests that evaluate other factors, such as risks involved in missing heritability, our results indicate the potential usability of LR-based assessment system and stress the importance of stratified epidemiological investigations in personalized medicine.
On the complex quantification of risk: systems-based perspective on terrorism.
Haimes, Yacov Y
2011-08-01
This article highlights the complexity of the quantification of the multidimensional risk function, develops five systems-based premises on quantifying the risk of terrorism to a threatened system, and advocates the quantification of vulnerability and resilience through the states of the system. The five premises are: (i) There exists interdependence between a specific threat to a system by terrorist networks and the states of the targeted system, as represented through the system's vulnerability, resilience, and criticality-impact. (ii) A specific threat, its probability, its timing, the states of the targeted system, and the probability of consequences can be interdependent. (iii) The two questions in the risk assessment process: "What is the likelihood?" and "What are the consequences?" can be interdependent. (iv) Risk management policy options can reduce both the likelihood of a threat to a targeted system and the associated likelihood of consequences by changing the states (including both vulnerability and resilience) of the system. (v) The quantification of risk to a vulnerable system from a specific threat must be built on a systemic and repeatable modeling process, by recognizing that the states of the system constitute an essential step to construct quantitative metrics of the consequences based on intelligence gathering, expert evidence, and other qualitative information. The fact that the states of all systems are functions of time (among other variables) makes the time frame pivotal in each component of the process of risk assessment, management, and communication. Thus, risk to a system, caused by an initiating event (e.g., a threat) is a multidimensional function of the specific threat, its probability and time frame, the states of the system (representing vulnerability and resilience), and the probabilistic multidimensional consequences. © 2011 Society for Risk Analysis.
Inventory Control System for a Healthcare Apparel Service Centre with Stockout Risk: A Case Analysis
Hui, Chi-Leung
2017-01-01
Based on the real-world inventory control problem of a capacitated healthcare apparel service centre in Hong Kong which provides tailor-made apparel-making services for the elderly and disabled people, this paper studies a partial backordered continuous review inventory control problem in which the product demand follows a Poisson process with a constant lead time. The system is controlled by an (Q,r) inventory policy which incorporate the stockout risk, storage capacity, and partial backlog. The healthcare apparel service centre, under the capacity constraint, aims to minimize the inventory cost and achieving a low stockout risk. To address this challenge, an optimization problem is constructed. A real case-based data analysis is conducted, and the result shows that the expected total cost on an order cycle is reduced substantially at around 20% with our proposed optimal inventory control policy. An extensive sensitivity analysis is conducted to generate additional insights. PMID:29527283
Pan, An; Hui, Chi-Leung
2017-01-01
Based on the real-world inventory control problem of a capacitated healthcare apparel service centre in Hong Kong which provides tailor-made apparel-making services for the elderly and disabled people, this paper studies a partial backordered continuous review inventory control problem in which the product demand follows a Poisson process with a constant lead time. The system is controlled by an ( Q , r ) inventory policy which incorporate the stockout risk, storage capacity, and partial backlog. The healthcare apparel service centre, under the capacity constraint, aims to minimize the inventory cost and achieving a low stockout risk. To address this challenge, an optimization problem is constructed. A real case-based data analysis is conducted, and the result shows that the expected total cost on an order cycle is reduced substantially at around 20% with our proposed optimal inventory control policy. An extensive sensitivity analysis is conducted to generate additional insights.
Fews-Risk: A step towards risk-based flood forecasting
NASA Astrophysics Data System (ADS)
Bachmann, Daniel; Eilander, Dirk; de Leeuw, Annemargreet; Diermanse, Ferdinand; Weerts, Albrecht; de Bruijn, Karin; Beckers, Joost; Boelee, Leonore; Brown, Emma; Hazlewood, Caroline
2015-04-01
Operational flood prediction and the assessment of flood risk are important components of flood management. Currently, the model-based prediction of discharge and/or water level in a river is common practice for operational flood forecasting. Based on the prediction of these values decisions about specific emergency measures are made within operational flood management. However, the information provided for decision support is restricted to pure hydrological or hydraulic aspects of a flood. Information about weak sections within the flood defences, flood prone areas and assets at risk in the protected areas are rarely used in a model-based flood forecasting system. This information is often available for strategic planning, but is not in an appropriate format for operational purposes. The idea of FEWS-Risk is the extension of existing flood forecasting systems with elements of strategic flood risk analysis, such as probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. Thus, additional information is provided to the decision makers, such as: • Location, timing and probability of failure of defined sections of the flood defence line; • Flood spreading, extent and hydraulic values in the hinterland caused by an overflow or a breach flow • Impacts and consequences in case of flooding in the protected areas, such as injuries or casualties and/or damages to critical infrastructure or economy. In contrast with purely hydraulic-based operational information, these additional data focus upon decision support for answering crucial questions within an operational flood forecasting framework, such as: • Where should I reinforce my flood defence system? • What type of action can I take to mend a weak spot in my flood defences? • What are the consequences of a breach? • Which areas should I evacuate first? This presentation outlines the additional required workflows towards risk-based flood forecasting systems. In a cooperation between HR Wallingford and Deltares, the extended workflows are being integrated into the Delft-FEWS software system. Delft-FEWS provides modules for managing the data handling and forecasting process. Results of a pilot study that demonstrates the new tools are presented. The value of the newly generated information for decision support during a flood event is discussed.
Multi-Mission System Analysis for Planetary Entry (M-SAPE) Version 1
NASA Technical Reports Server (NTRS)
Samareh, Jamshid; Glaab, Louis; Winski, Richard G.; Maddock, Robert W.; Emmett, Anjie L.; Munk, Michelle M.; Agrawal, Parul; Sepka, Steve; Aliaga, Jose; Zarchi, Kerry;
2014-01-01
This report describes an integrated system for Multi-mission System Analysis for Planetary Entry (M-SAPE). The system in its current form is capable of performing system analysis and design for an Earth entry vehicle suitable for sample return missions. The system includes geometry, mass sizing, impact analysis, structural analysis, flight mechanics, TPS, and a web portal for user access. The report includes details of M-SAPE modules and provides sample results. Current M-SAPE vehicle design concept is based on Mars sample return (MSR) Earth entry vehicle design, which is driven by minimizing risk associated with sample containment (no parachute and passive aerodynamic stability). By M-SAPE exploiting a common design concept, any sample return mission, particularly MSR, will benefit from significant risk and development cost reductions. The design provides a platform by which technologies and design elements can be evaluated rapidly prior to any costly investment commitment.
Kohara, Norihito; Kaneko, Masayuki; Narukawa, Mamoru
2018-01-01
The concept of the risk-based approach has been introduced as an effort to secure the quality of clinical trials. In the risk-based approach, identification and evaluation of risk in advance are considered important. For recently completed clinical trials, we investigated the relationship between study characteristics and protocol deviations leading to the exclusion of subjects from Per Protocol Set (PPS) efficacy analysis. New drugs approved in Japan in the fiscal year 2014-2015 were targeted in the research. The reasons for excluding subjects from the PPS efficacy analysis were described in 102 trials out of 492 in the summary of new drug application documents, which was publicly disclosed after the drug's regulatory approval. The author extracted these reasons along with the numbers of the cases and the study characteristics of each clinical trial. Then, the direct comparison, univariate regression analysis, and multivariate regression analysis was carried out based on the exclusion rate. The study characteristics for which exclusion of subjects from the PPS efficacy analysis were frequently observed was multiregional clinical trials in study region; inhalant and external use in administration route; Anti-infective for systemic use; Respiratory system, Dermatologicals, and Nervous system in therapeutic drug under the Anatomical Therapeutic Chemical Classification. In the multivariate regression analysis, the clinical trial variables of inhalant, Respiratory system, or Dermatologicals were selected as study characteristics leading to a higher exclusion rate. The characteristics of the clinical trial that is likely to cause protocol deviations that will affect efficacy analysis were suggested. These studies should be considered for specific attention and priority observation in the trial protocol or its monitoring plan and execution, such as a clear description of inclusion/exclusion criteria in the protocol, development of training materials to site staff, and/or trial subjects as specific risk-alleviating measures.
Risk analysis with a fuzzy-logic approach of a complex installation
NASA Astrophysics Data System (ADS)
Peikert, Tim; Garbe, Heyno; Potthast, Stefan
2016-09-01
This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.
Model-Based Engineering for Supply Chain Risk Management
2015-09-30
Privacy, 2009 [19] Julien Delange Wheel Brake System Example using AADL; Feiler, Peter; Hansson, Jörgen; de Niz, Dionisio; & Wrage, Lutz. System ...University Software Engineering Institute Abstract—Expanded use of commercial components has increased the complexity of system assurance...verification. Model- based engineering (MBE) offers a means to design, develop, analyze, and maintain a complex system architecture. Architecture Analysis
A modeling framework for exposing risks in complex systems.
Sharit, J
2000-08-01
This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.
Almekhlafi, M A; Hill, M D; Wiebe, S; Goyal, M; Yavin, D; Wong, J H; Clement, F M
2014-02-01
Carotid revascularization procedures can be complicated by stroke. Additional disability adds to the already high costs of the procedure. To weigh the cost and benefit, we estimated the cost-utility of carotid angioplasty and stenting compared with carotid endarterectomy among patients with symptomatic carotid stenosis, with special emphasis on scenario analyses that would yield carotid angioplasty and stenting as the cost-effective alternative relative to carotid endarterectomy. A cost-utility analysis from the perspective of the health system payer was performed by using a Markov analytic model. Clinical estimates were based on a meta-analysis. The procedural costs were derived from a microcosting data base. The costs for hospitalization and rehabilitation of patients with stroke were based on a Canadian multicenter study. Utilities were based on a randomized controlled trial. In the base case analysis, carotid angioplasty and stenting were more expensive (incremental cost of $6107) and had a lower utility (-0.12 quality-adjusted life years) than carotid endarterectomy. The results are sensitive to changes in the risk of clinical events and the relative risk of death and stroke. Carotid angioplasty and stenting were more economically attractive among high-risk surgical patients. For carotid angioplasty and stenting to become the preferred option, their costs would need to fall from more than $7300 to $4350 or less and the risks of the periprocedural and annual minor strokes would have to be equivalent to that of carotid endarterectomy. In the base case analysis, carotid angioplasty and stenting were associated with higher costs and lower utility compared with carotid endarterectomy for patients with symptomatic carotid stenosis. Carotid angioplasty and stenting were cost-effective for patients with high surgical risk.
Simple Scoring System to Predict In-Hospital Mortality After Surgery for Infective Endocarditis.
Gatti, Giuseppe; Perrotti, Andrea; Obadia, Jean-François; Duval, Xavier; Iung, Bernard; Alla, François; Chirouze, Catherine; Selton-Suty, Christine; Hoen, Bruno; Sinagra, Gianfranco; Delahaye, François; Tattevin, Pierre; Le Moing, Vincent; Pappalardo, Aniello; Chocron, Sidney
2017-07-20
Aspecific scoring systems are used to predict the risk of death postsurgery in patients with infective endocarditis (IE). The purpose of the present study was both to analyze the risk factors for in-hospital death, which complicates surgery for IE, and to create a mortality risk score based on the results of this analysis. Outcomes of 361 consecutive patients (mean age, 59.1±15.4 years) who had undergone surgery for IE in 8 European centers of cardiac surgery were recorded prospectively, and a risk factor analysis (multivariable logistic regression) for in-hospital death was performed. The discriminatory power of a new predictive scoring system was assessed with the receiver operating characteristic curve analysis. Score validation procedures were carried out. Fifty-six (15.5%) patients died postsurgery. BMI >27 kg/m 2 (odds ratio [OR], 1.79; P =0.049), estimated glomerular filtration rate <50 mL/min (OR, 3.52; P <0.0001), New York Heart Association class IV (OR, 2.11; P =0.024), systolic pulmonary artery pressure >55 mm Hg (OR, 1.78; P =0.032), and critical state (OR, 2.37; P =0.017) were independent predictors of in-hospital death. A scoring system was devised to predict in-hospital death postsurgery for IE (area under the receiver operating characteristic curve, 0.780; 95% CI, 0.734-0.822). The score performed better than 5 of 6 scoring systems for in-hospital death after cardiac surgery that were considered. A simple scoring system based on risk factors for in-hospital death was specifically created to predict mortality risk postsurgery in patients with IE. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Wang, Lihong; Gong, Zaiwu
2017-10-10
As meteorological disaster systems are large complex systems, disaster reduction programs must be based on risk analysis. Consequently, judgment by an expert based on his or her experience (also known as qualitative evaluation) is an important link in meteorological disaster risk assessment. In some complex and non-procedural meteorological disaster risk assessments, a hesitant fuzzy linguistic preference relation (HFLPR) is often used to deal with a situation in which experts may be hesitant while providing preference information of a pairwise comparison of alternatives, that is, the degree of preference of one alternative over another. This study explores hesitation from the perspective of statistical distributions, and obtains an optimal ranking of an HFLPR based on chance-restricted programming, which provides a new approach for hesitant fuzzy optimisation of decision-making in meteorological disaster risk assessments.
Using software security analysis to verify the secure socket layer (SSL) protocol
NASA Technical Reports Server (NTRS)
Powell, John D.
2004-01-01
nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.
Default contagion risks in Russian interbank market
NASA Astrophysics Data System (ADS)
Leonidov, A. V.; Rumyantsev, E. L.
2016-06-01
Systemic risks of default contagion in the Russian interbank market are investigated. The analysis is based on considering the bow-tie structure of the weighted oriented graph describing the structure of the interbank loans. A probabilistic model of interbank contagion explicitly taking into account the empirical bow-tie structure reflecting functionality of the corresponding nodes (borrowers, lenders, borrowers and lenders simultaneously), degree distributions and disassortativity of the interbank network under consideration based on empirical data is developed. The characteristics of contagion-related systemic risk calculated with this model are shown to be in agreement with those of explicit stress tests.
1993-03-01
values themselves. The Wools perform risk-adjusted present-value comparisons and compute the ROI using discount factors. The assessment of risk in a...developed X Window system, the de facto industry standard window system in the UNIX environment. An X- terminal’s use is limited to display. It has no...2.1 IT HARDWARE The DOS-based PC used in this analysis costs $2,060. It includes an ASL 486DX-33 Industry Standard Architecture (ISA) computer with 8
Saba, Luca; Jain, Pankaj K; Suri, Harman S; Ikeda, Nobutaka; Araki, Tadashi; Singh, Bikesh K; Nicolaides, Andrew; Shafique, Shoaib; Gupta, Ajay; Laird, John R; Suri, Jasjit S
2017-06-01
Severe atherosclerosis disease in carotid arteries causes stenosis which in turn leads to stroke. Machine learning systems have been previously developed for plaque wall risk assessment using morphology-based characterization. The fundamental assumption in such systems is the extraction of the grayscale features of the plaque region. Even though these systems have the ability to perform risk stratification, they lack the ability to achieve higher performance due their inability to select and retain dominant features. This paper introduces a polling-based principal component analysis (PCA) strategy embedded in the machine learning framework to select and retain dominant features, resulting in superior performance. This leads to more stability and reliability. The automated system uses offline image data along with the ground truth labels to generate the parameters, which are then used to transform the online grayscale features to predict the risk of stroke. A set of sixteen grayscale plaque features is computed. Utilizing the cross-validation protocol (K = 10), and the PCA cutoff of 0.995, the machine learning system is able to achieve an accuracy of 98.55 and 98.83%corresponding to the carotidfar wall and near wall plaques, respectively. The corresponding reliability of the system was 94.56 and 95.63%, respectively. The automated system was validated against the manual risk assessment system and the precision of merit for same cross-validation settings and PCA cutoffs are 98.28 and 93.92%for the far and the near wall, respectively.PCA-embedded morphology-based plaque characterization shows a powerful strategy for risk assessment and can be adapted in clinical settings.
An Integrated Web-based Decision Support System in Disaster Risk Management
NASA Astrophysics Data System (ADS)
Aye, Z. C.; Jaboyedoff, M.; Derron, M. H.
2012-04-01
Nowadays, web based decision support systems (DSS) play an essential role in disaster risk management because of their supporting abilities which help the decision makers to improve their performances and make better decisions without needing to solve complex problems while reducing human resources and time. Since the decision making process is one of the main factors which highly influence the damages and losses of society, it is extremely important to make right decisions at right time by combining available risk information with advanced web technology of Geographic Information System (GIS) and Decision Support System (DSS). This paper presents an integrated web-based decision support system (DSS) of how to use risk information in risk management efficiently and effectively while highlighting the importance of a decision support system in the field of risk reduction. Beyond the conventional systems, it provides the users to define their own strategies starting from risk identification to the risk reduction, which leads to an integrated approach in risk management. In addition, it also considers the complexity of changing environment from different perspectives and sectors with diverse stakeholders' involvement in the development process. The aim of this platform is to contribute a part towards the natural hazards and geosciences society by developing an open-source web platform where the users can analyze risk profiles and make decisions by performing cost benefit analysis, Environmental Impact Assessment (EIA) and Strategic Environmental Assessment (SEA) with the support of others tools and resources provided. There are different access rights to the system depending on the user profiles and their responsibilities. The system is still under development and the current version provides maps viewing, basic GIS functionality, assessment of important infrastructures (e.g. bridge, hospital, etc.) affected by landslides and visualization of the impact-probability matrix in terms of socio-economic dimension.
SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph
2015-01-01
This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.
NASA Astrophysics Data System (ADS)
He, Fang; Chen, Xi
2016-11-01
The accelerating accumulation and risk concentration of Chinese local financing platforms debts have attracted wide attention throughout the world. Due to the network of financial exposures among institutions, the failure of several platforms or regions of systemic importance will probably trigger systemic risk and destabilize the financial system. However, the complex network of credit relationships in Chinese local financing platforms at the state level remains unknown. To fill this gap, we presented the first complex networks and hierarchical cluster analysis of the credit market of Chinese local financing platforms using the ;bottom up; method from firm-level data. Based on balance-sheet channel, we analyzed the topology and taxonomy by applying the analysis paradigm of subdominant ultra-metric space to an empirical data in 2013. It is remarked that we chose to extract the network of co-financed financing platforms in order to evaluate the effect of risk contagion from platforms to bank system. We used the new credit similarity measure by combining the factor of connectivity and size, to extract minimal spanning trees (MSTs) and hierarchical trees (HTs). We found that: (1) the degree distributions of credit correlation backbone structure of Chinese local financing platforms are fat tailed, and the structure is unstable with respect to targeted failures; (2) the backbone is highly hierarchical, and largely explained by the geographic region; (3) the credit correlation backbone structure based on connectivity and size is significantly heterogeneous; (4) key platforms and regions of systemic importance, and contagion path of systemic risk are obtained, which are contributed to preventing systemic risk and regional risk of Chinese local financing platforms and preserving financial stability under the framework of macro prudential supervision. Our approach of credit similarity measure provides a means of recognizing ;systemically important; institutions and regions for a targeted policy with risk minimization which gives a flexible and comprehensive consideration to both aspects of ;too big to fail; and ;too central to fail;.
Lee, Hunjoo; Lee, Kiyoung; Park, Ji Young; Min, Sung-Gi
2017-05-01
With support from the Korean Ministry of the Environment (ME), our interdisciplinary research staff developed the COnsumer Product Exposure and Risk assessment system (COPER). This system includes various databases and features that enable the calculation of exposure and determination of risk caused by consumer products use. COPER is divided into three tiers: the integrated database layer (IDL), the domain specific service layer (DSSL), and the exposure and risk assessment layer (ERAL). IDL is organized by the form of the raw data (mostly non-aggregated data) and includes four sub-databases: a toxicity profile, an inventory of Korean consumer products, the weight fractions of chemical substances in the consumer products determined by chemical analysis and national representative exposure factors. DSSL provides web-based information services corresponding to each database within IDL. Finally, ERAL enables risk assessors to perform various exposure and risk assessments, including exposure scenario design via either inhalation or dermal contact by using or organizing each database in an intuitive manner. This paper outlines the overall architecture of the system and highlights some of the unique features of COPER based on visual and dynamic rendering engine for exposure assessment model on web.
Afuakwah, Charles; Welbury, Richard
2015-11-01
Clinical guidelines recommend an individual is given a caries risk status based on analysis of defined clinical and social criteria before implementing a tailored preventive plan. Improve documentation of caries risk assessment (CRA) in a general dental practice setting, using a systems-based approach to quality improvement methods. Investigate the impact of quality improvement efforts on subsequent design and delivery of preventive care. Identify barriers to delivery of CRA and provision of preventive care. Data for patients aged 0-16 years was collected over two cycles using standard audit methodology. The first cycle was a retrospective analysis (n = 400) using random sampling. The second cycle a prospective analysis (n = 513) using consecutive sampling over a 15-week period. Five staff meetings with feedback occurred between cycles. In cycle one, no specific CRA system was identified. CRA status was not stated widely, risk factors were not analysed and there was variation with respect to the prescription and delivery of preventive strategies. These discrepancies were demonstrable for all four participating dentists and at all ages. In cycle two, 100% recorded CRA. All risk factors were analysed and individual caries risk was correctly annotated. There was 100% compliance with the protocol for preventive plans. The use of CRA improved documentation of caries risk status. This has improved subsequent prescription of age specific evidence-based preventive care appropriate to the risk status of that individual. Barriers were identified to the delivery of CRA and the provision of comprehensive preventive care by the dentists and other healthcare professionals.
NASA Astrophysics Data System (ADS)
Wang, Y.; Chang, J.; Guo, A.
2017-12-01
Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on flood control systems. Given this focus, a univariate and copula-based bivariate hydrological risk framework focusing on flood control and sediment transport is proposed in the current work. Additionally, the conditional probabilities of occurrence of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula model. Moreover, a Monte Carlo-based algorithm is used to evaluate the uncertainties of univariate and bivariate hydrological risk. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The results indicate that (1) 2-day and 3-day consecutive rainfall are highly correlated with the annual maximum flood discharge (AMF) in UCX and UCH, respectively; and (2) univariate and bivariate return periods, risk and reliability for the purposes of flood control and sediment transport are successfully estimated. Sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the AMF, exceeding the design flood of downstream hydraulic structures in the UCX and UCH. Most importantly, there was considerable sampling uncertainty in the univariate and bivariate hydrologic risk analysis, which would greatly challenge measures of future flood mitigation. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.
Software reliability through fault-avoidance and fault-tolerance
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.
1993-01-01
Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.
Vulnerability Analysis and Evaluation of Urban Road System in Tianjin
NASA Astrophysics Data System (ADS)
Liu, Y. Q.; Wu, X.
In recent years, with the development of economy, the road construction of our country has entered into a period of rapid growth. The road transportation network has been expanding and the risk of disasters is increasing. In this paper we study the vulnerability of urban road system in Tianjin. After analyzed many risk factors of the urban road system security, including road construction, road traffic and the natural environment, we proposed an evaluation index of vulnerability of urban road system and established the corresponding evaluation index system. Based on the results of analysis and comprehensive evaluation, appropriate improvement measures and suggestions which may reduce the vulnerability of the road system and improve the safety and reliability of the road system are proposed.
Zhang, Yan; Zhong, Ming
2013-01-01
Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883
NASA Astrophysics Data System (ADS)
Rodak, C. M.; McHugh, R.; Wei, X.
2016-12-01
The development and combination of horizontal drilling and hydraulic fracturing has unlocked unconventional hydrocarbon reserves around the globe. These advances have triggered a number of concerns regarding aquifer contamination and over-exploitation, leading to scientific studies investigating potential risks posed by directional hydraulic fracturing activities. These studies, balanced with potential economic benefits of energy production, are a crucial source of information for communities considering the development of unconventional reservoirs. However, probabilistic quantification of the overall risk posed by hydraulic fracturing at the system level are rare. Here we present the concept of fault tree analysis to determine the overall probability of groundwater contamination or over-exploitation, broadly referred to as the probability of failure. The potential utility of fault tree analysis for the quantification and communication of risks is approached with a general application. However, the fault tree design is robust and can handle various combinations of regional-specific data pertaining to relevant spatial scales, geological conditions, and industry practices where available. All available data are grouped into quantity and quality-based impacts and sub-divided based on the stage of the hydraulic fracturing process in which the data is relevant as described by the USEPA. Each stage is broken down into the unique basic events required for failure; for example, to quantify the risk of an on-site spill we must consider the likelihood, magnitude, composition, and subsurface transport of the spill. The structure of the fault tree described above can be used to render a highly complex system of variables into a straightforward equation for risk calculation based on Boolean logic. This project shows the utility of fault tree analysis for the visual communication of the potential risks of hydraulic fracturing activities on groundwater resources.
The NASA Space Radiobiology Risk Assessment Project
NASA Astrophysics Data System (ADS)
Cucinotta, Francis A.; Huff, Janice; Ponomarev, Artem; Patel, Zarana; Kim, Myung-Hee
The current first phase (2006-2011) has the three major goals of: 1) optimizing the conventional cancer risk models currently used based on the double-detriment life-table and radiation quality functions; 2) the integration of biophysical models of acute radiation syndromes; and 3) the development of new systems radiation biology models of cancer processes. The first-phase also includes continued uncertainty assessment of space radiation environmental models and transport codes, and relative biological effectiveness factors (RBE) based on flight data and NSRL results, respectively. The second phase of the (2012-2016) will: 1) develop biophysical models of central nervous system risks (CNS); 2) achieve comphrensive systems biology models of cancer processes using data from proton and heavy ion studies performed at NSRL; and 3) begin to identify computational models of biological countermeasures. Goals for the third phase (2017-2021) include: 1) the development of a systems biology model of cancer risks for operational use at NASA; 2) development of models of degenerative risks, 2) quantitative models of counter-measure impacts on cancer risks; and 3) indiviudal based risk assessments. Finally, we will support a decision point to continue NSRL research in support of NASA's exploration goals beyond 2021, and create an archival of NSRL research results for continued analysis. Details on near term goals, plans for a WEB based data resource of NSRL results, and a space radiation Wikepedia are described.
10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.
Code of Federal Regulations, 2010 CFR
2010-01-01
... analysis of the structures, systems, and components of the reactor to be manufactured, with emphasis upon... assumed for this evaluation should be based upon a major accident, hypothesized for purposes of site... structures, systems, and components with the objective of assessing the risk to public health and safety...
An Online Risk Monitor System (ORMS) to Increase Safety and Security Levels in Industry
NASA Astrophysics Data System (ADS)
Zubair, M.; Rahman, Khalil Ur; Hassan, Mehmood Ul
2013-12-01
The main idea of this research is to develop an Online Risk Monitor System (ORMS) based on Living Probabilistic Safety Assessment (LPSA). The article highlights the essential features and functions of ORMS. The basic models and modules such as, Reliability Data Update Model (RDUM), running time update, redundant system unavailability update, Engineered Safety Features (ESF) unavailability update and general system update have been described in this study. ORMS not only provides quantitative analysis but also highlights qualitative aspects of risk measures. ORMS is capable of automatically updating the online risk models and reliability parameters of equipment. ORMS can support in the decision making process of operators and managers in Nuclear Power Plants.
Scope Complexity Options Risks Excursions (SCORE) Factor Mathematical Description.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini
The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element canmore » be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less
Neuromorphic Computing for Very Large Test and Evaluation Data Analysis
2014-05-01
analysis and utilization of newly available hardware- based artificial neural network chips. These two aspects of the program are complementary. The...neuromorphic architectures research focused on long term disruptive technologies with high risk but revolutionary potential. The hardware- based neural...today. Overall, hardware- based neural processing research allows us to study the fundamental system and architectural issues relevant for employing
NASA Astrophysics Data System (ADS)
Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao
2017-05-01
Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.
USEPA’s Land‐Based Materials Management Exposure and Risk Assessment Tool System
It is recognized that some kinds of 'waste' materials can in fact be reused as input materials for making safe products that benefit society. RIMM (Risk-Informed Materials Management) provides an integrated data gathering and analysis capability to enable scientifically rigorous ...
Fatality Reduction by Air Bags: Analyses of Accident Data through Early 1996
DOT National Transportation Integrated Search
1996-08-01
The fatality risk of front-seat occupants of passenger cars and light trucks equipped with air bags is compared to the corresponding risk in similar vehicles without air bags, based on statistical analysis of Fatal Accident Reporting System (FARS)dat...
Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H
2017-10-01
Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
Some considerations on the definition of risk based on concepts of systems theory and probability.
Andretta, Massimo
2014-07-01
The concept of risk has been applied in many modern science and technology fields. Despite its successes in many applicative fields, there is still not a well-established vision and universally accepted definition of the principles and fundamental concepts of the risk assessment discipline. As emphasized recently, the risk fields suffer from a lack of clarity on their scientific bases that can define, in a unique theoretical framework, the general concepts in the different areas of application. The aim of this article is to make suggestions for another perspective of risk definition that could be applied and, in a certain sense, generalize some of the previously known definitions (at least in the fields of technical and scientific applications). By drawing on my experience of risk assessment in different applicative situations (particularly in the risk estimation for major industrial accidents, and in the health and ecological risk assessment for contaminated sites), I would like to revise some general and foundational concepts of risk analysis in as consistent a manner as possible from the axiomatic/deductive point of view. My proposal is based on the fundamental concepts of the systems theory and of the probability. In this way, I try to frame, in a single, broad, and general theoretical context some fundamental concepts and principles applicable in many different fields of risk assessment. I hope that this article will contribute to the revitalization and stimulation of useful discussions and new insights into the key issues and theoretical foundations of risk assessment disciplines. © 2013 Society for Risk Analysis.
Rong, Hao; Tian, Jin
2015-05-01
The study contributes to human reliability analysis (HRA) by proposing a method that focuses more on human error causality within a sociotechnical system, illustrating its rationality and feasibility by using a case of the Minuteman (MM) III missile accident. Due to the complexity and dynamics within a sociotechnical system, previous analyses of accidents involving human and organizational factors clearly demonstrated that the methods using a sequential accident model are inadequate to analyze human error within a sociotechnical system. System-theoretic accident model and processes (STAMP) was used to develop a universal framework of human error causal analysis. To elaborate the causal relationships and demonstrate the dynamics of human error, system dynamics (SD) modeling was conducted based on the framework. A total of 41 contributing factors, categorized into four types of human error, were identified through the STAMP-based analysis. All factors are related to a broad view of sociotechnical systems, and more comprehensive than the causation presented in the accident investigation report issued officially. Recommendations regarding both technical and managerial improvement for a lower risk of the accident are proposed. The interests of an interdisciplinary approach provide complementary support between system safety and human factors. The integrated method based on STAMP and SD model contributes to HRA effectively. The proposed method will be beneficial to HRA, risk assessment, and control of the MM III operating process, as well as other sociotechnical systems. © 2014, Human Factors and Ergonomics Society.
Fielding, L M; Ellis, L; Beveridge, C; Peters, A C
2005-04-01
To reduce foodborne illnesses, hazard and risk-based quality management systems are essential. Small and medium sized companies (SMEs) tend to have a poor understanding of such systems and limited adoption of the Hazard Analysis Critical Control Point system (HACCP). The requirement for full HACCP implementation by 2006 will place an even greater burden on these businesses. The aim of this project is to assess the current levels of understanding of hazards and risks in SMEs in the manufacturing sector. A questionnaire survey was made of 850 SMEs, including microbusinesses. This determined the industry sector and processes carried out, whether the company operated hazard-based quality management and the knowledge of the technical manager regarding the associated hazards and risks. Follow-up visits to the manufacturing plant observed the processes and the operatives to determine their level of understanding. A benchmarking audit was carried out and each company was rated. The results show that the majority of respondents stated that they operated hazard analysis-based quality management. The ability of the respondents to correctly define a hazard or risk or identify different types of hazard was, however, poor. There was no correlation between business type and audit score. The microbusinesses did, however, perform significantly less well than the larger SMEs.
3MRA UNCERTAINTY AND SENSITIVITY ANALYSIS
This presentation discusses the Multimedia, Multipathway, Multireceptor Risk Assessment (3MRA) modeling system. The outline of the presentation is: modeling system overview - 3MRA versions; 3MRA version 1.0; national-scale assessment dimensionality; SuperMUSE: windows-based super...
NASA Technical Reports Server (NTRS)
2002-01-01
Under a Phase II SBIR contract, Kennedy and Lumina Decision Systems, Inc., jointly developed the Schedule and Cost Risk Analysis Modeling (SCRAM) system, based on a version of Lumina's flagship software product, Analytica(R). Acclaimed as "the best single decision-analysis program yet produced" by MacWorld magazine, Analytica is a "visual" tool used in decision-making environments worldwide to build, revise, and present business models, minus the time-consuming difficulty commonly associated with spreadsheets. With Analytica as their platform, Kennedy and Lumina created the SCRAM system in response to NASA's need to identify the importance of major delays in Shuttle ground processing, a critical function in project management and process improvement. As part of the SCRAM development project, Lumina designed a version of Analytica called the Analytica Design Engine (ADE) that can be easily incorporated into larger software systems. ADE was commercialized and utilized in many other developments, including web-based decision support.
Staccini, P; Quaranta, J F; Staccini-Myx, A; Veyres, P; Jambou, P
2003-09-01
Nowadays, information system is recognised as one of the key points of the management strategy. An information system is regarded conceptualised as a mean to link 3 aspects of a firm (structure, organisation rules and staff). Its design and implementation have to meet the objectives of medical and economical evaluation, especially risk management objectives. In order to identify, analyse, reduce and prevent the occurrence of adverse events, and also to measure the efficacy and efficiency of the production of care services, the design of information systems should be based on a process analysis in order to describe and classify all the working practices within the hospital. According to various methodologies (usually top-down analysis), each process can be divided into activities. Each activity (especially each care activity) can be described according to its potential risks and expected results. For care professionals performing a task, the access to official or internal guidelines and the adverse events reporting forms has also to be defined. Putting together all the elements of such a process analysis will contribute to integrate, into daily practice, the management of risks, supported by the information system.
Morag, Ido; Luria, Gil
2018-04-01
Most studies concerned with participative ergonomic (PE) interventions, focus on organizational rather than group level analysis. By implementing an intervention at a manufacturing plant, the current study, utilizing advanced information systems, measured the effect of line-supervisor leadership on employee exposure to risks. The study evaluated which PE dimensions (i.e., extent of workforce involvement, diversity of reporter role types and scope of analysis) are related to such exposure at the group level. The data for the study was extracted from two separate computerized systems (workforce medical records of 791 employees and an intranet reporting system) during a two-year period. While the results did not confirm the effect of line-supervisor leadership on subordinates' exposure to risks, they did demonstrate relationships between PE dimensions and the employees' exposure to risks. The results support the suggested level of analysis and demonstrate that group-based analysis facilitates the assimilation of preventive interventions. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia
2015-04-01
Climate uncertainty can affect water resources availability and management decisions. Sustainable water resources management therefore requires evaluation of policy and management decisions under a wide range of possible future water supply conditions. This study proposes a risk-based framework to integrate water supply uncertainty into a forward-looking decision making context. To apply this framework, a stochastic reconstruction scheme is used to generate a large ensemble of flow series. For the Rocky Mountain basins considered here, two key characteristics of the annual hydrograph are its annual flow volume and the timing of the seasonal flood peak. These are perturbed to represent natural randomness and potential changes due to future climate. 30-year series of perturbed flows are used as input to the SWAMP model - an integrated water resources model that simulates regional water supply-demand system and estimates economic productivity of water and other sustainability indicators, including system vulnerability and resilience. The simulation results are used to construct 2D-maps of net revenue of a particular water sector; e.g., hydropower, or for all sectors combined. Each map cell represents a risk scenario of net revenue based on a particular annual flow volume, timing of the peak flow, and 200 stochastic realizations of flow series. This framework is demonstrated for a water resources system in the Saskatchewan River Basin (SaskRB) in Saskatchewan, Canada. Critical historical drought sequences, derived from tree-ring reconstructions of several hundred years of annual river flows, are used to evaluate the system's performance (net revenue risk) under extremely low flow conditions and also to locate them on the previously produced 2D risk maps. This simulation and analysis framework is repeated under various reservoir operation strategies (e.g., maximizing flood protection or maximizing water supply security); development proposals, such as irrigation expansion; and change in energy prices. Such risk-based analysis demonstrates relative reduction/increase of risk associated with management and policy decisions and allow decision makers to explore the relative importance of policy versus natural water supply change in a water resources system.
[Analysis and research on cleaning points of HVAC systems in public places].
Yang, Jiaolan; Han, Xu; Chen, Dongqing; Jin, Xin; Dai, Zizhu
2010-03-01
To analyze cleaning points of HVAC systems, and to provides scientific base for regulating the cleaning of HVAC systems. Based on the survey results on the cleaning situation of HVAC systems around China for the past three years, we analyzes the cleaning points of HVAC systems from various aspects, such as the major health risk factors of HVAC systems, the formulation strategy of the cleaning of HVAC systems, cleaning methods and acceptance points of the air ducts and the parts of HVAC systems, the onsite protection and individual protection, the waste treatment and the cleaning of the removed equipment, inspection of the cleaning results, video record, and the final acceptance of the cleaning. The analysis of the major health risk factors of HVAC systems and the formulation strategy of the cleaning of HVAC systems is given. The specific methods for cleaning the air ducts, machine units, air ports, coil pipes and the water cooling towers of HVAC systems, the acceptance points of HVAC systems and the requirements of the report on the final acceptance of the cleaning of HVAC systems are proposed. By the analysis of the points of the cleaning of HVAC systems and proposal of corresponding measures, this study provides the base for the scientific and regular launch of the cleaning of HVAC systems, a novel technology service, and lays a foundation for the revision of the existing cleaning regulations, which may generate technical and social benefits to some extent.
Approach to proliferation risk assessment based on multiple objective analysis framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrianov, A.; Kuptsov, I.; Studgorodok 1, Obninsk, Kaluga region, 249030
2013-07-01
The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materialsmore » circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.« less
NASA Technical Reports Server (NTRS)
1972-01-01
Nuclear safety analysis as applied to a space base mission is presented. The nuclear safety analysis document summarizes the mission and the credible accidents/events which may lead to nuclear hazards to the general public. The radiological effects and associated consequences of the hazards are discussed in detail. The probability of occurrence is combined with the potential number of individuals exposed to or above guideline values to provide a measure of accident and total mission risk. The overall mission risk has been determined to be low with the potential exposure to or above 25 rem limited to less than 4 individuals per every 1000 missions performed. No radiological risk to the general public occurs during the prelaunch phase at KSC. The most significant risks occur from prolonged exposure to reactor debris following land impact generally associated with the disposal phase of the mission where fission product inventories can be high.
The safer clinical systems project in renal care.
Weale, Andy R
2013-09-01
Current systems in place in healthcare are designed to detect harm after it has happened (e.g critical incident reports) and make recommendations based on an assessment of that event. Safer Clinical Systems, a Health Foundation funded project, is designed to proactively search for risk within systems, rather than being reactive to harm. The aim of the Safer Clinical Systems project in Renal Care was to reduce the risks associated with shared care for patients who are undergoing surgery but are looked after peri-operatively by nephrology teams on nephrology wards. This report details our findings of the diagnostic phase of Safer Clinical Systems: the proactive search for risk. We have evaluated the current system of care using a set of risk evaluation and process mapping tools (Failure Modes and Effects Analysis (FMEA) and Hierarchical Task Analysis HTA). We have engaged staff with the process mapping and risk assessment tools. We now understand our system and understand where the highest risk tasks are undertaken during a renal in-patient stay during which a patient has an operation. These key tasks occur across the perioperaive period and are not confined to one aspect of care. A measurement strategy and intervention plan have been designed around these tasks. Safer Clinical Systems has identified high risk, low reliability tasks in our system. We look forward to fully reporting these data in 2014. © 2013 European Dialysis and Transplant Nurses Association/European Renal Care Association.
Hu, Zhongkai; Jin, Bo; Shin, Andrew Y; Zhu, Chunqing; Zhao, Yifan; Hao, Shiying; Zheng, Le; Fu, Changlin; Wen, Qiaojun; Ji, Jun; Li, Zhen; Wang, Yong; Zheng, Xiaolin; Dai, Dorothy; Culver, Devore S; Alfreds, Shaun T; Rogow, Todd; Stearns, Frank; Sylvester, Karl G; Widen, Eric; Ling, Xuefeng B
2015-01-13
An easily accessible real-time Web-based utility to assess patient risks of future emergency department (ED) visits can help the health care provider guide the allocation of resources to better manage higher-risk patient populations and thereby reduce unnecessary use of EDs. Our main objective was to develop a Health Information Exchange-based, next 6-month ED risk surveillance system in the state of Maine. Data on electronic medical record (EMR) encounters integrated by HealthInfoNet (HIN), Maine's Health Information Exchange, were used to develop the Web-based surveillance system for a population ED future 6-month risk prediction. To model, a retrospective cohort of 829,641 patients with comprehensive clinical histories from January 1 to December 31, 2012 was used for training and then tested with a prospective cohort of 875,979 patients from July 1, 2012, to June 30, 2013. The multivariate statistical analysis identified 101 variables predictive of future defined 6-month risk of ED visit: 4 age groups, history of 8 different encounter types, history of 17 primary and 8 secondary diagnoses, 8 specific chronic diseases, 28 laboratory test results, history of 3 radiographic tests, and history of 25 outpatient prescription medications. The c-statistics for the retrospective and prospective cohorts were 0.739 and 0.732 respectively. Integration of our method into the HIN secure statewide data system in real time prospectively validated its performance. Cluster analysis in both the retrospective and prospective analyses revealed discrete subpopulations of high-risk patients, grouped around multiple "anchoring" demographics and chronic conditions. With the Web-based population risk-monitoring enterprise dashboards, the effectiveness of the active case finding algorithm has been validated by clinicians and caregivers in Maine. The active case finding model and associated real-time Web-based app were designed to track the evolving nature of total population risk, in a longitudinal manner, for ED visits across all payers, all diseases, and all age groups. Therefore, providers can implement targeted care management strategies to the patient subgroups with similar patterns of clinical histories, driving the delivery of more efficient and effective health care interventions. To the best of our knowledge, this prospectively validated EMR-based, Web-based tool is the first one to allow real-time total population risk assessment for statewide ED visits.
Adeniyi, D A; Wei, Z; Yang, Y
2018-01-30
A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.
Brasil, Albert Vincent Berthier; Teles, Alisson R; Roxo, Marcelo Ricardo; Schuster, Marcelo Neutzling; Zauk, Eduardo Ballverdu; Barcellos, Gabriel da Costa; Costa, Pablo Ramon Fruett da; Ferreira, Nelson Pires; Kraemer, Jorge Luiz; Ferreira, Marcelo Paglioli; Gobbato, Pedro Luis; Worm, Paulo Valdeci
2016-10-01
To analyze the cumulative effect of risk factors associated with early major complications in postoperative spine surgery. Retrospective analysis of 583 surgically-treated patients. Early "major" complications were defined as those that may lead to permanent detrimental effects or require further significant intervention. A balanced risk score was built using multiple logistic regression. Ninety-two early major complications occurred in 76 patients (13%). Age > 60 years and surgery of three or more levels proved to be significant independent risk factors in the multivariate analysis. The balanced scoring system was defined as: 0 points (no risk factor), 2 points (1 factor) or 4 points (2 factors). The incidence of early major complications in each category was 7% (0 points), 15% (2 points) and 29% (4 points) respectively. This balanced scoring system, based on two risk factors, represents an important tool for both surgical indication and for patient counseling before surgery.
Bahouth, George; Digges, Kennerly; Schulman, Carl
2012-01-01
This paper presents methods to estimate crash injury risk based on crash characteristics captured by some passenger vehicles equipped with Advanced Automatic Crash Notification technology. The resulting injury risk estimates could be used within an algorithm to optimize rescue care. Regression analysis was applied to the National Automotive Sampling System / Crashworthiness Data System (NASS/CDS) to determine how variations in a specific injury risk threshold would influence the accuracy of predicting crashes with serious injuries. The recommended thresholds for classifying crashes with severe injuries are 0.10 for frontal crashes and 0.05 for side crashes. The regression analysis of NASS/CDS indicates that these thresholds will provide sensitivity above 0.67 while maintaining a positive predictive value in the range of 0.20. PMID:23169132
Multi-hazard risk analysis using the FP7 RASOR Platform
NASA Astrophysics Data System (ADS)
Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew
2014-10-01
Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.
NASA Astrophysics Data System (ADS)
Hansen, Christian; Schlichting, Stefan; Zidowitz, Stephan; Köhn, Alexander; Hindennach, Milo; Kleemann, Markus; Peitgen, Heinz-Otto
2008-03-01
Tumor resections from the liver are complex surgical interventions. With recent planning software, risk analyses based on individual liver anatomy can be carried out preoperatively. However, additional tumors within the liver are frequently detected during oncological interventions using intraoperative ultrasound. These tumors are not visible in preoperative data and their existence may require changes to the resection strategy. We propose a novel method that allows an intraoperative risk analysis adaptation by merging newly detected tumors with a preoperative risk analysis. To determine the exact positions and sizes of these tumors we make use of a navigated ultrasound-system. A fast communication protocol enables our application to exchange crucial data with this navigation system during an intervention. A further motivation for our work is to improve the visual presentation of a moving ultrasound plane within a complex 3D planning model including vascular systems, tumors, and organ surfaces. In case the ultrasound plane is located inside the liver, occlusion of the ultrasound plane by the planning model is an inevitable problem for the applied visualization technique. Our system allows the surgeon to focus on the ultrasound image while perceiving context-relevant planning information. To improve orientation ability and distance perception, we include additional depth cues by applying new illustrative visualization algorithms. Preliminary evaluations confirm that in case of intraoperatively detected tumors a risk analysis adaptation is beneficial for precise liver surgery. Our new GPU-based visualization approach provides the surgeon with a simultaneous visualization of planning models and navigated 2D ultrasound data while minimizing occlusion problems.
Reliability and Probabilistic Risk Assessment - How They Play Together
NASA Technical Reports Server (NTRS)
Safie, Fayssal; Stutts, Richard; Huang, Zhaofeng
2015-01-01
Since the Space Shuttle Challenger accident in 1986, NASA has extensively used probabilistic analysis methods to assess, understand, and communicate the risk of space launch vehicles. Probabilistic Risk Assessment (PRA), used in the nuclear industry, is one of the probabilistic analysis methods NASA utilizes to assess Loss of Mission (LOM) and Loss of Crew (LOC) risk for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability distributions to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: 1) what can go wrong that would lead to loss or degraded performance (i.e., scenarios involving undesired consequences of interest), 2) how likely is it (probabilities), and 3) what is the severity of the degradation (consequences). Since the Challenger accident, PRA has been used in supporting decisions regarding safety upgrades for launch vehicles. Another area that was given a lot of emphasis at NASA after the Challenger accident is reliability engineering. Reliability engineering has been a critical design function at NASA since the early Apollo days. However, after the Challenger accident, quantitative reliability analysis and reliability predictions were given more scrutiny because of their importance in understanding failure mechanism and quantifying the probability of failure, which are key elements in resolving technical issues, performing design trades, and implementing design improvements. Although PRA and reliability are both probabilistic in nature and, in some cases, use the same tools, they are two different activities. Specifically, reliability engineering is a broad design discipline that deals with loss of function and helps understand failure mechanism and improve component and system design. PRA is a system scenario based risk assessment process intended to assess the risk scenarios that could lead to a major/top undesirable system event, and to identify those scenarios that are high-risk drivers. PRA output is critical to support risk informed decisions concerning system design. This paper describes the PRA process and the reliability engineering discipline in detail. It discusses their differences and similarities and how they work together as complementary analyses to support the design and risk assessment processes. Lessons learned, applications, and case studies in both areas are also discussed in the paper to demonstrate and explain these differences and similarities.
PRA and Risk Informed Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernsen, Sidney A.; Simonen, Fredric A.; Balkey, Kenneth R.
2006-01-01
The Boiler and Pressure Vessel Code (BPVC) of the American Society of Mechanical Engineers (ASME) has introduced a risk based approach into Section XI that covers Rules for Inservice Inspection of Nuclear Power Plant Components. The risk based approach requires application of the probabilistic risk assessments (PRA). Because no industry consensus standard existed for PRAs, ASME has developed a standard to evaluate the quality level of an available PRA needed to support a given risk based application. The paper describes the PRA standard, Section XI application of PRAs, and plans for broader applications of PRAs to other ASME nuclear codesmore » and standards. The paper addresses several specific topics of interest to Section XI. Important consideration are special methods (surrogate components) used to overcome the lack of PRA treatments of passive components in PRAs. The approach allows calculations of conditional core damage probabilities both for component failures that cause initiating events and failures in standby systems that decrease the availability of these systems. The paper relates the explicit risk based methods of the new Section XI code cases to the implicit consideration of risk used in the development of Section XI. Other topics include the needed interactions of ISI engineers, plant operating staff, PRA specialists, and members of expert panels that review the risk based programs.« less
NASA Technical Reports Server (NTRS)
Smith, H. E.
1990-01-01
Present software development accomplishments are indicative of the emerging interest in and increasing efforts to provide risk assessment backbone tools in the manned spacecraft engineering community. There are indications that similar efforts are underway in the chemical processes industry and are probably being planned for other high risk ground base environments. It appears that complex flight systems intended for extended manned planetary exploration will drive this technology.
XPA A23G polymorphism and risk of digestive system cancers: a meta-analysis.
He, Lei; Deng, Tao; Luo, Hesheng
2015-01-01
Several studies have reported an association between the A23G polymorphism (rs 1800975) in the xeroderma pigmentosum group A (XPA) gene and risk of digestive system cancers. However, the results are inconsistent. In this study, we performed a meta-analysis to assess the association between XPA A23G polymorphism and the risk of digestive system cancers. Relevant studies were identified using the PubMed, Web of Science, China National Knowledge Infrastructure, WanFang, and VIP databases up to August 30, 2014. The pooled odds ratio (OR) with a 95% confidence interval (CI) was calculated using the fixed or random effects model. A total of 18 case-control studies from 16 publications with 4,170 patients and 6,929 controls were included. Overall, no significant association was found between XPA A23G polymorphism and the risk of digestive system cancers (dominant model: GA + AA versus GG, OR 0.89, 95% CI 0.74-1.08; recessive model: AA versus GA + GG, OR 0.94, 95% CI 0.74-1.20; GA versus GG, OR 0.89, 95% CI 0.77-1.03; and AA versus GG, OR 0.87, 95% CI 0.64-1.19). When the analysis was stratified by ethnicity, similar results were observed among Asians and Caucasians in all genetic models. In stratified analysis based on tumor type, we also failed to detect any association between XPA A23G polymorphism and the risk of esophageal, gastric, or colorectal cancers. This meta-analysis indicates that the XPA A23G polymorphism is not associated with a risk of digestive system cancers.
Bønes, Erlend; Hasvold, Per; Henriksen, Eva; Strandenaes, Thomas
2007-09-01
Instant messaging (IM) is suited for immediate communication because messages are delivered almost in real time. Results from studies of IM use in enterprise work settings make us believe that IM based services may prove useful also within the healthcare sector. However, today's public instant messaging services do not have the level of information security required for adoption of IM in healthcare. We proposed MedIMob, our own architecture for a secure enterprise IM service for use in healthcare. MedIMob supports IM clients on mobile devices in addition to desktop based clients. Security threats were identified in a risk analysis of the MedIMob architecture. The risk analysis process consists of context identification, threat identification, analysis of consequences and likelihood, risk evaluation, and proposals for risk treatment. The risk analysis revealed a number of potential threats to the information security of a service like this. Many of the identified threats are general when dealing with mobile devices and sensitive data; others are threats which are more specific to our service and architecture. Individual threats identified in the risks analysis are discussed and possible counter measures presented. The risk analysis showed that most of the proposed risk treatment measures must be implemented to obtain an acceptable risk level; among others blocking much of the additional functionality of the smartphone. To conclude on the usefulness of this IM service, it will be evaluated in a trial study of the human-computer interaction. Further work also includes an improved design of the proposed MedIMob architecture. 2006 Elsevier Ireland Ltd
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reynolds, K.M.; Holsten, E.H.; Werner, R.A.
1995-03-01
SBexpert version 1.0 is a knowledge-based decision-support system for management of spruce beetle developed for use in Microsoft Windows. The users guide provides detailed instructions on the use of all SBexpert features. SBexpert has four main subprograms; introduction, analysis, textbook, and literature. The introduction is the first of the five subtopics in the SBexpert help system. The analysis topic is an advisory system for spruce beetle management that provides recommendation for reducing spruce beetle hazard and risk to spruce stands and is the main analytical topic in SBexpert. The textbook and literature topics provide complementary decision support for analysis.
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
A Risk-based Assessment And Management Framework For Multipollutant Air Quality
Frey, H. Christopher; Hubbell, Bryan
2010-01-01
The National Research Council recommended both a risk- and performance-based multipollutant approach to air quality management. Specifically, management decisions should be based on minimizing the exposure to, and risk of adverse effects from, multiple sources of air pollution and that the success of these decisions should be measured by how well they achieved this objective. We briefly describe risk analysis and its application within the current approach to air quality management. Recommendations are made as to how current practice could evolve to support a fully risk- and performance-based multipollutant air quality management system. The ability to implement a risk assessment framework in a credible and policy-relevant manner depends on the availability of component models and data which are scientifically sound and developed with an understanding of their application in integrated assessments. The same can be said about accountability assessments used to evaluate the outcomes of decisions made using such frameworks. The existing risk analysis framework, although typically applied to individual pollutants, is conceptually well suited for analyzing multipollutant management actions. Many elements of this framework, such as emissions and air quality modeling, already exist with multipollutant characteristics. However, the framework needs to be supported with information on exposure and concentration response relationships that result from multipollutant health studies. Because the causal chain that links management actions to emission reductions, air quality improvements, exposure reductions and health outcomes is parallel between prospective risk analyses and retrospective accountability assessments, both types of assessment should be placed within a single framework with common metrics and indicators where possible. Improvements in risk reductions can be obtained by adopting a multipollutant risk analysis framework within the current air quality management system, e.g. focused on standards for individual pollutants and with separate goals for air toxics and ambient pollutants. However, additional improvements may be possible if goals and actions are defined in terms of risk metrics that are comparable across criteria pollutants and air toxics (hazardous air pollutants), and that encompass both human health and ecological risks. PMID:21209847
Ajisegiri, Whenayon Simeon; Chughtai, Abrar Ahmad; MacIntyre, C Raina
2018-03-01
The 2014 Ebola virus disease (EVD) outbreak affected several countries worldwide, including six West African countries. It was the largest Ebola epidemic in the history and the first to affect multiple countries simultaneously. Significant national and international delay in response to the epidemic resulted in 28,652 cases and 11,325 deaths. The aim of this study was to develop a risk analysis framework to prioritize rapid response for situations of high risk. Based on findings from the literature, sociodemographic features of the affected countries, and documented epidemic data, a risk scoring framework using 18 criteria was developed. The framework includes measures of socioeconomics, health systems, geographical factors, cultural beliefs, and traditional practices. The three worst affected West African countries (Guinea, Sierra Leone, and Liberia) had the highest risk scores. The scores were much lower in developed countries that experienced Ebola compared to West African countries. A more complex risk analysis framework using 18 measures was compared with a simpler one with 10 measures, and both predicted risk equally well. A simple risk scoring system can incorporate measures of hazard and impact that may otherwise be neglected in prioritizing outbreak response. This framework can be used by public health personnel as a tool to prioritize outbreak investigation and flag outbreaks with potentially catastrophic outcomes for urgent response. Such a tool could mitigate costly delays in epidemic response. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
Nouri.Gharahasanlou, Ali; Mokhtarei, Ashkan; Khodayarei, Aliasqar; Ataei, Mohammad
2014-01-01
Evaluating and analyzing the risk in the mining industry is a new approach for improving the machinery performance. Reliability, safety, and maintenance management based on the risk analysis can enhance the overall availability and utilization of the mining technological systems. This study investigates the failure occurrence probability of the crushing and mixing bed hall department at Azarabadegan Khoy cement plant by using fault tree analysis (FTA) method. The results of the analysis in 200 h operating interval show that the probability of failure occurrence for crushing, conveyor systems, crushing and mixing bed hall department is 73, 64, and 95 percent respectively and the conveyor belt subsystem found as the most probable system for failure. Finally, maintenance as a method of control and prevent the occurrence of failure is proposed. PMID:26779433
Nouri Gharahasanlou, Ali; Mokhtarei, Ashkan; Khodayarei, Aliasqar; Ataei, Mohammad
2014-04-01
Evaluating and analyzing the risk in the mining industry is a new approach for improving the machinery performance. Reliability, safety, and maintenance management based on the risk analysis can enhance the overall availability and utilization of the mining technological systems. This study investigates the failure occurrence probability of the crushing and mixing bed hall department at Azarabadegan Khoy cement plant by using fault tree analysis (FTA) method. The results of the analysis in 200 h operating interval show that the probability of failure occurrence for crushing, conveyor systems, crushing and mixing bed hall department is 73, 64, and 95 percent respectively and the conveyor belt subsystem found as the most probable system for failure. Finally, maintenance as a method of control and prevent the occurrence of failure is proposed.
Albéniz, Eduardo; Fraile, María; Ibáñez, Berta; Alonso-Aguirre, Pedro; Martínez-Ares, David; Soto, Santiago; Gargallo, Carla Jerusalén; Ramos Zabala, Felipe; Álvarez, Marco Antonio; Rodríguez-Sánchez, Joaquín; Múgica, Fernando; Nogales, Óscar; Herreros de Tejada, Alberto; Redondo, Eduardo; Guarner-Argente, Carlos; Pin, Noel; León-Brito, Helena; Pardeiro, Remedios; López-Roses, Leopoldo; Rodríguez-Téllez, Manuel; Jiménez, Alejandra; Martínez-Alcalá, Felipe; García, Orlando; de la Peña, Joaquín; Ono, Akiko; Alberca de Las Parras, Fernando; Pellisé, María; Rivero, Liseth; Saperas, Esteban; Pérez-Roldán, Francisco; Pueyo Royo, Antonio; Eguaras Ros, Javier; Zúñiga Ripa, Alba; Concepción-Martín, Mar; Huelin-Álvarez, Patricia; Colán-Hernández, Juan; Cubiella, Joaquín; Remedios, David; Bessa I Caserras, Xavier; López-Viedma, Bartolomé; Cobian, Julyssa; González-Haba, Mariano; Santiago, José; Martínez-Cara, Juan Gabriel; Valdivielso, Eduardo
2016-08-01
After endoscopic mucosal resection (EMR) of colorectal lesions, delayed bleeding is the most common serious complication, but there are no guidelines for its prevention. We aimed to identify risk factors associated with delayed bleeding that required medical attention after discharge until day 15 and develop a scoring system to identify patients at risk. We performed a prospective study of 1214 consecutive patients with nonpedunculated colorectal lesions 20 mm or larger treated by EMR (n = 1255) at 23 hospitals in Spain, from February 2013 through February 2015. Patients were examined 15 days after the procedure, and medical data were collected. We used the data to create a delayed bleeding scoring system, and assigned a weight to each risk factor based on the β parameter from multivariate logistic regression analysis. Patients were classified as being at low, average, or high risk for delayed bleeding. Delayed bleeding occurred in 46 cases (3.7%, 95% confidence interval, 2.7%-4.9%). In multivariate analysis, factors associated with delayed bleeding included age ≥75 years (odds ratio [OR], 2.36; P < .01), American Society of Anesthesiologist classification scores of III or IV (OR, 1.90; P ≤ .05), aspirin use during EMR (OR, 3.16; P < .05), right-sided lesions (OR, 4.86; P < .01), lesion size ≥40 mm (OR, 1.91; P ≤ .05), and a mucosal gap not closed by hemoclips (OR, 3.63; P ≤ .01). We developed a risk scoring system based on these 6 variables that assigned patients to the low-risk (score, 0-3), average-risk (score, 4-7), or high-risk (score, 8-10) categories with a receiver operating characteristic curve of 0.77 (95% confidence interval, 0.70-0.83). In these groups, the probabilities of delayed bleeding were 0.6%, 5.5%, and 40%, respectively. The risk of delayed bleeding after EMR of large colorectal lesions is 3.7%. We developed a risk scoring system based on 6 factors that determined the risk for delayed bleeding (receiver operating characteristic curve, 0.77). The factors most strongly associated with delayed bleeding were right-sided lesions, aspirin use, and mucosal defects not closed by hemoclips. Patients considered to be high risk (score, 8-10) had a 40% probability of delayed bleeding. Copyright © 2016 AGA Institute. Published by Elsevier Inc. All rights reserved.
An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring
Mora-Jiménez, Inmaculada; Ramos-López, Javier; Quintanilla Fernández, Teresa; García-García, Antonio; Díez-Mazuela, Daniel; García-Alberola, Arcadi
2018-01-01
Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT)), and a complex-domain (heart rate variability (HRV)). Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages) and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT). The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain. PMID:29494497
Berry, Colin; Norrie, John; McMurray, John J V
2005-03-01
The active control trials, SPORTIF III and SPORTIF V, compared the direct thrombin inhibitor ximelagatran to warfarin, where each was given as a treatment to prevent systemic embolism and stroke in patients with atrial fibrillation. Because warfarin has previously been compared to placebo in similar patients and ximelagatran has now been compared to warfarin, an indirect comparison between ximelagatran and placebo is possible (imputed placebo analysis). In this analysis, ximelagatran reduces the risk of stroke and systemic embolism by 66% (hazard ratio 0.338; 95% confidence interval [CI] 0.204-0.560). Ximelagatran preserves 102% (95% CI 72-132%) of the benefit of warfarin. Based on these data, ximelagatran may be an effective alternative to warfarin for the prevention of stroke and systemic embolism in high-risk patients with atrial fibrillation.
Development of a GIS-based spill management information system.
Martin, Paul H; LeBoeuf, Eugene J; Daniel, Edsel B; Dobbins, James P; Abkowitz, Mark D
2004-08-30
Spill Management Information System (SMIS) is a geographic information system (GIS)-based decision support system designed to effectively manage the risks associated with accidental or intentional releases of a hazardous material into an inland waterway. SMIS provides critical planning and impact information to emergency responders in anticipation of, or following such an incident. SMIS couples GIS and database management systems (DBMS) with the 2-D surface water model CE-QUAL-W2 Version 3.1 and the air contaminant model Computer-Aided Management of Emergency Operations (CAMEO) while retaining full GIS risk analysis and interpretive capabilities. Live 'real-time' data links are established within the spill management software to utilize current meteorological information and flowrates within the waterway. Capabilities include rapid modification of modeling conditions to allow for immediate scenario analysis and evaluation of 'what-if' scenarios. The functionality of the model is illustrated through a case study of the Cheatham Reach of the Cumberland River near Nashville, TN.
A near real time scenario at regional scale for the hydrogeological risk
NASA Astrophysics Data System (ADS)
Ponziani, F.; Stelluti, M.; Zauri, R.; Berni, N.; Brocca, L.; Moramarco, T.; Salciarini, D.; Tamagnini, C.
2012-04-01
The early warning systems dedicated to landslides and floods represent the Umbria Region Civil Protection Service new generation tools for hydraulic and hydrogeological risk reduction. Following past analyses performed by the Functional Centre (part of the civil protection service dedicated to the monitoring and the evaluation of natural hazards) on the relationship between saturated soil conditions and rainfall thresholds, we have developed an automated early warning system for the landslide risk, called LANDWARN, which generates daily and 72h forecast risk matrix with a dense mesh of 100 x 100m, throughout the region. The system is based on: (a) the 20 days -observed and 72h -predicted rainfall, provided by the local meteorological network and the Local scale Meteorological Model COSMO ME, (b) the assessment of the saturation of soils by: daily extraction of ASCAT satellite data, data from a network of 16 TDR sensors, and a water balance model (developed by the Research Institute for Geo-Hydrological Protection, CNR, Perugia, Italy) that allows for the prediction of a saturation index for each point of the analysis grid up to a window of 72 h, (c) a Web-GIS platform that combines the data grids of calculated hazard indicators with layers of landslide susceptibility and vulnerability of the territory, in order to produce dynamic risk scenarios. The system is still under development and it's implemented at different scales: the entire region, and a set of known high-risk landslides in Umbria. The system is monitored and regularly reviewed through the back analysis of landslide reports for which the activation date is available. Up to now, the development of the system involves: a) the improvement of the reliability assessment of the condition of soil saturation, a key parameter which is used to dynamically adjust the values of rainfall thresholds used for the declaration of levels of landslide hazard. For this purpose, a procedure was created for the ASCAT satellite data daily download, used for the derivation of a soil water content index (SWI): these data are compared with instrumental ones from the TDR stations and the results of the water balance model that evaluates the contributions of water infiltration, percolation, evapotranspiration, etc. using physically based parameters obtained through a long process of characterization of soil and rock types, for each grid point; b) The assessment of the contribution due to the melting of the snow; c) the physically based - coupling model slope stability analysis, GIS-based, developed by the Department of Civil and Environmental Engineering, University of Perugia, with the aim to introduce also the actual mechanical and physical characteristics of slopes in the analysis. As result of the system, is the daily creation of near real-time and 24, 48, 72h forecast risk scenarios, that, under the intention of the Department of Civil Protection Service, will be used by the Functional Centre for the institutional tasks of hydrogeological risk evaluation and management, but also by local Administrations involved in the monitoring and assessment of landslide risk, in order to receive feedback on the effectiveness of the scenarios produced.
Risk evaluation of highway engineering project based on the fuzzy-AHP
NASA Astrophysics Data System (ADS)
Yang, Qian; Wei, Yajun
2011-10-01
Engineering projects are social activities, which integrate with technology, economy, management and organization. There are uncertainties in each respect of engineering projects, and it needs to strengthen risk management urgently. Based on the analysis of the characteristics of highway engineering, and the study of the basic theory on risk evaluation, the paper built an index system of highway project risk evaluation. Besides based on fuzzy mathematics principle, analytical hierarchy process was used and as a result, the model of the comprehensive appraisal method of fuzzy and AHP was set up for the risk evaluation of express way concessionary project. The validity and the practicability of the risk evaluation of expressway concessionary project were verified after the model was applied to the practice of a project.
Launch Vehicle Debris Models and Crew Vehicle Ascent Abort Risk
NASA Technical Reports Server (NTRS)
Gee, Ken; Lawrence, Scott
2013-01-01
For manned space launch systems, a reliable abort system is required to reduce the risks associated with a launch vehicle failure during ascent. Understanding the risks associated with failure environments can be achieved through the use of physics-based models of these environments. Debris fields due to destruction of the launch vehicle is one such environment. To better analyze the risk posed by debris, a physics-based model for generating launch vehicle debris catalogs has been developed. The model predicts the mass distribution of the debris field based on formulae developed from analysis of explosions. Imparted velocity distributions are computed using a shock-physics code to model the explosions within the launch vehicle. A comparison of the debris catalog with an existing catalog for the Shuttle external tank show good comparison in the debris characteristics and the predicted debris strike probability. The model is used to analyze the effects of number of debris pieces and velocity distributions on the strike probability and risk.
Belgium: risk adjustment and financial responsibility in a centralised system.
Schokkaert, Erik; Van de Voorde, Carine
2003-07-01
Since 1995 Belgian sickness funds are partially financed through a risk adjustment system and are held partially financially responsible for the difference between their actual and their risk-adjusted expenditures. However, they did not get the necessary instruments for exerting a real influence on expenditures and the health insurance market has not been opened for new entrants. At the same time the sickness funds have powerful tools for risk selection, because they also dominate the market for supplementary health insurance. The present risk-adjustment system is based on the results of a regression analysis with aggregate data. The main proclaimed purpose of this system is to guarantee a fair treatment to all the sickness funds. Until now the danger of risk selection has not been taken seriously. Consumer mobility has remained rather low. However, since the degree of financial responsibility is programmed to increase in the near future, the potential profits from cream skimming will increase.
Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins
Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.
2010-12-14
A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.
Goode, Natassia; Salmon, Paul M; Lenné, Michael G; Hillard, Peter
2014-07-01
Injuries resulting from manual handling tasks represent an on-going problem for the transport and storage industry. This article describes an application of a systems theory-based approach, Rasmussen's (1997. Safety Science 27, 183), risk management framework, to the analysis of the factors influencing safety during manual handling activities in a freight handling organisation. Observations of manual handling activities, cognitive decision method interviews with workers (n=27) and interviews with managers (n=35) were used to gather information about three manual handling activities. Hierarchical task analysis and thematic analysis were used to identify potential risk factors and performance shaping factors across the levels of Rasmussen's framework. These different data sources were then integrated using Rasmussen's Accimap technique to provide an overall analysis of the factors influencing safety during manual handling activities in this context. The findings demonstrate how a systems theory-based approach can be applied to this domain, and suggest that policy-orientated, rather than worker-orientated, changes are required to prevent future manual handling injuries. Copyright © 2013 Elsevier Ltd. All rights reserved.
Bio-Terrorism: Steps to Effective Public Health Risk Communication and Fear Management
2004-06-01
outline the challenges of communicating risk prior to, during and following a bio-terrorism event as well as the relationship between the content of...particularly challenging for a system based on thorough research and data analysis. Risk communication in a bio-terrorism event will involve...Ultimately, the Anthrax events confirmed the difficulty in communicating risk when scientific data is not available. Adding to the challenges imposed by an
Central nervous system infections and stroke -- a population-based analysis.
Chien, L-N; Chi, N-F; Hu, C-J; Chiou, H-Y
2013-10-01
Chronic central nervous system (CNS) infections have been found to associate with cerebrovascular complications. Acute CNS infections are more common than chronic CNS infections, but whether they could increase the risk of vascular diseases has not been studied. The study cohort comprised all adult patients with diagnoses of CNS infections from Taiwan National Health Insurance Research Database during 2000-2009 (n = 533). The comparison group were matched by age, sex, urbanization, diagnostic year, and vascular risk factors of cases (cases and controls = 1:5). Patients were tracked for at least 1 year. Kaplan-Meier analysis was used to compare the risk of stroke and acute myocardial infarction (AMI) after adjusting censoring subjects. After adjusting the patients demographic characteristics and comorbidities, the risk of patients with CNS infections developing stroke was 2.75-3.44 times greater than their comparison group. More than 70% of the stroke events were occurring within 1 year after CNS infections. The risk of AMI was not found as we compared patients with and without CNS infections. The population-based cohort study suggested that adult patients with CNS infections have higher risk to develop stroke but not AMI, and the risk is marked within a year after infections. © 2013 John Wiley & Sons A/S.
Landscape ecological risk assessment study in arid land
NASA Astrophysics Data System (ADS)
Gong, Lu; Amut, Aniwaer; Shi, Qingdong; Wang, Gary Z.
2007-09-01
The ecosystem risk assessment is an essential decision making system for predicting the reconstruction and recovery of a damaged ecosystem after intensive mankind activities. The sustainability of environment and resources of the lake ecosystem in arid districts have been paid close attention to by international communities as well as numerous experts and scholars. The ecological risk assessment offered a scientific foundation for making the decision and execution of ecological risk management. Bosten Lake, the largest inland freshwater lake in China, is the main water source of the industrial and agricultural production as well as the local residence in Yanqi basin, Kuara city and Yuri County in the southern Xinjiang. Bosten Lake also provides a direct water source for emergency transportation in the Lower Reaches of Tarim River. However, with the intensive utilizations of water and soil resources, the environmental condition in the Bosten Lake has become more and more serious. In this study, the theory and method of landscape ecological risk assessment has been practiced using 3S technologies combined with the frontier theory of landscape ecology. Defining the mainly risk resource including flood, drought, water pollution and rich nutrition of water has been evaluated based on the ecosystem risk assessment system. The main process includes five stages: regional natural resources analysis, risk receptor selection, risk sources evaluation, exposure and hazard analysis, and integrated risk assessment. Based on the risk assessment results, the environmental risk management countermeasure has been determined.
Decision analysis and risk models for land development affecting infrastructure systems.
Thekdi, Shital A; Lambert, James H
2012-07-01
Coordination and layering of models to identify risks in complex systems such as large-scale infrastructure of energy, water, and transportation is of current interest across application domains. Such infrastructures are increasingly vulnerable to adjacent commercial and residential land development. Land development can compromise the performance of essential infrastructure systems and increase the costs of maintaining or increasing performance. A risk-informed approach to this topic would be useful to avoid surprise, regret, and the need for costly remedies. This article develops a layering and coordination of models for risk management of land development affecting infrastructure systems. The layers are: system identification, expert elicitation, predictive modeling, comparison of investment alternatives, and implications of current decisions for future options. The modeling layers share a focus on observable factors that most contribute to volatility of land development and land use. The relevant data and expert evidence include current and forecasted growth in population and employment, conservation and preservation rules, land topography and geometries, real estate assessments, market and economic conditions, and other factors. The approach integrates to a decision framework of strategic considerations based on assessing risk, cost, and opportunity in order to prioritize needs and potential remedies that mitigate impacts of land development to the infrastructure systems. The approach is demonstrated for a 5,700-mile multimodal transportation system adjacent to 60,000 tracts of potential land development. © 2011 Society for Risk Analysis.
Measuring Security Effectiveness and Efficiency at U.S. Commercial Airports
2013-03-01
formative program evaluation and policy analysis to investigate current airport security programs. It identifies innovative public administration and...policy-analysis tools that could provide potential benefits to airport security . These tools will complement the System Based Risk Management framework if
Design and implementation of a risk assessment module in a spatial decision support system
NASA Astrophysics Data System (ADS)
Zhang, Kaixi; van Westen, Cees; Bakker, Wim
2014-05-01
The spatial decision support system named 'Changes SDSS' is currently under development. The goal of this system is to analyze changing hydro-meteorological hazards and the effect of risk reduction alternatives to support decision makers in choosing the best alternatives. The risk assessment module within the system is to assess the current risk, analyze the risk after implementations of risk reduction alternatives, and analyze the risk in different future years when considering scenarios such as climate change, land use change and population growth. The objective of this work is to present the detailed design and implementation plan of the risk assessment module. The main challenges faced consist of how to shift the risk assessment from traditional desktop software to an open source web-based platform, the availability of input data and the inclusion of uncertainties in the risk analysis. The risk assessment module is developed using Ext JS library for the implementation of user interface on the client side, using Python for scripting, as well as PostGIS spatial functions for complex computations on the server side. The comprehensive consideration of the underlying uncertainties in input data can lead to a better quantification of risk assessment and a more reliable Changes SDSS, since the outputs of risk assessment module are the basis for decision making module within the system. The implementation of this module will contribute to the development of open source web-based modules for multi-hazard risk assessment in the future. This work is part of the "CHANGES SDSS" project, funded by the European Community's 7th Framework Program.
Rosswog, Carolina; Schmidt, Rene; Oberthuer, André; Juraeva, Dilafruz; Brors, Benedikt; Engesser, Anne; Kahlert, Yvonne; Volland, Ruth; Bartenhagen, Christoph; Simon, Thorsten; Berthold, Frank; Hero, Barbara; Faldum, Andreas; Fischer, Matthias
2017-12-01
Current risk stratification systems for neuroblastoma patients consider clinical, histopathological, and genetic variables, and additional prognostic markers have been proposed in recent years. We here sought to select highly informative covariates in a multistep strategy based on consecutive Cox regression models, resulting in a risk score that integrates hazard ratios of prognostic variables. A cohort of 695 neuroblastoma patients was divided into a discovery set (n=75) for multigene predictor generation, a training set (n=411) for risk score development, and a validation set (n=209). Relevant prognostic variables were identified by stepwise multivariable L1-penalized least absolute shrinkage and selection operator (LASSO) Cox regression, followed by backward selection in multivariable Cox regression, and then integrated into a novel risk score. The variables stage, age, MYCN status, and two multigene predictors, NB-th24 and NB-th44, were selected as independent prognostic markers by LASSO Cox regression analysis. Following backward selection, only the multigene predictors were retained in the final model. Integration of these classifiers in a risk scoring system distinguished three patient subgroups that differed substantially in their outcome. The scoring system discriminated patients with diverging outcome in the validation cohort (5-year event-free survival, 84.9±3.4 vs 63.6±14.5 vs 31.0±5.4; P<.001), and its prognostic value was validated by multivariable analysis. We here propose a translational strategy for developing risk assessment systems based on hazard ratios of relevant prognostic variables. Our final neuroblastoma risk score comprised two multigene predictors only, supporting the notion that molecular properties of the tumor cells strongly impact clinical courses of neuroblastoma patients. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Bergion, Viktor; Lindhe, Andreas; Sokolova, Ekaterina; Rosén, Lars
2018-04-01
Waterborne outbreaks of gastrointestinal diseases can cause large costs to society. Risk management needs to be holistic and transparent in order to reduce these risks in an effective manner. Microbial risk mitigation measures in a drinking water system were investigated using a novel approach combining probabilistic risk assessment and cost-benefit analysis. Lake Vomb in Sweden was used to exemplify and illustrate the risk-based decision model. Four mitigation alternatives were compared, where the first three alternatives, A1-A3, represented connecting 25, 50 and 75%, respectively, of on-site wastewater treatment systems in the catchment to the municipal wastewater treatment plant. The fourth alternative, A4, represented installing a UV-disinfection unit in the drinking water treatment plant. Quantitative microbial risk assessment was used to estimate the positive health effects in terms of quality adjusted life years (QALYs), resulting from the four mitigation alternatives. The health benefits were monetised using a unit cost per QALY. For each mitigation alternative, the net present value of health and environmental benefits and investment, maintenance and running costs was calculated. The results showed that only A4 can reduce the risk (probability of infection) below the World Health Organization guidelines of 10 -4 infections per person per year (looking at the 95th percentile). Furthermore, all alternatives resulted in a negative net present value. However, the net present value would be positive (looking at the 50 th percentile using a 1% discount rate) if non-monetised benefits (e.g. increased property value divided evenly over the studied time horizon and reduced microbial risks posed to animals), estimated at 800-1200 SEK (€100-150) per connected on-site wastewater treatment system per year, were included. This risk-based decision model creates a robust and transparent decision support tool. It is flexible enough to be tailored and applied to local settings of drinking water systems. The model provides a clear and holistic structure for decisions related to microbial risk mitigation. To improve the decision model, we suggest to further develop the valuation and monetisation of health effects and to refine the propagation of uncertainties and variabilities between the included methods. Copyright © 2018 Elsevier Ltd. All rights reserved.
System for decision analysis support on complex waste management issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shropshire, D.E.
1997-10-01
A software system called the Waste Flow Analysis has been developed and applied to complex environmental management processes for the United States Department of Energy (US DOE). The system can evaluate proposed methods of waste retrieval, treatment, storage, transportation, and disposal. Analysts can evaluate various scenarios to see the impacts to waste slows and schedules, costs, and health and safety risks. Decision analysis capabilities have been integrated into the system to help identify preferred alternatives based on a specific objectives may be to maximize the waste moved to final disposition during a given time period, minimize health risks, minimize costs,more » or combinations of objectives. The decision analysis capabilities can support evaluation of large and complex problems rapidly, and under conditions of variable uncertainty. The system is being used to evaluate environmental management strategies to safely disposition wastes in the next ten years and reduce the environmental legacy resulting from nuclear material production over the past forty years.« less
The East London glaucoma prediction score: web-based validation of glaucoma risk screening tool
Stephen, Cook; Benjamin, Longo-Mbenza
2013-01-01
AIM It is difficult for Optometrists and General Practitioners to know which patients are at risk. The East London glaucoma prediction score (ELGPS) is a web based risk calculator that has been developed to determine Glaucoma risk at the time of screening. Multiple risk factors that are available in a low tech environment are assessed to provide a risk assessment. This is extremely useful in settings where access to specialist care is difficult. Use of the calculator is educational. It is a free web based service. Data capture is user specific. METHOD The scoring system is a web based questionnaire that captures and subsequently calculates the relative risk for the presence of Glaucoma at the time of screening. Three categories of patient are described: Unlikely to have Glaucoma; Glaucoma Suspect and Glaucoma. A case review methodology of patients with known diagnosis is employed to validate the calculator risk assessment. RESULTS Data from the patient records of 400 patients with an established diagnosis has been captured and used to validate the screening tool. The website reports that the calculated diagnosis correlates with the actual diagnosis 82% of the time. Biostatistics analysis showed: Sensitivity = 88%; Positive predictive value = 97%; Specificity = 75%. CONCLUSION Analysis of the first 400 patients validates the web based screening tool as being a good method of screening for the at risk population. The validation is ongoing. The web based format will allow a more widespread recruitment for different geographic, population and personnel variables. PMID:23550097
Di Rosa, Mirko; Hausdorff, Jeff M; Stara, Vera; Rossi, Lorena; Glynn, Liam; Casey, Monica; Burkard, Stefan; Cherubini, Antonio
2017-06-01
Falls are a major health problem for older adults with immediate effects, such as fractures and head injuries, and longer term effects including fear of falling, loss of independence, and disability. The goals of the WIISEL project were to develop an unobtrusive, self-learning and wearable system aimed at assessing gait impairments and fall risk of older adults in the home setting; assessing activity and mobility in daily living conditions; identifying decline in mobility performance and detecting falls in the home setting. The WIISEL system was based on a pair of electronic insoles, able to transfer data to a commercially available smartphone, which was used to wirelessly collect data in real time from the insoles and transfer it to a backend computer server via mobile internet connection and then onwards to a gait analysis tool. Risk of falls was calculated by the system using a novel Fall Risk Index (FRI) based on multiple gait parameters and gait pattern recognition. The system was tested by twenty-nine older users and data collected by the insoles were compared with standardized functional tests with a concurrent validity approach. The results showed that the FRI captures the risk of falls with accuracy that is similar to that of conventional performance-based tests of fall risk. These preliminary findings support the idea that theWIISEL system can be a useful research tool and may have clinical utility for long-term monitoring of fall risk at home and in the community setting. Copyright © 2017 Elsevier B.V. All rights reserved.
Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.
2016-01-01
Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologiesmore » for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.« less
Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; ...
2017-01-24
We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less
Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof
2009-04-01
Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.
Overview of NASA Langley's Systems Analysis Capabilities
NASA Technical Reports Server (NTRS)
Cavanaugh, Stephen; Kumar, Ajay; Brewer, Laura; Kimmel, Bill; Korte, John; Moul, Tom
2006-01-01
The Systems Analysis and Concepts Directorate (SACD) has been in the systems analysis business line supporting National Aeronautics and Space Administration (NASA) aeronautics, exploration, space operations and science since the 1960 s. Our current organization structure is shown in Figure 1. SACD mission can be summed up in the following statements: 1. We conduct advanced concepts for Agency decision makers and programs. 2. We provide aerospace systems analysis products such as mission architectures, advanced system concepts, system and technology trades, life cycle cost and risk analysis, system integration and pre-decisional sensitive information. 3. Our work enables informed technical, programmatic and budgetary decisions. SACD has a complement of 114 government employees and approximately 50 on-site contractors which is equally split between supporting aeronautics and exploration. SACD strives for technical excellence and creditability of the systems analysis products delivered to its customers. The Directorate office is continuously building market intelligence and working with other NASA centers and external partners to expand our business base. The Branches strive for technical excellence and credibility of our systems analysis products by seeking out existing and new partnerships that are critical for successful systems analysis. The Directorates long term goal is to grow the amount of science systems analysis business base.
METAL SPECIATION IN SOIL, SEDIMENT, AND WATER SYSTEMS VIA SYNCHROTRON RADIATION RESEARCH
Metal contaminated environmental systems (soils, sediments, and water) have challenged researchers for many years. Traditional methods of analysis have employed extraction methods to determine total metal content and define risk based on the premise that as metal concentration in...
New risk metrics and mathematical tools for risk analysis: Current and future challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr
The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) themore » Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares, (e.g., Seafood Spoilage Predictor) have evolved the use of information systems in the food safety management. Such tools are updateable with new food-pathogen specific models containing cardinal parameters and multiple dependent variables, including plate counts, concentration of metabolic products, or even expression levels of certain genes. Then, these tools may further serve as decision-support tools which may assist in product logistics, based on their scientifically-based and “momentary” expressed spoilage and safety level.« less
New risk metrics and mathematical tools for risk analysis: Current and future challenges
NASA Astrophysics Data System (ADS)
Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon
2015-01-01
The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares, (e.g., Seafood Spoilage Predictor) have evolved the use of information systems in the food safety management. Such tools are updateable with new food-pathogen specific models containing cardinal parameters and multiple dependent variables, including plate counts, concentration of metabolic products, or even expression levels of certain genes. Then, these tools may further serve as decision-support tools which may assist in product logistics, based on their scientifically-based and "momentary" expressed spoilage and safety level.
Citizen Science to Support Community-based Flood Early Warning and Resilience Building
NASA Astrophysics Data System (ADS)
Paul, J. D.; Buytaert, W.; Allen, S.; Ballesteros-Cánovas, J. A.; Bhusal, J.; Cieslik, K.; Clark, J.; Dewulf, A.; Dhital, M. R.; Hannah, D. M.; Liu, W.; Nayaval, J. L.; Schiller, A.; Smith, P. J.; Stoffel, M.; Supper, R.
2017-12-01
In Disaster Risk Management, an emerging shift has been noted from broad-scale, top-down assessments towards more participatory, community-based, bottom-up approaches. Combined with technologies for robust and low-cost sensor networks, a citizen science approach has recently emerged as a promising direction in the provision of extensive, real-time information for flood early warning systems. Here we present the framework and initial results of a major new international project, Landslide EVO, aimed at increasing local resilience against hydrologically induced disasters in western Nepal by exploiting participatory approaches to knowledge generation and risk governance. We identify three major technological developments that strongly support our approach to flood early warning and resilience building in Nepal. First, distributed sensor networks, participatory monitoring, and citizen science hold great promise in complementing official monitoring networks and remote sensing by generating site-specific information with local buy-in, especially in data-scarce regions. Secondly, the emergence of open source, cloud-based risk analysis platforms supports the construction of a modular, distributed, and potentially decentralised data processing workflow. Finally, linking data analysis platforms to social computer networks and ICT (e.g. mobile phones, tablets) allows tailored interfaces and people-centred decision- and policy-support systems to be built. Our proposition is that maximum impact is created if end-users are involved not only in data collection, but also over the entire project life-cycle, including the analysis and provision of results. In this context, citizen science complements more traditional knowledge generation practices, and also enhances multi-directional information provision, risk management, early-warning systems and local resilience building.
WE-B-BRC-00: Concepts in Risk-Based Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...
Braulke, Friederike; Platzbecker, Uwe; Müller-Thomas, Catharina; Götze, Katharina; Germing, Ulrich; Brümmendorf, Tim H.; Nolte, Florian; Hofmann, Wolf-Karsten; Giagounidis, Aristoteles A. N.; Lübbert, Michael; Greenberg, Peter L.; Bennett, John M.; Solé, Francesc; Mallo, Mar; Slovak, Marilyn L.; Ohyashiki, Kazuma; Le Beau, Michelle M.; Tüchler, Heinz; Pfeilstöcker, Michael; Nösslinger, Thomas; Hildebrandt, Barbara; Shirneshan, Katayoon; Aul, Carlo; Stauder, Reinhard; Sperr, Wolfgang R.; Valent, Peter; Fonatsch, Christa; Trümper, Lorenz; Haase, Detlef; Schanz, Julie
2015-01-01
International Prognostic Scoring Systems are used to determine the individual risk profile of myelodysplastic syndrome patients. For the assessment of International Prognostic Scoring Systems, an adequate chromosome banding analysis of the bone marrow is essential. Cytogenetic information is not available for a substantial number of patients (5%–20%) with dry marrow or an insufficient number of metaphase cells. For these patients, a valid risk classification is impossible. In the study presented here, the International Prognostic Scoring Systems were validated based on fluorescence in situ hybridization analyses using extended probe panels applied to cluster of differentiation 34 positive (CD34+) peripheral blood cells of 328 MDS patients of our prospective multicenter German diagnostic study and compared to chromosome banding results of 2902 previously published patients with myelodysplastic syndromes. For cytogenetic risk classification by fluorescence in situ hybridization analyses of CD34+ peripheral blood cells, the groups differed significantly for overall and leukemia-free survival by uni- and multivariate analyses without discrepancies between treated and untreated patients. Including cytogenetic data of fluorescence in situ hybridization analyses of peripheral CD34+ blood cells (instead of bone marrow banding analysis) into the complete International Prognostic Scoring System assessment, the prognostic risk groups separated significantly for overall and leukemia-free survival. Our data show that a reliable stratification to the risk groups of the International Prognostic Scoring Systems is possible from peripheral blood in patients with missing chromosome banding analysis by using a comprehensive probe panel (clinicaltrials.gov identifier:01355913). PMID:25344522
Nykanen, David G; Forbes, Thomas J; Du, Wei; Divekar, Abhay A; Reeves, Jaxk H; Hagler, Donald J; Fagan, Thomas E; Pedra, Carlos A C; Fleming, Gregory A; Khan, Danyal M; Javois, Alexander J; Gruenstein, Daniel H; Qureshi, Shakeel A; Moore, Phillip M; Wax, David H
2016-02-01
We sought to develop a scoring system that predicts the risk of serious adverse events (SAE's) for individual pediatric patients undergoing cardiac catheterization procedures. Systematic assessment of risk of SAE in pediatric catheterization can be challenging in view of a wide variation in procedure and patient complexity as well as rapidly evolving technology. A 10 component scoring system was originally developed based on expert consensus and review of the existing literature. Data from an international multi-institutional catheterization registry (CCISC) between 2008 and 2013 were used to validate this scoring system. In addition we used multivariate methods to further refine the original risk score to improve its predictive power of SAE's. Univariate analysis confirmed the strong correlation of each of the 10 components of the original risk score with SAE attributed to a pediatric cardiac catheterization (P < 0.001 for all variables). Multivariate analysis resulted in a modified risk score (CRISP) that corresponds to an increase in value of area under a receiver operating characteristic curve (AUC) from 0.715 to 0.741. The CRISP score predicts risk of occurrence of an SAE for individual patients undergoing pediatric cardiac catheterization procedures. © 2015 Wiley Periodicals, Inc.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-16
...The Food and Drug Administration (FDA) is proposing to amend its regulation for Current Good Manufacturing Practice In Manufacturing, Packing, or Holding Human Food (CGMPs) to modernize it and to add requirements for domestic and foreign facilities that are required to register under the Federal Food, Drug, and Cosmetic Act (the FD&C Act) to establish and implement hazard analysis and risk- based preventive controls for human food. FDA also is proposing to revise certain definitions in FDA's current regulation for Registration of Food Facilities to clarify the scope of the exemption from registration requirements provided by the FD&C Act for ``farms.'' FDA is taking this action as part of its announced initiative to revisit the CGMPs since they were last revised in 1986 and to implement new statutory provisions in the FD&C Act. The proposed rule is intended to build a food safety system for the future that makes modern, science-, and risk-based preventive controls the norm across all sectors of the food system.
The Use of Object-Oriented Analysis Methods in Surety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.
1999-05-01
Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less
ERIC Educational Resources Information Center
Lutz, John E.; And Others
The degree of success of the computerized Child-Based Information System (CBIS) was analyzed in two areas--presenting, delivering, and managing a developmental curriculum; and recording, filing, and monitoring child tracking data, including requirements for Individualized Education Plans (IEP's). Preschool handicapped and high-risk children and…
Quantifying the Metrics That Characterize Safety Culture of Three Engineered Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, Julie; Ernesti, Mary; Tokuhiro, Akira
2002-07-01
With potential energy shortages and increasing electricity demand, the nuclear energy option is being reconsidered in the United States. Public opinion will have a considerable voice in policy decisions that will 'road-map' the future of nuclear energy in this country. This report is an extension of the last author's work on the 'safety culture' associated with three engineered systems (automobiles, commercial airplanes, and nuclear power plants) in Japan and the United States. Safety culture, in brief is defined as a specifically developed culture based on societal and individual interpretations of the balance of real, perceived, and imagined risks versus themore » benefits drawn from utilizing a given engineered systems. The method of analysis is a modified scale analysis, with two fundamental Eigen-metrics, time- (t) and number-scales (N) that describe both engineered systems and human factors. The scale analysis approach is appropriate because human perception of risk, perception of benefit and level of (technological) acceptance are inherently subjective, therefore 'fuzzy' and rarely quantifiable in exact magnitude. Perception of risk, expressed in terms of the psychometric factors 'dread risk' and 'unknown risk', contains both time- and number-scale elements. Various engineering system accidents with fatalities, reported by mass media are characterized by t and N, and are presented in this work using the scale analysis method. We contend that level of acceptance infers a perception of benefit at least two orders larger magnitude than perception of risk. The 'amplification' influence of mass media is also deduced as being 100- to 1000-fold the actual number of fatalities/serious injuries in a nuclear-related accident. (authors)« less
Comparative and Predictive Multimedia Assessments Using Monte Carlo Uncertainty Analyses
NASA Astrophysics Data System (ADS)
Whelan, G.
2002-05-01
Multiple-pathway frameworks (sometimes referred to as multimedia models) provide a platform for combining medium-specific environmental models and databases, such that they can be utilized in a more holistic assessment of contaminant fate and transport in the environment. These frameworks provide a relatively seamless transfer of information from one model to the next and from databases to models. Within these frameworks, multiple models are linked, resulting in models that consume information from upstream models and produce information to be consumed by downstream models. The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) is an example, which allows users to link their models to other models and databases. FRAMES is an icon-driven, site-layout platform that is an open-architecture, object-oriented system that interacts with environmental databases; helps the user construct a Conceptual Site Model that is real-world based; allows the user to choose the most appropriate models to solve simulation requirements; solves the standard risk paradigm of release transport and fate; and exposure/risk assessments to people and ecology; and presents graphical packages for analyzing results. FRAMES is specifically designed allow users to link their own models into a system, which contains models developed by others. This paper will present the use of FRAMES to evaluate potential human health exposures using real site data and realistic assumptions from sources, through the vadose and saturated zones, to exposure and risk assessment at three real-world sites, using the Multimedia Environmental Pollutant Assessment System (MEPAS), which is a multimedia model contained within FRAMES. These real-world examples use predictive and comparative approaches coupled with a Monte Carlo analysis. A predictive analysis is where models are calibrated to monitored site data, prior to the assessment, and a comparative analysis is where models are not calibrated but based solely on literature or judgement and is usually used to compare alternatives. In many cases, a combination is employed where the model is calibrated to a portion of the data (e.g., to determine hydrodynamics), then used to compare alternatives. Three subsurface-based multimedia examples are presented, increasing in complexity. The first presents the application of a predictive, deterministic assessment; the second presents a predictive and comparative, Monte Carlo analysis; and the third presents a comparative, multi-dimensional Monte Carlo analysis. Endpoints are typically presented in terms of concentration, hazard, risk, and dose, and because the vadose zone model typically represents a connection between a source and the aquifer, it does not generally represent the final medium in a multimedia risk assessment.
Analysis of the Space Propulsion System Problem Using RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
diego mandelli; curtis smith; cristian rabiti
This paper presents the solution of the space propulsion problem using a PRA code currently under development at Idaho National Laboratory (INL). RAVEN (Reactor Analysis and Virtual control ENviroment) is a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities. It is designed to derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures) and to perform both Monte- Carlo sampling of random distributed events and Event Tree based analysis. In order to facilitate the input/output handling, a Graphical User Interface (GUI) and a post-processing data-mining module are available.more » RAVEN allows also to interface with several numerical codes such as RELAP5 and RELAP-7 and ad-hoc system simulators. For the space propulsion system problem, an ad-hoc simulator has been developed and written in python language and then interfaced to RAVEN. Such simulator fully models both deterministic (e.g., system dynamics and interactions between system components) and stochastic behaviors (i.e., failures of components/systems such as distribution lines and thrusters). Stochastic analysis is performed using random sampling based methodologies (i.e., Monte-Carlo). Such analysis is accomplished to determine both the reliability of the space propulsion system and to propagate the uncertainties associated to a specific set of parameters. As also indicated in the scope of the benchmark problem, the results generated by the stochastic analysis are used to generate risk-informed insights such as conditions under witch different strategy can be followed.« less
Tian, Hua; Wang, Xueying; Shu, Gequn; Wu, Mingqiang; Yan, Nanhua; Ma, Xiaonan
2017-09-15
Mixture of hydrocarbon and carbon dioxide shows excellent cycle performance in Organic Rankine Cycle (ORC) used for engine waste heat recovery, but the unavoidable leakage in practical application is a threat for safety due to its flammability. In this work, a quantitative risk assessment system (QR-AS) is established aiming at providing a general method of risk assessment for flammable working fluid leakage. The QR-AS covers three main aspects: analysis of concentration distribution based on CFD simulations, explosive risk assessment based on the TNT equivalent method and risk mitigation based on evaluation results. A typical case of propane/carbon dioxide mixture leaking from ORC is investigated to illustrate the application of QR-AS. According to the assessment results, proper ventilation speed, safe mixture ratio and location of gas-detecting devices have been proposed to guarantee the security in case of leakage. The results revealed that this presented QR-AS was reliable for the practical application and the evaluation results could provide valuable guidance for the design of mitigation measures to improve the safe performance of ORC system. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Pingel, N.; Liang, Y.; Bindra, A.
2016-12-01
More than 1 million Californians live and work in the floodplains of the Sacramento-San Joaquin Valley where flood risks are among the highest in the nation. In response to this threat to people, property and the environment, the Department of Water Resources (DWR) has been called to action to improve flood risk management. This has transpired through significant advances in development of flood information and tools, analysis, and planning. Senate Bill 5 directed DWR to prepare the Central Valley Flood Protection Plan (CVFPP) and update it every 5 years. A key component of this aggressive planning approach is answering the question: What is the current flood risk, and how would proposed improvements change flood risk throughout the system? Answering this question is a substantial challenge due to the size and complexity of the watershed and flood control system. The watershed is roughly 42,000 sq mi, and flows are controlled by numerous reservoirs, bypasses, and levees. To overcome this challenge, the State invested in development of a comprehensive analysis "tool box" through various DWR programs. Development of the tool box included: collection of hydro-meteorological, topographic, geotechnical, and economic data; development of rainfall-runoff, reservoir operation, hydraulic routing, and flood risk analysis models; and development of specialized applications and computing schemes to accelerate the analysis. With this toolbox, DWR is analyzing flood hazard, flood control system performance, exposure and vulnerability of people and property to flooding, consequence of flooding for specific events, and finally flood risk for a range of CVFPP alternatives. Based on the results, DWR will put forward a State Recommended Plan in the 2017 CVFPP. Further, the value of the analysis tool box extends beyond the CVFPP. It will serve as a foundation for other flood studies for years to come and has already been successfully applied for inundation mapping to support emergency response, reservoir operation analysis, and others.
How the ownership structures cause epidemics in financial markets: A network-based simulation model
NASA Astrophysics Data System (ADS)
Dastkhan, Hossein; Gharneh, Naser Shams
2018-02-01
Analysis of systemic risks and contagions is one of the main challenges of policy makers and researchers in the recent years. Network theory is introduced as a main approach in the modeling and simulation of financial and economic systems. In this paper, a simulation model is introduced based on the ownership network to analyze the contagion and systemic risk events. For this purpose, different network structures with different values for parameters are considered to investigate the stability of the financial system in the presence of different kinds of idiosyncratic and aggregate shocks. The considered network structures include Erdos-Renyi, core-periphery, segregated and power-law networks. Moreover, the results of the proposed model are also calculated for a real ownership network. The results show that the network structure has a significant effect on the probability and the extent of contagion in the financial systems. For each network structure, various values for the parameters results in remarkable differences in the systemic risk measures. The results of real case show that the proposed model is appropriate in the analysis of systemic risk and contagion in financial markets, identification of systemically important firms and estimation of market loss when the initial failures occur. This paper suggests a new direction in the modeling of contagion in the financial markets, in particular that the effects of new kinds of financial exposure are clarified. This paper's idea and analytical results may also be useful for the financial policy makers, portfolio managers and the firms to conduct their investment in the right direction.
NASA Astrophysics Data System (ADS)
Augustine, Kurt E.; Camp, Jon J.; Holmes, David R.; Huddleston, Paul M.; Lu, Lichun; Yaszemski, Michael J.; Robb, Richard A.
2012-03-01
Failure of the spine's structural integrity from metastatic disease can lead to both pain and neurologic deficit. Fractures that require treatment occur in over 30% of bony metastases. Our objective is to use computed tomography (CT) in conjunction with analytic techniques that have been previously developed to predict fracture risk in cancer patients with metastatic disease to the spine. Current clinical practice for cancer patients with spine metastasis often requires an empirical decision regarding spinal reconstructive surgery. Early image-based software systems used for CT analysis are time consuming and poorly suited for clinical application. The Biomedical Image Resource (BIR) at Mayo Clinic, Rochester has developed an image analysis computer program that calculates from CT scans, the residual load-bearing capacity in a vertebra with metastatic cancer. The Spine Cancer Assessment (SCA) program is built on a platform designed for clinical practice, with a workflow format that allows for rapid selection of patient CT exams, followed by guided image analysis tasks, resulting in a fracture risk report. The analysis features allow the surgeon to quickly isolate a single vertebra and obtain an immediate pre-surgical multiple parallel section composite beam fracture risk analysis based on algorithms developed at Mayo Clinic. The analysis software is undergoing clinical validation studies. We expect this approach will facilitate patient management and utilization of reliable guidelines for selecting among various treatment option based on fracture risk.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...
NASA Astrophysics Data System (ADS)
Ho, Long-Phi; Chau, Nguyen-Xuan-Quang; Nguyen, Hong-Quan
2013-04-01
The Nhieu Loc - Thi Nghe basin is the most important administrative and business area of Ho Chi Minh City. Due to system complexity of the basin such as the increasing trend of rainfall intensity, (tidal) water level and land subsidence, the simulation of hydrological, hydraulic variables for flooding prediction seems rather not adequate in practical projects. The basin is still highly vulnerable despite of multi-million USD investment for urban drainage improvement projects since the last decade. In this paper, an integrated system analysis in both spatial and temporal aspects based on statistical, GIS and modelling approaches has been conducted in order to: (1) Analyse risks before and after projects, (2) Foresee water-related risk under uncertainties of unfavourable driving factors and (3) Develop a sustainable flood risk management strategy for the basin. The results show that given the framework of risk analysis and adaptive strategy, certain urban developing plans in the basin must be carefully revised and/or checked in order to reduce the highly unexpected loss in the future
The Role and Quality of Software Safety in the NASA Constellation Program
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.
2010-01-01
In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.
The economic viability of pursuing a space power system concept
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.
1977-01-01
The development of a space power system requires no fundamental technological breakthroughs. There are, however, uncertainties regarding the degree to which necessary developments can be achieved or exceeded. An analysis is conducted concerning the implementation of a 5000 MW space-based solar power system based on photovoltaic conversion of solar energy to electrical energy. The solar array is about 13 km long and 5 km wide. Placed in geosynchronous orbit, it provides power to the earth for 30 years. Attention is given to the economic feasibility of a space power system, a risk analysis for space power systems, and the use of the presented methodology for comparing alternative technology development programs.
IEEE 1982. Proceedings of the international conference on cybernetics and society
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1982-01-01
The following topics were dealt with: knowledge-based systems; risk analysis; man-machine interactions; human information processing; metaphor, analogy and problem-solving; manual control modelling; transportation systems; simulation; adaptive and learning systems; biocybernetics; cybernetics; mathematical programming; robotics; decision support systems; analysis, design and validation of models; computer vision; systems science; energy systems; environmental modelling and policy; pattern recognition; nuclear warfare; technological forecasting; artificial intelligence; the Turin shroud; optimisation; workloads. Abstracts of individual papers can be found under the relevant classification codes in this or future issues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.
We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less
Zhou, Jinzhe; Zhou, Yanbing; Cao, Shougen; Li, Shikuan; Wang, Hao; Niu, Zhaojian; Chen, Dong; Wang, Dongsheng; Lv, Liang; Zhang, Jian; Li, Yu; Jiao, Xuelong; Tan, Xiaojie; Zhang, Jianli; Wang, Haibo; Zhang, Bingyuan; Lu, Yun; Sun, Zhenqing
2016-01-01
Reporting of surgical complications is common, but few provide information about the severity and estimate risk factors of complications. If have, but lack of specificity. We retrospectively analyzed data on 2795 gastric cancer patients underwent surgical procedure at the Affiliated Hospital of Qingdao University between June 2007 and June 2012, established multivariate logistic regression model to predictive risk factors related to the postoperative complications according to the Clavien-Dindo classification system. Twenty-four out of 86 variables were identified statistically significant in univariate logistic regression analysis, 11 significant variables entered multivariate analysis were employed to produce the risk model. Liver cirrhosis, diabetes mellitus, Child classification, invasion of neighboring organs, combined resection, introperative transfusion, Billroth II anastomosis of reconstruction, malnutrition, surgical volume of surgeons, operating time and age were independent risk factors for postoperative complications after gastrectomy. Based on logistic regression equation, p=Exp∑BiXi / (1+Exp∑BiXi), multivariate logistic regression predictive model that calculated the risk of postoperative morbidity was developed, p = 1/(1 + e((4.810-1.287X1-0.504X2-0.500X3-0.474X4-0.405X5-0.318X6-0.316X7-0.305X8-0.278X9-0.255X10-0.138X11))). The accuracy, sensitivity and specificity of the model to predict the postoperative complications were 86.7%, 76.2% and 88.6%, respectively. This risk model based on Clavien-Dindo grading severity of complications system and logistic regression analysis can predict severe morbidity specific to an individual patient's risk factors, estimate patients' risks and benefits of gastric surgery as an accurate decision-making tool and may serve as a template for the development of risk models for other surgical groups.
Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S
2016-01-01
In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability.
Critical asset and portfolio risk analysis: an all-hazards framework.
Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark
2007-08-01
This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.
Integration of expert knowledge and uncertainty in natural risk assessment
NASA Astrophysics Data System (ADS)
Baruffini, Mirko; Jaboyedoff, Michel
2010-05-01
Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and uncertainties. We followed the same approach for each term of risks i.e. hazard, vulnerability, element at risk, exposition. This risk approach can be achieved by a comprehensive use of several artificial intelligence (AI) technologies, which are done through, for example: (1) GIS techniques; (2) FR or T-PDF for qualitatively predicting risks for possible review results; and (3) A Multi-Criteria Evaluation for analyzing weak points. The main advantages of FR or T-PDF involve the ability to express not-fully-formalized knowledge, easy knowledge representation and acquisition, and self updatability. The results show that such an approach points out quite wide zone of uncertainty. REFERENCES Zadeh L.A. 1965 : Fuzzy Sets. Information and Control, 8:338-353.
Deng, Xinyang; Jiang, Wen
2017-09-12
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.
Deng, Xinyang
2017-01-01
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905
RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk
NASA Astrophysics Data System (ADS)
van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina
2015-04-01
Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs)
Schultz, Michael; Seo, Steven Bohwan; Holt, Alec; Regenbrecht, Holger
2015-11-18
Colorectal cancer (CRC) has a high incidence, especially in New Zealand. The reasons for this are unknown. While most cancers develop sporadically, a positive family history, determined by the number and age at diagnosis of affected first and second degree relatives with CRC is one of the major factors, which may increase an individual's lifetime risk. Before a patient can be enrolled in a surveillance program a detailed assessment and documentation of the family history is important but time consuming and often inaccurate. The documentation is usually paper-based. Our aim was therefore to develop and validate the usability and efficacy of a web-based family history assessment tool for CRC suitable for the general population. The tool was also to calculate the risk and make a recommendation for surveillance. Two versions of an electronic assessment tool, diagram-based and questionnaire-based, were developed with the risk analysis and recommendations for surveillance based on the New Zealand Guidelines Group recommendations. Accuracy of our tool was tested prior to the study by comparing risk calculations based on family history by experienced gastroenterologists with the electronic assessment. The general public, visiting a local science fair were asked to use and comment on the usability of the two interfaces. Ninety people assessed and commented on the two interfaces. Both interfaces were effective in assessing the risk to develop CRC through their familial history for CRC. However, the questionnaire-based interface performed with significantly better satisfaction (p = 0.001) than the diagram-based interface. There was no difference in efficacy though. We conclude that a web-based questionnaire tool can assist in the accurate documentation and analysis of the family history relevant to determine the individual risk of CRC based on local guidelines. The calculator is now implemented and assessable through the web-page of a local charity for colorectal cancer awareness and integral part of the local general practitioners' e-referral system for colonic imaging.
Classifying Nanomaterial Risks Using Multi-Criteria Decision Analysis
NASA Astrophysics Data System (ADS)
Linkov, I.; Steevens, J.; Chappell, M.; Tervonen, T.; Figueira, J. R.; Merad, M.
There is rapidly growing interest by regulatory agencies and stakeholders in the potential toxicity and other risks associated with nanomaterials throughout the different stages of the product life cycle (e.g., development, production, use and disposal). Risk assessment methods and tools developed and applied to chemical and biological material may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material because of the variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as promote the safe use/handling of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. The stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different risk categories based on our current knowledge of nanomaterial's physico-chemical characteristics, variation in produced material, and best professional judgement. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.1,2
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Wallops Ship Surveillance System
NASA Technical Reports Server (NTRS)
Smith, Donna C.
2011-01-01
Approved as a Wallops control center backup system, the Wallops Ship Surveillance Software is a day-of-launch risk analysis tool for spaceport activities. The system calculates impact probabilities and displays ship locations relative to boundary lines. It enables rapid analysis of possible flight paths to preclude the need to cancel launches and allow execution of launches in a timely manner. Its design is based on low-cost, large-customer- base elements including personal computers, the Windows operating system, C/C++ object-oriented software, and network interfaces. In conformance with the NASA software safety standard, the system is designed to ensure that it does not falsely report a safe-for-launch condition. To improve the current ship surveillance method, the system is designed to prevent delay of launch under a safe-for-launch condition. A single workstation is designated the controller of the official ship information and the official risk analysis. Copies of this information are shared with other networked workstations. The program design is divided into five subsystems areas: 1. Communication Link -- threads that control the networking of workstations; 2. Contact List -- a thread that controls a list of protected item (ocean vessel) information; 3. Hazard List -- threads that control a list of hazardous item (debris) information and associated risk calculation information; 4. Display -- threads that control operator inputs and screen display outputs; and 5. Archive -- a thread that controls archive file read and write access. Currently, most of the hazard list thread and parts of other threads are being reused as part of a new ship surveillance system, under the SureTrak project.
Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da
2016-12-01
Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.
Interactive decision support in hepatic surgery
Dugas, Martin; Schauer, Rolf; Volk, Andreas; Rau, Horst
2002-01-01
Background Hepatic surgery is characterized by complicated operations with a significant peri- and postoperative risk for the patient. We developed a web-based, high-granular research database for comprehensive documentation of all relevant variables to evaluate new surgical techniques. Methods To integrate this research system into the clinical setting, we designed an interactive decision support component. The objective is to provide relevant information for the surgeon and the patient to assess preoperatively the risk of a specific surgical procedure. Based on five established predictors of patient outcomes, the risk assessment tool searches for similar cases in the database and aggregates the information to estimate the risk for an individual patient. Results The physician can verify the analysis and exclude manually non-matching cases according to his expertise. The analysis is visualized by means of a Kaplan-Meier plot. To evaluate the decision support component we analyzed data on 165 patients diagnosed with hepatocellular carcinoma (period 1996–2000). The similarity search provides a two-peak distribution indicating there are groups of similar patients and singular cases which are quite different to the average. The results of the risk estimation are consistent with the observed survival data, but must be interpreted with caution because of the limited number of matching reference cases. Conclusion Critical issues for the decision support system are clinical integration, a transparent and reliable knowledge base and user feedback. PMID:12003639
Space Shuttle critical function audit
NASA Technical Reports Server (NTRS)
Sacks, Ivan J.; Dipol, John; Su, Paul
1990-01-01
A large fault-tolerance model of the main propulsion system of the US space shuttle has been developed. This model is being used to identify single components and pairs of components that will cause loss of shuttle critical functions. In addition, this model is the basis for risk quantification of the shuttle. The process used to develop and analyze the model is digraph matrix analysis (DMA). The DMA modeling and analysis process is accessed via a graphics-based computer user interface. This interface provides coupled display of the integrated system schematics, the digraph models, the component database, and the results of the fault tolerance and risk analyses.
Evaluating the Cost, Safety, and Proliferation Risks of Small Floating Nuclear Reactors.
Ford, Michael J; Abdulla, Ahmed; Morgan, M Granger
2017-11-01
It is hard to see how our energy system can be decarbonized if the world abandons nuclear power, but equally hard to introduce the technology in nonnuclear energy states. This is especially true in countries with limited technical, institutional, and regulatory capabilities, where safety and proliferation concerns are acute. Given the need to achieve serious emissions mitigation by mid-century, and the multidecadal effort required to develop robust nuclear governance institutions, we must look to other models that might facilitate nuclear plant deployment while mitigating the technology's risks. One such deployment paradigm is the build-own-operate-return model. Because returning small land-based reactors containing spent fuel is infeasible, we evaluate the cost, safety, and proliferation risks of a system in which small modular reactors are manufactured in a factory, and then deployed to a customer nation on a floating platform. This floating small modular reactor would be owned and operated by a single entity and returned unopened to the developed state for refueling. We developed a decision model that allows for a comparison of floating and land-based alternatives considering key International Atomic Energy Agency plant-siting criteria. Abandoning onsite refueling is beneficial, and floating reactors built in a central facility can potentially reduce the risk of cost overruns and the consequences of accidents. However, if the floating platform must be built to military-grade specifications, then the cost would be much higher than a land-based system. The analysis tool presented is flexible, and can assist planners in determining the scope of risks and uncertainty associated with different deployment options. © 2017 Society for Risk Analysis.
Risk-based principles for defining and managing water security
Hall, Jim; Borgomeo, Edoardo
2013-01-01
The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
Atella, Vincenzo; Brunetti, Marianna; Maestas, Nicole
2012-05-01
Health risk is increasingly viewed as an important form of background risk that affects household portfolio decisions. However, its role might be mediated by the presence of a protective full-coverage national health service that could reduce households' probability of incurring current and future out-of-pocket medical expenditures. We use SHARE data to study the influence of current health status and future health risk on the decision to hold risky assets, across ten European countries with different health systems, each offering a different degree of protection against out-of-pocket medical expenditures. We find robust empirical evidence that perceived health status matters more than objective health status and, consistent with the theory of background risk, health risk affects portfolio choices only in countries with less protective health care systems. Furthermore, portfolio decisions consistent with background risk models are observed only with respect to middle-aged and highly-educated investors.
Constellation Program (CxP) Crew Exploration Vehicle (CEV) Project Integrated Landing System
NASA Technical Reports Server (NTRS)
Baker, John D.; Yuchnovicz, Daniel E.; Eisenman, David J.; Peer, Scott G.; Fasanella, Edward L.; Lawrence, Charles
2009-01-01
Crew Exploration Vehicle (CEV) Chief Engineer requested a risk comparison of the Integrated Landing System design developed by NASA and the design developed by Contractor- referred to as the LM 604 baseline. Based on the results of this risk comparison, the CEV Chief engineer requested that the NESC evaluate identified risks and develop strategies for their reduction or mitigation. The assessment progressed in two phases. A brief Phase I analysis was performed by the Water versus Land-Landing Team to compare the CEV Integrated Landing System proposed by the Contractor against the NASA TS-LRS001 baseline with respect to risk. A phase II effort examined the areas of critical importance to the overall landing risk, evaluating risk to the crew and to the CEV Crew Module (CM) during a nominal land-landing. The findings of the assessment are contained in this report.
Unmanned aircraft system sense and avoid integrity and continuity
NASA Astrophysics Data System (ADS)
Jamoom, Michael B.
This thesis describes new methods to guarantee safety of sense and avoid (SAA) functions for Unmanned Aircraft Systems (UAS) by evaluating integrity and continuity risks. Previous SAA efforts focused on relative safety metrics, such as risk ratios, comparing the risk of using an SAA system versus not using it. The methods in this thesis evaluate integrity and continuity risks as absolute measures of safety, as is the established practice in commercial aircraft terminal area navigation applications. The main contribution of this thesis is a derivation of a new method, based on a standard intruder relative constant velocity assumption, that uses hazard state estimates and estimate error covariances to establish (1) the integrity risk of the SAA system not detecting imminent loss of '"well clear," which is the time and distance required to maintain safe separation from intruder aircraft, and (2) the probability of false alert, the continuity risk. Another contribution is applying these integrity and continuity risk evaluation methods to set quantifiable and certifiable safety requirements on sensors. A sensitivity analysis uses this methodology to evaluate the impact of sensor errors on integrity and continuity risks. The penultimate contribution is an integrity and continuity risk evaluation where the estimation model is refined to address realistic intruder relative linear accelerations, which goes beyond the current constant velocity standard. The final contribution is an integrity and continuity risk evaluation addressing multiple intruders. This evaluation is a new innovation-based method to determine the risk of mis-associating intruder measurements. A mis-association occurs when the SAA system incorrectly associates a measurement to the wrong intruder, causing large errors in the estimated intruder trajectories. The new methods described in this thesis can help ensure safe encounters between aircraft and enable SAA sensor certification for UAS integration into the National Airspace System.
Multi Criteria Evaluation Module for RiskChanges Spatial Decision Support System
NASA Astrophysics Data System (ADS)
Olyazadeh, Roya; Jaboyedoff, Michel; van Westen, Cees; Bakker, Wim
2015-04-01
Multi-Criteria Evaluation (MCE) module is one of the five modules of RiskChanges spatial decision support system. RiskChanges web-based platform aims to analyze changes in hydro-meteorological risk and provides tools for selecting the best risk reduction alternative. It is developed under CHANGES framework (changes-itn.eu) and INCREO project (increo-fp7.eu). MCE tool helps decision makers and spatial planners to evaluate, sort and rank the decision alternatives. The users can choose among different indicators that are defined within the system using Risk and Cost Benefit analysis results besides they can add their own indicators. Subsequently the system standardizes and prioritizes them. Finally, the best decision alternative is selected by using the weighted sum model (WSM). The Application of this work is to facilitate the effect of MCE for analyzing changing risk over the time under different scenarios and future years by adopting a group decision making into practice and comparing the results by numeric and graphical view within the system. We believe that this study helps decision-makers to achieve the best solution by expressing their preferences for strategies under future scenarios. Keywords: Multi-Criteria Evaluation, Spatial Decision Support System, Weighted Sum Model, Natural Hazard Risk Management
MAVEN Information Security Governance, Risk Management, and Compliance (GRC): Lessons Learned
NASA Technical Reports Server (NTRS)
Takamura, Eduardo; Gomez-Rosa, Carlos A.; Mangum, Kevin; Wasiak, Fran
2014-01-01
As the first interplanetary mission managed by the NASA Goddard Space Flight Center, the Mars Atmosphere and Volatile EvolutioN (MAVEN) had three IT security goals for its ground system: COMPLIANCE, (IT) RISK REDUCTION, and COST REDUCTION. In a multiorganizational environment in which government, industry and academia work together in support of the ground system and mission operations, information security governance, risk management, and compliance (GRC) becomes a challenge as each component of the ground system has and follows its own set of IT security requirements. These requirements are not necessarily the same or even similar to each other's, making the auditing of the ground system security a challenging feat. A combination of standards-based information security management based on the National Institute of Standards and Technology (NIST) Risk Management Framework (RMF), due diligence by the Mission's leadership, and effective collaboration among all elements of the ground system enabled MAVEN to successfully meet NASA's requirements for IT security, and therefore meet Federal Information Security Management Act (FISMA) mandate on the Agency. Throughout the implementation of GRC on MAVEN during the early stages of the mission development, the Project faced many challenges some of which have been identified in this paper. The purpose of this paper is to document these challenges, and provide a brief analysis of the lessons MAVEN learned. The historical information documented herein, derived from an internal pre-launch lessons learned analysis, can be used by current and future missions and organizations implementing and auditing GRC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khalil, Y. F.
2012-04-30
The objective of this project is to examine safety aspects of candidate hydrogen storage materials and systems being developed in the DOE Hydrogen Program. As a result of this effort, the general DOE safety target will be given useful meaning by establishing a link between the characteristics of new storage materials and the satisfaction of safety criteria. This will be accomplished through the development and application of formal risk analysis methods, standardized materials testing, chemical reactivity characterization, novel risk mitigation approaches and subscale system demonstration. The project also will collaborate with other DOE and international activities in materials based hydrogenmore » storage safety to provide a larger, highly coordinated effort.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munganahalli, D.
Sedco Forex is a drilling contractor that operates approximately 80 rigs on land and offshore worldwide. The HSE management system developed by Sedco Forex is an effort to prevent accidents and minimize losses. An integral part of the HSE management system is establishing risk profiles and thereby minimizing risk and reducing loss exposures. Risk profiles are established based on accident reports, potential accident reports and other risk identification reports (RIR) like the Du Pont STOP system. A rig could fill in as many as 30 accident reports, 30 potential accident reports and 500 STOP cards each year. Statistics are importantmore » for an HSE management system, since they are indicators of success or failure of HSE systems. It is however difficult to establish risk profiles based on statistical information, unless tools are available at the rig site to aid with the analysis. Risk profiles are then used to identify important areas in the operation that may require specific attention to minimize the loss exposure. Programs to address the loss exposure can then be identified and implemented with either a local or corporate approach. In January 1995, Sedco Forex implemented a uniform HSE Database on all the rigs worldwide. In one year companywide, the HSE database would contain information on approximately 500 accident and potential accident reports, and 10,000 STOP cards. This paper demonstrates the salient features of the database and describes how it has helped in establishing key risk profiles. It also shows a recent example of how risk profiles have been established at the corporate level and used to identify the key contributing factors to hands and finger injuries. Based on this information, a campaign was launched to minimize the frequency of occurrence and associated loss attributed to hands and fingers accidents.« less
Risk analysis for autonomous underwater vehicle operations in extreme environments.
Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter
2010-12-01
Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. © 2010 Society for Risk Analysis.
Multicriteria Decision Framework for Cybersecurity Risk Assessment and Management.
Ganin, Alexander A; Quach, Phuoc; Panwar, Mahesh; Collier, Zachary A; Keisler, Jeffrey M; Marchese, Dayton; Linkov, Igor
2017-09-05
Risk assessors and managers face many difficult challenges related to novel cyber systems. Among these challenges are the constantly changing nature of cyber systems caused by technical advances, their distribution across the physical, information, and sociocognitive domains, and the complex network structures often including thousands of nodes. Here, we review probabilistic and risk-based decision-making techniques applied to cyber systems and conclude that existing approaches typically do not address all components of the risk assessment triplet (threat, vulnerability, consequence) and lack the ability to integrate across multiple domains of cyber systems to provide guidance for enhancing cybersecurity. We present a decision-analysis-based approach that quantifies threat, vulnerability, and consequences through a set of criteria designed to assess the overall utility of cybersecurity management alternatives. The proposed framework bridges the gap between risk assessment and risk management, allowing an analyst to ensure a structured and transparent process of selecting risk management alternatives. The use of this technique is illustrated for a hypothetical, but realistic, case study exemplifying the process of evaluating and ranking five cybersecurity enhancement strategies. The approach presented does not necessarily eliminate biases and subjectivity necessary for selecting countermeasures, but provides justifiable methods for selecting risk management actions consistent with stakeholder and decisionmaker values and technical data. Published 2017. This article is a U.S. Government work and is in the public domain in the U.S.A.
A risk-based approach to robotic mission requirements
NASA Technical Reports Server (NTRS)
Dias, William C.; Bourke, Roger D.
1992-01-01
A NASA Risk Team has developed a method for the application of risk management to the definition of robotic mission requirements for the Space Exploration Initiative. These requirements encompass environmental information, infrastructural emplacement in advance, and either technology testing or system/subsystems demonstration. Attention is presently given to a method for step-by-step consideration and analysis of the risk component inherent in mission architecture, followed by a calculation of the subjective risk level. Mitigation strategies are then applied with the same rules, and a comparison is made.
A GIS-based approach for comparative analysis of potential fire risk assessment
NASA Astrophysics Data System (ADS)
Sun, Ying; Hu, Lieqiu; Liu, Huiping
2007-06-01
Urban fires are one of the most important sources of property loss and human casualty and therefore it is necessary to assess the potential fire risk with consideration of urban community safety. Two evaluation models are proposed, both of which are integrated with GIS. One is the single factor model concerning the accessibility of fire passage and the other is grey clustering approach based on the multifactor system. In the latter model, fourteen factors are introduced and divided into four categories involving security management, evacuation facility, construction resistance and fire fighting capability. A case study on campus of Beijing Normal University is presented to express the potential risk assessment models in details. A comparative analysis of the two models is carried out to validate the accuracy. The results are approximately consistent with each other. Moreover, modeling with GIS promotes the efficiency the potential risk assessment.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations
2013-03-01
TITLE: Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations PRINCIPAL INVESTIGATOR: Fengshan Liu...SUBTITLE 5a. CONTRACT NUMBER Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations 5b. GRANT NUMBER...identifying the prevalence of women with incomplete visualization of the breast . We developed a code to estimate the breast cancer risks using the
NASA Astrophysics Data System (ADS)
Ersin, Ozlem Hacer
Novel technologies and their resultant products demand fresh ways of thinking about pre-market risk analysis and post-market surveillance. A regulatory framework that is responsive to emerging knowledge about the hazards of novel technologies offers repeatable and transparent processes and remains economically and socially feasible. Workers are an especially vulnerable population who are exposed to unknown hazards of novel technologies and serve often as unwitting sentinels of impending risks. This Grounded Theory-based case study identifies gaps in our current ability to regulate novel technologies so as to minimize occupational health risks and offers necessary modifications for an environment that is conducive to proper regulation. Nanopharmaceuticals and the nano-based technologies at their base are used by way of exemplar technologies that are currently taxing the ability of the regulatory system to provide adequate oversight. Ambiguities of definition, absence of a tracking system (of who is doing nanotechnology research), and the paucity of scientific evidence to support risk management efforts are among the findings of the study and need to be addressed as ameliorative steps toward an effective regulatory structure.
Braulke, Friederike; Platzbecker, Uwe; Müller-Thomas, Catharina; Götze, Katharina; Germing, Ulrich; Brümmendorf, Tim H; Nolte, Florian; Hofmann, Wolf-Karsten; Giagounidis, Aristoteles A N; Lübbert, Michael; Greenberg, Peter L; Bennett, John M; Solé, Francesc; Mallo, Mar; Slovak, Marilyn L; Ohyashiki, Kazuma; Le Beau, Michelle M; Tüchler, Heinz; Pfeilstöcker, Michael; Nösslinger, Thomas; Hildebrandt, Barbara; Shirneshan, Katayoon; Aul, Carlo; Stauder, Reinhard; Sperr, Wolfgang R; Valent, Peter; Fonatsch, Christa; Trümper, Lorenz; Haase, Detlef; Schanz, Julie
2015-02-01
International Prognostic Scoring Systems are used to determine the individual risk profile of myelodysplastic syndrome patients. For the assessment of International Prognostic Scoring Systems, an adequate chromosome banding analysis of the bone marrow is essential. Cytogenetic information is not available for a substantial number of patients (5%-20%) with dry marrow or an insufficient number of metaphase cells. For these patients, a valid risk classification is impossible. In the study presented here, the International Prognostic Scoring Systems were validated based on fluorescence in situ hybridization analyses using extended probe panels applied to cluster of differentiation 34 positive (CD34(+)) peripheral blood cells of 328 MDS patients of our prospective multicenter German diagnostic study and compared to chromosome banding results of 2902 previously published patients with myelodysplastic syndromes. For cytogenetic risk classification by fluorescence in situ hybridization analyses of CD34(+) peripheral blood cells, the groups differed significantly for overall and leukemia-free survival by uni- and multivariate analyses without discrepancies between treated and untreated patients. Including cytogenetic data of fluorescence in situ hybridization analyses of peripheral CD34(+) blood cells (instead of bone marrow banding analysis) into the complete International Prognostic Scoring System assessment, the prognostic risk groups separated significantly for overall and leukemia-free survival. Our data show that a reliable stratification to the risk groups of the International Prognostic Scoring Systems is possible from peripheral blood in patients with missing chromosome banding analysis by using a comprehensive probe panel (clinicaltrials.gov identifier:01355913). Copyright© Ferrata Storti Foundation.
Vulnerability and risk of deltaic social-ecological systems exposed to multiple hazards.
Hagenlocher, Michael; Renaud, Fabrice G; Haas, Susanne; Sebesvari, Zita
2018-08-01
Coastal river deltas are hotspots of global change impacts. Sustainable delta futures are increasingly threatened due to rising hazard exposure combined with high vulnerabilities of deltaic social-ecological systems. While the need for integrated multi-hazard approaches has been clearly articulated, studies on vulnerability and risk in deltas either focus on local case studies or single hazards and do not apply a social-ecological systems perspective. As a result, vulnerabilities and risks in areas with strong social and ecological coupling, such as coastal deltas, are not fully understood and the identification of risk reduction and adaptation strategies are often based on incomplete assumptions. To overcome these limitations, we propose an innovative modular indicator library-based approach for the assessment of multi-hazard risk of social-ecological systems across and within coastal deltas globally, and apply it to the Amazon, Ganges-Brahmaputra-Meghna (GBM), and Mekong deltas. Results show that multi-hazard risk is highest in the GBM delta and lowest in the Amazon delta. The analysis reveals major differences between social and environmental vulnerability across the three deltas, notably in the Mekong and the GBM deltas where environmental vulnerability is significantly higher than social vulnerability. Hotspots and drivers of risk vary spatially, thus calling for spatially targeted risk reduction and adaptation strategies within the deltas. Ecosystems have been identified as both an important element at risk as well as an entry point for risk reduction and adaptation strategies. Copyright © 2018. Published by Elsevier B.V.
ASIL determination for motorbike's Electronics Throttle Control System (ETCS) mulfunction
NASA Astrophysics Data System (ADS)
Zaman Rokhani, Fakhrul; Rahman, Muhammad Taqiuddin Abdul; Ain Kamsani, Noor; Sidek, Roslina Mohd; Saripan, M. Iqbal; Samsudin, Khairulmizam; Khair Hassan, Mohd
2017-11-01
Electronics Throttle Control System (ETCS) is the principal electronic unit in all fuel injection engine motorbike, augmenting the engine performance efficiency in comparison to the conventional carburetor based engine. ETCS is regarded as a safety-critical component, whereby ETCS malfunction can cause unintended acceleration or deceleration event, which can be hazardous to riders. In this study, Hazard Analysis and Risk Assessment, an ISO26262 functional safety standard analysis has been applied on motorbike's ETCS to determine the required automotive safety integrity level. Based on the analysis, the established automotive safety integrity level can help to derive technical and functional safety measures for ETCS development.
NASA Astrophysics Data System (ADS)
Simicevic, Aleksandra; Bonadonna, Costanza; di Traglia, Federico; Rosi, Mauro
2010-05-01
Volcanic eruptions are accompanied by numerous hazards which pose short- and long-term threats to people and property. Recent experiences have shown that successful responses to hazard events correlate strongly with the degree to which proactive policies of risk reduction are already in place before an eruption occurs. Effective proactive risk-reduction strategies require contributions from numerous disciplines. A volcanic eruption is not a hazard, per se, but rather an event capable of producing a variety of hazards (e.g. earthquakes, pyroclastic density currents, lava flows, tephra fall, lahars, landslides, gas release, and tsunamis) that can affect the built environment in a variety of ways, over different time scales and with different degrees of intensity. Our proposed model for the assessment and mitigation of exposure-based volcanic risk is mainly based on the compilation of three types of maps: hazard maps, hazard-specific vulnerability maps and exposure-based risk maps. Hazard maps identify the spatial distribution of individual volcanic hazard and it includes both event analysis and impact analysis. Hazard-specific vulnerability maps represent the systematic evaluation of physical vulnerability of the built environment to a range of volcanic phenomena, i.e. spatial distribution of buildings vulnerable to a given hazard based on the analysis of selected building elements. Buildings are classified on the basis of their major components that are relevant for different volcanic hazards, their strength, their construction materials and are defined taking into account the potential damage that each group of building elements (e.g. walls, roof, load-bearing structure) will suffer under a volcanic hazard. All those factors are enumerated in a checklist and are used for the building survey. Hazard-specific vulnerability maps are then overlapped with hazard maps in order to compile exposure-based risk maps and so quantify the potential damage. Such quantification is the starting point of the identification of suitable mitigation measures which will be analyzed through a cost-benefit analysis to assess their financial feasibility. Information about public networks is also recorded in order to give an overall idea of the built environment condition of the island. The vulnerability assessment of the technical systems describes the potential damages that could stress systems like electricity supply, water distribution, communication networks or transport systems. These damages can also be described as function disruption of the system. The important aspect is not only the physical capacity of a system to resist, but also its capacity to continue functioning. The model will be tested on the island of Vulcano in southern Italy. Vulcano is characterized by clear signs of volcanic unrest and is the type locality for a deadly style of eruption. The main active system of Vulcano Island (La Fossa cone) is known to produce a variety of eruption styles and intensities, each posing their own hazards and threats. Six different hazard scenarios have been identified based on a detailed stratigraphic work. The urbanization on Vulcano took place in the 1980s with no real planning and its population mostly subsists on tourism. Our preliminary results show that Vulcano is not characterized by a great variability of architectural typologies and construction materials. Three main types of buildings are present (masonry with concrete frame, masonry with manufactured stone units, masonry with hollow clay bricks) and no statistically significant trends were found between physical and morphological characteristics. The recent signs of volcanic unrest combined with a complex vulnerability of the island due to an uncontrolled urban development and a significant seasonal variation of the exposed population in summer months result in a high volcanic risk. As a result, Vulcano represents the ideal environment to test a multi-hazard based risk model and to study the transition between micro (building) and macro (urban environment) scale of analysis, which is still an unexplored field in the study of volcanic risk. Different levels of vulnerability need to be analyzed in order to increase the level of preparedness, plan a potential evacuation, manage a potential volcanic crisis and assess the best mitigation measures to put in place and reduce the volcanic risk.
NASA Astrophysics Data System (ADS)
van der Vat, Marnix; Femke, Schasfoort; Rhee Gigi, Van; Manfred, Wienhoven; Nico, Polman; Joost, Delsman; den Hoek Paul, Van; Maat Judith, Ter; Marjolein, Mens
2016-04-01
It is widely acknowledged that drought management should move from a crisis to a risk-based approach. A risk-based approach to managing water resources requires a sound drought risk analysis, quantifying the probability and impacts of water shortage due to droughts. Impacts of droughts are for example crop yield losses, hydropower production losses, and water shortage for municipal and industrial use. Many studies analyse the balance between supply and demand, but there is little experience in translating this into economic metrics that can be used in a decision-making process on investments to reduce drought risk. We will present a drought risk analysis method for the Netherlands, with a focus on the underlying economic method to quantify the welfare effects of water shortage for different water users. Both the risk-based approach as well as the economic valuation of water shortage for various water users was explored in a study for the Dutch Government. First, an historic analysis of the effects of droughts on revenues and prices in agriculture as well as on shipping and nature was carried out. Second, a drought risk analysis method was developed that combines drought hazard and drought impact analysis in a probabilistic way for various sectors. This consists of a stepwise approach, from water availability through water shortage to economic impact, for a range of drought events with a certain return period. Finally, a local case study was conducted to test the applicability of the drought risk analysis method. Through the study, experience was gained into integrating hydrological and economic analyses, which is a prerequisite for drought risk analysis. Results indicate that the risk analysis method is promising and applicable for various sectors. However, it was also found that quantification of economic impacts from droughts is time-consuming, because location- and sector-specific data is needed, which is not always readily available. Furthermore, for some sectors hydrological data was lacking to make a reliable estimate of drought return periods. By 2021, the Netherlands Government aims to agree on the water supply service levels, which should describe water availability and quality that can be delivered with a certain return period. The Netherlands' Ministry of Infrastructure and the Environment, representatives of the regional water boards and Rijkswaterstaat (operating the main water system) as well as several consultants and research institutes are important stakeholders for further development of the method, evaluation of cases and the development of a quantitative risk-informed decision-making tool.
Kim, MinJeong; Liu, Hongbin; Kim, Jeong Tai; Yoo, ChangKyoo
2014-08-15
Sensor faults in metro systems provide incorrect information to indoor air quality (IAQ) ventilation systems, resulting in the miss-operation of ventilation systems and adverse effects on passenger health. In this study, a new sensor validation method is proposed to (1) detect, identify and repair sensor faults and (2) evaluate the influence of sensor reliability on passenger health risk. To address the dynamic non-Gaussianity problem of IAQ data, dynamic independent component analysis (DICA) is used. To detect and identify sensor faults, the DICA-based squared prediction error and sensor validity index are used, respectively. To restore the faults to normal measurements, a DICA-based iterative reconstruction algorithm is proposed. The comprehensive indoor air-quality index (CIAI) that evaluates the influence of the current IAQ on passenger health is then compared using the faulty and reconstructed IAQ data sets. Experimental results from a metro station showed that the DICA-based method can produce an improved IAQ level in the metro station and reduce passenger health risk since it more accurately validates sensor faults than do conventional methods. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Applications and Lessons Learned in Reliability Engineering
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Fuller, Raymond P.
2011-01-01
Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.
Quantitative risk assessment system (QRAS)
NASA Technical Reports Server (NTRS)
Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)
2001-01-01
A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.
Study of the propensity for hemorrhage in Hispanic Americans with stroke.
Frey, James L; Jahnke, Heidi K; Goslar, Pamela W
2008-01-01
Multiple sources document a higher proportion of intraparenchymal hemorrhage (HEM) in Hispanic (HIS) than white (WHI) patients with stroke. We sought an explanation for this phenomenon through analysis of multiple variables in our hospital-based stroke population. We performed univariate and multivariate analysis of risk factors in our HIS and WHI patients with stroke to identify differences that might account for a greater propensity for HEM in HIS patients. Multivariate analysis disclosed that the risk of HEM correlated significantly with untreated hypertension (HTN), HIS ethnicity, and heavy alcohol intake. A negative correlation was found for hyperlipidemia and diabetes. Our HIS patients with stroke had a greater prevalence of untreated HTN and heavy alcohol intake, with HIS men being at greatest risk. HIS patients with stroke in our hospital-based population appear relatively more prone to HEM than do WHI patients. This risk correlates with a greater likelihood of having untreated HTN and heavy alcohol intake, more so for HIS men. The explanation appears to be a relative lack of health awareness and involvement in our health care system. The possibility that HIS ethnicity itself constitutes a biological risk factor for HEM remains a matter of speculation. Validation of this work with community data should lead to remediation through a community-based effort.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... Analysis and Risk-Based Preventive Controls for Human Food'' (the proposed preventive controls rule) and... Farm.'' The purpose of the draft RA is to provide a science-based risk analysis of those activity/food... Food, Drug, and Cosmetic Act for hazard analysis and risk-based preventive controls (the proposed...
Design of a secure remote management module for a software-operated medical device.
Burnik, Urban; Dobravec, Štefan; Meža, Marko
2017-12-09
Software-based medical devices need to be maintained throughout their entire life cycle. The efficiency of after-sales maintenance can be improved by managing medical systems remotely. This paper presents how to design the remote access function extensions in order to prevent risks imposed by uncontrolled remote access. A thorough analysis of standards and legislation requirements regarding safe operation and risk management of medical devices is presented. Based on the formal requirements, a multi-layer machine design solution is proposed that eliminates remote connectivity risks by strict separation of regular device functionalities from remote management service, deploys encrypted communication links and uses digital signatures to prevent mishandling of software images. The proposed system may also be used as an efficient version update of the existing medical device designs.
Cyber security risk assessment for SCADA and DCS networks.
Ralston, P A S; Graham, J H; Hieb, J L
2007-10-01
The growing dependence of critical infrastructures and industrial automation on interconnected physical and cyber-based control systems has resulted in a growing and previously unforeseen cyber security threat to supervisory control and data acquisition (SCADA) and distributed control systems (DCSs). It is critical that engineers and managers understand these issues and know how to locate the information they need. This paper provides a broad overview of cyber security and risk assessment for SCADA and DCS, introduces the main industry organizations and government groups working in this area, and gives a comprehensive review of the literature to date. Major concepts related to the risk assessment methods are introduced with references cited for more detail. Included are risk assessment methods such as HHM, IIM, and RFRM which have been applied successfully to SCADA systems with many interdependencies and have highlighted the need for quantifiable metrics. Presented in broad terms is probability risk analysis (PRA) which includes methods such as FTA, ETA, and FEMA. The paper concludes with a general discussion of two recent methods (one based on compromise graphs and one on augmented vulnerability trees) that quantitatively determine the probability of an attack, the impact of the attack, and the reduction in risk associated with a particular countermeasure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Hirt, Evelyn H.; Veeramany, Arun
This research report summaries the development and evaluation of a prototypic enhanced risk monitor (ERM) methodology (framework) that includes alternative risk metrics and uncertainty analysis. This updated ERM methodology accounts for uncertainty in the equipment condition assessment (ECA), the prognostic result, and the probabilistic risk assessment (PRA) model. It is anticipated that the ability to characterize uncertainty in the estimated risk and update the risk estimates in real time based on equipment condition assessment (ECA) will provide a mechanism for optimizing plant performance while staying within specified safety margins. These results (based on impacting active component O&M using real-time equipmentmore » condition information) are a step towards ERMs that, if integrated with AR supervisory plant control systems, can help control O&M costs and improve affordability of advanced reactors.« less
Stein, Deborah M; Kufera, Joseph A; Ho, Shiu M; Ryb, Gabriel E; Dischinger, Patricia C; O'Connor, James V; Scalea, Thomas M
2011-02-01
Motor vehicle collisions (MVCs) are the leading cause of spine and spinal cord injuries in the United States. Traumatic cervical spine injuries (CSIs) result in significant morbidity and mortality. This study was designed to evaluate both the epidemiologic and biomechanical risk factors associated with CSI in MVCs by using a population-based database and to describe occupant and crashes characteristics for a subset of severe crashes in which a CSI was sustained as represented by the Crash Injury Research Engineering Network (CIREN) database. Prospectively collected CIREN data from the eight centers were used to identify all case occupants between 1996 and November 2009. Case occupants older than 14 years and case vehicles of the four most common vehicle types were included. The National Automotive Sampling System's Crashworthiness Data System, a probability sample of all police-reported MVCs in the United States, was queried using the same inclusion criteria between 1997 and 2008. Cervical spinal cord and spinal column injuries were identified using Abbreviated Injury Scale (AIS) score codes. Data were abstracted on all case occupants, biomechanical crash characteristics, and injuries sustained. Univariate analysis was performed using a χ analysis. Logistic regression was used to identify significant risk factors in a multivariate analysis to control for confounding associations. CSIs were identified in 11.5% of CIREN case occupants. Case occupants aged 65 years or older and those occupants involved in rollover crashes were more likely to sustain a CSI. In univariate analysis of the subset of severe crashes represented by CIREN, the use of airbag and seat belt together (reference) were more protective than seat belt alone (odds ratio [OR]=1.73, 95% confidence interval [CI]=1.32-2.27) or the use of neither restraint system (OR=1.45, 95% CI=1.02-2.07). The most frequent injury sources in CIREN crashes were roof and its components (24.8%) and noncontact sources (15.5%). In multivariate analysis, age, rollover impact, and airbag-only restraint systems were associated with an increased odds of CSI. Using the population-based National Automotive Sampling System's Crashworthiness Data System data, 0.35% of occupants sustained a CSI. In univariate analysis, older age was noted to be a significant risk factor for CSI. Airbag-only restraint systems and both rollover and lateral crashes were also identified as risk factors for CSI. In addition, increasing delta v was highly associated with CSIs. In multivariate analysis, similar risk factors were noted. Of all the restraint systems, seat belt use without airbag deployment was found to be the most protective restraint system (OR=0.29, 95% CI=0.16-0.50), whereas airbag-only restraint was associated with the highest risk of CSI (OR=3.54, 95% CI=2.29-5.46). Despite advances in automotive safety, CSIs sustained in MVC continue to occur too often. Older case occupants are at an increased risk of CSI. Rollover crashes and severe crashes led to a much higher risk of CSI than other types and severity of MVCs. Seat belt use is very effective in preventing CSI, whereas airbag deployment may increase the risk of occupants sustaining a CSI. More protection for older occupants is needed and protection in both rollover and lateral crashes should remain a focus of the automotive industry. The design of airbag restraint systems should be evaluated so that they are not causative of serious injury. In addition, engineers should continue to focus on improving automotive design to minimize the risk of spinal injury to occupants in high severity crashes.
A theoretical treatment of technical risk in modern propulsion system design
NASA Astrophysics Data System (ADS)
Roth, Bryce Alexander
2000-09-01
A prevalent trend in modern aerospace systems is increasing complexity and cost, which in turn drives increased risk. Consequently, there is a clear and present need for the development of formalized methods to analyze the impact of risk on the design of aerospace vehicles. The objective of this work is to develop such a method that enables analysis of risk via a consistent, comprehensive treatment of aerothermodynamic and mass properties aspects of vehicle design. The key elements enabling the creation of this methodology are recent developments in the analytical estimation of work potential based on the second law of thermodynamics. This dissertation develops the theoretical foundation of a vehicle analysis method based on work potential and validates it using the Northrop F-5E with GE J85-GE-21 engines as a case study. Although the method is broadly applicable, emphasis is given to aircraft propulsion applications. Three work potential figures of merit are applied using this method: exergy, available energy, and thrust work potential. It is shown that each possesses unique properties making them useful for specific vehicle analysis tasks, though the latter two are actually special cases of exergy. All three are demonstrated on the analysis of the J85-GE-21 propulsion system, resulting in a comprehensive description of propulsion system thermodynamic loss. This "loss management" method is used to analyze aerodynamic drag loss of the F-5E and is then used in conjunction with the propulsive loss model to analyze the usage of fuel work potential throughout the F-5E design mission. The results clearly show how and where work potential is used during flight and yield considerable insight as to where the greatest opportunity for design improvement is. Next, usage of work potential is translated into fuel weight so that the aerothermodynamic performance of the F-5E can be expressed entirely in terms of vehicle gross weight. This technique is then applied as a means to quantify the impact of engine cycle technologies on the F-5E airframe. Finally, loss management methods are used in conjunction with probabilistic analysis methods to quantify the impact of risk on F-5E aerothermodynamic performance.
Ermolieva, T; Filatova, T; Ermoliev, Y; Obersteiner, M; de Bruijn, K M; Jeuken, A
2017-01-01
As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood-loss-sharing program involving private insurance based on location-specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS-based flood model and a stochastic optimization procedure with respect to location-specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile-related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures. © 2016 Society for Risk Analysis.
Options and Risk for Qualification of Electric Propulsion System
NASA Technical Reports Server (NTRS)
Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)
2002-01-01
Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.
Risk analysis based CWR track buckling safety evaluations
DOT National Transportation Integrated Search
2001-01-01
As part of the Federal Railroad Administrations (FRA) track systems research program, the US DOTS Volpe Center is conducting analytic and experimental investigations to evaluate track lateral strength and stability limits for improved safety an...
Reliability and Probabilistic Risk Assessment - How They Play Together
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang
2015-01-01
PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will perform its intended function(s) for a specified mission profile. In general, the reliability metric can be calculated through the analyses using reliability demonstration and reliability prediction methodologies. Reliability analysis is very critical for understanding component failure mechanisms and in identifying reliability critical design and process drivers. The following sections discuss the PRA process and reliability engineering in detail and provide an application where reliability analysis and PRA were jointly used in a complementary manner to support a Space Shuttle flight risk assessment.
Plioutsias, Anastasios; Karanikas, Nektarios; Chatzimihailidou, Maria Mikela
2018-03-01
Currently, published risk analyses for drones refer mainly to commercial systems, use data from civil aviation, and are based on probabilistic approaches without suggesting an inclusive list of hazards and respective requirements. Within this context, this article presents: (1) a set of safety requirements generated from the application of the systems theoretic process analysis (STPA) technique on a generic small drone system; (2) a gap analysis between the set of safety requirements and the ones met by 19 popular drone models; (3) the extent of the differences between those models, their manufacturers, and the countries of origin; and (4) the association of drone prices with the extent they meet the requirements derived by STPA. The application of STPA resulted in 70 safety requirements distributed across the authority, manufacturer, end user, or drone automation levels. A gap analysis showed high dissimilarities regarding the extent to which the 19 drones meet the same safety requirements. Statistical results suggested a positive correlation between drone prices and the extent that the 19 drones studied herein met the safety requirements generated by STPA, and significant differences were identified among the manufacturers. This work complements the existing risk assessment frameworks for small drones, and contributes to the establishment of a commonly endorsed international risk analysis framework. Such a framework will support the development of a holistic and methodologically justified standardization scheme for small drone flights. © 2017 Society for Risk Analysis.
Long, Haiming; Zhang, Ji; Tang, Nengyu
2017-01-01
This study considers the effect of an industry's network topology on its systemic risk contribution to the stock market using data from the CSI 300 two-tier industry indices from the Chinese stock market. We first measure industry's conditional-value-at-risk (CoVaR) and the systemic risk contribution (ΔCoVaR) using the fitted time-varying t-copula function. The network of the stock industry is established based on dynamic conditional correlations with the minimum spanning tree. Then, we investigate the connection characteristics and topology of the network. Finally, we utilize seemingly unrelated regression estimation (SUR) of panel data to analyze the relationship between network topology of the stock industry and the industry's systemic risk contribution. The results show that the systemic risk contribution of small-scale industries such as real estate, food and beverage, software services, and durable goods and clothing, is higher than that of large-scale industries, such as banking, insurance and energy. Industries with large betweenness centrality, closeness centrality, and clustering coefficient and small node occupancy layer are associated with greater systemic risk contribution. In addition, further analysis using a threshold model confirms that the results are robust.
[Development and clinical evaluation of an anesthesia information management system].
Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei
2010-09-21
To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.
Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)
NASA Technical Reports Server (NTRS)
Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis;
2011-01-01
Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was, however, not to happen. Early in the Apollo program, estimates of the probability for a successful roundtrip human mission to the moon yielded disappointingly low (and suspect) values and NASA became discouraged from further performing quantitative risk analyses until some two decades later when the methods were more refined, rigorous, and repeatable. Instead, NASA decided to rely primarily on the Hazard Analysis (HA) and Failure Modes and Effects Analysis (FMEA) methods for system safety assessment.
[Economic effects of integrated RIS-PACS solution in the university environment].
Kröger, M; Nissen-Meyer, S; Wetekam, V; Reiser, M
1999-04-01
The goal of the current article is to demonstrate how qualitative and monetary effects resulting from an integrated RIS/PACS installation can be evaluated. First of all, the system concept of a RIS/PACS solution for a university hospital is defined and described. Based on this example, a generic method for the evaluation of qualitative and monetary effects as well as associated risks is depicted and demonstrated. To this end, qualitative analyses, investment calculations and risk analysis are employed. The sample analysis of a RIS/PACS solution specially designed for a university hospital demonstrates positive qualitative and monetary effects of the system. Under ideal conditions the payoff time of the investments is reached after 4 years of an assumed 8 years effective life of the system. Furthermore, under conservative assumptions, the risk analysis shows a probability of 0% for realising a negative net present value at the end of the payoff time period. It should be pointed out that the positive result of this sample analysis will not necessarily apply to other clinics or hospitals. However, the same methods may be used for the individual evaluation of the qualitative and monetary effects of a RIS/PACS installation in any clinic.
Novel risk score of contrast-induced nephropathy after percutaneous coronary intervention.
Ji, Ling; Su, XiaoFeng; Qin, Wei; Mi, XuHua; Liu, Fei; Tang, XiaoHong; Li, Zi; Yang, LiChuan
2015-08-01
Contrast-induced nephropathy (CIN) post-percutaneous coronary intervention (PCI) is a major cause of acute kidney injury. In this study, we established a comprehensive risk score model to assess risk of CIN after PCI procedure, which could be easily used in a clinical environment. A total of 805 PCI patients, divided into analysis cohort (70%) and validation cohort (30%), were enrolled retrospectively in this study. Risk factors for CIN were identified using univariate analysis and multivariate logistic regression in the analysis cohort. Risk score model was developed based on multiple regression coefficients. Sensitivity and specificity of the new risk score system was validated in the validation cohort. Comparisons between the new risk score model and previous reported models were applied. The incidence of post-PCI CIN in the analysis cohort (n = 565) was 12%. Considerably high CIN incidence (50%) was observed in patients with chronic kidney disease (CKD). Age >75, body mass index (BMI) >25, myoglobin level, cardiac function level, hypoalbuminaemia, history of chronic kidney disease (CKD), Intra-aortic balloon pump (IABP) and peripheral vascular disease (PVD) were identified as independent risk factors of post-PCI CIN. A novel risk score model was established using multivariate regression coefficients, which showed highest sensitivity and specificity (0.917, 95%CI 0.877-0.957) compared with previous models. A new post-PCI CIN risk score model was developed based on a retrospective study of 805 patients. Application of this model might be helpful to predict CIN in patients undergoing PCI procedure. © 2015 Asian Pacific Society of Nephrology.
Quantitative assessment of building fire risk to life safety.
Guanquan, Chu; Jinhua, Sun
2008-06-01
This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.
Irestig, Magnus; Timpka, Toomas
2010-02-01
We set out to examine design conflict resolution tactics used in development of large information systems for health services and to outline the design consequences for these tactics. Discourse analysis methods were applied to data collected from meetings conducted during the development of a web-based system in a public health context. We found that low risk tactics were characterized by design issues being managed within the formal mandate and competences of the design group. In comparison, high risk tactics were associated with irresponsible compromises, i.e. decisions being passed on to others or to later phases of the design process. The consequence of this collective disregard of issues such as responsibility and legitimacy is that the system design will be impossible to implement in factual health service contexts. The results imply that downstream responsibility issues have to be continuously dealt with in system development in health services.
Risk Assessment for Mobile Systems Through a Multilayered Hierarchical Bayesian Network.
Li, Shancang; Tryfonas, Theo; Russell, Gordon; Andriotis, Panagiotis
2016-08-01
Mobile systems are facing a number of application vulnerabilities that can be combined together and utilized to penetrate systems with devastating impact. When assessing the overall security of a mobile system, it is important to assess the security risks posed by each mobile applications (apps), thus gaining a stronger understanding of any vulnerabilities present. This paper aims at developing a three-layer framework that assesses the potential risks which apps introduce within the Android mobile systems. A Bayesian risk graphical model is proposed to evaluate risk propagation in a layered risk architecture. By integrating static analysis, dynamic analysis, and behavior analysis in a hierarchical framework, the risks and their propagation through each layer are well modeled by the Bayesian risk graph, which can quantitatively analyze risks faced to both apps and mobile systems. The proposed hierarchical Bayesian risk graph model offers a novel way to investigate the security risks in mobile environment and enables users and administrators to evaluate the potential risks. This strategy allows to strengthen both app security as well as the security of the entire system.
Model Based Mission Assurance: Emerging Opportunities for Robotic Systems
NASA Technical Reports Server (NTRS)
Evans, John W.; DiVenti, Tony
2016-01-01
The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).
Ecological covariates based predictive model of malaria risk in the state of Chhattisgarh, India.
Kumar, Rajesh; Dash, Chinmaya; Rani, Khushbu
2017-09-01
Malaria being an endemic disease in the state of Chhattisgarh and ecologically dependent mosquito-borne disease, the study is intended to identify the ecological covariates of malaria risk in districts of the state and to build a suitable predictive model based on those predictors which could assist developing a weather based early warning system. This secondary data based analysis used one month lagged district level malaria positive cases as response variable and ecological covariates as independent variables which were tested with fixed effect panelled negative binomial regression models. Interactions among the covariates were explored using two way factorial interaction in the model. Although malaria risk in the state possesses perennial characteristics, higher parasitic incidence was observed during the rainy and winter seasons. The univariate analysis indicated that the malaria incidence risk was statistically significant associated with rainfall, maximum humidity, minimum temperature, wind speed, and forest cover ( p < 0.05). The efficient predictive model include the forest cover [IRR-1.033 (1.024-1.042)], maximum humidity [IRR-1.016 (1.013-1.018)], and two-way factorial interactions between district specific averaged monthly minimum temperature and monthly minimum temperature, monthly minimum temperature was statistically significant [IRR-1.44 (1.231-1.695)] whereas the interaction term has a protective effect [IRR-0.982 (0.974-0.990)] against malaria infections. Forest cover, maximum humidity, minimum temperature and wind speed emerged as potential covariates to be used in predictive models for modelling the malaria risk in the state which could be efficiently used for early warning systems in the state.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems
NASA Technical Reports Server (NTRS)
Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)
2000-01-01
We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.
Chughtai, Abrar Ahmad; MacIntyre, C. Raina
2017-01-01
Abstract The 2014 Ebola virus disease (EVD) outbreak affected several countries worldwide, including six West African countries. It was the largest Ebola epidemic in the history and the first to affect multiple countries simultaneously. Significant national and international delay in response to the epidemic resulted in 28,652 cases and 11,325 deaths. The aim of this study was to develop a risk analysis framework to prioritize rapid response for situations of high risk. Based on findings from the literature, sociodemographic features of the affected countries, and documented epidemic data, a risk scoring framework using 18 criteria was developed. The framework includes measures of socioeconomics, health systems, geographical factors, cultural beliefs, and traditional practices. The three worst affected West African countries (Guinea, Sierra Leone, and Liberia) had the highest risk scores. The scores were much lower in developed countries that experienced Ebola compared to West African countries. A more complex risk analysis framework using 18 measures was compared with a simpler one with 10 measures, and both predicted risk equally well. A simple risk scoring system can incorporate measures of hazard and impact that may otherwise be neglected in prioritizing outbreak response. This framework can be used by public health personnel as a tool to prioritize outbreak investigation and flag outbreaks with potentially catastrophic outcomes for urgent response. Such a tool could mitigate costly delays in epidemic response. PMID:28810081
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christie, R.F.; Stetkar, J.W.
1985-01-01
The change in availability of the high-pressure coolant injection system (HPCIS) due to a change in pump and valve test interval from monthly to quarterly was analyzed. This analysis started by using the HPCIS base line evaluation produced as part of the Browns Ferry Nuclear Plant (BFN) Probabilistic Risk Assessment (PRA). The base line evaluation showed that the dominant contributors to the unavailability of the HPCI system are hardware failures and the resultant downtime for unscheduled maintenance.
Karmakar, Chandan; Jelinek, Herbert; Khandoker, Ahsan; Tulppo, Mikko; Makikallio, Timo; Kiviniemi, Antti; Huikuri, Heikki; Palaniswami, Marimuthu
2012-01-01
Diabetes mellitus is associated with multi-organ system dysfunction. One of the key causative factors is the increased blood sugar level that leads to an increase in free radical activity and organ damage including the cardiovascular and nervous system. Heart rhythm is extrinsically modulated by the autonomic nervous system and cardiac autonomic neuropathy or dysautonomia has been shown to lead to sudden cardiac death in people with diabetes due to the decrease in heart rate variability (HRV). Current algorithms for determining HRV describe only beat-to-beat variation and therefore do not consider the ability of a heart beat to influence a train of succeeding beats. Therefore mortality risk analysis based on HRV has often not been able to discern the presence of an increased risk. This study used a novel innovation of the tone-entropy algorithm by incorporating increased lag intervals and found that both the sympatho-vagal balance and total activity changed at larger lag intervals. Tone-Entropy was found to be better risk identifier of cardiac mortality in people with diabetes at lags higher than one and best at lag seven.
Aoki, Tomonori; Nagata, Naoyoshi; Shimbo, Takuro; Niikura, Ryota; Sakurai, Toshiyuki; Moriyasu, Shiori; Okubo, Hidetaka; Sekine, Katsunori; Watanabe, Kazuhiro; Yokoi, Chizu; Yanase, Mikio; Akiyama, Junichi; Mizokami, Masashi; Uemura, Naomi
2016-11-01
We aimed to develop and validate a risk scoring system to determine the risk of severe lower gastrointestinal bleeding (LGIB) and predict patient outcomes. We first performed a retrospective analysis of data from 439 patients emergently hospitalized for acute LGIB at the National Center for Global Health and Medicine in Japan, from January 2009 through December 2013. We used data on comorbidities, medication, presenting symptoms, and vital signs, and laboratory test results to develop a scoring system for severe LGIB (defined as continuous and/or recurrent bleeding). We validated the risk score in a prospective study of 161 patients with acute LGIB admitted to the same center from April 2014 through April 2015. We assessed the system's accuracy in predicting patient outcome using area under the receiver operating characteristics curve (AUC) analysis. All patients underwent colonoscopy. In the first study, 29% of the patients developed severe LGIB. We devised a risk scoring system based on nonsteroidal anti-inflammatory drugs use, no diarrhea, no abdominal tenderness, blood pressure of 100 mm Hg or lower, antiplatelet drugs use, albumin level less than 3.0 g/dL, disease scores of 2 or higher, and syncope (NOBLADS), which all were independent correlates of severe LGIB. Severe LGIB developed in 75.7% of patients with scores of 5 or higher compared with 2% of patients without any of the factors correlated with severe LGIB (P < .001). The NOBLADS score determined the severity of LGIB with an AUC value of 0.77. In the validation (second) study, severe LGIB developed in 35% of patients; the NOBLADS score predicted the severity of LGIB with an AUC value of 0.76. Higher NOBLADS scores were associated with a requirement for blood transfusion, longer hospital stay, and intervention (P < .05 for trend). We developed and validated a scoring system for risk of severe LGIB based on 8 factors (NOBLADS score). The system also determined the risk for blood transfusion, longer hospital stay, and intervention. It might be used in decision making regarding intervention and management. Copyright © 2016 AGA Institute. Published by Elsevier Inc. All rights reserved.
Chang, Jee Suk; Kim, Kyung Hwan; Keum, Ki Chang; Noh, Sung Hoon; Lim, Joon Seok; Kim, Hyo Song; Rha, Sun Young; Lee, Yong Chan; Hyung, Woo Jin; Koom, Woong Sub
2016-12-01
To classify patients with nonmetastatic advanced gastric cancer who underwent D2-gastrectomy into prognostic groups based on peritoneal and systemic recurrence risks. Between 2004 and 2007, 1,090 patients with T3-4 or N+ gastric cancer were identified from our registry. Recurrence rates were estimated using a competing-risk analysis. Different prognostic groups were defined using recursive partitioning analysis (RPA). Median follow-up was 7 years. In the RPA-model for peritoneal recurrence risk, the initial node was split by T stage, indicating that differences between patients with T1-3 and T4 cancer were the greatest. The 5-year peritoneal recurrence rates for patients with T4 (n = 627) and T1-3 (n = 463) disease were 34.3% and 9.1%, respectively. N stage and neural invasion had an additive impact on high-risk patients. The RPA model for systemic relapse incorporated N stage alone and gave two terminal nodes: N0-2 (n = 721) and N3 (n = 369). The 5-year cumulative incidences were 7.7% and 24.5%, respectively. We proposed risk stratification models of peritoneal and systemic recurrence in patients undergoing D2-gastrectomy. This classification could be used for stratification protocols in future studies evaluating adjuvant therapies such as preoperative chemoradiotherapy. J. Surg. Oncol. 2016;114:859-864. © 2016 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
FlySec: a risk-based airport security management system based on security as a service concept
NASA Astrophysics Data System (ADS)
Kyriazanos, Dimitris M.; Segou, Olga E.; Zalonis, Andreas; Thomopoulos, Stelios C. A.
2016-05-01
Complementing the ACI/IATA efforts, the FLYSEC European H2020 Research and Innovation project (http://www.fly-sec.eu/) aims to develop and demonstrate an innovative, integrated and end-to-end airport security process for passengers, enabling a guided and streamlined procedure from the landside to airside and into the boarding gates, and offering for an operationally validated innovative concept for end-to-end aviation security. FLYSEC ambition turns through a well-structured work plan into: (i) innovative processes facilitating risk-based screening; (ii) deployment and integration of new technologies and repurposing existing solutions towards a risk-based Security paradigm shift; (iii) improvement of passenger facilitation and customer service, bringing security as a real service in the airport of tomorrow;(iv) achievement of measurable throughput improvement and a whole new level of Quality of Service; and (v) validation of the results through advanced "in-vitro" simulation and "in-vivo" pilots. On the technical side, FLYSEC achieves its ambitious goals by integrating new technologies on video surveillance, intelligent remote image processing and biometrics combined with big data analysis, open-source intelligence and crowdsourcing. Repurposing existing technologies is also in the FLYSEC objectives, such as mobile application technologies for improved passenger experience and positive boarding applications (i.e. services to facilitate boarding and landside/airside way finding) as well as RFID for carry-on luggage tracking and quick unattended luggage handling. In this paper, the authors will describe the risk based airport security management system which powers FLYSEC intelligence and serves as the backend on top of which FLYSEC's front end technologies reside for security services management, behaviour and risk analysis.
Stärk, Katharina DC; Regula, Gertraud; Hernandez, Jorge; Knopf, Lea; Fuchs, Klemens; Morris, Roger S; Davies, Peter
2006-01-01
Background Emerging animal and zoonotic diseases and increasing international trade have resulted in an increased demand for veterinary surveillance systems. However, human and financial resources available to support government veterinary services are becoming more and more limited in many countries world-wide. Intuitively, issues that present higher risks merit higher priority for surveillance resources as investments will yield higher benefit-cost ratios. The rapid rate of acceptance of this core concept of risk-based surveillance has outpaced the development of its theoretical and practical bases. Discussion The principal objectives of risk-based veterinary surveillance are to identify surveillance needs to protect the health of livestock and consumers, to set priorities, and to allocate resources effectively and efficiently. An important goal is to achieve a higher benefit-cost ratio with existing or reduced resources. We propose to define risk-based surveillance systems as those that apply risk assessment methods in different steps of traditional surveillance design for early detection and management of diseases or hazards. In risk-based designs, public health, economic and trade consequences of diseases play an important role in selection of diseases or hazards. Furthermore, certain strata of the population of interest have a higher probability to be sampled for detection of diseases or hazards. Evaluation of risk-based surveillance systems shall prove that the efficacy of risk-based systems is equal or higher than traditional systems; however, the efficiency (benefit-cost ratio) shall be higher in risk-based surveillance systems. Summary Risk-based surveillance considerations are useful to support both strategic and operational decision making. This article highlights applications of risk-based surveillance systems in the veterinary field including food safety. Examples are provided for risk-based hazard selection, risk-based selection of sampling strata as well as sample size calculation based on risk considerations. PMID:16507106
Risk analysis based CWR track buckling safety evaluations
DOT National Transportation Integrated Search
1999-12-01
As part of the Federal Railroad Administration's (FRA) track systems research program, the US DOT'S Volpe Center is conducting analytic and experimental investigations to evaluate track lateral strength and stability limits for improved safety and pe...
76 FR 76215 - Privacy Act; System of Records: State-78, Risk Analysis and Management Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-06
... network. Vetting requests, analyses, and results will be stored separately on a classified computer... DEPARTMENT OF STATE [Public Notice 7709] Privacy Act; System of Records: State-78, Risk Analysis... a system of records, Risk Analysis and Management Records, State-78, pursuant to the provisions of...
Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun
2018-09-01
Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.
Impact of Domain Analysis on Reuse Methods
1989-11-06
return on the investment. The potential negative effects a "bad" domain analysis has on developing systems in the domain also increases the risks of a...importance of domain analysis as part of a software reuse program. A particular goal is to assist in avoiding the potential negative effects of ad hoc or...are specification objects discovered by performing object-oriented analysis. Object-based analysis approaches thus serve to capture a model of reality
Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez
2014-01-01
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. © 2013 Society for Risk Analysis.
A Tissue Systems Pathology Assay for High-Risk Barrett's Esophagus.
Critchley-Thorne, Rebecca J; Duits, Lucas C; Prichard, Jeffrey W; Davison, Jon M; Jobe, Blair A; Campbell, Bruce B; Zhang, Yi; Repa, Kathleen A; Reese, Lia M; Li, Jinhong; Diehl, David L; Jhala, Nirag C; Ginsberg, Gregory; DeMarshall, Maureen; Foxwell, Tyler; Zaidi, Ali H; Lansing Taylor, D; Rustgi, Anil K; Bergman, Jacques J G H M; Falk, Gary W
2016-06-01
Better methods are needed to predict risk of progression for Barrett's esophagus. We aimed to determine whether a tissue systems pathology approach could predict progression in patients with nondysplastic Barrett's esophagus, indefinite for dysplasia, or low-grade dysplasia. We performed a nested case-control study to develop and validate a test that predicts progression of Barrett's esophagus to high-grade dysplasia (HGD) or esophageal adenocarcinoma (EAC), based upon quantification of epithelial and stromal variables in baseline biopsies. Data were collected from Barrett's esophagus patients at four institutions. Patients who progressed to HGD or EAC in ≥1 year (n = 79) were matched with patients who did not progress (n = 287). Biopsies were assigned randomly to training or validation sets. Immunofluorescence analyses were performed for 14 biomarkers and quantitative biomarker and morphometric features were analyzed. Prognostic features were selected in the training set and combined into classifiers. The top-performing classifier was assessed in the validation set. A 3-tier, 15-feature classifier was selected in the training set and tested in the validation set. The classifier stratified patients into low-, intermediate-, and high-risk classes [HR, 9.42; 95% confidence interval, 4.6-19.24 (high-risk vs. low-risk); P < 0.0001]. It also provided independent prognostic information that outperformed predictions based on pathology analysis, segment length, age, sex, or p53 overexpression. We developed a tissue systems pathology test that better predicts risk of progression in Barrett's esophagus than clinicopathologic variables. The test has the potential to improve upon histologic analysis as an objective method to risk stratify Barrett's esophagus patients. Cancer Epidemiol Biomarkers Prev; 25(6); 958-68. ©2016 AACR. ©2016 American Association for Cancer Research.
Araki, Tadashi; Jain, Pankaj K; Suri, Harman S; Londhe, Narendra D; Ikeda, Nobutaka; El-Baz, Ayman; Shrivastava, Vimal K; Saba, Luca; Nicolaides, Andrew; Shafique, Shoaib; Laird, John R; Gupta, Ajay; Suri, Jasjit S
2017-01-01
Stroke risk stratification based on grayscale morphology of the ultrasound carotid wall has recently been shown to have a promise in classification of high risk versus low risk plaque or symptomatic versus asymptomatic plaques. In previous studies, this stratification has been mainly based on analysis of the far wall of the carotid artery. Due to the multifocal nature of atherosclerotic disease, the plaque growth is not restricted to the far wall alone. This paper presents a new approach for stroke risk assessment by integrating assessment of both the near and far walls of the carotid artery using grayscale morphology of the plaque. Further, this paper presents a scientific validation system for stroke risk assessment. Both these innovations have never been presented before. The methodology consists of an automated segmentation system of the near wall and far wall regions in grayscale carotid B-mode ultrasound scans. Sixteen grayscale texture features are computed, and fed into the machine learning system. The training system utilizes the lumen diameter to create ground truth labels for the stratification of stroke risk. The cross-validation procedure is adapted in order to obtain the machine learning testing classification accuracy through the use of three sets of partition protocols: (5, 10, and Jack Knife). The mean classification accuracy over all the sets of partition protocols for the automated system in the far and near walls is 95.08% and 93.47%, respectively. The corresponding accuracies for the manual system are 94.06% and 92.02%, respectively. The precision of merit of the automated machine learning system when compared against manual risk assessment system are 98.05% and 97.53% for the far and near walls, respectively. The ROC of the risk assessment system for the far and near walls is close to 1.0 demonstrating high accuracy. Copyright © 2016 Elsevier Ltd. All rights reserved.
Decomposition-Based Failure Mode Identification Method for Risk-Free Design of Large Systems
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Stone, Robert B.; Roberts, Rory A.; Clancy, Daniel (Technical Monitor)
2002-01-01
When designing products, it is crucial to assure failure and risk-free operation in the intended operating environment. Failures are typically studied and eliminated as much as possible during the early stages of design. The few failures that go undetected result in unacceptable damage and losses in high-risk applications where public safety is of concern. Published NASA and NTSB accident reports point to a variety of components identified as sources of failures in the reported cases. In previous work, data from these reports were processed and placed in matrix form for all the system components and failure modes encountered, and then manipulated using matrix methods to determine similarities between the different components and failure modes. In this paper, these matrices are represented in the form of a linear combination of failures modes, mathematically formed using Principal Components Analysis (PCA) decomposition. The PCA decomposition results in a low-dimensionality representation of all failure modes and components of interest, represented in a transformed coordinate system. Such a representation opens the way for efficient pattern analysis and prediction of failure modes with highest potential risks on the final product, rather than making decisions based on the large space of component and failure mode data. The mathematics of the proposed method are explained first using a simple example problem. The method is then applied to component failure data gathered from helicopter, accident reports to demonstrate its potential.
Ontology based decision system for breast cancer diagnosis
NASA Astrophysics Data System (ADS)
Trabelsi Ben Ameur, Soumaya; Cloppet, Florence; Wendling, Laurent; Sellami, Dorra
2018-04-01
In this paper, we focus on analysis and diagnosis of breast masses inspired by expert concepts and rules. Accordingly, a Bag of Words is built based on the ontology of breast cancer diagnosis, accurately described in the Breast Imaging Reporting and Data System. To fill the gap between low level knowledge and expert concepts, a semantic annotation is developed using a machine learning tool. Then, breast masses are classified into benign or malignant according to expert rules implicitly modeled with a set of classifiers (KNN, ANN, SVM and Decision Tree). This semantic context of analysis offers a frame where we can include external factors and other meta-knowledge such as patient risk factors as well as exploiting more than one modality. Based on MRI and DECEDM modalities, our developed system leads a recognition rate of 99.7% with Decision Tree where an improvement of 24.7 % is obtained owing to semantic analysis.
A White Paper on Global Wheat Health Based on Scenario Development and Analysis.
Savary, S; Djurle, A; Yuen, J; Ficke, A; Rossi, V; Esker, P D; Fernandes, J M C; Del Ponte, E M; Kumar, J; Madden, L V; Paul, P; McRoberts, N; Singh, P K; Huber, L; Pope de Vallavielle, C; Saint-Jean, S; Willocquet, L
2017-10-01
Scenario analysis constitutes a useful approach to synthesize knowledge and derive hypotheses in the case of complex systems that are documented with mainly qualitative or very diverse information. In this article, a framework for scenario analysis is designed and then, applied to global wheat health within a timeframe from today to 2050. Scenario analysis entails the choice of settings, the definition of scenarios of change, and the analysis of outcomes of these scenarios in the chosen settings. Three idealized agrosystems, representing a large fraction of the global diversity of wheat-based agrosystems, are considered, which represent the settings of the analysis. Several components of global changes are considered in their consequences on global wheat health: climate change and climate variability, nitrogen fertilizer use, tillage, crop rotation, pesticide use, and the deployment of host plant resistances. Each idealized agrosystem is associated with a scenario of change that considers first, a production situation and its dynamics, and second, the impacts of the evolving production situation on the evolution of crop health. Crop health is represented by six functional groups of wheat pathogens: the pathogens associated with Fusarium head blight; biotrophic fungi, Septoria-like fungi, necrotrophic fungi, soilborne pathogens, and insect-transmitted viruses. The analysis of scenario outcomes is conducted along a risk-analytical pattern, which involves risk probabilities represented by categorized probability levels of disease epidemics, and risk magnitudes represented by categorized levels of crop losses resulting from these levels of epidemics within each production situation. The results from this scenario analysis suggest an overall increase of risk probabilities and magnitudes in the three idealized agrosystems. Changes in risk probability or magnitude however vary with the agrosystem and the functional groups of pathogens. We discuss the effects of global changes on the six functional groups, in terms of their epidemiology and of the crop losses they cause. Scenario analysis enables qualitative analysis of complex systems, such as plant pathosystems that are evolving in response to global changes, including climate change and technology shifts. It also provides a useful framework for quantitative simulation modeling analysis for plant disease epidemiology.
NASA Technical Reports Server (NTRS)
Zelkin, Natalie; Henriksen, Stephen
2011-01-01
This document is being provided as part of ITT's NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract: "New ATM Requirements--Future Communications, C-Band and L-Band Communications Standard Development." ITT has completed a safety hazard analysis providing a preliminary safety assessment for the proposed C-band (5091- to 5150-MHz) airport surface communication system. The assessment was performed following the guidelines outlined in the Federal Aviation Administration Safety Risk Management Guidance for System Acquisitions document. The safety analysis did not identify any hazards with an unacceptable risk, though a number of hazards with a medium risk were documented. This effort represents an initial high-level safety hazard analysis and notes the triggers for risk reassessment. A detailed safety hazards analysis is recommended as a follow-on activity to assess particular components of the C-band communication system after the profile is finalized and system rollout timing is determined. A security risk assessment has been performed by NASA as a parallel activity. While safety analysis is concerned with a prevention of accidental errors and failures, the security threat analysis focuses on deliberate attacks. Both processes identify the events that affect operation of the system; and from a safety perspective the security threats may present safety risks.
Risk analysis of computer system designs
NASA Technical Reports Server (NTRS)
Vallone, A.
1981-01-01
Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.
WE-B-BRC-02: Risk Analysis and Incident Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraass, B.
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
FMEA of manual and automated methods for commissioning a radiotherapy treatment planning system.
Wexler, Amy; Gu, Bruce; Goddu, Sreekrishna; Mutic, Maya; Yaddanapudi, Sridhar; Olsen, Lindsey; Harry, Taylor; Noel, Camille; Pawlicki, Todd; Mutic, Sasa; Cai, Bin
2017-09-01
To evaluate the level of risk involved in treatment planning system (TPS) commissioning using a manual test procedure, and to compare the associated process-based risk to that of an automated commissioning process (ACP) by performing an in-depth failure modes and effects analysis (FMEA). The authors collaborated to determine the potential failure modes of the TPS commissioning process using (a) approaches involving manual data measurement, modeling, and validation tests and (b) an automated process utilizing application programming interface (API) scripting, preloaded, and premodeled standard radiation beam data, digital heterogeneous phantom, and an automated commissioning test suite (ACTS). The severity (S), occurrence (O), and detectability (D) were scored for each failure mode and the risk priority numbers (RPN) were derived based on TG-100 scale. Failure modes were then analyzed and ranked based on RPN. The total number of failure modes, RPN scores and the top 10 failure modes with highest risk were described and cross-compared between the two approaches. RPN reduction analysis is also presented and used as another quantifiable metric to evaluate the proposed approach. The FMEA of a MTP resulted in 47 failure modes with an RPN ave of 161 and S ave of 6.7. The highest risk process of "Measurement Equipment Selection" resulted in an RPN max of 640. The FMEA of an ACP resulted in 36 failure modes with an RPN ave of 73 and S ave of 6.7. The highest risk process of "EPID Calibration" resulted in an RPN max of 576. An FMEA of treatment planning commissioning tests using automation and standardization via API scripting, preloaded, and pre-modeled standard beam data, and digital phantoms suggests that errors and risks may be reduced through the use of an ACP. © 2017 American Association of Physicists in Medicine.
Quality transitivity and traceability system of herbal medicine products based on quality markers.
Liu, Changxiao; Guo, De-An; Liu, Liang
2018-05-15
Due to a variety of factors to affect the herb quality, the existing quality management model is unable to evaluate the process control. The development of the concept of "quality marker" (Q-marker) lays basis for establishing an independent process quality control system for herbal products. To ensure the highest degree of safety, effectiveness and quality process control of herbal products, it is aimed to establish a quality transitivity and traceability system of quality and process control from raw materials to finished herbal products. Based on the key issues and challenges of quality assessment, the current status of quality and process controls from raw materials to herbal medicinal products listed in Pharmacopoeia were analyzed and the research models including discovery and identification of Q-markers, analysis and quality management of risk evaluation were designed. Authors introduced a few new technologies and methodologies, such as DNA barcoding, chromatographic technologies, fingerprint analysis, chemical markers, bio-responses, risk management and solution for quality process control. The quality and process control models for herbal medicinal products were proposed and the transitivity and traceability system from raw materials to the finished products was constructed to improve the herbal quality from the entire supply and production chain. The transitivity and traceability system has been established based on quality markers, especially on how to control the production process under Good Engineering Practices, as well as to implement the risk management for quality and process control in herbal medicine production. Copyright © 2018 Elsevier GmbH. All rights reserved.
Global computer-assisted appraisal of osteoporosis risk in Asian women: an innovative study.
Chang, Shu F; Hong, Chin M; Yang, Rong S
2011-05-01
To develop a computer-assisted appraisal system of osteoporosis that can predict osteoporosis health risk in community-dwelling women and to use it in an empirical analysis of the risk in Asian women. As the literature indicates, health risk assessment tools are generally applied in clinical practice for patient diagnosis. However, few studies have explored how to assist community-dwelling women to understand the risk of osteoporosis without invasive data. A longitudinal, evidence-based study. The first stage of this study is to establish a system that combines expertise in nursing, medicine and information technology. This part includes information from random samples (n = 700), including data on bone mineral density, osteoporosis risk factors, knowledge, beliefs and behaviour, which are used as the health risk appraisal system database. The second stage is to apply an empirical study. The relative risks of osteoporosis of the participants (n = 300) were determined with the system. The participants that were classified as at-risk were randomly grouped into experimental and control groups. Each group was treated using different nursing intervention methods. The sensitivity and specificity of the analytical tools was 75%. In empirical study, analysis results indicate that the prevalence of osteoporosis was 14.0%. Data indicate that strategic application of multiple nursing interventions can promote osteoporosis prevention knowledge in high-risk women and enhance the effectiveness of preventive action. The system can also provide people in remote areas or with insufficient medical resources a simple and effective means of managing health risk and implement the idea of self-evaluation and self-caring among community-dwelling women at home to achieve the final goal of early detection and early treatment of osteoporosis. This study developed a useful approach for providing Asia women with a reliable, valid, convenient and economical self-health management model. Health care professionals can explore the use of advanced information systems and nursing interventions to increase the effectiveness of osteoporosis prevention programmes for women. © 2011 Blackwell Publishing Ltd.
Johnson, Stephen; Proctor, Matthew; Bluth, Edward; Smetherman, Dana; Baumgarten, Katherine; Troxclair, Laurie; Bienvenu, Michele
2013-10-01
Because of the complex process and the risk of errors associated with the glutaraldehyde-based solutions previously used at our institution for disinfection, our department has implemented a new method for high-level disinfection of vaginal ultrasound probes: the hydrogen peroxide-based Trophon system (Nanosonics, Alexandria, New South Wales, Australia). The aim of this study was to compare the time difference, safety, and sonographers' satisfaction between the glutaraldehyde-based Cidex (CIVCO Medical Solutions, Kalona, IA) and the hydrogen peroxide-based Trophon disinfection systems. The Institutional Review Board approved a 14-question survey administered to the 13 sonographers in our department. Survey questions addressed a variety of aspects of the disinfection processes with graded responses over a standardized 5-point scale. A process diagram was developed for each disinfection method with segmental timing analysis, and a cost analysis was performed. Nonvariegated analysis of the survey data with the Wilcoxon signed rank test showed a statistical difference in survey responses in favor of the hydrogen peroxide-based system over the glutaraldehyde-based system regarding efficiency (P = .0013), ease of use (P = .0013), ability to maintain work flow (P = .026), safety (P = .0026), fixing problems (P = .0158), time (P = .0011), and overall satisfaction (P = .0018). The glutaraldehyde-based system took 32 minutes versus 14 minutes for the hydrogen peroxide-based system; the hydrogen peroxide-based system saved on average 7.5 hours per week. The cost of the hydrogen peroxide-based system and weekly maintenance pays for itself if 1.5 more ultrasound examinations are performed each week. The hydrogen peroxide-based disinfection system was proven to be more efficient and viewed to be easier and safer to use than the glutaraldehyde-based system. The adoption of the hydrogen peroxide-based system led to higher satisfaction among sonographers.
Interdisciplinary approach for disaster risk reduction in Valtellina Valley, northern Italy
NASA Astrophysics Data System (ADS)
Garcia, Carolina; Blahut, Jan; Luna, Byron Quan; Poretti, Ilaria; Camera, Corrado; de Amicis, Mattia; Sterlacchini, Simone
2010-05-01
Inside the framework of the European research network Mountain Risks, an interdisciplinary research group has been working in the Consortium of Mountain Municipalities of Valtellina di Tirano (northern Italy). This area has been continuously affected by several mountain hazards such as landslides, debris flows and floods that directly affect the population, and in some cases caused several deaths and million euros of losses. An aim of the interdisciplinary work in this study area, is to integrate different scientific products of the research group, in the areas of risk assessment, management and governance, in order to generate, among others, risk reduction tools addressed to general public and stakeholders. Two types of phenomena have been particularly investigated: debris flows and floods. The scientific products range from modeling to mapping of hazard and risk, emergency planning based on real time decision support systems, surveying for the evaluation of risk perception and preparedness, among others. Outputs from medium scale hazard and risk modeling could be used for decision makers and spatial planners as well as civil protection authorities to have a general overview of the area and indentify hot spots for further detailed analysis. Subsequently, local scale analysis is necessary to define possible events and risk scenarios for emergency planning. As for the modeling of past events and new scenarios of debris flows, physical outputs were used as inputs into physical vulnerability assessment and quantitative risk analysis within dynamic runout models. On a pilot zone, the physical damage was quantified for each affected structure within the context of physical vulnerability and different empirical vulnerability curves were obtained. Prospective economic direct losses were estimated. For floods hazard assessment, different approaches and models are being tested, in order to produce flood maps for various return periods, and related to registered rainfalls. About Civil Protection topics, the main aim is to set up and manage contingency plans in advance; that is, to identify and prepare people in charge to take action to define the activities to be performed, to be aware of available resources and to optimize the communication system among the people involved, in order to efficiently face a prospective crisis phase. For this purpose, a real time emergency plan has been develop based GIS (Geographical Information Systems), DSS (Decision Support Systems), and ICT (Information & Communication Technology).
Determination of viable legionellae in engineered water systems: Do we find what we are looking for?
Kirschner, Alexander K.T.
2016-01-01
In developed countries, legionellae are one of the most important water-based bacterial pathogens caused by management failure of engineered water systems. For routine surveillance of legionellae in engineered water systems and outbreak investigations, cultivation-based standard techniques are currently applied. However, in many cases culture-negative results are obtained despite the presence of viable legionellae, and clinical cases of legionellosis cannot be traced back to their respective contaminated water source. Among the various explanations for these discrepancies, the presence of viable but non-culturable (VBNC) Legionella cells has received increased attention in recent discussions and scientific literature. Alternative culture-independent methods to detect and quantify legionellae have been proposed in order to complement or even substitute the culture method in the future. Such methods should detect VBNC Legionella cells and provide a more comprehensive picture of the presence of legionellae in engineered water systems. However, it is still unclear whether and to what extent these VBNC legionellae are hazardous to human health. Current risk assessment models to predict the risk of legionellosis from Legionella concentrations in the investigated water systems contain many uncertainties and are mainly based on culture-based enumeration. If VBNC legionellae should be considered in future standard analysis, quantitative risk assessment models including VBNC legionellae must be proven to result in better estimates of human health risk than models based on cultivation alone. This review critically evaluates current methods to determine legionellae in the VBNC state, their potential to complement the standard culture-based method in the near future, and summarizes current knowledge on the threat that VBNC legionellae may pose to human health. PMID:26928563
Determination of viable legionellae in engineered water systems: Do we find what we are looking for?
Kirschner, Alexander K T
2016-04-15
In developed countries, legionellae are one of the most important water-based bacterial pathogens caused by management failure of engineered water systems. For routine surveillance of legionellae in engineered water systems and outbreak investigations, cultivation-based standard techniques are currently applied. However, in many cases culture-negative results are obtained despite the presence of viable legionellae, and clinical cases of legionellosis cannot be traced back to their respective contaminated water source. Among the various explanations for these discrepancies, the presence of viable but non-culturable (VBNC) Legionella cells has received increased attention in recent discussions and scientific literature. Alternative culture-independent methods to detect and quantify legionellae have been proposed in order to complement or even substitute the culture method in the future. Such methods should detect VBNC Legionella cells and provide a more comprehensive picture of the presence of legionellae in engineered water systems. However, it is still unclear whether and to what extent these VBNC legionellae are hazardous to human health. Current risk assessment models to predict the risk of legionellosis from Legionella concentrations in the investigated water systems contain many uncertainties and are mainly based on culture-based enumeration. If VBNC legionellae should be considered in future standard analysis, quantitative risk assessment models including VBNC legionellae must be proven to result in better estimates of human health risk than models based on cultivation alone. This review critically evaluates current methods to determine legionellae in the VBNC state, their potential to complement the standard culture-based method in the near future, and summarizes current knowledge on the threat that VBNC legionellae may pose to human health. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Baruffini, Mirko
2010-05-01
Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a GIS-based system can be for effective and efficient disaster response management. In the coming years our GIS application will be a data base containing all information needed for the evaluation of risk sites along the Gotthard line. Our GIS application can help the technical management to decide about protection measures because of, in addition to the visualisation, tools for spatial data analysis will be available. REFERENCES Bründl M. (Ed.) 2009 : Risikokonzept für Naturgefahren - Leitfaden. Nationale Plattform für Naturgefahren PLANAT, Bern. 416 S. BUWAL 1999: Risikoanalyse bei gravitativen Naturgefahren - Methode, Fallbeispiele und Daten (Risk analyses for gravitational natural hazards). Bundesamt für Umwelt, Wald und Landschaft (BUWAL). Umwelt-Materialen Nr. 107, 1-244. Loat, R. & Zimmermann, M. 2004: La gestion des risques en Suisse (Risk Management in Switzerland). In: Veyret, Y., Garry, G., Meschinet de Richemont, N. & Armand Colin (eds) 2002: Colloque Arche de la Défense 22-24 octobre 2002, dans Risques naturels et aménagement en Europe, 108-120. Maggi R. et al, 2009: Evaluation of the optimal resilience for vulnerable infrastructure networks. An interdisciplinary pilot study on the transalpine transportation corridors, NRP 54 "Sustainable Development of the Built Environment", Projekt Nr. 405 440, Final Scientific Report, Lugano
Information and problem report usage in system saftey engineering division
NASA Technical Reports Server (NTRS)
Morrissey, Stephen J.
1990-01-01
Five basic problems or question areas are examined. They are as follows: (1) Evaluate adequacy of current problem/performance data base; (2) Evaluate methods of performing trend analysis; (3) Methods and sources of data for probabilistic risk assessment; and (4) How is risk assessment documentation upgraded and/or updated. The fifth problem was to provide recommendations for each of the above four areas.
Advanced tracking systems design and analysis
NASA Technical Reports Server (NTRS)
Potash, R.; Floyd, L.; Jacobsen, A.; Cunningham, K.; Kapoor, A.; Kwadrat, C.; Radel, J.; Mccarthy, J.
1989-01-01
The results of an assessment of several types of high-accuracy tracking systems proposed to track the spacecraft in the National Aeronautics and Space Administration (NASA) Advanced Tracking and Data Relay Satellite System (ATDRSS) are summarized. Tracking systems based on the use of interferometry and ranging are investigated. For each system, the top-level system design and operations concept are provided. A comparative system assessment is presented in terms of orbit determination performance, ATDRSS impacts, life-cycle cost, and technological risk.
Wang, J; Yang, S; Guo, F H; Mao, X; Zhou, H; Dong, Y Q; Wang, Z M; Luo, F
2015-11-13
The angiotensin-converting enzyme (ACE) gene insertion/deletion (I/D) polymorphism has been reported to be associated with digestive system cancer; however, the results from previous studies have been conflicting. The present study aimed to investigate the association between the ACE I/D polymorphism and the risk of digestive system cancer using a meta-analysis of previously published studies. Databases were systematically searched to identify relevant studies published prior to December 2014. We estimated the pooled OR with its 95%CI to assess the association. The meta-analysis consisted of thirteen case-control studies that included 2557 patients and 4356 healthy controls. Meta-analysis results based on all the studies showed no significant association between the ACE I/D polymorphism and the risk of digestive system cancer (DD vs II: OR = 0.85, 95%CI = 0.59-1.24; DI vs II: OR = 0.94, 95%CI = 0.78-1.15; dominant model: OR = 0.96, 95%CI = 0.81- 1.15; recessive model: OR = 1.06, 95%CI = 0.76-1.48). Subgroup analyses by race and cancer type did not detect an association between the ACE I/D polymorphism and digestive system cancer risk. However, when the analyses were restricted to smaller studies (N < 500 patients), the summary OR of DI vs II was 0.80 (95%CI = 0.66-0.97). Our analyses detected a possibility of publication bias with a misestimate of the true association by smaller studies. Overall, meta-analysis results suggest the ACE I/D polymorphism might not be associated with susceptibility to digestive system cancer. Further large and well-designed studies are needed to confirm these conclusions.
Local Food Systems Food Safety Concerns.
Chapman, Benjamin; Gunter, Chris
2018-04-01
Foodborne disease causes an estimated 48 million illnesses and 3,000 deaths annually (Scallan E, et al., Emerg Infect Dis 17:7-15, 2011), with U.S. economic costs estimated at $152 billion to $1.4 trillion annually (Roberts T, Am J Agric Econ 89:1183-1188, 2007; Scharff RL, http://www.pewtrusts.org/en/research-and-analysis/reports/0001/01/01/healthrelated-costs-from-foodborne-illness-in-the-united-states, 2010). An increasing number of these illnesses are associated with fresh fruits and vegetables. An analysis of outbreaks from 1990 to 2003 found that 12% of outbreaks and 20% of outbreak-related illnesses were associated with produce (Klein S, Smith DeWaal CS, Center for Science in the Public Interest, https://cspinet.org/sites/default/files/attachment/ddreport.pdf, June 2008; Lynch M, Tauxe R, Hedberg C, Epidemiol Infect 137:307-315, 2009). These food safety problems have resulted in various stakeholders recommending the shift to a more preventative and risk-based food safety system. A modern risk-based food safety system takes a farm-to-fork preventative approach to food safety and relies on the proactive collection and analysis of data to better understand potential hazards and risk factors, to design and evaluate interventions, and to prioritize prevention efforts. Such a system focuses limited resources at the points in the food system with the likelihood of having greatest benefit to public health. As shared kitchens, food hubs, and local food systems such as community supported agriculture are becoming more prevalent throughout the United States, so are foodborne illness outbreaks at these locations. At these locations, many with limited resources, food safety methods of prevention are rarely the main focus. This lack of focus on food safety knowledge is why a growing number of foodborne illness outbreaks are occurring at these locations.
IV&V Project Assessment Process Validation
NASA Technical Reports Server (NTRS)
Driskell, Stephen
2012-01-01
The Space Launch System (SLS) will launch NASA's Multi-Purpose Crew Vehicle (MPCV). This launch vehicle will provide American launch capability for human exploration and travelling beyond Earth orbit. SLS is designed to be flexible for crew or cargo missions. The first test flight is scheduled for December 2017. The SLS SRR/SDR provided insight into the project development life cycle. NASA IV&V ran the standard Risk Based Assessment and Portfolio Based Risk Assessment to identify analysis tasking for the SLS program. This presentation examines the SLS System Requirements Review/System Definition Review (SRR/SDR), IV&V findings for IV&V process validation correlation to/from the selected IV&V tasking and capabilities. It also provides a reusable IEEE 1012 scorecard for programmatic completeness across the software development life cycle.
Accurate Diabetes Risk Stratification Using Machine Learning: Role of Missing Value and Outliers.
Maniruzzaman, Md; Rahman, Md Jahanur; Al-MehediHasan, Md; Suri, Harman S; Abedin, Md Menhazul; El-Baz, Ayman; Suri, Jasjit S
2018-04-10
Diabetes mellitus is a group of metabolic diseases in which blood sugar levels are too high. About 8.8% of the world was diabetic in 2017. It is projected that this will reach nearly 10% by 2045. The major challenge is that when machine learning-based classifiers are applied to such data sets for risk stratification, leads to lower performance. Thus, our objective is to develop an optimized and robust machine learning (ML) system under the assumption that missing values or outliers if replaced by a median configuration will yield higher risk stratification accuracy. This ML-based risk stratification is designed, optimized and evaluated, where: (i) the features are extracted and optimized from the six feature selection techniques (random forest, logistic regression, mutual information, principal component analysis, analysis of variance, and Fisher discriminant ratio) and combined with ten different types of classifiers (linear discriminant analysis, quadratic discriminant analysis, naïve Bayes, Gaussian process classification, support vector machine, artificial neural network, Adaboost, logistic regression, decision tree, and random forest) under the hypothesis that both missing values and outliers when replaced by computed medians will improve the risk stratification accuracy. Pima Indian diabetic dataset (768 patients: 268 diabetic and 500 controls) was used. Our results demonstrate that on replacing the missing values and outliers by group median and median values, respectively and further using the combination of random forest feature selection and random forest classification technique yields an accuracy, sensitivity, specificity, positive predictive value, negative predictive value and area under the curve as: 92.26%, 95.96%, 79.72%, 91.14%, 91.20%, and 0.93, respectively. This is an improvement of 10% over previously developed techniques published in literature. The system was validated for its stability and reliability. RF-based model showed the best performance when outliers are replaced by median values.
Space-based solar power conversion and delivery systems study. Volume 5: Economic analysis
NASA Technical Reports Server (NTRS)
1977-01-01
Space-based solar power conversion and delivery systems are studied along with a variety of economic and programmatic issues relevant to their development and deployment. The costs, uncertainties and risks associated with the current photovoltaic Satellite Solar Power System (SSPS) configuration, and issues affecting the development of an economically viable SSPS development program are addressed. In particular, the desirability of low earth orbit (LEO) and geosynchronous (GEO) test satellites is examined and critical technology areas are identified. The development of SSPS unit production (nth item), and operation and maintenance cost models suitable for incorporation into a risk assessment (Monte Carlo) model (RAM) are reported. The RAM was then used to evaluate the current SSPS configuration expected costs and cost-risk associated with this configuration. By examining differential costs and cost-risk as a function of postulated technology developments, the critical technologies, that is, those which drive costs and/or cost-risk, are identified. It is shown that the key technology area deals with productivity in space, that is, the ability to fabricate and assemble large structures in space, not, as might be expected, with some hardware component technology.
Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.
Haimes, Yacov Y
2018-01-01
The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
Huang, Weiqing; Fan, Hongbo; Qiu, Yongfu; Cheng, Zhiyu; Xu, Pingru; Qian, Yu
2016-05-01
Recently, China has frequently experienced large-scale, severe and persistent haze pollution due to surging urbanization and industrialization and a rapid growth in the number of motor vehicles and energy consumption. The vehicle emission due to the consumption of a large number of fossil fuels is no doubt a critical factor of the haze pollution. This work is focused on the causation mechanism of haze pollution related to the vehicle emission for Guangzhou city by employing the Fault Tree Analysis (FTA) method for the first time. With the establishment of the fault tree system of "Haze weather-Vehicle exhausts explosive emission", all of the important risk factors are discussed and identified by using this deductive FTA method. The qualitative and quantitative assessments of the fault tree system are carried out based on the structure, probability and critical importance degree analysis of the risk factors. The study may provide a new simple and effective tool/strategy for the causation mechanism analysis and risk management of haze pollution in China. Copyright © 2016 Elsevier Ltd. All rights reserved.
Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)
NASA Astrophysics Data System (ADS)
Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.
2014-04-01
A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mossahebi, S; Feigenberg, S; Nichols, E
Purpose: GammaPod™, the first stereotactic radiotherapy device for early stage breast cancer treatment, has been recently installed and commissioned at our institution. A multidisciplinary working group applied the failure mode and effects analysis (FMEA) approach to perform a risk analysis. Methods: FMEA was applied to the GammaPod™ treatment process by: 1) generating process maps for each stage of treatment; 2) identifying potential failure modes and outlining their causes and effects; 3) scoring the potential failure modes using the risk priority number (RPN) system based on the product of severity, frequency of occurrence, and detectability (ranging 1–10). An RPN of highermore » than 150 was set as the threshold for minimal concern of risk. For these high-risk failure modes, potential quality assurance procedures and risk control techniques have been proposed. A new set of severity, occurrence, and detectability values were re-assessed in presence of the suggested mitigation strategies. Results: In the single-day image-and-treat workflow, 19, 22, and 27 sub-processes were identified for the stages of simulation, treatment planning, and delivery processes, respectively. During the simulation stage, 38 potential failure modes were found and scored, in terms of RPN, in the range of 9-392. 34 potential failure modes were analyzed in treatment planning with a score range of 16-200. For the treatment delivery stage, 47 potential failure modes were found with an RPN score range of 16-392. The most critical failure modes consisted of breast-cup pressure loss and incorrect target localization due to patient upper-body alignment inaccuracies. The final RPN score of these failure modes based on recommended actions were assessed to be below 150. Conclusion: FMEA risk analysis technique was applied to the treatment process of GammaPod™, a new stereotactic radiotherapy technology. Application of systematic risk analysis methods is projected to lead to improved quality of GammaPod™ treatments. Ying Niu and Cedric Yu are affiliated with Xcision Medical Systems.« less
Atella, Vincenzo; Brunetti, Marianna; Maestas, Nicole
2013-01-01
Health risk is increasingly viewed as an important form of background risk that affects household portfolio decisions. However, its role might be mediated by the presence of a protective full-coverage national health service that could reduce households’ probability of incurring current and future out-of-pocket medical expenditures. We use SHARE data to study the influence of current health status and future health risk on the decision to hold risky assets, across ten European countries with different health systems, each offering a different degree of protection against out-of-pocket medical expenditures. We find robust empirical evidence that perceived health status matters more than objective health status and, consistent with the theory of background risk, health risk affects portfolio choices only in countries with less protective health care systems. Furthermore, portfolio decisions consistent with background risk models are observed only with respect to middle-aged and highly-educated investors. PMID:23885134
Early Warning System for West Nile Virus Risk Areas, California, USA
Ahearn, Sean C.; McConchie, Alan; Glaser, Carol; Jean, Cynthia; Barker, Chris; Park, Bborie; Padgett, Kerry; Parker, Erin; Aquino, Ervic; Kramer, Vicki
2011-01-01
The Dynamic Continuous-Area Space-Time (DYCAST) system is a biologically based spatiotemporal model that uses public reports of dead birds to identify areas at high risk for West Nile virus (WNV) transmission to humans. In 2005, during a statewide epidemic of WNV (880 cases), the California Department of Public Health prospectively implemented DYCAST over 32,517 km2 in California. Daily risk maps were made available online and used by local agencies to target public education campaigns, surveillance, and mosquito control. DYCAST had 80.8% sensitivity and 90.6% specificity for predicting human cases, and κ analysis indicated moderate strength of chance-adjusted agreement for >4 weeks. High-risk grid cells (populations) were identified an average of 37.2 days before onset of human illness; relative risk for disease was >39× higher than for low-risk cells. Although prediction rates declined in subsequent years, results indicate DYCAST was a timely and effective early warning system during the severe 2005 epidemic. PMID:21801622
OCCURRENCE AND EXPOSURE ASSESSMENT FOR THE ...
Describes the occurrence of Cryptosporidium and other pathogens in the raw and finished water of public water systems (PWS) based on modeling of source water survey data. Analysis of microbial occurrence data to support LT2ESWTR microbial risk assessment
2013-06-30
QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC
Xu, Zicheng; Li, Xiao; Qin, Zhiqiang; Xue, Jianxin; Wang, Jingyuan; Liu, Zhentao; Cai, Hongzhou; Yu, Bin; Xu, Ting; Zou, Qin
2017-07-24
Individual studies of the association between N-acetyltransferase 1 (NAT1)*10 allele and bladder cancer susceptibility have shown inconclusive results. To derive a more precise estimation of any such relationship, we performed this systemic review and updated meta-analysis based on 17 publications. A total of 17 studies were investigated with 4,322 bladder cancer cases and 4,944 controls. The pooled odds ratios (ORs) with 95% confidence intervals (CIs) were used to assess the strength of the association. Subgroup analyses were conducted based on ethnicity, sex, source of controls and detecting methods. Then trial sequential analysis was performed to evaluate whether the evidence of the results was sufficient and reduce the risk of type I error. There was no association between NAT1*10 allele and bladder cancer risk in a random-effects model (OR = 0.96, 95% CI, 0.84-1.10) or in a fixed-effects model (OR = 0.95, 95% CI, 0.87-1.03). In addition, no significantly increased risk of bladder cancer was found in any other subgroup analysis. Then, trial sequential analyses demonstrated that the results of our study need to be further verified. Despite its limitations, the results of the present meta-analysis suggested that there was no association between NAT1*10 allele and bladder cancer risk. More importantly, our findings need to be further validated regarding whether being without the NAT1*10 allele could in the future be shown to be a potential marker for the risk of bladder cancer.
2011-08-01
investigated. Implementation of this technology into the maintenance framework depends on several factors, including safety of the structural system, cost... Maintenance Parameters The F-15 Program has indicated that, in practice , maintenance actions are generally performed on flight hour multiples of 200...Risk Analysis or the Perform Cost Benefit Analysis sections of the flowchart. 4.6. Determine System Configurations The current maintenance practice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, C.M.; Fortmann, K.M.; Hill, S.W.
1994-12-01
Environmental restoration is an area of concern in an environmentally conscious world. Much effort is required to clean up the environment and promote environmentally sound methods for managing current land use. In light of the public consciousness with the latter topic, the United States Air Force must also take an active role in addressing these environmental issues with respect to current and future USAF base land use. This thesis uses the systems engineering technique to assess human health risks and to evaluate risk management options with respect to depleted uranium contamination in the sampled region of Test Area (TA) C-64more » at Eglin Air Force Base (AFB). The research combines the disciplines of environmental data collection, DU soil concentration distribution modeling, ground water modeling, particle resuspension modeling, exposure assessment, health hazard assessment, and uncertainty analysis to characterize the test area. These disciplines are required to quantify current and future health risks, as well as to recommend cost effective ways to increase confidence in health risk assessment and remediation options.« less
NASA Technical Reports Server (NTRS)
Carreno, Victor
2006-01-01
This document describes a method to demonstrate that a UAS, operating in the NAS, can avoid collisions with an equivalent level of safety compared to a manned aircraft. The method is based on the calculation of a collision probability for a UAS , the calculation of a collision probability for a base line manned aircraft, and the calculation of a risk ratio given by: Risk Ratio = P(collision_UAS)/P(collision_manned). A UAS will achieve an equivalent level of safety for collision risk if the Risk Ratio is less than or equal to one. Calculation of the probability of collision for UAS and manned aircraft is accomplished through event/fault trees.
NASA Technical Reports Server (NTRS)
Zelkin, Natalie; Henriksen, Stephen
2011-01-01
This document is being provided as part of ITT's NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: "New ATM Requirements--Future Communications, C-Band and L-Band Communications Standard Development." ITT has completed a safety hazard analysis providing a preliminary safety assessment for the proposed L-band (960 to 1164 MHz) terrestrial en route communications system. The assessment was performed following the guidelines outlined in the Federal Aviation Administration Safety Risk Management Guidance for System Acquisitions document. The safety analysis did not identify any hazards with an unacceptable risk, though a number of hazards with a medium risk were documented. This effort represents a preliminary safety hazard analysis and notes the triggers for risk reassessment. A detailed safety hazards analysis is recommended as a follow-on activity to assess particular components of the L-band communication system after the technology is chosen and system rollout timing is determined. The security risk analysis resulted in identifying main security threats to the proposed system as well as noting additional threats recommended for a future security analysis conducted at a later stage in the system development process. The document discusses various security controls, including those suggested in the COCR Version 2.0.
DiMase, Daniel; Collier, Zachary A; Carlson, Jinae; Gray, Robin B; Linkov, Igor
2016-10-01
Within the microelectronics industry, there is a growing concern regarding the introduction of counterfeit electronic parts into the supply chain. Even though this problem is widespread, there have been limited attempts to implement risk-based approaches for testing and supply chain management. Supply chain risk management tends to focus on the highly visible disruptions of the supply chain instead of the covert entrance of counterfeits; thus counterfeit risk is difficult to mitigate. This article provides an overview of the complexities of the electronics supply chain, and highlights some gaps in risk assessment practices. In particular, this article calls for enhanced traceability capabilities to track and trace parts at risk through various stages of the supply chain. Placing the focus on risk-informed decision making through the following strategies is needed, including prioritization of high-risk parts, moving beyond certificates of conformance, incentivizing best supply chain management practices, adoption of industry standards, and design and management for supply chain resilience. © 2016 Society for Risk Analysis.
Jiang, Jheng Jie; Lee, Chon Lin; Fang, Meng Der; Boyd, Kenneth G.; Gibb, Stuart W.
2015-01-01
This paper presents a methodology based on multivariate data analysis for characterizing potential source contributions of emerging contaminants (ECs) detected in 26 river water samples across multi-scape regions during dry and wet seasons. Based on this methodology, we unveil an approach toward potential source contributions of ECs, a concept we refer to as the “Pharmaco-signature.” Exploratory analysis of data points has been carried out by unsupervised pattern recognition (hierarchical cluster analysis, HCA) and receptor model (principal component analysis-multiple linear regression, PCA-MLR) in an attempt to demonstrate significant source contributions of ECs in different land-use zone. Robust cluster solutions grouped the database according to different EC profiles. PCA-MLR identified that 58.9% of the mean summed ECs were contributed by domestic impact, 9.7% by antibiotics application, and 31.4% by drug abuse. Diclofenac, ibuprofen, codeine, ampicillin, tetracycline, and erythromycin-H2O have significant pollution risk quotients (RQ>1), indicating potentially high risk to aquatic organisms in Taiwan. PMID:25874375
Pretagostini, R; Gabbrielli, F; Fiaschetti, P; Oliveti, A; Cenci, S; Peritore, D; Stabile, D
2010-05-01
Starting from the report on medical errors published in 1999 by the US Institute of Medicine, a number of different approaches to risk management have been developed for maximum risk reduction in health care activities. The health care authorities in many countries have focused attention on patient safety, employing action research programs that are based on quite different principles. We performed a systematic Medline research of the literature since 1999. The following key words were used, also combining boolean operators and medical subheading terms: "adverse event," "risk management," "error," and "governance." Studies published in the last 5 years were particularly classified in various groups: risk management in health care systems; safety in specific hospital activities; and health care institutions' official documents. Methods of action researches have been analysed and their characteristics compared. Their suitability for safety development in donation, retrieval, and transplantation processes were discussed in the reality of the Italian transplant network. Some action researches and studies were dedicated to entire national healthcare systems, whereas others focused on specific risks. Many research programs have undergone critical review in the literature. Retrospective analysis has centered on so-called sentinel events to particularly analyze only a minor portion of the organizational phenomena, which can be the origin of an adverse event, an incident, or an error. Sentinel events give useful information if they are studied in highly engineered and standardized organizations like laboratories or tissue establishments, but they show several limits in the analysis of organ donation, retrieval, and transplantation processes, which are characterized by prevailing human factors, with high intrinsic risk and variability. Thus, they are poorly effective to deliver sure elements to base safety management improvement programs, especially regarding multidisciplinary systems with high complexity. In organ transplantation, the possibility to increase safety seems greater using proactive research, mainly centred on organizational processes together with retrospective analyses but not limited to sentinel event reports. Copyright (c) 2010. Published by Elsevier Inc.
An evaluation of Computational Fluid dynamics model for flood risk analysis
NASA Astrophysics Data System (ADS)
Di Francesco, Silvia; Biscarini, Chiara; Montesarchio, Valeria
2014-05-01
This work presents an analysis of the hydrological-hydraulic engineering requisites for Risk evaluation and efficient flood damage reduction plans. Most of the research efforts have been dedicated to the scientific and technical aspects of risk assessment, providing estimates of possible alternatives and of the risk associated. In the decision making process for mitigation plan, the contribute of scientist is crucial, due to the fact that Risk-Damage analysis is based on evaluation of flow field ,of Hydraulic Risk and on economical and societal considerations. The present paper will focus on the first part of process, the mathematical modelling of flood events which is the base for all further considerations. The evaluation of potential catastrophic damage consequent to a flood event and in particular to dam failure requires modelling of the flood with sufficient detail so to capture the spatial and temporal evolutions of the event, as well of the velocity field. Thus, the selection of an appropriate mathematical model to correctly simulate flood routing is an essential step. In this work we present the application of two 3D Computational fluid dynamics models to a synthetic and real case study in order to evaluate the correct evolution of flow field and the associated flood Risk . The first model is based on a opensource CFD platform called openFoam. Water flow is schematized with a classical continuum approach based on Navier-Stokes equation coupled with Volume of fluid (VOF) method to take in account the multiphase character of river bottom-water- air systems. The second model instead is based on the Lattice Boltzmann method, an innovative numerical fluid dynamics scheme based on Boltzmann's kinetic equation that represents the flow dynamics at the macroscopic level by incorporating a microscopic kinetic approach. Fluid is seen as composed by particles that can move and collide among them. Simulation results from both models are promising and congruent to experimental results available in literature, thought the LBM model requires less computational effort respect to the NS one.
Chang, Y S; Chang, C C; Chen, Y H; Chen, W S; Chen, J H
2017-10-01
Objectives Patients with systemic lupus erythematosus are considered vulnerable to infective endocarditis and prophylactic antibiotics are recommended before an invasive dental procedure. However, the evidence is insufficient. This nationwide population-based study evaluated the risk and related factors of infective endocarditis in systemic lupus erythematosus. Methods We identified 12,102 systemic lupus erythematosus patients from the National Health Insurance research-oriented database, and compared the incidence rate of infective endocarditis with that among 48,408 non-systemic lupus erythematosus controls. A Cox multivariable proportional hazards model was employed to evaluate the risk of infective endocarditis in the systemic lupus erythematosus cohort. Results After a mean follow-up of more than six years, the systemic lupus erythematosus cohort had a significantly higher incidence rate of infective endocarditis (42.58 vs 4.32 per 100,000 person-years, incidence rate ratio = 9.86, p < 0.001) than that of the control cohort. By contrast, the older systemic lupus erythematosus cohort had lower risk (adjusted hazard ratio 11.64) than that of the younger-than-60-years systemic lupus erythematosus cohort (adjusted hazard ratio 15.82). Cox multivariate proportional hazards analysis revealed heart disease (hazard ratio = 5.71, p < 0.001), chronic kidney disease (hazard ratio = 2.98, p = 0.034), receiving a dental procedure within 30 days (hazard ratio = 36.80, p < 0.001), and intravenous steroid therapy within 30 days (hazard ratio = 39.59, p < 0.001) were independent risk factors for infective endocarditis in systemic lupus erythematosus patients. Conclusions A higher risk of infective endocarditis was observed in systemic lupus erythematosus patients. Risk factors for infective endocarditis in the systemic lupus erythematosus cohort included heart disease, chronic kidney disease, steroid pulse therapy within 30 days, and a recent invasive dental procedure within 30 days.
Yen, Jennifer; Van Arendonk, Kyle J.; Streiff, Michael B.; McNamara, LeAnn; Stewart, F. Dylan; Conner G, Kim G; Thompson, Richard E.; Haut, Elliott R.; Takemoto, Clifford M.
2017-01-01
OBJECTIVES Identify risk factors for venous thromboembolism (VTE) and develop a VTE risk assessment model for pediatric trauma patients. DESIGN, SETTING, AND PATIENTS We performed a retrospective review of patients 21 years and younger who were hospitalized following traumatic injuries at the John Hopkins level 1 adult and pediatric trauma center (1987-2011). The clinical characteristics of patients with and without VTE were compared, and multivariable logistic regression analysis was used to identify independent risk factors for VTE. Weighted risk assessment scoring systems were developed based on these and previously identified factors from patients in the National Trauma Data Bank (NTDB 2008-2010); the scoring systems were validated in this cohort from Johns Hopkins as well as a cohort of pediatric admissions from the NTDB (2011-2012). MAIN RESULTS Forty-nine of 17,366 pediatric trauma patients (0.28%) were diagnosed with VTE after admission to our trauma center. After adjusting for potential confounders, VTE was independently associated with older age, surgery, blood transfusion, higher Injury Severity Score (ISS), and lower Glasgow Coma Scale (GCS) score. These and additional factors were identified in 402,329 pediatric patients from the NTDB from 2008-2010; independent risk factors from the logistic regression analysis of this NTDB cohort were selected and incorporated into weighted risk assessment scoring systems. Two models were developed and were cross-validated in 2 separate pediatric trauma cohorts: 1) 282,535 patients in the NTDB from 2011 to 2012 2) 17,366 patients from Johns Hopkins. The receiver operator curve using these models in the validation cohorts had area under the curves that ranged 90% to 94%. CONCLUSIONS VTE is infrequent after trauma in pediatric patients. We developed weighted scoring systems to stratify pediatric trauma patients at risk for VTE. These systems may have potential to guide risk-appropriate VTE prophylaxis in children after trauma. PMID:26963757
Stocker, Elena; Becker, Karin; Hate, Siddhi; Hohl, Roland; Schiemenz, Wolfgang; Sacher, Stephan; Zimmer, Andreas; Salar-Behzadi, Sharareh
2017-01-01
This study aimed to apply quality risk management based on the The International Conference on Harmonisation guideline Q9 for the early development stage of hot melt coated multiparticulate systems for oral administration. N-acetylcysteine crystals were coated with a formulation composing tripalmitin and polysorbate 65. The critical quality attributes (CQAs) were initially prioritized using failure mode and effects analysis. The CQAs of the coated material were defined as particle size, taste-masking efficiency, and immediate release profile. The hot melt coated process was characterized via a flowchart, based on the identified potential critical process parameters (CPPs) and their impact on the CQAs. These CPPs were prioritized using a process failure mode, effects, and criticality analysis and their critical impact on the CQAs was experimentally confirmed using a statistical design of experiments. Spray rate, atomization air pressure, and air flow rate were identified as CPPs. Coating amount and content of polysorbate 65 in the coating formulation were identified as critical material attributes. A hazard and critical control points analysis was applied to define control strategies at the critical process points. A fault tree analysis evaluated causes for potential process failures. We successfully demonstrated that a standardized quality risk management approach optimizes the product development sustainability and supports the regulatory aspects. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
An improved method for risk evaluation in failure modes and effects analysis of CNC lathe
NASA Astrophysics Data System (ADS)
Rachieru, N.; Belu, N.; Anghel, D. C.
2015-11-01
Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.
Equipment management risk rating system based on engineering endpoints.
James, P J
1999-01-01
The equipment management risk ratings system outlined here offers two significant departures from current practice: risk classifications are based on intrinsic device risks, and the risk rating system is based on engineering endpoints. Intrinsic device risks are categorized as physical, clinical and technical, and these flow from the incoming equipment assessment process. Engineering risk management is based on verification of engineering endpoints such as clinical measurements or energy delivery. This practice eliminates the ambiguity associated with ranking risk in terms of physiologic and higher-level outcome endpoints such as no significant hazards, low significance, injury, or mortality.
Long, Haiming; Tang, Nengyu
2017-01-01
This study considers the effect of an industry’s network topology on its systemic risk contribution to the stock market using data from the CSI 300 two-tier industry indices from the Chinese stock market. We first measure industry’s conditional-value-at-risk (CoVaR) and the systemic risk contribution (ΔCoVaR) using the fitted time-varying t-copula function. The network of the stock industry is established based on dynamic conditional correlations with the minimum spanning tree. Then, we investigate the connection characteristics and topology of the network. Finally, we utilize seemingly unrelated regression estimation (SUR) of panel data to analyze the relationship between network topology of the stock industry and the industry’s systemic risk contribution. The results show that the systemic risk contribution of small-scale industries such as real estate, food and beverage, software services, and durable goods and clothing, is higher than that of large-scale industries, such as banking, insurance and energy. Industries with large betweenness centrality, closeness centrality, and clustering coefficient and small node occupancy layer are associated with greater systemic risk contribution. In addition, further analysis using a threshold model confirms that the results are robust. PMID:28683130
[Risk Management: concepts and chances for public health].
Palm, Stefan; Cardeneo, Margareta; Halber, Marco; Schrappe, Matthias
2002-01-15
Errors are a common problem in medicine and occur as a result of a complex process involving many contributing factors. Medical errors significantly reduce the safety margin for the patient and contribute additional costs in health care delivery. In most cases adverse events cannot be attributed to a single underlying cause. Therefore an effective risk management strategy must follow a system approach, which is based on counting and analysis of near misses. The development of defenses against the undesired effects of errors should be the main focus rather than asking the question "Who blundered?". Analysis of near misses (which in this context can be compared to indicators) offers several methodological advantages as compared to the analysis of errors and adverse events. Risk management is an integral element of quality management.
Zhang, Xiaoling; Huang, Kai; Zou, Rui; Liu, Yong; Yu, Yajuan
2013-01-01
The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of "low risk and high return efficiency" in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.
Zou, Rui; Liu, Yong; Yu, Yajuan
2013-01-01
The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of “low risk and high return efficiency” in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management. PMID:24191144
NASA Astrophysics Data System (ADS)
Koliopoulos, T. C.; Koliopoulou, G.
2007-10-01
We present an input-output solution for simulating the associated behavior and optimized physical needs of an environmental system. The simulations and numerical analysis determined the accurate boundary loads and areas that were required to interact for the proper physical operation of a complicated environmental system. A case study was conducted to simulate the optimum balance of an environmental system based on an artificial intelligent multi-interacting input-output numerical scheme. The numerical results were focused on probable further environmental management techniques, with the objective of minimizing any risks and associated environmental impact to protect the quality of public health and the environment. Our conclusions allowed us to minimize the associated risks, focusing on probable cases in an emergency to protect the surrounded anthropogenic or natural environment. Therefore, the lining magnitude could be determined for any useful associated technical works to support the environmental system under examination, taking into account its particular boundary necessities and constraints.
Mars Exploration Rovers Landing Dispersion Analysis
NASA Technical Reports Server (NTRS)
Knocke, Philip C.; Wawrzyniak, Geoffrey G.; Kennedy, Brian M.; Desai, Prasun N.; Parker, TImothy J.; Golombek, Matthew P.; Duxbury, Thomas C.; Kass, David M.
2004-01-01
Landing dispersion estimates for the Mars Exploration Rover missions were key elements in the site targeting process and in the evaluation of landing risk. This paper addresses the process and results of the landing dispersion analyses performed for both Spirit and Opportunity. The several contributors to landing dispersions (navigation and atmospheric uncertainties, spacecraft modeling, winds, and margins) are discussed, as are the analysis tools used. JPL's MarsLS program, a MATLAB-based landing dispersion visualization and statistical analysis tool, was used to calculate the probability of landing within hazardous areas. By convolving this with the probability of landing within flight system limits (in-spec landing) for each hazard area, a single overall measure of landing risk was calculated for each landing ellipse. In-spec probability contours were also generated, allowing a more synoptic view of site risks, illustrating the sensitivity to changes in landing location, and quantifying the possible consequences of anomalies such as incomplete maneuvers. Data and products required to support these analyses are described, including the landing footprints calculated by NASA Langley's POST program and JPL's AEPL program, cartographically registered base maps and hazard maps, and flight system estimates of in-spec landing probabilities for each hazard terrain type. Various factors encountered during operations, including evolving navigation estimates and changing atmospheric models, are discussed and final landing points are compared with approach estimates.
Modelling Risk to US Military Populations from Stopping Blanket Mandatory Polio Vaccination.
Burgess, Colleen; Burgess, Andrew; McMullen, Kellie
2017-01-01
Transmission of polio poses a threat to military forces when deploying to regions where such viruses are endemic. US-born soldiers generally enter service with immunity resulting from childhood immunization against polio; moreover, new recruits are routinely vaccinated with inactivated poliovirus vaccine (IPV), supplemented based upon deployment circumstances. Given residual protection from childhood vaccination, risk-based vaccination may sufficiently protect troops from polio transmission. This analysis employed a mathematical system for polio transmission within military populations interacting with locals in a polio-endemic region to evaluate changes in vaccination policy. Removal of blanket immunization had no effect on simulated polio incidence among deployed military populations when risk-based immunization was employed; however, when these individuals reintegrated with their base populations, risk of transmission to nondeployed personnel increased by 19%. In the absence of both blanket- and risk-based immunization, transmission to nondeployed populations increased by 25%. The overall number of new infections among nondeployed populations was negligible for both scenarios due to high childhood immunization rates, partial protection against transmission conferred by IPV, and low global disease incidence levels. Risk-based immunization driven by deployment to polio-endemic regions is sufficient to prevent transmission among both deployed and nondeployed US military populations.
TOKEN: Trustable Keystroke-Based Authentication for Web-Based Applications on Smartphones
NASA Astrophysics Data System (ADS)
Nauman, Mohammad; Ali, Tamleek
Smartphones are increasingly being used to store personal information as well as to access sensitive data from the Internet and the cloud. Establishment of the identity of a user requesting information from smartphones is a prerequisite for secure systems in such scenarios. In the past, keystroke-based user identification has been successfully deployed on production-level mobile devices to mitigate the risks associated with naïve username/password based authentication. However, these approaches have two major limitations: they are not applicable to services where authentication occurs outside the domain of the mobile device - such as web-based services; and they often overly tax the limited computational capabilities of mobile devices. In this paper, we propose a protocol for keystroke dynamics analysis which allows web-based applications to make use of remote attestation and delegated keystroke analysis. The end result is an efficient keystroke-based user identification mechanism that strengthens traditional password protected services while mitigating the risks of user profiling by collaborating malicious web services.
Quantitative risk management in gas injection project: a case study from Oman oil and gas industry
NASA Astrophysics Data System (ADS)
Khadem, Mohammad Miftaur Rahman Khan; Piya, Sujan; Shamsuzzoha, Ahm
2017-09-01
The purpose of this research was to study the recognition, application and quantification of the risks associated in managing projects. In this research, the management of risks in an oil and gas project is studied and implemented within a case company in Oman. In this study, at first, the qualitative data related to risks in the project were identified through field visits and extensive interviews. These data were then translated into numerical values based on the expert's opinion. Further, the numerical data were used as an input to Monte Carlo simulation. RiskyProject Professional™ software was used to simulate the system based on the identified risks. The simulation result predicted a delay of about 2 years as a worse case with no chance of meeting the project's on stream date. Also, it has predicted 8% chance of exceeding the total estimated budget. The result of numerical analysis from the proposed model is validated by comparing it with the result of qualitative analysis, which was obtained through discussion with various project managers of company.
Quality Interaction Between Mission Assurance and Project Team Members
NASA Technical Reports Server (NTRS)
Kwong-Fu, Helenann H.; Wilson, Robert K.
2006-01-01
Mission Assurance independent assessments started during the development cycle and continued through post launch operations. In operations, Health and Safety of the Observatory is of utmost importance. Therefore, Mission Assurance must ensure requirements compliance and focus on process improvements required across the operational systems including new/modified products, tools, and procedures. The deployment of the interactive model involves three objectives: Team member Interaction, Good Root Cause Analysis Practices, and Risk Assessment to avoid reoccurrences. In applying this model, we use a metric based measurement process and was found to have the most significant effect, which points to the importance of focuses on a combination of root cause analysis and risk approaches allowing the engineers the ability to prioritize and quantify their corrective actions based on a well-defined set of root cause definitions (i.e. closure criteria for problem reports), success criteria and risk rating definitions.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-06
... Rulemaking: State-78, Risk Analysis and Management Records SUMMARY: Notice is hereby given that the... portions of the Risk Analysis and Management (RAM) Records, State-78, system of records contain criminal...) * * * (2) * * * Risk Analysis and Management Records, STATE-78. * * * * * (b) * * * (1) * * * Risk Analysis...
Varzakas, Theodoros H
2011-09-01
The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative.
Development of the Methodology Needed to Quantify Risks to Groundwater at CO2 Storage Sites
NASA Astrophysics Data System (ADS)
Brown, C. F.; Birkholzer, J. T.; Carroll, S.; Hakala, A.; Keating, E. H.; Lopano, C. L.; Newell, D. L.; Spycher, N.
2011-12-01
The National Risk Assessment Partnership (NRAP) is an effort that harnesses capabilities across five U.S. Department of Energy (DOE) national laboratories into a mission-focused platform to develop a defensible, science-based quantitative methodology for determining risk profiles at CO2 storage sites. NRAP is conducting risk and uncertainty analysis in the areas of reservoir performance, natural leakage pathways, wellbore integrity, groundwater protection, monitoring, and systems level modeling. The mission of NRAP is "to provide the scientific underpinning for risk assessment with respect to the long-term storage of CO2, including assessment of residual risk associated with a site post-closure." Additionally, NRAP will develop a strategic, risk-based monitoring protocol, such that monitoring at all stages of a project effectively minimizes uncertainty in the predicted behavior of the site, thereby increasing confidence in storage integrity. NRAP's research focus in the area of groundwater protection is divided into three main tasks: 1) development of quantitative risk profiles for potential groundwater impacts; 2) filling key science gaps in developing those risk profiles; and 3) field-based confirmation. Within these three tasks, researchers are engaged in collaborative studies to determine metrics to identify system perturbation and their associated risk factors. Reservoir simulations are being performed to understand/predict consequences of hypothetical leakage scenarios, from which reduced order models are being developed to feed risk profile development. Both laboratory-based experiments and reactive transport modeling studies provide estimates of geochemical impacts over a broad range of leakage scenarios. This presentation will provide an overview of the research objectives within NRAP's groundwater protection focus area, as well as select accomplishments achieved to date.
[Study on the risk assessment method of regional groundwater pollution].
Yang, Yan; Yu, Yun-Jiang; Wang, Zong-Qing; Li, Ding-Long; Sun, Hong-Wei
2013-02-01
Based on the boundary elements of system risk assessment, the regional groundwater pollution risk assessment index system was preliminarily established, which included: regional groundwater specific vulnerability assessment, the regional pollution sources characteristics assessment and the health risk assessment of regional featured pollutants. The three sub-evaluation systems were coupled with the multi-index comprehensive method, the risk was characterized with the Spatial Analysis of ArcMap, and a new method to evaluate regional groundwater pollution risk that suitable for different parts of natural conditions, different types of pollution was established. Take Changzhou as an example, the risk of shallow groundwater pollution was studied with the new method, and found that the vulnerability index of groundwater in Changzhou is high and distributes unevenly; The distribution of pollution sources is concentrated and has a great impact on groundwater pollution risks; Influenced by the pollutants and pollution sources, the values of health risks are high in the urban area of Changzhou. The pollution risk of shallow groundwater is high and distributes unevenly, and distributes in the north of the line of Anjia-Xuejia-Zhenglu, the center of the city and the southeast, where the human activities are more intense and the pollution sources are intensive.
Developing points-based risk-scoring systems in the presence of competing risks.
Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P
2016-09-30
Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Zare Moayedi, Mahboobeh; Aslani, Azam; Fakhrahmad, Mostafa; Ezzatzadegan J, Shahrokh
2018-01-01
This study was conducted to develop an android based patient decision aid (PDA) as a self-care instrument for patients after kidney transplant and its usability evaluation. In this study, the systematic development process of Android-based self-care application for patients after kidney transplant based on Ottawa standard was included: scoping, assemble steering group, analysis of requirements, designing, develop of a prototype and system evaluation. The PDA is a self-triage system that will help early identification of risk symptoms in patients, and help manage them. System recommendations for risk signs are: Refer to the nearest hospital or healthcare center without delay, refer to the doctor and tell your doctor in the next visit. To identify patient care needs, a semi-structured interview with members of steering group, including patients and clinical experts, was conducted by the researchers. A prototype of the decision aid was made according to identified needs in the previous step. Finally, in order to evaluate its usability rate by using the System Usability Scale (SUS) questionnaire, it was used by exerts and patients. This study identified information needs, risk signs and steps that patients need to make appropriate decisions about them. The main capabilities of the decision aid are features such as reminders for appointment/test, time of taking medication, registration of symptoms, weight, blood pressure, body temperature, advising to patient in case of signs of risk, weight, blood pressure, body temperature and test results which were reported in the diagram. The mean score of system's usability evaluated by medical informatics specialists, clinicians, and patients were 88.33, 95, and 91. PDAs was usable and desirable from the point of view of medical informatics specialists, clinicians and patients.
Probabilistic Causal Analysis for System Safety Risk Assessments in Commercial Air Transport
NASA Technical Reports Server (NTRS)
Luxhoj, James T.
2003-01-01
Aviation is one of the critical modes of our national transportation system. As such, it is essential that new technologies be continually developed to ensure that a safe mode of transportation becomes even safer in the future. The NASA Aviation Safety Program (AvSP) is managing the development of new technologies and interventions aimed at reducing the fatal aviation accident rate by a factor of 5 by year 2007 and by a factor of 10 by year 2022. A portfolio assessment is currently being conducted to determine the projected impact that the new technologies and/or interventions may have on reducing aviation safety system risk. This paper reports on advanced risk analytics that combine the use of a human error taxonomy, probabilistic Bayesian Belief Networks, and case-based scenarios to assess a relative risk intensity metric. A sample case is used for illustrative purposes.
Pugliese, F; Albini, E; Serio, O; Apostoli, P
2011-01-01
The 81/2008 Act has defined a model of a health and safety management system that can contribute to prevent the occupational health and safety risks. We have developed the structure of a health and safety management system model and the necessary tools for its implementation in health care facilities. The realization of a model is structured in various phases: initial review, safety policy, planning, implementation, monitoring, management review and continuous improvement. Such a model, in continuous evolution, is based on the responsibilities of the different corporate characters and on an accurate analysis of risks and involved norms.
Human errors and measurement uncertainty
NASA Astrophysics Data System (ADS)
Kuselman, Ilya; Pennecchi, Francesca
2015-04-01
Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.
NASA Technical Reports Server (NTRS)
Jones, Harry W.; Dillon-Merrill, Robin L.; Thomas, Gretchen A.
2003-01-01
The Advanced Integration Matrix (AIM) Project u7ill study and solve systems-level integration issues for exploration missions beyond Low Earth Orbit (LEO), through the design and development of a ground-based facility for developing revolutionary integrated systems for joint human-robotic missions. This paper describes a Probabilistic Risk Analysis (PRA) of human space missions that was developed to help define the direction and priorities for AIM. Risk analysis is required for all major NASA programs and has been used for shuttle, station, and Mars lander programs. It is a prescribed part of early planning and is necessary during concept definition, even before mission scenarios and system designs exist. PRA cm begin when little failure data are available, and be continually updated and refined as detail becomes available. PRA provides a basis for examining tradeoffs among safety, reliability, performance, and cost. The objective of AIM's PRA is to indicate how risk can be managed and future human space missions enabled by the AIM Project. Many critical events can cause injuries and fatalities to the crew without causing loss of vehicle or mission. Some critical systems are beyond AIM's scope, such as propulsion and guidance. Many failure-causing events can be mitigated by conducting operational tests in AIM, such as testing equipment and evaluating operational procedures, especially in the areas of communications and computers, autonomous operations, life support, thermal design, EVA and rover activities, physiological factors including habitation, medical equipment, and food, and multifunctional tools and repairable systems. AIM is well suited to test and demonstrate the habitat, life support, crew operations, and human interface. Because these account for significant crew, systems performance, and science risks, AIM will help reduce mission risk, and missions beyond LEO are far enough in the future that AIM can have significant impact.
Geo-hazard harmonised data a driven process to environmental analysis system
NASA Astrophysics Data System (ADS)
Cipolloni, Carlo; Iadanza, Carla; Pantaloni, Marco; Trigila, Alessandro
2015-04-01
In the last decade an increase of damage caused by natural disasters has been recorded in Italy. To support environmental safety and human protection, by reducing vulnerability of exposed elements as well as improving the resilience of the involved communities, it need to give access to harmonized and customized data that is one of several steps towards delivering adequate support to risk assessment, reduction and management. In this contest has been developed SEIS and Copernicus-GEMES as infrastructure based on web services for environmental analysis, to integrates in its own system specifications and results from INSPIRE. The two landslide risk scenarios developed in different European projects driven the harmonization process of data that represents the basic element to have interoperable web services in environmental analysis system. From two different perspective we have built a common methodology to analyse dataset and transform them into INSPIRE compliant format following the Data Specification on Geology and on Natural Risk Zone given by INSPIRE. To ensure the maximum results and re-usability of data we have also applied to the landslide and geological datasets a wider Data model standard like GeoSciML, that represents the natural extension of INSPIRE data model to provide more information. The aim of this work is to present the first results of two projects concerning the data harmonisation process, where an important role is played by the semantic harmonisation using the ontology service and/or the hierarchy vocabularies available as Link Data or Link Open Data by means of URI directly in the data spatial services. It will be presented how the harmonised web services can provide an add value in a risk scenario analysis system, showing the first results of the landslide environmental analysis developed by the eENVplus and LIFE+IMAGINE projects.
Stone, Erik E; Skubic, Marjorie
2011-01-01
We present an analysis of measuring stride-to-stride gait variability passively, in a home setting using two vision based monitoring techniques: anonymized video data from a system of two web-cameras, and depth imagery from a single Microsoft Kinect. Millions of older adults fall every year. The ability to assess the fall risk of elderly individuals is essential to allowing them to continue living safely in independent settings as they age. Studies have shown that measures of stride-to-stride gait variability are predictive of falls in older adults. For this analysis, a set of participants were asked to perform a number of short walks while being monitored by the two vision based systems, along with a marker based Vicon motion capture system for ground truth. Measures of stride-to-stride gait variability were computed using each of the systems and compared against those obtained from the Vicon.
Syndromic surveillance system based on near real-time cattle mortality monitoring.
Torres, G; Ciaravino, V; Ascaso, S; Flores, V; Romero, L; Simón, F
2015-05-01
Early detection of an infectious disease incursion will minimize the impact of outbreaks in livestock. Syndromic surveillance based on the analysis of readily available data can enhance traditional surveillance systems and allow veterinary authorities to react in a timely manner. This study was based on monitoring the number of cattle carcasses sent for rendering in the veterinary unit of Talavera de la Reina (Spain). The aim was to develop a system to detect deviations from expected values which would signal unexpected health events. Historical weekly collected dead cattle (WCDC) time series stabilized by the Box-Cox transformation and adjusted by the minimum least squares method were used to build the univariate cycling regression model based on a Fourier transformation. Three different models, according to type of production system, were built to estimate the baseline expected number of WCDC. Two types of risk signals were generated: point risk signals when the observed value was greater than the upper 95% confidence interval of the expected baseline, and cumulative risk signals, generated by a modified cumulative sum algorithm, when the cumulative sums of reported deaths were above the cumulative sum of expected deaths. Data from 2011 were used to prospectively validate the model generating seven risk signals. None of them were correlated to infectious disease events but some coincided, in time, with very high climatic temperatures recorded in the region. The harvest effect was also observed during the first week of the study year. Establishing appropriate risk signal thresholds is a limiting factor of predictive models; it needs to be adjusted based on experience gained during the use of the models. To increase the sensitivity and specificity of the predictions epidemiological interpretation of non-specific risk signals should be complemented by other sources of information. The methodology developed in this study can enhance other existing early detection surveillance systems. Syndromic surveillance based on mortality monitoring can reduce the detection time for certain disease outbreaks associated with mild mortality only detected at regional level. The methodology can be adapted to monitor other parameters routinely collected at farm level which can be influenced by communicable diseases. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Yuping; Zheng, Qipeng P.; Wang, Jianhui
2014-11-01
tThis paper presents a two-stage stochastic unit commitment (UC) model, which integrates non-generation resources such as demand response (DR) and energy storage (ES) while including riskconstraints to balance between cost and system reliability due to the fluctuation of variable genera-tion such as wind and solar power. This paper uses conditional value-at-risk (CVaR) measures to modelrisks associated with the decisions in a stochastic environment. In contrast to chance-constrained modelsrequiring extra binary variables, risk constraints based on CVaR only involve linear constraints and con-tinuous variables, making it more computationally attractive. The proposed models with risk constraintsare able to avoid over-conservative solutions butmore » still ensure system reliability represented by loss ofloads. Then numerical experiments are conducted to study the effects of non-generation resources ongenerator schedules and the difference of total expected generation costs with risk consideration. Sen-sitivity analysis based on reliability parameters is also performed to test the decision preferences ofconfidence levels and load-shedding loss allowances on generation cost reduction.« less
NASA Astrophysics Data System (ADS)
Elias, E.; Reyes, J. J.; Steele, C. M.; Rango, A.
2017-12-01
Assessing vulnerability of agricultural systems to climate variability and change is vital in securing food systems and sustaining rural livelihoods. Farmers, ranchers, and forest landowners rely on science-based, decision-relevant, and localized information to maintain production, ecological viability, and economic returns. This contribution synthesizes a collection of research on the future of agricultural production in the American Southwest (SW). Research was based on a variety of geospatial methodologies and datasets to assess the vulnerability of rangelands and livestock, field crops, specialty crops, and forests in the SW to climate-risk and change. This collection emerged from the development of regional vulnerability assessments for agricultural climate-risk by the U.S. Department of Agriculture (USDA) Climate Hub Network, established to deliver science-based information and technologies to enable climate-informed decision-making. Authors defined vulnerability differently based on their agricultural system of interest, although each primarily focuses on biophysical systems. We found that an inconsistent framework for vulnerability and climate risk was necessary to adequately capture the diversity, variability, and heterogeneity of SW landscapes, peoples, and agriculture. Through the diversity of research questions and methodologies, this collection of articles provides valuable information on various aspects of SW vulnerability. All articles relied on geographic information systems technology, with highly variable levels of complexity. Agricultural articles used National Agricultural Statistics Service data, either as tabular county level summaries or through the CropScape cropland raster datasets. Most relied on modeled historic and future climate information, but with differing assumptions regarding spatial resolution and temporal framework. We assert that it is essential to evaluate climate risk using a variety of complementary methodologies and perspectives. In addition, we found that spatial analysis supports informed adaptation, within and outside the SW United States. The persistence and adaptive capacity of agriculture in the water-limited Southwest serves as an instructive example and may offer solutions to reduce future climate risk.
Cantrelle, Christelle; Legeai, Camille; Latouche, Aurélien; Tuppin, Philippe; Jasseron, Carine; Sebbag, Laurent; Bastien, Olivier; Dorent, Richard
2017-08-01
Heart allocation systems are usually urgency-based, offering grafts to candidates at high risk of waitlist mortality. In the context of a revision of the heart allocation rules, we determined observed predictors of 1-year waitlist mortality in France, considering the competing risk of transplantation, to determine which candidate subgroups are favored or disadvantaged by the current allocation system. Patients registered on the French heart waitlist between 2010 and 2013 were included. Cox cause-specific hazards and Fine and Gray subdistribution hazards were used to determine candidate characteristics associated with waitlist mortality and access to transplantation. Of the 2053 candidates, 7 variables were associated with 1-year waitlist mortality by the Fine and Gray method including 4 candidate characteristics related to heart failure severity (hospitalization at listing, serum natriuretic peptide level, systolic pulmonary artery pressure, and glomerular filtration rate) and 3 characteristics not associated with heart failure severity but with lower access to transplantation (blood type, age, and body mass index). Observed waitlist mortality for candidates on mechanical circulatory support was like that of others. The heart allocation system strongly modifies the risk of pretransplant mortality related to heart failure severity. An in-depth competing risk analysis is therefore a more appropriate method to evaluate graft allocation systems. This knowledge should help to prioritize candidates in the context of a limited donor pool.
Decision support systems and methods for complex networks
Huang, Zhenyu [Richland, WA; Wong, Pak Chung [Richland, WA; Ma, Jian [Richland, WA; Mackey, Patrick S [Richland, WA; Chen, Yousu [Richland, WA; Schneider, Kevin P [Seattle, WA
2012-02-28
Methods and systems for automated decision support in analyzing operation data from a complex network. Embodiments of the present invention utilize these algorithms and techniques not only to characterize the past and present condition of a complex network, but also to predict future conditions to help operators anticipate deteriorating and/or problem situations. In particular, embodiments of the present invention characterize network conditions from operation data using a state estimator. Contingency scenarios can then be generated based on those network conditions. For at least a portion of all of the contingency scenarios, risk indices are determined that describe the potential impact of each of those scenarios. Contingency scenarios with risk indices are presented visually as graphical representations in the context of a visual representation of the complex network. Analysis of the historical risk indices based on the graphical representations can then provide trends that allow for prediction of future network conditions.
NASA Technical Reports Server (NTRS)
1985-01-01
Task 2 in the Space Station Data System (SSDS) Analysis/Architecture Study is the development of an information base that will support the conduct of trade studies and provide sufficient data to make design/programmatic decisions. This volume identifies the preferred options in the programmatic category and characterizes these options with respect to performance attributes, constraints, costs, and risks. The programmatic category includes methods used to administrate/manage the development, operation and maintenance of the SSDS. The specific areas discussed include standardization/commonality; systems management; and systems development, including hardware procurement, software development and system integration, test and verification.
Development of the Expert System Domain Advisor and Analysis Tool
1991-09-01
analysis. Typical of the current methods in use at this time is the " tarot metric". This method defines a decision rule whose output is whether to go...B - TAROT METRIC B. ::TTRODUCTION The system chart of ESEM, Figure 1, shows the following three risk-based decision points: i. At prolect initiation...34 decisions. B-I 201 PRELIMINARY T" B-I. Evaluais Factan for ES Deyelopsineg FACTORS POSSIBLE VALUE RATINGS TAROT metric (overall suitability) Poor, Fair
A Bibliometric Analysis of U.S.-Based Research on the Behavioral Risk Factor Surveillance System
Khalil, George M.; Gotway Crawford, Carol A.
2017-01-01
Background Since Alan Pritchard defined bibliometrics as “the application of statistical methods to media of communication” in 1969, bibliometric analyses have become widespread. To date, however, bibliometrics has not been used to analyze publications related to the U.S. Behavioral Risk Factor Surveillance System (BRFSS). Purpose To determine the most frequently cited BRFSS-related topical areas, institutions, and journals. Methods A search of the Web of Knowledge database in 2013 identified U.S.-published studies related to BRFSS, from its start in 1984 through 2012. Search terms were BRFSS, Behavioral Risk Factor Surveillance System, or Behavioral Risk Survey. The resulting 1,387 articles were analyzed descriptively and produced data for VOSviewer, a computer program that plotted a relevance distance–based map and clustered keywords from text in titles and abstracts. Results Topics, journals, and publishing institutions ranged widely. Most research was clustered by content area, such as cancer screening, access to care, heart health, and quality of life. The American Journal of Preventive Medicine and American Journal of Public Health published the most BRFSS-related papers (95 and 70, respectively). Conclusions Bibliometrics can help identify the most frequently published BRFSS-related topics, publishing journals, and publishing institutions. BRFSS data are widely used, particularly by CDC and academic institutions such as the University of Washington and other universities hosting top-ranked schools of public health. Bibliometric analysis and mapping provides an innovative way of quantifying and visualizing the plethora of research conducted using BRFSS data and summarizing the contribution of this surveillance system to public health. PMID:25442231
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Assessing the Value of Information for Identifying Optimal Floodplain Management Portfolios
NASA Astrophysics Data System (ADS)
Read, L.; Bates, M.; Hui, R.; Lund, J. R.
2014-12-01
Floodplain management is a complex portfolio problem that can be analyzed from an integrated perspective incorporating traditionally structural and nonstructural options. One method to identify effective strategies for preparing, responding to, and recovering from floods is to optimize for a portfolio of temporary (emergency) and permanent floodplain management options. A risk-based optimization approach to this problem assigns probabilities to specific flood events and calculates the associated expected damages. This approach is currently limited by: (1) the assumption of perfect flood forecast information, i.e. implementing temporary management activities according to the actual flood event may differ from optimizing based on forecasted information and (2) the inability to assess system resilience across a range of possible future events (risk-centric approach). Resilience is defined here as the ability of a system to absorb and recover from a severe disturbance or extreme event. In our analysis, resilience is a system property that requires integration of physical, social, and information domains. This work employs a 3-stage linear program to identify the optimal mix of floodplain management options using conditional probabilities to represent perfect and imperfect flood stages (forecast vs. actual events). We assess the value of information in terms of minimizing damage costs for two theoretical cases - urban and rural systems. We use portfolio analysis to explore how the set of optimal management options differs depending on whether the goal is for the system to be risk-adverse to a specified event or resilient over a range of events.
Yue, Wencong; Cai, Yanpeng; Xu, Linyu; Yang, Zhifeng; Yin, Xin'An; Su, Meirong
2017-07-11
To improve the capabilities of conventional methodologies in facilitating industrial water allocation under uncertain conditions, an integrated approach was developed through the combination of operational research, uncertainty analysis, and violation risk analysis methods. The developed approach can (a) address complexities of industrial water resources management (IWRM) systems, (b) facilitate reflections of multiple uncertainties and risks of the system and incorporate them into a general optimization framework, and (c) manage robust actions for industrial productions in consideration of water supply capacity and wastewater discharging control. The developed method was then demonstrated in a water-stressed city (i.e., the City of Dalian), northeastern China. Three scenarios were proposed according to the city's industrial plans. The results indicated that in the planning year of 2020 (a) the production of civilian-used steel ships and machine-made paper & paperboard would reduce significantly, (b) violation risk of chemical oxygen demand (COD) discharge under scenario 1 would be the most prominent, compared with those under scenarios 2 and 3, (c) the maximal total economic benefit under scenario 2 would be higher than the benefit under scenario 3, and (d) the production of rolling contact bearing, rail vehicles, and commercial vehicles would be promoted.
Security Investment in Contagious Networks.
Hasheminasab, Seyed Alireza; Tork Ladani, Behrouz
2018-01-16
Security of the systems is normally interdependent in such a way that security risks of one part affect other parts and threats spread through the vulnerable links in the network. So, the risks of the systems can be mitigated through investments in the security of interconnecting links. This article takes an innovative look at the problem of security investment of nodes on their vulnerable links in a given contagious network as a game-theoretic model that can be applied to a variety of applications including information systems. In the proposed game model, each node computes its corresponding risk based on the value of its assets, vulnerabilities, and threats to determine the optimum level of security investments on its external links respecting its limited budget. Furthermore, direct and indirect nonlinear influences of a node's security investment on the risks of other nodes are considered. The existence and uniqueness of the game's Nash equilibrium in the proposed game are also proved. Further analysis of the model in a practical case revealed that taking advantage of the investment effects of other players, perfectly rational players (i.e., those who use the utility function of the proposed game model) make more cost-effective decisions than selfish nonrational or semirational players. © 2018 Society for Risk Analysis.
Martins, Isabella Vilhena Freire; de Avelar, Barbara Rauta; Pereira, Maria Julia Salim; da Fonseca, Adevair Henrique
2012-09-01
A model based on geographical information systems for mapping the risk of fascioliasis was developed for the southern part of Espírito Santo state, Brazil. The determinants investigated were precipitation, temperature, elevation, slope, soil type and land use. Weightings and grades were assigned to determinants and their categories according to their relevance with respect to fascioliasis. Theme maps depicting the spatial distribution of risk areas indicate that over 50% of southern Espírito Santo is either at high or at very high risk for fascioliasis. These areas were found to be characterized by comparatively high temperature but relatively low slope, low precipitation and low elevation corresponding to periodically flooded grasslands or soils that promote water retention.
Peters, Roger H; Young, M Scott; Rojas, Elizabeth C; Gorey, Claire M
2017-07-01
Over seven million persons in the United States are supervised by the criminal justice system, including many who have co-occurring mental and substance use disorders (CODs). This population is at high risk for recidivism and presents numerous challenges to those working in the justice system. To provide a contemporary review of the existing research and examine key issues and evidence-based treatment and supervision practices related to CODs in the justice system. We reviewed COD research involving offenders that has been conducted over the past 20 years and provide an analysis of key findings. Several empirically supported frameworks are available to guide services for offenders who have CODs, including Integrated Dual Disorders Treatment (IDDT), the Risk-Need-Responsivity (RNR) model, and Cognitive-Behavioral Therapy (CBT). Evidence-based services include integrated assessment that addresses both sets of disorders and the risk for criminal recidivism. Although several evidence-based COD interventions have been implemented at different points in the justice system, there remains a significant gap in services for offenders who have CODs. Existing program models include Crisis Intervention Teams (CIT), day reporting centers, specialized community supervision teams, pre- and post-booking diversion programs, and treatment-based courts (e.g., drug courts, mental health courts, COD dockets). Jail-based COD treatment programs provide stabilization of acute symptoms, medication consultation, and triage to community services, while longer-term prison COD programs feature Modified Therapeutic Communities (MTCs). Despite the availability of multiple evidence-based interventions that have been implemented across diverse justice system settings, these services are not sufficiently used to address the scope of treatment and supervision needs among offenders with CODs.
Mills, Joseph L; Conte, Michael S; Armstrong, David G; Pomposelli, Frank B; Schanzer, Andres; Sidawy, Anton N; Andros, George
2014-01-01
Critical limb ischemia, first defined in 1982, was intended to delineate a subgroup of patients with a threatened lower extremity primarily because of chronic ischemia. It was the intent of the original authors that patients with diabetes be excluded or analyzed separately. The Fontaine and Rutherford Systems have been used to classify risk of amputation and likelihood of benefit from revascularization by subcategorizing patients into two groups: ischemic rest pain and tissue loss. Due to demographic shifts over the last 40 years, especially a dramatic rise in the incidence of diabetes mellitus and rapidly expanding techniques of revascularization, it has become increasingly difficult to perform meaningful outcomes analysis for patients with threatened limbs using these existing classification systems. Particularly in patients with diabetes, limb threat is part of a broad disease spectrum. Perfusion is only one determinant of outcome; wound extent and the presence and severity of infection also greatly impact the threat to a limb. Therefore, the Society for Vascular Surgery Lower Extremity Guidelines Committee undertook the task of creating a new classification of the threatened lower extremity that reflects these important considerations. We term this new framework, the Society for Vascular Surgery Lower Extremity Threatened Limb Classification System. Risk stratification is based on three major factors that impact amputation risk and clinical management: Wound, Ischemia, and foot Infection (WIfI). The implementation of this classification system is intended to permit more meaningful analysis of outcomes for various forms of therapy in this challenging, but heterogeneous population. Copyright © 2014 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
Benchmarking Discount Rate in Natural Resource Damage Assessment with Risk Aversion.
Wu, Desheng; Chen, Shuzhen
2017-08-01
Benchmarking a credible discount rate is of crucial importance in natural resource damage assessment (NRDA) and restoration evaluation. This article integrates a holistic framework of NRDA with prevailing low discount rate theory, and proposes a discount rate benchmarking decision support system based on service-specific risk aversion. The proposed approach has the flexibility of choosing appropriate discount rates for gauging long-term services, as opposed to decisions based simply on duration. It improves injury identification in NRDA since potential damages and side-effects to ecosystem services are revealed within the service-specific framework. A real embankment case study demonstrates valid implementation of the method. © 2017 Society for Risk Analysis.
Enhanced project management tool
NASA Technical Reports Server (NTRS)
Hsu, Chen-Jung (Inventor); Patel, Hemil N. (Inventor); Maluf, David A. (Inventor); Moh Hashim, Jairon C. (Inventor); Tran, Khai Peter B. (Inventor)
2012-01-01
A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as one or more of a monthly report, a task plan report, a schedule report, a budget report and a risk management report, are generated and made available for display or further analysis or collection into a customized report template. An extensible database allows searching for information based upon context and upon content. Seven different types of project risks are addressed, including non-availability of required skill mix of workers. The system can be configured to exchange data and results with corresponding portions of similar project analyses, and to provide user-specific access to specified information.
NASA Astrophysics Data System (ADS)
Salloum, Ahmed
Constraint relaxation by definition means that certain security, operational, or financial constraints are allowed to be violated in the energy market model for a predetermined penalty price. System operators utilize this mechanism in an effort to impose a price-cap on shadow prices throughout the market. In addition, constraint relaxations can serve as corrective approximations that help in reducing the occurrence of infeasible or extreme solutions in the day-ahead markets. This work aims to capture the impact constraint relaxations have on system operational security. Moreover, this analysis also provides a better understanding of the correlation between DC market models and AC real-time systems and analyzes how relaxations in market models propagate to real-time systems. This information can be used not only to assess the criticality of constraint relaxations, but also as a basis for determining penalty prices more accurately. Constraint relaxations practice was replicated in this work using a test case and a real-life large-scale system, while capturing both energy market aspects and AC real-time system performance. System performance investigation included static and dynamic security analysis for base-case and post-contingency operating conditions. PJM peak hour loads were dynamically modeled in order to capture delayed voltage recovery and sustained depressed voltage profiles as a result of reactive power deficiency caused by constraint relaxations. Moreover, impacts of constraint relaxations on operational system security were investigated when risk based penalty prices are used. Transmission lines in the PJM system were categorized according to their risk index and each category was as-signed a different penalty price accordingly in order to avoid real-time overloads on high risk lines. This work also extends the investigation of constraint relaxations to post-contingency relaxations, where emergency limits are allowed to be relaxed in energy market models. Various scenarios were investigated to capture and compare between the impacts of base-case and post-contingency relaxations on real-time system performance, including the presence of both relaxations simultaneously. The effect of penalty prices on the number and magnitude of relaxations was investigated as well.
Denny, Diane S; Allen, Debra K; Worthington, Nicole; Gupta, Digant
2014-01-01
Delivering radiation therapy in an oncology setting is a high-risk process where system failures are more likely to occur because of increasing utilization, complexity, and sophistication of the equipment and related processes. Healthcare failure mode and effect analysis (FMEA) is a method used to proactively detect risks to the patient in a particular healthcare process and correct potential errors before adverse events occur. FMEA is a systematic, multidisciplinary team-based approach to error prevention and enhancing patient safety. We describe our experience of using FMEA as a prospective risk-management technique in radiation oncology at a national network of oncology hospitals in the United States, capitalizing not only on the use of a team-based tool but also creating momentum across a network of collaborative facilities seeking to learn from and share best practices with each other. The major steps of our analysis across 4 sites and collectively were: choosing the process and subprocesses to be studied, assembling a multidisciplinary team at each site responsible for conducting the hazard analysis, and developing and implementing actions related to our findings. We identified 5 areas of performance improvement for which risk-reducing actions were successfully implemented across our enterprise. © 2012 National Association for Healthcare Quality.
Performance of diagnosis-based risk adjustment measures in a population of sick Australians.
Duckett, S J; Agius, P A
2002-12-01
Australia is beginning to explore 'managed competition' as an organising framework for the health care system. This requires setting fair capitation rates, i.e. rates that adjust for the risk profile of covered lives. This paper tests two US-developed risk adjustment approaches using Australian data. Data from the 'co-ordinated care' dataset (which incorporates all service costs of 16,538 participants in a large health service research project conducted in 1996-99) were grouped into homogenous risk categories using risk adjustment 'grouper software'. The grouper products yielded three sets of homogenous categories: Diagnostic Groups and Diagnostic cost Groups. A two-stage analysis of predictive power was used: probability of any service use in the concurrent year, next year and the year after (logistic regression) and, for service users, a regression of logged cost of service use. The independent variables were diagnosis gender, a SES variable and the Age, gender and diagnosis-based risk adjustment measures explain around 40-45% of variation in costs of service use in the current year for untrimmed data (compared with around 15% for age and gender alone). Prediction of subsequent use is much poorer (around 20%). Using more information to assign people to risk categories generally improves prediction. Predictive power of diagnosis-base risk adjusters on this Australian dataset is similar to that found in Low predictive power carries policy risks of cream skimming rather than managing population health and care. Competitive funding models with risk adjustment on prior year experience could reduce system efficiency if implemented with current risk adjustment technology.
Disaster preparedness in a complex urban system: the case of Kathmandu Valley, Nepal.
Carpenter, Samuel; Grünewald, François
2016-07-01
The city is a growing centre of humanitarian concern. Yet, aid agencies, governments and donors are only beginning to comprehend the scale and, importantly, the complexity of the humanitarian challenge in urban areas. Using the case study of the Kathmandu Valley, Nepal, this paper examines the analytical utility of recent research on complex urban systems in strengthening scholarly understanding of urban disaster risk management, and outlines its operational relevance to disaster preparedness. Drawing on a literature review and 26 interviews with actors from across the Government of Nepal, the International Red Cross and Red Crescent Movement, non-governmental organisations, United Nations agencies, and at-risk communities, the study argues that complexity can be seen as a defining feature of urban systems and the risks that confront them. To manage risk in these systems effectively, preparedness efforts must be based on adaptive and agile approaches, incorporating the use of network analysis, partnerships, and new technologies. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
Baum, Thomas; Karampinos, Dimitrios C; Brockow, Knut; Seifert-Klauss, Vanadin; Jungmann, Pia M; Biedermann, Tilo; Rummeny, Ernst J; Bauer, Jan S; Müller, Dirk
2015-01-01
Subjects with indolent systemic mastocytosis (ISM) have an increased risk for osteoporosis. It has been demonstrated that trabecular bone microstructure analysis improves the prediction of bone strength beyond dual-energy X-ray absorptiometry-based bone mineral density. The purpose of this study was to obtain Magnetic Resonance (MR)-based trabecular bone microstructure parameters as advanced imaging biomarkers in subjects with ISM (n=18) and compare them with those of normal controls (n=18). Trabecular bone microstructure parameters were not significantly (P>.05) different between subjects with ISM and controls. These findings revealed important pathophysiological information about ISM-associated osteoporosis and may limit the use of trabecular bone microstructure analysis in this clinical setting. Copyright © 2015 Elsevier Inc. All rights reserved.
Simulation Assisted Risk Assessment: Blast Overpressure Modeling
NASA Technical Reports Server (NTRS)
Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael
2006-01-01
A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.
RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carson, Susan D.; Hunter, Regina L.; Link, Madison D.
RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database containsmore » both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.« less
Impact of model-based risk analysis for liver surgery planning.
Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K
2014-05-01
A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.
An agent based architecture for high-risk neonate management at neonatal intensive care unit.
Malak, Jaleh Shoshtarian; Safdari, Reza; Zeraati, Hojjat; Nayeri, Fatemeh Sadat; Mohammadzadeh, Niloofar; Farajollah, Seide Sedighe Seied
2018-01-01
In recent years, the use of new tools and technologies has decreased the neonatal mortality rate. Despite the positive effect of using these technologies, the decisions are complex and uncertain in critical conditions when the neonate is preterm or has a low birth weight or malformations. There is a need to automate the high-risk neonate management process by creating real-time and more precise decision support tools. To create a collaborative and real-time environment to manage neonates with critical conditions at the NICU (Neonatal Intensive Care Unit) and to overcome high-risk neonate management weaknesses by applying a multi agent based analysis and design methodology as a new solution for NICU management. This study was a basic research for medical informatics method development that was carried out in 2017. The requirement analysis was done by reviewing articles on NICU Decision Support Systems. PubMed, Science Direct, and IEEE databases were searched. Only English articles published after 1990 were included; also, a needs assessment was done by reviewing the extracted features and current processes at the NICU environment where the research was conducted. We analyzed the requirements and identified the main system roles (agents) and interactions by a comparative study of existing NICU decision support systems. The Universal Multi Agent Platform (UMAP) was applied to implement a prototype of our multi agent based high-risk neonate management architecture. Local environment agents interacted inside a container and each container interacted with external resources, including other NICU systems and consultation centers. In the NICU container, the main identified agents were reception, monitoring, NICU registry, and outcome prediction, which interacted with human agents including nurses and physicians. Managing patients at the NICU units requires online data collection, real-time collaboration, and management of many components. Multi agent systems are applied as a well-known solution for management, coordination, modeling, and control of NICU processes. We are currently working on an outcome prediction module using artificial intelligence techniques for neonatal mortality risk prediction. The full implementation of the proposed architecture and evaluation is considered the future work.
NASA Technical Reports Server (NTRS)
Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert
2010-01-01
Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
Patriarca, Peter A; Van Auken, R Michael; Kebschull, Scott A
2018-01-01
Benefit-risk evaluations of drugs have been conducted since the introduction of modern regulatory systems more than 50 years ago. Such judgments are typically made on the basis of qualitative or semiquantitative approaches, often without the aid of quantitative assessment methods, the latter having often been applied asymmetrically to place emphasis on benefit more so than harm. In an effort to preliminarily evaluate the utility of lives lost or saved, or quality-adjusted life-years (QALY) lost and gained as a means of quantitatively assessing the potential benefits and risks of a new chemical entity, we focused our attention on the unique scenario in which a drug was initially approved based on one set of data, but later withdrawn from the market based on a second set of data. In this analysis, a dimensionless risk to benefit ratio was calculated in each instance, based on the risk and benefit quantified in similar units. The results indicated that FDA decisions to approve the drug corresponded to risk to benefit ratios less than or equal to 0.136, and that decisions to withdraw the drug from the US market corresponded to risk to benefit ratios greater than or equal to 0.092. The probability of FDA approval was then estimated using logistic regression analysis. The results of this analysis indicated that there was a 50% probability of FDA approval if the risk to benefit ratio was 0.121, and that the probability approaches 100% for values much less than 0.121, and the probability approaches 0% for values much greater than 0.121. The large uncertainty in these estimates due to the small sample size and overlapping data may be addressed in the future by applying the methodology to other drugs.
Ecological Risk Assessment with MCDM of Some Invasive Alien Plants in China
NASA Astrophysics Data System (ADS)
Xie, Guowen; Chen, Weiguang; Lin, Meizhen; Zheng, Yanling; Guo, Peiguo; Zheng, Yisheng
Alien plant invasion is an urgent global issue that threatens the sustainable development of the ecosystem health. The study of its ecological risk assessment (ERA) could help us to prevent and reduce the invasion risk more effectively. Based on the theory of ERA and methods of the analytic hierarchy process (AHP) of multi-criteria decision-making (MCDM), and through the analyses of the characteristics and processes of alien plant invasion, this paper discusses the methodologies of ERA of alien plant invasion. The assessment procedure consisted of risk source analysis, receptor analysis, exposure and hazard assessment, integral assessment, and countermeasure of risk management. The indicator system of risk source assessment as well as the indices and formulas applied to measure the ecological loss and risk were established, and the method for comprehensively assessing the ecological risk of alien plant invasion was worked out. The result of ecological risk analysis to 9 representative invasive alien plants in China shows that the ecological risk of Erigeron annuus, Ageratum conyzoides, Alternanthera philoxeroides and Mikania midrantha is high (grade1-2), that of Oxalis corymbosa and Wedelia chinensis comes next (grade3), while Mirabilis jalapa, Pilea microphylla and Calendula officinalis of the last (grade 4). Risk strategies are put forward on this basis.
Cyber Contingency Analysis version 1.x
DOE Office of Scientific and Technical Information (OSTI.GOV)
Contingency analysis based approach for quantifying and examining the resiliency of a cyber system in respect to confidentiality, integrity and availability. A graph representing an organization's cyber system and related resources is used for the availability contingency analysis. The mission critical paths associated with an organization are used to determine the consequences of a potential contingency. A node (or combination of nodes) are removed from the graph to analyze a particular contingency. The value of all mission critical paths that are disrupted by that contingency are used to quantify its severity. A total severity score can be calculated based onmore » the complete list of all these contingencies. A simple n1 analysis can be done in which only one node is removed at a time for the analysis. We can also compute nk analysis, where k is the number of nodes to simultaneously remove for analysis. A contingency risk score can also be computed, which takes the probability of the contingencies into account. In addition to availability, we can also quantify confidentiality and integrity scores for the system. These treat user accounts as potential contingencies. The amount (and type) of files that an account can read to is used to compute the confidentiality score. The amount (and type) of files that an account can write to is used to compute the integrity score. As with availability analysis, we can use this information to compute total severity scores in regards to confidentiality and integrity. We can also take probability into account to compute associated risk scores.« less
A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas
Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan
2016-01-01
Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202
Alonazi, Wadi B
2017-06-12
The natural assimilation of the process through which health partners sustain long-term relationships is a key issue in maintaining social well-being, reducing health risk factors, and sustaining public health programs. One global initiative in building effective healthcare systems is public-private partnerships (PPPs). This study elucidates the proposed key performance indicators initiated by the Ministry of Health of Saudi Arabia based on the projections of the government, known as Vision 2030, from the perspective of health risk factors. Through an inductive content analysis, this study assessed primary and secondary data in relation to the Saudi National Transformation Program (NTP). To identify the institutions that played a role in formulating the new Saudi Healthcare System, health policies, regulations, and reports published between 1996 and 2016 were categorized. After ranking the risk factors, the investigator selected 13 healthcare professionals in four focus group interviews to insightfully explore the challenges that the NTP faces from a health risk perspective. Thus, the study employed qualitative data gathered through focus group interviews with key figures as well as data extracted from written sources to identify distinct but interrelated partnerships practiced within risk management. A methodological overview of NTP priority and implementation offered practical guidance in the healthcare context. The five critical factors in maintaining successful and sustainable PPPs were (1) trustworthiness, (2) technological capability, (3) patient-centeredness, (4) competence, and (5) flexibility. Concession on primary and secondary healthcare services might be a good option based on the literature review and considering its popularity in other countries. A high outcome-based risk of PPPs was found as the most commonly shared perspective in risk management. Although the impact of the NTP rise has yet to be explored, its potential for challenging health consequences requires consideration and substantial regulatory action. This study contributes to the emerging critical analysis on local health initiatives by highlighting how integration may only be possible with a more radical conceptualization of national health governance.
Domingues, Patrícia Henriques; Sousa, Pablo; Otero, Álvaro; Gonçalves, Jesus Maria; Ruiz, Laura; de Oliveira, Catarina; Lopes, Maria Celeste; Orfao, Alberto; Tabernero, Maria Dolores
2014-01-01
Background Tumor recurrence remains the major clinical complication of meningiomas, the majority of recurrences occurring among WHO grade I/benign tumors. In the present study, we propose a new scoring system for the prognostic stratification of meningioma patients based on analysis of a large series of meningiomas followed for a median of >5 years. Methods Tumor cytogenetics were systematically investigated by interphase fluorescence in situ hybridization in 302 meningioma samples, and the proposed classification was further validated in an independent series of cases (n = 132) analyzed by high-density (500K) single-nucleotide polymorphism (SNP) arrays. Results Overall, we found an adverse impact on patient relapse-free survival (RFS) for males, presence of brain edema, younger patients (<55 years), tumor size >50 mm, tumor localization at intraventricular and anterior cranial base areas, WHO grade II/III meningiomas, and complex karyotypes; the latter 5 variables showed an independent predictive value in multivariate analysis. Based on these parameters, a prognostic score was established for each individual case, and patients were stratified into 4 risk categories with significantly different (P < .001) outcomes. These included a good prognosis group, consisting of approximately 20% of cases, that showed a RFS of 100% ± 0% at 10 years and a very poor-prognosis group with a RFS rate of 0% ± 0% at 10 years. The prognostic impact of the scoring system proposed here was also retained when WHO grade I cases were considered separately (P < .001). Conclusions Based on this risk-stratification classification, different strategies may be adopted for follow-up, and eventually also for treatment, of meningioma patients at different risks for relapse. PMID:24536048
Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems
NASA Astrophysics Data System (ADS)
Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.
2012-04-01
Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central Greece. Moreover, remote sensing has proven very effective in delineating spatial variability and features in drought monitoring and assessment.
Weather Augmented Risk Determination (WARD) System
NASA Astrophysics Data System (ADS)
Niknejad, M.; Mazdiyasni, O.; Momtaz, F.; AghaKouchak, A.
2017-12-01
Extreme climatic events have direct and indirect impacts on society, economy and the environment. Based on the United States Bureau of Economic Analysis (BEA) data, over one third of the U.S. GDP can be considered as weather-sensitive involving some degree of weather risk. This expands from a local scale concrete foundation construction to large scale transportation systems. Extreme and unexpected weather conditions have always been considered as one of the probable risks to human health, productivity and activities. The construction industry is a large sector of the economy, and is also greatly influenced by weather-related risks including work stoppage and low labor productivity. Identification and quantification of these risks, and providing mitigation of their effects are always the concerns of construction project managers. In addition to severe weather conditions' destructive effects, seasonal changes in weather conditions can also have negative impacts on human health. Work stoppage and reduced labor productivity can be caused by precipitation, wind, temperature, relative humidity and other weather conditions. Historical and project-specific weather information can improve better project management and mitigation planning, and ultimately reduce the risk of weather-related conditions. This paper proposes new software for project-specific user-defined data analysis that offers (a) probability of work stoppage and the estimated project length considering weather conditions; (b) information on reduced labor productivity and its impacts on project duration; and (c) probabilistic information on the project timeline based on both weather-related work stoppage and labor productivity. The software (WARD System) is designed such that it can be integrated into the already available project management tools. While the system and presented application focuses on the construction industry, the developed software is general and can be used for any application that involves labor productivity (e.g., farming) and work stoppage due to weather conditions (e.g., transportation, agriculture industry).
Effects of changes along the risk chain on flood risk
NASA Astrophysics Data System (ADS)
Duha Metin, Ayse; Apel, Heiko; Viet Dung, Nguyen; Guse, Björn; Kreibich, Heidi; Schröter, Kai; Vorogushyn, Sergiy; Merz, Bruno
2017-04-01
Interactions of hydrological and socio-economic factors shape flood disaster risk. For this reason, assessment of flood risk ideally takes into account the whole flood risk chain from atmospheric processes, through the catchment and river system processes to the damage mechanisms in the affected areas. Since very different processes at various scales are interacting along the flood risk, the impact of the single components is rather unclear. However for flood risk management, it is required to know the controlling factor of flood damages. The present study, using the flood-prone Mulde catchment in Germany, discusses the sensitivity of flood risk to disturbances along the risk chain: How do disturbances propagate through the risk chain? How do different disturbances combine or conflict and affect flood risk? In this sensitivity analysis, the five components of the flood risk change are included. These are climate, catchment, river system, exposure and vulnerability. A model framework representing the complete risk chain is combined with observational data to understand how the sensitivities evolve along the risk chain by considering three plausible change scenarios for each of five components. The flood risk is calculated by using the Regional Flood Model (RFM) which is based on a continuous simulation approach, including rainfall-runoff, 1D river network, 2D hinterland inundation and damage estimation models. The sensitivity analysis covers more than 240 scenarios with different combinations of the five components. It is investigated how changes in different components affect risk indicators, such as the risk curve and expected annual damage (EAD). In conclusion, it seems that changes in exposure and vulnerability seem to outweigh changes in hazard.
Kawabata, Hiroshi; Tohyama, Kaoru; Matsuda, Akira; Araseki, Kayano; Hata, Tomoko; Suzuki, Takahiro; Kayano, Hidekazu; Shimbo, Kei; Zaike, Yuji; Usuki, Kensuke; Chiba, Shigeru; Ishikawa, Takayuki; Arima, Nobuyoshi; Nogawa, Masaharu; Ohta, Akiko; Miyazaki, Yasushi; Mitani, Kinuko; Ozawa, Keiya; Arai, Shunya; Kurokawa, Mineo; Takaori-Kondo, Akifumi
2017-09-01
The Japanese National Research Group on Idiopathic Bone Marrow Failure Syndromes has been conducting prospective registration, central review, and follow-up study for patients with aplastic anemia and myelodysplastic syndrome (MDS) since 2006. Using this database, we retrospectively analyzed the prognosis of patients with MDS. As of May 2016, 351 cases were registered in this database, 186 of which were eligible for the present study. Kaplan-Meier analysis showed that overall survival (OS) curves of the five risk categories stipulated by the revised international prognostic scoring system (IPSS-R) were reasonably separated. 2-year OS rates for the very low-, low-, intermediate-, high-, and very high-risk categories were 95, 89, 79, 35, and 12%, respectively. In the same categories, incidence of leukemic transformation at 2 years was 0, 10, 8, 56, and 40%, respectively. Multivariate analysis revealed that male sex, low platelet counts, increased blast percentage (>2%), and high-risk karyotype abnormalities were independent risk factors for poor OS. Based on these data, we classified Japanese MDS patients who were classified as intermediate-risk in IPSS-R, into the lower risk MDS category, highlighting the need for careful assessment of treatments within low- and high-risk treatment protocols.
NASA Astrophysics Data System (ADS)
Chu, Zhongyi; Di, Jingnan; Cui, Jing
2017-10-01
Space debris occupies a valuable orbital resource and is an inevitable and urgent problem, especially for large space debris because of its high risk and the possible crippling effects of a collision. Space debris has attracted much attention in recent years. A tethered system used in an active debris removal scenario is a promising method to de-orbit large debris in a safe manner. In a tethered system, the flexibility of the tether used in debris removal can possibly induce tangling, which is dangerous and should be avoided. In particular, attachment point bias due to capture error can significantly affect the motion of debris relative to the tether and increase the tangling risk. Hence, in this paper, the effect of attachment point bias on the tethered system is studied based on a dynamic model established based on a Newtonian approach. Next, a safety metric of avoiding a tangle when a tether is tensioned with attachment point bias is designed to analyse the tangling risk of the tethered system. Finally, several numerical cases are established and simulated to validate the effects of attachment point bias on a space tethered system.
Evaluating robustness in rank-based risk assessments of freshwater ecosystems
Mattson, K.M.; Angermeier, Paul
2007-01-01
Conservation planning aims to protect biodiversity by sustainng the natural physical, chemical, and biological processes within representative ecosystems. Often data to measure these components are inadequate or unavailable. The impact of human activities on ecosystem processes complicates integrity assessments and might alter ecosystem organization at multiple spatial scales. Freshwater conservation targets, such as populations and communities, are influenced by both intrinsic aquatic properties and the surrounding landscape, and locally collected data might not accurately reflect potential impacts. We suggest that changes in five major biotic drivers—energy sources, physical habitat, flow regime, water quality, and biotic interactions—might be used as surrogates to inform conservation planners of the ecological integrity of freshwater ecosystems. Threats to freshwater systems might be evaluated based on their impact to these drivers to provide an overview of potential risk to conservation targets. We developed a risk-based protocol, the Ecological Risk Index (ERI), to identify watersheds with least/most risk to conservation targets. Our protocol combines risk-based components, specifically the frequency and severity of human-induced stressors, with biotic drivers and mappable land- and water-use data to provide a summary of relative risk to watersheds. We illustrate application of our protocol with a case study of the upper Tennessee River basin, USA. Differences in risk patterns among the major drainages in the basin reflect dominant land uses, such as mining and agriculture. A principal components analysis showed that localized, moderately severe threats accounted for most of the threat composition differences among our watersheds. We also found that the relative importance of threats is sensitive to the spatial grain of the analysis. Our case study demonstrates that the ERI is useful for evaluating the frequency and severity of ecosystemwide risk, which can inform local and regional conservation planning.
Risk Analysis Methods for Deepwater Port Oil Transfer Systems
DOT National Transportation Integrated Search
1976-06-01
This report deals with the risk analysis methodology for oil spills from the oil transfer systems in deepwater ports. Failure mode and effect analysis in combination with fault tree analysis are identified as the methods best suited for the assessmen...
Megacity Indicator System for Disaster Risk Management in Istanbul (MegaIST)
NASA Astrophysics Data System (ADS)
Yahya Menteşe, Emin; Kılıç, Osman; Baş, Mahmut; Khazai, Bijan; Ergün Konukcu, Betul; Emre Basmacı, Ahmet
2017-04-01
Decision makers need tools to understand the priorities and to set up benchmarks and track progress in their disaster risk reduction activities, so that they can justify their decisions and investments. In this regard, Megacity Indicator System for Disaster Risk Management (MegaIST), is developed in order to be used in disaster risk management studies, for decision makers and managers to establish right strategies and proper risk reduction actions, enhance resource management and investment decisions, set priorities, monitor progress in DRM and validate decisions taken with the aim of helping disaster oriented urban redevelopment, inform investors about risk profile of the city and providing a basis for dissemination and sharing of risk components with related stakeholders; by Directorate of Earthquake and Ground Research of Istanbul Metropolitan Municipality (IMM). MegaIST achieves these goals by analyzing the earthquake risk in three separate but complementary sub-categories consisting of "urban seismic risk, coping capacity and disaster risk management index" in an integrated way. MegaIST model fosters its analyses by presenting the outputs in a simple and user friendly format benefiting from GIS technology that ensures the adoptability of the model's use. Urban seismic risk analysis includes two components, namely; Physical Risk and Social Vulnerability Analysis. Physical risk analysis is based on the possible physical losses (such as building damage, casualties etc.) due to an earthquake while social vulnerability is considered as a factor that increases the results of the physical losses in correlation with the level of education, health, economic status and disaster awareness/preparedness of society. Coping capacity analysis is carried out with the aim of understanding the readiness of the Municipality to respond and recover from a disaster in Istanbul can be defined both in terms of the Municipality's operational capacities - the capacity of the Municipality in terms of the demand on its resources to respond to emergencies and restore services - as well as functional capacities - the policies and planning measures at the Municipality which lead to reduction of risk and protection of people. Disaster Risk Management Index (DRMI) is used as "control system" within the conceptual framework of MegaIST. This index has been developed to understand impact of corporate governance and enforcement structures and policies on total Urban Seismic Risk and in order to make the performance evaluation. Also, DRMI is composed of macro indicators that are developed in order to monitor progress in reducing disaster risk management of institution. They are presented in four broad indicator groups: Legal and Institutional Requirements, Risk Reduction Implementation and Preparedness Activities, Readiness to Respond and Recover, and Strategy and Coordination. As a result; in MegaIST, with the identification and analysis of physical and social vulnerabilities along with coping capacity and disaster risk management performance indicators; an integrated and analytical decision support system has been established to enhance DRM process and reach to a disaster resilient urban environment.
A Strategic Approach to Medical Care for Exploration Missions
NASA Technical Reports Server (NTRS)
Canga, Michael A.; Shah, Ronak V.; Mindock, Jennifer A.; Antonsen, Erik L.
2016-01-01
Exploration missions will present significant new challenges to crew health, including effects of variable gravity environments, limited communication with Earth-based personnel for diagnosis and consultation for medical events, limited resupply, and limited ability for crew return. Providing health care capabilities for exploration class missions will require system trades be performed to identify a minimum set of requirements and crosscutting capabilities, which can be used in design of exploration medical systems. Medical data, information, and knowledge collected during current space missions must be catalogued and put in formats that facilitate querying and analysis. These data are used to inform the medical research and development program through analysis of risk trade studies between medical care capabilities and system constraints such as mass, power, volume, and training. Medical capability as a quantifiable variable is proposed as a surrogate risk metric and explored for trade space analysis that can improve communication between the medical and engineering approaches to mission design. The resulting medical system design approach selected will inform NASA mission architecture, vehicle, and subsystem design for the next generation of spacecraft.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wellman, Dawn M.; Freshley, Mark D.; Truex, Michael J.
Current requirements for site remediation and closure are standards-based and are often overly conservative, costly, and in some cases, technically impractical to achieve. Use of risk-informed alternate endpoints provide a means to achieve remediation goals that are permitted by regulations and are protective of human health and the environment. Alternate endpoints enable establishing a path for cleanup that may include intermediate remedial milestones and transition points and/or regulatory alternatives to standards-based remediation. A framework is presented that is centered around developing and refining conceptual models in conjunction with assessing risks and potential endpoints as part of a system-based assessment thatmore » integrates site data with scientific understanding of processes that control the distribution and transport of contaminants in the subsurface and pathways to receptors. This system based assessment and subsequent implementation of the remediation strategy with appropriate monitoring are targeted at providing a holistic approach to addressing risks to human health and the environment. This holistic approach also enables effective predictive analysis of contaminant behavior to provide defensible criteria and data for making long-term decisions. Developing and implementing an alternate endpoint-based approach for remediation and waste site closure presents a number of challenges and opportunities. Categories of these challenges include scientific and technical, regulatory, institutional, and budget and resource allocation issues. Opportunities exist for developing and implementing systems-based approaches with respect to supportive characterization, monitoring, predictive modeling, and remediation approaches.« less
Conceptual Launch Vehicle and Spacecraft Design for Risk Assessment
NASA Technical Reports Server (NTRS)
Motiwala, Samira A.; Mathias, Donovan L.; Mattenberger, Christopher J.
2014-01-01
One of the most challenging aspects of developing human space launch and exploration systems is minimizing and mitigating the many potential risk factors to ensure the safest possible design while also meeting the required cost, weight, and performance criteria. In order to accomplish this, effective risk analyses and trade studies are needed to identify key risk drivers, dependencies, and sensitivities as the design evolves. The Engineering Risk Assessment (ERA) team at NASA Ames Research Center (ARC) develops advanced risk analysis approaches, models, and tools to provide such meaningful risk and reliability data throughout vehicle development. The goal of the project presented in this memorandum is to design a generic launch 7 vehicle and spacecraft architecture that can be used to develop and demonstrate these new risk analysis techniques without relying on other proprietary or sensitive vehicle designs. To accomplish this, initial spacecraft and launch vehicle (LV) designs were established using historical sizing relationships for a mission delivering four crewmembers and equipment to the International Space Station (ISS). Mass-estimating relationships (MERs) were used to size the crew capsule and launch vehicle, and a combination of optimization techniques and iterative design processes were employed to determine a possible two-stage-to-orbit (TSTO) launch trajectory into a 350-kilometer orbit. Primary subsystems were also designed for the crewed capsule architecture, based on a 24-hour on-orbit mission with a 7-day contingency. Safety analysis was also performed to identify major risks to crew survivability and assess the system's overall reliability. These procedures and analyses validate that the architecture's basic design and performance are reasonable to be used for risk trade studies. While the vehicle designs presented are not intended to represent a viable architecture, they will provide a valuable initial platform for developing and demonstrating innovative risk assessment capabilities.
NASA Langley Systems Analysis & Concepts Directorate Technology Assessment/Portfolio Analysis
NASA Technical Reports Server (NTRS)
Cavanaugh, Stephen; Chytka, Trina; Arcara, Phil; Jones, Sharon; Stanley, Doug; Wilhite, Alan W.
2006-01-01
Systems analysis develops and documents candidate mission and architectures, associated system concepts, enabling capabilities and investment strategies to achieve NASA s strategic objectives. The technology assessment process connects the mission and architectures to the investment strategies. In order to successfully implement a technology assessment, there is a need to collect, manipulate, analyze, document, and disseminate technology-related information. Information must be collected and organized on the wide variety of potentially applicable technologies, including: previous research results, key technical parameters and characteristics, technology readiness levels, relationships to other technologies, costs, and potential barriers and risks. This information must be manipulated to facilitate planning and documentation. An assessment is included of the programmatic and technical risks associated with each technology task as well as potential risk mitigation plans. Risks are assessed and tracked in terms of likelihood of the risk occurring and consequences of the risk if it does occur. The risk assessments take into account cost, schedule, and technical risk dimensions. Assessment data must be simplified for presentation to decision makers. The Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center has a wealth of experience in performing Technology Assessment and Portfolio Analysis as this has been a business line since 1978.
NASA Astrophysics Data System (ADS)
Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi
2009-10-01
As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.
Risk Costs for New Dams: Economic Analysis and Effects of Monitoring
NASA Astrophysics Data System (ADS)
Paté-Cornell, M. Elisabeth; Tagaras, George
1986-01-01
This paper presents new developments and illustrations of the introduction of risk and costs in cost-benefit analysis for new dams. The emphasis is on a method of evaluation of the risk costs based on the structure of the local economy. Costs to agricultural property as well as residential, commercial, industrial, and public property are studied in detail. Of particular interest is the case of sequential dam failure and the evaluation of the risk costs attributable to a new dam upstream from an existing one. Three real cases are presented as illustrations of the method: the Auburn Dam, the Dickey-Lincoln School Project, and the Teton Dam, which failed in 1976. This last case provides a calibration tool for the estimation of loss ratios. For these three projects, the risk-modified benefit-cost ratios are computed to assess the effect of the risk on the economic performance of the project. The role of a warning system provided by systematic monitoring of the dam is analyzed: by reducing the risk costs, the warning system attenuates their effect on the benefit-cost ratio. The precursors, however, can be missed or misinterpreted: monitoring does not guarantee that the risks to human life can be reduced to zero. This study shows, in particular, that it is critical to consider the risk costs in the decision to build a new dam when the flood area is large and densely populated.
Broitman, D; Raviv, O; Ayalon, O; Kan, I
2018-05-01
Setting up a sustainable agricultural vegetative waste-management system is a challenging investment task, particularly when markets for output products of waste-treatment technologies are not well established. We conduct an economic analysis of possible investments in treatment technologies of agricultural vegetative waste, while accounting for fluctuating output prices. Under a risk-neutral approach, we find the range of output-product prices within which each considered technology becomes most profitable, using average final prices as the exclusive factor. Under a risk-averse perspective, we rank the treatment technologies based on their computed certainty-equivalent profits as functions of the coefficient of variation of the technologies' output prices. We find the ranking of treatment technologies based on average prices to be robust to output-price fluctuations provided that the coefficient of variation of the output prices is below about 0.4, that is, approximately twice as high as that of well-established recycled-material markets such as glass, paper and plastic. We discuss some policy implications that arise from our analysis regarding vegetative waste management and its associated risks. Copyright © 2018 Elsevier Ltd. All rights reserved.
Construction risk assessment of deep foundation pit in metro station based on G-COWA method
NASA Astrophysics Data System (ADS)
You, Weibao; Wang, Jianbo; Zhang, Wei; Liu, Fangmeng; Yang, Diying
2018-05-01
In order to get an accurate understanding of the construction safety of deep foundation pit in metro station and reduce the probability and loss of risk occurrence, a risk assessment method based on G-COWA is proposed. Firstly, relying on the specific engineering examples and the construction characteristics of deep foundation pit, an evaluation index system based on the five factors of “human, management, technology, material and environment” is established. Secondly, the C-OWA operator is introduced to realize the evaluation index empowerment and weaken the negative influence of expert subjective preference. The gray cluster analysis and fuzzy comprehensive evaluation method are combined to construct the construction risk assessment model of deep foundation pit, which can effectively solve the uncertainties. Finally, the model is applied to the actual project of deep foundation pit of Qingdao Metro North Station, determine its construction risk rating is “medium”, evaluate the model is feasible and reasonable. And then corresponding control measures are put forward and useful reference are provided.
NASA System Safety Handbook. Volume 1; System Safety Framework and Concepts for Implementation
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Smith, Curtis; Stamatelatos, Michael; Youngblood, Robert
2011-01-01
System safety assessment is defined in NPR 8715.3C, NASA General Safety Program Requirements as a disciplined, systematic approach to the analysis of risks resulting from hazards that can affect humans, the environment, and mission assets. Achievement of the highest practicable degree of system safety is one of NASA's highest priorities. Traditionally, system safety assessment at NASA and elsewhere has focused on the application of a set of safety analysis tools to identify safety risks and formulate effective controls.1 Familiar tools used for this purpose include various forms of hazard analyses, failure modes and effects analyses, and probabilistic safety assessment (commonly also referred to as probabilistic risk assessment (PRA)). In the past, it has been assumed that to show that a system is safe, it is sufficient to provide assurance that the process for identifying the hazards has been as comprehensive as possible and that each identified hazard has one or more associated controls. The NASA Aerospace Safety Advisory Panel (ASAP) has made several statements in its annual reports supporting a more holistic approach. In 2006, it recommended that "... a comprehensive risk assessment, communication and acceptance process be implemented to ensure that overall launch risk is considered in an integrated and consistent manner." In 2009, it advocated for "... a process for using a risk-informed design approach to produce a design that is optimally and sufficiently safe." As a rationale for the latter advocacy, it stated that "... the ASAP applauds switching to a performance-based approach because it emphasizes early risk identification to guide designs, thus enabling creative design approaches that might be more efficient, safer, or both." For purposes of this preface, it is worth mentioning three areas where the handbook emphasizes a more holistic type of thinking. First, the handbook takes the position that it is important to not just focus on risk on an individual basis but to consider measures of aggregate safety risk and to ensure wherever possible that there be quantitative measures for evaluating how effective the controls are in reducing these aggregate risks. The term aggregate risk, when used in this handbook, refers to the accumulation of risks from individual scenarios that lead to a shortfall in safety performance at a high level: e.g., an excessively high probability of loss of crew, loss of mission, planetary contamination, etc. Without aggregated quantitative measures such as these, it is not reasonable to expect that safety has been optimized with respect to other technical and programmatic objectives. At the same time, it is fully recognized that not all sources of risk are amenable to precise quantitative analysis and that the use of qualitative approaches and bounding estimates may be appropriate for those risk sources. Second, the handbook stresses the necessity of developing confidence that the controls derived for the purpose of achieving system safety not only handle risks that have been identified and properly characterized but also provide a general, more holistic means for protecting against unidentified or uncharacterized risks. For example, while it is not possible to be assured that all credible causes of risk have been identified, there are defenses that can provide protection against broad categories of risks and thereby increase the chances that individual causes are contained. Third, the handbook strives at all times to treat uncertainties as an integral aspect of risk and as a part of making decisions. The term "uncertainty" here does not refer to an actuarial type of data analysis, but rather to a characterization of our state of knowledge regarding results from logical and physical models that approximate reality. Uncertainty analysis finds how the output parameters of the models are related to plausible variations in the input parameters and in the modeling assumptions. The evaluation of unrtainties represents a method of probabilistic thinking wherein the analyst and decision makers recognize possible outcomes other than the outcome perceived to be "most likely." Without this type of analysis, it is not possible to determine the worth of an analysis product as a basis for making decisions related to safety and mission success. In line with these considerations the handbook does not take a hazard-analysis-centric approach to system safety. Hazard analysis remains a useful tool to facilitate brainstorming but does not substitute for a more holistic approach geared to a comprehensive identification and understanding of individual risk issues and their contributions to aggregate safety risks. The handbook strives to emphasize the importance of identifying the most critical scenarios that contribute to the risk of not meeting the agreed-upon safety objectives and requirements using all appropriate tools (including but not limited to hazard analysis). Thereafter, emphasis shifts to identifying the risk drivers that cause these scenarios to be critical and ensuring that there are controls directed toward preventing or mitigating the risk drivers. To address these and other areas, the handbook advocates a proactive, analytic-deliberative, risk-informed approach to system safety, enabling the integration of system safety activities with systems engineering and risk management processes. It emphasizes how one can systematically provide the necessary evidence to substantiate the claim that a system is safe to within an acceptable risk tolerance, and that safety has been achieved in a cost-effective manner. The methodology discussed in this handbook is part of a systems engineering process and is intended to be integral to the system safety practices being conducted by the NASA safety and mission assurance and systems engineering organizations. The handbook posits that to conclude that a system is adequately safe, it is necessary to consider a set of safety claims that derive from the safety objectives of the organization. The safety claims are developed from a hierarchy of safety objectives and are therefore hierarchical themselves. Assurance that all the claims are true within acceptable risk tolerance limits implies that all of the safety objectives have been satisfied, and therefore that the system is safe. The acceptable risk tolerance limits are provided by the authority who must make the decision whether or not to proceed to the next step in the life cycle. These tolerances are therefore referred to as the decision maker's risk tolerances. In general, the safety claims address two fundamental facets of safety: 1) whether required safety thresholds or goals have been achieved, and 2) whether the safety risk is as low as possible within reasonable impacts on cost, schedule, and performance. The latter facet includes consideration of controls that are collective in nature (i.e., apply generically to broad categories of risks) and thereby provide protection against unidentified or uncharacterized risks.
NASA Astrophysics Data System (ADS)
Mazza, Mirko
2015-12-01
Reinforced concrete (r.c.) framed buildings designed in compliance with inadequate seismic classifications and code provisions present in many cases a high vulnerability and need to be retrofitted. To this end, the insertion of a base isolation system allows a considerable reduction of the seismic loads transmitted to the superstructure. However, strong near-fault ground motions, which are characterised by long-duration horizontal pulses, may amplify the inelastic response of the superstructure and induce a failure of the isolation system. The above considerations point out the importance of checking the effectiveness of different isolation systems for retrofitting a r.c. framed structure. For this purpose, a numerical investigation is carried out with reference to a six-storey r.c. framed building, which, primarily designed (as to be a fixed-base one) in compliance with the previous Italian code (DM96) for a medium-risk seismic zone, has to be retrofitted by insertion of an isolation system at the base for attaining performance levels imposed by the current Italian code (NTC08) in a high-risk seismic zone. Besides the (fixed-base) original structure, three cases of base isolation are studied: elastomeric bearings acting alone (e.g. HDLRBs); in-parallel combination of elastomeric and friction bearings (e.g. high-damping-laminated-rubber bearings, HDLRBs and steel-PTFE sliding bearings, SBs); friction bearings acting alone (e.g. friction pendulum bearings, FPBs). The nonlinear analysis of the fixed-base and base-isolated structures subjected to horizontal components of near-fault ground motions is performed for checking plastic conditions at the potential critical (end) sections of the girders and columns as well as critical conditions of the isolation systems. Unexpected high values of ductility demand are highlighted at the lower floors of all base-isolated structures, while re-centring problems of the base isolation systems under near-fault earthquakes are expected in case of friction bearings acting alone (i.e. FPBs) or that in combination (i.e. SBs) with HDLRBs.
Acoustic detection of Melolonthine larvae in Australian sugarcane
USDA-ARS?s Scientific Manuscript database
Decision support systems have been developed for risk analysis and control of root-feeding white grub pests in Queensland sugarcane, based partly on manual inspection of cane soil samples. Acoustic technology was considered as a potential alternative to this laborious procedure. Field surveys were...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, D. G.; Arent, D. J.; Johnson, L.
2006-06-01
This paper documents a probabilistic risk assessment of existing and alternative power supply systems at a large telecommunications office. The analysis characterizes the increase in the reliability of power supply through the use of two alternative power configurations. Failures in the power systems supporting major telecommunications service nodes are a main contributor to significant telecommunications outages. A logical approach to improving the robustness of telecommunication facilities is to increase the depth and breadth of technologies available to restore power during power outages. Distributed energy resources such as fuel cells and gas turbines could provide additional on-site electric power sources tomore » provide backup power, if batteries and diesel generators fail. The analysis is based on a hierarchical Bayesian approach and focuses on the failure probability associated with each of three possible facility configurations, along with assessment of the uncertainty or confidence level in the probability of failure. A risk-based characterization of final best configuration is presented.« less
Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance
NASA Astrophysics Data System (ADS)
Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra
2017-06-01
In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.
A Risk Stratification Model for Lung Cancer Based on Gene Coexpression Network and Deep Learning
2018-01-01
Risk stratification model for lung cancer with gene expression profile is of great interest. Instead of previous models based on individual prognostic genes, we aimed to develop a novel system-level risk stratification model for lung adenocarcinoma based on gene coexpression network. Using multiple microarray, gene coexpression network analysis was performed to identify survival-related networks. A deep learning based risk stratification model was constructed with representative genes of these networks. The model was validated in two test sets. Survival analysis was performed using the output of the model to evaluate whether it could predict patients' survival independent of clinicopathological variables. Five networks were significantly associated with patients' survival. Considering prognostic significance and representativeness, genes of the two survival-related networks were selected for input of the model. The output of the model was significantly associated with patients' survival in two test sets and training set (p < 0.00001, p < 0.0001 and p = 0.02 for training and test sets 1 and 2, resp.). In multivariate analyses, the model was associated with patients' prognosis independent of other clinicopathological features. Our study presents a new perspective on incorporating gene coexpression networks into the gene expression signature and clinical application of deep learning in genomic data science for prognosis prediction. PMID:29581968
Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination
NASA Technical Reports Server (NTRS)
Groen, Frank
2010-01-01
This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.
Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin
2015-12-15
As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. Copyright © 2015 Elsevier B.V. All rights reserved.
Xu, Rong; Wang, QuanQiu; Li, Li
2015-01-01
Dietary intakes of red meat and fat are established risk factors for both colorectal cancer (CRC) and cardiovascular disease (CVDs). Recent studies have shown a mechanistic link between TMAO, an intestinal microbial metabolite of red meat and fat, and risk of CVDs. Data linking TMAO directly to CRC is, however, lacking. Here, we present an unbiased data-driven network-based systems approach to uncover a potential genetic relationship between TMAO and CRC. We constructed two different epigenetic interaction networks (EINs) using chemical-gene, disease-gene and protein-protein interaction data from multiple large-scale data resources. We developed a network-based ranking algorithm to ascertain TMAO-related diseases from EINs. We systematically analyzed disease categories among TMAO-related diseases at different ranking cutoffs. We then determined which genetic pathways were associated with both TMAO and CRC. We show that CVDs and their major risk factors were ranked highly among TMAO-related diseases, confirming the newly discovered mechanistic link between CVDs and TMAO, and thus validating our algorithms. CRC was ranked highly among TMAO-related disease retrieved from both EINs (top 0.02%, #1 out of 4,372 diseases retrieved based on Mendelian genetics and top 10.9% among 882 diseases based on genome-wide association genetics), providing strong supporting evidence for our hypothesis that TMAO is genetically related to CRC. We have also identified putative genetic pathways that may link TMAO to CRC, which warrants further investigation. Through systematic disease enrichment analysis, we also demonstrated that TMAO is related to metabolic syndromes and cancers in general. Our genome-wide analysis demonstrates that systems approaches to studying the epigenetic interactions among diet, microbiome metabolisms, and disease genetics hold promise for understanding disease pathogenesis. Our results show that TMAO is genetically associated with CRC. This study suggests that TMAO may be an important intermediate marker linking dietary meat and fat and gut microbiota metabolism to risk of CRC, underscoring opportunities for the development of new gut microbiome-dependent diagnostic tests and therapeutics for CRC.
WE-B-BRC-01: Current Methodologies in Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rath, F.
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
WE-B-BRC-03: Risk in the Context of Medical Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samei, E.
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
Choi, Young Jun; Baek, Jung Hwan; Shin, Jung Hee; Shim, Woo Hyun; Kim, Seon-Ok; Lee, Won-Hong; Song, Dong Eun; Kim, Tae Yong; Chung, Ki-Wook; Lee, Jeong Hyun
2018-05-13
The purpose of this study was to construct a web-based predictive model using ultrasound characteristics and subcategorized biopsy results for thyroid nodules of atypia of undetermined significance/follicular lesion of undetermined significance (AUS/FLUS) to stratify the risk of malignancy. Data included 672 thyroid nodules from 656 patients from a historical cohort. We analyzed ultrasound images of thyroid nodules and biopsy results according to nuclear atypia and architectural atypia. Multivariate logistic regression analysis was performed to predict whether nodules were diagnosed as malignant or benign. The ultrasound features, including spiculated margin, marked hypoechogenicity, calcifications, biopsy results, and cytologic atypia, showed significant differences between groups. A 13-point risk scoring system was developed, and the area under the curve (AUC) of the receiver operating characteristic (ROC) curve of the development and validation sets were 0.837 and 0.830, respectively (http://www.gap.kr/thyroidnodule_b3.php). We devised a web-based predictive model using the combined information of ultrasound characteristics and biopsy results for AUS/FLUS thyroid nodules to stratify the malignant risk. © 2018 Wiley Periodicals, Inc.
Araki, Tadashi; Ikeda, Nobutaka; Shukla, Devarshi; Jain, Pankaj K; Londhe, Narendra D; Shrivastava, Vimal K; Banchhor, Sumit K; Saba, Luca; Nicolaides, Andrew; Shafique, Shoaib; Laird, John R; Suri, Jasjit S
2016-05-01
Percutaneous coronary interventional procedures need advance planning prior to stenting or an endarterectomy. Cardiologists use intravascular ultrasound (IVUS) for screening, risk assessment and stratification of coronary artery disease (CAD). We hypothesize that plaque components are vulnerable to rupture due to plaque progression. Currently, there are no standard grayscale IVUS tools for risk assessment of plaque rupture. This paper presents a novel strategy for risk stratification based on plaque morphology embedded with principal component analysis (PCA) for plaque feature dimensionality reduction and dominant feature selection technique. The risk assessment utilizes 56 grayscale coronary features in a machine learning framework while linking information from carotid and coronary plaque burdens due to their common genetic makeup. This system consists of a machine learning paradigm which uses a support vector machine (SVM) combined with PCA for optimal and dominant coronary artery morphological feature extraction. Carotid artery proven intima-media thickness (cIMT) biomarker is adapted as a gold standard during the training phase of the machine learning system. For the performance evaluation, K-fold cross validation protocol is adapted with 20 trials per fold. For choosing the dominant features out of the 56 grayscale features, a polling strategy of PCA is adapted where the original value of the features is unaltered. Different protocols are designed for establishing the stability and reliability criteria of the coronary risk assessment system (cRAS). Using the PCA-based machine learning paradigm and cross-validation protocol, a classification accuracy of 98.43% (AUC 0.98) with K=10 folds using an SVM radial basis function (RBF) kernel was achieved. A reliability index of 97.32% and machine learning stability criteria of 5% were met for the cRAS. This is the first Computer aided design (CADx) system of its kind that is able to demonstrate the ability of coronary risk assessment and stratification while demonstrating a successful design of the machine learning system based on our assumptions. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Valach, J.; Cacciotti, R.; Kuneš, P.; ČerÅanský, M.; Bláha, J.
2012-04-01
The paper presents a project aiming to develop a knowledge-based system for documentation and analysis of defects of cultural heritage objects and monuments. The MONDIS information system concentrates knowledge on damage of immovable structures due to various causes, and preventive/remedial actions performed to protect/repair them, where possible. The currently built system is to provide for understanding of causal relationships between a defect, materials, external load, and environment of built object. Foundation for the knowledge-based system will be the systemized and formalized knowledge on defects and their mitigation acquired in the process of analysis of a representative set of cases documented in the past. On the basis of design comparability, used technologies, materials and the nature of the external forces and surroundings, the developed software system has the capacity to indicate the most likely risks of new defect occurrence or the extension of the existing ones. The system will also allow for a comparison of the actual failure with similar cases documented and will propose a suitable technical intervention plan. The system will provide conservationists, administrators and owners of historical objects with a toolkit for defect documentation for their objects. Also, advanced artificial intelligence methods will offer accumulated knowledge to users and will also enable them to get oriented in relevant techniques of preventive interventions and reconstructions based on similarity with their case.
Avaliani, S L; Novikov, S M; Shashina, T A; Dodina, N S; Kislitsin, V A; Mishina, A L
2014-01-01
The lack of adequate legislative and regulatory framework for ensuring minimization of the health risks in the field of environmental protection is the obstacle for the application of the risk analysis methodology as a leading tool for administrative activity in Russia. "Principles of the state policy in the sphere of ensuring chemical and biological safety of the Russian Federation for the period up to 2025 and beyond", approved by the President of the Russian Federation on 01 November 2013, No PR-25 73, are aimed at the legal support for the health risk analysis methodology. In the article there have been supposed the main stages of the operative control of the environmental quality, which lead to the reduction of the health risk to the acceptable level. The further improvement of the health risk analysis methodology in Russia should contribute to the implementation of the state policy in the sphere of chemical and biological safety through the introduction of complex measures on neutralization of chemical and biological threats to the human health and the environment, as well as evaluation of the economic effectiveness of these measures. The primary step should be the legislative securing of the quantitative value for the term: "acceptable risk".
A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Multiple Myeloma.
Raju, G K; Gurumurthi, Karthik; Domike, Reuben; Kazandjian, Dickran; Landgren, Ola; Blumenthal, Gideon M; Farrell, Ann; Pazdur, Richard; Woodcock, Janet
2018-01-01
Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analysis. In this work, a quantitative benefit-risk analysis approach captures regulatory decision-making about new drugs to treat multiple myeloma (MM). MM assessments have been based on endpoints such as time to progression (TTP), progression-free survival (PFS), and objective response rate (ORR) which are different than benefit-risk analysis based on overall survival (OS). Twenty-three FDA decisions on MM drugs submitted to FDA between 2003 and 2016 were identified and analyzed. The benefits and risks were quantified relative to comparators (typically the control arm of the clinical trial) to estimate whether the median benefit-risk was positive or negative. A sensitivity analysis was demonstrated using ixazomib to explore the magnitude of uncertainty. FDA approval decision outcomes were consistent and logical using this benefit-risk framework. © 2017 American Society for Clinical Pharmacology and Therapeutics.
Methods of quantitative risk assessment: The case of the propellant supply system
NASA Astrophysics Data System (ADS)
Merz, H. A.; Bienz, A.
1984-08-01
As a consequence of the disastrous accident in Lapua (Finland) in 1976, where an explosion in a cartridge loading facility killed 40 and injured more than 70 persons, efforts were undertaken to examine and improve the safety of such installations. An ammunition factory in Switzerland considered the replacement of the manual supply of propellant hoppers by a new pneumatic supply system. This would reduce the maximum quantity of propellant in the hoppers to a level, where an accidental ignition would no longer lead to a detonation, and this would drastically limit the effects on persons. A quantitative risk assessment of the present and the planned supply system demonstrated that, in this particular case, the pneumatic supply system would not reduce the risk enough to justify the related costs. In addition, it could be shown that the safety of the existing system can be improved more effectively by other safety measures at considerably lower costs. Based on this practical example, the advantages of a strictly quantitative risk assessment for the safety planning in explosives factories are demonstrated. The methodological background of a risk assessment and the steps involved in the analysis are summarized. In addition, problems of quantification are discussed.
Hydra: A web-based system for cardiovascular analysis, diagnosis and treatment.
Novo, J; Hermida, A; Ortega, M; Barreira, N; Penedo, M G; López, J E; Calvo, C
2017-02-01
Cardiovascular (CV) risk stratification is a highly complex process involving an extensive set of clinical trials to support the clinical decision-making process. There are many clinical conditions (e.g. diabetes, obesity, stress, etc.) that can lead to the early diagnosis or establishment of cardiovascular disease. In order to determine all these clinical conditions, a complete set of clinical patient analyses is typically performed, including a physical examination, blood analysis, electrocardiogram, blood pressure (BP) analysis, etc. This article presents a web-based system, called Hydra, which integrates a full and detailed set of services and functionalities for clinical decision support in order to help and improve the work of clinicians in cardiovascular patient diagnosis, risk assessment, treatment and monitoring over time. Hydra integrates a number of different services: a service for inputting all the information gathered by specialists (physical examination, habits, BP, blood analysis, electrocardiogram, etc.); a tool to automatically determine the CV risk stratification, including well-known standard risk stratification tables; and, finally, various tools to incorporate, analyze and graphically present the records of the ambulatory BP monitoring that provides BP analysis over a given period of time (24 or 48 hours). In addition, the platform presents a set of reports derived from all the information gathered from the patient in order to support physicians in their clinical decisions. Hydra was tested and validated in a real domain. In particular, internal medicine specialists at the Hypertension Unit of the Santiago de Compostela University Hospital (CHUS) validated the platform and used it in different clinical studies to demonstrate its utility. It was observed that the platform increased productivity and accuracy in the assessment of patient data yielding a cost reduction in clinical practice. This paper proposes a complete platform that includes different services for cardiovascular clinical decision support. It was also run as a web-based application to facilitate its use by clinicians, who can access the platform from any remote computer with Internet access. Hydra also includes different automated methods to facilitate the physicians' work and avoid potential errors in the analysis of patient data. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Application of a web-based Decision Support System in risk management
NASA Astrophysics Data System (ADS)
Aye, Zar Chi; Jaboyedoff, Michel; Derron, Marc-Henri
2013-04-01
Increasingly, risk information is widely available with the help of advanced technologies such as earth observation satellites, global positioning technologies, coupled with hazard modeling and analysis, and geographical information systems (GIS). Even though it exists, no effort will be put into action if it is not properly presented to the decision makers. These information need to be communicated clearly and show its usefulness so that people can make better informed decision. Therefore, communicating available risk information has become an important challenge and decision support systems have been one of the significant approaches which can help not only in presenting risk information to the decision makers but also in making efficient decisions while reducing human resources and time needed. In this study, the conceptual framework of an internet-based decision support system is presented to highlight its importance role in risk management framework and how it can be applied in case study areas chosen. The main purpose of the proposed system is to facilitate the available risk information in risk reduction by taking into account of the changes in climate, land use and socio-economic along with the risk scenarios. It allows the users to formulate, compare and select risk reduction scenarios (mainly for floods and landslides) through an enhanced participatory platform with diverse stakeholders' involvement in the decision making process. It is based on the three-tier (client-server) architecture which integrates web-GIS plus DSS functionalities together with cost benefit analysis and other supporting tools. Embedding web-GIS provides its end users to make better planning and informed decisions referenced to a geographical location, which is the one of the essential factors in disaster risk reduction programs. Different risk reduction measures of a specific area (local scale) will be evaluated using this web-GIS tool, available risk scenarios obtained from Probabilistic Risk Assessment (PRA) model and the knowledge collected from experts. The visualization of the risk reduction scenarios can also be shared among the users on the web to support the on-line participatory process. In addition, cost-benefit ratios of the different risk reduction scenarios can be prepared in order to serve as inputs for high-level decision makers. The most appropriate risk reduction scenarios will be chosen using Multi-Criteria Evaluation (MCE) method by weighting different parameters according to the preferences and criteria defined by the users. The role of public participation has been changing from one-way communication between authorities, experts, stakeholders and citizens towards more intensive two-way interaction. Involving the affected public and interest groups can enhance the level of legitimacy, transparency, and confidence in the decision making process. Due to its important part in decision making, online participatory tool is included in the DSS in order to allow the involved stakeholders interactively in risk reduction and be aware of the existing vulnerability conditions of the community. Moreover, it aims to achieve a more transparent and better informed decision-making process. The system is under in progress and the first tools implemented will be presented showing the wide possibilities of new web technologies which can have a great impact on the decision making process. It will be applied in four pilot areas in Europe: French Alps, North Eastern Italy, Romania and Poland. Nevertheless, the framework will be designed and implemented in a way to be applicable in any other regions.
Modelling Risk to US Military Populations from Stopping Blanket Mandatory Polio Vaccination
Burgess, Andrew
2017-01-01
Objectives Transmission of polio poses a threat to military forces when deploying to regions where such viruses are endemic. US-born soldiers generally enter service with immunity resulting from childhood immunization against polio; moreover, new recruits are routinely vaccinated with inactivated poliovirus vaccine (IPV), supplemented based upon deployment circumstances. Given residual protection from childhood vaccination, risk-based vaccination may sufficiently protect troops from polio transmission. Methods This analysis employed a mathematical system for polio transmission within military populations interacting with locals in a polio-endemic region to evaluate changes in vaccination policy. Results Removal of blanket immunization had no effect on simulated polio incidence among deployed military populations when risk-based immunization was employed; however, when these individuals reintegrated with their base populations, risk of transmission to nondeployed personnel increased by 19%. In the absence of both blanket- and risk-based immunization, transmission to nondeployed populations increased by 25%. The overall number of new infections among nondeployed populations was negligible for both scenarios due to high childhood immunization rates, partial protection against transmission conferred by IPV, and low global disease incidence levels. Conclusion Risk-based immunization driven by deployment to polio-endemic regions is sufficient to prevent transmission among both deployed and nondeployed US military populations. PMID:29104608
Oakes, Benjamin Donald; Mattsson, Lars-Göran; Näsman, Per; Glazunov, Andrés Alayón
2018-06-01
Modern infrastructures are becoming increasingly dependent on electronic systems, leaving them more vulnerable to electrical surges or electromagnetic interference. Electromagnetic disturbances appear in nature, e.g., lightning and solar wind; however, they may also be generated by man-made technology to maliciously damage or disturb electronic equipment. This article presents a systematic risk assessment framework for identifying possible, consequential, and plausible intentional electromagnetic interference (IEMI) attacks on an arbitrary distribution network infrastructure. In the absence of available data on IEMI occurrences, we find that a systems-based risk assessment is more useful than a probabilistic approach. We therefore modify the often applied definition of risk, i.e., a set of triplets containing scenario, probability, and consequence, to a set of quadruplets: scenario, resource requirements, plausibility, and consequence. Probability is "replaced" by resource requirements and plausibility, where the former is the minimum amount and type of equipment necessary to successfully carry out an attack scenario and the latter is a subjective assessment of the extent of the existence of attackers who possess the motivation, knowledge, and resources necessary to carry out the scenario. We apply the concept of intrusion areas and classify electromagnetic source technology according to key attributes. Worst-case scenarios are identified for different quantities of attacker resources. The most plausible and consequential of these are deemed the most important scenarios and should provide useful decision support in a countermeasures effort. Finally, an example of the proposed risk assessment framework, based on notional data, is provided on a hypothetical water distribution network. © 2017 Society for Risk Analysis.
Assurance of Fault Management: Risk-Significant Adverse Condition Awareness
NASA Technical Reports Server (NTRS)
Fitz, Rhonda
2016-01-01
Fault Management (FM) systems are ranked high in risk-based assessment of criticality within flight software, emphasizing the importance of establishing highly competent domain expertise to provide assurance for NASA projects, especially as spaceflight systems continue to increase in complexity. Insight into specific characteristics of FM architectures seen embedded within safety- and mission-critical software systems analyzed by the NASA Independent Verification Validation (IVV) Program has been enhanced with an FM Technical Reference (TR) suite. Benefits are aimed beyond the IVV community to those that seek ways to efficiently and effectively provide software assurance to reduce the FM risk posture of NASA and other space missions. The identification of particular FM architectures, visibility, and associated IVV techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. The role FM has with regard to overall asset protection of flight software systems is being addressed with the development of an adverse condition (AC) database encompassing flight software vulnerabilities.Identification of potential off-nominal conditions and analysis to determine how a system responds to these conditions are important aspects of hazard analysis and fault management. Understanding what ACs the mission may face, and ensuring they are prevented or addressed is the responsibility of the assurance team, which necessarily should have insight into ACs beyond those defined by the project itself. Research efforts sponsored by NASAs Office of Safety and Mission Assurance defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs, and allowing queries based on project, mission type, domain component, causal fault, and other key characteristics. The repository has a firm structure, initial collection of data, and an interface established for informational queries, with plans for integration within the Enterprise Architecture at NASA IVV, enabling support and accessibility across the Agency. The development of an improved workflow process for adaptive, risk-informed FM assurance is currently underway.
Evaluating the risk of water distribution system failure: A shared frailty model
NASA Astrophysics Data System (ADS)
Clark, Robert M.; Thurnau, Robert C.
2011-12-01
Condition assessment (CA) Modeling is drawing increasing interest as a technique that can assist in managing drinking water infrastructure. This paper develops a model based on the application of a Cox proportional hazard (PH)/shared frailty model and applies it to evaluating the risk of failure in drinking water networks using data from the Laramie Water Utility (located in Laramie, Wyoming, USA). Using the risk model a cost/ benefit analysis incorporating the inspection value method (IVM), is used to assist in making improved repair, replacement and rehabilitation decisions for selected drinking water distribution system pipes. A separate model is developed to predict failures in prestressed concrete cylinder pipe (PCCP). Various currently available inspection technologies are presented and discussed.
Smart Extraction and Analysis System for Clinical Research.
Afzal, Muhammad; Hussain, Maqbool; Khan, Wajahat Ali; Ali, Taqdir; Jamshed, Arif; Lee, Sungyoung
2017-05-01
With the increasing use of electronic health records (EHRs), there is a growing need to expand the utilization of EHR data to support clinical research. The key challenge in achieving this goal is the unavailability of smart systems and methods to overcome the issue of data preparation, structuring, and sharing for smooth clinical research. We developed a robust analysis system called the smart extraction and analysis system (SEAS) that consists of two subsystems: (1) the information extraction system (IES), for extracting information from clinical documents, and (2) the survival analysis system (SAS), for a descriptive and predictive analysis to compile the survival statistics and predict the future chance of survivability. The IES subsystem is based on a novel permutation-based pattern recognition method that extracts information from unstructured clinical documents. Similarly, the SAS subsystem is based on a classification and regression tree (CART)-based prediction model for survival analysis. SEAS is evaluated and validated on a real-world case study of head and neck cancer. The overall information extraction accuracy of the system for semistructured text is recorded at 99%, while that for unstructured text is 97%. Furthermore, the automated, unstructured information extraction has reduced the average time spent on manual data entry by 75%, without compromising the accuracy of the system. Moreover, around 88% of patients are found in a terminal or dead state for the highest clinical stage of disease (level IV). Similarly, there is an ∼36% probability of a patient being alive if at least one of the lifestyle risk factors was positive. We presented our work on the development of SEAS to replace costly and time-consuming manual methods with smart automatic extraction of information and survival prediction methods. SEAS has reduced the time and energy of human resources spent unnecessarily on manual tasks.
NASA Astrophysics Data System (ADS)
Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko
2002-05-01
The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.
Do We Know Whether Researchers and Reviewers are Estimating Risk and Benefit Accurately?
Hey, Spencer Phillips; Kimmelman, Jonathan
2016-10-01
Accurate estimation of risk and benefit is integral to good clinical research planning, ethical review, and study implementation. Some commentators have argued that various actors in clinical research systems are prone to biased or arbitrary risk/benefit estimation. In this commentary, we suggest the evidence supporting such claims is very limited. Most prior work has imputed risk/benefit beliefs based on past behavior or goals, rather than directly measuring them. We describe an approach - forecast analysis - that would enable direct and effective measure of the quality of risk/benefit estimation. We then consider some objections and limitations to the forecasting approach. © 2016 John Wiley & Sons Ltd.
Extended GTST-MLD for aerospace system safety analysis.
Guo, Chiming; Gong, Shiyu; Tan, Lin; Guo, Bo
2012-06-01
The hazards caused by complex interactions in the aerospace system have become a problem that urgently needs to be settled. This article introduces a method for aerospace system hazard interaction identification based on extended GTST-MLD (goal tree-success tree-master logic diagram) during the design stage. GTST-MLD is a functional modeling framework with a simple architecture. Ontology is used to extend the ability of system interaction description in GTST-MLD by adding the system design knowledge and the past accident experience. From the level of functionality and equipment, respectively, this approach can help the technician detect potential hazard interactions. Finally, a case is used to show the method. © 2011 Society for Risk Analysis.
Direct measurement of human exposure to environmental contaminants in real time (when the exposure is actually occurring) is rare and difficult to obtain. This frustrates both exposure assessments and investigations into the linkage between chemical exposure and human disease. ...
Networks Analysis of a Regional Ecosystem of Afterschool Programs
ERIC Educational Resources Information Center
Russell, Martha G.; Smith, Marc A.
2011-01-01
Case studies have documented the impact of family-school-community collaboration in afterschool programs on increasing awareness about the problems of at-risk youth, initiating dialogue among leaders and community representatives, developing rich school-based information systems, and demonstrating how to build strong relationships between public…
A novel risk classification system for 30-day mortality in children undergoing surgery
Walter, Arianne I.; Jones, Tamekia L.; Huang, Eunice Y.; Davis, Robert L.
2018-01-01
A simple, objective and accurate way of grouping children undergoing surgery into clinically relevant risk groups is needed. The purpose of this study, is to develop and validate a preoperative risk classification system for postsurgical 30-day mortality for children undergoing a wide variety of operations. The National Surgical Quality Improvement Project-Pediatric participant use file data for calendar years 2012–2014 was analyzed to determine preoperative variables most associated with death within 30 days of operation (D30). Risk groups were created using classification tree analysis based on these preoperative variables. The resulting risk groups were validated using 2015 data, and applied to neonates and higher risk CPT codes to determine validity in high-risk subpopulations. A five-level risk classification was found to be most accurate. The preoperative need for ventilation, oxygen support, inotropic support, sepsis, the need for emergent surgery and a do not resuscitate order defined non-overlapping groups with observed rates of D30 that vary from 0.075% (Very Low Risk) to 38.6% (Very High Risk). When CPT codes where death was never observed are eliminated or when the system is applied to neonates, the groupings remained predictive of death in an ordinal manner. PMID:29351327
Medical technology at home: safety-related items in technical documentation.
Hilbers, Ellen S M; de Vries, Claudette G J C A; Geertsma, Robert E
2013-01-01
This study aimed to investigate the technical documentation of manufacturers on issues of safe use of their device in a home setting. Three categories of equipment were selected: infusion pumps, ventilators, and dialysis systems. Risk analyses, instructions for use, labels, and post market surveillance procedures were requested from manufacturers. Additionally, they were asked to fill out a questionnaire on collection of field experience, on incidents, and training activities. Specific risks of device operation by lay users in a home setting were incompletely addressed in the risk analyses. A substantial number of user manuals were designed for professionals, rather than for patients or lay carers. Risk analyses and user information often showed incomplete coherence. Post market surveillance was mainly based on passive collection of field experiences. Manufacturers of infusion pumps, ventilators, and dialysis systems pay insufficient attention to the specific risks of use by lay persons in home settings. It is expected that this conclusion is also applicable for other medical equipment for treatment at home. Manufacturers of medical equipment for home use should pay more attention to use errors, lay use and home-specific risks in design, risk analysis, and user information. Field experiences should be collected more actively. Coherence between risk analysis and user information should be improved. Notified bodies should address these aspects in their assessment. User manuals issued by institutions supervising a specific home therapy should be drawn up in consultation with the manufacturer.
Security Events and Vulnerability Data for Cybersecurity Risk Estimation.
Allodi, Luca; Massacci, Fabio
2017-08-01
Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.
NEW APPROACHES IN RISK ANALYSIS OF ENVIRONMENTAL STRESSORS TO HUMAN AND ECOLOGICAL SYSTEMS
We explore the application of novel techniques for improving and integrating risk analysis of environmental stressors to human and ecological systems. Environmental protection decisions are guided by risk assessments serving as tools to develop regulatory policy and other relate...
Sun, Feifei; Zhu, Jia; Lu, Suying; Zhen, Zijun; Wang, Juan; Huang, Junting; Ding, Zonghui; Zeng, Musheng; Sun, Xiaofei
2018-01-02
Systemic inflammatory parameters are associated with poor outcomes in malignant patients. Several inflammation-based cumulative prognostic score systems were established for various solid tumors. However, there is few inflammation based cumulative prognostic score system for patients with diffuse large B cell lymphoma (DLBCL). We retrospectively reviewed 564 adult DLBCL patients who had received rituximab, cyclophosphamide, doxorubicin, vincristine and prednisolone (R-CHOP) therapy between Nov 1 2006 and Dec 30 2013 and assessed the prognostic significance of six systemic inflammatory parameters evaluated in previous studies by univariate and multivariate analysis:C-reactive protein(CRP), albumin levels, the lymphocyte-monocyte ratio (LMR), the neutrophil-lymphocyte ratio(NLR), the platelet-lymphocyte ratio(PLR)and fibrinogen levels. Multivariate analysis identified CRP, albumin levels and the LMR are three independent prognostic parameters for overall survival (OS). Based on these three factors, we constructed a novel inflammation-based cumulative prognostic score (ICPS) system. Four risk groups were formed: group ICPS = 0, ICPS = 1, ICPS = 2 and ICPS = 3. Advanced multivariate analysis indicated that the ICPS model is a prognostic score system independent of International Prognostic Index (IPI) for both progression-free survival (PFS) (p < 0.001) and OS (p < 0.001). The 3-year OS for patients with ICPS =0, ICPS =1, ICPS =2 and ICPS =3 were 95.6, 88.2, 76.0 and 62.2%, respectively (p < 0.001). The 3-year PFS for patients with ICPS = 0-1, ICPS = 2 and ICPS = 3 were 84.8, 71.6 and 54.5%, respectively (p < 0.001). The prognostic value of the ICPS model indicated that the degree of systemic inflammatory status was associated with clinical outcomes of patients with DLBCL in rituximab era. The ICPS model was shown to classify risk groups more accurately than any single inflammatory prognostic parameters. These findings may be useful for identifying candidates for further inflammation-related mechanism research or novel anti-inflammation target therapies.
Computerized database management system for breast cancer patients.
Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah
2014-01-01
Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.
Song, Ruiguang; Hall, H Irene; Harrison, Kathleen McDavid; Sharpe, Tanya Telfair; Lin, Lillian S; Dean, Hazel D
2011-01-01
We developed a statistical tool that brings together standard, accessible, and well-understood analytic approaches and uses area-based information and other publicly available data to identify social determinants of health (SDH) that significantly affect the morbidity of a specific disease. We specified AIDS as the disease of interest and used data from the American Community Survey and the National HIV Surveillance System. Morbidity and socioeconomic variables in the two data systems were linked through geographic areas that can be identified in both systems. Correlation and partial correlation coefficients were used to measure the impact of socioeconomic factors on AIDS diagnosis rates in certain geographic areas. We developed an easily explained approach that can be used by a data analyst with access to publicly available datasets and standard statistical software to identify the impact of SDH. We found that the AIDS diagnosis rate was highly correlated with the distribution of race/ethnicity, population density, and marital status in an area. The impact of poverty, education level, and unemployment depended on other SDH variables. Area-based measures of socioeconomic variables can be used to identify risk factors associated with a disease of interest. When correlation analysis is used to identify risk factors, potential confounding from other variables must be taken into account.
The physical vulnerability of elements at risk: a methodology based on fluid and classical mechanics
NASA Astrophysics Data System (ADS)
Mazzorana, B.; Fuchs, S.; Levaggi, L.
2012-04-01
The impacts of the flood events occurred in autumn 2011 in the Italian regions Liguria and Tuscany revived the engagement of the public decision makers to enhance in synergy flood control and land use planning. In this context, the design of efficient flood risk mitigation strategies and their subsequent implementation critically relies on a careful vulnerability analysis of both, the immobile and mobile elements at risk potentially exposed to flood hazards. Based on fluid and classical mechanics notions we developed computation schemes enabling for a dynamic vulnerability and risk analysis facing a broad typological variety of elements at risk. The methodological skeleton consists of (1) hydrodynamic computation of the time-varying flood intensities resulting for each element at risk in a succession of loading configurations; (2) modelling the mechanical response of the impacted elements through static, elasto-static and dynamic analyses; (3) characterising the mechanical response through proper structural damage variables and (4) economic valuation of the expected losses as a function of the quantified damage variables. From a computational perspective we coupled the description of the hydrodynamic flow behaviour and the induced structural modifications of the elements at risk exposed. Valuation methods, suitable to support a correct mapping from the value domains of the physical damage variables to the economic loss values are discussed. In such a way we target to complement from a methodological perspective the existing, mainly empirical, vulnerability and risk assessment approaches to refine the conceptual framework of the cost-benefit analysis. Moreover, we aim to support the design of effective flood risk mitigation strategies by diminishing the main criticalities within the systems prone to flood risk.
Spatial epidemiology of Toxoplasma gondii infection in goats in Serbia.
Djokic, Vitomir; Klun, Ivana; Musella, Vincenzo; Rinaldi, Laura; Cringoli, Giuseppe; Sotiraki, Smaragda; Djurkovic-Djakovic, Olgica
2014-05-01
A major risk factor for Toxoplasma gondii infection is consumption of undercooked meat. Increasing demand for goat meat is likely to promote the role of this animal for human toxoplasmosis. As there are virtually no data on toxoplasmosis in goats in Serbia, we undertook a cross-sectional serological study, including prediction modelling using geographical information systems (GIS). Sera from 431 goats reared in 143 households/farms throughout Serbia, sampled between January 2010 and September 2011, were examined for T. gondii antibodies by a modified agglutination test. Seroprevalence was 73.3% at the individual level and 84.6% at the farm level. Risk factor analysis showed above two-fold higher risk of infection for goats used for all purposes compared to dairy goats (P = 0.012), almost seven-fold higher risk for goats kept as sole species versus those kept with other animals (P = 0.001) and a two-fold lower risk for goats introduced from outside the farm compared to those raised on the farm (P = 0.027). Moreover, households/farms located in centre-eastern Serbia were found to be less often infected than those in northern Serbia (P = 0.004). The risk factor analysis was fully supported by spatial analysis based on a GIS database containing data on origin, serology, land cover, elevation, meteorology and a spatial prediction map based on kriging analysis, which showed western Serbia as the area most likely for finding goats positive for T. gondii and centre-eastern Serbia as the least likely. In addition, rainfall favoured seropositivity, whereas temperature, humidity and elevation did not.
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.
1976-01-01
A variety of economic and programmatic issues are discussed concerning the development and deployment of a fleet of space-based solar power satellites (SSPS). The costs, uncertainties and risks associated with the current photovoltaic SSPS configuration, and with issues affecting the development of an economically viable SSPS development program are analyzed. The desirability of a low earth orbit (LEO) demonstration satellite and a geosynchronous (GEO) pilot satellite is examined and critical technology areas are identified. In addition, a preliminary examination of utility interface issues is reported. The main focus of the effort reported is the development of SSPS unit production, and operation and maintenance cost models suitable for incorporation into a risk assessment (Monte Carlo) model (RAM). It is shown that the key technology area deals with the productivity of man in space, not, as might be expected, with some hardware component technology.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-13
... science-based risk analysis of those activity/food combinations that would be considered low risk. We... proposed requirements of the Federal Food, Drug, and Cosmetic Act for hazard analysis and risk-based... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Part 117 [Docket No...
Composite power system well-being analysis
NASA Astrophysics Data System (ADS)
Aboreshaid, Saleh Abdulrahman Saleh
The evaluation of composite system reliability is extremely complex as it is necessary to include detailed modeling of both generation and transmission facilities and their auxiliary elements. The most significant quantitative indices in composite power system adequacy evaluation are those which relate to load curtailment. Many utilities have difficulty in interpreting the expected load curtailment indices as the existing models are based on adequacy analysis and in many cases do not consider realistic operating conditions in the system under study. This thesis presents a security based approach which alleviates this difficulty and provides the ability to evaluate the well-being of customer load points and the overall composite generation and transmission power system. Acceptable deterministic criteria are included in the probabilistic evaluation of the composite system reliability indices to monitor load point well-being. The degree of load point well-being is quantified in terms of the healthy and marginal state indices in addition to the traditional risk indices. The individual well-being indices of the different system load points are aggregated to produce system indices. This thesis presents new models and techniques to quantify the well-being of composite generation and, direct and alternating current transmission systems. Security constraints are basically the operating limits which must be satisfied for normal system operation. These constraints depend mainly on the purpose behind the study. The constraints which govern the practical operation of a power system are divided, in this thesis, into three sets namely, steady-state, voltage stability and transient stability constraints. The inclusion of an appropriate transient stability constraint will lead to a more accurate appraisal of the overall power system well-being. This thesis illustrates the utilization of a bisection method in the analytical evaluation of the critical clearing time which forms the basis of most existing stability assessments. The effect of employing high-speed-simultaneous or adaptive reclosing schemes is presented in this thesis. An effective and fast technique to incorporate voltage stability considerations in composite generation and transmission system reliability evaluation is also presented. The proposed technique can be easily incorporated in an existing composite power system reliability program using voltage stability constraints that are constructed for individual load points based on a relatively simple risk index. It is believed that the concepts, procedures and indices presented in this thesis will provide useful tools for power system designers, planners and operators and assist them to perform composite system well-being analysis in addition to traditional risk assessment.
Beregovykh, V V; Spitskiy, O R
2014-01-01
Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.
How Do the Metabolic Effects of Chronic Stress Influence Breast Cancer Biology
2013-04-01
meta - analysis . International Journal of Cancer. 2003;107:1023-9. 3. Song M, Lee K-M, Kang D. Breast Cancer Prevention Based on Gene- Environment...7 PCR system. Details of the statistical analysis are provided in supplemental methods 1 section. 2 Adipocyte glucose consumption and...life events and breast cancer risk: a meta - analysis . International Journal of Cancer. 7 2003;107:1023-9. 8 3. Song M, Lee K-M, Kang D. Breast
Reviewing the economic efficiency of disaster risk management
NASA Astrophysics Data System (ADS)
Mechler, Reinhard
2013-04-01
There is a lot of rhetoric suggesting that disaster risk management (DRM) pays, yet surprisingly little in the way of hard facts. Cost-benefit analysis (CBA) is one major tool that can provide quantitative information about the prioritization of disaster risk management (DRM) (and climate adaptation) based on economic principles. Yet, on a global scale, there has been surprisingly little robust evidence on the economic efficiency and benefits of risk management measures. This review shows that for the limited evidence reported the economic case for DRM across a range of hazards is strong and that the benefits of investing in DRM outweigh the costs of doing so, on average, by about four times the cost in terms of avoided and reduced losses. Most studies using a CBA approach focus on structural DRM and most information has been made available on physical flood prevention. There have been some limited studies on preparedness and risk financing. The global evidence base is limited and estimates appear not very solid, and overall, in line with the conclusion of the recent IPCC SREX report, there is limited evidence and medium agreement across the literature. Some of the factors behind the limited robustness are inherent to CBA more widely: these challenges comprise the inability to price intangibles, evaluating strategies rather than single projects, difficulties in assessing softer rather than infrastructure-related options, choices regarding a proper discount rate, lack of accounting for the distribution of benefits and costs and difficulties with assessing nonmarket values such as those related to health, the environment, or public goods. Although techniques exist to address some of these challenges, they are not very likely to easily go away. Other challenges associated specifically with DRM, such as the need and difficulty to undertake risk -based analysis can be overcome, and there have been manuals and reports providing a way forward. In an age of austerity, cost-benefit analysis continues to be an important tool for prioritising efficient DRM measures, yet with a shifting emphasis from infrastructure-based options (hard resilience) to preparedness and systemic interventions (soft resilience), other tools such as cost-effectiveness analysis, multi-criteria analysis and robust decision-making approaches deserve more attention.
2018-01-01
Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869
McAndrews, Carolyn; Beyer, Kirsten; Guse, Clare E; Layde, Peter
2013-11-01
Comparing the injury risk of different travel modes requires using a travel-based measure of exposure. In this study we quantify injury risk by travel mode, age, race/ethnicity, sex, and injury severity using three different travel-based exposure measures (person-trips, person-minutes of travel, and person-miles of travel) to learn how these metrics affect the characterization of risk across populations. We used a linked database of hospital and police records to identify non-fatal injuries (2001-2009), the Fatality Analysis Reporting System for fatalities (2001-2009), and the 2001 Wisconsin Add-On to the National Household Travel Survey for exposure measures. In Wisconsin, bicyclists and pedestrians have a moderately higher injury risk compared to motor vehicle occupants (adjusting for demographic factors), but the risk is much higher when exposure is measured in distance. Although the analysis did not control for socio-economic status (a likely confounder) it showed that American Indian and Black travelers in Wisconsin face higher transportation injury risk than White travelers (adjusting for sex and travel mode), across all three measures of exposure. Working with multiple metrics to form comprehensive injury risk profiles such as this one can inform decision making about how to prioritize investments in transportation injury prevention. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Widodo, L.; Adianto; Sartika, D. I.
2017-12-01
PT. XYZ is a large automotive manufacturing company that manufacture, assemble as well as a car exporter. The other products are spare parts, jig and dies. PT. XYZ has long been implementing the Occupational Safety and Health Management System (OSHMS) to reduce the potential hazards that cause work accidents. However, this does not mean that OSHMS that has been implemented does not need to be upgraded and improved. This is due to the potential danger caused by work is quite high. This research was conducted in Sunter 2 Plant where its production activities have a high level of potential hazard. Based on Hazard Identification risk assessment, Risk Assessment, and Risk Control (HIRARC) found 10 potential hazards in Plant Stamping Production, consisting of 4 very high risk potential hazards (E), 5 high risk potential hazards (H), and 1 moderate risk potential hazard (M). While in Plant Casting Production found 22 potential hazards findings consist of 7 very high risk potential hazards (E), 12 high risk potential hazards (H), and 3 medium risk potential hazards (M). Based on the result of Fault Tree Analysis (FTA), the main priority is the high risk potential hazards (H) and very high risk potential hazards (E). The proposed improvement are to make the visual display of the importance of always using the correct Personal Protective Equipment (PPE), establishing good working procedures, conducting OSH training for workers on a regular basis, and continuing to conduct safety campaigns.
Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S
2014-01-01
The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).
Risk-Return Relationship in a Complex Adaptive System
Song, Kunyu; An, Kenan; Yang, Guang; Huang, Jiping
2012-01-01
For survival and development, autonomous agents in complex adaptive systems involving the human society must compete against or collaborate with others for sharing limited resources or wealth, by using different methods. One method is to invest, in order to obtain payoffs with risk. It is a common belief that investments with a positive risk-return relationship (namely, high risk high return and vice versa) are dominant over those with a negative risk-return relationship (i.e., high risk low return and vice versa) in the human society; the belief has a notable impact on daily investing activities of investors. Here we investigate the risk-return relationship in a model complex adaptive system, in order to study the effect of both market efficiency and closeness that exist in the human society and play an important role in helping to establish traditional finance/economics theories. We conduct a series of computer-aided human experiments, and also perform agent-based simulations and theoretical analysis to confirm the experimental observations and reveal the underlying mechanism. We report that investments with a negative risk-return relationship have dominance over those with a positive risk-return relationship instead in such a complex adaptive systems. We formulate the dynamical process for the system's evolution, which helps to discover the different role of identical and heterogeneous preferences. This work might be valuable not only to complexity science, but also to finance and economics, to management and social science, and to physics. PMID:22479416
Integrated risk framework for onsite wastewater treatment systems.
Carroll, Steven; Goonetilleke, Ashantha; Thomas, Evan; Hargreaves, Megan; Frost, Ray; Dawes, Les
2006-08-01
Onsite wastewater treatment systems (OWTS) are becoming increasingly important for the treatment and dispersal of effluent in new urbanised developments that are not serviced by centralised wastewater collection and treatment systems. However, the current standards and guidelines adopted by many local authorities for assessing suitable site and soil conditions for OWTS are increasingly coming under scrutiny due to the public health and environmental impacts caused by poorly performing systems, in particular septic tank-soil adsorption systems. In order to achieve sustainable onsite wastewater treatment with minimal impacts on the environment and public health, more appropriate means of assessment are required. This paper highlights an integrated risk based approach for assessing the inherent hazards associated with OWTS in order to manage and mitigate the environmental and public health risks inherent with onsite wastewater treatment. In developing a sound and cohesive integrated risk framework for OWTS, several key issues must be recognised. These include the inclusion of relevant stakeholders throughout framework development, the integration of scientific knowledge, data and analysis with risk assessment and management ideals, and identification of the appropriate performance goals for successful management and mitigation of associated risks. These issues were addressed in the development of the risk framework to provide a generic approach to assessing risk from OWTS. The utilisation of the developed risk framework for achieving more appropriate assessment and management techniques for OWTS is presented in a case study for the Gold Coast region, Queensland State, Australia.
Integrated Risk Framework for Onsite Wastewater Treatment Systems
NASA Astrophysics Data System (ADS)
Carroll, Steven; Goonetilleke, Ashantha; Thomas, Evan; Hargreaves, Megan; Frost, Ray; Dawes, Les
2006-08-01
Onsite wastewater treatment systems (OWTS) are becoming increasingly important for the treatment and dispersal of effluent in new urbanised developments that are not serviced by centralised wastewater collection and treatment systems. However, the current standards and guidelines adopted by many local authorities for assessing suitable site and soil conditions for OWTS are increasingly coming under scrutiny due to the public health and environmental impacts caused by poorly performing systems, in particular septic tank-soil adsorption systems. In order to achieve sustainable onsite wastewater treatment with minimal impacts on the environment and public health, more appropriate means of assessment are required. This paper highlights an integrated risk based approach for assessing the inherent hazards associated with OWTS in order to manage and mitigate the environmental and public health risks inherent with onsite wastewater treatment. In developing a sound and cohesive integrated risk framework for OWTS, several key issues must be recognised. These include the inclusion of relevant stakeholders throughout framework development, the integration of scientific knowledge, data and analysis with risk assessment and management ideals, and identification of the appropriate performance goals for successful management and mitigation of associated risks. These issues were addressed in the development of the risk framework to provide a generic approach to assessing risk from OWTS. The utilisation of the developed risk framework for achieving more appropriate assessment and management techniques for OWTS is presented in a case study for the Gold Coast region, Queensland State, Australia.
Risk-return relationship in a complex adaptive system.
Song, Kunyu; An, Kenan; Yang, Guang; Huang, Jiping
2012-01-01
For survival and development, autonomous agents in complex adaptive systems involving the human society must compete against or collaborate with others for sharing limited resources or wealth, by using different methods. One method is to invest, in order to obtain payoffs with risk. It is a common belief that investments with a positive risk-return relationship (namely, high risk high return and vice versa) are dominant over those with a negative risk-return relationship (i.e., high risk low return and vice versa) in the human society; the belief has a notable impact on daily investing activities of investors. Here we investigate the risk-return relationship in a model complex adaptive system, in order to study the effect of both market efficiency and closeness that exist in the human society and play an important role in helping to establish traditional finance/economics theories. We conduct a series of computer-aided human experiments, and also perform agent-based simulations and theoretical analysis to confirm the experimental observations and reveal the underlying mechanism. We report that investments with a negative risk-return relationship have dominance over those with a positive risk-return relationship instead in such a complex adaptive systems. We formulate the dynamical process for the system's evolution, which helps to discover the different role of identical and heterogeneous preferences. This work might be valuable not only to complexity science, but also to finance and economics, to management and social science, and to physics.
Horta, Rodrigo S; Lavalle, Gleidice E; Monteiro, Lidianne N; Souza, Mayara C C; Cassali, Geovanni D; Araújo, Roberto B
2018-03-01
Mast cell tumor (MCT) is a frequent cutaneous neoplasm in dogs that is heterogeneous in clinical presentation and biological behavior, with a variable potential for recurrence and metastasis. Accurate prediction of clinical outcomes has been challenging. The study objective was to develop a system for classification of canine MCT according to the mortality risk based on individual assessment of clinical, histologic, immunohistochemical, and molecular features. The study included 149 dogs with a histologic diagnosis of cutaneous or subcutaneous MCT. By univariate analysis, MCT metastasis and related death was significantly associated with clinical stage ( P < .0001, r P = -0.610), history of tumor recurrence ( P < .0001, r P = -0.550), Patnaik ( P < .0001, r P = -0.380) and Kiupel grades ( P < .0001, r P = -0.500), predominant organization of neoplastic cells ( P < .0001, r P = -0.452), mitotic count ( P < .0001, r P = -0.325), Ki-67 labeling index ( P < .0001, r P = -0.414), KITr pattern ( P = .02, r P = 0.207), and c-KIT mutational status ( P < .0001, r P = -0.356). By multivariate analysis with Cox proportional hazard model, only 2 features were independent predictors of overall survival: an amendment of the World Health Organization clinical staging system (hazard ratio [95% CI]: 1.824 [1.210-4.481]; P = .01) and a history of tumor recurrence (hazard ratio [95% CI]: 9.250 [2.158-23.268]; P < .001]. From these results, we propose an amendment of the WHO staging system, a method of risk analysis, and a suggested approach to clinical and laboratory evaluation of dogs with cutaneous MCT.
Rehfuess, Eva A; Best, Nicky; Briggs, David J; Joffe, Mike
2013-12-06
Effective interventions require evidence on how individual causal pathways jointly determine disease. Based on the concept of systems epidemiology, this paper develops Diagram-based Analysis of Causal Systems (DACS) as an approach to analyze complex systems, and applies it by examining the contributions of proximal and distal determinants of childhood acute lower respiratory infections (ALRI) in sub-Saharan Africa. Diagram-based Analysis of Causal Systems combines the use of causal diagrams with multiple routinely available data sources, using a variety of statistical techniques. In a step-by-step process, the causal diagram evolves from conceptual based on a priori knowledge and assumptions, through operational informed by data availability which then undergoes empirical testing, to integrated which synthesizes information from multiple datasets. In our application, we apply different regression techniques to Demographic and Health Survey (DHS) datasets for Benin, Ethiopia, Kenya and Namibia and a pooled World Health Survey (WHS) dataset for sixteen African countries. Explicit strategies are employed to make decisions transparent about the inclusion/omission of arrows, the sign and strength of the relationships and homogeneity/heterogeneity across settings.Findings about the current state of evidence on the complex web of socio-economic, environmental, behavioral and healthcare factors influencing childhood ALRI, based on DHS and WHS data, are summarized in an integrated causal diagram. Notably, solid fuel use is structured by socio-economic factors and increases the risk of childhood ALRI mortality. Diagram-based Analysis of Causal Systems is a means of organizing the current state of knowledge about a specific area of research, and a framework for integrating statistical analyses across a whole system. This partly a priori approach is explicit about causal assumptions guiding the analysis and about researcher judgment, and wrong assumptions can be reversed following empirical testing. This approach is well-suited to dealing with complex systems, in particular where data are scarce.
2013-01-01
Background Effective interventions require evidence on how individual causal pathways jointly determine disease. Based on the concept of systems epidemiology, this paper develops Diagram-based Analysis of Causal Systems (DACS) as an approach to analyze complex systems, and applies it by examining the contributions of proximal and distal determinants of childhood acute lower respiratory infections (ALRI) in sub-Saharan Africa. Results Diagram-based Analysis of Causal Systems combines the use of causal diagrams with multiple routinely available data sources, using a variety of statistical techniques. In a step-by-step process, the causal diagram evolves from conceptual based on a priori knowledge and assumptions, through operational informed by data availability which then undergoes empirical testing, to integrated which synthesizes information from multiple datasets. In our application, we apply different regression techniques to Demographic and Health Survey (DHS) datasets for Benin, Ethiopia, Kenya and Namibia and a pooled World Health Survey (WHS) dataset for sixteen African countries. Explicit strategies are employed to make decisions transparent about the inclusion/omission of arrows, the sign and strength of the relationships and homogeneity/heterogeneity across settings. Findings about the current state of evidence on the complex web of socio-economic, environmental, behavioral and healthcare factors influencing childhood ALRI, based on DHS and WHS data, are summarized in an integrated causal diagram. Notably, solid fuel use is structured by socio-economic factors and increases the risk of childhood ALRI mortality. Conclusions Diagram-based Analysis of Causal Systems is a means of organizing the current state of knowledge about a specific area of research, and a framework for integrating statistical analyses across a whole system. This partly a priori approach is explicit about causal assumptions guiding the analysis and about researcher judgment, and wrong assumptions can be reversed following empirical testing. This approach is well-suited to dealing with complex systems, in particular where data are scarce. PMID:24314302
2013-06-01
measuring numerical risk to the government ( Galway , 2004). However, quantitative risk analysis is rarely utilized in DoD acquisition programs because the...quantitative assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost...Kindle version]. Retrieved from Amazon.com 83 Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review
NASA Astrophysics Data System (ADS)
Pellicani, R.; Spilotro, G.; Colangelo, G.; Petraglia, A.; Pizzo, V.
2012-04-01
The rockfall risk has been evaluated for the Tirrena Inferiore State Road SS18 between 220+600 and 243+670 Kilometers in the coastal area of Maratea (Basilicata, Italy) through a specific multilayer technique. These results are particularly significant as validated in field through the occurrence of rockfall events after the study. The study part of "Tirrena Inferiore" SS18 road is often affected by rockfalls, which periodically (coinciding with abundant rainfalls, earthquakes and temperature lowering) cause large amount of damage and traffic interruptions. In order to assess the rockfall risk and define the countermeasure needed to mitigate the risk, an integrated index-based and physically-based approach was implemented. The roadway is subject to slopes with steep rocky vertical or sub-vertical faces affected by different systems of discontinuities, that show a widespread fracturing. The superficial parts of slopes are characterized by gaping fracturing, often karstified. Several historical rockfall events were recognized in the area and numerous geomechanical analyses, finalized to the stability analysis of rock walls, were carried out. The localization of the potentially unstable areas and the quantification of relative rockfall risk were evaluated through three successive phases of analysis. First, a map based on SMR (Slope Mass Rating) Index of Romana (1985) was produced, through a spatial analysis of both geomechanical parameters, such as the RMR Index of Bieniawski, and the distribution of the discontinuities. This approach therefore allowed the estimation of the potentially unstable zones and their classification on the basis of the resulting stability degree. Subsequently, an analysis of the rockfall trajectories in correspondence to the most unstable zones of slope was carried out by using ROTOMAP, a 3-dimensional rock-fall simulation software. The input data for computing the rockfall trajectories are the following: (1) digital terrain model (DTM), (2) location of rock-fall release points (source areas), (3) geometrical parameters of block rolling, such as limit angle of flight, impact and rebound, and (4) geomechanical parameters of block rolling, such as the coefficients of normal and tangential energy restitution. For each DTM cell the software calculates the number of blocks passing through, the maximum rock-fall velocity and the maximum flying height. These information were used in order to verify the efficiency of the existing rockfall protection systems. Finally, the rockfall risk map was realized through the evaluation of the spatial distribution of the following three parameters: (i) lithology, (ii) kinematic compatibility, and (iii) historical rockfall events. After quantifying the risk, the most suitable typologies of rockfall protection systems were identified for the most unstable sections of slopes. The importance and usefulness of this study derives from the validation of the obtained results, in terms of risk, through the occurrence of new rockfall events in those areas for which the highest level of rockfall risk was defined in previous study.
Pain management and opioid risk mitigation in the military.
Sharpe Potter, Jennifer; Bebarta, Vikhyat S; Marino, Elise N; Ramos, Rosemarie G; Turner, Barbara J
2014-05-01
Opioid analgesics misuse is a significant military health concern recognized as a priority issue by military leadership. Opioids are among those most commonly prescribed medications in the military for pain management. The military has implemented opioid risk mitigation strategies, including the Sole Provider Program and the Controlled Drug Management Analysis and Reporting Tool, which are used to identify and monitor for risk and misuse. However, there are substantial opportunities to build on these existing systems to better ensure safer opioid prescribing and monitor for misuse. Opioid risk mitigation strategies implemented by the civilian sector include establishing clinical guidelines for opioid prescribing and prescription monitoring programs. These strategies may help to inform opioid risk mitigation in the military health system. Reducing the risk of opioid misuse and improving quality of care for our Warfighters is necessary. This must be done through evidence-based approaches with an investment in research to improve patient care and prevent opioid misuse as well as its sequelae. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.
Optimal Bi-Objective Redundancy Allocation for Systems Reliability and Risk Management.
Govindan, Kannan; Jafarian, Ahmad; Azbari, Mostafa E; Choi, Tsan-Ming
2016-08-01
In the big data era, systems reliability is critical to effective systems risk management. In this paper, a novel multiobjective approach, with hybridization of a known algorithm called NSGA-II and an adaptive population-based simulated annealing (APBSA) method is developed to solve the systems reliability optimization problems. In the first step, to create a good algorithm, we use a coevolutionary strategy. Since the proposed algorithm is very sensitive to parameter values, the response surface method is employed to estimate the appropriate parameters of the algorithm. Moreover, to examine the performance of our proposed approach, several test problems are generated, and the proposed hybrid algorithm and other commonly known approaches (i.e., MOGA, NRGA, and NSGA-II) are compared with respect to four performance measures: 1) mean ideal distance; 2) diversification metric; 3) percentage of domination; and 4) data envelopment analysis. The computational studies have shown that the proposed algorithm is an effective approach for systems reliability and risk management.
Gerli, Sandro; Favilli, Alessandro; Franchini, David; De Giorgi, Marcello; Casucci, Paola; Parazzini, Fabio
2018-01-01
To assess if maternal risk profile and Hospital assistential levels were able to influence the inter-Hospitals comparison in the class 1 and 3 of the "The Ten Group Classification System" (TGCS). A population-based analysis using data from Institutional data-base of an Italian Region was carried out. The 11 maternity wards were divided into two categories: second-level hospitals (SLH), and first-level hospitals (FLH). The recorded deliveries were classified according to the TGCS. To analyze if different maternal characteristics and the hospitals assistential level could influence the cesarean section (CS) risk, a multivariate analysis was done considering separately women in the TGCS class 1 and 3. From January 2011 to December 2013 were recorded 19,987 deliveries. Of those 7,693 were in the TGCS class 1 and 4,919 in the class 3. The CS rates were 20.8% and 14.7% in class 1 (p < 0.0001) and 6.9% and 5.3% (p < 0.0230) in class 3, respectively in the FLH and SLH. The multivariate logistic regression showed that the FLH, older maternal age and gestational diabetes were independent risk factors for CS in groups 1 and 3. Obesity and gestational hypertension were also independent risk factors for group 1. TGCS is a useful tool to analyze the incidence of CS in a single center but in comparing different Hospitals, maternal characteristics and different assistential levels should be considered as potential bias.
Decerns: A framework for multi-criteria decision analysis
Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...
2015-02-27
A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.
The Financial Benefit of Early Flood Warnings in Europe
NASA Astrophysics Data System (ADS)
Pappenberger, Florian; Cloke, Hannah L.; Wetterhall, Fredrik; Parker, Dennis J.; Richardson, David; Thielen, Jutta
2015-04-01
Effective disaster risk management relies on science based solutions to close the gap between prevention and preparedness measures. The outcome of consultations on the UNIDSR post-2015 framework for disaster risk reduction highlight the need for cross-border early warning systems to strengthen the preparedness phases of disaster risk management in order to save people's lives and property and reduce the overall impact of severe events. In particular, continental and global scale flood forecasting systems provide vital information to various decision makers with which early warnings of floods can be made. Here the potential monetary benefits of early flood warnings using the example of the European Flood Awareness System (EFAS) are calculated based on pan-European Flood damage data and calculations of potential flood damage reductions. The benefits are of the order of 400 Euro for every 1 Euro invested. Because of the uncertainties which accompany the calculation, a large sensitivity analysis is performed in order to develop an envelope of possible financial benefits. Current EFAS system skill is compared against perfect forecasts to demonstrate the importance of further improving the skill of the forecasts. Improving the response to warnings is also essential in reaping the benefits of flood early warnings.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...
Data Hemorrhages in the Health-Care Sector
NASA Astrophysics Data System (ADS)
Johnson, M. Eric
Confidential data hemorrhaging from health-care providers pose financial risks to firms and medical risks to patients. We examine the consequences of data hemorrhages including privacy violations, medical fraud, financial identity theft, and medical identity theft. We also examine the types and sources of data hemorrhages, focusing on inadvertent disclosures. Through an analysis of leaked files, we examine data hemorrhages stemming from inadvertent disclosures on internet-based file sharing networks. We characterize the security risk for a group of health-care organizations using a direct analysis of leaked files. These files contained highly sensitive medical and personal information that could be maliciously exploited by criminals seeking to commit medical and financial identity theft. We also present evidence of the threat by examining user-issued searches. Our analysis demonstrates both the substantial threat and vulnerability for the health-care sector and the unique complexity exhibited by the US health-care system.
Wang, Xin; Jin, Jing; Yang, Yong; Liu, Wen-Yang; Ren, Hua; Feng, Yan-Ru; Xiao, Qin; Li, Ning; Deng, Lei; Fang, Hui; Jing, Hao; Lu, Ning-Ning; Tang, Yu; Wang, Jian-Yang; Wang, Shu-Lian; Wang, Wei-Hu; Song, Yong-Wen; Liu, Yue-Ping; Li, Ye-Xiong
2016-10-04
The role of adjuvant chemoradiotherapy (ACRT) or adjuvant chemotherapy (ACT) in treating patients with locally advanced upper rectal cancer (URC) after total mesorectal excision (TME) surgery remains unclear. We developed a clinical nomogram and a recursive partitioning analysis (RPA)-based risk stratification system for predicting 5-year cancer-specific survival (CSS) to determine whether these individuals require ACRT or ACT. This retrospective analysis included 547 patients with primary URC. A nomogram was developed based on the Cox regression model. The performance of the model was assessed by concordance index (C-index) and calibration curve in internal validation with bootstrapping. RPA stratified patients into risk groups based on their tumor characteristics. Five independent prognostic factors (age, preoperative increased carcinoembryonic antigen and carcinoma antigen 19-9, positive lymph node [PLN] number, tumor deposit [TD], pathological T classification) were identified and entered into the predictive nomogram. The bootstrap-corrected C-index was 0.757. RPA stratification of the three prognostic groups showed obviously different prognosis. Only the high-risk group (patients with PLN ≤ 6 and TD, or PLN > 6) benefited from ACRT plus ACT when compared with surgery followed by ACRT or ACT, and surgery alone (5-year CSS: 70.8% vs. 57.8% vs. 15.6%, P < 0.001). Our nomogram predicts 5-year CSS after TME surgery for locally advanced rectal cancer and RPA-based stratification indicates that ACRT plus ACT post-surgery may be an important treatment plan with potentially ignificant survival advantages in high-risk URC. This may help to select candidates of adjuvant treatment in prospective studies.
Pancer, Katarzyna
2013-01-01
Many factors affect the risk of Legionella infection, such as the design, construction and maintenance of water distribution systems, the presence of individuals who may be exposed and their vulnerability to infection, and the degree of water system colonization and properties of Legionella strains. For epidemiological investigations, two properties of the Legionella strains are usually determined: serotyping and genotyping (sequence-based typing, SBT). In Poland, data regarding legionellosis are fragmentary, despite the fact that this has been a notifiable disease since 2002. The number of reported cases is very low; moreover, the main method of diagnosis is serological examination (delayed diagnosis and cheaper methods), and only single cases of LD were confirmed by culture of bacteria. Therefore, after 10 years of mandatory reporting of the Legionella spp. infection in Poland, the real epidemiological situation is still unknown; however, risk assessment should be carried out, especially in hospitals. In the presented study, comparison of the sequence types of 111 isolated L. pneumophila strains (from hospital water systems) with those present in the EWGLI SBT data was undertaken for complex risk analysis as a complementary element. In total, strains of L. pneumophila belonging to 12 out of 19 STs determined in the presented study were previously reported to the EWGLI SBT database (ST1, ST42, ST59, ST81, ST87, ST114, ST152, ST191, ST371, ST421, ST461, ST520). Among these strains, only 7 STs were previously reported in the amount of ≥10 (mainly ST1, ST42, ST81). Analysis of EWGLI data were carried out and, proportionally, the highest percentage of hospital-acquired strains (clinical and environmental) was found for ST 81, ST421 and ST152, but the largest number was for ST1. Based on the EWGLI data and the presented results, it was found that persistent colonization of HWS of 3 hospitals by strains belonging to ST42, ST1, ST87 indicated an increased risk of legionellosis, especially ST42.
Yi, Yujun; Tang, Caihong; Yi, Tieci; Yang, Zhifeng; Zhang, Shanghong
2017-11-01
This study aims to concern the distribution of As, Cr, Cd, Hg, Cu, Zn, Pb and Fe in surface sediment, zoobenthos and fishes, and quantify the accumulative ecological risk and human health risk of metals in river ecological system based on the field investigation in the upper Yangtze River. The results revealed high ecological risk of As, Cd, Cu, Hg, Zn and Pb in sediment. As and Cd in fish presented potential human health risk of metals by assessing integrated target hazard quotient results based on average and maximum concentrations, respectively. No detrimental health effects of heavy metals on humans were found by daily fish consumption. While, the total target hazard quotient (1.659) exceeding 1, it meant that the exposed population might experience noncarcinogenic health risks from the accumulative effect of metals. Ecological network analysis model was established to identify the transfer routes and quantify accumulative effects of metals on river ecosystem. Control analysis between compartments showed large predator fish firstly depended on the omnivorous fish. Accumulative ecological risk of metals indicated that zoobenthos had the largest metal propagation risk and compartments located at higher trophic levels were not easier affected by the external environment pollution. A potential accumulative ecological risk of heavy metal in the food web was quantified, and the noncarcinogenic health risk of fish consumption was revealed for the upper reach of the Yangtze River. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Gur, David; Zheng, Bin; Lederman, Dror; Dhurjaty, Sreeram; Sumkin, Jules; Zuley, Margarita
2010-02-01
A new resonance-frequency based electronic impedance spectroscopy (REIS) system with multi-probes, including one central probe and six external probes that are designed to contact the breast skin in a circular form with a radius of 60 millimeters to the central ("nipple") probe, has been assembled and installed in our breast imaging facility. We are conducting a prospective clinical study to test the performance of this REIS system in identifying younger women (< 50 years old) at higher risk for having or developing breast cancer. In this preliminary analysis, we selected a subset of 100 examinations. Among these, 50 examinations were recommended for a biopsy due to detection of a highly suspicious breast lesion and 50 were determined negative during mammography screening. REIS output signal sweeps that we used to compute an initial feature included both amplitude and phase information representing differences between corresponding (matched) EIS signal values acquired from the left and right breasts. A genetic algorithm was applied to reduce the feature set and optimize a support vector machine (SVM) to classify the REIS examinations into "biopsy recommended" and "non-biopsy" recommended groups. Using the leave-one-case-out testing method, the classification performance as measured by the area under the receiver operating characteristic (ROC) curve was 0.816 +/- 0.042. This pilot analysis suggests that the new multi-probe-based REIS system could potentially be used as a risk stratification tool to identify pre-screened young women who are at higher risk of having or developing breast cancer.
Soy food consumption and risk of prostate cancer: a meta-analysis of observational studies.
Hwang, Ye Won; Kim, Soo Young; Jee, Sun Ha; Kim, Youn Nam; Nam, Chung Mo
2009-01-01
Soybean products have been suggested to have a chemo preventive effect against prostate cancer. The aim of this study was to provide a comprehensive meta-analysis on the extent of the possible association between soy-based food consumption and the risk of prostate cancer. Five cohort studies and 8 case-control studies were identified using MEDLINE, EMBASE, CINAHL, Korea Medical Database, KoreaMed, Korean studies Information Service System, Japana Centra Revuo Medicina, China National Knowledge Infrastructure, and a manual search. Summary odds ratios (ORs) comparing high versus low categories of soybean consumptions were calculated on the basis of the random effect model. We analyzed the associations based on the different types of soybean consumptions. The summary ORs (95% CI) for total soy foods were 0.69 (CI = 0.57-0.84) and 0.75 (CI = 0.62-0.89) for nonfermented soy foods. Among individual soy foods, only tofu yielded a significant value of 0.73 (CI = 0.57-0.92). Consumption of soybean milk, miso, or natto did not significantly reduce the risk of prostate cancer. Genistein and daidzein were associated with a lower risk of prostate cancer. This systematic review suggests that soy food consumption could lower the risk of prostate cancer. This conclusion, however, should be interpreted with caution because various biases can affect the results of a meta-analysis.
NASA Technical Reports Server (NTRS)
Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.
1995-01-01
Volume 5 is Appendix C, Auxiliary Shuttle Risk Analyses, and contains the following reports: Probabilistic Risk Assessment of Space Shuttle Phase 1 - Space Shuttle Catastrophic Failure Frequency Final Report; Risk Analysis Applied to the Space Shuttle Main Engine - Demonstration Project for the Main Combustion Chamber Risk Assessment; An Investigation of the Risk Implications of Space Shuttle Solid Rocket Booster Chamber Pressure Excursions; Safety of the Thermal Protection System of the Space Shuttle Orbiter - Quantitative Analysis and Organizational Factors; Space Shuttle Main Propulsion Pressurization System Probabilistic Risk Assessment, Final Report; and Space Shuttle Probabilistic Risk Assessment Proof-of-Concept Study - Auxiliary Power Unit and Hydraulic Power Unit Analysis Report.
Kroshl, William M; Sarkani, Shahram; Mazzuchi, Thomas A
2015-09-01
This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg "leader follower" game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent-based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent-based approach results in a greater percentage of defender victories than does the PRA-based approach. © 2015 Society for Risk Analysis.
Use-related risk analysis for medical devices based on improved FMEA.
Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping
2012-01-01
In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Best, Susan L.
2006-01-01
When the systems are developed and in place to provide the services needed to operate en route and on the Lunar and Martian surfaces, an Earth based replication will need to be in place for the safety and protection of mission success. The replication will entail all aspects of the flight configuration end to end but will not include any closed loop systems. This would replicate the infrastructure from Lunar and Martian robots, manned surface excursions, through man and unmanned terrestrial bases, through the various types of communication systems and technologies, manned and un-manned space vehicles (large and small), to Earth based systems and control centers. An Earth based replicated infrastructure will enable checkout and test of new technologies, hardware, software updates and upgrades and procedures without putting humans and missions at risk. Analysis of events, what ifs and trouble resolution could be played out on the ground to remove as much risk as possible from any type of proposed change to flight operational systems. With adequate detail, it is possible that failures could be predicted with a high probability and action taken to eliminate failures. A major factor in any mission to the Moon and to Mars is the complexity of systems, interfaces, processes, their limitations, associated risks and the factor of the unknown including the development by many contractors and NASA centers. The need to be able to introduce new technologies over the life of the program requires an end to end test bed to analyze and evaluate these technologies and what will happen when they are introduced into the flight system. The ability to analyze system behaviors end to end under varying conditions would enhance safety e.g. fault tolerances. This analysis along with the ability to mine data from the development environment (e.g. test data), flight ops and modeling/simulations data would provide a level of information not currently available to operations and astronauts. In this paper we will analyze the beginnings of such a replication and what it could do in terms of reducing risk in the near term for development. We will analyze the Space Shuttle Main Engine (SSME) test lab which has to a large extent accomplished this replication for the SSME and has been highly successful in analyzing hardware and software problems and changes. The cost of replicating the flight system as proposed here could be very high if attempted as an afterthought. We will describe the initial steps for the development of a replication of this infrastructure starting with the communication infrastructure. The Constellation of Labs (CofL) under the Command, Control, Communication and Information (C3I) project for the NASA Exploration Initiative will provide the initial foundation upon which to base this replication. Simply put, there is very little margin for error in high latency situations e.g. en-route to/from Mars or in an autonomous process on the Lunar far side. Any thought out approach to reduce risk and increase safety needs to be accomplished end to end with the actual systems configuration.
O'Reilly, Kathleen M; Lamoureux, Christine; Molodecky, Natalie A; Lyons, Hil; Grassly, Nicholas C; Tallis, Graham
2017-05-26
The international spread of wild poliomyelitis outbreaks continues to threaten eradication of poliomyelitis and in 2014 a public health emergency of international concern was declared. Here we describe a risk scoring system that has been used to assess country-level risks of wild poliomyelitis outbreaks, to inform prioritisation of mass vaccination planning, and describe the change in risk from 2014 to 2016. The methods were also used to assess the risk of emergence of vaccine-derived poliomyelitis outbreaks. Potential explanatory variables were tested against the reported outbreaks of wild poliomyelitis since 2003 using multivariable regression analysis. The regression analysis was translated to a risk score and used to classify countries as Low, Medium, Medium High and High risk, based on the predictive ability of the score. Indicators of population immunity, population displacement and diarrhoeal disease were associated with an increased risk of both wild and vaccine-derived outbreaks. High migration from countries with wild cases was associated with wild outbreaks. High birth numbers were associated with an increased risk of vaccine-derived outbreaks. Use of the scoring system is a transparent and rapid approach to assess country risk of wild and vaccine-derived poliomyelitis outbreaks. Since 2008 there has been a steep reduction in the number of wild poliomyelitis outbreaks and the reduction in countries classified as High and Medium High risk has reflected this. The risk of vaccine-derived poliomyelitis outbreaks has varied geographically. These findings highlight that many countries remain susceptible to poliomyelitis outbreaks and maintenance or improvement in routine immunisation is vital.
Dotson, G Scott; Hudson, Naomi L; Maier, Andrew
2015-01-01
Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management.
Dotson, G. Scott; Hudson, Naomi L.; Maier, Andrew
2016-01-01
Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management. PMID:26312660
Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers
Ken Rhinefrank
2016-07-25
Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.
CAD system for automatic analysis of CT perfusion maps
NASA Astrophysics Data System (ADS)
Hachaj, T.; Ogiela, M. R.
2011-03-01
In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.
A Research Roadmap for Computation-Based Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Mandelli, Diego; Joe, Jeffrey
2015-08-01
The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less
Qi, Xiaoxing; Liu, Liming; Liu, Yabin; Yao, Lan
2013-06-01
Integrated food security covers three aspects: food quantity security, food quality security, and sustainable food security. Because sustainable food security requires that food security must be compatible with sustainable development, the risk assessment of sustainable food security is becoming one of the most important issues. This paper mainly focuses on the characteristics of sustainable food security problems in the major grain-producing areas in China. We establish an index system based on land resources and eco-environmental conditions and apply a dynamic assessment method based on status assessments and trend analysis models to overcome the shortcomings of the static evaluation method. Using fuzzy mathematics, the risks are categorized into four grades: negligible risk, low risk, medium risk, and high risk. A case study was conducted in one of China's major grain-producing areas: Dongting Lake area. The results predict that the status of the sustainable food security in the Dongting Lake area is unsatisfactory for the foreseeable future. The number of districts at the medium-risk range will increase from six to ten by 2015 due to increasing population pressure, a decrease in the cultivated area, and a decrease in the effective irrigation area. Therefore, appropriate policies and measures should be put forward to improve it. The results could also provide direct support for an early warning system-which could be used to monitor food security trends or nutritional status so to inform policy makers of impending food shortages-to prevent sustainable food security risk based on some classical systematic methods. This is the first research of sustainable food security in terms of risk assessment, from the perspective of resources and the environment, at the regional scale.
NASA Astrophysics Data System (ADS)
Gebert, Niklas; Post, Joachim
2010-05-01
The development of early warning systems are one of the key domains of adaptation to global environmental change and contribute very much to the development of societal reaction and adaptive capacities to deal with extreme events. Especially, Indonesia is highly exposed to tsunami. In average every three years small and medium size tsunamis occur in the region causing damage and death. In the aftermath of the Indian Ocean Tsunami 2004, the German and Indonesian government agreed on a joint cooperation to develop a People Centered End-to-End Early Warning System (GITEWS). The analysis of risk and vulnerability, as an important step in risk (and early warning) governance, is a precondition for the design of effective early warning structures by delivering the knowledge base for developing institutionalized quick response mechanisms of organizations involved in the issuing of a tsunami warning, and of populations exposed to react to warnings and to manage evacuation before the first tsunami wave hits. Thus, a special challenge for developing countries is the governance of complex cross-sectoral and cross-scale institutional, social and spatial processes and requirements for the conceptualization, implementation and optimization of a people centered tsunami early warning system. In support of this, the risk and vulnerability assessment of the case study aims at identifying those factors that constitute the causal structure of the (dis)functionality between the technological warning and the social response system causing loss of life during an emergency situation: Which social groups are likely to be less able to receive and respond to an early warning alert? And, are people able to evacuate in due time? Here, only an interdisciplinary research approach is capable to analyze the socio-spatial and environmental conditions of vulnerability and risk and to produce valuable results for decision makers and civil society to manage tsunami risk in the early warning context. This requires the integration of natural / spatial and social science concepts, methods and data: E.g. a scenario based approach for tsunami inundation modeling was developed to provide decision makers with options to decide up to what level they aim to protect their people and territory, on the contrary household surveys were conducted for the spatial analysis of the evacuation preparedness of the population as a function of place specific hazard, risk, warning and evacuation perception; remote sensing was applied for the spatial analysis (land-use) of the socio-physical conditions of a city and region for evacuation; and existing social / population statistics were combined with land-use data for the precise spatial mapping of the population exposed to tsunami risks. Only by utilizing such a comprehensive assessment approach valuable information for risk governance can be generated. The results are mapped using GIS and designed according to the specific needs of different end-users, such as public authorities involved in the design of warning dissemination strategies, land-use planners (shelter planning, road network configuration) and NGOs mandated to provide education for the general public on tsunami risk and evacuation behavior. The case study of the city of Padang (one of the pilot areas of GITEWS), Indonesia clearly show, that only by intersecting social (vulnerability) and natural hazards research a comprehensive picture on tsunami risk can be provided with which risk governance in the early warning context can be conducted in a comprehensive, systemic and sustainable manner.
Conversion of Questionnaire Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
During the survey, respondents are asked to provide qualitative answers (well, adequate, needs improvement) on how well material control and accountability (MC&A) functions are being performed. These responses can be used to develop failure probabilities for basic events performed during routine operation of the MC&A systems. The failure frequencies for individual events may be used to estimate total system effectiveness using a fault tree in a probabilistic risk analysis (PRA). Numeric risk values are required for the PRA fault tree calculations that are performed to evaluate system effectiveness. So, the performance ratings in the questionnaire must be converted to relativemore » risk values for all of the basic MC&A tasks performed in the facility. If a specific material protection, control, and accountability (MPC&A) task is being performed at the 'perfect' level, the task is considered to have a near zero risk of failure. If the task is performed at a less than perfect level, the deficiency in performance represents some risk of failure for the event. As the degree of deficiency in performance increases, the risk of failure increases. If a task that should be performed is not being performed, that task is in a state of failure. The failure probabilities of all basic events contribute to the total system risk. Conversion of questionnaire MPC&A system performance data to numeric values is a separate function from the process of completing the questionnaire. When specific questions in the questionnaire are answered, the focus is on correctly assessing and reporting, in an adjectival manner, the actual performance of the related MC&A function. Prior to conversion, consideration should not be given to the numeric value that will be assigned during the conversion process. In the conversion process, adjectival responses to questions on system performance are quantified based on a log normal scale typically used in human error analysis (see A.D. Swain and H.E. Guttmann, 'Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications,' NUREG/CR-1278). This conversion produces the basic event risk of failure values required for the fault tree calculations. The fault tree is a deductive logic structure that corresponds to the operational nuclear MC&A system at a nuclear facility. The conventional Delphi process is a time-honored approach commonly used in the risk assessment field to extract numerical values for the failure rates of actions or activities when statistically significant data is absent.« less
The SAM framework: modeling the effects of management factors on human behavior in risk analysis.
Murphy, D M; Paté-Cornell, M E
1996-08-01
Complex engineered systems, such as nuclear reactors and chemical plants, have the potential for catastrophic failure with disastrous consequences. In recent years, human and management factors have been recognized as frequent root causes of major failures in such systems. However, classical probabilistic risk analysis (PRA) techniques do not account for the underlying causes of these errors because they focus on the physical system and do not explicitly address the link between components' performance and organizational factors. This paper describes a general approach for addressing the human and management causes of system failure, called the SAM (System-Action-Management) framework. Beginning with a quantitative risk model of the physical system, SAM expands the scope of analysis to incorporate first the decisions and actions of individuals that affect the physical system. SAM then links management factors (incentives, training, policies and procedures, selection criteria, etc.) to those decisions and actions. The focus of this paper is on four quantitative models of action that describe this last relationship. These models address the formation of intentions for action and their execution as a function of the organizational environment. Intention formation is described by three alternative models: a rational model, a bounded rationality model, and a rule-based model. The execution of intentions is then modeled separately. These four models are designed to assess the probabilities of individual actions from the perspective of management, thus reflecting the uncertainties inherent to human behavior. The SAM framework is illustrated for a hypothetical case of hazardous materials transportation. This framework can be used as a tool to increase the safety and reliability of complex technical systems by modifying the organization, rather than, or in addition to, re-designing the physical system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parwatiningtyas, Diyan, E-mail: diane.tyas@gmail.com, E-mail: erlinunindra@gmail.com; Ambarsari, Erlin Windia, E-mail: diane.tyas@gmail.com, E-mail: erlinunindra@gmail.com; Marlina, Dwi, E-mail: diane.tyas@gmail.com, E-mail: erlinunindra@gmail.com
Indonesia has a wealth of natural assets is so large to be managed and utilized, either from its own local government and local communities, especially in the mining sector. However, mining activities can change the state of the surface layer of the earth that have a high impact disaster risk. This could threaten the safety and disrupt human life, environmental damage, loss of property, and the psychological impact, sulking to the rule of law no 24 of 2007. That's why we strive to manage and minimize the risk of mine disasters in the region, how to use the method ofmore » calculation of Amplification Factor (AF) from the analysis based microtremor sulking Kanai and Nakamura, and decision systems were tested by analysis of ANP. Based on the amplification factor and Analytical Network Processing (ANP) obtained, some points showed instability in the surface layer of a mining area include the site of the TP-7, TP-8, TP-9, TP-10, (Birowo2). If in terms of structure, location indicated unstable due to have a sloping surface layer, resulting in the occurrence of landslides and earthquake risk is high. In the meantime, other areas of the mine site can be said to be a stable area.« less
NASA Astrophysics Data System (ADS)
Parwatiningtyas, Diyan; Ambarsari, Erlin Windia; Marlina, Dwi; Wiratomo, Yogi
2014-03-01
Indonesia has a wealth of natural assets is so large to be managed and utilized, either from its own local government and local communities, especially in the mining sector. However, mining activities can change the state of the surface layer of the earth that have a high impact disaster risk. This could threaten the safety and disrupt human life, environmental damage, loss of property, and the psychological impact, sulking to the rule of law no 24 of 2007. That's why we strive to manage and minimize the risk of mine disasters in the region, how to use the method of calculation of Amplification Factor (AF) from the analysis based microtremor sulking Kanai and Nakamura, and decision systems were tested by analysis of ANP. Based on the amplification factor and Analytical Network Processing (ANP) obtained, some points showed instability in the surface layer of a mining area include the site of the TP-7, TP-8, TP-9, TP-10, (Birowo2). If in terms of structure, location indicated unstable due to have a sloping surface layer, resulting in the occurrence of landslides and earthquake risk is high. In the meantime, other areas of the mine site can be said to be a stable area.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
NASA Astrophysics Data System (ADS)
Olyazadeh, Roya; van Westen, Cees; Bakker, Wim H.; Aye, Zar Chi; Jaboyedoff, Michel; Derron, Marc-Henri
2014-05-01
Natural hazard risk management requires decision making in several stages. Decision making on alternatives for risk reduction planning starts with an intelligence phase for recognition of the decision problems and identifying the objectives. Development of the alternatives and assigning the variable by decision makers to each alternative are employed to the design phase. Final phase evaluates the optimal choice by comparing the alternatives, defining indicators, assigning a weight to each and ranking them. This process is referred to as Multi-Criteria Decision Making analysis (MCDM), Multi-Criteria Evaluation (MCE) or Multi-Criteria Analysis (MCA). In the framework of the ongoing 7th Framework Program "CHANGES" (2011-2014, Grant Agreement No. 263953) of the European Commission, a Spatial Decision Support System is under development, that has the aim to analyse changes in hydro-meteorological risk and provide support to selecting the best risk reduction alternative. This paper describes the module for Multi-Criteria Decision Making analysis (MCDM) that incorporates monetary and non-monetary criteria in the analysis of the optimal alternative. The MCDM module consists of several components. The first step is to define criteria (or Indicators) which are subdivided into disadvantages (criteria that indicate the difficulty for implementing the risk reduction strategy, also referred to as Costs) and advantages (criteria that indicate the favorability, also referred to as benefits). In the next step the stakeholders can use the developed web-based tool for prioritizing criteria and decision matrix. Public participation plays a role in decision making and this is also planned through the use of a mobile web-version where the general local public can indicate their agreement on the proposed alternatives. The application is being tested through a case study related to risk reduction of a mountainous valley in the Alps affected by flooding. Four alternatives are evaluated in this case study namely: construction of defense structures, relocation, implementation of an early warning system and spatial planning regulations. Some of the criteria are determined partly in other modules of the CHANGES SDSS, such as the costs for implementation, the risk reduction in monetary values, and societal risk. Other criteria, which could be environmental, economic, cultural, perception in nature, are defined by different stakeholders such as local authorities, expert organizations, private sector, and local public. In the next step, the stakeholders weight the importance of the criteria by pairwise comparison and visualize the decision matrix, which is a matrix based on criteria versus alternatives values. Finally alternatives are ranked by Analytic Hierarchy Process (AHP) method. We expect that this approach will help the decision makers to ease their works and reduce their costs, because the process is more transparent, more accurate and involves a group decision. In that way there will be more confidence in the overall decision making process. Keywords: MCDM, Analytic Hierarchy Process (AHP), SDSS, Natural Hazard Risk Management
Selected considerations of implementation of the GNSS
NASA Astrophysics Data System (ADS)
Cwiklak, Janusz; Fellner, Andrzej; Fellner, Radoslaw; Jafernik, Henryk; Sledzinski, Janusz
2014-05-01
The article describes analysis of the safety and risk for the implementation of precise approach procedures (Localizer Performance and Vertical Guidance - LPV) with GNSS sensor at airports in Warsaw and Katowice. There were used some techniques of the identification of threats (inducing controlled flight into terrain, landing accident, mid-air collision) and evaluations methods based on Fault Tree Analysis, probability of the risk, safety risk evaluation matrix and Functional Hazard Assesment. Also safety goals were determined. Research led to determine probabilities of appearing of threats, as well as allow compare them with regard to the ILS. As a result of conducting the Preliminary System Safety Assessment (PSSA), there were defined requirements essential to reach the required level of the safety. It is worth to underline, that quantitative requirements were defined using FTA.
VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security, and HIPAA Compliance
Watzlaf, Valerie J.M.; Moeini, Sohrab; Firouzan, Patti
2010-01-01
Voice over the Internet Protocol (VoIP) systems such as Adobe ConnectNow, Skype, ooVoo, etc. may include the use of software applications for telerehabilitation (TR) therapy that can provide voice and video teleconferencing between patients and therapists. Privacy and security applications as well as HIPAA compliance within these protocols have been questioned by information technologists, providers of care and other health care entities. This paper develops a privacy and security checklist that can be used within a VoIP system to determine if it meets privacy and security procedures and whether it is HIPAA compliant. Based on this analysis, specific HIPAA criteria that therapists and health care facilities should follow are outlined and discussed, and therapists must weigh the risks and benefits when deciding to use VoIP software for TR. PMID:25945172
VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security, and HIPAA Compliance.
Watzlaf, Valerie J M; Moeini, Sohrab; Firouzan, Patti
2010-01-01
Voice over the Internet Protocol (VoIP) systems such as Adobe ConnectNow, Skype, ooVoo, etc. may include the use of software applications for telerehabilitation (TR) therapy that can provide voice and video teleconferencing between patients and therapists. Privacy and security applications as well as HIPAA compliance within these protocols have been questioned by information technologists, providers of care and other health care entities. This paper develops a privacy and security checklist that can be used within a VoIP system to determine if it meets privacy and security procedures and whether it is HIPAA compliant. Based on this analysis, specific HIPAA criteria that therapists and health care facilities should follow are outlined and discussed, and therapists must weigh the risks and benefits when deciding to use VoIP software for TR.
Ou, Huang-Tz; Chen, Yen-Ting; Liu, Ya-Ming; Wu, Jin-Shang
2016-06-01
To assess the cost-effectiveness of metformin-based dual therapies associated with cardiovascular disease (CVD) risk in a Chinese population with type 2 diabetes. We utilized Taiwan's National Health Insurance Research Database (NHIRD) 1997-2011, which is derived from the claims of National Health Insurance, a mandatory-enrollment single-payer system that covers over 99% of Taiwan's population. Four metformin-based dual therapy cohorts were used, namely a reference group of metformin plus sulfonylureas (Metformin-SU) and metformin plus acarbose, metformin plus thiazolidinediones (Metformin-TZD), and metformin plus glinides (Metformin-glinides). Using propensity scores, each subject in a comparison cohort was 1:1 matched to a referent. The effectiveness outcome was CVD risk. Only direct medical costs were included. The Markov chain model was applied to project lifetime outcomes, discounted at 3% per annum. The bootstrapping technique was performed to assess uncertainty in analysis. Metformin-glinides was most cost-effective in the base-case analysis; Metformin-glinides saved $194 USD for one percentage point of reduction in CVD risk, as compared to Metformin-SU. However, for the elderly or those with severe diabetic complications, Metformin-TZD, especially pioglitazone, was more suitable; as compared to Metformin-SU, Metformin-TZD saved $840.1 USD per percentage point of reduction in CVD risk. Among TZDs, Metformin-pioglitazone saved $1831.5 USD per percentage point of associated CVD risk reduction, as compared to Metformin-rosiglitazone. When CVD is considered an important clinical outcome, Metformin-pioglitazone is cost-effective, in particular for the elderly and those with severe diabetic complications. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Smart, Christian
1998-01-01
During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen out the many failure modes that did not contribute significantly to the catastrophic risk. The Hazard Analysis and FMEA for the SSME were also used to build ESDs that show the chain of events leading from the failure mode occurence to one of the following end states: catastrophic failure, engine shutdown, or siccessful operation( successful with respect to the failure mode under consideration).
Physical risk factors identification based on body sensor network combined to videotaping.
Vignais, Nicolas; Bernard, Fabien; Touvenot, Gérard; Sagot, Jean-Claude
2017-11-01
The aim of this study was to perform an ergonomic analysis of a material handling task by combining a subtask video analysis and a RULA computation, implemented continuously through a motion capture system combining inertial sensors and electrogoniometers. Five workers participated to the experiment. Seven inertial measurement units, placed on the worker's upper body (pelvis, thorax, head, arms, forearms), were implemented through a biomechanical model of the upper body to continuously provide trunk, neck, shoulder and elbow joint angles. Wrist joint angles were derived from electrogoniometers synchronized with the inertial measurement system. Worker's activity was simultaneously recorded using video. During post-processing, joint angles were used as inputs to a computationally implemented ergonomic evaluation based on the RULA method. Consequently a RULA score was calculated at each time step to characterize the risk of exposure of the upper body (right and left sides). Local risk scores were also computed to identify the anatomical origin of the exposure. Moreover, the video-recorded work activity was time-studied in order to classify and quantify all subtasks involved into the task. Results showed that mean RULA scores were at high risk for all participants (6 and 6.2 for right and left sides respectively). A temporal analysis demonstrated that workers spent most part of the work time at a RULA score of 7 (right: 49.19 ± 35.27%; left: 55.5 ± 29.69%). Mean local scores revealed that most exposed joints during the task were elbows, lower arms, wrists and hands. Elbows and lower arms were indeed at a high level of risk during the total time of a work cycle (100% for right and left sides). Wrist and hands were also exposed to a risky level for much of the period of work (right: 82.13 ± 7.46%; left: 77.85 ± 12.46%). Concerning the subtask analysis, subtasks called 'snow thrower', 'opening the vacuum sealer', 'cleaning' and 'storing' have been identified as the most awkward for right and left sides given mean RULA scores and percentages of time spent at risky levels. Results analysis permitted to suggest ergonomic recommendations for the redesign of the workstation. Contributions of the proposed innovative system dedicated to physical ergonomic assessment are further discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Non-Small Cell Lung Cancer.
Raju, G K; Gurumurthi, K; Domike, R; Kazandjian, D; Blumenthal, G; Pazdur, R; Woodcock, J
2016-12-01
Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analyses. There is much interest in quantifying regulatory approaches to benefit and risk. In this work the use of a quantitative benefit-risk analysis was applied to regulatory decision-making about new drugs to treat advanced non-small cell lung cancer (NSCLC). Benefits and risks associated with 20 US Food and Drug Administration (FDA) decisions associated with a set of candidate treatments submitted between 2003 and 2015 were analyzed. For benefit analysis, the median overall survival (OS) was used where available. When not available, OS was estimated based on overall response rate (ORR) or progression-free survival (PFS). Risks were analyzed based on magnitude (or severity) of harm and likelihood of occurrence. Additionally, a sensitivity analysis was explored to demonstrate analysis of systematic uncertainty. FDA approval decision outcomes considered were found to be consistent with the benefit-risk logic. © 2016 American Society for Clinical Pharmacology and Therapeutics.
Li, Ye; Wang, Hao; Wang, Wei; Xing, Lu; Liu, Shanwen; Wei, Xueyan
2017-01-01
Although plenty of studies have been conducted recently about the impacts of cooperative adaptive cruise control (CACC) system on traffic efficiency, there are few researches analyzing the safety effects of this advanced driving-assistant system. Thus, the primary objective of this study is to evaluate the impacts of the CACC system on reducing rear-end collision risks on freeways. The CACC model is firstly developed, which is based on the Intelligent Driver Model (IDM). Then, two surrogated safety measures, derived from the time-to-collision (TTC), denoting time exposed time-to-collision (TET) and time integrated time-to-collision (TIT), are introduced for quantifying the collision risks. And the safety effects are analyzed both theoretically and experimentally, by the linear stability analysis and simulations. The theoretical and simulation results conformably indicate that the CACC system brings dramatic benefits for reducing rear-end collision risks (TET and TIT are reduced more than 90%, respectively), when the desired time headway and time delay are set properly. The sensitivity analysis indicates there are few differences among different values of the threshold of TTC and the length of a CACC platoon. The results also show that the safety improvements weaken with the decrease of the penetration rates of CACC on the market and the increase of time delay between platoons. We also evaluate the traffic efficiency of the CACC system with different desired time headway. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
van den Dool, G.
2017-11-01
This study (van den Dool, 2017) is a proof of concept for a global predictive wildfire model, in which the temporal-spatial characteristics of wildfires are placed in a Geographical Information System (GIS), and the risk analysis is based on data-driven fuzzy logic functions. The data sources used in this model are available as global datasets, but subdivided into three pilot areas: North America (California/Nevada), Europe (Spain), and Asia (Mongolia), and are downscaled to the highest resolution (3-arc second). The GIS is constructed around three themes: topography, fuel availability and climate. From the topographical data, six derived sub-themes are created and converted to a fuzzy membership based on the catchment area statistics. The fuel availability score is a composite of four data layers: land cover, wood loads, biomass, biovolumes. As input for the climatological sub-model reanalysed daily averaged, weather-related data is used, which is accumulated to a global weekly time-window (to account for the uncertainty within the climatological model) and forms the temporal component of the model. The final product is a wildfire risk score (from 0 to 1) by week, representing the average wildfire risk in an area. To compute the potential wildfire risk the sub-models are combined usinga Multi-Criteria Approach, and the model results are validated against the area under the Receiver Operating Characteristic curve.
Assessment and uncertainty analysis of groundwater risk.
Li, Fawen; Zhu, Jingzhao; Deng, Xiyuan; Zhao, Yong; Li, Shaofei
2018-01-01
Groundwater with relatively stable quantity and quality is commonly used by human being. However, as the over-mining of groundwater, problems such as groundwater funnel, land subsidence and salt water intrusion have emerged. In order to avoid further deterioration of hydrogeological problems in over-mining regions, it is necessary to conduct the assessment of groundwater risk. In this paper, risks of shallow and deep groundwater in the water intake area of the South-to-North Water Transfer Project in Tianjin, China, were evaluated. Firstly, two sets of four-level evaluation index system were constructed based on the different characteristics of shallow and deep groundwater. Secondly, based on the normalized factor values and the synthetic weights, the risk values of shallow and deep groundwater were calculated. Lastly, the uncertainty of groundwater risk assessment was analyzed by indicator kriging method. The results meet the decision maker's demand for risk information, and overcome previous risk assessment results expressed in the form of deterministic point estimations, which ignore the uncertainty of risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.
Jordan, Teresa E.
2015-09-30
This submission of Utilization Analysis data to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) is in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (project DE-EE0006726). The submission includes data pertinent to the methods and results of an analysis of the Surface Levelized Cost of Heat (SLCOH) for US Census Bureau Places within the study area. This was calculated using a modification of a program called GEOPHIRES, available at http://koenraadbeckers.net/geophires/index.php. The MATLAB modules used in conjunction with GEOPHIRES, the MATLAB data input file, the GEOPHIRES output data file, and an explanation of the software components have been provided. Results of the SLCOH analysis appear on 4 .png image files as mapped risk of heat utilization. For each of the 4 image (.png) files, there is an accompanying georeferenced TIF (.tif) file by the same name. In addition to calculating SLCOH, this Task 4 also identified many sites that may be prospects for use of a geothermal district heating system, based on their size and industry, rather than on the SLCOH. An industry sorted listing of the sites (.xlsx) and a map of these sites plotted as a layer onto different iterations of maps combining the three geological risk factors (Thermal Quality, Natural Reservoir Quality, and Risk of Seismicity) has been provided. In addition to the 6 image (.png) files of the maps in this series, a shape (.shp) file and 7 associated files are included as well. Finally, supporting files (.pdf) describing the utilization analysis methodology and summarizing the anticipated permitting for a deep district heating system are supplied. UPDATE: Newer version of the Utilization Analysis has been added here: https://gdr.openei.org/submissions/878
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2009-08-01
Failure Mode and Effect Analysis (FMEA) has been applied for the risk assessment of snails manufacturing. A tentative approach of FMEA application to the snails industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (snails processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over snails processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Sterilization of tins, bioaccumulation of heavy metals, packaging of shells and poisonous mushrooms, were the processes identified as the ones with the highest RPN (280, 240, 147, 144, respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a snails processing industry is considered imperative.
Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments
NASA Technical Reports Server (NTRS)
Manning, Ted A.; Lawrence, Scott L.
2014-01-01
As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.
Analysis of reactor trips originating in balance of plant systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stetson, F.T.; Gallagher, D.W.; Le, P.T.
1990-09-01
This report documents the results of an analysis of balance-of-plant (BOP) related reactor trips at commercial US nuclear power plants of a 5-year period, from January 1, 1984, through December 31, 1988. The study was performed for the Plant Systems Branch, Office of Nuclear Reactor Regulation, US Nuclear Regulatory Commission. The objectives of the study were: to improve the level of understanding of BOP-related challenges to safety systems by identifying and categorizing such events; to prepare a computerized data base of BOP-related reactor trip events and use the data base to identify trends and patterns in the population of thesemore » events; to investigate the risk implications of BOP events that challenge safety systems; and to provide recommendations on how to address BOP-related concerns in regulatory context. 18 refs., 2 figs., 27 tabs.« less
Real time sound analysis for medical remote monitoring.
Istrate, Dan; Binet, Morgan; Cheng, Sreng
2008-01-01
The increase of aging population in Europe involves more people living alone at home with an increased risk of home accidents or falls. In order to prevent or detect a distress situation in the case of an elderly people living alone, a remote monitoring system based on the sound environment analysis can be used. We have already proposed a system which monitors the sound environment, identifies everyday life sounds and distress expressions in order to participate to an alarm decision. This first system uses a classical sound card on a PC or embedded PC allowing only one channel monitor. In this paper, we propose a new architecture of the remote monitoring system, which relies on a real time multichannel implementation based on an USB acquisition card. This structure allows monitoring eight channels in order to cover all the rooms of an apartment. More than that, the SNR estimation leads currently to the adaptation of the recognition models to environment.
Industrial Accidents Triggered by Natural Hazards: an Emerging Risk Issue
NASA Astrophysics Data System (ADS)
Renni, Elisabetta; Krausmann, Elisabeth; Basco, Anna; Salzano, Ernesto; Cozzani, Valerio
2010-05-01
Natural disasters such as earthquakes, tsunamis, flooding or hurricanes have recently and dramatically hit several countries worldwide. Both direct and indirect consequences involved the population, causing on the one hand a high number of fatalities and on the other hand so relevant economical losses that the national gross product may be affected for many years. Loss of critical industrial infrastructures (electricity generation and distribution, gas pipelines, oil refineries, etc.) also occurred, causing further indirect damage to the population. In several cases, accident scenarios with large releases of hazardous materials were triggered by these natural events, causing so-called "Natech events", in which the overall damage resulted from the simultaneous consequences of the natural event and of the release of hazardous substances. Toxic releases, large fires and explosions, as well as possible long-term environmental pollution, economical losses, and overloading of emergency systems were recognised by post-event studies as the main issues of these Natech scenarios. In recent years the increasing frequency and severity of some natural hazards due to climate change has slowly increased the awareness of Natech risk as an emerging risk among the stakeholders. Indeed, the iNTeg-Risk project, co-funded by the European Commission within the 7th Framework Program specifically addresses these scenarios among new technological issues on public safety. The present study, in part carried out within the iNTeg-Risk project, was aimed at the analysis and further development of methods and tools for the assessment and mitigation of Natech accidents. Available tools and knowledge gaps in the assessment of Natech scenarios were highlighted. The analysis mainly addressed the potential impact of flood, lightning and earthquake events on industrial installations where hazardous substances are present. Preliminary screening methodologies and more detailed methods based on quantitative risk analysis were developed. Strategies based on the use of multiple information layers aiming at the identification of mitigation and early warning systems were also explored. A case-study in the Emilia-Romagna region is presented.
Mghirbi, Oussama; Bord, Jean-Paul; Le Grusse, Philippe; Mandart, Elisabeth; Fabre, Jacques
2018-03-08
Faced with health, environmental, and socio-economic issues related to the heavy use of pesticides, diffuse phytosanitary pollution becomes a major concern shared by all the field actors. These actors, namely the farmers and territorial managers, have expressed the need to implement decision support tools for the territorial management of diffuse pollution resulting from the plant protection practices and their impacts. To meet these steadily increasing requests, a cartographic analysis approach was implemented based on GIS which allows the spatialization of the diffuse pollution impacts related to plant protection practices on the Etang de l'Or catchment area in the South of France. Risk mapping represents a support-decision tool that enables the different field actors to identify and locate vulnerable areas, so as to determine action plans and agri-environmental measures depending on the context of the natural environment. This work shows that mapping is helpful for managing risks related to the use of pesticides in agriculture by employing indicators of pressure (TFI) and risk on the applicator's health (IRSA) and on the environment (IRTE). These indicators were designed to assess the impact of plant protection practices at various spatial scales (field, farm, etc.). The cartographic analysis of risks related to plant protection practices shows that diffuse pollution is unequally located in the North (known for its abundant garrigues and vineyards) and in the South of the Etang de l'Or catchment area (the Mauguio-Lunel agricultural plain known for its diversified cropping systems). This spatial inequity is essentially related to land use and agricultural production system. Indeed, the agricultural lands cover about 60% of the total catchment area. Consequently, this cartographic analysis helps the territorial actors with the implementation of strategies for managing risks of diffuse pollution related to pesticides use in agriculture, based on environmental and socio-economic issues and the characteristics of the natural environment.