NASA Astrophysics Data System (ADS)
Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao
2017-05-01
Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.
Yang, Yu; Lian, Xin-Ying; Jiang, Yong-Hai; Xi, Bei-Dou; He, Xiao-Song
2017-11-01
Agricultural regions are a significant source of groundwater pesticide pollution. To ensure that agricultural regions with a significantly high risk of groundwater pesticide contamination are properly managed, a risk-based ranking method related to groundwater pesticide contamination is needed. In the present paper, a risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions was established. The method encompasses 3 phases, including indicator selection, characterization, and classification. In the risk ranking index system employed here, 17 indicators involving the physicochemical properties, environmental behavior characteristics, pesticide application methods, and inherent vulnerability of groundwater in the agricultural region were selected. The boundary of each indicator was determined using K-means cluster analysis based on a survey of a typical agricultural region and the physical and chemical properties of 300 typical pesticides. The total risk characterization was calculated by multiplying the risk value of each indicator, which could effectively avoid the subjectivity of index weight calculation and identify the main factors associated with the risk. The results indicated that the risk for groundwater pesticide contamination from agriculture in a region could be ranked into 4 classes from low to high risk. This method was applied to an agricultural region in Jiangsu Province, China, and it showed that this region had a relatively high risk for groundwater contamination from pesticides, and that the pesticide application method was the primary factor contributing to the relatively high risk. The risk ranking method was determined to be feasible, valid, and able to provide reference data related to the risk management of groundwater pesticide pollution from agricultural regions. Integr Environ Assess Manag 2017;13:1052-1059. © 2017 SETAC. © 2017 SETAC.
Greene, Barry R; Redmond, Stephen J; Caulfield, Brian
2017-05-01
Falls are the leading global cause of accidental death and disability in older adults and are the most common cause of injury and hospitalization. Accurate, early identification of patients at risk of falling, could lead to timely intervention and a reduction in the incidence of fall-related injury and associated costs. We report a statistical method for fall risk assessment using standard clinical fall risk factors (N = 748). We also report a means of improving this method by automatically combining it, with a fall risk assessment algorithm based on inertial sensor data and the timed-up-and-go test. Furthermore, we provide validation data on the sensor-based fall risk assessment method using a statistically independent dataset. Results obtained using cross-validation on a sample of 292 community dwelling older adults suggest that a combined clinical and sensor-based approach yields a classification accuracy of 76.0%, compared to either 73.6% for sensor-based assessment alone, or 68.8% for clinical risk factors alone. Increasing the cohort size by adding an additional 130 subjects from a separate recruitment wave (N = 422), and applying the same model building and validation method, resulted in a decrease in classification performance (68.5% for combined classifier, 66.8% for sensor data alone, and 58.5% for clinical data alone). This suggests that heterogeneity between cohorts may be a major challenge when attempting to develop fall risk assessment algorithms which generalize well. Independent validation of the sensor-based fall risk assessment algorithm on an independent cohort of 22 community dwelling older adults yielded a classification accuracy of 72.7%. Results suggest that the present method compares well to previously reported sensor-based fall risk assessment methods in assessing falls risk. Implementation of objective fall risk assessment methods on a large scale has the potential to improve quality of care and lead to a reduction in associated hospital costs, due to fewer admissions and reduced injuries due to falling.
Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika
2017-01-01
Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.
Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da
2016-12-01
Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.
Van der Fels-Klerx, H J; Van Asselt, E D; Raley, M; Poulsen, M; Korsgaard, H; Bredsdorff, L; Nauta, M; D'agostino, M; Coles, D; Marvin, H J P; Frewer, L J
2018-01-22
This study aimed to critically review methods for ranking risks related to food safety and dietary hazards on the basis of their anticipated human health impacts. A literature review was performed to identify and characterize methods for risk ranking from the fields of food, environmental science and socio-economic sciences. The review used a predefined search protocol, and covered the bibliographic databases Scopus, CAB Abstracts, Web of Sciences, and PubMed over the period 1993-2013. All references deemed relevant, on the basis of predefined evaluation criteria, were included in the review, and the risk ranking method characterized. The methods were then clustered-based on their characteristics-into eleven method categories. These categories included: risk assessment, comparative risk assessment, risk ratio method, scoring method, cost of illness, health adjusted life years (HALY), multi-criteria decision analysis, risk matrix, flow charts/decision trees, stated preference techniques and expert synthesis. Method categories were described by their characteristics, weaknesses and strengths, data resources, and fields of applications. It was concluded there is no single best method for risk ranking. The method to be used should be selected on the basis of risk manager/assessor requirements, data availability, and the characteristics of the method. Recommendations for future use and application are provided.
A Review on Methods of Risk Adjustment and their Use in Integrated Healthcare Systems
Juhnke, Christin; Bethge, Susanne
2016-01-01
Introduction: Effective risk adjustment is an aspect that is more and more given weight on the background of competitive health insurance systems and vital healthcare systems. The objective of this review was to obtain an overview of existing models of risk adjustment as well as on crucial weights in risk adjustment. Moreover, the predictive performance of selected methods in international healthcare systems should be analysed. Theory and methods: A comprehensive, systematic literature review on methods of risk adjustment was conducted in terms of an encompassing, interdisciplinary examination of the related disciplines. Results: In general, several distinctions can be made: in terms of risk horizons, in terms of risk factors or in terms of the combination of indicators included. Within these, another differentiation by three levels seems reasonable: methods based on mortality risks, methods based on morbidity risks as well as those based on information on (self-reported) health status. Conclusions and discussion: After the final examination of different methods of risk adjustment it was shown that the methodology used to adjust risks varies. The models differ greatly in terms of their included morbidity indicators. The findings of this review can be used in the evaluation of integrated healthcare delivery systems and can be integrated into quality- and patient-oriented reimbursement of care providers in the design of healthcare contracts. PMID:28316544
NASA Astrophysics Data System (ADS)
Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua
2017-05-01
With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.
Developing points-based risk-scoring systems in the presence of competing risks.
Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P
2016-09-30
Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Probabalistic Risk Assessment of a Turbine Disk
NASA Astrophysics Data System (ADS)
Carter, Jace A.; Thomas, Michael; Goswami, Tarun; Fecke, Ted
Current Federal Aviation Administration (FAA) rotor design certification practices risk assessment using a probabilistic framework focused on only the life-limiting defect location of a component. This method generates conservative approximations of the operational risk. The first section of this article covers a discretization method, which allows for a transition from this relative risk to an absolute risk where the component is discretized into regions called zones. General guidelines were established for the zone-refinement process based on the stress gradient topology in order to reach risk convergence. The second section covers a risk assessment method for predicting the total fatigue life due to fatigue induced damage. The total fatigue life incorporates a dual mechanism approach including the crack initiation life and propagation life while simultaneously determining the associated initial flaw sizes. A microstructure-based model was employed to address uncertainties in material response and relate crack initiation life with crack size, while propagation life was characterized large crack growth laws. The two proposed methods were applied to a representative Inconel 718 turbine disk. The zone-based method reduces the conservative approaches, while showing effects of feature-based inspection on the risk assessment. In the fatigue damage assessment, the predicted initial crack distribution was found to be the most sensitive probabilistic parameter and can be used to establish an enhanced inspection planning.
This problems-based, introductory workshop focuses on methods to assess health risks posed by exposures to chemical mixtures in the environment. Chemical mixtures health risk assessment methods continue to be developed and evolve to address concerns over health risks from multic...
This problems-based, introductory workshop focuses on methods to assess health risks posed by exposures to chemical mixtures in the environment. Chemical mixtures health risk assessment methods continue to be developed and evolve to address concerns over health risks from multic...
Fox, Aaron S; Bonacci, Jason; McLean, Scott G; Spittle, Michael; Saunders, Natalie
2016-05-01
Laboratory-based measures provide an accurate method to identify risk factors for anterior cruciate ligament (ACL) injury; however, these methods are generally prohibitive to the wider community. Screening methods that can be completed in a field or clinical setting may be more applicable for wider community use. Examination of field-based screening methods for ACL injury risk can aid in identifying the most applicable method(s) for use in these settings. The objective of this systematic review was to evaluate and compare field-based screening methods for ACL injury risk to determine their efficacy of use in wider community settings. An electronic database search was conducted on the SPORTDiscus™, MEDLINE, AMED and CINAHL databases (January 1990-July 2015) using a combination of relevant keywords. A secondary search of the same databases, using relevant keywords from identified screening methods, was also undertaken. Studies identified as potentially relevant were independently examined by two reviewers for inclusion. Where consensus could not be reached, a third reviewer was consulted. Original research articles that examined screening methods for ACL injury risk that could be undertaken outside of a laboratory setting were included for review. Two reviewers independently assessed the quality of included studies. Included studies were categorized according to the screening method they examined. A description of each screening method, and data pertaining to the ability to prospectively identify ACL injuries, validity and reliability, recommendations for identifying 'at-risk' athletes, equipment and training required to complete screening, time taken to screen athletes, and applicability of the screening method across sports and athletes were extracted from relevant studies. Of 1077 citations from the initial search, a total of 25 articles were identified as potentially relevant, with 12 meeting all inclusion/exclusion criteria. From the secondary search, eight further studies met all criteria, resulting in 20 studies being included for review. Five ACL-screening methods-the Landing Error Scoring System (LESS), Clinic-Based Algorithm, Observational Screening of Dynamic Knee Valgus (OSDKV), 2D-Cam Method, and Tuck Jump Assessment-were identified. There was limited evidence supporting the use of field-based screening methods in predicting ACL injuries across a range of populations. Differences relating to the equipment and time required to complete screening methods were identified. Only screening methods for ACL injury risk were included for review. Field-based screening methods developed for lower-limb injury risk in general may also incorporate, and be useful in, screening for ACL injury risk. Limited studies were available relating to the OSDKV and 2D-Cam Method. The LESS showed predictive validity in identifying ACL injuries, however only in a youth athlete population. The LESS also appears practical for community-wide use due to the minimal equipment and set-up/analysis time required. The Clinic-Based Algorithm may have predictive value for ACL injury risk as it identifies athletes who exhibit high frontal plane knee loads during a landing task, but requires extensive additional equipment and time, which may limit its application to wider community settings.
This problems-based, half-day, introductory workshop focuses on methods to assess health risks posed by exposures to chemical mixtures in the environment. Chemical mixtures health risk assessment methods continue to be developed and evolve to address concerns over health risks f...
Evaluation on Cost Overrun Risks of Long-distance Water Diversion Project Based on SPA-IAHP Method
NASA Astrophysics Data System (ADS)
Yuanyue, Yang; Huimin, Li
2018-02-01
Large investment, long route, many change orders and etc. are main causes for costs overrun of long-distance water diversion project. This paper, based on existing research, builds a full-process cost overrun risk evaluation index system for water diversion project, apply SPA-IAHP method to set up cost overrun risk evaluation mode, calculate and rank weight of every risk evaluation indexes. Finally, the cost overrun risks are comprehensively evaluated by calculating linkage measure, and comprehensive risk level is acquired. SPA-IAHP method can accurately evaluate risks, and the reliability is high. By case calculation and verification, it can provide valid cost overrun decision making information to construction companies.
Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai
2016-08-26
Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions.
Comprehensive risk assessment method of catastrophic accident based on complex network properties
NASA Astrophysics Data System (ADS)
Cui, Zhen; Pang, Jun; Shen, Xiaohong
2017-09-01
On the macro level, the structural properties of the network and the electrical characteristics of the micro components determine the risk of cascading failures. And the cascading failures, as a process with dynamic development, not only the direct risk but also potential risk should be considered. In this paper, comprehensively considered the direct risk and potential risk of failures based on uncertain risk analysis theory and connection number theory, quantified uncertain correlation by the node degree and node clustering coefficient, then established a comprehensive risk indicator of failure. The proposed method has been proved by simulation on the actual power grid. Modeling a network according to the actual power grid, and verified the rationality of the proposed method.
Chen, Yu; Song, Guobao; Yang, Fenglin; Zhang, Shushen; Zhang, Yun; Liu, Zhenyu
2012-01-01
According to risk systems theory and the characteristics of the chemical industry, an index system was established for risk assessment of enterprises in chemical industrial parks (CIPs) based on the inherent risk of the source, effectiveness of the prevention and control mechanism, and vulnerability of the receptor. A comprehensive risk assessment method based on catastrophe theory was then proposed and used to analyze the risk levels of ten major chemical enterprises in the Songmu Island CIP, China. According to the principle of equal distribution function, the chemical enterprise risk level was divided into the following five levels: 1.0 (very safe), 0.8 (safe), 0.6 (generally recognized as safe, GRAS), 0.4 (unsafe), 0.2 (very unsafe). The results revealed five enterprises (50%) with an unsafe risk level, and another five enterprises (50%) at the generally recognized as safe risk level. This method solves the multi-objective evaluation and decision-making problem. Additionally, this method involves simple calculations and provides an effective technique for risk assessment and hierarchical risk management of enterprises in CIPs. PMID:23208298
A stable systemic risk ranking in China's banking sector: Based on principal component analysis
NASA Astrophysics Data System (ADS)
Fang, Libing; Xiao, Binqing; Yu, Honghai; You, Qixing
2018-02-01
In this paper, we compare five popular systemic risk rankings, and apply principal component analysis (PCA) model to provide a stable systemic risk ranking for the Chinese banking sector. Our empirical results indicate that five methods suggest vastly different systemic risk rankings for the same bank, while the combined systemic risk measure based on PCA provides a reliable ranking. Furthermore, according to factor loadings of the first component, PCA combined ranking is mainly based on fundamentals instead of market price data. We clearly find that price-based rankings are not as practical a method as fundamentals-based ones. This PCA combined ranking directly shows systemic risk contributions of each bank for banking supervision purpose and reminds banks to prevent and cope with the financial crisis in advance.
Røssvoll, Elin Halbach; Ueland, Øydis; Hagtvedt, Therese; Jacobsen, Eivind; Lavik, Randi; Langsrud, Solveig
2012-09-01
Traditionally, consumer food safety survey responses have been classified as either "right" or "wrong" and food handling practices that are associated with high risk of infection have been treated in the same way as practices with lower risks. In this study, a risk-based method for consumer food safety surveys has been developed, and HACCP (hazard analysis and critical control point) methodology was used for selecting relevant questions. We conducted a nationally representative Web-based survey (n = 2,008), and to fit the self-reported answers we adjusted a risk-based grading system originally developed for observational studies. The results of the survey were analyzed both with the traditional "right" and "wrong" classification and with the risk-based grading system. The results using the two methods were very different. Only 5 of the 10 most frequent food handling violations were among the 10 practices associated with the highest risk. These 10 practices dealt with different aspects of heat treatment (lacking or insufficient), whereas the majority of the most frequent violations involved storing food at room temperature for too long. Use of the risk-based grading system for survey responses gave a more realistic picture of risks associated with domestic food handling practices. The method highlighted important violations and minor errors, which are performed by most people and are not associated with significant risk. Surveys built on a HACCP-based approach with risk-based grading will contribute to a better understanding of domestic food handling practices and will be of great value for targeted information and educational activities.
Dr. Simmons will provide a concise overview of established and emerging methods to group chemicals for component-based mixture risk assessments. This will be followed by introduction to several important component-based methods, the Hazard Index, Target Organ Hazard Index, Multi...
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Sanni, Steinar; Lyng, Emily; Pampanin, Daniela M
2017-06-01
Offshore oil and gas activities are required not to cause adverse environmental effects, and risk based management has been established to meet environmental standards. In some risk assessment schemes, Risk Indicators (RIs) are parameters to monitor the development of risk affecting factors. RIs have not yet been established in the Environmental Risk Assessment procedures for management of oil based discharges offshore. This paper evaluates the usefulness of biomarkers as RIs, based on their properties, existing laboratory biomarker data and assessment methods. Data shows several correlations between oil concentrations and biomarker responses, and assessment principles exist that qualify biomarkers for integration into risk procedures. Different ways that these existing biomarkers and methods can be applied as RIs in a probabilistic risk assessment system when linked with whole organism responses are discussed. This can be a useful approach to integrate biomarkers into probabilistic risk assessment related to oil based discharges, representing a potential supplement to information that biomarkers already provide about environmental impact and risk related to these kind of discharges. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L
2016-02-10
Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure. Copyright © 2015 John Wiley & Sons, Ltd.
Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...
Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method
Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui
2014-01-01
A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159
Risk evaluation of bogie system based on extension theory and entropy weight method.
Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui
2014-01-01
A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly.
Risk-Based Object Oriented Testing
NASA Technical Reports Server (NTRS)
Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert
2000-01-01
Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.
Gaziano, Thomas A; Young, Cynthia R; Fitzmaurice, Garrett; Atwood, Sidney; Gaziano, J Michael
2008-01-01
Summary Background Around 80% of all cardiovascular deaths occur in developing countries. Assessment of those patients at high risk is an important strategy for prevention. Since developing countries have limited resources for prevention strategies that require laboratory testing, we assessed if a risk prediction method that did not require any laboratory tests could be as accurate as one requiring laboratory information. Methods The National Health and Nutrition Examination Survey (NHANES) was a prospective cohort study of 14 407 US participants aged between 25–74 years at the time they were first examined (between 1971 and 1975). Our follow-up study population included participants with complete information on these surveys who did not report a history of cardiovascular disease (myocardial infarction, heart failure, stroke, angina) or cancer, yielding an analysis dataset N=6186. We compared how well either method could predict first-time fatal and non-fatal cardiovascular disease events in this cohort. For the laboratory-based model, which required blood testing, we used standard risk factors to assess risk of cardiovascular disease: age, systolic blood pressure, smoking status, total cholesterol, reported diabetes status, and current treatment for hypertension. For the non-laboratory-based model, we substituted body-mass index for cholesterol. Findings In the cohort of 6186, there were 1529 first-time cardiovascular events and 578 (38%) deaths due to cardiovascular disease over 21 years. In women, the laboratory-based model was useful for predicting events, with a c statistic of 0·829. The c statistic of the non-laboratory-based model was 0·831. In men, the results were similar (0·784 for the laboratory-based model and 0·783 for the non-laboratory-based model). Results were similar between the laboratory-based and non-laboratory-based models in both men and women when restricted to fatal events only. Interpretation A method that uses non-laboratory-based risk factors predicted cardiovascular events as accurately as one that relied on laboratory-based values. This approach could simplify risk assessment in situations where laboratory testing is inconvenient or unavailable. PMID:18342687
This problems-based, half-day, introductory workshop focuses on methods to assess health risks posed by exposures to chemical mixtures in the environment. Chemical mixtures health risk assessment methods continue to be developed and evolve to address concerns over health risks f...
[Ecological risk assessment of sediment pollution based on triangular fuzzy number].
Zhou, Xiao-Wei; Wang, Li-Ping; Zheng, Bing-Hui
2008-11-01
Based on the characteristics of random and fuzziness, and the shortage and imprecision of datum information of water environmental system, environment background value of sediments and concentration of pollution is calculated by means of triangle fuzzy number and fuzzy risk assessment model of the potential ecological risk index is established. Using this method heavy metal pollution and ecological risk in the Yangtze Estuary and its adjacent waters were analyzed. The result shows that the environment of the foundation of the study area is subject to varying degrees of pollution. The pollution extents are correspondingly Cu, Hg, Zn, Pb, As, Cd. RI by that method and the Hakanson ecological risk method is in similar trend. RI of the estuary, turbidity maximum zone and Hangzhou bay is greater than that at outside of the estuary and sea area nearby Zhousan, and the potential ecological risk rate increases one. The assessment result is good in the validation based on the corresponding period macrobenthic community parameters.
Salganik, Matthew J; Fazito, Dimitri; Bertoni, Neilane; Abdo, Alexandre H; Mello, Maeve B; Bastos, Francisco I
2011-11-15
One of the many challenges hindering the global response to the human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) epidemic is the difficulty of collecting reliable information about the populations most at risk for the disease. Thus, the authors empirically assessed a promising new method for estimating the sizes of most at-risk populations: the network scale-up method. Using 4 different data sources, 2 of which were from other researchers, the authors produced 5 estimates of the number of heavy drug users in Curitiba, Brazil. The authors found that the network scale-up and generalized network scale-up estimators produced estimates 5-10 times higher than estimates made using standard methods (the multiplier method and the direct estimation method using data from 2004 and 2010). Given that equally plausible methods produced such a wide range of results, the authors recommend that additional studies be undertaken to compare estimates based on the scale-up method with those made using other methods. If scale-up-based methods routinely produce higher estimates, this would suggest that scale-up-based methods are inappropriate for populations most at risk of HIV/AIDS or that standard methods may tend to underestimate the sizes of these populations.
Sassen, Barbara; Kok, Gerjo; Mesters, Ilse; Crutzen, Rik; Cremers, Anita; Vanhees, Luc
2012-12-14
Patients with cardiovascular risk factors can reduce their risk of cardiovascular disease by increasing their physical activity and their physical fitness. According to the guidelines for cardiovascular risk management, health professionals should encourage their patients to engage in physical activity. In this paper, we provide insight regarding the systematic development of a Web-based intervention for both health professionals and patients with cardiovascular risk factors using the development method Intervention Mapping. The different steps of Intervention Mapping are described to open up the "black box" of Web-based intervention development and to support future Web-based intervention development. The development of the Professional and Patient Intention and Behavior Intervention (PIB2 intervention) was initiated with a needs assessment for both health professionals (ie, physiotherapy and nursing) and their patients. We formulated performance and change objectives and, subsequently, theory- and evidence-based intervention methods and strategies were selected that were thought to affect the intention and behavior of health professionals and patients. The rationale of the intervention was based on different behavioral change methods that allowed us to describe the scope and sequence of the intervention and produced the Web-based intervention components. The Web-based intervention consisted of 5 modules, including individualized messages and self-completion forms, and charts and tables. The systematic and planned development of the PIB2 intervention resulted in an Internet-delivered behavior change intervention. The intervention was not developed as a substitute for face-to-face contact between professionals and patients, but as an application to complement and optimize health services. The focus of the Web-based intervention was to extend professional behavior of health care professionals, as well as to improve the risk-reduction behavior of patients with cardiovascular risk factors. The Intervention Mapping protocol provided a systematic method for developing the intervention and each intervention design choice was carefully thought-out and justified. Although it was not a rapid or an easy method for developing an intervention, the protocol guided and directed the development process. The application of evidence-based behavior change methods used in our intervention offers insight regarding how an intervention may change intention and health behavior. The Web-based intervention appeared feasible and was implemented. Further research will test the effectiveness of the PIB2 intervention. Dutch Trial Register, Trial ID: ECP-92.
2012-01-01
Background Patients with cardiovascular risk factors can reduce their risk of cardiovascular disease by increasing their physical activity and their physical fitness. According to the guidelines for cardiovascular risk management, health professionals should encourage their patients to engage in physical activity. Objective In this paper, we provide insight regarding the systematic development of a Web-based intervention for both health professionals and patients with cardiovascular risk factors using the development method Intervention Mapping. The different steps of Intervention Mapping are described to open up the “black box” of Web-based intervention development and to support future Web-based intervention development. Methods The development of the Professional and Patient Intention and Behavior Intervention (PIB2 intervention) was initiated with a needs assessment for both health professionals (ie, physiotherapy and nursing) and their patients. We formulated performance and change objectives and, subsequently, theory- and evidence-based intervention methods and strategies were selected that were thought to affect the intention and behavior of health professionals and patients. The rationale of the intervention was based on different behavioral change methods that allowed us to describe the scope and sequence of the intervention and produced the Web-based intervention components. The Web-based intervention consisted of 5 modules, including individualized messages and self-completion forms, and charts and tables. Results The systematic and planned development of the PIB2 intervention resulted in an Internet-delivered behavior change intervention. The intervention was not developed as a substitute for face-to-face contact between professionals and patients, but as an application to complement and optimize health services. The focus of the Web-based intervention was to extend professional behavior of health care professionals, as well as to improve the risk-reduction behavior of patients with cardiovascular risk factors. Conclusions The Intervention Mapping protocol provided a systematic method for developing the intervention and each intervention design choice was carefully thought-out and justified. Although it was not a rapid or an easy method for developing an intervention, the protocol guided and directed the development process. The application of evidence-based behavior change methods used in our intervention offers insight regarding how an intervention may change intention and health behavior. The Web-based intervention appeared feasible and was implemented. Further research will test the effectiveness of the PIB2 intervention. Trial Registration Dutch Trial Register, Trial ID: ECP-92 PMID:23612470
Risk Analysis of Earth-Rock Dam Failures Based on Fuzzy Event Tree Method
Fu, Xiao; Gu, Chong-Shi; Su, Huai-Zhi; Qin, Xiang-Nan
2018-01-01
Earth-rock dams make up a large proportion of the dams in China, and their failures can induce great risks. In this paper, the risks associated with earth-rock dam failure are analyzed from two aspects: the probability of a dam failure and the resulting life loss. An event tree analysis method based on fuzzy set theory is proposed to calculate the dam failure probability. The life loss associated with dam failure is summarized and refined to be suitable for Chinese dams from previous studies. The proposed method and model are applied to one reservoir dam in Jiangxi province. Both engineering and non-engineering measures are proposed to reduce the risk. The risk analysis of the dam failure has essential significance for reducing dam failure probability and improving dam risk management level. PMID:29710824
A comparative potency method for cancer risk assessment has been developed based upon a constant relative potency hypothesis. This method was developed and tested using data from a battery of short-term mutagenesis bioassays, animal tumorigenicity data and human lung cancer risk ...
NASA Astrophysics Data System (ADS)
Liu, P.
2013-12-01
Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.
NASA Astrophysics Data System (ADS)
Hu, Xiaojing; Li, Qiang; Zhang, Hao; Guo, Ziming; Zhao, Kun; Li, Xinpeng
2018-06-01
Based on the Monte Carlo method, an improved risk assessment method for hybrid AC/DC power system with VSC station considering the operation status of generators, converter stations, AC lines and DC lines is proposed. According to the sequential AC/DC power flow algorithm, node voltage and line active power are solved, and then the operation risk indices of node voltage over-limit and line active power over-limit are calculated. Finally, an improved two-area IEEE RTS-96 system is taken as a case to analyze and assessment its operation risk. The results show that the proposed model and method can intuitively and directly reflect the weak nodes and weak lines of the system, which can provide some reference for the dispatching department.
NASA Astrophysics Data System (ADS)
Setiawan, E. P.; Rosadi, D.
2017-01-01
Portfolio selection problems conventionally means ‘minimizing the risk, given the certain level of returns’ from some financial assets. This problem is frequently solved with quadratic or linear programming methods, depending on the risk measure that used in the objective function. However, the solutions obtained by these method are in real numbers, which may give some problem in real application because each asset usually has its minimum transaction lots. In the classical approach considering minimum transaction lots were developed based on linear Mean Absolute Deviation (MAD), variance (like Markowitz’s model), and semi-variance as risk measure. In this paper we investigated the portfolio selection methods with minimum transaction lots with conditional value at risk (CVaR) as risk measure. The mean-CVaR methodology only involves the part of the tail of the distribution that contributed to high losses. This approach looks better when we work with non-symmetric return probability distribution. Solution of this method can be found with Genetic Algorithm (GA) methods. We provide real examples using stocks from Indonesia stocks market.
NASA's human system risk management approach and its applicability to commercial spaceflight.
Law, Jennifer; Mathers, Charles H; Fondy, Susan R E; Vanderploeg, James M; Kerstman, Eric L
2013-01-01
As planning continues for commercial spaceflight, attention is turned to NASA to assess whether its human system risk management approach can be applied to mitigate the risks associated with commercial suborbital and orbital flights. NASA uses a variety of methods to assess the risks to the human system based on their likelihood and consequences. In this article, we review these methods and categorize the risks in the system as "definite," "possible," or "least" concern for commercial spaceflight. As with career astronauts, these risks will be primarily mitigated by screening and environmental control. Despite its focus on long-duration exploration missions, NASA's human system risk management approach can serve as a preliminary knowledge base to help medical planners prepare for commercial spaceflights.
Knerr, Sarah; Wernli, Karen J; Leppig, Kathleen; Ehrlich, Kelly; Graham, Amanda L; Farrell, David; Evans, Chalanda; Luta, George; Schwartz, Marc D; O'Neill, Suzanne C
2017-05-01
Mammographic breast density is one of the strongest risk factors for breast cancer after age and family history. Mandatory breast density disclosure policies are increasing nationally without clear guidance on how to communicate density status to women. Coupling density disclosure with personalized risk counseling and decision support through a web-based tool may be an effective way to allow women to make informed, values-consistent risk management decisions without increasing distress. This paper describes the design and methods of Engaged, a prospective, randomized controlled trial examining the effect of online personalized risk counseling and decision support on risk management decisions in women with dense breasts and increased breast cancer risk. The trial is embedded in a large integrated health care system in the Pacific Northwest. A total of 1250 female health plan members aged 40-69 with a recent negative screening mammogram who are at increased risk for interval cancer based on their 5-year breast cancer risk and BI-RADS® breast density will be randomly assigned to access either a personalized web-based counseling and decision support tool or standard educational content. Primary outcomes will be assessed using electronic health record data (i.e., chemoprevention and breast MRI utilization) and telephone surveys (i.e., distress) at baseline, six weeks, and twelve months. Engaged will provide evidence about whether a web-based personalized risk counseling and decision support tool is an effective method for communicating with women about breast density and risk management. An effective intervention could be disseminated with minimal clinical burden to align with density disclosure mandates. Clinical Trials Registration Number:NCT03029286. Copyright © 2017 Elsevier Inc. All rights reserved.
Risk based inspection for atmospheric storage tank
NASA Astrophysics Data System (ADS)
Nugroho, Agus; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin
2016-04-01
Corrosion is an attack that occurs on a metallic material as a result of environment's reaction.Thus, it causes atmospheric storage tank's leakage, material loss, environmental pollution, equipment failure and affects the age of process equipment then finally financial damage. Corrosion risk measurement becomesa vital part of Asset Management at the plant for operating any aging asset.This paper provides six case studies dealing with high speed diesel atmospheric storage tank parts at a power plant. A summary of the basic principles and procedures of corrosion risk analysis and RBI applicable to the Process Industries were discussed prior to the study. Semi quantitative method based onAPI 58I Base-Resource Document was employed. The risk associated with corrosion on the equipment in terms of its likelihood and its consequences were discussed. The corrosion risk analysis outcome used to formulate Risk Based Inspection (RBI) method that should be a part of the atmospheric storage tank operation at the plant. RBI gives more concern to inspection resources which are mostly on `High Risk' and `Medium Risk' criteria and less on `Low Risk' shell. Risk categories of the evaluated equipment were illustrated through case study analysis outcome.
Feng, Qiang; Chen, Yiran; Sun, Bo; Li, Songjie
2014-01-01
An optimization method for condition based maintenance (CBM) of aircraft fleet considering prognostics uncertainty is proposed. The CBM and dispatch process of aircraft fleet is analyzed first, and the alternative strategy sets for single aircraft are given. Then, the optimization problem of fleet CBM with lower maintenance cost and dispatch risk is translated to the combinatorial optimization problem of single aircraft strategy. Remain useful life (RUL) distribution of the key line replaceable Module (LRM) has been transformed into the failure probability of the aircraft and the fleet health status matrix is established. And the calculation method of the costs and risks for mission based on health status matrix and maintenance matrix is given. Further, an optimization method for fleet dispatch and CBM under acceptable risk is proposed based on an improved genetic algorithm. Finally, a fleet of 10 aircrafts is studied to verify the proposed method. The results shows that it could realize optimization and control of the aircraft fleet oriented to mission success.
Chen, Yiran; Sun, Bo; Li, Songjie
2014-01-01
An optimization method for condition based maintenance (CBM) of aircraft fleet considering prognostics uncertainty is proposed. The CBM and dispatch process of aircraft fleet is analyzed first, and the alternative strategy sets for single aircraft are given. Then, the optimization problem of fleet CBM with lower maintenance cost and dispatch risk is translated to the combinatorial optimization problem of single aircraft strategy. Remain useful life (RUL) distribution of the key line replaceable Module (LRM) has been transformed into the failure probability of the aircraft and the fleet health status matrix is established. And the calculation method of the costs and risks for mission based on health status matrix and maintenance matrix is given. Further, an optimization method for fleet dispatch and CBM under acceptable risk is proposed based on an improved genetic algorithm. Finally, a fleet of 10 aircrafts is studied to verify the proposed method. The results shows that it could realize optimization and control of the aircraft fleet oriented to mission success. PMID:24892046
Li, L L; Jiang, Z; Song, W L; Ding, Y Y; Xu, J; He, N
2017-10-10
Objective: To develop a HIV infection risk assessment tool for men who have sex with men (MSM) based on Delphi method. Methods: After an exhaustive literature review, we used Delphi method to determine the specific items and relative risk scores of the assessment tool through two rounds of specialist consultation and overall consideration of the opinions and suggestions of 17 specialists. Results: The positivity coefficient through first and second round specialist consultation was 100.0 % and 94.1 % , respectively. The mean of authority coefficients ( Cr ) was 0.86. Kendall's W coefficient of the specialist consultation was 0.55 for the first round consultation (χ(2)=84.426, P <0.001) and 0.46 for the second round consultation (χ(2)=65.734, P <0.001), respectively, suggesting that the specialists had similar opinions. The final HIV infection risk assessment tool for MSM has 8 items. Conclusions: The HIV infection risk assessment tool for MSM, developed under the Delphi method, can be used in the evaluation of HIV infection risk in MSM and individualized prevention and intervention. However, the reliability and validity of this risk assessment tool need to be further evaluated.
Dynamic drought risk assessment using crop model and remote sensing techniques
NASA Astrophysics Data System (ADS)
Sun, H.; Su, Z.; Lv, J.; Li, L.; Wang, Y.
2017-02-01
Drought risk assessment is of great significance to reduce the loss of agricultural drought and ensure food security. The normally drought risk assessment method is to evaluate its exposure to the hazard and the vulnerability to extended periods of water shortage for a specific region, which is a static evaluation method. The Dynamic Drought Risk Assessment (DDRA) is to estimate the drought risk according to the crop growth and water stress conditions in real time. In this study, a DDRA method using crop model and remote sensing techniques was proposed. The crop model we employed is DeNitrification and DeComposition (DNDC) model. The drought risk was quantified by the yield losses predicted by the crop model in a scenario-based method. The crop model was re-calibrated to improve the performance by the Leaf Area Index (LAI) retrieved from MODerate Resolution Imaging Spectroradiometer (MODIS) data. And the in-situ station-based crop model was extended to assess the regional drought risk by integrating crop planted mapping. The crop planted area was extracted with extended CPPI method from MODIS data. This study was implemented and validated on maize crop in Liaoning province, China.
Rah, Jeong-Eun; Manger, Ryan P; Yock, Adam D; Kim, Gwe-Ya
2016-12-01
To examine the abilities of a traditional failure mode and effects analysis (FMEA) and modified healthcare FMEA (m-HFMEA) scoring methods by comparing the degree of congruence in identifying high risk failures. The authors applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation were based on the risk priority number (RPN). The RPN is a product of three indices: occurrence, severity, and detectability. The m-HFMEA approach utilized two indices, severity and frequency. A risk inventory matrix was divided into four categories: very low, low, high, and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. The two methods were independently compared to determine if the results and rated risks matched. The authors' results showed an agreement of 85% between FMEA and m-HFMEA approaches for top 20 risks of SIG-RS-specific failure modes. The main differences between the two approaches were the distribution of the values and the observation that failure modes (52, 54, 154) with high m-HFMEA scores do not necessarily have high FMEA-RPN scores. In the m-HFMEA analysis, when the risk score is determined, the basis of the established HFMEA Decision Tree™ or the failure mode should be more thoroughly investigated. m-HFMEA is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allows the prioritization of high risks and mitigation measures. It is therefore a useful tool for the prospective risk analysis method to radiotherapy.
Risk assessment for construction projects of transport infrastructure objects
NASA Astrophysics Data System (ADS)
Titarenko, Boris
2017-10-01
The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.
EFFECTS-BASED CUMULATIVE RISK ASSESSMENT IN A LOW-INCOME URBAN COMMUNITY NEAR A SUPERFUND SITE
We will introduce into the cumulative risk assessment framework novel methods for non-cancer risk assessment, techniques for dose-response modeling that extend insights from chemical mixtures frameworks to non-chemical stressors, multilevel statistical methods used to address ...
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
Construction risk assessment of deep foundation pit in metro station based on G-COWA method
NASA Astrophysics Data System (ADS)
You, Weibao; Wang, Jianbo; Zhang, Wei; Liu, Fangmeng; Yang, Diying
2018-05-01
In order to get an accurate understanding of the construction safety of deep foundation pit in metro station and reduce the probability and loss of risk occurrence, a risk assessment method based on G-COWA is proposed. Firstly, relying on the specific engineering examples and the construction characteristics of deep foundation pit, an evaluation index system based on the five factors of “human, management, technology, material and environment” is established. Secondly, the C-OWA operator is introduced to realize the evaluation index empowerment and weaken the negative influence of expert subjective preference. The gray cluster analysis and fuzzy comprehensive evaluation method are combined to construct the construction risk assessment model of deep foundation pit, which can effectively solve the uncertainties. Finally, the model is applied to the actual project of deep foundation pit of Qingdao Metro North Station, determine its construction risk rating is “medium”, evaluate the model is feasible and reasonable. And then corresponding control measures are put forward and useful reference are provided.
Population-based absolute risk estimation with survey data
Kovalchik, Stephanie A.; Pfeiffer, Ruth M.
2013-01-01
Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614
2011-01-01
Background Fall events contribute significantly to mortality, morbidity and costs in our ageing population. In order to identify persons at risk and to target preventive measures, many scores and assessment tools have been developed. These often require expertise and are costly to implement. Recent research investigates the use of wearable inertial sensors to provide objective data on motion features which can be used to assess individual fall risk automatically. So far it is unknown how well this new method performs in comparison with conventional fall risk assessment tools. The aim of our research is to compare the predictive performance of our new sensor-based method with conventional and established methods, based on prospective data. Methods In a first study phase, 119 inpatients of a geriatric clinic took part in motion measurements using a wireless triaxial accelerometer during a Timed Up&Go (TUG) test and a 20 m walk. Furthermore, the St. Thomas Risk Assessment Tool in Falling Elderly Inpatients (STRATIFY) was performed, and the multidisciplinary geriatric care team estimated the patients' fall risk. In a second follow-up phase of the study, 46 of the participants were interviewed after one year, including a fall and activity assessment. The predictive performances of the TUG, the STRATIFY and team scores are compared. Furthermore, two automatically induced logistic regression models based on conventional clinical and assessment data (CONV) as well as sensor data (SENSOR) are matched. Results Among the risk assessment scores, the geriatric team score (sensitivity 56%, specificity 80%) outperforms STRATIFY and TUG. The induced logistic regression models CONV and SENSOR achieve similar performance values (sensitivity 68%/58%, specificity 74%/78%, AUC 0.74/0.72, +LR 2.64/2.61). Both models are able to identify more persons at risk than the simple scores. Conclusions Sensor-based objective measurements of motion parameters in geriatric patients can be used to assess individual fall risk, and our prediction model's performance matches that of a model based on conventional clinical and assessment data. Sensor-based measurements using a small wearable device may contribute significant information to conventional methods and are feasible in an unsupervised setting. More prospective research is needed to assess the cost-benefit relation of our approach. PMID:21711504
NASA Astrophysics Data System (ADS)
Bai, Tao; Ma, Pan-pan; Kan, Yan-bin; Huang, Qiang
2017-12-01
Ecological risk assessment of river is an important content for protection and improvement of ecological environment. In this paper, taking Xiaolangdi reservoir for example, ecological risk assessments are studied based on the 1956-1997 and 2002-2008 dairy runoff data as the pre and post of construction of Xiaolangdi reservoir. Considering pre and post hydrological regime of construction of Xiaolangdi, ecological risk assessment index systems of downstream are established based on Index of Hydrologic Alteration-Range of Variability Approach method (IHA-RVA), which considering characters of flow, time, frequency, delay and change rate. Then ecological risk fuzzy comprehensive evaluation assessment model downstream is established based on risk index and RVA method. The results show that after the construction of Xiaolangdi reservoir, ecological risk occurred in the downstream of Yellow River for changed hydrological indexes, such as monthly average flow, frequency and duration of extreme annual flow and so on, which probably destroy the whole ecosystems of the river. For example, ecological risk downstream of Xiaolangdi reservoir upgrade to level two in 2008. Research results make reference values and scientific basis both in ecological risk assessment and management of reservoir after construction.
NASA Astrophysics Data System (ADS)
Cao, Guangxi; Han, Yan; Li, Qingchen; Xu, Wei
2017-02-01
The acceleration of economic globalization gradually shows the linkage of the stock markets in various counties and produces a risk conduction effect. An asymmetric MF-DCCA method is conducted based on the different directions of risk conduction (DMF-ADCCA) and by using the traditional MF-DCCA. To ensure that the empirical results are more objective and robust, this study selects the stock index data of China, the US, Germany, India, and Brazil from January 2011 to September 2014 using the asymmetric MF-DCCA method based on different risk conduction effects and nonlinear Granger causality tests to study the asymmetric cross-correlation between domestic and foreign stock markets. Empirical results indicate the existence of a bidirectional conduction effect between domestic and foreign stock markets, and the greater influence degree from foreign countries to domestic market compared with that from the domestic market to foreign countries.
ERIC Educational Resources Information Center
Common, Eric Alan; Lane, Kathleen Lynne; Pustejovsky, James E.; Johnson, Austin H.; Johl, Liane Elizabeth
2017-01-01
This systematic review investigated one systematic approach to designing, implementing, and evaluating functional assessment-based interventions (FABI) for use in supporting school-age students with or at-risk for high-incidence disabilities. We field tested several recently developed methods for single-case design syntheses. First, we appraised…
Chemical Mixture Risk Assessment Additivity-Based Approaches
Powerpoint presentation includes additivity-based chemical mixture risk assessment methods. Basic concepts, theory and example calculations are included. Several slides discuss the use of "common adverse outcomes" in analyzing phthalate mixtures.
Calysto: Risk Management for Commercial Manned Spaceflight
NASA Technical Reports Server (NTRS)
Dillaman, Gary
2012-01-01
The Calysto: Risk Management for Commercial Manned Spaceflight study analyzes risk management in large enterprises and how to effectively communicate risks across organizations. The Calysto Risk Management tool developed by NASA's Kennedy Space Center's SharePoint team is used and referenced throughout the study. Calysto is a web-base tool built on Microsoft's SharePoint platform. The risk management process at NASA is examined and incorporated in the study. Using risk management standards from industry and specific organizations at the Kennedy Space Center, three methods of communicating and elevating risk are examined. Each method describes details of the effectiveness and plausibility of using the method in the Calysto Risk Management Tool. At the end of the study suggestions are made for future renditions of Calysto.
A simulation model of IT risk on program trading
NASA Astrophysics Data System (ADS)
Xia, Bingying; Jiang, Wenbao; Luo, Guangxuan
2015-12-01
The biggest difficulty for Program trading IT risk measures lies in the loss of data, in view of this situation, the current scholars approach is collecting court, network and other public media such as all kinds of accident of IT both at home and abroad for data collection, and the loss of IT risk quantitative analysis based on this database. However, the IT risk loss database established by this method can only fuzzy reflect the real situation and not for real to make fundamental explanation. In this paper, based on the study of the concept and steps of the MC simulation, we use computer simulation method, by using the MC simulation method in the "Program trading simulation system" developed by team to simulate the real programming trading and get the IT risk loss of data through its IT failure experiment, at the end of the article, on the effectiveness of the experimental data is verified. In this way, better overcome the deficiency of the traditional research method and solves the problem of lack of IT risk data in quantitative research. More empirically provides researchers with a set of simulation method are used to study the ideas and the process template.
A BHR Composite Network-Based Visualization Method for Deformation Risk Level of Underground Space
Zheng, Wei; Zhang, Xiaoya; Lu, Qi
2015-01-01
This study proposes a visualization processing method for the deformation risk level of underground space. The proposed method is based on a BP-Hopfield-RGB (BHR) composite network. Complex environmental factors are integrated in the BP neural network. Dynamic monitoring data are then automatically classified in the Hopfield network. The deformation risk level is combined with the RGB color space model and is displayed visually in real time, after which experiments are conducted with the use of an ultrasonic omnidirectional sensor device for structural deformation monitoring. The proposed method is also compared with some typical methods using a benchmark dataset. Results show that the BHR composite network visualizes the deformation monitoring process in real time and can dynamically indicate dangerous zones. PMID:26011618
USDA-ARS?s Scientific Manuscript database
Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...
NHEERL is conducting a demonstration project to develop tools and approaches for assessing the risks of multiple stressors to populations of piscivorous wildlife, leading to the development of risk-based criteria. Specifically, we are developing methods and approaches to assess...
PRA and Risk Informed Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernsen, Sidney A.; Simonen, Fredric A.; Balkey, Kenneth R.
2006-01-01
The Boiler and Pressure Vessel Code (BPVC) of the American Society of Mechanical Engineers (ASME) has introduced a risk based approach into Section XI that covers Rules for Inservice Inspection of Nuclear Power Plant Components. The risk based approach requires application of the probabilistic risk assessments (PRA). Because no industry consensus standard existed for PRAs, ASME has developed a standard to evaluate the quality level of an available PRA needed to support a given risk based application. The paper describes the PRA standard, Section XI application of PRAs, and plans for broader applications of PRAs to other ASME nuclear codesmore » and standards. The paper addresses several specific topics of interest to Section XI. Important consideration are special methods (surrogate components) used to overcome the lack of PRA treatments of passive components in PRAs. The approach allows calculations of conditional core damage probabilities both for component failures that cause initiating events and failures in standby systems that decrease the availability of these systems. The paper relates the explicit risk based methods of the new Section XI code cases to the implicit consideration of risk used in the development of Section XI. Other topics include the needed interactions of ISI engineers, plant operating staff, PRA specialists, and members of expert panels that review the risk based programs.« less
Sexual Pleasure and Sexual Risk among Women who Use Methamphetamine: A Mixed Methods Study
Lorvick, Jennifer; Bourgois, Philippe; Wenger, Lynn D.; Arreola, Sonya G.; Lutnick, Alexandra; Wechsberg, Wendee M.; Kral, Alex H.
2012-01-01
Background The intersection of drug use, sexual pleasure and sexual risk behavior is rarely explored when it comes to poor women who use drugs. This paper explores the relationship between sexual behavior and methamphetamine use in a community-based sample of women, exploring not only risk, but also desire, pleasure and the challenges of overcoming trauma. Methods Quantitative data were collected using standard epidemiological methods (N=322) for community-based studies. In addition, using purposive sampling, qualitative data were collected among a subset of participants (n=34). Data were integrated for mixed methods analysis. Results While many participants reported sexual risk behavior (unprotected vaginal or anal intercourse) in the quantitative survey, sexual risk was not the central narrative pertaining to sexual behavior and methamphetamine use in qualitative findings. Rather, desire, pleasure and disinhibition arose as central themes. Women described feelings of power and agency related to sexual behavior while high on methamphetamine. Findings were mixed on whether methamphetamine use increased sexual risk behavior. Conclusion The use of mixed methods afforded important insights into the sexual behavior and priorities of methamphetamine-using women. Efforts to reduce sexual risk should recognize and valorize the positive aspects of methamphetamine use for some women, building on positive feelings of power and agency as an approach to harm minimization. PMID:22954501
NASA Astrophysics Data System (ADS)
Sun, K.; Cheng, D. B.; He, J. J.; Zhao, Y. L.
2018-02-01
Collapse gully erosion is a specific type of soil erosion in the red soil region of southern China, and early warning and prevention of the occurrence of collapse gully erosion is very important. Based on the idea of risk assessment, this research, taking Guangdong province as an example, adopt the information acquisition analysis and the logistic regression analysis, to discuss the feasibility for collapse gully erosion risk assessment in regional scale, and compare the applicability of the different risk assessment methods. The results show that in the Guangdong province, the risk degree of collapse gully erosion occurrence is high in northeastern and western area, and relatively low in southwestern and central part. The comparing analysis of the different risk assessment methods on collapse gully also indicated that the risk distribution patterns from the different methods were basically consistent. However, the accuracy of risk map from the information acquisition analysis method was slightly better than that from the logistic regression analysis method.
Failure mode effect analysis and fault tree analysis as a combined methodology in risk management
NASA Astrophysics Data System (ADS)
Wessiani, N. A.; Yoshio, F.
2018-04-01
There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.
Deng, Xinyang; Jiang, Wen
2017-09-12
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.
Deng, Xinyang
2017-01-01
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905
A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.
Yang, Harry; Zhang, Jianchun
2015-01-01
The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current methods. Analytical methods are often used to ensure safety, efficacy, and quality of medicinal products. According to government regulations and regulatory guidelines, these methods need to be validated through well-designed studies to minimize the risk of accepting unsuitable methods. This article describes a novel statistical test for analytical method validation, which provides better protection for the risk of accepting unsuitable analytical methods. © PDA, Inc. 2015.
Risk analysis for veterinary biologicals released into the environment.
Silva, S V; Samagh, B S; Morley, R S
1995-12-01
All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.
Unmanned aircraft system sense and avoid integrity and continuity
NASA Astrophysics Data System (ADS)
Jamoom, Michael B.
This thesis describes new methods to guarantee safety of sense and avoid (SAA) functions for Unmanned Aircraft Systems (UAS) by evaluating integrity and continuity risks. Previous SAA efforts focused on relative safety metrics, such as risk ratios, comparing the risk of using an SAA system versus not using it. The methods in this thesis evaluate integrity and continuity risks as absolute measures of safety, as is the established practice in commercial aircraft terminal area navigation applications. The main contribution of this thesis is a derivation of a new method, based on a standard intruder relative constant velocity assumption, that uses hazard state estimates and estimate error covariances to establish (1) the integrity risk of the SAA system not detecting imminent loss of '"well clear," which is the time and distance required to maintain safe separation from intruder aircraft, and (2) the probability of false alert, the continuity risk. Another contribution is applying these integrity and continuity risk evaluation methods to set quantifiable and certifiable safety requirements on sensors. A sensitivity analysis uses this methodology to evaluate the impact of sensor errors on integrity and continuity risks. The penultimate contribution is an integrity and continuity risk evaluation where the estimation model is refined to address realistic intruder relative linear accelerations, which goes beyond the current constant velocity standard. The final contribution is an integrity and continuity risk evaluation addressing multiple intruders. This evaluation is a new innovation-based method to determine the risk of mis-associating intruder measurements. A mis-association occurs when the SAA system incorrectly associates a measurement to the wrong intruder, causing large errors in the estimated intruder trajectories. The new methods described in this thesis can help ensure safe encounters between aircraft and enable SAA sensor certification for UAS integration into the National Airspace System.
Animal food intake and cooking methods in relation to endometrial cancer risk in Shanghai
Xu, W-H; Dai, Q; Xiang, Y-B; Zhao, G-M; Zheng, W; Gao, Y-T; Ruan, Z-X; Cheng, J-R; Shu, X-O
2006-01-01
We evaluated animal food intake and cooking methods in relation to endometrial cancer risk in a population-based case–control study in Shanghai, China. A validated food frequency questionnaire was used to collect the usual dietary habits of 1204 cases and 1212 controls aged 30–69 years between 1997 and 2003. Statistical analyses were based on an unconditional logistic regression model adjusting for potential confounders. High intake of meat and fish was associated with an increased risk of endometrial cancer, with adjusted odds ratios for the highest vs the lowest quartile groups being 1.7 (95% confidence interval: 1.3–2.2) and 2.4 (1.8–3.1), respectively. The elevated risk was observed for all types of meat and fish intake. Intake of eggs and milk was not related to risk. Cooking methods and doneness levels for meat and fish were not associated with risk, nor did they modify the association with meat and fish consumption. Our study suggests that animal food consumption may play an important role in the aetiology of endometrial cancer, but cooking methods have minimal influence on risk among Chinese women. PMID:17060930
Semicompeting risks in aging research: methods, issues and needs
Varadhan, Ravi; Xue, Qian-Li; Bandeen-Roche, Karen
2015-01-01
A semicompeting risks problem involves two-types of events: a nonterminal and a terminal event (death). Typically, the nonterminal event is the focus of the study, but the terminal event can preclude the occurrence of the nonterminal event. Semicompeting risks are ubiquitous in studies of aging. Examples of semicompeting risk dyads include: dementia and death, frailty syndrome and death, disability and death, and nursing home placement and death. Semicompeting risk models can be divided into two broad classes: models based only on observables quantities (class O) and those based on potential (latent) failure times (class L). The classical illness-death model belongs to class O. This model is a special case of the multistate models, which has been an active area of methodology development. During the past decade and a half, there has also been a flurry of methodological activity on semicompeting risks based on latent failure times (L models). These advances notwithstanding, the semi-competing risks methodology has not penetrated biomedical research, in general, and gerontological research, in particular. Some possible reasons for this lack of uptake are: the methods are relatively new and sophisticated, conceptual problems associated with potential failure time models are difficult to overcome, paucity of expository articles aimed at educating practitioners, and non-availability of readily usable software. The main goals of this review article are: (i) to describe the major types of semicompeting risks problems arising in aging research, (ii) to provide a brief survey of the semicompeting risks methods, (iii) to suggest appropriate methods for addressing the problems in aging research, (iv) to highlight areas where more work is needed, and (v) to suggest ways to facilitate the uptake of the semicompeting risks methodology by the broader biomedical research community. PMID:24729136
NASA Astrophysics Data System (ADS)
Zhang, Kejiang; Kluck, Cheryl; Achari, Gopal
2009-11-01
A ranking system for contaminated sites based on comparative risk methodology using fuzzy Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE) was developed in this article. It combines the concepts of fuzzy sets to represent uncertain site information with the PROMETHEE, a subgroup of Multi-Criteria Decision Making (MCDM) methods. Criteria are identified based on a combination of the attributes (toxicity, exposure, and receptors) associated with the potential human health and ecological risks posed by contaminated sites, chemical properties, site geology and hydrogeology and contaminant transport phenomena. Original site data are directly used avoiding the subjective assignment of scores to site attributes. When the input data are numeric and crisp the PROMETHEE method can be used. The Fuzzy PROMETHEE method is preferred when substantial uncertainties and subjectivities exist in site information. The PROMETHEE and fuzzy PROMETHEE methods are both used in this research to compare the sites. The case study shows that this methodology provides reasonable results.
Zhang, Kejiang; Kluck, Cheryl; Achari, Gopal
2009-11-01
A ranking system for contaminated sites based on comparative risk methodology using fuzzy Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE) was developed in this article. It combines the concepts of fuzzy sets to represent uncertain site information with the PROMETHEE, a subgroup of Multi-Criteria Decision Making (MCDM) methods. Criteria are identified based on a combination of the attributes (toxicity, exposure, and receptors) associated with the potential human health and ecological risks posed by contaminated sites, chemical properties, site geology and hydrogeology and contaminant transport phenomena. Original site data are directly used avoiding the subjective assignment of scores to site attributes. When the input data are numeric and crisp the PROMETHEE method can be used. The Fuzzy PROMETHEE method is preferred when substantial uncertainties and subjectivities exist in site information. The PROMETHEE and fuzzy PROMETHEE methods are both used in this research to compare the sites. The case study shows that this methodology provides reasonable results.
What is the lifetime risk of developing cancer?: the effect of adjusting for multiple primaries
Sasieni, P D; Shelton, J; Ormiston-Smith, N; Thomson, C S; Silcocks, P B
2011-01-01
Background: The ‘lifetime risk' of cancer is generally estimated by combining current incidence rates with current all-cause mortality (‘current probability' method) rather than by describing the experience of a birth cohort. As individuals may get more than one type of cancer, what is generally estimated is the average (mean) number of cancers over a lifetime. This is not the same as the probability of getting cancer. Methods: We describe a method for estimating lifetime risk that corrects for the inclusion of multiple primary cancers in the incidence rates routinely published by cancer registries. The new method applies cancer incidence rates to the estimated probability of being alive without a previous cancer. The new method is illustrated using data from the Scottish Cancer Registry and is compared with ‘gold-standard' estimates that use (unpublished) data on first primaries. Results: The effect of this correction is to make the estimated ‘lifetime risk' smaller. The new estimates are extremely similar to those obtained using incidence based on first primaries. The usual ‘current probability' method considerably overestimates the lifetime risk of all cancers combined, although the correction for any single cancer site is minimal. Conclusion: Estimation of the lifetime risk of cancer should either be based on first primaries or should use the new method. PMID:21772332
Papageorgiou, Elpiniki I; Jayashree Subramanian; Karmegam, Akila; Papandrianos, Nikolaos
2015-11-01
Breast cancer is the most deadly disease affecting women and thus it is natural for women aged 40-49 years (who have a family history of breast cancer or other related cancers) to assess their personal risk for developing familial breast cancer (FBC). Besides, as each individual woman possesses different levels of risk of developing breast cancer depending on their family history, genetic predispositions and personal medical history, individualized care setting mechanism needs to be identified so that appropriate risk assessment, counseling, screening, and prevention options can be determined by the health care professionals. The presented work aims at developing a soft computing based medical decision support system using Fuzzy Cognitive Map (FCM) that assists health care professionals in deciding the individualized care setting mechanisms based on the FBC risk level of the given women. The FCM based FBC risk management system uses NHL to learn causal weights from 40 patient records and achieves a 95% diagnostic accuracy. The results obtained from the proposed model are in concurrence with the comprehensive risk evaluation tool based on Tyrer-Cuzick model for 38/40 patient cases (95%). Besides, the proposed model identifies high risk women by calculating higher accuracy of prediction than the standard Gail and NSAPB models. The testing accuracy of the proposed model using 10-fold cross validation technique outperforms other standard machine learning based inference engines as well as previous FCM-based risk prediction methods for BC. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
The East London glaucoma prediction score: web-based validation of glaucoma risk screening tool
Stephen, Cook; Benjamin, Longo-Mbenza
2013-01-01
AIM It is difficult for Optometrists and General Practitioners to know which patients are at risk. The East London glaucoma prediction score (ELGPS) is a web based risk calculator that has been developed to determine Glaucoma risk at the time of screening. Multiple risk factors that are available in a low tech environment are assessed to provide a risk assessment. This is extremely useful in settings where access to specialist care is difficult. Use of the calculator is educational. It is a free web based service. Data capture is user specific. METHOD The scoring system is a web based questionnaire that captures and subsequently calculates the relative risk for the presence of Glaucoma at the time of screening. Three categories of patient are described: Unlikely to have Glaucoma; Glaucoma Suspect and Glaucoma. A case review methodology of patients with known diagnosis is employed to validate the calculator risk assessment. RESULTS Data from the patient records of 400 patients with an established diagnosis has been captured and used to validate the screening tool. The website reports that the calculated diagnosis correlates with the actual diagnosis 82% of the time. Biostatistics analysis showed: Sensitivity = 88%; Positive predictive value = 97%; Specificity = 75%. CONCLUSION Analysis of the first 400 patients validates the web based screening tool as being a good method of screening for the at risk population. The validation is ongoing. The web based format will allow a more widespread recruitment for different geographic, population and personnel variables. PMID:23550097
ERIC Educational Resources Information Center
Brammeier, Monique; Chow, Joan M.; Samuel, Michael C.; Organista, Kurt C.; Miller, Jamie; Bolan, Gail
2008-01-01
Context: The prevalence of sexually transmitted diseases and associated risk behaviors among California farmworkers is not well described. Purpose: To estimate the prevalence of sexually transmitted diseases (STDs) and associated risk behaviors among California farmworkers. Methods: Cross-sectional analysis of population-based survey data from 6…
Chronic Disease Risk Reduction with a Community-Based Lifestyle Change Programme
ERIC Educational Resources Information Center
Merrill, Ray M; Aldana, Steven G; Greenlaw, Roger L; Salberg, Audrey; Englert, Heike
2008-01-01
Objective To assess whether reduced health risks resulting from the Coronary Health Improvement Project (CHIP) persist through 18 months. Methods: The CHIP is a four-week health education course designed to help individuals reduce cardiovascular risk by improving nutrition and physical activity behaviors. Analyses were based on 211 CHIP enrollees,…
ERIC Educational Resources Information Center
Skelton, Alexander; Riley, David; Wales, David; Vess, James
2006-01-01
A growing research base supports the predictive validity of actuarial methods of risk assessment with sexual offenders. These methods use clearly defined variables with demonstrated empirical association with re-offending. The advantages of actuarial measures for screening large numbers of offenders quickly and economically are further enhanced…
Robust Derivation of Risk Reduction Strategies
NASA Technical Reports Server (NTRS)
Richardson, Julian; Port, Daniel; Feather, Martin
2007-01-01
Effective risk reduction strategies can be derived mechanically given sufficient characterization of the risks present in the system and the effectiveness of available risk reduction techniques. In this paper, we address an important question: can we reliably expect mechanically derived risk reduction strategies to be better than fixed or hand-selected risk reduction strategies, given that the quantitative assessment of risks and risk reduction techniques upon which mechanical derivation is based is difficult and likely to be inaccurate? We consider this question relative to two methods for deriving effective risk reduction strategies: the strategic method defined by Kazman, Port et al [Port et al, 2005], and the Defect Detection and Prevention (DDP) tool [Feather & Cornford, 2003]. We performed a number of sensitivity experiments to evaluate how inaccurate knowledge of risk and risk reduction techniques affect the performance of the strategies computed by the Strategic Method compared to a variety of alternative strategies. The experimental results indicate that strategies computed by the Strategic Method were significantly more effective than the alternative risk reduction strategies, even when knowledge of risk and risk reduction techniques was very inaccurate. The robustness of the Strategic Method suggests that its use should be considered in a wide range of projects.
Deep vein thrombosis in hospitalized patients: a review of evidence-based guidelines for prevention.
Kehl-Pruett, Wendy
2006-01-01
Deep vein thrombosis affects many hospitalized patients because of decreased activity and therapeutic equipment. This article reviews known risk factors for developing deep vein thrombosis, current prevention methods, and current evidence-based guidelines in order to raise nurses' awareness of early prevention methods in all hospitalized patients. Early prophylaxis can reduce patient risk of deep vein thrombosis and its complications.
Syed, Zeeshan; Saeed, Mohammed; Rubinfeld, Ilan
2010-01-01
For many clinical conditions, only a small number of patients experience adverse outcomes. Developing risk stratification algorithms for these conditions typically requires collecting large volumes of data to capture enough positive and negative for training. This process is slow, expensive, and may not be appropriate for new phenomena. In this paper, we explore different anomaly detection approaches to identify high-risk patients as cases that lie in sparse regions of the feature space. We study three broad categories of anomaly detection methods: classification-based, nearest neighbor-based, and clustering-based techniques. When evaluated on data from the National Surgical Quality Improvement Program (NSQIP), these methods were able to successfully identify patients at an elevated risk of mortality and rare morbidities following inpatient surgical procedures. PMID:21347083
Operational Risk Measurement of Chinese Commercial Banks Based on Extreme Value Theory
NASA Astrophysics Data System (ADS)
Song, Jiashan; Li, Yong; Ji, Feng; Peng, Cheng
The financial institutions and supervision institutions have all agreed on strengthening the measurement and management of operational risks. This paper attempts to build a model on the loss of operational risks basing on Peak Over Threshold model, emphasizing on weighted least square, which improved Hill’s estimation method, while discussing the situation of small sample, and fix the sample threshold more objectively basing on the media-published data of primary banks loss on operational risk from 1994 to 2007.
Multiple Interacting Risk Factors: On Methods for Allocating Risk Factor Interactions.
Price, Bertram; MacNicoll, Michael
2015-05-01
A persistent problem in health risk analysis where it is known that a disease may occur as a consequence of multiple risk factors with interactions is allocating the total risk of the disease among the individual risk factors. This problem, referred to here as risk apportionment, arises in various venues, including: (i) public health management, (ii) government programs for compensating injured individuals, and (iii) litigation. Two methods have been described in the risk analysis and epidemiology literature for allocating total risk among individual risk factors. One method uses weights to allocate interactions among the individual risk factors. The other method is based on risk accounting axioms and finding an optimal and unique allocation that satisfies the axioms using a procedure borrowed from game theory. Where relative risk or attributable risk is the risk measure, we find that the game-theory-determined allocation is the same as the allocation where risk factor interactions are apportioned to individual risk factors using equal weights. Therefore, the apportionment problem becomes one of selecting a meaningful set of weights for allocating interactions among the individual risk factors. Equal weights and weights proportional to the risks of the individual risk factors are discussed. © 2015 Society for Risk Analysis.
Lobach, Irvna; Fan, Ruzone; Carroll, Raymond T.
2011-01-01
With the advent of dense single nucleotide polymorphism genotyping, population-based association studies have become the major tools for identifying human disease genes and for fine gene mapping of complex traits. We develop a genotype-based approach for association analysis of case-control studies of gene-environment interactions in the case when environmental factors are measured with error and genotype data are available on multiple genetic markers. To directly use the observed genotype data, we propose two genotype-based models: genotype effect and additive effect models. Our approach offers several advantages. First, the proposed risk functions can directly incorporate the observed genotype data while modeling the linkage disequihbrium information in the regression coefficients, thus eliminating the need to infer haplotype phase. Compared with the haplotype-based approach, an estimating procedure based on the proposed methods can be much simpler and significantly faster. In addition, there is no potential risk due to haplotype phase estimation. Further, by fitting the proposed models, it is possible to analyze the risk alleles/variants of complex diseases, including their dominant or additive effects. To model measurement error, we adopt the pseudo-likelihood method by Lobach et al. [2008]. Performance of the proposed method is examined using simulation experiments. An application of our method is illustrated using a population-based case-control study of association between calcium intake with the risk of colorectal adenoma development. PMID:21031455
Gis-Based Multi-Criteria Decision Analysis for Forest Fire Risk Mapping
NASA Astrophysics Data System (ADS)
Akay, A. E.; Erdoğan, A.
2017-11-01
The forested areas along the coastal zone of the Mediterranean region in Turkey are classified as first-degree fire sensitive areas. Forest fires are major environmental disaster that affects the sustainability of forest ecosystems. Besides, forest fires result in important economic losses and even threaten human lives. Thus, it is critical to determine the forested areas with fire risks and thereby minimize the damages on forest resources by taking necessary precaution measures in these areas. The risk of forest fire can be assessed based on various factors such as forest vegetation structures (tree species, crown closure, tree stage), topographic features (slope and aspect), and climatic parameters (temperature, wind). In this study, GIS-based Multi-Criteria Decision Analysis (MCDA) method was used to generate forest fire risk map. The study was implemented in the forested areas within Yayla Forest Enterprise Chiefs at Dursunbey Forest Enterprise Directorate which is classified as first degree fire sensitive area. In the solution process, "extAhp 2.0" plug-in running Analytic Hierarchy Process (AHP) method in ArcGIS 10.4.1 was used to categorize study area under five fire risk classes: extreme risk, high risk, moderate risk, and low risk. The results indicated that 23.81 % of the area was of extreme risk, while 25.81 % was of high risk. The result indicated that the most effective criterion was tree species, followed by tree stages. The aspect had the least effective criterion on forest fire risk. It was revealed that GIS techniques integrated with MCDA methods are effective tools to quickly estimate forest fire risk at low cost. The integration of these factors into GIS can be very useful to determine forested areas with high fire risk and also to plan forestry management after fire.
Chen, Keping; Blong, Russell; Jacobson, Carol
2003-04-01
This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.
NASA Astrophysics Data System (ADS)
van der Vat, Marnix; Femke, Schasfoort; Rhee Gigi, Van; Manfred, Wienhoven; Nico, Polman; Joost, Delsman; den Hoek Paul, Van; Maat Judith, Ter; Marjolein, Mens
2016-04-01
It is widely acknowledged that drought management should move from a crisis to a risk-based approach. A risk-based approach to managing water resources requires a sound drought risk analysis, quantifying the probability and impacts of water shortage due to droughts. Impacts of droughts are for example crop yield losses, hydropower production losses, and water shortage for municipal and industrial use. Many studies analyse the balance between supply and demand, but there is little experience in translating this into economic metrics that can be used in a decision-making process on investments to reduce drought risk. We will present a drought risk analysis method for the Netherlands, with a focus on the underlying economic method to quantify the welfare effects of water shortage for different water users. Both the risk-based approach as well as the economic valuation of water shortage for various water users was explored in a study for the Dutch Government. First, an historic analysis of the effects of droughts on revenues and prices in agriculture as well as on shipping and nature was carried out. Second, a drought risk analysis method was developed that combines drought hazard and drought impact analysis in a probabilistic way for various sectors. This consists of a stepwise approach, from water availability through water shortage to economic impact, for a range of drought events with a certain return period. Finally, a local case study was conducted to test the applicability of the drought risk analysis method. Through the study, experience was gained into integrating hydrological and economic analyses, which is a prerequisite for drought risk analysis. Results indicate that the risk analysis method is promising and applicable for various sectors. However, it was also found that quantification of economic impacts from droughts is time-consuming, because location- and sector-specific data is needed, which is not always readily available. Furthermore, for some sectors hydrological data was lacking to make a reliable estimate of drought return periods. By 2021, the Netherlands Government aims to agree on the water supply service levels, which should describe water availability and quality that can be delivered with a certain return period. The Netherlands' Ministry of Infrastructure and the Environment, representatives of the regional water boards and Rijkswaterstaat (operating the main water system) as well as several consultants and research institutes are important stakeholders for further development of the method, evaluation of cases and the development of a quantitative risk-informed decision-making tool.
Marschollek, Michael; Rehwald, Anja; Wolf, Klaus-Hendrik; Gietzelt, Matthias; Nemitz, Gerhard; zu Schwabedissen, Hubertus Meyer; Schulze, Mareike
2011-06-28
Fall events contribute significantly to mortality, morbidity and costs in our ageing population. In order to identify persons at risk and to target preventive measures, many scores and assessment tools have been developed. These often require expertise and are costly to implement. Recent research investigates the use of wearable inertial sensors to provide objective data on motion features which can be used to assess individual fall risk automatically. So far it is unknown how well this new method performs in comparison with conventional fall risk assessment tools. The aim of our research is to compare the predictive performance of our new sensor-based method with conventional and established methods, based on prospective data. In a first study phase, 119 inpatients of a geriatric clinic took part in motion measurements using a wireless triaxial accelerometer during a Timed Up&Go (TUG) test and a 20 m walk. Furthermore, the St. Thomas Risk Assessment Tool in Falling Elderly Inpatients (STRATIFY) was performed, and the multidisciplinary geriatric care team estimated the patients' fall risk. In a second follow-up phase of the study, 46 of the participants were interviewed after one year, including a fall and activity assessment. The predictive performances of the TUG, the STRATIFY and team scores are compared. Furthermore, two automatically induced logistic regression models based on conventional clinical and assessment data (CONV) as well as sensor data (SENSOR) are matched. Among the risk assessment scores, the geriatric team score (sensitivity 56%, specificity 80%) outperforms STRATIFY and TUG. The induced logistic regression models CONV and SENSOR achieve similar performance values (sensitivity 68%/58%, specificity 74%/78%, AUC 0.74/0.72, +LR 2.64/2.61). Both models are able to identify more persons at risk than the simple scores. Sensor-based objective measurements of motion parameters in geriatric patients can be used to assess individual fall risk, and our prediction model's performance matches that of a model based on conventional clinical and assessment data. Sensor-based measurements using a small wearable device may contribute significant information to conventional methods and are feasible in an unsupervised setting. More prospective research is needed to assess the cost-benefit relation of our approach.
Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance
NASA Technical Reports Server (NTRS)
Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.
2016-01-01
Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.
Barmaz, Stefania; Potts, Simon G; Vighi, Marco
2010-10-01
Pollination is one of the most important ecosystem services in agroecosystems and supports food production. Pollinators are potentially at risk being exposed to pesticides and the main route of exposure is direct contact, in some cases ingestion, of contaminated materials such as pollen, nectar, flowers and foliage. To date there are no suitable methods for predicting pesticide exposure for pollinators, therefore official procedures to assess pesticide risk are based on a Hazard Quotient. Here we develop a procedure to assess exposure and risk for pollinators based on the foraging behaviour of honeybees (Apis mellifera) and using this species as indicator representative of pollinating insects. The method was applied in 13 European field sites with different climatic, landscape and land use characteristics. The level of risk during the crop growing season was evaluated as a function of the active ingredients used and application regime. Risk levels were primarily determined by the agronomic practices employed (i.e. crop type, pest control method, pesticide use), and there was a clear temporal partitioning of risks through time. Generally the risk was higher in sites cultivated with permanent crops, such as vineyard and olive, than in annual crops, such as cereals and oil seed rape. The greatest level of risk is generally found at the beginning of the growing season for annual crops and later in June-July for permanent crops.
Kandhasamy, Chandrasekaran; Ghosh, Kaushik
2017-02-01
Indian states are currently classified into HIV-risk categories based on the observed prevalence counts, percentage of infected attendees in antenatal clinics, and percentage of infected high-risk individuals. This method, however, does not account for the spatial dependence among the states nor does it provide any measure of statistical uncertainty. We provide an alternative model-based approach to address these issues. Our method uses Poisson log-normal models having various conditional autoregressive structures with neighborhood-based and distance-based weight matrices and incorporates all available covariate information. We use R and WinBugs software to fit these models to the 2011 HIV data. Based on the Deviance Information Criterion, the convolution model using distance-based weight matrix and covariate information on female sex workers, literacy rate and intravenous drug users is found to have the best fit. The relative risk of HIV for the various states is estimated using the best model and the states are then classified into the risk categories based on these estimated values. An HIV risk map of India is constructed based on these results. The choice of the final model suggests that an HIV control strategy which focuses on the female sex workers, intravenous drug users and literacy rate would be most effective. Copyright © 2017 Elsevier Ltd. All rights reserved.
Frössling, Jenny; Nusinovici, Simon; Nöremark, Maria; Widgren, Stefan; Lindberg, Ann
2014-11-15
In the design of surveillance, there is often a desire to target high risk herds. Such risk-based approaches result in better allocation of resources and improve the performance of surveillance activities. For many contagious animal diseases, movement of live animals is a main route of transmission, and because of this, herds that purchase many live animals or have a large contact network due to trade can be seen as a high risk stratum of the population. This paper presents a new method to assess herd disease risk in animal movement networks. It is an improvement to current network measures that takes direction, temporal order, and also movement size and probability of disease into account. In the study, the method was used to calculate a probability of disease ratio (PDR) of herds in simulated datasets, and of real herds based on animal movement data from dairy herds included in a bulk milk survey for Coxiella burnetii. Known differences in probability of disease are easily incorporated in the calculations and the PDR was calculated while accounting for regional differences in probability of disease, and also by applying equal probability of disease throughout the population. Each herd's increased probability of disease due to purchase of animals was compared to both the average herd and herds within the same risk stratum. The results show that the PDR is able to capture the different circumstances related to disease prevalence and animal trade contact patterns. Comparison of results based on inclusion or exclusion of differences in risk also highlights how ignoring such differences can influence the ability to correctly identify high risk herds. The method shows a potential to be useful for risk-based surveillance, in the classification of herds in control programmes or to represent influential contacts in risk factor studies. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lah, J; Manger, R; Kim, G
Purpose: To examine the ability of traditional Failure mode and effects analysis (FMEA) and a light version of Healthcare FMEA (HFMEA), called Scenario analysis of FMEA (SAFER) by comparing their outputs in terms of the risks identified and their severity rankings. Methods: We applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation are based on risk priority number (RPN). RPN is a product of three indices: occurrence, severity and detectability. The SAFER approach; utilized two indices-frequency and severity-which were defined by a multidisciplinarymore » team. A criticality matrix was divided into 4 categories; very low, low, high and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. Results: Two methods were independently compared to determine if the results and rated risks were matching or not. Our results showed an agreement of 67% between FMEA and SAFER approaches for the 15 riskiest SIG-specific failure modes. The main differences between the two approaches were the distribution of the values and the failure modes (No.52, 54, 154) that have high SAFER scores do not necessarily have high FMEA RPN scores. In our results, there were additional risks identified by both methods with little correspondence. In the SAFER, when the risk score is determined, the basis of the established decision tree or the failure mode should be more investigated. Conclusion: The FMEA method takes into account the probability that an error passes without being detected. SAFER is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allow the prioritization of risks and mitigation measures, and thus is perfectly applicable to clinical parts of radiotherapy.« less
Cao, Jing; Steffen, Brian T; Guan, Weihua; Remaley, Alan T; McConnell, Joseph P; Palamalai, Vikram; Tsai, Michael Y
Apolipoprotein B-100 (ApoB) is a well-researched lipoprotein marker used in assessing the risk of coronary heart disease (CHD) development. Despite its continued use at the bedside, ApoB methodologies have not been thoroughly compared and may differentially discriminate CHD risk, resulting in patient misclassification. This study compared 3 ApoB immunoassays and their associations with incident CHD risk over a 12-year follow-up period in the Multi-Ethnic Study of Atherosclerosis. Plasma ApoB concentrations were measured in 4679 participants of Multi-Ethnic Study of Atherosclerosis at baseline, using 3 immunoturbidimetric methods. Roche and Kamiya reagent-based methods were analyzed on a Roche modular P analyzer, and the Diazyme reagent-based method was analyzed on a Siemens Dimension analyzer. Cox proportional analysis estimated ApoB-related risk of incident CHD over a median follow-up period of 12.5 years with adjustments for nonlipid CHD risk factors. ApoB concentrations were examined as continuous variables but were also dichotomized based on clinical designations of borderline (100 mg/dL), high (120 mg/dL), and very high ApoB levels (140 mg/dL). Moderate to strong correlations among ApoB methods were observed (r = 0.79-0.98). ApoB concentrations (per standard deviation) were similarly associated with CHD risk and hazard ratio (95% confidence interval): Roche: 1.16 (1.03-1.30); Kamiya: 1.14 (1.02-1.28); and Diazyme: 1.14 (1.02-1.28). Although all 3 ApoB were similarly associated with risk of incident CHD over the study period regardless of the reagent type, the bias between methods suggests that these reagents are not fungible, and assay harmonization may be warranted. Copyright © 2017 National Lipid Association. Published by Elsevier Inc. All rights reserved.
IT Operational Risk Measurement Model Based on Internal Loss Data of Banks
NASA Astrophysics Data System (ADS)
Hao, Xiaoling
Business operation of banks relies increasingly on information technology (IT) and the most important role of IT is to guarantee the operational continuity of business process. Therefore, IT Risk management efforts need to be seen from the perspective of operational continuity. Traditional IT risk studies focused on IT asset-based risk analysis and risk-matrix based qualitative risk evaluation. In practice, IT risk management practices of banking industry are still limited to the IT department and aren't integrated into business risk management, which causes the two departments to work in isolation. This paper presents an improved methodology for dealing with IT operational risk. It adopts quantitative measurement method, based on the internal business loss data about IT events, and uses Monte Carlo simulation to predict the potential losses. We establish the correlation between the IT resources and business processes to make sure risk management of IT and business can work synergistically.
Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Dalton, Angela C.; Dale, Crystal
2014-06-01
Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less
Walsh, Linda; Schneider, Uwe
2013-03-01
Radiation-related risks of cancer can be transported from one population to another population at risk, for the purpose of calculating lifetime risks from radiation exposure. Transfer via excess relative risks (ERR) or excess absolute risks (EAR) or a mixture of both (i.e., from the life span study (LSS) of Japanese atomic bomb survivors) has been done in the past based on qualitative weighting. Consequently, the values of the weights applied and the method of application of the weights (i.e., as additive or geometric weighted means) have varied both between reports produced at different times by the same regulatory body and also between reports produced at similar times by different regulatory bodies. Since the gender and age patterns are often markedly different between EAR and ERR models, it is useful to have an evidence-based method for determining the relative goodness of fit of such models to the data. This paper identifies a method, using Akaike model weights, which could aid expert judgment and be applied to help to achieve consistency of approach and quantitative evidence-based results in future health risk assessments. The results of applying this method to recent LSS cancer incidence models are that the relative EAR weighting by cancer solid cancer site, on a scale of 0-1, is zero for breast and colon, 0.02 for all solid, 0.03 for lung, 0.08 for liver, 0.15 for thyroid, 0.18 for bladder and 0.93 for stomach. The EAR weighting for female breast cancer increases from 0 to 0.3, if a generally observed change in the trend between female age-specific breast cancer incidence rates and attained age, associated with menopause, is accounted for in the EAR model. Application of this method to preferred models from a study of multi-model inference from many models fitted to the LSS leukemia mortality data, results in an EAR weighting of 0. From these results it can be seen that lifetime risk transfer is most highly weighted by EAR only for stomach cancer. However, the generalization and interpretation of radiation effect estimates based on the LSS cancer data, when projected to other populations, are particularly uncertain if considerable differences exist between site-specific baseline rates in the LSS and the other populations of interest. Definitive conclusions, regarding the appropriate method for transporting cancer risks, are limited by a lack of knowledge in several areas including unknown factors and uncertainties in biological mechanisms and genetic and environmental risk factors for carcinogenesis; uncertainties in radiation dosimetry; and insufficient statistical power and/or incomplete follow-up in data from radio-epidemiological studies.
Automatic Identification of Web-Based Risk Markers for Health Events
Borsa, Diana; Hayward, Andrew C; McKendry, Rachel A; Cox, Ingemar J
2015-01-01
Background The escalating cost of global health care is driving the development of new technologies to identify early indicators of an individual’s risk of disease. Traditionally, epidemiologists have identified such risk factors using medical databases and lengthy clinical studies but these are often limited in size and cost and can fail to take full account of diseases where there are social stigmas or to identify transient acute risk factors. Objective Here we report that Web search engine queries coupled with information on Wikipedia access patterns can be used to infer health events associated with an individual user and automatically generate Web-based risk markers for some of the common medical conditions worldwide, from cardiovascular disease to sexually transmitted infections and mental health conditions, as well as pregnancy. Methods Using anonymized datasets, we present methods to first distinguish individuals likely to have experienced specific health events, and classify them into distinct categories. We then use the self-controlled case series method to find the incidence of health events in risk periods directly following a user’s search for a query category, and compare to the incidence during other periods for the same individuals. Results Searches for pet stores were risk markers for allergy. We also identified some possible new risk markers; for example: searching for fast food and theme restaurants was associated with a transient increase in risk of myocardial infarction, suggesting this exposure goes beyond a long-term risk factor but may also act as an acute trigger of myocardial infarction. Dating and adult content websites were risk markers for sexually transmitted infections, such as human immunodeficiency virus (HIV). Conclusions Web-based methods provide a powerful, low-cost approach to automatically identify risk factors, and support more timely and personalized public health efforts to bring human and economic benefits. PMID:25626480
The Community-based Participatory Intervention Effect of “HIV-RAAP”
Yancey, Elleen M.; Mayberry, Robert; Armstrong-Mensah, Elizabeth; Collins, David; Goodin, Lisa; Cureton, Shava; Trammell, Ella H.; Yuan, Keming
2012-01-01
Objectives To design and test HIV-RAAP (HIV/AIDS Risk Reduction Among Heterosexually Active African American Men and Women: A Risk Reduction Prevention Intervention) a coeducational, culture- and gender-sensitive community-based participatory HIV risk reduction intervention. Methods A community-based participatory research process included intervention development and implementation of a 7-session coeducational curriculum conducted over 7 consecutive weeks. Results The results indicated a significant intervention effect on reducing sexual behavior risk (P=0.02), improving HIV risk knowledge (P=0.006), and increasing sexual partner conversations about HIV risk reduction (P= 0.001). Conclusions The HIV-RAAP intervention impacts key domains of heterosexual HIV transmission. PMID:22488405
The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism
NASA Technical Reports Server (NTRS)
Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.
2006-01-01
This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.
Risk Factors of Falls in Community-Dwelling Older Adults: Logistic Regression Tree Analysis
ERIC Educational Resources Information Center
Yamashita, Takashi; Noe, Douglas A.; Bailer, A. John
2012-01-01
Purpose of the Study: A novel logistic regression tree-based method was applied to identify fall risk factors and possible interaction effects of those risk factors. Design and Methods: A nationally representative sample of American older adults aged 65 years and older (N = 9,592) in the Health and Retirement Study 2004 and 2006 modules was used.…
Cyber security risk assessment for SCADA and DCS networks.
Ralston, P A S; Graham, J H; Hieb, J L
2007-10-01
The growing dependence of critical infrastructures and industrial automation on interconnected physical and cyber-based control systems has resulted in a growing and previously unforeseen cyber security threat to supervisory control and data acquisition (SCADA) and distributed control systems (DCSs). It is critical that engineers and managers understand these issues and know how to locate the information they need. This paper provides a broad overview of cyber security and risk assessment for SCADA and DCS, introduces the main industry organizations and government groups working in this area, and gives a comprehensive review of the literature to date. Major concepts related to the risk assessment methods are introduced with references cited for more detail. Included are risk assessment methods such as HHM, IIM, and RFRM which have been applied successfully to SCADA systems with many interdependencies and have highlighted the need for quantifiable metrics. Presented in broad terms is probability risk analysis (PRA) which includes methods such as FTA, ETA, and FEMA. The paper concludes with a general discussion of two recent methods (one based on compromise graphs and one on augmented vulnerability trees) that quantitatively determine the probability of an attack, the impact of the attack, and the reduction in risk associated with a particular countermeasure.
Advances in Chemical Mixtures Risk Methods
This presentation is an overview of emerging issues for dose addition in chemical mixtures risk assessment. It is intended to give the participants a perspective of recent developments in methods for dose addition. The workshop abstract is as follows:This problems-based, half-day...
Osteoporosis risk prediction using machine learning and conventional methods.
Kim, Sung Kean; Yoo, Tae Keun; Oh, Ein; Kim, Deok Won
2013-01-01
A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women, and compared with the ability of a conventional clinical decision tool, osteoporosis self-assessment tool (OST). We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Surveys (KNHANES V-1). The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests (RF), artificial neural networks (ANN), and logistic regression (LR) based on various predictors associated with low bone density. The learning models were compared with OST. SVM had significantly better area under the curve (AUC) of the receiver operating characteristic (ROC) than ANN, LR, and OST. Validation on the test set showed that SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0%. We were the first to perform comparisons of the performance of osteoporosis prediction between the machine learning and conventional methods using population-based epidemiological data. The machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.
Dynamic building risk assessment theoretic model for rainstorm-flood utilization ABM and ABS
NASA Astrophysics Data System (ADS)
Lai, Wenze; Li, Wenbo; Wang, Hailei; Huang, Yingliang; Wu, Xuelian; Sun, Bingyun
2015-12-01
Flood is one of natural disasters with the worst loss in the world. It needs to assess flood disaster risk so that we can reduce the loss of flood disaster. Disaster management practical work needs the dynamic risk results of building. Rainstorm flood disaster system is a typical complex system. From the view of complex system theory, flood disaster risk is the interaction result of hazard effect objects, rainstorm flood hazard factors, and hazard environments. Agent-based modeling (ABM) is an important tool for complex system modeling. Rainstorm-flood building risk dynamic assessment method (RFBRDAM) was proposed using ABM in this paper. The interior structures and procedures of different agents in proposed meth had been designed. On the Netlogo platform, the proposed method was implemented to assess the building risk changes of the rainstorm flood disaster in the Huaihe River Basin using Agent-based simulation (ABS). The results indicated that the proposed method can dynamically assess building risk of the whole process for the rainstorm flood disaster. The results of this paper can provide one new approach for flood disaster building risk dynamic assessment and flood disaster management.
Seitz, Holli H.; Gibson, Laura; Skubisz, Christine; Forquer, Heather; Mello, Susan; Schapira, Marilyn M.; Armstrong, Katrina; Cappella, Joseph N.
2016-01-01
Objective This experiment tested the effects of an individualized risk-based online mammography decision intervention. The intervention employs exemplification theory and the Elaboration Likelihood Model of persuasion to improve the match between breast cancer risk and mammography intentions. Methods 2,918 women ages 35-49 were stratified into two levels of 10-year breast cancer risk (< 1.5%; ≥ 1.5%) then randomly assigned to one of eight conditions: two comparison conditions and six risk-based intervention conditions that varied according to a 2 (amount of content: brief vs. extended) × 3 (format: expository vs. untailored exemplar [example case] vs. tailored exemplar) design. Outcomes included mammography intentions and accuracy of perceived breast cancer risk. Results Risk-based intervention conditions improved the match between objective risk estimates and perceived risk, especially for high-numeracy women with a 10-year breast cancer risk <1.5%. For women with a risk < 1.5%, exemplars improved accuracy of perceived risk and all risk-based interventions increased intentions to wait until age 50 to screen. Conclusion A risk-based mammography intervention improved accuracy of perceived risk and the match between objective risk estimates and mammography intentions. Practice Implications Interventions could be applied in online or clinical settings to help women understand risk and make mammography decisions. PMID:27178707
2013-01-01
Background The high burden and rising incidence of cardiovascular disease (CVD) in resource constrained countries necessitates implementation of robust and pragmatic primary and secondary prevention strategies. Many current CVD management guidelines recommend absolute cardiovascular (CV) risk assessment as a clinically sound guide to preventive and treatment strategies. Development of non-laboratory based cardiovascular risk assessment algorithms enable absolute risk assessment in resource constrained countries. The objective of this review is to evaluate the performance of existing non-laboratory based CV risk assessment algorithms using the benchmarks for clinically useful CV risk assessment algorithms outlined by Cooney and colleagues. Methods A literature search to identify non-laboratory based risk prediction algorithms was performed in MEDLINE, CINAHL, Ovid Premier Nursing Journals Plus, and PubMed databases. The identified algorithms were evaluated using the benchmarks for clinically useful cardiovascular risk assessment algorithms outlined by Cooney and colleagues. Results Five non-laboratory based CV risk assessment algorithms were identified. The Gaziano and Framingham algorithms met the criteria for appropriateness of statistical methods used to derive the algorithms and endpoints. The Swedish Consultation, Framingham and Gaziano algorithms demonstrated good discrimination in derivation datasets. Only the Gaziano algorithm was externally validated where it had optimal discrimination. The Gaziano and WHO algorithms had chart formats which made them simple and user friendly for clinical application. Conclusion Both the Gaziano and Framingham non-laboratory based algorithms met most of the criteria outlined by Cooney and colleagues. External validation of the algorithms in diverse samples is needed to ascertain their performance and applicability to different populations and to enhance clinicians’ confidence in them. PMID:24373202
Time-based collision risk modeling for air traffic management
NASA Astrophysics Data System (ADS)
Bell, Alan E.
Since the emergence of commercial aviation in the early part of last century, economic forces have driven a steadily increasing demand for air transportation. Increasing density of aircraft operating in a finite volume of airspace is accompanied by a corresponding increase in the risk of collision, and in response to a growing number of incidents and accidents involving collisions between aircraft, governments worldwide have developed air traffic control systems and procedures to mitigate this risk. The objective of any collision risk management system is to project conflicts and provide operators with sufficient opportunity to recognize potential collisions and take necessary actions to avoid them. It is therefore the assertion of this research that the currency of collision risk management is time. Future Air Traffic Management Systems are being designed around the foundational principle of four dimensional trajectory based operations, a method that replaces legacy first-come, first-served sequencing priorities with time-based reservations throughout the airspace system. This research will demonstrate that if aircraft are to be sequenced in four dimensions, they must also be separated in four dimensions. In order to separate aircraft in four dimensions, time must emerge as the primary tool by which air traffic is managed. A functional relationship exists between the time-based performance of aircraft, the interval between aircraft scheduled to cross some three dimensional point in space, and the risk of collision. This research models that relationship and presents two key findings. First, a method is developed by which the ability of an aircraft to meet a required time of arrival may be expressed as a robust standard for both industry and operations. Second, a method by which airspace system capacity may be increased while maintaining an acceptable level of collision risk is presented and demonstrated for the purpose of formulating recommendations for procedures regulating air traffic management methods and industry standards governing performance requirements for avionics designed to support trajectory based operations.
Proposal for a new categorization of aseptic processing facilities based on risk assessment scores.
Katayama, Hirohito; Toda, Atsushi; Tokunaga, Yuji; Katoh, Shigeo
2008-01-01
Risk assessment of aseptic processing facilities was performed using two published risk assessment tools. Calculated risk scores were compared with experimental test results, including environmental monitoring and media fill run results, in three different types of facilities. The two risk assessment tools used gave a generally similar outcome. However, depending on the tool used, variations were observed in the relative scores between the facilities. For the facility yielding the lowest risk scores, the corresponding experimental test results showed no contamination, indicating that these ordinal testing methods are insufficient to evaluate this kind of facility. A conventional facility having acceptable aseptic processing lines gave relatively high risk scores. The facility showing a rather high risk score demonstrated the usefulness of conventional microbiological test methods. Considering the significant gaps observed in calculated risk scores and in the ordinal microbiological test results between advanced and conventional facilities, we propose a facility categorization based on risk assessment. The most important risk factor in aseptic processing is human intervention. When human intervention is eliminated from the process by advanced hardware design, the aseptic processing facility can be classified into a new risk category that is better suited for assuring sterility based on a new set of criteria rather than on currently used microbiological analysis. To fully benefit from advanced technologies, we propose three risk categories for these aseptic facilities.
Towards a global water scarcity risk assessment framework: using scenarios and risk distributions
NASA Astrophysics Data System (ADS)
Veldkamp, Ted; Wada, Yoshihide; Aerts, Jeroen; Ward, Philip
2016-04-01
Over the past decades, changing hydro-climatic and socioeconomic conditions have led to increased water scarcity problems. A large number of studies have shown that these water scarcity conditions will worsen in the near future. Despite numerous calls for risk-based assessments of water scarcity, a framework that includes UNISDR's definition of risk does not yet exist at the global scale. This study provides a first step towards such a risk-based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change projections and socioeconomic scenarios. Our study highlights that water scarcity risk increases given all future scenarios, up to >56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity in terms of Expected Annual Exposed Population, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels. Covering hazard, exposure, and vulnerability, risk-based methods are well-suited to assess water scarcity adaptation. Completing the presented risk framework therefore offers water managers a promising perspective to increase water security in a well-informed and adaptive manner.
Occupational risk assessment in the construction industry in Iran.
Seifi Azad Mard, Hamid Reza; Estiri, Ali; Hadadi, Parinaz; Seifi Azad Mard, Mahshid
2017-12-01
Occupational accidents in the construction industry are more common compared with other fields and these accidents are more severe compared with the global average in developing countries, especially in Iran. Studies which lead to the source of these accidents and suggest solutions for them are therefore valuable. In this study a combination of the failure mode and effects analysis method and fuzzy theory is used as a semi-qualitative-quantitative method for analyzing risks and failure modes. The main causes of occupational accidents in this field were identified and analyzed based on three factors; severity, detection and occurrence. Based on whether the risks are high or low priority, modifying actions were suggested to reduce the occupational risks. Finally, the results showed that high priority risks had a 40% decrease due to these actions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... APHIS will select a method to use for the destruction of such poultry based on the following factors: (1... are maintained; (3) The risk to human health or safety of the method used; (4) Whether the method requires specialized equipment or training; (5) The risk that the method poses of spreading the H5/H7 LPAI...
Code of Federal Regulations, 2011 CFR
2011-01-01
... APHIS will select a method to use for the destruction of such poultry based on the following factors: (1... are maintained; (3) The risk to human health or safety of the method used; (4) Whether the method requires specialized equipment or training; (5) The risk that the method poses of spreading the H5/H7 LPAI...
ERIC Educational Resources Information Center
Rule, David L.
Several regression methods were examined within the framework of weighted structural regression (WSR), comparing their regression weight stability and score estimation accuracy in the presence of outlier contamination. The methods compared are: (1) ordinary least squares; (2) WSR ridge regression; (3) minimum risk regression; (4) minimum risk 2;…
Remarks on a financial inverse problem by means of Monte Carlo Methods
NASA Astrophysics Data System (ADS)
Cuomo, Salvatore; Di Somma, Vittorio; Sica, Federica
2017-10-01
Estimating the price of a barrier option is a typical inverse problem. In this paper we present a numerical and statistical framework for a market with risk-free interest rate and a risk asset, described by a Geometric Brownian Motion (GBM). After approximating the risk asset with a numerical method, we find the final option price by following an approach based on sequential Monte Carlo methods. All theoretical results are applied to the case of an option whose underlying is a real stock.
DOT National Transportation Integrated Search
2012-11-01
New methods are proposed for mitigating risk in hazardous materials (hazmat) transportation, based on Conditional : Value-at-Risk (CVaR) measure, on time-dependent vehicular networks. While the CVaR risk measure has been : popularly used in financial...
Risk Decision Making Model for Reservoir Floodwater resources Utilization
NASA Astrophysics Data System (ADS)
Huang, X.
2017-12-01
Floodwater resources utilization(FRU) can alleviate the shortage of water resources, but there are risks. In order to safely and efficiently utilize the floodwater resources, it is necessary to study the risk of reservoir FRU. In this paper, the risk rate of exceeding the design flood water level and the risk rate of exceeding safety discharge are estimated. Based on the principle of the minimum risk and the maximum benefit of FRU, a multi-objective risk decision making model for FRU is constructed. Probability theory and mathematical statistics method is selected to calculate the risk rate; C-D production function method and emergy analysis method is selected to calculate the risk benefit; the risk loss is related to flood inundation area and unit area loss; the multi-objective decision making problem of the model is solved by the constraint method. Taking the Shilianghe reservoir in Jiangsu Province as an example, the optimal equilibrium solution of FRU of the Shilianghe reservoir is found by using the risk decision making model, and the validity and applicability of the model are verified.
A risk-based approach to robotic mission requirements
NASA Technical Reports Server (NTRS)
Dias, William C.; Bourke, Roger D.
1992-01-01
A NASA Risk Team has developed a method for the application of risk management to the definition of robotic mission requirements for the Space Exploration Initiative. These requirements encompass environmental information, infrastructural emplacement in advance, and either technology testing or system/subsystems demonstration. Attention is presently given to a method for step-by-step consideration and analysis of the risk component inherent in mission architecture, followed by a calculation of the subjective risk level. Mitigation strategies are then applied with the same rules, and a comparison is made.
Park, In-Sun; Park, Jae-Woo
2011-01-30
Total petroleum hydrocarbon (TPH) is an important environmental contaminant that is toxic to human and environmental receptors. However, human health risk assessment for petroleum, oil, and lubricant (POL)-contaminated sites is especially challenging because TPH is not a single compound, but rather a mixture of numerous substances. To address this concern, this study recommends a new human health risk assessment strategy for POL-contaminated sites. The strategy is based on a newly modified TPH fractionation method and includes an improved analytical protocol. The proposed TPH fractionation method is composed of ten fractions (e.g., aliphatic and aromatic EC8-10, EC10-12, EC12-16, EC16-22 and EC22-40). Physicochemical properties and toxicity values of each fraction were newly defined in this study. The stepwise ultrasonication-based analytical process was established to measure TPH fractions. Analytical results were compared with those from the TPH Criteria Working Group (TPHCWG) Direct Method. Better analytical efficiencies in TPH, aliphatic, and aromatic fractions were achieved when contaminated soil samples were analyzed with the new analytical protocol. Finally, a human health risk assessment was performed based on the developed tiered risk assessment framework. Results showed that a detailed quantitative risk assessment should be conducted to determine scientifically and economically appropriate cleanup target levels, although the phase II process is useful for determining the potency of human health risks posed by POL-contamination. Copyright © 2010 Elsevier B.V. All rights reserved.
Efficient discovery of risk patterns in medical data.
Li, Jiuyong; Fu, Ada Wai-chee; Fahey, Paul
2009-01-01
This paper studies a problem of efficiently discovering risk patterns in medical data. Risk patterns are defined by a statistical metric, relative risk, which has been widely used in epidemiological research. To avoid fruitless search in the complete exploration of risk patterns, we define optimal risk pattern set to exclude superfluous patterns, i.e. complicated patterns with lower relative risk than their corresponding simpler form patterns. We prove that mining optimal risk pattern sets conforms an anti-monotone property that supports an efficient mining algorithm. We propose an efficient algorithm for mining optimal risk pattern sets based on this property. We also propose a hierarchical structure to present discovered patterns for the easy perusal by domain experts. The proposed approach is compared with two well-known rule discovery methods, decision tree and association rule mining approaches on benchmark data sets and applied to a real world application. The proposed method discovers more and better quality risk patterns than a decision tree approach. The decision tree method is not designed for such applications and is inadequate for pattern exploring. The proposed method does not discover a large number of uninteresting superfluous patterns as an association mining approach does. The proposed method is more efficient than an association rule mining method. A real world case study shows that the method reveals some interesting risk patterns to medical practitioners. The proposed method is an efficient approach to explore risk patterns. It quickly identifies cohorts of patients that are vulnerable to a risk outcome from a large data set. The proposed method is useful for exploratory study on large medical data to generate and refine hypotheses. The method is also useful for designing medical surveillance systems.
Malekmohammadi, Bahram; Tayebzadeh Moghadam, Negar
2018-04-13
Environmental risk assessment (ERA) is a commonly used, effective tool applied to reduce adverse effects of environmental risk factors. In this study, ERA was investigated using the Bayesian network (BN) model based on a hierarchical structure of variables in an influence diagram (ID). ID facilitated ranking of the different alternatives under uncertainty that were then used to evaluate comparisons of the different risk factors. BN was used to present a new model for ERA applicable to complicated development projects such as dam construction. The methodology was applied to the Gabric Dam, in southern Iran. The main environmental risk factors in the region, presented by the Gabric Dam, were identified based on the Delphi technique and specific features of the study area. These included the following: flood, water pollution, earthquake, changes in land use, erosion and sedimentation, effects on the population, and ecosensitivity. These risk factors were then categorized based on results from the output decision node of the BN, including expected utility values for risk factors in the decision node. ERA was performed for the Gabric Dam using the analytical hierarchy process (AHP) method to compare results of BN modeling with those of conventional methods. Results determined that a BN-based hierarchical structure to ERA present acceptable and reasonable risk assessment prioritization in proposing suitable solutions to reduce environmental risks and can be used as a powerful decision support system for evaluating environmental risks.
Determination of viable legionellae in engineered water systems: Do we find what we are looking for?
Kirschner, Alexander K.T.
2016-01-01
In developed countries, legionellae are one of the most important water-based bacterial pathogens caused by management failure of engineered water systems. For routine surveillance of legionellae in engineered water systems and outbreak investigations, cultivation-based standard techniques are currently applied. However, in many cases culture-negative results are obtained despite the presence of viable legionellae, and clinical cases of legionellosis cannot be traced back to their respective contaminated water source. Among the various explanations for these discrepancies, the presence of viable but non-culturable (VBNC) Legionella cells has received increased attention in recent discussions and scientific literature. Alternative culture-independent methods to detect and quantify legionellae have been proposed in order to complement or even substitute the culture method in the future. Such methods should detect VBNC Legionella cells and provide a more comprehensive picture of the presence of legionellae in engineered water systems. However, it is still unclear whether and to what extent these VBNC legionellae are hazardous to human health. Current risk assessment models to predict the risk of legionellosis from Legionella concentrations in the investigated water systems contain many uncertainties and are mainly based on culture-based enumeration. If VBNC legionellae should be considered in future standard analysis, quantitative risk assessment models including VBNC legionellae must be proven to result in better estimates of human health risk than models based on cultivation alone. This review critically evaluates current methods to determine legionellae in the VBNC state, their potential to complement the standard culture-based method in the near future, and summarizes current knowledge on the threat that VBNC legionellae may pose to human health. PMID:26928563
Determination of viable legionellae in engineered water systems: Do we find what we are looking for?
Kirschner, Alexander K T
2016-04-15
In developed countries, legionellae are one of the most important water-based bacterial pathogens caused by management failure of engineered water systems. For routine surveillance of legionellae in engineered water systems and outbreak investigations, cultivation-based standard techniques are currently applied. However, in many cases culture-negative results are obtained despite the presence of viable legionellae, and clinical cases of legionellosis cannot be traced back to their respective contaminated water source. Among the various explanations for these discrepancies, the presence of viable but non-culturable (VBNC) Legionella cells has received increased attention in recent discussions and scientific literature. Alternative culture-independent methods to detect and quantify legionellae have been proposed in order to complement or even substitute the culture method in the future. Such methods should detect VBNC Legionella cells and provide a more comprehensive picture of the presence of legionellae in engineered water systems. However, it is still unclear whether and to what extent these VBNC legionellae are hazardous to human health. Current risk assessment models to predict the risk of legionellosis from Legionella concentrations in the investigated water systems contain many uncertainties and are mainly based on culture-based enumeration. If VBNC legionellae should be considered in future standard analysis, quantitative risk assessment models including VBNC legionellae must be proven to result in better estimates of human health risk than models based on cultivation alone. This review critically evaluates current methods to determine legionellae in the VBNC state, their potential to complement the standard culture-based method in the near future, and summarizes current knowledge on the threat that VBNC legionellae may pose to human health. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
Khandan, Mohammad; Nili, Majid; Koohpaei, Alireza; Mosaferchi, Saeedeh
2016-01-01
Nowadays, the health work decision makers need to analyze a huge amount of data and consider many conflicting evaluation criteria and sub-criteria. Therefore, an ergonomic evaluation in the work environment in order to the control occupational disorders is considered as the Multi Criteria Decision Making (MCDM) problem. In this study, the ergonomic risks factors, which may influence health, were evaluated in a manufacturing company in 2014. Then entropy method was applied to prioritize the different risk factors. This study was done with a descriptive-analytical approach and 13 tasks were included from total number of employees who were working in the seven halls of an ark opal manufacturing (240). Required information was gathered by the demographic questionnaire and Assessment of Repetitive Tasks (ART) method for repetitive task assessment. In addition, entropy was used to prioritize the risk factors based on the ergonomic control needs. The total exposure score based on the ART method calculated was equal to 30.07 ±12.43. Data analysis illustrated that 179 cases (74.6% of tasks) were in the high level of risk area and 13.8% were in the medium level of risk. ART- entropy results revealed that based on the weighted factors, higher value belongs to grip factor and the lowest value was related to neck and hand posture and duration. Based on the limited financial resources, it seems that MCDM in many challenging situations such as control procedures and priority approaches could be used successfully. Other MCDM methods for evaluating and prioritizing the ergonomic problems are recommended.
2014-01-01
Background Coffee and its compounds have been proposed to inhibit endometrial carcinogenesis. Studies in the Norwegian population can be especially interesting due to the high coffee consumption and increasing incidence of endometrial cancer in the country. Methods A total of 97 926 postmenopausal Norwegian women from the population-based prospective Norwegian Women and Cancer (NOWAC) Study, were included in the present analysis. We evaluated the general association between total coffee consumption and endometrial cancer risk as well as the possible impact of brewing method. Multivariate Cox regression analysis was used to estimate risks, and heterogeneity tests were performed to compare brewing methods. Results During an average of 10.9 years of follow-up, 462 incident endometrial cancer cases were identified. After multivariate adjustment, significant risk reduction was found among participants who drank ≥8 cups/day of coffee with a hazard ratio of 0.52 (95% confidence interval, CI 0.34-0.79). However, we did not observe a significant dose-response relationship. No significant heterogeneity in risk was found when comparing filtered and boiled coffee brewing methods. A reduction in endometrial cancer risk was observed in subgroup analyses among participants who drank ≥8 cups/day and had a body mass index ≥25 kg/m2, and in current smokers. Conclusions These data suggest that in this population with high coffee consumption, endometrial cancer risk decreases in women consuming ≥8 cups/day, independent of brewing method. PMID:24666820
Gavrilyuk, Oxana; Braaten, Tonje; Skeie, Guri; Weiderpass, Elisabete; Dumeaux, Vanessa; Lund, Eiliv
2014-03-25
Coffee and its compounds have been proposed to inhibit endometrial carcinogenesis. Studies in the Norwegian population can be especially interesting due to the high coffee consumption and increasing incidence of endometrial cancer in the country. A total of 97 926 postmenopausal Norwegian women from the population-based prospective Norwegian Women and Cancer (NOWAC) Study, were included in the present analysis. We evaluated the general association between total coffee consumption and endometrial cancer risk as well as the possible impact of brewing method. Multivariate Cox regression analysis was used to estimate risks, and heterogeneity tests were performed to compare brewing methods. During an average of 10.9 years of follow-up, 462 incident endometrial cancer cases were identified. After multivariate adjustment, significant risk reduction was found among participants who drank ≥8 cups/day of coffee with a hazard ratio of 0.52 (95% confidence interval, CI 0.34-0.79). However, we did not observe a significant dose-response relationship. No significant heterogeneity in risk was found when comparing filtered and boiled coffee brewing methods. A reduction in endometrial cancer risk was observed in subgroup analyses among participants who drank ≥8 cups/day and had a body mass index ≥25 kg/m2, and in current smokers. These data suggest that in this population with high coffee consumption, endometrial cancer risk decreases in women consuming ≥8 cups/day, independent of brewing method.
NASA Technical Reports Server (NTRS)
Carreno, Victor
2006-01-01
This document describes a method to demonstrate that a UAS, operating in the NAS, can avoid collisions with an equivalent level of safety compared to a manned aircraft. The method is based on the calculation of a collision probability for a UAS , the calculation of a collision probability for a base line manned aircraft, and the calculation of a risk ratio given by: Risk Ratio = P(collision_UAS)/P(collision_manned). A UAS will achieve an equivalent level of safety for collision risk if the Risk Ratio is less than or equal to one. Calculation of the probability of collision for UAS and manned aircraft is accomplished through event/fault trees.
An Integrated Web-Based Assessment Tool for Assessing Pesticide Exposure and Risks
Background/Question/Methods We have created an integrated web-based tool designed to estimate exposure doses and ecological risks under the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Species Act. This involved combining a number of disparat...
O’Brien, Denzil
2016-01-01
Simple Summary This paper examines a number of methods for calculating injury risk for riders in the equestrian sport of eventing, and suggests that the primary locus of risk is the action of the horse jumping, and the jump itself. The paper argues that risk calculation should therefore focus first on this locus. Abstract All horse-riding is risky. In competitive horse sports, eventing is considered the riskiest, and is often characterised as very dangerous. But based on what data? There has been considerable research on the risks and unwanted outcomes of horse-riding in general, and on particular subsets of horse-riding such as eventing. However, there can be problems in accessing accurate, comprehensive and comparable data on such outcomes, and in using different calculation methods which cannot compare like with like. This paper critically examines a number of risk calculation methods used in estimating risk for riders in eventing, including one method which calculates risk based on hours spent in the activity and in one case concludes that eventing is more dangerous than motorcycle racing. This paper argues that the primary locus of risk for both riders and horses is the jump itself, and the action of the horse jumping. The paper proposes that risk calculation in eventing should therefore concentrate primarily on this locus, and suggests that eventing is unlikely to be more dangerous than motorcycle racing. The paper proposes avenues for further research to reduce the likelihood and consequences of rider and horse falls at jumps. PMID:26891334
Diagnostic accuracy of different caries risk assessment methods. A systematic review.
Senneby, Anna; Mejàre, Ingegerd; Sahlin, Nils-Eric; Svensäter, Gunnel; Rohlin, Madeleine
2015-12-01
To evaluate the accuracy of different methods used to identify individuals with increased risk of developing dental coronal caries. Studies on following methods were included: previous caries experience, tests using microbiota, buffering capacity, salivary flow rate, oral hygiene, dietary habits and sociodemographic variables. QUADAS-2 was used to assess risk of bias. Sensitivity, specificity, predictive values, and likelihood ratios (LR) were calculated. Quality of evidence based on ≥3 studies of a method was rated according to GRADE. PubMed, Cochrane Library, Web of Science and reference lists of included publications were searched up to January 2015. From 5776 identified articles, 18 were included. Assessment of study quality identified methodological limitations concerning study design, test technology and reporting. No study presented low risk of bias in all domains. Three or more studies were found only for previous caries experience and salivary mutans streptococci and quality of evidence for these methods was low. Evidence regarding other methods was lacking. For previous caries experience, sensitivity ranged between 0.21 and 0.94 and specificity between 0.20 and 1. Tests using salivary mutans streptococci resulted in low sensitivity and high specificity. For children with primary teeth at baseline, pooled LR for a positive test was 3 for previous caries experience and 4 for salivary mutans streptococci, given a threshold ≥10(5) CFU/ml. Evidence on the validity of analysed methods used for caries risk assessment is limited. As methodological quality was low, there is a need to improve study design. Low validity for the analysed methods may lead to patients with increased risk not being identified, whereas some are falsely identified as being at risk. As caries risk assessment guides individualized decisions on interventions and intervals for patient recall, improved performance based on best evidence is greatly needed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sabotage at Nuclear Power Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purvis, James W.
1999-07-21
Recently there has been a noted worldwide increase in violent actions including attempted sabotage at nuclear power plants. Several organizations, such as the International Atomic Energy Agency and the US Nuclear Regulatory Commission, have guidelines, recommendations, and formal threat- and risk-assessment processes for the protection of nuclear assets. Other examples are the former Defense Special Weapons Agency, which used a risk-assessment model to evaluate force-protection security requirements for terrorist incidents at DOD military bases. The US DOE uses a graded approach to protect its assets based on risk and vulnerability assessments. The Federal Aviation Administration and Federal Bureau of Investigationmore » conduct joint threat and vulnerability assessments on high-risk US airports. Several private companies under contract to government agencies use formal risk-assessment models and methods to identify security requirements. The purpose of this paper is to survey these methods and present an overview of all potential types of sabotage at nuclear power plants. The paper discusses emerging threats and current methods of choice for sabotage--especially vehicle bombs and chemical attacks. Potential consequences of sabotage acts, including economic and political; not just those that may result in unacceptable radiological exposure to the public, are also discussed. Applicability of risk-assessment methods and mitigation techniques are also presented.« less
Application of the risk assessment paradigm to the induction of allergic contact dermatitis.
Felter, Susan P; Ryan, Cindy A; Basketter, David A; Gilmour, Nicola J; Gerberick, G Frank
2003-02-01
The National Academy of Science (NAS) risk assessment paradigm has been widely accepted as a framework for estimating risk from exposure to environmental chemicals (NAS, 1983). Within this framework, quantitative risk assessments (QRAs) serve as the cornerstone of health-based exposure limits, and have been used routinely for both cancer and noncancer endpoints. These methods have focused primarily on the extrapolation of data from laboratory animals to establish acceptable levels of exposure for humans. For health effects associated with a threshold, uncertainty and variability inherent in the extrapolation process is generally dealt with by the application of "uncertainty factors (UFs)." The adaptation of QRA methods to address skin sensitization is a natural and desirable extension of current practices. Based on our chemical, cellular and molecular understanding of the induction of allergic contact dermatitis, one can conduct a QRA using established methods of identifying a NOAEL (No Observed Adverse Effect Level) or other point of departure, and applying appropriate UFs. This paper describes the application of the NAS paradigm to characterize risks from human exposure to skin sensitizers; consequently, this method can also be used to establish an exposure level for skin allergens that does not present an appreciable risk of sensitization. Copyright 2003 Elsevier Science (USA)
Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu
2006-11-01
Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.
Climate Change Impacts and Adaptation on Southwestern DoD Facilities
2017-03-03
integrating climate change risks into decision priorities. 15. SUBJECT TERMS adaptation, baseline sensitivity, climate change, climate exposure...four bases we found that integrating climate change risks into the current decision matrix, by linking projected risks to current or past impacts...data and decision tools and methods. Bases have some capacity to integrate climate-related information, but they have limited resources to undertake
Risk assessment of metals in road-deposited sediment along an urban-rural gradient.
Zhao, Hongtao; Li, Xuyong
2013-03-01
We applied the traditional risk assessment methods originally designed for soils and river sediments to evaluation of risk associated with metals in road-deposited sediment (RDS) along an urban-rural gradient that included central urban (UCA), urban village (UVA), central suburban county (CSA), rural town (RTA), and rural village (RVA) areas in the Beijing metropolitan region. A new indicator RI(RDS) was developed which integrated the RDS characteristics of mobility, grain size and amount with the potential ecological risk index. The risk associated with metals in RDS in urban areas was generally higher than that in rural areas based on the assessment using traditional methods, but the risk was higher in urban and rural village areas than the areas with higher administration units based on the indicator RI(RDS). These findings implied that RDS characteristics variation with the urban-rural gradient must be considered in metal risk assessment and RDS washoff pollution control. Copyright © 2012 Elsevier Ltd. All rights reserved.
Qi, Xiaoxing; Liu, Liming; Liu, Yabin; Yao, Lan
2013-06-01
Integrated food security covers three aspects: food quantity security, food quality security, and sustainable food security. Because sustainable food security requires that food security must be compatible with sustainable development, the risk assessment of sustainable food security is becoming one of the most important issues. This paper mainly focuses on the characteristics of sustainable food security problems in the major grain-producing areas in China. We establish an index system based on land resources and eco-environmental conditions and apply a dynamic assessment method based on status assessments and trend analysis models to overcome the shortcomings of the static evaluation method. Using fuzzy mathematics, the risks are categorized into four grades: negligible risk, low risk, medium risk, and high risk. A case study was conducted in one of China's major grain-producing areas: Dongting Lake area. The results predict that the status of the sustainable food security in the Dongting Lake area is unsatisfactory for the foreseeable future. The number of districts at the medium-risk range will increase from six to ten by 2015 due to increasing population pressure, a decrease in the cultivated area, and a decrease in the effective irrigation area. Therefore, appropriate policies and measures should be put forward to improve it. The results could also provide direct support for an early warning system-which could be used to monitor food security trends or nutritional status so to inform policy makers of impending food shortages-to prevent sustainable food security risk based on some classical systematic methods. This is the first research of sustainable food security in terms of risk assessment, from the perspective of resources and the environment, at the regional scale.
Fire risk in San Diego County, California: A weighted Bayesian model approach
Kolden, Crystal A.; Weigel, Timothy J.
2007-01-01
Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.
Development of a Fall-Risk Self-Assessment for Community-Dwelling Seniors
Vivrette, Rebecca L.; Rubenstein, Laurence Z.; Martin, Jennifer L.; Josephson, Karen R.; Kramer, B. Josea
2012-01-01
Objective To determine seniors’ beliefs about falls and design a fall-risk self-assessment and educational materials to promote early identification of evidence-based fall risks and encourage prevention behaviors. Methods Focus groups with community-dwelling seniors, conducted in two phases to identify perceptions about fall risks and risk reduction and to assess face validity of the fall-risk self-assessment and acceptability of educational materials. Results Lay perception of fall risks was in general concordance with evidence-based research. Maintaining independence and positive tone were perceived as key motivators for fall prevention. Seniors intended to use information in the educational tool to stimulate discussions about falls with health care providers. Implications An evidence-based, educational fall-risk self-assessment acceptable to older adults can build on existing lay knowledge about fall risks and perception that falls are a relevant problem and can educate seniors about their specific risks and how to minimize them. PMID:21285473
The Research on the Loan-to-Value of Inventory Pledge Loan Based Upon the Unified Credit Mode
NASA Astrophysics Data System (ADS)
Peng, Yang
This paper focus on loan limit indicator of seasonal inventory financing in supply chain financial innovation based on the logistics features of unified credit mode. According to the "corporate and debt" method in trade credit, this paper analyzes the cash flow properties of borrowing firm and the profit level of logistics enterprise, then it assumes downside-risk-averse logistics enterprise instead of risk-neutral logistics enterprise and takes the method of VaR to figure out the maximum loan-to-value ratio of inventory which is in accord with the risk tolerance level of logistics enterprise in seasonal inventory impawn financing.
Zhang, Bo; Cohen, Joanna E; OʼConnor, Shawn
2014-01-01
Selection of priority groups is important for health interventions. However, no quantitative method has been developed. To develop a quantitative method to support the process of selecting priority groups for public health interventions based on both high risk and population health burden. Secondary data analysis of the 2010 Canadian Community Health Survey. Canadian population. Survey respondents. We identified priority groups for 3 diseases: heart disease, stroke, and chronic lower respiratory diseases. Three measures--prevalence, population counts, and adjusted odds ratios (OR)--were calculated for subpopulations (sociodemographic characteristics and other risk factors). A Priority Group Index (PGI) was calculated by summing the rank scores of these 3 measures. Of the 30 priority groups identified by the PGI (10 for each of the 3 disease outcomes), 7 were identified on the basis of high prevalence only, 5 based on population count only, 3 based on high OR only, and the remainder based on combinations of these. The identified priority groups were all in line with the literature as risk factors for the 3 diseases, such as elderly people for heart disease and stroke and those with low income for chronic lower respiratory diseases. The PGI was thus able to balance both high risk and population burden approaches in selecting priority groups, and thus it would address health inequities as well as disease burden in the overall population. The PGI is a quantitative method to select priority groups for public health interventions; it has the potential to enhance the effective use of limited public resources.
Lu, Hao; Wang, Mingyang; Yang, Baohuai; Rong, Xiaoli
2013-01-01
With the development of subway engineering, according to uncertain factors and serious accidents involved in the construction of subways, implementing risk assessment is necessary and may bring a number of benefits for construction safety. The Kent index method extensively used in pipeline construction is improved to make risk assessment much more practical for the risk assessment of disastrous accidents in subway engineering. In the improved method, the indexes are divided into four categories, namely, basic, design, construction, and consequence indexes. In this study, a risk assessment model containing four kinds of indexes is provided. Three kinds of risk occurrence modes are listed. The probability index model which considers the relativity of the indexes is established according to the risk occurrence modes. The model provides the risk assessment process through the fault tree method and has been applied in the risk assessment of Nanjing subway's river-crossing tunnel construction. Based on the assessment results, the builders were informed of what risks should be noticed and what they should do to avoid the risks. The need for further research is discussed. Overall, this method may provide a tool for the builders, and improve the safety of the construction. PMID:23710136
Hsu, Jia-Lien; Hung, Ping-Cheng; Lin, Hung-Yen; Hsieh, Chung-Ho
2015-04-01
Breast cancer is one of the most common cause of cancer mortality. Early detection through mammography screening could significantly reduce mortality from breast cancer. However, most of screening methods may consume large amount of resources. We propose a computational model, which is solely based on personal health information, for breast cancer risk assessment. Our model can be served as a pre-screening program in the low-cost setting. In our study, the data set, consisting of 3976 records, is collected from Taipei City Hospital starting from 2008.1.1 to 2008.12.31. Based on the dataset, we first apply the sampling techniques and dimension reduction method to preprocess the testing data. Then, we construct various kinds of classifiers (including basic classifiers, ensemble methods, and cost-sensitive methods) to predict the risk. The cost-sensitive method with random forest classifier is able to achieve recall (or sensitivity) as 100 %. At the recall of 100 %, the precision (positive predictive value, PPV), and specificity of cost-sensitive method with random forest classifier was 2.9 % and 14.87 %, respectively. In our study, we build a breast cancer risk assessment model by using the data mining techniques. Our model has the potential to be served as an assisting tool in the breast cancer screening.
Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L
2017-07-01
To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.
Motamedzade, Majid; Ashuri, Mohammad Reza; Golmohammadi, Rostam; Mahjub, Hossein
2011-06-13
During the last decades, to assess the risk factors of work-related musculoskeletal disorders (WMSDs), enormous observational methods have been developed. Rapid Entire Body Assessment (REBA) and Quick Exposure Check (QEC) are two general methods in this field. This study aimed to compare ergonomic risk assessment outputs from QEC and REBA in terms of agreement in distribution of postural loading scores based on analysis of working postures. This cross-sectional study was conducted in an engine oil company in which 40 jobs were studied. All jobs were observed by a trained occupational health practitioner. Job information was collected to ensure the completion of ergonomic risk assessment tools, including QEC, and REBA. The result revealed that there was a significant correlation between final scores (r=0.731) and the action levels (r =0.893) of two applied methods. Comparison between the action levels and final scores of two methods showed that there was no significant difference among working departments. Most of studied postures acquired low and moderate risk level in QEC assessment (low risk=20%, moderate risk=50% and High risk=30%) and in REBA assessment (low risk=15%, moderate risk=60% and high risk=25%). There is a significant correlation between two methods. They have a strong correlation in identifying risky jobs, and determining the potential risk for incidence of WMSDs. Therefore, there is possibility for researchers to apply interchangeably both methods, for postural risk assessment in appropriate working environments.
Kim, MinJeong; Liu, Hongbin; Kim, Jeong Tai; Yoo, ChangKyoo
2014-08-15
Sensor faults in metro systems provide incorrect information to indoor air quality (IAQ) ventilation systems, resulting in the miss-operation of ventilation systems and adverse effects on passenger health. In this study, a new sensor validation method is proposed to (1) detect, identify and repair sensor faults and (2) evaluate the influence of sensor reliability on passenger health risk. To address the dynamic non-Gaussianity problem of IAQ data, dynamic independent component analysis (DICA) is used. To detect and identify sensor faults, the DICA-based squared prediction error and sensor validity index are used, respectively. To restore the faults to normal measurements, a DICA-based iterative reconstruction algorithm is proposed. The comprehensive indoor air-quality index (CIAI) that evaluates the influence of the current IAQ on passenger health is then compared using the faulty and reconstructed IAQ data sets. Experimental results from a metro station showed that the DICA-based method can produce an improved IAQ level in the metro station and reduce passenger health risk since it more accurately validates sensor faults than do conventional methods. Copyright © 2014 Elsevier B.V. All rights reserved.
Cooper, Hannah LF; Bossak, Brian; Tempalski, Barbara; Des Jarlais, Don C.; Friedman, Samuel R.
2009-01-01
The concept of the “risk environment” – defined as the “space … [where] factors exogenous to the individual interact to increase the chances of HIV transmission” – draws together the disciplines of public health and geography. Researchers have increasingly turned to geographic methods to quantify dimensions of the risk environment that are both structural and spatial (e.g., local poverty rates). The scientific power of the intersection between public health and geography, however, has yet to be fully mined. In particular, research on the risk environment has rarely applied geographic methods to create neighbourhood-based measures of syringe exchange programs (SEPs) or of drug-related law enforcement activities, despite the fact that these interventions are widely conceptualized as structural and spatial in nature and are two of the most well-established dimensions of the risk environment. To strengthen research on the risk environment, this paper presents a way of using geographic methods to create neighbourhood-based measures of (1) access to SEP sites and (2) exposure to drug-related arrests, and then applies these methods to one setting (New York City). NYC-based results identified substantial cross-neighbourhood variation in SEP site access and in exposure to drug-related arrest rates (even within the subset of neighbourhoods nominally experiencing the same drug-related police strategy). These geographic measures – grounded as they are in conceptualizations of SEPs and drug-related law enforcement strategies – can help develop new arenas of inquiry regarding the impact of these two dimensions of the risk environment on injectors’ health, including exploring whether and how neighbourhood-level access to SEP sites and exposure to drug-related arrests shape a range of outcomes among local injectors. PMID:18963907
Use-related risk analysis for medical devices based on improved FMEA.
Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping
2012-01-01
In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.
Bansback, Nick; Sizto, Sonia; Guh, Daphne; Anis, Aslam H
2012-10-01
Numerous websites offer direct-to-consumer (DTC) genetic testing, yet it is unknown how individuals will react to genetic risk profiles online. The objective of this study was to determine the feasibility of using a web-based survey and conjoint methods to elicit individuals' interpretations of genetic risk profiles by their anticipated worry/anxiousness and health-seeking behaviors. A web-based survey was developed using conjoint methods. Each survey presented 12 hypothetical genetic risk profiles describing genetic test results for four diseases. Test results were characterized by the type of disease (eight diseases), individual risk (five levels), and research confidence (three levels). After each profile, four questions were asked regarding anticipated worry and health-seeking behaviors. Probabilities of response outcomes based on attribute levels were estimated from logistic regression models, adjusting for covariates. Overall, 319 participants (69%) completed 3828 unique genetic risk profiles. Across all profiles, most participants anticipated making doctor's appointments (63%), lifestyle changes (57%), and accessing screening (57%); 40% anticipated feeling more worried and anxious. Higher levels of disease risk were significantly associated with affirmative responses. Conjoint methods may be used to elicit reactions to genetic information online. Preliminary results suggest that genetic information may increase worry/anxiousness and health-seeking behaviors among consumers of DTC tests. Further research is planned to determine the appropriateness of these affects and behaviors.
Adeniyi, D A; Wei, Z; Yang, Y
2018-01-30
A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.
2013-01-01
Background Privacy and information security are important for all healthcare services, including home-based services. We have designed and implemented a prototype technology platform for providing home-based healthcare services. It supports a personal electronic health diary and enables secure and reliable communication and interaction with peers and healthcare personnel. The platform runs on a small computer with a dedicated remote control. It is connected to the patient’s TV and to a broadband Internet. The platform has been tested with home-based rehabilitation and education programs for chronic obstructive pulmonary disease and diabetes. As part of our work, a risk assessment of privacy and security aspects has been performed, to reveal actual risks and to ensure adequate information security in this technical platform. Methods Risk assessment was performed in an iterative manner during the development process. Thus, security solutions have been incorporated into the design from an early stage instead of being included as an add-on to a nearly completed system. We have adapted existing risk management methods to our own environment, thus creating our own method. Our method conforms to ISO’s standard for information security risk management. Results A total of approximately 50 threats and possible unwanted incidents were identified and analysed. Among the threats to the four information security aspects: confidentiality, integrity, availability, and quality; confidentiality threats were identified as most serious, with one threat given an unacceptable level of High risk. This is because health-related personal information is regarded as sensitive. Availability threats were analysed as low risk, as the aim of the home programmes is to provide education and rehabilitation services; not for use in acute situations or for continuous health monitoring. Conclusions Most of the identified threats are applicable for healthcare services intended for patients or citizens in their own homes. Confidentiality risks in home are different from in a more controlled environment such as a hospital; and electronic equipment located in private homes and communicating via Internet, is more exposed to unauthorised access. By implementing the proposed measures, it has been possible to design a home-based service which ensures the necessary level of information security and privacy. PMID:23937965
Measuring the coupled risks: A copula-based CVaR model
NASA Astrophysics Data System (ADS)
He, Xubiao; Gong, Pu
2009-01-01
Integrated risk management for financial institutions requires an approach for aggregating risk types (such as market and credit) whose distributional shapes vary considerably. The financial institutions often ignore risks' coupling influence so as to underestimate the financial risks. We constructed a copula-based Conditional Value-at-Risk (CVaR) model for market and credit risks. This technique allows us to incorporate realistic marginal distributions that capture essential empirical features of these risks, such as skewness and fat-tails while allowing for a rich dependence structure. Finally, the numerical simulation method is used to implement the model. Our results indicate that the coupled risks for the listed company's stock maybe are undervalued if credit risk is ignored, especially for the listed company with bad credit quality.
Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan
2016-07-01
Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.
REBOUND: A Media-Based Life Skills and Risk Education Programme
ERIC Educational Resources Information Center
Kröninger-Jungaberle, Henrik; Nagy, Ede; von Heyden, Maximilian; DuBois, Fletcher
2015-01-01
Background: REBOUND is a novel media-based life skills and risk education programme developed for 14- to 25-year olds in school, university or youth group settings. This paper outlines the programme's rationale, curriculum and implementation. It provides information of relevance to researchers, programme developers and policymakers. Methods/design…
Sandle, Tim
2012-01-01
Environmental monitoring programs are essential for pharmaceutical facilities in order to assess the level of environmental control. For biotechnology facilities there is little advice as to the frequency at which viable environmental monitoring should be conducted. This paper outlines an approach, based on the principles of quality risk management, for the development of a framework from which monitoring frequencies can be determined. This involved the identification of common hazards and the evaluation those hazards in terms of the severity of contamination and the probability of contamination occurring. These elements of risk were evaluated for different cleanrooms and the relative risks ranked. Once the risk scores were calculated, the methods for detecting risks within the cleanrooms were assessed. Risk filtering was then used to group different cleanrooms based on their relative risks and detection methods against predetermined monitoring frequencies. Through use of case study examples, the paper presents the model and describes how appropriate frequencies for the environmental monitoring of cleanrooms can be set. Cleanrooms in which biotechnology pharmaceutical processing takes place are subject to environmental monitoring. The frequency at which such monitoring should be performed can be difficult to determine. This paper uses quality risk assessment methods to construct a framework for determining monitoring frequencies and illustrates the suitability of the framework through a case study.
Tan, Maxine; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin
2017-01-01
The purpose of this study is to evaluate a new method to improve performance of computer-aided detection (CAD) schemes of screening mammograms with two approaches. In the first approach, we developed a new case based CAD scheme using a set of optimally selected global mammographic density, texture, spiculation, and structural similarity features computed from all four full-field digital mammography (FFDM) images of the craniocaudal (CC) and mediolateral oblique (MLO) views by using a modified fast and accurate sequential floating forward selection feature selection algorithm. Selected features were then applied to a “scoring fusion” artificial neural network (ANN) classification scheme to produce a final case based risk score. In the second approach, we combined the case based risk score with the conventional lesion based scores of a conventional lesion based CAD scheme using a new adaptive cueing method that is integrated with the case based risk scores. We evaluated our methods using a ten-fold cross-validation scheme on 924 cases (476 cancer and 448 recalled or negative), whereby each case had all four images from the CC and MLO views. The area under the receiver operating characteristic curve was AUC = 0.793±0.015 and the odds ratio monotonically increased from 1 to 37.21 as CAD-generated case based detection scores increased. Using the new adaptive cueing method, the region based and case based sensitivities of the conventional CAD scheme at a false positive rate of 0.71 per image increased by 2.4% and 0.8%, respectively. The study demonstrated that supplementary information can be derived by computing global mammographic density image features to improve CAD-cueing performance on the suspicious mammographic lesions. PMID:27997380
Advanced uncertainty modelling for container port risk analysis.
Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin
2016-08-13
Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.
Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.
Verrier, Richard L.; Klingenheben, Thomas; Malik, Marek; El-Sherif, Nabil; Exner, Derek V.; Hohnloser, Stefan H.; Ikeda, Takanori; Martínez, Juan Pablo; Narayan, Sanjiv M.; Nieminen, Tuomo; Rosenbaum, David S.
2014-01-01
This consensus guideline was prepared on behalf of the International Society for Holter and Noninvasive Electrocardiology and is cosponsored by the Japanese Circulation Society, the Computers in Cardiology Working Group on e-Cardiology of the European Society of Cardiology, and the European Cardiac Arrhythmia Society. It discusses the electrocardiographic phenomenon of T-wave alternans (TWA) (i.e., a beat-to-beat alternation in the morphology and amplitude of the ST- segment or T-wave). This statement focuses on its physiological basis and measurement technologies and its clinical utility in stratifying risk for life-threatening ventricular arrhythmias. Signal processing techniques including the frequency-domain Spectral Method and the time-domain Modified Moving Average method have demonstrated the utility of TWA in arrhythmia risk stratification in prospective studies in >12,000 patients. The majority of exercise-based studies using both methods have reported high relative risks for cardiovascular mortality and for sudden cardiac death in patients with preserved as well as depressed left ventricular ejection fraction. Studies with ambulatory electrocardiogram-based TWA analysis with Modified Moving Average method have yielded significant predictive capacity. However, negative studies with the Spectral Method have also appeared, including 2 interventional studies in patients with implantable defibrillators. Meta-analyses have been performed to gain insights into this issue. Frontiers of TWA research include use in arrhythmia risk stratification of individuals with preserved ejection fraction, improvements in predictivity with quantitative analysis, and utility in guiding medical as well as device-based therapy. Overall, although TWA appears to be a useful marker of risk for arrhythmic and cardiovascular death, there is as yet no definitive evidence that it can guide therapy. PMID:21920259
Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. En...
Towards Risk Based Design for NASA's Missions
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila
2004-01-01
This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.
A New Time-varying Concept of Risk in a Changing Climate.
Sarhadi, Ali; Ausín, María Concepción; Wiper, Michael P
2016-10-20
In a changing climate arising from anthropogenic global warming, the nature of extreme climatic events is changing over time. Existing analytical stationary-based risk methods, however, assume multi-dimensional extreme climate phenomena will not significantly vary over time. To strengthen the reliability of infrastructure designs and the management of water systems in the changing environment, multidimensional stationary risk studies should be replaced with a new adaptive perspective. The results of a comparison indicate that current multi-dimensional stationary risk frameworks are no longer applicable to projecting the changing behaviour of multi-dimensional extreme climate processes. Using static stationary-based multivariate risk methods may lead to undesirable consequences in designing water system infrastructures. The static stationary concept should be replaced with a flexible multi-dimensional time-varying risk framework. The present study introduces a new multi-dimensional time-varying risk concept to be incorporated in updating infrastructure design strategies under changing environments arising from human-induced climate change. The proposed generalized time-varying risk concept can be applied for all stochastic multi-dimensional systems that are under the influence of changing environments.
Alternative evaluation metrics for risk adjustment methods.
Park, Sungchul; Basu, Anirban
2018-06-01
Risk adjustment is instituted to counter risk selection by accurately equating payments with expected expenditures. Traditional risk-adjustment methods are designed to estimate accurate payments at the group level. However, this generates residual risks at the individual level, especially for high-expenditure individuals, thereby inducing health plans to avoid those with high residual risks. To identify an optimal risk-adjustment method, we perform a comprehensive comparison of prediction accuracies at the group level, at the tail distributions, and at the individual level across 19 estimators: 9 parametric regression, 7 machine learning, and 3 distributional estimators. Using the 2013-2014 MarketScan database, we find that no one estimator performs best in all prediction accuracies. Generally, machine learning and distribution-based estimators achieve higher group-level prediction accuracy than parametric regression estimators. However, parametric regression estimators show higher tail distribution prediction accuracy and individual-level prediction accuracy, especially at the tails of the distribution. This suggests that there is a trade-off in selecting an appropriate risk-adjustment method between estimating accurate payments at the group level and lower residual risks at the individual level. Our results indicate that an optimal method cannot be determined solely on the basis of statistical metrics but rather needs to account for simulating plans' risk selective behaviors. Copyright © 2018 John Wiley & Sons, Ltd.
Comparing biomarkers as principal surrogate endpoints.
Huang, Ying; Gilbert, Peter B
2011-12-01
Recently a new definition of surrogate endpoint, the "principal surrogate," was proposed based on causal associations between treatment effects on the biomarker and on the clinical endpoint. Despite its appealing interpretation, limited research has been conducted to evaluate principal surrogates, and existing methods focus on risk models that consider a single biomarker. How to compare principal surrogate value of biomarkers or general risk models that consider multiple biomarkers remains an open research question. We propose to characterize a marker or risk model's principal surrogate value based on the distribution of risk difference between interventions. In addition, we propose a novel summary measure (the standardized total gain) that can be used to compare markers and to assess the incremental value of a new marker. We develop a semiparametric estimated-likelihood method to estimate the joint surrogate value of multiple biomarkers. This method accommodates two-phase sampling of biomarkers and is more widely applicable than existing nonparametric methods by incorporating continuous baseline covariates to predict the biomarker(s), and is more robust than existing parametric methods by leaving the error distribution of markers unspecified. The methodology is illustrated using a simulated example set and a real data set in the context of HIV vaccine trials. © 2011, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Puissant, Anne; Wernert, Pauline; Débonnaire, Nicolas; Malet, Jean-Philippe; Bernardie, Séverine; Thomas, Loic
2017-04-01
Landslide risk assessment has become a major research subject within the last decades. In the context of the French-funded ANR Project SAMCO which aims at enhancing the overall resilience of societies on the impacts of mountain risks, we developed a procedure to quantify changes in landslide risk at catchment scales. First, we investigate landslide susceptibility, the spatial component of the hazard, through a weight of evidence probabilistic model. This latter is based on the knowledge of past and current landslides in order to simulate their spatial locations in relation to environmental controlling factors. Second, we studied potential consequences using a semi-quantitative region-scale indicator-based method, called method of the Potential Damage Index (PDI). It allows estimating the possible damages related to landslides by combining weighted indicators reflecting the exposure of the element at risk for structural, functional and socio-economic stakes. Finally, we provide landslide risk maps by combining both susceptibility and potential consequence maps resulting from the two previous steps. The risk maps are produced for the present time and for the future (e.g. period 2050 and 2100) taking into account four scenarios of future landcover and landuse development (based on the Prelude European Project) that are consistent with the likely evolution of mountain communities. Results allow identifying the geographical areas that are likely to be exposed to landslide risk in the future. The results are integrated on a web-based demonstrator, enabling the comparison between various scenarios, and could thus be used as decision-support tools for local stakeholders. The method and the demonstrator will be presented through the analysis of landslide risk in two catchments of the French Alps: the Vars catchment and the Barcelonnette basin, both characterized by a different exposure to landslide hazards.
Credit scoring analysis using weighted k nearest neighbor
NASA Astrophysics Data System (ADS)
Mukid, M. A.; Widiharih, T.; Rusgiyono, A.; Prahutama, A.
2018-05-01
Credit scoring is a quatitative method to evaluate the credit risk of loan applications. Both statistical methods and artificial intelligence are often used by credit analysts to help them decide whether the applicants are worthy of credit. These methods aim to predict future behavior in terms of credit risk based on past experience of customers with similar characteristics. This paper reviews the weighted k nearest neighbor (WKNN) method for credit assessment by considering the use of some kernels. We use credit data from a private bank in Indonesia. The result shows that the Gaussian kernel and rectangular kernel have a better performance based on the value of percentage corrected classified whose value is 82.4% respectively.
Lu, Jia-Yang; Cheung, Michael Lok-Man; Huang, Bao-Tian; Wu, Li-Li; Xie, Wen-Jia; Chen, Zhi-Jian; Li, De-Rui; Xie, Liang-Xi
2015-01-01
To assess the performance of a simple optimisation method for improving target coverage and organ-at-risk (OAR) sparing in intensity-modulated radiotherapy (IMRT) for cervical oesophageal cancer. For 20 selected patients, clinically acceptable original IMRT plans (Original plans) were created, and two optimisation methods were adopted to improve the plans: 1) a base dose function (BDF)-based method, in which the treatment plans were re-optimised based on the original plans, and 2) a dose-controlling structure (DCS)-based method, in which the original plans were re-optimised by assigning additional constraints for hot and cold spots. The Original, BDF-based and DCS-based plans were compared with regard to target dose homogeneity, conformity, OAR sparing, planning time and monitor units (MUs). Dosimetric verifications were performed and delivery times were recorded for the BDF-based and DCS-based plans. The BDF-based plans provided significantly superior dose homogeneity and conformity compared with both the DCS-based and Original plans. The BDF-based method further reduced the doses delivered to the OARs by approximately 1-3%. The re-optimisation time was reduced by approximately 28%, but the MUs and delivery time were slightly increased. All verification tests were passed and no significant differences were found. The BDF-based method for the optimisation of IMRT for cervical oesophageal cancer can achieve significantly better dose distributions with better planning efficiency at the expense of slightly more MUs.
Hänsel Petersson, Gunnel; Åkerman, Sigvard; Isberg, Per-Erik; Ericson, Dan
2016-07-07
Predicting future risk for oral diseases, treatment need and prognosis are tasks performed daily in clinical practice. A large variety of methods have been reported, ranging from clinical judgement or "gut feeling" or even patient interviewing, to complex assessments of combinations of known risk factors. In clinical practice, there is an ongoing continuous search for less complicated and more valid tools for risk assessment. There is also a lack of knowledge how different common methods relates to one another. The aim of this study was to investigate if caries risk assessment (CRA) based on clinical judgement and the Cariogram model give similar results. In addition, to assess which factors from clinical status and history agree best with the CRA based on clinical judgement and how the patient's own perception of future oral treatment need correspond with the sum of examiners risk score. Clinical examinations were performed on randomly selected individuals 20-89 years old living in Skåne, Sweden. In total, 451 individuals were examined, 51 % women. The clinical examination included caries detection, saliva samples and radiographic examination together with history and a questionnaire. The examiners made a risk classification and the authors made a second risk calculation according to the Cariogram. For those assessed as low risk using the Cariogram 69 % also were assessed as low risk based on clinical judgement. For the other risk groups the agreement was lower. Clinical variables that significantly related to CRA based on clinical judgement were DS (decayed surfaces) and combining DS and incipient lesions, DMFT (decayed, missed, filled teeth), plaque amount, history and soft drink intake. Patients' perception of future oral treatment need correlated to some extent with the sum of examiners risk score. The main finding was that CRA based on clinical judgement and the Cariogram model gave similar results for the groups that were predicted at low level of future disease, but not so well for the other groups. CRA based on clinical judgement agreed best with the number of DS plus incipient lesions.
Braunstein, Sarah L; van de Wijgert, Janneke H; Vyankandondera, Joseph; Kestelyn, Evelyne; Ntirushwa, Justin; Nash, Denis
2012-01-01
Background: The epidemiologic utility of STARHS hinges not only on producing accurate estimates of HIV incidence, but also on identifying risk factors for recent HIV infection. Methods: As part of an HIV seroincidence study, 800 Rwandan female sex workers (FSW) were HIV tested, with those testing positive further tested by BED-CEIA (BED) and AxSYM Avidity Index (Ax-AI) assays. A sample of HIV-negative (N=397) FSW were followed prospectively for HIV seroconversion. We compared estimates of risk factors for: 1) prevalent HIV infection; 2) recently acquired HIV infection (RI) based on three different STARHS classifications (BED alone, Ax-AI alone, BED/Ax-AI combined); and 3) prospectively observed seroconversion. Results: There was mixed agreement in risk factors between methods. HSV-2 coinfection and recent STI treatment were associated with both prevalent HIV infection and all three measures of recent infection. A number of risk factors were associated only with prevalent infection, including widowhood, history of forced sex, regular alcohol consumption, prior imprisonment, and current breastfeeding. Number of sex partners in the last 3 months was associated with recent infection based on BED/Ax-AI combined, but not other STARHS-based recent infection outcomes or prevalent infection. Risk factor estimates for prospectively observed seroconversion differed in magnitude and direction from those for recent infection via STARHS. Conclusions: Differences in risk factor estimates by each method could reflect true differences in risk factors between the prevalent, recently, or newly infected populations, the effect of study interventions (among those followed prospectively), or assay misclassification. Similar investigations in other populations/settings are needed to further establish the epidemiologic utility of STARHS for identifying risk factors, in addition to incidence rate estimation. PMID:23056162
[Legal and methodical aspects of occupational risk management].
2011-01-01
Legal and methodical aspects of occupational risk management (ORM) are considered with account of new official documents. Introduction of risk and risk management notions into Labor Code reflects the change of forms of occupational health and safety. The role of hygienist and occupational medicine professionals in workplace conditions certification (WCC) and periodical medical examinations (PME) is strengthened. The ORM could be improved by introducing the block of prognosis and causation based on IT-technologies that could match systems of WCC and PME thus improving the effectiveness of prophylactics.
Nitride, Chiara; Lee, Victoria; Baricevic-Jones, Ivona; Adel-Patient, Karine; Baumgartner, Sabine; Mills, E N Clare
2018-01-01
Allergen analysis is central to implementing and monitoring food allergen risk assessment and management processes by the food industry, but current methods for the determination of allergens in foods give highly variable results. The European Union-funded "Integrated Approaches to Food Allergen and Allergy Risk Management" (iFAAM) project has been working to address gaps in knowledge regarding food allergen management and analysis, including the development of novel MS and immuno-based allergen determination methods. Common allergenic food ingredients (peanut, hazelnut, walnut, cow's milk [Bos domesticus], and hen's egg [Gallus domesticus]) and common food matrixes (chocolate dessert and cookie) have been used for both clinical studies and analytical method development to ensure that the new methods are clinically relevant. Allergen molecules have been used as analytical targets and allergenic ingredients incurred into matrixes at levels close to reference doses that may trigger the use of precautionary allergen labeling. An interlaboratory method comparison has been undertaken for the determination of peanut in chocolate dessert using MS and immuno-based methods. The iFAAM approach has highlighted the need for methods to report test results in allergenic protein. This will allow food business operators to use them in risk assessments that are founded on clinical study data in which protein has been used as a measure of allergenic potency.
Failure prediction using machine learning and time series in optical network.
Wang, Zhilong; Zhang, Min; Wang, Danshi; Song, Chuang; Liu, Min; Li, Jin; Lou, Liqi; Liu, Zhuo
2017-08-07
In this paper, we propose a performance monitoring and failure prediction method in optical networks based on machine learning. The primary algorithms of this method are the support vector machine (SVM) and double exponential smoothing (DES). With a focus on risk-aware models in optical networks, the proposed protection plan primarily investigates how to predict the risk of an equipment failure. To the best of our knowledge, this important problem has not yet been fully considered. Experimental results showed that the average prediction accuracy of our method was 95% when predicting the optical equipment failure state. This finding means that our method can forecast an equipment failure risk with high accuracy. Therefore, our proposed DES-SVM method can effectively improve traditional risk-aware models to protect services from possible failures and enhance the optical network stability.
Taksler, Glen B; Perzynski, Adam T; Kattan, Michael W
2017-04-01
Recommendations for colorectal cancer screening encourage patients to choose among various screening methods based on individual preferences for benefits, risks, screening frequency, and discomfort. We devised a model to illustrate how individuals with varying tolerance for screening complications risk might decide on their preferred screening strategy. We developed a discrete-time Markov mathematical model that allowed hypothetical individuals to maximize expected lifetime utility by selecting screening method, start age, stop age, and frequency. Individuals could choose from stool-based testing every 1 to 3 years, flexible sigmoidoscopy every 1 to 20 years with annual stool-based testing, colonoscopy every 1 to 20 years, or no screening. We compared the life expectancy gained from the chosen strategy with the life expectancy available from a benchmark strategy of decennial colonoscopy. For an individual at average risk of colorectal cancer who was risk neutral with respect to screening complications (and therefore was willing to undergo screening if it would actuarially increase life expectancy), the model predicted that he or she would choose colonoscopy every 10 years, from age 53 to 73 years, consistent with national guidelines. For a similar individual who was moderately averse to screening complications risk (and therefore required a greater increase in life expectancy to accept potential risks of colonoscopy), the model predicted that he or she would prefer flexible sigmoidoscopy every 12 years with annual stool-based testing, with 93% of the life expectancy benefit of decennial colonoscopy. For an individual with higher risk aversion, the model predicted that he or she would prefer 2 lifetime flexible sigmoidoscopies, 20 years apart, with 70% of the life expectancy benefit of decennial colonoscopy. Mathematical models may formalize how individuals with different risk attitudes choose between various guideline-recommended colorectal cancer screening strategies.
Improved Methods for Fire Risk Assessment in Low-Income and Informal Settlements.
Twigg, John; Christie, Nicola; Haworth, James; Osuteye, Emmanuel; Skarlatidou, Artemis
2017-02-01
Fires cause over 300,000 deaths annually worldwide and leave millions more with permanent injuries: some 95% of these deaths are in low- and middle-income countries. Burn injury risk is strongly associated with low-income and informal (or slum) settlements, which are growing rapidly in an urbanising world. Fire policy and mitigation strategies in poorer countries are constrained by inadequate data on incidence, impacts, and causes, which is mainly due to a lack of capacity and resources for data collection, analysis, and modelling. As a first step towards overcoming such challenges, this project reviewed the literature on the subject to assess the potential of a range of methods and tools for identifying, assessing, and addressing fire risk in low-income and informal settlements; the process was supported by an expert workshop at University College London in May 2016. We suggest that community-based risk and vulnerability assessment methods, which are widely used in disaster risk reduction, could be adapted to urban fire risk assessment, and could be enhanced by advances in crowdsourcing and citizen science for geospatial data creation and collection. To assist urban planners, emergency managers, and community organisations who are working in resource-constrained settings to identify and assess relevant fire risk factors, we also suggest an improved analytical framework based on the Haddon Matrix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Y; Liu, B; Kalra, M
Purpose: X-rays from CT scans can increase cancer risk to patients. Lifetime Attributable Risk of Cancer Incidence for adult patients has been investigated and shown to decrease as patient age. However, a new risk model shows an increasing risk trend for several radiosensitive organs for middle age patients. This study investigates the feasibility of a general method for optimizing tube current modulation (TCM) functions to minimize risk by reducing radiation dose to radiosensitive organs of patients. Methods: Organ-based TCM has been investigated in literature for eye lens dose and breast dose. Adopting the concept in organ-based TCM, this study seeksmore » to find an optimized tube current for minimal total risk to breasts and lungs by reducing dose to these organs. The contributions of each CT view to organ dose are determined through simulations of CT scan view-by-view using a GPU-based fast Monte Carlo code, ARCHER. A Linear Programming problem is established for tube current optimization, with Monte Carlo results as weighting factors at each view. A pre-determined dose is used as upper dose boundary, and tube current of each view is optimized to minimize the total risk. Results: An optimized tube current is found to minimize the total risk of lungs and breasts: compared to fixed current, the risk is reduced by 13%, with breast dose reduced by 38% and lung dose reduced by 7%. The average tube current is maintained during optimization to maintain image quality. In addition, dose to other organs in chest region is slightly affected, with relative change in dose smaller than 10%. Conclusion: Optimized tube current plans can be generated to minimize cancer risk to lungs and breasts while maintaining image quality. In the future, various risk models and greater number of projections per rotation will be simulated on phantoms of different gender and age. National Institutes of Health R01EB015478.« less
[Study on the risk assessment method of regional groundwater pollution].
Yang, Yan; Yu, Yun-Jiang; Wang, Zong-Qing; Li, Ding-Long; Sun, Hong-Wei
2013-02-01
Based on the boundary elements of system risk assessment, the regional groundwater pollution risk assessment index system was preliminarily established, which included: regional groundwater specific vulnerability assessment, the regional pollution sources characteristics assessment and the health risk assessment of regional featured pollutants. The three sub-evaluation systems were coupled with the multi-index comprehensive method, the risk was characterized with the Spatial Analysis of ArcMap, and a new method to evaluate regional groundwater pollution risk that suitable for different parts of natural conditions, different types of pollution was established. Take Changzhou as an example, the risk of shallow groundwater pollution was studied with the new method, and found that the vulnerability index of groundwater in Changzhou is high and distributes unevenly; The distribution of pollution sources is concentrated and has a great impact on groundwater pollution risks; Influenced by the pollutants and pollution sources, the values of health risks are high in the urban area of Changzhou. The pollution risk of shallow groundwater is high and distributes unevenly, and distributes in the north of the line of Anjia-Xuejia-Zhenglu, the center of the city and the southeast, where the human activities are more intense and the pollution sources are intensive.
Solvency II solvency capital requirement for life insurance companies based on expected shortfall.
Boonen, Tim J
2017-01-01
This paper examines the consequences for a life annuity insurance company if the solvency II solvency capital requirements (SCR) are calibrated based on expected shortfall (ES) instead of value-at-risk (VaR). We focus on the risk modules of the SCRs for the three risk classes equity risk, interest rate risk and longevity risk. The stress scenarios are determined using the calibration method proposed by EIOPA in 2014. We apply the stress-scenarios for these three risk classes to a fictitious life annuity insurance company. We find that for EIOPA's current quantile 99.5% of the VaR, the stress scenarios of the various risk classes based on ES are close to the stress scenarios based on VaR. Might EIOPA choose to calibrate the stress scenarios on a smaller quantile, the longevity SCR is relatively larger and the equity SCR is relatively smaller if ES is used instead of VaR. We derive the same conclusion if stress scenarios are determined with empirical stress scenarios.
Agapova, Maria; Bresnahan, Brian B; Higashi, Mitchell; Kessler, Larry; Garrison, Louis P; Devine, Beth
2017-02-01
The American College of Radiology develops evidence-based practice guidelines to aid appropriate utilization of radiological procedures. Panel members use expert opinion to weight trade-offs and consensus methods to rate appropriateness of imaging tests. These ratings include an equivocal range, assigned when there is disagreement about a technology's appropriateness and the evidence base is weak or for special circumstances. It is not clear how expert consensus merges with the evidence base to arrive at an equivocal rating. Quantitative benefit-risk assessment (QBRA) methods may assist decision makers in this capacity. However, many methods exist and it is not clear which methods are best suited for this application. We perform a critical appraisal of QBRA methods and propose several steps that may aid in making transparent areas of weak evidence and barriers to consensus in guideline development. We identify QBRA methods with potential to facilitate decision making in guideline development and build a decision aid for selecting among these methods. This study identified 2 families of QBRA methods suited to guideline development when expert opinion is expected to contribute substantially to decision making. Key steps to deciding among QBRA methods involve identifying specific benefit-risk criteria and developing a state-of-evidence matrix. For equivocal ratings assigned for reasons other than disagreement or weak evidence base, QBRA may not be needed. In the presence of disagreement but the absence of a weak evidence base, multicriteria decision analysis approaches are recommended; and in the presence of weak evidence base and the absence of disagreement, incremental net health benefit alone or combined with multicriteria decision analysis is recommended. Our critical appraisal further extends investigation of the strengths and limitations of select QBRA methods in facilitating diagnostic radiology clinical guideline development. The process of using the decision aid exposes and makes transparent areas of weak evidence and barriers to consensus. © 2016 John Wiley & Sons, Ltd.
Li, Pei-Chiun; Ma, Hwong-Wen
2016-01-25
The total quantity of chemical emissions does not take into account their chemical toxicity, and fails to be an accurate indicator of the potential impact on human health. The sources of released contaminants, and therefore, the potential risk, also differ based on geography. Because of the complexity of the risk, there is no integrated method to evaluate the effectiveness of risk reduction. Therefore, this study developed a method to incorporate the spatial variability of emissions into human health risk assessment to evaluate how to effectively reduce risk using risk elasticity analysis. Risk elasticity analysis, the percentage change in risk in response to the percentage change in emissions, was adopted in this study to evaluate the effectiveness and efficiency of risk reduction. The results show that the main industry sectors are different in each area, and that high emission in an area does not correspond to high risk. Decreasing the high emissions of certain sectors in an area does not result in efficient risk reduction in this area. This method can provide more holistic information for risk management, prevent the development of increased risk, and prioritize the risk reduction strategies. Copyright © 2015 Elsevier B.V. All rights reserved.
Sun, Jimeng; Hu, Jianying; Luo, Dijun; Markatou, Marianthi; Wang, Fei; Edabollahi, Shahram; Steinhubl, Steven E.; Daar, Zahra; Stewart, Walter F.
2012-01-01
Background: The ability to identify the risk factors related to an adverse condition, e.g., heart failures (HF) diagnosis, is very important for improving care quality and reducing cost. Existing approaches for risk factor identification are either knowledge driven (from guidelines or literatures) or data driven (from observational data). No existing method provides a model to effectively combine expert knowledge with data driven insight for risk factor identification. Methods: We present a systematic approach to enhance known knowledge-based risk factors with additional potential risk factors derived from data. The core of our approach is a sparse regression model with regularization terms that correspond to both knowledge and data driven risk factors. Results: The approach is validated using a large dataset containing 4,644 heart failure cases and 45,981 controls. The outpatient electronic health records (EHRs) for these patients include diagnosis, medication, lab results from 2003–2010. We demonstrate that the proposed method can identify complementary risk factors that are not in the existing known factors and can better predict the onset of HF. We quantitatively compare different sets of risk factors in the context of predicting onset of HF using the performance metric, the Area Under the ROC Curve (AUC). The combined risk factors between knowledge and data significantly outperform knowledge-based risk factors alone. Furthermore, those additional risk factors are confirmed to be clinically meaningful by a cardiologist. Conclusion: We present a systematic framework for combining knowledge and data driven insights for risk factor identification. We demonstrate the power of this framework in the context of predicting onset of HF, where our approach can successfully identify intuitive and predictive risk factors beyond a set of known HF risk factors. PMID:23304365
Häggström, Christel; Van Hemelrijck, Mieke; Garmo, Hans; Robinson, David; Stattin, Pär; Rowley, Mark; Coolen, Anthony C C; Holmberg, Lars
2018-05-09
Most previous studies of prostate cancer have not taken into account that men in the studied populations are also at risk of competing event, and that these men may have different susceptibility to prostate cancer risk. The aim of this study was to investigate heterogeneity in risk of prostate cancer, using a recently developed latent class regression method for competing risks. We further aimed to elucidate the association between type 2 diabetes mellitus (T2DM) and prostate cancer risk, and to compare the results with conventional methods for survival analysis. We analysed the risk of prostate cancer in 126,482 men from the comparison cohort of the Prostate Cancer Data base Sweden (PCBaSe) 3.0. During a mean follow-up of 6 years 6,036 men were diagnosed with prostate cancer and 22,393 men died. We detected heterogeneity in risk of prostate cancer with two distinct latent classes in the study population. The smaller class included 9% of the study population in which men had a higher risk of prostate cancer and the risk was stronger associated with class membership than any of the covariates included in the study. Moreover, we found no association between T2DM and risk of prostate cancer after removal of the effect of informative censoring due to competing risks. The recently developed latent class for competing risks method could be used to provide new insights in precision medicine with the target to classify individuals regarding different susceptibility to a particular disease, reaction to a risk factor or response to treatment. This article is protected by copyright. All rights reserved. © 2018 UICC.
Quantitative Method for Analyzing the Allocation of Risks in Transportation Construction
DOT National Transportation Integrated Search
1979-04-01
The report presents a conceptual model of risk that was developed to analyze the impact on owner's cost of alternate allocations of risk among owner and contractor in mass transit construction. A model and analysis procedure are developed, based on d...
Active fans and grizzly bears: Reducing risks for wilderness campers
NASA Astrophysics Data System (ADS)
Sakals, M. E.; Wilford, D. J.; Wellwood, D. W.; MacDougall, S. A.
2010-03-01
Active geomorphic fans experience debris flows, debris floods and/or floods (hydrogeomorphic processes) that can be hazards to humans. Grizzly bears ( Ursus arctos) can also be a hazard to humans. This paper presents the results of a cross-disciplinary study that analyzed both hydrogeomorphic and grizzly bear hazards to wilderness campers on geomorphic fans along a popular hiking trail in Kluane National Park and Reserve in southwestern Yukon Territory, Canada. Based on the results, a method is proposed to reduce the risks to campers associated with camping on fans. The method includes both landscape and site scales and is based on easily understood and readily available information regarding weather, vegetation, stream bank conditions, and bear ecology and behaviour. Educating wilderness campers and providing a method of decision-making to reduce risk supports Parks Canada's public safety program; a program based on the principle of user self-sufficiency. Reducing grizzly bear-human conflicts complements the efforts of Parks Canada to ensure a healthy grizzly bear population.
NASA Astrophysics Data System (ADS)
Oh, Jung Hun; Kerns, Sarah; Ostrer, Harry; Powell, Simon N.; Rosenstein, Barry; Deasy, Joseph O.
2017-02-01
The biological cause of clinically observed variability of normal tissue damage following radiotherapy is poorly understood. We hypothesized that machine/statistical learning methods using single nucleotide polymorphism (SNP)-based genome-wide association studies (GWAS) would identify groups of patients of differing complication risk, and furthermore could be used to identify key biological sources of variability. We developed a novel learning algorithm, called pre-conditioned random forest regression (PRFR), to construct polygenic risk models using hundreds of SNPs, thereby capturing genomic features that confer small differential risk. Predictive models were trained and validated on a cohort of 368 prostate cancer patients for two post-radiotherapy clinical endpoints: late rectal bleeding and erectile dysfunction. The proposed method results in better predictive performance compared with existing computational methods. Gene ontology enrichment analysis and protein-protein interaction network analysis are used to identify key biological processes and proteins that were plausible based on other published studies. In conclusion, we confirm that novel machine learning methods can produce large predictive models (hundreds of SNPs), yielding clinically useful risk stratification models, as well as identifying important underlying biological processes in the radiation damage and tissue repair process. The methods are generally applicable to GWAS data and are not specific to radiotherapy endpoints.
NASA Astrophysics Data System (ADS)
Samat, N. A.; Ma'arof, S. H. Mohd Imam
2015-05-01
Disease mapping is a method to display the geographical distribution of disease occurrence, which generally involves the usage and interpretation of a map to show the incidence of certain diseases. Relative risk (RR) estimation is one of the most important issues in disease mapping. This paper begins by providing a brief overview of Chikungunya disease. This is followed by a review of the classical model used in disease mapping, based on the standardized morbidity ratio (SMR), which we then apply to our Chikungunya data. We then fit an extension of the classical model, which we refer to as a Poisson-Gamma model, when prior distributions for the relative risks are assumed known. Both results are displayed and compared using maps and we reveal a smoother map with fewer extremes values of estimated relative risk. The extensions of this paper will consider other methods that are relevant to overcome the drawbacks of the existing methods, in order to inform and direct government strategy for monitoring and controlling Chikungunya disease.
Eisen, Lars; Eisen, Rebecca J.
2018-01-01
The nymphal stage of the blacklegged tick, Ixodes scapularis Say, is considered the primary vector to humans in the eastern United States of the Lyme disease spirochete Borrelia burgdorferi sensu stricto. The abundance of infected host-seeking nymphs is commonly used to estimate the fundamental risk of human exposure to B. burgdorferi, for the purpose of environmental risk assessment and as an outcome measure when evaluating environmentally based tick or pathogen control methods. However, as this tick-based risk measure does not consider the likelihoods of either human encounters with infected ticks or tick bites resulting in pathogen transmission, its linkage to the occurrence of Lyme disease cases is worth evaluating. In this Forum article, we describe different tick-based risk measures, discuss their strengths and weaknesses, and review the evidence for their capacity to predict the occurrence of Lyme disease cases. We conclude that: 1) the linkage between abundance of host-seeking B. burgdorferi-infected nymphs and Lyme disease occurrence is strong at community or county scales but weak at the fine spatial scale of residential properties where most human exposures to infected nymphs occur in Northeast, 2) the combined use of risk measures based on infected nymphs collected from the environment and ticks collected from humans is preferable to either one of these risk measures used singly when assessing the efficacy of environmentally based tick or pathogen control methods aiming to reduce the risk of human exposure to B. burgdorferi, 3) there is a need for improved risk assessment methodology for residential properties that accounts for both the abundance of infected nymphs and the likelihood of human–tick contact, and 4) we need to better understand how specific human activities conducted in defined residential microhabitats relate to risk for nymphal exposures and bites. PMID:27330093
Eisen, Lars; Eisen, Rebecca J
2016-06-21
The nymphal stage of the blacklegged tick, Ixodes scapularis Say, is considered the primary vector to humans in the eastern United States of the Lyme disease spirochete Borrelia burgdorferi sensu stricto. The abundance of infected host-seeking nymphs is commonly used to estimate the fundamental risk of human exposure to B. burgdorferi, for the purpose of environmental risk assessment and as an outcome measure when evaluating environmentally based tick or pathogen control methods. However, as this tick-based risk measure does not consider the likelihoods of either human encounters with infected ticks or tick bites resulting in pathogen transmission, its linkage to the occurrence of Lyme disease cases is worth evaluating. In this Forum article, we describe different tick-based risk measures, discuss their strengths and weaknesses, and review the evidence for their capacity to predict the occurrence of Lyme disease cases. We conclude that: 1) the linkage between abundance of host-seeking B. burgdorferi-infected nymphs and Lyme disease occurrence is strong at community or county scales but weak at the fine spatial scale of residential properties where most human exposures to infected nymphs occur in Northeast, 2) the combined use of risk measures based on infected nymphs collected from the environment and ticks collected from humans is preferable to either one of these risk measures used singly when assessing the efficacy of environmentally based tick or pathogen control methods aiming to reduce the risk of human exposure to B. burgdorferi, 3) there is a need for improved risk assessment methodology for residential properties that accounts for both the abundance of infected nymphs and the likelihood of human-tick contact, and 4) we need to better understand how specific human activities conducted in defined residential microhabitats relate to risk for nymphal exposures and bites. Published by Oxford University Press on behalf of Entomological Society of America 2016.This work is written by US Government employees and is in the public domain in the US.
Henriksen, Eva; Burkow, Tatjana M; Johnsen, Elin; Vognild, Lars K
2013-08-09
Privacy and information security are important for all healthcare services, including home-based services. We have designed and implemented a prototype technology platform for providing home-based healthcare services. It supports a personal electronic health diary and enables secure and reliable communication and interaction with peers and healthcare personnel. The platform runs on a small computer with a dedicated remote control. It is connected to the patient's TV and to a broadband Internet. The platform has been tested with home-based rehabilitation and education programs for chronic obstructive pulmonary disease and diabetes. As part of our work, a risk assessment of privacy and security aspects has been performed, to reveal actual risks and to ensure adequate information security in this technical platform. Risk assessment was performed in an iterative manner during the development process. Thus, security solutions have been incorporated into the design from an early stage instead of being included as an add-on to a nearly completed system. We have adapted existing risk management methods to our own environment, thus creating our own method. Our method conforms to ISO's standard for information security risk management. A total of approximately 50 threats and possible unwanted incidents were identified and analysed. Among the threats to the four information security aspects: confidentiality, integrity, availability, and quality; confidentiality threats were identified as most serious, with one threat given an unacceptable level of High risk. This is because health-related personal information is regarded as sensitive. Availability threats were analysed as low risk, as the aim of the home programmes is to provide education and rehabilitation services; not for use in acute situations or for continuous health monitoring. Most of the identified threats are applicable for healthcare services intended for patients or citizens in their own homes. Confidentiality risks in home are different from in a more controlled environment such as a hospital; and electronic equipment located in private homes and communicating via Internet, is more exposed to unauthorised access. By implementing the proposed measures, it has been possible to design a home-based service which ensures the necessary level of information security and privacy.
NASA Astrophysics Data System (ADS)
Zhang, Yufeng; Long, Man; Luo, Sida; Bao, Yu; Shen, Hanxia
2015-12-01
Transit route choice model is the key technology of public transit systems planning and management. Traditional route choice models are mostly based on expected utility theory which has an evident shortcoming that it cannot accurately portray travelers' subjective route choice behavior for their risk preferences are not taken into consideration. Cumulative prospect theory (CPT), a brand new theory, can be used to describe travelers' decision-making process under the condition of uncertainty of transit supply and risk preferences of multi-type travelers. The method to calibrate the reference point, a key parameter to CPT-based transit route choice model, determines the precision of the model to a great extent. In this paper, a new method is put forward to obtain the value of reference point which combines theoretical calculation and field investigation results. Comparing the proposed method with traditional method, it shows that the new method can promote the quality of CPT-based model by improving the accuracy in simulating travelers' route choice behaviors based on transit trip investigation from Nanjing City, China. The proposed method is of great significance to logical transit planning and management, and to some extent makes up the defect that obtaining the reference point is solely based on qualitative analysis.
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2007-01-01
A formal method is described to quantify structural reliability and risk in the presence of a multitude of uncertainties. The method is based on the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where reliability and risk are usually specified. A sample case is described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that the method is mature and that it can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. The results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
Houston, Ebony; Peterson, James; Kuo, Irene; Magnus, Manya
2016-01-01
Abstract Purpose: To develop optimal methods to study sexual health among black young men who have sex with men and transgender women (BYMSM/TW). Methods: We conducted a mixed-methods prospective study to identify recruitment and retention strategies for BYMSM/TW (age 16–21) in Washington D.C., and describe HIV risk behaviors and context. Results: Incentivized peer referral was highly productive, and 60% of BYMSM/TW were retained for 3 months. Participants reported high levels of sexual risk, homophobia, racism, and maternal support. Conclusion: BYMSM/TW studies should utilize a combination of peer-based, in-person, and technology-based recruiting strategies. Additional research is needed to leverage mobile technology and social media to enhance retention. PMID:26651365
Bitar, A; Maghrabi, M; Doubal, A W
2013-12-01
Two methods for determination of internal dose due to (131)I intake during the preparation and handling of iodine radiopharmaceutical products have been compared. The first method was based on the measurement of (131)I in 24-hour urine samples while the second method was based on the measurement in vivo of (131)I in thyroid. The results have shown that urine analysis method can be used as a screening test but not for internal dose assessment of exposed workers. Thyroid monitoring method was found to be more reliable and accurate method for assessing internal dose from (131)I intake. In addition, the assessed internal dose showed that the annual internal effective dose for some workers was below 1 mSv with no risk classification, whereas the results of other group of workers were between 1 and 6 mSv with low risk classification. Only one worker reached 7.66 mSv with high risk classification; and this worker must be monitored individually. © 2013 Elsevier Ltd. All rights reserved.
Indicators of economic security of the region: a risk-based approach to assessing and rating
NASA Astrophysics Data System (ADS)
Karanina, Elena; Loginov, Dmitri
2017-10-01
The article presents the results of research of theoretical and methodical problems of strategy development for economic security of a particular region, justified by the composition of risk factors. The analysis of those risk factors is performed. The threshold values of indicators of economic security of regions were determined using the methods of socioeconomic statistics. The authors concluded that in modern Russian conditions it is necessary to pay great attention to the analysis of the composition and level of indicators of economic security of the region and, based on the materials of this analysis, to formulate more accurate decisions concerning the strategy of socio-economic development.
Risk evaluation of highway engineering project based on the fuzzy-AHP
NASA Astrophysics Data System (ADS)
Yang, Qian; Wei, Yajun
2011-10-01
Engineering projects are social activities, which integrate with technology, economy, management and organization. There are uncertainties in each respect of engineering projects, and it needs to strengthen risk management urgently. Based on the analysis of the characteristics of highway engineering, and the study of the basic theory on risk evaluation, the paper built an index system of highway project risk evaluation. Besides based on fuzzy mathematics principle, analytical hierarchy process was used and as a result, the model of the comprehensive appraisal method of fuzzy and AHP was set up for the risk evaluation of express way concessionary project. The validity and the practicability of the risk evaluation of expressway concessionary project were verified after the model was applied to the practice of a project.
Eliasson, Kristina; Palm, Peter; Nyman, Teresia; Forsman, Mikael
2017-07-01
A common way to conduct practical risk assessments is to observe a job and report the observed long term risks for musculoskeletal disorders. The aim of this study was to evaluate the inter- and intra-observer reliability of ergonomists' risk assessments without the support of an explicit risk assessment method. Twenty-one experienced ergonomists assessed the risk level (low, moderate, high risk) of eight upper body regions, as well as the global risk of 10 video recorded work tasks. Intra-observer reliability was assessed by having nine of the ergonomists repeat the procedure at least three weeks after the first assessment. The ergonomists made their risk assessment based on his/her experience and knowledge. The statistical parameters of reliability included agreement in %, kappa, linearly weighted kappa, intraclass correlation and Kendall's coefficient of concordance. The average inter-observer agreement of the global risk was 53% and the corresponding weighted kappa (K w ) was 0.32, indicating fair reliability. The intra-observer agreement was 61% and 0.41 (K w ). This study indicates that risk assessments of the upper body, without the use of an explicit observational method, have non-acceptable reliability. It is therefore recommended to use systematic risk assessment methods to a higher degree. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Stärk, Katharina DC; Regula, Gertraud; Hernandez, Jorge; Knopf, Lea; Fuchs, Klemens; Morris, Roger S; Davies, Peter
2006-01-01
Background Emerging animal and zoonotic diseases and increasing international trade have resulted in an increased demand for veterinary surveillance systems. However, human and financial resources available to support government veterinary services are becoming more and more limited in many countries world-wide. Intuitively, issues that present higher risks merit higher priority for surveillance resources as investments will yield higher benefit-cost ratios. The rapid rate of acceptance of this core concept of risk-based surveillance has outpaced the development of its theoretical and practical bases. Discussion The principal objectives of risk-based veterinary surveillance are to identify surveillance needs to protect the health of livestock and consumers, to set priorities, and to allocate resources effectively and efficiently. An important goal is to achieve a higher benefit-cost ratio with existing or reduced resources. We propose to define risk-based surveillance systems as those that apply risk assessment methods in different steps of traditional surveillance design for early detection and management of diseases or hazards. In risk-based designs, public health, economic and trade consequences of diseases play an important role in selection of diseases or hazards. Furthermore, certain strata of the population of interest have a higher probability to be sampled for detection of diseases or hazards. Evaluation of risk-based surveillance systems shall prove that the efficacy of risk-based systems is equal or higher than traditional systems; however, the efficiency (benefit-cost ratio) shall be higher in risk-based surveillance systems. Summary Risk-based surveillance considerations are useful to support both strategic and operational decision making. This article highlights applications of risk-based surveillance systems in the veterinary field including food safety. Examples are provided for risk-based hazard selection, risk-based selection of sampling strata as well as sample size calculation based on risk considerations. PMID:16507106
Sexton, Ken
2012-01-01
Systematic evaluation of cumulative health risks from the combined effects of multiple environmental stressors is becoming a vital component of risk-based decisions aimed at protecting human populations and communities. This article briefly examines the historical development of cumulative risk assessment as an analytical tool, and discusses current approaches for evaluating cumulative health effects from exposure to both chemical mixtures and combinations of chemical and nonchemical stressors. A comparison of stressor-based and effects-based assessment methods is presented, and the potential value of focusing on viable risk management options to limit the scope of cumulative evaluations is discussed. The ultimate goal of cumulative risk assessment is to provide answers to decision-relevant questions based on organized scientific analysis; even if the answers, at least for the time being, are inexact and uncertain. PMID:22470298
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco
Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less
NASA Astrophysics Data System (ADS)
Qiu, Yuchen; Wang, Yunzhi; Yan, Shiju; Tan, Maxine; Cheng, Samuel; Liu, Hong; Zheng, Bin
2016-03-01
In order to establish a new personalized breast cancer screening paradigm, it is critically important to accurately predict the short-term risk of a woman having image-detectable cancer after a negative mammographic screening. In this study, we developed and tested a novel short-term risk assessment model based on deep learning method. During the experiment, a number of 270 "prior" negative screening cases was assembled. In the next sequential ("current") screening mammography, 135 cases were positive and 135 cases remained negative. These cases were randomly divided into a training set with 200 cases and a testing set with 70 cases. A deep learning based computer-aided diagnosis (CAD) scheme was then developed for the risk assessment, which consists of two modules: adaptive feature identification module and risk prediction module. The adaptive feature identification module is composed of three pairs of convolution-max-pooling layers, which contains 20, 10, and 5 feature maps respectively. The risk prediction module is implemented by a multiple layer perception (MLP) classifier, which produces a risk score to predict the likelihood of the woman developing short-term mammography-detectable cancer. The result shows that the new CAD-based risk model yielded a positive predictive value of 69.2% and a negative predictive value of 74.2%, with a total prediction accuracy of 71.4%. This study demonstrated that applying a new deep learning technology may have significant potential to develop a new short-term risk predicting scheme with improved performance in detecting early abnormal symptom from the negative mammograms.
Code of Federal Regulations, 2010 CFR
2010-04-01
... POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and reevaluations. 35.1320 Section 35.1320 Housing and...
Code of Federal Regulations, 2014 CFR
2014-04-01
... POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard... 24 Housing and Urban Development 1 2014-04-01 2014-04-01 false Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and reevaluations. 35.1320 Section 35.1320 Housing and...
Code of Federal Regulations, 2013 CFR
2013-04-01
... POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard... 24 Housing and Urban Development 1 2013-04-01 2013-04-01 false Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and reevaluations. 35.1320 Section 35.1320 Housing and...
Code of Federal Regulations, 2012 CFR
2012-04-01
... POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and reevaluations. 35.1320 Section 35.1320 Housing and...
Code of Federal Regulations, 2011 CFR
2011-04-01
... POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and reevaluations. 35.1320 Section 35.1320 Housing and...
Failure mode and effects analysis: a comparison of two common risk prioritisation methods.
McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L
2016-05-01
Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-22
... secured borrowers within each year), the coefficients of variation of the time series of annual default... the method you use, please do not submit your comment multiple times via different methods. You may... component to directly recognize the credit risk on such loans.\\4\\ At the time of the Farm Bill's enactment...
White Paper: A Defect Prioritization Method Based on the Risk Priority Number
2013-11-01
adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories
Parent-Reported Predictors of Adolescent Panic Attacks.
ERIC Educational Resources Information Center
Hayward, Chris; Wilson, Kimberly A.; Lagle, Kristy; Killen, Joel D.; Taylor, C. Barr
2004-01-01
Objective: To identify parent-reported risk factors for adolescent panic attacks. Method: Structured diagnostic interviews were obtained from 770 parents of participants in a school-based risk factor study for adolescent panic. Parent-reported risk factors assessed included characteristics of the child (negative affect, separation anxiety disorder…
IMPROVED RISK ASSESSMENT AND REMEDIATION OF SOIL METALS BASED ON BIOAVAILABILITY MEASUREMENTS
Heavy metals in soils can comprise risk through plant uptake or soil ingestion. Recent research results and progress in understandings of risks and methods for soil metal remediation will be presented. Beneficial use of composts/bosolids plus limestone to remediate metal killed e...
NASA Astrophysics Data System (ADS)
Debnath, Ashim Kumar; Chin, Hoong Chor
Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.
A time series modeling approach in risk appraisal of violent and sexual recidivism.
Bani-Yaghoub, Majid; Fedoroff, J Paul; Curry, Susan; Amundsen, David E
2010-10-01
For over half a century, various clinical and actuarial methods have been employed to assess the likelihood of violent recidivism. Yet there is a need for new methods that can improve the accuracy of recidivism predictions. This study proposes a new time series modeling approach that generates high levels of predictive accuracy over short and long periods of time. The proposed approach outperformed two widely used actuarial instruments (i.e., the Violence Risk Appraisal Guide and the Sex Offender Risk Appraisal Guide). Furthermore, analysis of temporal risk variations based on specific time series models can add valuable information into risk assessment and management of violent offenders.
Contraceptive use and risk of unintended pregnancy in California.
Foster, Diana G; Bley, Julia; Mikanda, John; Induni, Marta; Arons, Abigail; Baumrind, Nikki; Darney, Philip D; Stewart, Felicia
2004-07-01
California is home to more than one out of eight American women of reproductive age. Because California has a large, diverse and growing population, national statistics do not necessarily describe the reproductive health of California women. This article presents risk for pregnancy and sexually transmitted infections among women in California based on the California Women's Health Survey. Over 8900 women of reproductive age who participated in this survey between 1998 and 2001 provide estimates of access to care and use of family-planning methods in the state. We find that 49% of the female population aged 18-44 in California is at risk of unintended pregnancy. Nine percent (9%) of women at risk of an unintended pregnancy are not using any method of contraception, primarily for method-related reasons, such as a concern about side effects or a dislike of available contraceptive methods. Among women at risk for unintended pregnancy, we find disparities by race/ethnicity and education in use of contraceptive methods.
Risk management of key issues of FPSO
NASA Astrophysics Data System (ADS)
Sun, Liping; Sun, Hai
2012-12-01
Risk analysis of key systems have become a growing topic late of because of the development of offshore structures. Equipment failures of offloading system and fire accidents were analyzed based on the floating production, storage and offloading (FPSO) features. Fault tree analysis (FTA), and failure modes and effects analysis (FMEA) methods were examined based on information already researched on modules of relex reliability studio (RRS). Equipment failures were also analyzed qualitatively by establishing a fault tree and Boolean structure function based on the shortage of failure cases, statistical data, and risk control measures examined. Failure modes of fire accident were classified according to the different areas of fire occurrences during the FMEA process, using risk priority number (RPN) methods to evaluate their severity rank. The qualitative analysis of FTA gave the basic insight of forming the failure modes of FPSO offloading, and the fire FMEA gave the priorities and suggested processes. The research has practical importance for the security analysis problems of FPSO.
The prefabricated building risk decision research of DM technology on the basis of Rough Set
NASA Astrophysics Data System (ADS)
Guo, Z. L.; Zhang, W. B.; Ma, L. H.
2017-08-01
With the resources crises and more serious pollution, the green building has been strongly advocated by most countries and become a new building style in the construction field. Compared with traditional building, the prefabricated building has its own irreplaceable advantages but is influenced by many uncertainties. So far, a majority of scholars have been studying based on qualitative researches from all of the word. This paper profoundly expounds its significance about the prefabricated building. On the premise of the existing research methods, combined with rough set theory, this paper redefines the factors which affect the prefabricated building risk. Moreover, it quantifies risk factors and establish an expert knowledge base through assessing. And then reduced risk factors about the redundant attributes and attribute values, finally form the simplest decision rule. This simplest decision rule, which is based on the DM technology of rough set theory, provides prefabricated building with a controllable new decision-making method.
From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment
NASA Astrophysics Data System (ADS)
Klose, M.; Damm, B.
2014-12-01
The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their impact on public budgets is a further component of this approach. In integrated risk assessment, combination of methods plays an important role, with the objective of collecting and integrating complex data sets on landslide risk.
Olver, Mark E; Beggs Christofferson, Sarah M; Wong, Stephen C P
2015-02-01
We examined the use of the clinically significant change (CSC) method with the Violence Risk Scale-Sexual Offender version (VRS-SO), and its implications for risk communication, in a combined sample of 945 treated sexual offenders from three international settings, followed up for a minimum 5 years post-release. The reliable change (RC) index was used to identify thresholds of clinically meaningful change and to create four CSC groups (already okay, recovered, improved, unchanged) based on VRS-SO dynamic scores and amount of change made. Outcome analyses demonstrated important CSC-group differences in 5-year rates of sexual and violent recidivism. However, when baseline risk was controlled via Cox regression survival analysis, the pattern and magnitude of CSC-group differences in sexual and violent recidivism changed to suggest that observed variation in recidivism base rates could be at least partly explained by pre-existing group differences in risk level. Implications for communication of risk-change information and applications to clinical practice are discussed. Copyright © 2015 John Wiley & Sons, Ltd.
Application of Risk-Based Inspection method for gas compressor station
NASA Astrophysics Data System (ADS)
Zhang, Meng; Liang, Wei; Qiu, Zeyang; Lin, Yang
2017-05-01
According to the complex process and lots of equipment, there are risks in gas compressor station. At present, research on integrity management of gas compressor station is insufficient. In this paper, the basic principle of Risk Based Inspection (RBI) and the RBI methodology are studied; the process of RBI in the gas compressor station is developed. The corrosion loop and logistics loop of the gas compressor station are determined through the study of corrosion mechanism and process of the gas compressor station. The probability of failure is calculated by using the modified coefficient, and the consequence of failure is calculated by the quantitative method. In particular, we addressed the application of a RBI methodology in a gas compressor station. The risk ranking is helpful to find the best preventive plan for inspection in the case study.
Le Vu, Stéphane; Ratmann, Oliver; Delpech, Valerie; Brown, Alison E; Gill, O Noel; Tostevin, Anna; Fraser, Christophe; Volz, Erik M
2018-06-01
Phylogenetic clustering of HIV sequences from a random sample of patients can reveal epidemiological transmission patterns, but interpretation is hampered by limited theoretical support and statistical properties of clustering analysis remain poorly understood. Alternatively, source attribution methods allow fitting of HIV transmission models and thereby quantify aspects of disease transmission. A simulation study was conducted to assess error rates of clustering methods for detecting transmission risk factors. We modeled HIV epidemics among men having sex with men and generated phylogenies comparable to those that can be obtained from HIV surveillance data in the UK. Clustering and source attribution approaches were applied to evaluate their ability to identify patient attributes as transmission risk factors. We find that commonly used methods show a misleading association between cluster size or odds of clustering and covariates that are correlated with time since infection, regardless of their influence on transmission. Clustering methods usually have higher error rates and lower sensitivity than source attribution method for identifying transmission risk factors. But neither methods provide robust estimates of transmission risk ratios. Source attribution method can alleviate drawbacks from phylogenetic clustering but formal population genetic modeling may be required to estimate quantitative transmission risk factors. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim
2017-10-01
Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.
Seismic Hazard Analysis — Quo vadis?
NASA Astrophysics Data System (ADS)
Klügel, Jens-Uwe
2008-05-01
The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.
Caries risk assessment in schoolchildren - a form based on Cariogram® software
CABRAL, Renata Nunes; HILGERT, Leandro Augusto; FABER, Jorge; LEAL, Soraya Coelho
2014-01-01
Identifying caries risk factors is an important measure which contributes to best understanding of the cariogenic profile of the patient. The Cariogram® software provides this analysis, and protocols simplifying the method were suggested. Objectives The aim of this study was to determine whether a newly developed Caries Risk Assessment (CRA) form based on the Cariogram® software could classify schoolchildren according to their caries risk and to evaluate relationships between caries risk and the variables in the form. Material and Methods 150 schoolchildren aged 5 to 7 years old were included in this survey. Caries prevalence was obtained according to International Caries Detection and Assessment System (ICDAS) II. Information for filling in the form based on Cariogram® was collected clinically and from questionnaires sent to parents. Linear regression and a forward stepwise multiple regression model were applied to correlate the variables included in the form with the caries risk. Results Caries prevalence, in primary dentition, including enamel and dentine carious lesions was 98.6%, and 77.3% when only dentine lesions were considered. Eighty-six percent of the children were classified as at moderate caries risk. The forward stepwise multiple regression model result was significant (R2=0.904; p<0.00001), showing that the most significant factors influencing caries risk were caries experience, oral hygiene, frequency of food consumption, sugar consumption and fluoride sources. Conclusion The use of the form based on the Cariogram® software enabled classification of the schoolchildren at low, moderate and high caries risk. Caries experience, oral hygiene, frequency of food consumption, sugar consumption and fluoride sources are the variables that were shown to be highly correlated with caries risk. PMID:25466473
WRF-based fire risk modelling and evaluation for years 2010 and 2012 in Poland
NASA Astrophysics Data System (ADS)
Stec, Magdalena; Szymanowski, Mariusz; Kryza, Maciej
2016-04-01
Wildfires are one of the main ecosystems' disturbances for forested, seminatural and agricultural areas. They generate significant economic loss, especially in forest management and agriculture. Forest fire risk modeling is therefore essential e.g. for forestry administration. In August 2015 a new method of forest fire risk forecasting entered into force in Poland. The method allows to predict a fire risk level in a 4-degree scale (0 - no risk, 3 - highest risk) and consists of a set of linearized regression equations. Meteorological information is used as predictors in regression equations, with air temperature, relative humidity, average wind speed, cloudiness and rainfall. The equations include also pine litter humidity as a measure of potential fuel characteristics. All these parameters are measured routinely in Poland at 42 basic and 94 auxiliary sites. The fire risk level is estimated for a current (basing on morning measurements) or next day (basing on midday measurements). Entire country is divided into 42 prognostic zones, and fire risk level for each zone is taken from the closest measuring site. The first goal of this work is to assess if the measurements needed for fire risk forecasting may be replaced by the data from mesoscale meteorological model. Additionally, the use of a meteorological model would allow to take into account much more realistic spatial differentiation of weather elements determining the fire risk level instead of discrete point-made measurements. Meteorological data have been calculated using the Weather Research and Forecasting model (WRF). For the purpose of this study the WRF model is run in the reanalysis mode allowing to estimate all required meteorological data in a 5-kilometers grid. The only parameter that cannot be directly calculated using WRF is the litter humidity, which has been estimated using empirical formula developed by Sakowska (2007). The experiments are carried out for two selected years: 2010 and 2012. The year 2010 was characterized by the smallest number of wildfires and burnt area whereas 2012 - by the biggest number of fires and the largest area of conflagration. The data about time, localization, scale and causes of individual wildfire occurrence in given years are taken from the National Forest Fire Information System (KSIPL), administered by Forest Fire Protection Department of Polish Forest Research Institute. The database is a part of European Forest Fire Information System (EFFIS). Basing on this data and on the WRF-based fire risk modelling we intend to achieve the second goal of the study, which is the evaluation of the forecasted fire risk with an occurrence of wildfires. Special attention is paid here to the number, time and the spatial distribution of wildfires occurred in cases of low-level predicted fire risk. Results obtained reveals the effectiveness of the new forecasting method. The outcome of our investigation allows to draw a conclusion that some adjustments are possible to improve the efficiency on the fire-risk estimation method.
Train integrity detection risk analysis based on PRISM
NASA Astrophysics Data System (ADS)
Wen, Yuan
2018-04-01
GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.
Zhou, Yanju; Chen, Qian; Chen, Xiaohong; Wang, Zongrun
2014-01-01
This paper considers a decentralized supply chain in which a single supplier sells a perishable product to a single retailer facing uncertain demand. We assume that the supplier and the retailer are both risk averse and utilize Conditional Value at Risk (CVaR), a risk measure method which is popularized in financial risk management, to estimate their risk attitude. We establish a buyback policy model based on Stackelberg game theory under considering supply chain members' risk preference and get the expressions of the supplier's optimal repurchase price and the retailer's optimal order quantity which are compared with those under risk neutral case. Finally, a numerical example is applied to simulate that model and prove related conclusions. PMID:25247605
Zhou, Yanju; Chen, Qian; Chen, Xiaohong; Wang, Zongrun
2014-01-01
This paper considers a decentralized supply chain in which a single supplier sells a perishable product to a single retailer facing uncertain demand. We assume that the supplier and the retailer are both risk averse and utilize Conditional Value at Risk (CVaR), a risk measure method which is popularized in financial risk management, to estimate their risk attitude. We establish a buyback policy model based on Stackelberg game theory under considering supply chain members' risk preference and get the expressions of the supplier's optimal repurchase price and the retailer's optimal order quantity which are compared with those under risk neutral case. Finally, a numerical example is applied to simulate that model and prove related conclusions.
Comparison of risk assessment procedures used in OCRA and ULRA methods
Roman-Liu, Danuta; Groborz, Anna; Tokarski, Tomasz
2013-01-01
The aim of this study was to analyse the convergence of two methods by comparing exposure and the assessed risk of developing musculoskeletal disorders at 18 repetitive task workstations. The already established occupational repetitive actions (OCRA) and the recently developed upper limb risk assessment (ULRA) produce correlated results (R = 0.84, p = 0.0001). A discussion of the factors that influence the values of the OCRA index and ULRA's repetitive task indicator shows that both similarities and differences in the results produced by the two methods can arise from the concepts that underlie them. The assessment procedure and mathematical calculations that the basic parameters are subjected to are crucial to the results of risk assessment. The way the basic parameters are defined influences the assessment of exposure and risk assessment to a lesser degree. The analysis also proved that not always do great differences in load indicator values result in differences in risk zones. Practitioner Summary: We focused on comparing methods that, even though based on different concepts, serve the same purpose. The results proved that different methods with different assumptions can produce similar assessment of upper limb load; sharp criteria in risk assessment are not the best solution. PMID:24041375
NASA Astrophysics Data System (ADS)
Mardi Safitri, Dian; Arfi Nabila, Zahra; Azmi, Nora
2018-03-01
Musculoskeletal Disorders (MSD) is one of the ergonomic risks due to manual activity, non-neutral posture and repetitive motion. The purpose of this study is to measure risk and implement ergonomic interventions to reduce the risk of MSD on the paper pallet assembly work station. Measurements to work posture are done by Ovako Working Posture Analysis (OWAS) methods and Rapid Entire Body Assessment (REBA) method, while the measurement of work repetitiveness was using Strain Index (SI) method. Assembly processes operators are identified has the highest risk level. OWAS score, Strain Index, and REBA values are 4, 20.25, and 11. Ergonomic improvements are needed to reduce that level of risk. Proposed improvements will be developed using the Quality Function Deployment (QFD) method applied with Axiomatic House of Quality (AHOQ) and Morphological Chart. As the result, risk level based on OWAS score & REBA score turn out from 4 & 11 to be 1 & 2. Biomechanics analysis of the operator also shows the decreasing values for L4-L5 moment, compression, joint shear, and joint moment strength.
A Contextualized Approach to Faith-Based HIV Risk Reduction for African American Women.
Stewart, Jennifer M; Rogers, Christopher K; Bellinger, Dawn; Thompson, Keitra
2016-07-01
HIV/AIDS has a devastating impact on African Americans, particularly women and young adults. We sought to characterize risks, barriers, and content and delivery needs for a faith-based intervention to reduce HIV risk among African American women ages 18 to 25. In a convergent parallel mixed methods study, we conducted four focus groups (n = 38) and surveyed 71 young adult women. Data were collected across four African American churches for a total of 109 participants. We found the majority of women in this sample were engaged in behaviors that put them at risk for contracting HIV, struggled with religiously based barriers and matters of sexuality, and had a desire to incorporate their intimate relationships, parenting, and financial burdens into faith-based HIV risk-reduction interventions. Incorporating additional social context-related factors into HIV risk-reduction interventions for young African American women is critical to adapting and developing HIV interventions to reduce risk among young adult women in faith settings. © The Author(s) 2016.
The network approach and interventions to prevent HIV among injection drug users.
Neaigus, A
1998-01-01
OBJECTIVE: To review human immunodeficiency virus (HIV) risk reduction interventions among injecting drug users (IDUs) that have adopted a network approach. METHODS: The design and outcomes of selected network-based interventions among IDUs are reviewed using the network concepts of the dyad (two-person relationship), the personal risk network (an index person and all of his or her relationship), and the "sociometric" network (the complete set of relations between people in a population) and community. RESULTS: In a dyad intervention among HIV-serodiscordant couples, many of which included IDUs, there were no HIV seroconversions. Participants in personal risk network interventions were more likely to reduce drug risks and in some of these interventions, sexual risks, than were participants in individual-based interventions. Sociometric network interventions reached more IDUs and may be more cost-effective than individual-based interventions. CONCLUSION: Network-based HIV risk reduction interventions among IDUs, and others at risk for HIV, hold promise and should be encouraged. PMID:9722819
Jahanfar, Ali; Amirmojahedi, Mohsen; Gharabaghi, Bahram; Dubey, Brajesh; McBean, Edward; Kumar, Dinesh
2017-03-01
Rapid population growth of major urban centres in many developing countries has created massive landfills with extraordinary heights and steep side-slopes, which are frequently surrounded by illegal low-income residential settlements developed too close to landfills. These extraordinary landfills are facing high risks of catastrophic failure with potentially large numbers of fatalities. This study presents a novel method for risk assessment of landfill slope failure, using probabilistic analysis of potential failure scenarios and associated fatalities. The conceptual framework of the method includes selecting appropriate statistical distributions for the municipal solid waste (MSW) material shear strength and rheological properties for potential failure scenario analysis. The MSW material properties for a given scenario is then used to analyse the probability of slope failure and the resulting run-out length to calculate the potential risk of fatalities. In comparison with existing methods, which are solely based on the probability of slope failure, this method provides a more accurate estimate of the risk of fatalities associated with a given landfill slope failure. The application of the new risk assessment method is demonstrated with a case study for a landfill located within a heavily populated area of New Delhi, India.
Prevalence and Predictors of Sexual Risks Among Homeless Youth
ERIC Educational Resources Information Center
Halcon, Linda L.; Lifson, Alan R.
2004-01-01
This study examined prevalence of sexual risks among homeless adolescents and described factors associated with those risks. Community-based outreach methods were used successfully to access this difficult-to-reach population. The sample included 203 homeless youth aged 15-22 recruited from community sites. Questionnaire items addressed…
College Students' Perceived Disease Risk versus Actual Prevalence Rates
ERIC Educational Resources Information Center
Smith, Matthew Lee; Dickerson, Justin B.; Sosa, Erica T.; McKyer, E. Lisako J.; Ory, Marcia G.
2012-01-01
Objective: To compare college students' perceived disease risk with disease prevalence rates. Methods: Data were analyzed from 625 college students collected with an Internet-based survey. Paired t-tests were used to separately compare participants' perceived 10-year and lifetime disease risk for 4 diseases: heart disease, cancer, diabetes, and…
NASA Astrophysics Data System (ADS)
DELİCE, Yavuz
2015-04-01
Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing, signaling caused malfunctions and risks), fire or explosion etc.- In this study, with FMEA method, risk analysis of the urban and intercity motorways against natural disasters and hazards have been performed and found solutions were brought against these risks. Keywords: Failure Modes Effects Analysis (FMEA), Pareto Analyses (PA), Highways, Risk Management.
Risk-based corrective action and brownfields restorations. Geotechnical special publication No. 82
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benson, C.H.; Meegoda, J.N.; Gilbert, R.G.
Risk-based corrective action (RBCA) and brownfields restoration now play a significant role in contaminated site remediation. RBCA provides the necessary framework for balancing health and environmental risks with costs while targeting the ultimate objective of sensible remediation. Brownfields is a reasonable, economical approach for remediating contaminated land intended for industrial use. This book describes the tools and methods employed in RBCA, and provides illustrative examples through case histories with emphasis on brownfields restorations.
Research on the Application of Risk-based Inspection for the Boiler System in Power Plant
NASA Astrophysics Data System (ADS)
Li, Henan
2017-12-01
Power plant boiler is one of the three main equipment of coal-fired power plants, is very tall to the requirement of the safe and stable operation, in a significant role in the whole system of thermal power generation, a risk-based inspection is a kind of pursuit of security and economy of unified system management idea and method, can effectively evaluate equipment risk and reduce the operational cost.
Gardosi, J; Clausson, B; Francis, A
2009-09-01
We wanted to compare customised and population standards for defining smallness for gestational age (SGA) in the assessment of perinatal mortality risk associated with parity and maternal size. Population-based cohort study. Sweden. Swedish Birth Registry database 1992-1995 with 354 205 complete records. Coefficients were derived and applied to determine SGA by the fully customised method, or by adjustment for fetal sex only, and using the same fetal weight standard. Perinatal deaths and rates of small for gestational age (SGA) babies within subgroups stratified by parity, body mass index (BMI) and maternal size within the BMI range of 20.0-24.9. Perinatal mortality rates (PMR) had a U-shaped distribution in parity groups, increased proportionately with maternal BMI, and had no association with maternal size within the normal BMI range. For each of these subgroups, SGA rates determined by the customised method showed strong association with the PMR. In contrast, SGA based on uncustomised, population-based centiles had poor correlation with perinatal mortality. The increased perinatal mortality risk in pregnancies of obese mothers was associated with an increased risk of SGA using customised centiles, and a decreased risk of SGA using population-based centiles. The use of customised centiles to determine SGA improves the identification of pregnancies which are at increased risk of perinatal death.
Software Risk Identification for Interplanetary Probes
NASA Technical Reports Server (NTRS)
Dougherty, Robert J.; Papadopoulos, Periklis E.
2005-01-01
The need for a systematic and effective software risk identification methodology is critical for interplanetary probes that are using increasingly complex and critical software. Several probe failures are examined that suggest more attention and resources need to be dedicated to identifying software risks. The direct causes of these failures can often be traced to systemic problems in all phases of the software engineering process. These failures have lead to the development of a practical methodology to identify risks for interplanetary probes. The proposed methodology is based upon the tailoring of the Software Engineering Institute's (SEI) method of taxonomy-based risk identification. The use of this methodology will ensure a more consistent and complete identification of software risks in these probes.
ERIC Educational Resources Information Center
Prince, Kort C.; Ho, Edward A.; Hansen, Sharon B.
2010-01-01
This study examined the effects of the Living Skills school-based intervention program as a method of improving school adjustment and the social lives of at-risk elementary school students. Youth participants were referred to the program by teachers or school counselors based on perceptions of risk due to rejection and isolation, aggressive and…
[Consideration of Mobile Medical Device Regulation].
Peng, Liang; Yang, Pengfei; He, Weigang
2015-07-01
The regulation of mobile medical devices is one of the hot topics in the industry now. The definition, regulation scope and requirements, potential risks of mobile medical devices were analyzed and discussed based on mobile computing techniques and the FDA guidance of mobile medical applications. The regulation work of mobile medical devices in China needs to adopt the risk-based method.
NASA Astrophysics Data System (ADS)
Wei, Yu; Chen, Wang; Lin, Yu
2013-05-01
Recent studies in the econophysics literature reveal that price variability has fractal and multifractal characteristics not only in developed financial markets, but also in emerging markets. Taking high-frequency intraday quotes of the Shanghai Stock Exchange Component (SSEC) Index as example, this paper proposes a new method to measure daily Value-at-Risk (VaR) by combining the newly introduced multifractal volatility (MFV) model and the extreme value theory (EVT) method. Two VaR backtesting techniques are then employed to compare the performance of the model with that of a group of linear and nonlinear generalized autoregressive conditional heteroskedasticity (GARCH) models. The empirical results show the multifractal nature of price volatility in Chinese stock market. VaR measures based on the multifractal volatility model and EVT method outperform many GARCH-type models at high-risk levels.
Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults
ERIC Educational Resources Information Center
Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.
2007-01-01
Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…
[Risk management for medical devices].
Xie, Ying-jie; Xu, Xing-gang
2007-07-01
Based on the practices of the risk management activities by Chinese medical device manufacturers and theoretical study of the latest international standard ISO 14971:2007, this article analyses the risk management in medical device manufacturing industry by introducing the status quo of applications, four requirements at operational stages, and future trends of development. Methods and suggestions are therefore given to medical device manufacturers for risk management.
Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko
2016-12-01
There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.
Nixdorf, Erik; Sun, Yuanyuan; Lin, Mao; Kolditz, Olaf
2017-12-15
The main objective of this study is to quantify the groundwater contamination risk of Songhua River Basin by applying a novel approach of integrating public datasets, web services and numerical modelling techniques. To our knowledge, this study is the first to establish groundwater risk maps for the entire Songhua River Basin, one of the largest and most contamination-endangered river basins in China. Index-based groundwater risk maps were created with GIS tools at a spatial resolution of 30arc sec by combining the results of groundwater vulnerability and hazard assessment. Groundwater vulnerability was evaluated using the DRASTIC index method based on public datasets at the highest available resolution in combination with numerical groundwater modelling. As a novel approach to overcome data scarcity at large scales, a web mapping service based data query was applied to obtain an inventory for potential hazardous sites within the basin. The groundwater risk assessment demonstrated that <1% of Songhua River Basin is at high or very high contamination risk. These areas were mainly located in the vast plain areas with hotspots particularly in the Changchun metropolitan area. Moreover, groundwater levels and pollution point sources were found to play a significantly larger impact in assessing these areas than originally assumed by the index scheme. Moderate contamination risk was assigned to 27% of the aquifers, predominantly associated with less densely populated agricultural areas. However, the majority of aquifer area in the sparsely populated mountain ranges displayed low groundwater contamination risk. Sensitivity analysis demonstrated that this novel method is valid for regional assessments of groundwater contamination risk. Despite limitations in resolution and input data consistency, the obtained groundwater contamination risk maps will be beneficial for regional and local decision-making processes with regard to groundwater protection measures, particularly if other data availability is limited. Copyright © 2017 Elsevier B.V. All rights reserved.
Health Risk Assessment of Inhalable Particulate Matter in Beijing Based on the Thermal Environment
Xu, Lin-Yu; Yin, Hao; Xie, Xiao-Dong
2014-01-01
Inhalable particulate matter (PM10) is a primary air pollutant closely related to public health, and an especially serious problem in urban areas. The urban heat island (UHI) effect has made the urban PM10 pollution situation more complex and severe. In this study, we established a health risk assessment system utilizing an epidemiological method taking the thermal environment effects into consideration. We utilized a remote sensing method to retrieve the PM10 concentration, UHI, Normalized Difference Vegetation Index (NDVI), and Normalized Difference Water Index (NDWI). With the correlation between difference vegetation index (DVI) and PM10 concentration, we utilized the established model between PM10 and thermal environmental indicators to evaluate the PM10 health risks based on the epidemiological study. Additionally, with the regulation of UHI, NDVI and NDWI, we aimed at regulating the PM10 health risks and thermal environment simultaneously. This study attempted to accomplish concurrent thermal environment regulation and elimination of PM10 health risks through control of UHI intensity. The results indicate that urban Beijing has a higher PM10 health risk than rural areas; PM10 health risk based on the thermal environment is 1.145, which is similar to the health risk calculated (1.144) from the PM10 concentration inversion; according to the regulation results, regulation of UHI and NDVI is effective and helpful for mitigation of PM10 health risk in functional zones. PMID:25464132
Kim, Dohyeong; Galeano, M. Alicia Overstreet; Hull, Andrew; Miranda, Marie Lynn
2008-01-01
Background Preventive approaches to childhood lead poisoning are critical for addressing this longstanding environmental health concern. Moreover, increasing evidence of cognitive effects of blood lead levels < 10 μg/dL highlights the need for improved exposure prevention interventions. Objectives Geographic information system–based childhood lead exposure risk models, especially if executed at highly resolved spatial scales, can help identify children most at risk of lead exposure, as well as prioritize and direct housing and health-protective intervention programs. However, developing highly resolved spatial data requires labor-and time-intensive geocoding and analytical processes. In this study we evaluated the benefit of increased effort spent geocoding in terms of improved performance of lead exposure risk models. Methods We constructed three childhood lead exposure risk models based on established methods but using different levels of geocoded data from blood lead surveillance, county tax assessors, and the 2000 U.S. Census for 18 counties in North Carolina. We used the results to predict lead exposure risk levels mapped at the individual tax parcel unit. Results The models performed well enough to identify high-risk areas for targeted intervention, even with a relatively low level of effort on geocoding. Conclusions This study demonstrates the feasibility of widespread replication of highly spatially resolved childhood lead exposure risk models. The models guide resource-constrained local health and housing departments and community-based organizations on how best to expend their efforts in preventing and mitigating lead exposure risk in their communities. PMID:19079729
Chen, C L; Kaber, D B; Dempsey, P G
2000-06-01
A new and improved method to feedforward neural network (FNN) development for application to data classification problems, such as the prediction of levels of low-back disorder (LBD) risk associated with industrial jobs, is presented. Background on FNN development for data classification is provided along with discussions of previous research and neighborhood (local) solution search methods for hard combinatorial problems. An analytical study is presented which compared prediction accuracy of a FNN based on an error-back propagation (EBP) algorithm with the accuracy of a FNN developed by considering results of local solution search (simulated annealing) for classifying industrial jobs as posing low or high risk for LBDs. The comparison demonstrated superior performance of the FNN generated using the new method. The architecture of this FNN included fewer input (predictor) variables and hidden neurons than the FNN developed based on the EBP algorithm. Independent variable selection methods and the phenomenon of 'overfitting' in FNN (and statistical model) generation for data classification are discussed. The results are supportive of the use of the new approach to FNN development for applications to musculoskeletal disorders and risk forecasting in other domains.
NASA Astrophysics Data System (ADS)
Xu, P.; Li, D.
2017-12-01
Microplastic which refers to the plastic fragments and particles with diameters less than 5 mm has potential threatening impacts on various ambient medium. The shortage of knowledge of ecological risks from microplastics inhibits the scientific research process. Based on the research of the literature widely, this paper analyzed potential ecological risk of microplastic in sediment of Shanghai and Hong Kong by means of ecological risk index and hazard classes developed by UN Globally Harmonized System. Combining of the two assessment method, results showed that the order of microplastic pollution extents in sediments was Changjiang Estuary
A utility/cost analysis of breast cancer risk prediction algorithms
NASA Astrophysics Data System (ADS)
Abbey, Craig K.; Wu, Yirong; Burnside, Elizabeth S.; Wunderlich, Adam; Samuelson, Frank W.; Boone, John M.
2016-03-01
Breast cancer risk prediction algorithms are used to identify subpopulations that are at increased risk for developing breast cancer. They can be based on many different sources of data such as demographics, relatives with cancer, gene expression, and various phenotypic features such as breast density. Women who are identified as high risk may undergo a more extensive (and expensive) screening process that includes MRI or ultrasound imaging in addition to the standard full-field digital mammography (FFDM) exam. Given that there are many ways that risk prediction may be accomplished, it is of interest to evaluate them in terms of expected cost, which includes the costs of diagnostic outcomes. In this work we perform an expected-cost analysis of risk prediction algorithms that is based on a published model that includes the costs associated with diagnostic outcomes (true-positive, false-positive, etc.). We assume the existence of a standard screening method and an enhanced screening method with higher scan cost, higher sensitivity, and lower specificity. We then assess expected cost of using a risk prediction algorithm to determine who gets the enhanced screening method under the strong assumption that risk and diagnostic performance are independent. We find that if risk prediction leads to a high enough positive predictive value, it will be cost-effective regardless of the size of the subpopulation. Furthermore, in terms of the hit-rate and false-alarm rate of the of the risk prediction algorithm, iso-cost contours are lines with slope determined by properties of the available diagnostic systems for screening.
Risk-Based Sampling: I Don't Want to Weight in Vain.
Powell, Mark R
2015-12-01
Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. © 2015 Society for Risk Analysis.
Vehicle crashworthiness ratings in Australia.
Cameron, M; Mach, T; Neiger, D; Graham, A; Ramsay, R; Pappas, M; Haley, J
1994-08-01
The paper reviews the published vehicle safety ratings based on mass crash data from the United States, Sweden, and Great Britain. It then describes the development of vehicle crashworthiness ratings based on injury compensation claims and police accident reports from Victoria and New South Wales, the two most populous states in Australia. Crashworthiness was measured by a combination of injury severity (of injured drivers) and injury risk (of drivers involved in crashes). Injury severity was based on 22,600 drivers injured in crashes in the two states. Injury risk was based on 70,900 drivers in New South Wales involved in crashes after which a vehicle was towed away. Injury risk measured in this way was compared with the "relative injury risk" of particular model cars involved in two car crashes in Victoria (where essentially only casualty crashes are reported), which was based on the method developed by Folksam Insurance in Sweden from Evans' double-pair comparison method. The results include crashworthiness ratings for the makes and models crashing in Australia in sufficient numbers to measure their crash performance adequately. The ratings were normalised for the driver sex and speed limit at the crash location, the two factors found to be strongly related to injury risk and/or severity and to vary substantially across makes and models of Australian crash-involved cars. This allows differences in crashworthiness of individual models to be seen, uncontaminated by major crash exposure differences.
[Screening for risk of child abuse and neglect. A practicable method?].
Kindler, H
2010-10-01
Selective primary prevention programs for child abuse and neglect depend on risk screening instruments that have the goal of systematically identifying families who can profit most from early help. Based on a systematic review of longitudinal studies, a set of established risk factors for early child abuse and neglect is presented. Nearly half of the items included in screening instruments can be seen as validated. Available studies indicate a high sensitivity of risk screening instruments. Positive predictive values, however, are low. Overall, the use of risk screening instruments in the area of primary prevention for families at risk represents a feasible method, as long as stigmatizing effects can be avoided and participating families also benefit beyond preventing endangerment.
An improved method for risk evaluation in failure modes and effects analysis of CNC lathe
NASA Astrophysics Data System (ADS)
Rachieru, N.; Belu, N.; Anghel, D. C.
2015-11-01
Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.
Chen, Wansu; Shi, Jiaxiao; Qian, Lei; Azen, Stanley P
2014-06-26
To estimate relative risks or risk ratios for common binary outcomes, the most popular model-based methods are the robust (also known as modified) Poisson and the log-binomial regression. Of the two methods, it is believed that the log-binomial regression yields more efficient estimators because it is maximum likelihood based, while the robust Poisson model may be less affected by outliers. Evidence to support the robustness of robust Poisson models in comparison with log-binomial models is very limited. In this study a simulation was conducted to evaluate the performance of the two methods in several scenarios where outliers existed. The findings indicate that for data coming from a population where the relationship between the outcome and the covariate was in a simple form (e.g. log-linear), the two models yielded comparable biases and mean square errors. However, if the true relationship contained a higher order term, the robust Poisson models consistently outperformed the log-binomial models even when the level of contamination is low. The robust Poisson models are more robust (or less sensitive) to outliers compared to the log-binomial models when estimating relative risks or risk ratios for common binary outcomes. Users should be aware of the limitations when choosing appropriate models to estimate relative risks or risk ratios.
Improved Methods for Fire Risk Assessment in Low-Income and Informal Settlements
Twigg, John; Christie, Nicola; Haworth, James; Osuteye, Emmanuel; Skarlatidou, Artemis
2017-01-01
Fires cause over 300,000 deaths annually worldwide and leave millions more with permanent injuries: some 95% of these deaths are in low- and middle-income countries. Burn injury risk is strongly associated with low-income and informal (or slum) settlements, which are growing rapidly in an urbanising world. Fire policy and mitigation strategies in poorer countries are constrained by inadequate data on incidence, impacts, and causes, which is mainly due to a lack of capacity and resources for data collection, analysis, and modelling. As a first step towards overcoming such challenges, this project reviewed the literature on the subject to assess the potential of a range of methods and tools for identifying, assessing, and addressing fire risk in low-income and informal settlements; the process was supported by an expert workshop at University College London in May 2016. We suggest that community-based risk and vulnerability assessment methods, which are widely used in disaster risk reduction, could be adapted to urban fire risk assessment, and could be enhanced by advances in crowdsourcing and citizen science for geospatial data creation and collection. To assist urban planners, emergency managers, and community organisations who are working in resource-constrained settings to identify and assess relevant fire risk factors, we also suggest an improved analytical framework based on the Haddon Matrix. PMID:28157149
NASA Astrophysics Data System (ADS)
Sari, Diana Puspita; Pujotomo, Darminto; Wardani, Nadira Kusuma
2017-11-01
The Determination of risk is an uncertain event. Risks can have negative or positive impacts on project objectives. A project was defined as a series of activities and tasks that have a purpose, specifications, and limits of cost. Banyumanik Hospital Development Project is one of the construction projects in Semarang which have experienced some problems. The first problem is project delays on building stake. The second problem is delay of material supply. Finally, the problem that occurs is less management attention to health safety as evidenced by the unavailability of PPE for the workers. These problems will pose a risk to be a very important risk management performed by contractors at the Banyumanik Hospital Development Project to reduce the impact that would be caused by the risk borne by the provider of construction services. This research aim to risk identification, risk assessment and risk mitigation. Project risk management begins with the identification of risks based on the project life cycle. The risk assessment carried out by AS I NZS 4360: 2004 to the impacts of cost, time and quality. The results obtained from the method of AS I NZS 4360: 2004 is the risk that requires the handling of mitigation. Mitigated risk is the risk that had significant and high level. There are four risks that require risk mitigation with Bow-Tie diagrams which is work accidents, contract delays, material delays and design changes. Bow-Tie diagram method is a method for identifying causal and preventive action and recovery of a risk. Results obtained from Bow-Tie diagram method is a preventive action and recovery. This action is used as input to the ALARP method. ALARP method is used to determine the priority of the strategy proposed in the category broadly acceptable, tolerable, and unacceptable.
How Many Batches Are Needed for Process Validation under the New FDA Guidance?
Yang, Harry
2013-01-01
The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and they highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance. The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and THEY highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance.
Evaluating Risk Communication After the Fukushima Disaster Based on Nudge Theory.
Murakami, Michio; Tsubokura, Masaharu
2017-03-01
Using nudge theory and some examples of risk communication that followed the Fukushima disaster, this article discusses the influences and justifications of risk communication, in addition to how risk communication systems are designed. To assist people in making decisions based on their own value systems, we provide three suggestions, keeping in mind that people can be influenced (ie, "nudged") depending on how risk communication takes place: (1) accumulate knowledge on the process of evaluating how the method of risk communication and a system's default design could impact people; (2) clarify the purpose and outcomes of risk communication; and (3) see what risk communication might be ethically unjustifiable. Quantitative studies on risk communication and collective narratives will provide some ideas for how to design better risk communication systems and to help people make decisions. Furthermore, we have shown examples of unjustifiable risk communication.
Simon, Steven L.; Bouville, André; Kleinerman, Ruth
2009-01-01
Biodosimetry measurements can potentially be an important and integral part of the dosimetric methods used in long-term studies of health risk following radiation exposure. Such studies rely on accurate estimation of doses to the whole body or to specific organs of individuals in order to derive reliable estimates of cancer risk. However, dose estimates based on analytical dose reconstruction (i.e., models) or personnel monitoring measurements, e.g., film-badges, can have substantial uncertainty. Biodosimetry can potentially reduce uncertainty in health risk studies by corroboration of model-based dose estimates or by using them to assess bias in dose models. While biodosimetry has begun to play a more significant role in long-term health risk studies, its use is still generally limited in that context due to one or more factors including, inadequate limits of detection, large inter-individual variability of the signal measured, high per-sample cost, and invasiveness. Presently, the most suitable biodosimetry methods for epidemiologic studies are chromosome aberration frequencies from fluorescence in situ hybridization (FISH) of peripheral blood lymphocytes and electron paramagnetic resonance (EPR) measurements made on tooth enamel. Both types of measurements, however, are usually invasive and require difficult to obtain biological samples. Moreover, doses derived from these methods are not always directly relevant to the tissues of interest. To increase the value of biodosimetry to epidemiologic studies, a number of issues need to be considered including limits of detection, effects of inhomogenous exposure of the body, how to extrapolate from the tissue sampled to the tissues of interest, and how to adjust dosimetry models applied to large populations based on sparse biodosimetry measurements. The requirements of health risk studies suggest a set of characteristics that, if satisfied by new biodosimetry methods, would increase the overall usefulness of biodosimetry to determining radiation health risks. PMID:20065672
2013-01-01
Background All rigorous primary cardiovascular disease (CVD) prevention guidelines recommend absolute CVD risk scores to identify high- and low-risk patients, but laboratory testing can be impractical in low- and middle-income countries. The purpose of this study was to compare the ranking performance of a simple, non-laboratory-based risk score to laboratory-based scores in various South African populations. Methods We calculated and compared 10-year CVD (or coronary heart disease (CHD)) risk for 14,772 adults from thirteen cross-sectional South African populations (data collected from 1987 to 2009). Risk characterization performance for the non-laboratory-based score was assessed by comparing rankings of risk with six laboratory-based scores (three versions of Framingham risk, SCORE for high- and low-risk countries, and CUORE) using Spearman rank correlation and percent of population equivalently characterized as ‘high’ or ‘low’ risk. Total 10-year non-laboratory-based risk of CVD death was also calculated for a representative cross-section from the 1998 South African Demographic Health Survey (DHS, n = 9,379) to estimate the national burden of CVD mortality risk. Results Spearman correlation coefficients for the non-laboratory-based score with the laboratory-based scores ranged from 0.88 to 0.986. Using conventional thresholds for CVD risk (10% to 20% 10-year CVD risk), 90% to 92% of men and 94% to 97% of women were equivalently characterized as ‘high’ or ‘low’ risk using the non-laboratory-based and Framingham (2008) CVD risk score. These results were robust across the six risk scores evaluated and the thirteen cross-sectional datasets, with few exceptions (lower agreement between the non-laboratory-based and Framingham (1991) CHD risk scores). Approximately 18% of adults in the DHS population were characterized as ‘high CVD risk’ (10-year CVD death risk >20%) using the non-laboratory-based score. Conclusions We found a high level of correlation between a simple, non-laboratory-based CVD risk score and commonly-used laboratory-based risk scores. The burden of CVD mortality risk was high for men and women in South Africa. The policy and clinical implications are that fast, low-cost screening tools can lead to similar risk assessment results compared to time- and resource-intensive approaches. Until setting-specific cohort studies can derive and validate country-specific risk scores, non-laboratory-based CVD risk assessment could be an effective and efficient primary CVD screening approach in South Africa. PMID:23880010
A Simple Model to Rank Shellfish Farming Areas Based on the Risk of Disease Introduction and Spread.
Thrush, M A; Pearce, F M; Gubbins, M J; Oidtmann, B C; Peeler, E J
2017-08-01
The European Union Council Directive 2006/88/EC requires that risk-based surveillance (RBS) for listed aquatic animal diseases is applied to all aquaculture production businesses. The principle behind this is the efficient use of resources directed towards high-risk farm categories, animal types and geographic areas. To achieve this requirement, fish and shellfish farms must be ranked according to their risk of disease introduction and spread. We present a method to risk rank shellfish farming areas based on the risk of disease introduction and spread and demonstrate how the approach was applied in 45 shellfish farming areas in England and Wales. Ten parameters were used to inform the risk model, which were grouped into four risk themes based on related pathways for transmission of pathogens: (i) live animal movement, (ii) transmission via water, (iii) short distance mechanical spread (birds) and (iv) long distance mechanical spread (vessels). Weights (informed by expert knowledge) were applied both to individual parameters and to risk themes for introduction and spread to reflect their relative importance. A spreadsheet model was developed to determine quantitative scores for the risk of pathogen introduction and risk of pathogen spread for each shellfish farming area. These scores were used to independently rank areas for risk of introduction and for risk of spread. Thresholds were set to establish risk categories (low, medium and high) for introduction and spread based on risk scores. Risk categories for introduction and spread for each area were combined to provide overall risk categories to inform a risk-based surveillance programme directed at the area level. Applying the combined risk category designation framework for risk of introduction and spread suggested by European Commission guidance for risk-based surveillance, 4, 10 and 31 areas were classified as high, medium and low risk, respectively. © 2016 Crown copyright.
Solving Word Problems using Schemas: A Review of the Literature
Powell, Sarah R.
2011-01-01
Solving word problems is a difficult task for students at-risk for or with learning disabilities (LD). One instructional approach that has emerged as a valid method for helping students at-risk for or with LD to become more proficient at word-problem solving is using schemas. A schema is a framework for solving a problem. With a schema, students are taught to recognize problems as falling within word-problem types and to apply a problem solution method that matches that problem type. This review highlights two schema approaches for 2nd- and 3rd-grade students at-risk for or with LD: schema-based instruction and schema-broadening instruction. A total of 12 schema studies were reviewed and synthesized. Both types of schema approaches enhanced the word-problem skill of students at-risk for or with LD. Based on the review, suggestions are provided for incorporating word-problem instruction using schemas. PMID:21643477
Health Hazard Appraisal in Patient Counseling
LaDou, Joseph; Sherwood, John N.; Hughes, Lewis
1975-01-01
A program of annual health examinations was expanded to include counseling based on a computerized appraisal of individual patients' specific health hazard factors. Data obtained from a specially designed questionnaire, laboratory tests and a physical examination yielded a printout showing a number of weighted risk factors and their relation to ten leading causes of death as determined for that patient. From all of this information, a risk (“apparent”) age was developed for the patient. The results were reviewed with each patient, and methods of correcting health hazards were stressed. A total of 488 persons were appraised, and 107 were randomly reappraised in less than a year, with the finding that the net risk age was reduced by 1.4 years. Such a reduction in risk age is significant; it indicates that appraisal-based counseling is an effective method of altering priorities of health practices. PMID:1114813
Grey situation group decision-making method based on prospect theory.
Zhang, Na; Fang, Zhigeng; Liu, Xiaqing
2014-01-01
This paper puts forward a grey situation group decision-making method on the basis of prospect theory, in view of the grey situation group decision-making problems that decisions are often made by multiple decision experts and those experts have risk preferences. The method takes the positive and negative ideal situation distance as reference points, defines positive and negative prospect value function, and introduces decision experts' risk preference into grey situation decision-making to make the final decision be more in line with decision experts' psychological behavior. Based on TOPSIS method, this paper determines the weight of each decision expert, sets up comprehensive prospect value matrix for decision experts' evaluation, and finally determines the optimal situation. At last, this paper verifies the effectiveness and feasibility of the method by means of a specific example.
Grey Situation Group Decision-Making Method Based on Prospect Theory
Zhang, Na; Fang, Zhigeng; Liu, Xiaqing
2014-01-01
This paper puts forward a grey situation group decision-making method on the basis of prospect theory, in view of the grey situation group decision-making problems that decisions are often made by multiple decision experts and those experts have risk preferences. The method takes the positive and negative ideal situation distance as reference points, defines positive and negative prospect value function, and introduces decision experts' risk preference into grey situation decision-making to make the final decision be more in line with decision experts' psychological behavior. Based on TOPSIS method, this paper determines the weight of each decision expert, sets up comprehensive prospect value matrix for decision experts' evaluation, and finally determines the optimal situation. At last, this paper verifies the effectiveness and feasibility of the method by means of a specific example. PMID:25197706
Ziegler, Ildikó; Borbély-Jakab, Judit; Sugó, Lilla; Kovács, Réka J
2017-01-01
In this case study, the principles of quality risk management were applied to review sampling points and monitoring frequencies in the hormonal tableting unit of a formulation development pilot plant. In the cleanroom area, premises of different functions are located. Therefore a general method was established for risk evaluation based on the Hazard Analysis and Critical Control Points (HACCP) method to evaluate these premises (i.e., production area itself and ancillary clean areas) from the point of view of microbial load and state in order to observe whether the existing monitoring program met the emerged advanced monitoring practice. LAY ABSTRACT: In pharmaceutical production, cleanrooms are needed for the manufacturing of final dosage forms of drugs-intended for human or veterinary use-in order to protect the patient's weakened body from further infections. Cleanrooms are premises with a controlled level of contamination that is specified by the number of particles per cubic meter at a specified particle size or number of microorganisms (i.e. microbial count) per surface area. To ensure a low microbial count over time, microorganisms are detected and counted by environmental monitoring methods regularly. It is reasonable to find the easily infected places by risk analysis to make sure the obtained results really represent the state of the whole room. This paper presents a risk analysis method for the optimization of environmental monitoring and verification of the suitability of the method. © PDA, Inc. 2017.
Reinders, Lars M H; Klassen, Martin D; Jaeger, Martin; Teutenberg, Thorsten; Tuerk, Jochen
2018-04-01
Monoclonal antibodies are a group of commonly used therapeutics, whose occupational health risk is still discussed controversially. The long-term low-dose exposure side effects are insufficiently evaluated; hence, discussions are often based on a theoretical level or extrapolating side effects from therapeutic dosages. While some research groups recommend applying the precautionary principle for monoclonal antibodies, others consider the exposure risk too low for measures taken towards occupational health and safety. However, both groups agree that airborne monoclonal antibodies have the biggest risk potential. Therefore, we developed a peptide-based analytical method for occupational exposure monitoring of airborne monoclonal antibodies. The method will allow collecting data about the occupational exposure to monoclonal antibodies. Thus, the mean daily intake for personnel in pharmacies and the pharmaceutical industry can be determined for the first time and will help to substantiate the risk assessment by relevant data. The introduced monitoring method includes air sampling, sample preparation and detection by liquid chromatography coupled with high-resolution mass spectrometry of individual monoclonal antibodies as well as sum parameter. For method development and validation, a chimeric (rituximab), humanised (trastuzumab) and a fully humanised (daratumumab) monoclonal antibody are used. A limit of detection between 1 μg per sample for daratumumab and 25 μg per sample for the collective peptide is achieved. Graphical abstract Demonstration of the analytical workflow, from the release of monoclonal antibodies to the detection as single substances as well as sum parameter.
12 CFR Appendix C to Part 325 - Risk-Based Capital for State Nonmember Banks: Market Risk
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10Standardized Measurement Method for Specific Risk Section 11Simplified Supervisory Formula Approach Section... apply: Affiliate with respect to a company means any company that controls, is controlled by, or is under common control with, the company. Backtesting means the comparison of a bank's internal estimates...
12 CFR Appendix C to Part 325 - Risk-Based Capital for State Nonmember Banks: Market Risk
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10Standardized Measurement Method for Specific Risk Section 11Simplified Supervisory Formula Approach Section... apply: Affiliate with respect to a company means any company that controls, is controlled by, or is under common control with, the company. Backtesting means the comparison of a bank's internal estimates...
Yang, Yong
2017-11-01
Most health studies focus on one health outcome and examine the influence of one or multiple risk factors. However, in reality, various pathways, interactions, and associations exist not only between risk factors and health outcomes but also among the risk factors and among health outcomes. The advance of system science methods, Big Data, and accumulated knowledge allows us to examine how multiple risk factors influence multiple health outcomes at multiple levels (termed a 3M study). Using the study of neighborhood environment and health as an example, I elaborate on the significance of 3M studies. 3M studies may lead to a significantly deeper understanding of the dynamic interactions among risk factors and outcomes and could help us design better interventions that may be of particular relevance for upstream interventions. Agent-based modeling (ABM) is a promising method in the 3M study, although its potentials are far from being fully explored. Future challenges include the gap of epidemiologic knowledge and evidence, lack of empirical data sources, and the technical challenges of ABM. © 2017 New York Academy of Sciences.
Tian, Hua; Wang, Xueying; Shu, Gequn; Wu, Mingqiang; Yan, Nanhua; Ma, Xiaonan
2017-09-15
Mixture of hydrocarbon and carbon dioxide shows excellent cycle performance in Organic Rankine Cycle (ORC) used for engine waste heat recovery, but the unavoidable leakage in practical application is a threat for safety due to its flammability. In this work, a quantitative risk assessment system (QR-AS) is established aiming at providing a general method of risk assessment for flammable working fluid leakage. The QR-AS covers three main aspects: analysis of concentration distribution based on CFD simulations, explosive risk assessment based on the TNT equivalent method and risk mitigation based on evaluation results. A typical case of propane/carbon dioxide mixture leaking from ORC is investigated to illustrate the application of QR-AS. According to the assessment results, proper ventilation speed, safe mixture ratio and location of gas-detecting devices have been proposed to guarantee the security in case of leakage. The results revealed that this presented QR-AS was reliable for the practical application and the evaluation results could provide valuable guidance for the design of mitigation measures to improve the safe performance of ORC system. Copyright © 2017 Elsevier B.V. All rights reserved.
Goldenberg, Shira; Strathdee, Steffanie A.; Gallardo, Manuel; Nguyen, Lucie; Lozada, Remedios; Semple, Shirley J.; Patterson, Thomas L.
2011-01-01
In 2008, 400 males ≥ 18 years old who paid or traded for sex with a female sex worker (FSW) in Tijuana, Mexico, in the past 4 months completed surveys and HIV/STI testing; 30 also completed qualitative interviews. To analyze environmental HIV vulnerability among male clients of FSWs in Tijuana, Mexico, we used mixed methods to investigate correlates of clients who met FSWs in nightlife venues and clients’ perspectives on venue-based risks. Logistic regression identified micro-level correlates of meeting FSWs in nightlife venues, which were triangulated with clients’ narratives regarding macro-level influences. In a multivariate model, offering increased pay for unprotected sex and binge drinking were micro-level factors that were independently associated with meeting FSWs in nightlife venues versus other places. In qualitative interviews, clients characterized nightlife venues as high risk due to the following macro-level features: social norms dictating heavy alcohol consumption; economic exploitation by establishment owners; and poor enforcement of sex work regulations in nightlife venues. Structural interventions in nightlife venues are needed to address venue-based risks. PMID:21396875
Goldenberg, Shira M; Strathdee, Steffanie A; Gallardo, Manuel; Nguyen, Lucie; Lozada, Remedios; Semple, Shirley J; Patterson, Thomas L
2011-05-01
In 2008, 400 males ≥18 years old who paid or traded for sex with a female sex worker (FSW) in Tijuana, Mexico, in the past 4 months completed surveys and HIV/STI testing; 30 also completed qualitative interviews. To analyze environmental sources of HIV vulnerability among male clients of FSWs in Tijuana, we used mixed methods to investigate correlates of clients who met FSWs in nightlife venues and clients' perspectives on venue-based HIV risk. Logistic regression identified micro-level correlates of meeting FSWs in nightlife venues, which were triangulated with clients' narratives regarding macro-level influences. In a multivariate model, offering increased pay for unprotected sex and binge drinking were micro-level factors that were independently associated with meeting FSWs in nightlife venues versus other places. In qualitative interviews, clients characterized nightlife venues as high risk due to the following macro-level features: social norms dictating heavy alcohol consumption; economic exploitation by establishment owners; and poor enforcement of sex work regulations in nightlife venues. Structural interventions in nightlife venues are needed to address venue-based risks. Copyright © 2011 Elsevier Ltd. All rights reserved.
Kramer, Erik J; Dodington, James; Hunt, Ava; Henderson, Terrell; Nwabuo, Adaobi; Dicker, Rochelle; Juillard, Catherine
2017-09-01
Violent injury is the second most common cause of death among 15- to 24-year olds in the US. Up to 58% of violently injured youth return to the hospital with a second violent injury. Hospital-based violence intervention programs (HVIPs) have been shown to reduce injury recidivism through intensive case management. However, no validated guidelines for risk assessment strategies in the HVIP setting have been reported. We aimed to use qualitative methods to investigate the key components of risk assessments employed by HVIP case managers and to propose a risk assessment model based on this qualitative analysis. An established academic hospital-affiliated HVIP served as the nexus for this research. Thematic saturation was reached with 11 semi-structured interviews and two focus groups conducted with HVIP case managers and key informants identified through snowball sampling. Interactions were analyzed by a four-member team using Nvivo 10, employing the constant comparison method. Risk factors identified were used to create a set of models presented in two follow-up HVIP case managers and leadership focus groups. Eighteen key themes within seven domains (environment, identity, mental health, behavior, conflict, indicators of lower risk, and case management) and 141 potential risk factors for use in the risk assessment framework were identified. The most salient factors were incorporated into eight models that were presented to the HVIP case managers. A 29-item algorithmic structured professional judgment model was chosen. We identified four tiers of risk factors for violent reinjury that were incorporated into a proposed risk assessment instrument, VRRAI. Copyright © 2017 Elsevier Inc. All rights reserved.
Comparing methods of determining Legionella spp. in complex water matrices.
Díaz-Flores, Álvaro; Montero, Juan Carlos; Castro, Francisco Javier; Alejandres, Eva María; Bayón, Carmen; Solís, Inmaculada; Fernández-Lafuente, Roberto; Rodríguez, Guillermo
2015-04-29
Legionella testing conducted at environmental laboratories plays an essential role in assessing the risk of disease transmission associated with water systems. However, drawbacks of culture-based methodology used for Legionella enumeration can have great impact on the results and interpretation which together can lead to underestimation of the actual risk. Up to 20% of the samples analysed by these laboratories produced inconclusive results, making effective risk management impossible. Overgrowth of competing microbiota was reported as an important factor for culture failure. For quantitative polymerase chain reaction (qPCR), the interpretation of the results from the environmental samples still remains a challenge. Inhibitors may cause up to 10% of inconclusive results. This study compared a quantitative method based on immunomagnetic separation (IMS method) with culture and qPCR, as a new approach to routine monitoring of Legionella. First, pilot studies evaluated the recovery and detectability of Legionella spp using an IMS method, in the presence of microbiota and biocides. The IMS method results were not affected by microbiota while culture counts were significantly reduced (1.4 log) or negative in the same samples. Damage by biocides of viable Legionella was detected by the IMS method. Secondly, a total of 65 water samples were assayed by all three techniques (culture, qPCR and the IMS method). Of these, 27 (41.5%) were recorded as positive by at least one test. Legionella spp was detected by culture in 7 (25.9%) of the 27 samples. Eighteen (66.7%) of the 27 samples were positive by the IMS method, thirteen of them reporting counts below 10(3) colony forming units per liter (CFU l(-1)), six presented interfering microbiota and three presented PCR inhibition. Of the 65 water samples, 24 presented interfering microbiota by culture and 8 presented partial or complete inhibition of the PCR reaction. So the rate of inconclusive results of culture and PCR was 36.9 and 12.3%, respectively, without any inconclusive results reported for the IMS method. The IMS method generally improved the recovery and detectability of Legionella in environmental matrices, suggesting the possibility to use IMS method as valuable indicator of risk. Thus, this method may significantly improve our knowledge about the exposure risk to these bacteria, allowing us to implement evidence-based monitoring and disinfection strategies.
Evidence-based causation in toxicology: A 10-year retrospective.
James, R C; Britt, J K; Halmes, N C; Guzelian, P S
2015-12-01
We introduced Evidence-based Toxicology (EBT) in 2005 to address the disparities that exist between the various Weight-of-Evidence (WOE) methods typically applied in the regulatory hazard decision-making arena and urged toxicologists to adopt the evidence-based guidelines long-utilized in medicine (i.e., Evidence-Based Medicine or EBM). This review of the activities leading to the adoption of evidence-based methods and EBT during the last decade demonstrates how fundamental concepts that form EBT, such as the use of systematic reviews to capture and consider all available information, are improving toxicological evaluations performed by various groups and agencies. We reiterate how the EBT framework, a process that provides a method for performing human chemical causation analyses in an objective, transparent and reproducible manner, differs significantly from past and current regulatory WOE approaches. We also discuss why the uncertainties associated with regulatory WOE schemes lead to a definition of the term "risk" that contains unquantifiable uncertainties not present in this term as it is used in epidemiology and medicine. We believe this distinctly different meaning of "risk" should be clearly conveyed to those not familiar with this difference (e.g., the lay public), when theoretical/nomologic risks associated with chemical-induced toxicities are presented outside of regulatory and related scientific parlance. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Pykhtin, Kirill; Simankina, Tatiana; Sharmanov, Vladimir; Kopytova, Anna
2017-10-01
The danger of injuries and accidents in various industries such as transportation and construction urges the government to control the occupational health and safety more strictly. However, in order to do so with the minimal costs modern risk management tools, have to be implemented. Risk-based approach is an essential tool for competent risk- assessment and used in a great variety of other countries, demonstrating great results in providing of safe working environment. The article describes the problems that the implementation of the method faces in Russia and suggests certain ways to resolve them.
Assessing the validity of prospective hazard analysis methods: a comparison of two techniques
2014-01-01
Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system. PMID:24467813
WE-B-BRC-00: Concepts in Risk-Based Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
NASA Astrophysics Data System (ADS)
Jiang, Shengqian; Liu, Peng; Fu, Danni; Xue, Yiming; Luo, Wentao; Wang, Mingjie
2017-04-01
As an effective survey method of upper limb disorder, rapid upper limb assessment (RULA) has a wide application in industry period. However, it is very difficult to rapidly evaluate operator's postures in real complex work place. In this paper, a real-time RULA method is proposed to accurately assess the potential risk of operator's postures based on the somatosensory data collected from Kinect sensor, which is a line of motion sensing input devices by Microsoft. First, the static position information of each bone point is collected to obtain the effective angles of body parts based on the calculating methods based on joints angles. Second, a whole RULA score of body is obtained to assess the risk level of current posture in real time. Third, those RULA scores are compared with the results provided by a group of ergonomic practitionerswho were asked to observe the same static postures. All the experiments were carried out in an ergonomic lab. The results show that the proposed method can detect operator's postures more accurately. What's more, this method is applied in a real-time condition which can improve the evaluating efficiency.
Yi, Haeseung; Xiao, Tong; Thomas, Parijatham; Aguirre, Alejandra; Smalletz, Cindy; David, Raven; Crew, Katherine
2015-01-01
Background Breast cancer risk assessment including genetic testing can be used to classify people into different risk groups with screening and preventive interventions tailored to the needs of each group, yet the implementation of risk-stratified breast cancer prevention in primary care settings is complex. Objective To address barriers to breast cancer risk assessment, risk communication, and prevention strategies in primary care settings, we developed a Web-based decision aid, RealRisks, that aims to improve preference-based decision-making for breast cancer prevention, particularly in low-numerate women. Methods RealRisks incorporates experience-based dynamic interfaces to communicate risk aimed at reducing inaccurate risk perceptions, with modules on breast cancer risk, genetic testing, and chemoprevention that are tailored. To begin, participants learn about risk by interacting with two games of experience-based risk interfaces, demonstrating average 5-year and lifetime breast cancer risk. We conducted four focus groups in English-speaking women (age ≥18 years), a questionnaire completed before and after interacting with the decision aid, and a semistructured group discussion. We employed a mixed-methods approach to assess accuracy of perceived breast cancer risk and acceptability of RealRisks. The qualitative analysis of the semistructured discussions assessed understanding of risk, risk models, and risk appropriate prevention strategies. Results Among 34 participants, mean age was 53.4 years, 62% (21/34) were Hispanic, and 41% (14/34) demonstrated low numeracy. According to the Gail breast cancer risk assessment tool (BCRAT), the mean 5-year and lifetime breast cancer risk were 1.11% (SD 0.77) and 7.46% (SD 2.87), respectively. After interacting with RealRisks, the difference in perceived and estimated breast cancer risk according to BCRAT improved for 5-year risk (P=.008). In the qualitative analysis, we identified potential barriers to adopting risk-appropriate breast cancer prevention strategies, including uncertainty about breast cancer risk and risk models, distrust toward the health care system, and perception that risk assessment to pre-screen women for eligibility for genetic testing may be viewed as rationing access to care. Conclusions In a multi-ethnic population, we demonstrated a significant improvement in accuracy of perceived breast cancer risk after exposure to RealRisks. However, we identified potential barriers that suggest that accurate risk perceptions will not suffice as the sole basis to support informed decision making and the acceptance of risk-appropriate prevention strategies. Findings will inform the iterative design of the RealRisks decision aid. PMID:26175193
Co-Occurrence of Conduct Disorder and Depression in a Clinic-Based Sample of Boys with ADHD
ERIC Educational Resources Information Center
Drabick, Deborah A. G.; Gadow, Kenneth D.; Sprafkin, Joyce
2006-01-01
Background: Children with attention-deficit/hyperactivity disorder (ADHD) are at risk for the development of comorbid conduct disorder (CD) and depression. The current study examined potential psychosocial risk factors for CD and depression in a clinic-based sample of 203 boys (aged 6-10 years) with ADHD. Methods: The boys and their mothers…
Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah
2015-09-01
Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.
Hotspot Identification for Shanghai Expressways Using the Quantitative Risk Assessment Method
Chen, Can; Li, Tienan; Sun, Jian; Chen, Feng
2016-01-01
Hotspot identification (HSID) is the first and key step of the expressway safety management process. This study presents a new HSID method using the quantitative risk assessment (QRA) technique. Crashes that are likely to happen for a specific site are treated as the risk. The aggregation of the crash occurrence probability for all exposure vehicles is estimated based on the empirical Bayesian method. As for the consequences of crashes, crashes may not only cause direct losses (e.g., occupant injuries and property damages) but also result in indirect losses. The indirect losses are expressed by the extra delays calculated using the deterministic queuing diagram method. The direct losses and indirect losses are uniformly monetized to be considered as the consequences of this risk. The potential costs of crashes, as a criterion to rank high-risk sites, can be explicitly expressed as the sum of the crash probability for all passing vehicles and the corresponding consequences of crashes. A case study on the urban expressways of Shanghai is presented. The results show that the new QRA method for HSID enables the identification of a set of high-risk sites that truly reveal the potential crash costs to society. PMID:28036009
Dynamic TIMI Risk Score for STEMI
Amin, Sameer T.; Morrow, David A.; Braunwald, Eugene; Sloan, Sarah; Contant, Charles; Murphy, Sabina; Antman, Elliott M.
2013-01-01
Background Although there are multiple methods of risk stratification for ST‐elevation myocardial infarction (STEMI), this study presents a prospectively validated method for reclassification of patients based on in‐hospital events. A dynamic risk score provides an initial risk stratification and reassessment at discharge. Methods and Results The dynamic TIMI risk score for STEMI was derived in ExTRACT‐TIMI 25 and validated in TRITON‐TIMI 38. Baseline variables were from the original TIMI risk score for STEMI. New variables were major clinical events occurring during the index hospitalization. Each variable was tested individually in a univariate Cox proportional hazards regression. Variables with P<0.05 were incorporated into a full multivariable Cox model to assess the risk of death at 1 year. Each variable was assigned an integer value based on the odds ratio, and the final score was the sum of these values. The dynamic score included the development of in‐hospital MI, arrhythmia, major bleed, stroke, congestive heart failure, recurrent ischemia, and renal failure. The C‐statistic produced by the dynamic score in the derivation database was 0.76, with a net reclassification improvement (NRI) of 0.33 (P<0.0001) from the inclusion of dynamic events to the original TIMI risk score. In the validation database, the C‐statistic was 0.81, with a NRI of 0.35 (P=0.01). Conclusions This score is a prospectively derived, validated means of estimating 1‐year mortality of STEMI at hospital discharge and can serve as a clinically useful tool. By incorporating events during the index hospitalization, it can better define risk and help to guide treatment decisions. PMID:23525425
Information and problem report usage in system saftey engineering division
NASA Technical Reports Server (NTRS)
Morrissey, Stephen J.
1990-01-01
Five basic problems or question areas are examined. They are as follows: (1) Evaluate adequacy of current problem/performance data base; (2) Evaluate methods of performing trend analysis; (3) Methods and sources of data for probabilistic risk assessment; and (4) How is risk assessment documentation upgraded and/or updated. The fifth problem was to provide recommendations for each of the above four areas.
NASA Technical Reports Server (NTRS)
LaBel, Kenneth A.; Sampson, Michael J.
2017-01-01
As the space business rapidly evolves to accommodate a lower cost model of development and operation via concepts such as commercial space and small spacecraft (aka, CubeSats and swarms), traditional EEE parts screening and qualification methods are being scrutinized under a risk-reward trade space. In this presentation, two basic concepts will be discussed: (1) The movement from complete risk aversion EEE parts methods to managing and/or accepting risk via alternate approaches; and, (2) A discussion of emerging assurance methods to reduce overdesign as well emerging model based mission assurance (MBMA) concepts. center dot Example scenarios will be described as well as consideration for trading traditional versus alternate methods.
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.
Segmentation of organs at risk in CT volumes of head, thorax, abdomen, and pelvis
NASA Astrophysics Data System (ADS)
Han, Miaofei; Ma, Jinfeng; Li, Yan; Li, Meiling; Song, Yanli; Li, Qiang
2015-03-01
Accurate segmentation of organs at risk (OARs) is a key step in treatment planning system (TPS) of image guided radiation therapy. We are developing three classes of methods to segment 17 organs at risk throughout the whole body, including brain, brain stem, eyes, mandible, temporomandibular joints, parotid glands, spinal cord, lungs, trachea, heart, livers, kidneys, spleen, prostate, rectum, femoral heads, and skin. The three classes of segmentation methods include (1) threshold-based methods for organs of large contrast with adjacent structures such as lungs, trachea, and skin; (2) context-driven Generalized Hough Transform-based methods combined with graph cut algorithm for robust localization and segmentation of liver, kidneys and spleen; and (3) atlas and registration-based methods for segmentation of heart and all organs in CT volumes of head and pelvis. The segmentation accuracy for the seventeen organs was subjectively evaluated by two medical experts in three levels of score: 0, poor (unusable in clinical practice); 1, acceptable (minor revision needed); and 2, good (nearly no revision needed). A database was collected from Ruijin Hospital, Huashan Hospital, and Xuhui Central Hospital in Shanghai, China, including 127 head scans, 203 thoracic scans, 154 abdominal scans, and 73 pelvic scans. The percentages of "good" segmentation results were 97.6%, 92.9%, 81.1%, 87.4%, 85.0%, 78.7%, 94.1%, 91.1%, 81.3%, 86.7%, 82.5%, 86.4%, 79.9%, 72.6%, 68.5%, 93.2%, 96.9% for brain, brain stem, eyes, mandible, temporomandibular joints, parotid glands, spinal cord, lungs, trachea, heart, livers, kidneys, spleen, prostate, rectum, femoral heads, and skin, respectively. Various organs at risk can be reliably segmented from CT scans by use of the three classes of segmentation methods.
Pedophilia: an evaluation of diagnostic and risk prediction methods.
Wilson, Robin J; Abracen, Jeffrey; Looman, Jan; Picheca, Janice E; Ferguson, Meaghan
2011-06-01
One hundred thirty child sexual abusers were diagnosed using each of following four methods: (a) phallometric testing, (b) strict application of Diagnostic and Statistical Manual of Mental Disorders (4th ed., text revision [DSM-IV-TR]) criteria, (c) Rapid Risk Assessment of Sex Offender Recidivism (RRASOR) scores, and (d) "expert" diagnoses rendered by a seasoned clinician. Comparative utility and intermethod consistency of these methods are reported, along with recidivism data indicating predictive validity for risk management. Results suggest that inconsistency exists in diagnosing pedophilia, leading to diminished accuracy in risk assessment. Although the RRASOR and DSM-IV-TR methods were significantly correlated with expert ratings, RRASOR and DSM-IV-TR were unrelated to each other. Deviant arousal was not associated with any of the other methods. Only the expert ratings and RRASOR scores were predictive of sexual recidivism. Logistic regression analyses showed that expert diagnosis did not add to prediction of sexual offence recidivism over and above RRASOR alone. Findings are discussed within a context of encouragement of clinical consistency and evidence-based practice regarding treatment and risk management of those who sexually abuse children.
The 10-year Absolute Risk of Cardiovascular (CV) Events in Northern Iran: a Population Based Study
Motamed, Nima; Mardanshahi, Alireza; Saravi, Benyamin Mohseni; Siamian, Hasan; Maadi, Mansooreh; Zamani, Farhad
2015-01-01
Background: The present study was conducted to estimate 10-year cardiovascular disease events (CVD) risk using three instruments in northern Iran. Material and methods: Baseline data of 3201 participants 40-79 of a population based cohort which was conducted in Northern Iran were analyzed. Framingham risk score (FRS), World Health Organization (WHO) risk prediction charts and American college of cardiovascular / American heart association (ACC/AHA) tool were applied to assess 10-year CVD events risk. The agreement values between the risk assessment instruments were determined using the kappa statistics. Results: Our study estimated 53.5%of male population aged 40-79 had a 10 –year risk of CVD events≥10% based on ACC/AHA approach, 48.9% based on FRS and 11.8% based on WHO risk charts. A 10 –year risk≥10% was estimated among 20.1% of women using the ACC/AHA approach, 11.9%using FRS and 5.7%using WHO tool. ACC/AHA and Framingham tools had closest agreement in the estimation of 10-year risk≥10% (κ=0.7757) in meanwhile ACC/AHA and WHO approaches displayed highest agreement (κ=0.6123) in women. Conclusion: Different estimations of 10-year risk of CVD event were provided by ACC/AHA, FRS and WHO approaches. PMID:26236160
Advanced space-based InSAR risk analysis of planned and existing transportation infrastructure.
DOT National Transportation Integrated Search
2017-03-21
The purpose of this document is to summarize activities by Stanford University and : MDA Geospatial Services Inc. (MDA) to estimate surface deformation and associated : risk to transportation infrastructure using SAR Interferometric methods for the :...
Web-enabling Ecological Risk Assessment for Accessibility and Transparency
Ecological risk methods and tools are necessarily diverse to account for different combinations of receptors, exposure processes, effects estimation, and degree of conservatism/realism necessary to support chemical-based assessments. These tools have been continuously developed s...
ASSESSING SUSCEPTIBILITY FROM EARLY-LIFE EXPOSURE TO CARCINOGENS
Cancer risks from childhood exposures to chemicals are generally analyzed using methods based upon exposure from adults, which assumes chemicals are equally potent for inducing risks at these different lifestages. Published literature was evaluated to determine whether there was...
When methods meet politics: how risk adjustment became part of Medicare managed care.
Weissman, Joel S; Wachterman, Melissa; Blumenthal, David
2005-06-01
Health-based risk adjustment has long been touted as key to the success of competitive models of health care. Because it decreases the incentive to enroll only healthy patients in insurance plans, risk adjustment was incorporated into Medicare policy via the Balanced Budget Act of 1997. However, full implementation of risk adjustment was delayed due to clashes with the managed care industry over payment policy, concerns over perverse incentives, and problems of data burden. We review the history of risk adjustment leading up to the Balanced Budget Act and examine the controversies surrounding attempts to stop or delay its implementation during the years that followed. The article provides lessons for the future of health-based risk adjustment and possible alternatives.
Application of multi-criteria decision-making to risk prioritisation in tidal energy developments
NASA Astrophysics Data System (ADS)
Kolios, Athanasios; Read, George; Ioannou, Anastasia
2016-01-01
This paper presents an analytical multi-criterion analysis for the prioritisation of risks for the development of tidal energy projects. After a basic identification of risks throughout the project and relevant stakeholders in the UK, classified through a political, economic, social, technological, legal and environmental analysis, relevant questionnaires provided scores to each risk and corresponding weights for each of the different sectors. Employing an extended technique for order of preference by similarity to ideal solution as well as the weighted sum method based on the data obtained, the risks identified are ranked based on their criticality, drawing attention of the industry in mitigating the ones scoring higher. Both methods were modified to take averages at different stages of the analysis in order to observe the effects on the final risk ranking. A sensitivity analysis of the results was also carried out with regard to the weighting factors given to the perceived expertise of participants, with different results being obtained whether a linear, squared or square root regression is used. Results of the study show that academics and industry have conflicting opinions with regard to the perception of the most critical risks.
Risk-taking and decision-making in youth: relationships to addiction vulnerability
Balogh, Kornelia N.; Mayes, Linda C.; Potenza, Marc N.
2013-01-01
Background Decision-making and risk-taking behavior undergo developmental changes during adolescence. Disadvantageous decision-making and increased risk-taking may lead to problematic behaviors such as substance use and abuse, pathological gambling and excessive internet use. Methods Based on MEDLINE searches, this article reviews the literature on decision-making and risk-taking and their relationship to addiction vulnerability in youth. Results Decision-making and risk-taking behaviors involve brain areas that undergoing developmental changes during puberty and young adulthood. Individual differences and peer pressure also relate importantly to decision-making and risk-taking. Conclusions Brain-based changes in emotional, motivational and cognitive processing may underlie risk-taking and decision-making propensities in adolescence, making this period a time of heightened vulnerability for engagement in additive behaviors. PMID:24294500
Wang, Molin; Liao, Xiaomei; Laden, Francine; Spiegelman, Donna
2016-01-01
Identification of the latency period and age-related susceptibility, if any, is an important aspect of assessing risks of environmental, nutritional and occupational exposures. We consider estimation and inference for latency and age-related susceptibility in relative risk and excess risk models. We focus on likelihood-based methods for point and interval estimation of the latency period and age-related windows of susceptibility coupled with several commonly considered exposure metrics. The method is illustrated in a study of the timing of the effects of constituents of air pollution on mortality in the Nurses’ Health Study. PMID:26750582
The use of advanced web-based survey design in Delphi research.
Helms, Christopher; Gardner, Anne; McInnes, Elizabeth
2017-12-01
A discussion of the application of metadata, paradata and embedded data in web-based survey research, using two completed Delphi surveys as examples. Metadata, paradata and embedded data use in web-based Delphi surveys has not been described in the literature. The rapid evolution and widespread use of online survey methods imply that paper-based Delphi methods will likely become obsolete. Commercially available web-based survey tools offer a convenient and affordable means of conducting Delphi research. Researchers and ethics committees may be unaware of the benefits and risks of using metadata in web-based surveys. Discussion paper. Two web-based, three-round Delphi surveys were conducted sequentially between August 2014 - January 2015 and April - May 2016. Their aims were to validate the Australian nurse practitioner metaspecialties and their respective clinical practice standards. Our discussion paper is supported by researcher experience and data obtained from conducting both web-based Delphi surveys. Researchers and ethics committees should consider the benefits and risks of metadata use in web-based survey methods. Web-based Delphi research using paradata and embedded data may introduce efficiencies that improve individual participant survey experiences and reduce attrition across iterations. Use of embedded data allows the efficient conduct of multiple simultaneous Delphi surveys across a shorter timeframe than traditional survey methods. The use of metadata, paradata and embedded data appears to improve response rates, identify bias and give possible explanation for apparent outlier responses, providing an efficient method of conducting web-based Delphi surveys. © 2017 John Wiley & Sons Ltd.
[Impact of water pollution risk in water transfer project based on fault tree analysis].
Liu, Jian-Chang; Zhang, Wei; Wang, Li-Min; Li, Dai-Qing; Fan, Xiu-Ying; Deng, Hong-Bing
2009-09-15
The methods to assess water pollution risk for medium water transfer are gradually being explored. The event-nature-proportion method was developed to evaluate the probability of the single event. Fault tree analysis on the basis of calculation on single event was employed to evaluate the extent of whole water pollution risk for the channel water body. The result indicates, that the risk of pollutants from towns and villages along the line of water transfer project to the channel water body is at high level with the probability of 0.373, which will increase pollution to the channel water body at the rate of 64.53 mg/L COD, 4.57 mg/L NH4(+) -N and 0.066 mg/L volatilization hydroxybenzene, respectively. The measurement of fault probability on the basis of proportion method is proved to be useful in assessing water pollution risk under much uncertainty.
Di Salvo, Francesca; Meneghini, Elisabetta; Vieira, Veronica; Baili, Paolo; Mariottini, Mauro; Baldini, Marco; Micheli, Andrea; Sant, Milena
2015-01-01
Introduction The study investigated the geographic variation of mortality risk for hematological malignancies (HMs) in order to identify potential high-risk areas near an Italian petrochemical refinery. Material and methods A population-based case-control study was conducted and residential histories for 171 cases and 338 sex- and age-matched controls were collected. Confounding factors were obtained from interviews with consenting relatives for 109 HM deaths and 267 controls. To produce risk mortality maps, two different approaches were applied. We mapped (1) adptive kernel density relative risk estimation (KDE) for case-control studies which estimates a spatial relative risk function using the ratio between cases and controls’ densities, and (2) estimated odds ratios for case-control study data using generalized additive models (GAMs) to smooth the effect of location, a proxy for exposure, while adjusting for confounding variables. Results No high-risk areas for HM mortality were identified among all subjects (men and women combined), by applying both approaches. Using the adaptive KDE approach, we found a significant increase in death risk only among women in a large area 2–6 km southeast of the refinery and the application of GAMs also identified a similarly-located significant high-risk area among women only (global p-value<0.025). Potential confounding risk factors we considered in the GAM did not alter the results. Conclusion Both approaches identified a high-risk area close to the refinery among women only. Those spatial methods are useful tools for public policy management to determine priority areas for intervention. Our findings suggest several directions for further research in order to identify other potential environmental exposures that may be assessed in forthcoming studies based on detailed exposure modeling. PMID:26073202
Reiman, Arto; Pekkala, Janne; Väyrynen, Seppo; Putkonen, Ari; Forsman, Mikael
2014-01-01
The aim of this study was to identify risks and ergonomics discomfort during work of local and short haul delivery truck drivers outside a cab. The study used a video- and computer-based method (VIDAR). VIDAR is a participatory method identifying demanding work situations and their potential risks. The drivers' work was videoed and analysed by subjects and ergonomists. Delivery truck drivers should not be perceived as one group with equal risks because there were significant differences between the 2 types of transportation and specific types of risks. VIDAR produces visual material for risk management processes. VIDAR as a participatory approach stimulates active discussion about work-related risks and discomfort, and about possibilities for improvement. VIDAR may be also applied to work which comprises different working environments.
Parkin Kullmann, Jane Alana; Hayes, Susan; Wang, Min-Xia
2015-01-01
Background Amyotrophic lateral sclerosis (ALS) is a progressive neurodegenerative disease with a typical survival of three to five years. Epidemiological studies using paper-based questionnaires in individual countries or continents have failed to find widely accepted risk factors for the disease. The advantages of online versus paper-based questionnaires have been extensively reviewed, but few online epidemiological studies into human neurodegenerative diseases have so far been undertaken. Objective To design a Web-based questionnaire to identify environmental risk factors for ALS and enable international comparisons of these risk factors. Methods A Web-based epidemiological questionnaire for ALS has been developed based on experience gained from administering a previous continent-wide paper-based questionnaire for this disease. New and modified questions have been added from our previous paper-based questionnaire, from literature searches, and from validated ALS questionnaires supplied by other investigators. New criteria to allow the separation of familial and sporadic ALS cases have been included. The questionnaire addresses many risk factors that have already been proposed for ALS, as well as a number that have not yet been rigorously examined. To encourage participation, responses are collected anonymously and no personally identifiable information is requested. The survey is being translated into a number of languages which will allow many people around the world to read and answer it in their own language. Results After the questionnaire had been online for 4 months, it had 379 respondents compared to only 46 respondents for the same initial period using a paper-based questionnaire. The average age of the first 379 web questionnaire respondents was 54 years compared to the average age of 60 years for the first 379 paper questionnaire respondents. The questionnaire is soon to be promoted in a number of countries through ALS associations and disease registries. Conclusions Web-based questionnaires are a time- and resource-efficient method for performing large epidemiological studies of neurodegenerative diseases such as ALS. The ability to compare risk factors between different countries using the same analysis tool will be of particular value for finding robust risk factors that underlie ALS. PMID:26239255
NASA Astrophysics Data System (ADS)
Delaney, C.; Hartman, R. K.; Mendoza, J.; Evans, K. M.; Evett, S.
2016-12-01
Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation or flow forecasts to inform the flood operations of reservoirs. Previous research and modeling for flood control reservoirs has shown that FIRO can reduce flood risk and increase water supply for many reservoirs. The risk-based method of FIRO presents a unique approach that incorporates flow forecasts made by NOAA's California-Nevada River Forecast Center (CNRFC) to model and assess risk of meeting or exceeding identified management targets or thresholds. Forecasted risk is evaluated against set risk tolerances to set reservoir flood releases. A water management model was developed for Lake Mendocino, a 116,500 acre-foot reservoir located near Ukiah, California. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United State Army Corps of Engineers and is operated by the Sonoma County Water Agency for water supply. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has been plagued with water supply reliability issues since 2007. FIRO is applied to Lake Mendocino by simulating daily hydrologic conditions from 1985 to 2010 in the Upper Russian River from Lake Mendocino to the City of Healdsburg approximately 50 miles downstream. The risk-based method is simulated using a 15-day, 61 member streamflow hindcast by the CNRFC. Model simulation results of risk-based flood operations demonstrate a 23% increase in average end of water year (September 30) storage levels over current operations. Model results show no increase in occurrence of flood damages for points downstream of Lake Mendocino. This investigation demonstrates that FIRO may be a viable flood control operations approach for Lake Mendocino and warrants further investigation through additional modeling and analysis.
Dawe, Russell Eric; Bishop, Jessica; Pendergast, Amanda; Avery, Susan; Monaghan, Kelly; Duggan, Norah; Aubrey-Bassler, Kris
2017-01-01
Background: Previous research suggests that family physicians have rates of cesarean delivery that are lower than or equivalent to those for obstetricians, but adjustments for risk differences in these analyses may have been inadequate. We used an econometric method to adjust for observed and unobserved factors affecting the risk of cesarean delivery among women attended by family physicians versus obstetricians. Methods: This retrospective population-based cohort study included all Canadian (except Quebec) hospital deliveries by family physicians and obstetricians between Apr. 1, 2006, and Mar. 31, 2009. We excluded women with multiple gestations, and newborns with a birth weight less than 500 g or gestational age less than 20 weeks. We estimated the relative risk of cesarean delivery using instrumental-variable-adjusted and logistic regression. Results: The final cohort included 776 299 women who gave birth in 390 hospitals. The risk of cesarean delivery was 27.3%, and the mean proportion of deliveries by family physicians was 26.9% (standard deviation 23.8%). The relative risk of cesarean delivery for family physicians versus obstetricians was 0.48 (95% confidence interval [CI] 0.41-0.56) with logistic regression and 1.27 (95% CI 1.02-1.57) with instrumental-variable-adjusted regression. Interpretation: Our conventional analyses suggest that family physicians have a lower rate of cesarean delivery than obstetricians, but instrumental variable analyses suggest the opposite. Because instrumental variable methods adjust for unmeasured factors and traditional methods do not, the large discrepancy between these estimates of risk suggests that clinical and/or sociocultural factors affecting the decision to perform cesarean delivery may not be accounted for in our database. PMID:29233843
Karp, Igor; Sylvestre, Marie-Pierre; Abrahamowicz, Michal; Leffondré, Karen; Siemiatycki, Jack
2016-11-01
Assessment of individual risk of illness is an important activity in preventive medicine. Development of risk-assessment models has heretofore relied predominantly on studies involving follow-up of cohort-type populations, while case-control studies have generally been considered unfit for this purpose. To present a method for individualized assessment of absolute risk of an illness (as illustrated by lung cancer) based on data from a 'non-nested' case-control study. We used data from a case-control study conducted in Montreal, Canada in 1996-2001. Individuals diagnosed with lung cancer (n = 920) and age- and sex-matched lung-cancer-free subjects (n = 1288) completed questionnaires documenting life-time cigarette-smoking history and occupational, medical, and family history. Unweighted and weighted logistic models were fitted. Model overfitting was assessed using bootstrap-based cross-validation and 'shrinkage.' The discriminating ability was assessed by the c-statistic, and the risk-stratifying performance was assessed by examination of the variability in risk estimates over hypothetical risk-profiles. In the logistic models, the logarithm of incidence-density of lung cancer was expressed as a function of age, sex, cigarette-smoking history, history of respiratory conditions and exposure to occupational carcinogens, and family history of lung cancer. The models entailed a minimal degree of overfitting ('shrinkage' factor: 0.97 for both unweighted and weighted models) and moderately high discriminating ability (c-statistic: 0.82 for the unweighted model and 0.66 for the weighted model). The method's risk-stratifying performance was quite high. The presented method allows for individualized assessment of risk of lung cancer and can be used for development of risk-assessment models for other illnesses.
Glick, Sara Nelson; Houston, Ebony; Peterson, James; Kuo, Irene; Magnus, Manya
2016-08-01
To develop optimal methods to study sexual health among black young men who have sex with men and transgender women (BYMSM/TW). We conducted a mixed-methods prospective study to identify recruitment and retention strategies for BYMSM/TW (age 16-21) in Washington D.C., and describe HIV risk behaviors and context. Incentivized peer referral was highly productive, and 60% of BYMSM/TW were retained for 3 months. Participants reported high levels of sexual risk, homophobia, racism, and maternal support. BYMSM/TW studies should utilize a combination of peer-based, in-person, and technology-based recruiting strategies. Additional research is needed to leverage mobile technology and social media to enhance retention.
Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.
Reyes Santos, Joost; Haimes, Yacov Y
2004-06-01
The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model. However, under extremely unfavorable market conditions, results indicate that f(4) can be a more valid measure of risk than volatility.
Risk assessment is a crucial component of the site remediation decision-making process. Some current EPA methods do not have detection limits low enough for risk assessment of many VOCs (e.g., EPA Region 3 Risk Based Concentration levels, EPA Region 9 Preliminary Remediation Goa...
ERIC Educational Resources Information Center
Mayo, Carrie; George, Valerie
2014-01-01
Objective: To investigate the relationship between risk of eating disorders, body dissatisfaction, and perceptual attractiveness in male university students. Participants: Research was conducted January-April 2012 and involved 339 male and 441 female students. Methods: Eating disorder risk was assessed with the Eating Attitudes Test (EAT) and body…
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
Probabilistic Exposure Analysis for Chemical Risk Characterization
Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.
2009-01-01
This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660
Approach to risk identification in undifferentiated mental disorders
Silveira, José; Rockman, Patricia; Fulford, Casey; Hunter, Jon
2016-01-01
Abstract Objective To provide primary care physicians with a novel approach to risk identification and related clinical decision making in the management of undifferentiated mental disorders. Sources of information We conducted a review of the literature in PubMed, CINAHL, PsycINFO, and Google Scholar using the search terms diagnostic uncertainty, diagnosis, risk identification, risk assessment/methods, risk, risk factors, risk management/methods, cognitive biases and psychiatry, decision making, mental disorders/diagnosis, clinical competence, evidence-based medicine, interviews as topic, psychiatry/education, psychiatry/methods, documentation/methods, forensic psychiatry/education, forensic psychiatry/methods, mental disorders/classification, mental disorders/psychology, violence/prevention and control, and violence/psychology. Main message Mental disorders are a large component of practice in primary care and often present in an undifferentiated manner, remaining so for prolonged periods. The challenging search for a diagnosis can divert attention from risk identification, as diagnosis is commonly presumed to be necessary before treatment can begin. This might inadvertently contribute to preventable adverse events. Focusing on salient aspects of the patient presentation related to risk should be prioritized. This article presents a novel approach to organizing patient information to assist risk identification and decision making in the management of patients with undifferentiated mental disorders. Conclusion A structured approach can help physicians to manage the clinical uncertainty common to risk identification in patients with mental disorders and cope with the common anxiety and cognitive biases that affect priorities in risk-related decision making. By focusing on risk, functional impairments, and related symptoms using a novel framework, physicians can meet their patients’ immediate needs while continuing the search for diagnostic clarity and long-term treatment. PMID:27965330
Approach to risk identification in undifferentiated mental disorders.
Silveira, José; Rockman, Patricia; Fulford, Casey; Hunter, Jon
2016-12-01
To provide primary care physicians with a novel approach to risk identification and related clinical decision making in the management of undifferentiated mental disorders. We conducted a review of the literature in PubMed, CINAHL, PsycINFO, and Google Scholar using the search terms diagnostic uncertainty, diagnosis, risk identification, risk assessment/methods, risk, risk factors, risk management/methods, cognitive biases and psychiatry, decision making, mental disorders/diagnosis, clinical competence, evidence-based medicine, interviews as topic, psychiatry/education, psychiatry/methods, documentation/methods, forensic psychiatry/education, forensic psychiatry/methods, mental disorders/classification, mental disorders/psychology, violence/prevention and control, and violence/psychology. Mental disorders are a large component of practice in primary care and often present in an undifferentiated manner, remaining so for prolonged periods. The challenging search for a diagnosis can divert attention from risk identification, as diagnosis is commonly presumed to be necessary before treatment can begin. This might inadvertently contribute to preventable adverse events. Focusing on salient aspects of the patient presentation related to risk should be prioritized. This article presents a novel approach to organizing patient information to assist risk identification and decision making in the management of patients with undifferentiated mental disorders. A structured approach can help physicians to manage the clinical uncertainty common to risk identification in patients with mental disorders and cope with the common anxiety and cognitive biases that affect priorities in risk-related decision making. By focusing on risk, functional impairments, and related symptoms using a novel framework, physicians can meet their patients' immediate needs while continuing the search for diagnostic clarity and long-term treatment. Copyright© the College of Family Physicians of Canada.
Elbogen, Eric B; Fuller, Sara; Johnson, Sally C; Brooks, Stephanie; Kinneer, Patricia; Calhoun, Patrick S; Beckham, Jean C
2010-08-01
Increased media attention to post-deployment violence highlights the need to develop effective models to guide risk assessment among military Veterans. Ideally, a method would help identify which Veterans are most at risk for violence so that it can be determined what could be done to prevent violent behavior. This article suggests how empirical approaches to risk assessment used successfully in civilian populations can be applied to Veterans. A review was conducted of the scientific literature on Veteran populations regarding factors related to interpersonal violence generally and to domestic violence specifically. A checklist was then generated of empirically-supported risk factors for clinicians to consider in practice. To conceptualize how these known risk factors relate to a Veteran's violence potential, risk assessment scholarship was utilized to develop an evidence-based method to guide mental health professionals. The goals of this approach are to integrate science into practice, overcome logistical barriers, and permit more effective assessment, monitoring, and management of violence risk for clinicians working with Veterans, both in Department of Veteran Affairs settings and in the broader community. Research is needed to test the predictive validity of risk assessment models. Ultimately, the use of a systematic, empirical framework could lead to improved clinical decision-making in the area of risk assessment and potentially help prevent violence among Veterans. Published by Elsevier Ltd.
Rooney, Andrew A.; Cooper, Glinda S.; Jahnke, Gloria D.; Lam, Juleen; Morgan, Rebecca L.; Boyles, Abee L.; Ratcliffe, Jennifer M.; Kraft, Andrew D.; Schünemann, Holger J.; Schwingl, Pamela; Walker, Teneille D.; Thayer, Kristina A.; Lunn, Ruth M.
2016-01-01
Environmental health hazard assessments are routinely relied upon for public health decision-making. The evidence base used in these assessments is typically developed from a collection of diverse sources of information of varying quality. It is critical that literature-based evaluations consider the credibility of individual studies used to reach conclusions through consistent, transparent and accepted methods. Systematic review procedures address study credibility by assessing internal validity or “risk of bias” — the assessment of whether the design and conduct of a study compromised the credibility of the link between exposure/intervention and outcome. This paper describes the commonalities and differences in risk-of-bias methods developed or used by five groups that conduct or provide methodological input for performing environmental health hazard assessments: the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) Working Group, the Navigation Guide, the National Toxicology Program’s (NTP) Office of Health Assessment and Translation (OHAT) and Office of the Report on Carcinogens (ORoC), and the Integrated Risk Information System of the U.S. Environmental Protection Agency (EPA-IRIS). Each of these groups have been developing and applying rigorous assessment methods for integrating across a heterogeneous collection of human and animal studies to inform conclusions on potential environmental health hazards. There is substantial consistency across the groups in the consideration of risk-of-bias issues or “domains” for assessing observational human studies. There is a similar overlap in terms of domains addressed for animal studies; however, the groups differ in the relative emphasis placed on different aspects of risk of bias. Future directions for the continued harmonization and improvement of these methods are also discussed. PMID:26857180
Rooney, Andrew A; Cooper, Glinda S; Jahnke, Gloria D; Lam, Juleen; Morgan, Rebecca L; Boyles, Abee L; Ratcliffe, Jennifer M; Kraft, Andrew D; Schünemann, Holger J; Schwingl, Pamela; Walker, Teneille D; Thayer, Kristina A; Lunn, Ruth M
2016-01-01
Environmental health hazard assessments are routinely relied upon for public health decision-making. The evidence base used in these assessments is typically developed from a collection of diverse sources of information of varying quality. It is critical that literature-based evaluations consider the credibility of individual studies used to reach conclusions through consistent, transparent and accepted methods. Systematic review procedures address study credibility by assessing internal validity or "risk of bias" - the assessment of whether the design and conduct of a study compromised the credibility of the link between exposure/intervention and outcome. This paper describes the commonalities and differences in risk-of-bias methods developed or used by five groups that conduct or provide methodological input for performing environmental health hazard assessments: the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) Working Group, the Navigation Guide, the National Toxicology Program's (NTP) Office of Health Assessment and Translation (OHAT) and Office of the Report on Carcinogens (ORoC), and the Integrated Risk Information System of the U.S. Environmental Protection Agency (EPA-IRIS). Each of these groups have been developing and applying rigorous assessment methods for integrating across a heterogeneous collection of human and animal studies to inform conclusions on potential environmental health hazards. There is substantial consistency across the groups in the consideration of risk-of-bias issues or "domains" for assessing observational human studies. There is a similar overlap in terms of domains addressed for animal studies; however, the groups differ in the relative emphasis placed on different aspects of risk of bias. Future directions for the continued harmonization and improvement of these methods are also discussed. Published by Elsevier Ltd.
Yi, Huso; Hallowell, Nina; Griffiths, Sian; Yeung Leung, Tak
2013-01-01
Background A newly introduced cell-free fetal DNA sequencing based non-invasive prenatal testing (DNA-NIPT) detects Down syndrome with sensitivity of 99% at early gestational stage without risk of miscarriage. Attention has been given to its public health implications; little is known from consumer perspectives. This qualitative study aimed to explore women’s motivations for using, and perceptions of, DNA-NIPT in Hong Kong. Methods and Findings In-depth interviews were conducted with 45 women who had undertaken DNA-NIPT recruited by purposive sampling based on socio-demographic and clinical characteristics. The sample included 31 women identified as high-risk from serum and ultrasound based Down syndrome screening (SU-DSS). Thematic narrative analysis examined informed-decision making of the test and identified the benefits and needs. Women outlined a number of reasons for accessing DNA-NIPT: reducing the uncertainty associated with risk probability-based results from SU-DSS, undertaking DNA-NIPT as a comprehensive measure to counteract risk from childbearing especially at advanced age, perceived predictive accuracy and absence of risk of harm to fetus. Accounts of women deemed high-risk or not high-risk are distinctive in a number of respects. High-risk women accessed DNA-NIPT to get a clearer idea of their risk. This group perceived SU-DSS as an unnecessary and confusing procedure because of its varying, protocol-dependent detection rates. Those women not deemed high-risk, in contrast, undertook DNA-NIPT for psychological assurance and to reduce anxiety even after receiving the negative result from SU-DSS. Conclusions DNA-NIPT was regarded positively by women who chose this method of screening over the routine, less expensive testing options. Given its perceived utility, health providers need to consider whether DNA-NIPT should be offered as part of universal routine care to women at high-risk for fetal aneuploidy. If this is the case, then further development of guidelines and quality assurance will be needed to provide a service suited to patients’ needs. PMID:24312358
SOME PROBLEMS OF "SAFE DOSE" ESTIMATION
In environmental carcinogenic risk assessment, the usually defined "safe doses" appear subjective in some sense. n this paper a method of standardizing "safe doses" based on some objective parameters is introduced and a procedure of estimating safe doses under the competing risks...
Using qPCR for Water Microbial Risk Assessments
Microbial risk assessment (MRA) has traditionally utilized microbiological data that was obtained by culture-based techniques that are expensive and time consuming. With the advent of PCR methods there is a realistic opportunity to conduct MRA studies economically, in less time,...
Evidence-based risk communication: a systematic review.
Zipkin, Daniella A; Umscheid, Craig A; Keating, Nancy L; Allen, Elizabeth; Aung, KoKo; Beyth, Rebecca; Kaatz, Scott; Mann, Devin M; Sussman, Jeremy B; Korenstein, Deborah; Schardt, Connie; Nagi, Avishek; Sloane, Richard; Feldstein, David A
2014-08-19
Effective communication of risks and benefits to patients is critical for shared decision making. To review the comparative effectiveness of methods of communicating probabilistic information to patients that maximize their cognitive and behavioral outcomes. PubMed (1966 to March 2014) and CINAHL, EMBASE, and the Cochrane Central Register of Controlled Trials (1966 to December 2011) using several keywords and structured terms. Prospective or cross-sectional studies that recruited patients or healthy volunteers and compared any method of communicating probabilistic information with another method. Two independent reviewers extracted study characteristics and assessed risk of bias. Eighty-four articles, representing 91 unique studies, evaluated various methods of numerical and visual risk display across several risk scenarios and with diverse outcome measures. Studies showed that visual aids (icon arrays and bar graphs) improved patients' understanding and satisfaction. Presentations including absolute risk reductions were better than those including relative risk reductions for maximizing accuracy and seemed less likely than presentations with relative risk reductions to influence decisions to accept therapy. The presentation of numbers needed to treat reduced understanding. Comparative effects of presentations of frequencies (such as 1 in 5) versus event rates (percentages, such as 20%) were inconclusive. Most studies were small and highly variable in terms of setting, context, and methods of administering interventions. Visual aids and absolute risk formats can improve patients' understanding of probabilistic information, whereas numbers needed to treat can lessen their understanding. Due to study heterogeneity, the superiority of any single method for conveying probabilistic information is not established, but there are several good options to help clinicians communicate with patients. None.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bender, Edward T.
Purpose: To develop a robust method for deriving dose-painting prescription functions using spatial information about the risk for disease recurrence. Methods: Spatial distributions of radiobiological model parameters are derived from distributions of recurrence risk after uniform irradiation. These model parameters are then used to derive optimal dose-painting prescription functions given a constant mean biologically effective dose. Results: An estimate for the optimal dose distribution can be derived based on spatial information about recurrence risk. Dose painting based on imaging markers that are moderately or poorly correlated with recurrence risk are predicted to potentially result in inferior disease control when comparedmore » the same mean biologically effective dose delivered uniformly. A robust optimization approach may partially mitigate this issue. Conclusions: The methods described here can be used to derive an estimate for a robust, patient-specific prescription function for use in dose painting. Two approximate scaling relationships were observed: First, the optimal choice for the maximum dose differential when using either a linear or two-compartment prescription function is proportional to R, where R is the Pearson correlation coefficient between a given imaging marker and recurrence risk after uniform irradiation. Second, the predicted maximum possible gain in tumor control probability for any robust optimization technique is nearly proportional to the square of R.« less
Ueda, Peter; Woodward, Mark; Lu, Yuan; Hajifathalian, Kaveh; Al-Wotayan, Rihab; Aguilar-Salinas, Carlos A; Ahmadvand, Alireza; Azizi, Fereidoun; Bentham, James; Cifkova, Renata; Di Cesare, Mariachiara; Eriksen, Louise; Farzadfar, Farshad; Ferguson, Trevor S; Ikeda, Nayu; Khalili, Davood; Khang, Young-Ho; Lanska, Vera; León-Muñoz, Luz; Magliano, Dianna J; Margozzini, Paula; Msyamboza, Kelias P; Mutungi, Gerald; Oh, Kyungwon; Oum, Sophal; Rodríguez-Artalejo, Fernando; Rojas-Martinez, Rosalba; Valdivia, Gonzalo; Wilks, Rainford; Shaw, Jonathan E; Stevens, Gretchen A; Tolstrup, Janne S; Zhou, Bin; Salomon, Joshua A; Ezzati, Majid; Danaei, Goodarz
2017-01-01
Summary Background Worldwide implementation of risk-based cardiovascular disease (CVD) prevention requires risk prediction tools that are contemporarily recalibrated for the target country and can be used where laboratory measurements are unavailable. We present two cardiovascular risk scores, with and without laboratory-based measurements, and the corresponding risk charts for 182 countries to predict 10-year risk of fatal and non-fatal CVD in adults aged 40–74 years. Methods Based on our previous laboratory-based prediction model (Globorisk), we used data from eight prospective studies to estimate coefficients of the risk equations using proportional hazard regressions. The laboratory-based risk score included age, sex, smoking, blood pressure, diabetes, and total cholesterol; in the non-laboratory (office-based) risk score, we replaced diabetes and total cholesterol with BMI. We recalibrated risk scores for each sex and age group in each country using country-specific mean risk factor levels and CVD rates. We used recalibrated risk scores and data from national surveys (using data from adults aged 40–64 years) to estimate the proportion of the population at different levels of CVD risk for ten countries from different world regions as examples of the information the risk scores provide; we applied a risk threshold for high risk of at least 10% for high-income countries (HICs) and at least 20% for low-income and middle-income countries (LMICs) on the basis of national and international guidelines for CVD prevention. We estimated the proportion of men and women who were similarly categorised as high risk or low risk by the two risk scores. Findings Predicted risks for the same risk factor profile were generally lower in HICs than in LMICs, with the highest risks in countries in central and southeast Asia and eastern Europe, including China and Russia. In HICs, the proportion of people aged 40–64 years at high risk of CVD ranged from 1% for South Korean women to 42% for Czech men (using a ≥10% risk threshold), and in low-income countries ranged from 2% in Uganda (men and women) to 13% in Iranian men (using a ≥20% risk threshold). More than 80% of adults were similarly classified as low or high risk by the laboratory-based and office-based risk scores. However, the office-based model substantially underestimated the risk among patients with diabetes. Interpretation Our risk charts provide risk assessment tools that are recalibrated for each country and make the estimation of CVD risk possible without using laboratory-based measurements. PMID:28126460
Gazijahani, Farhad Samadi; Ravadanegh, Sajad Najafi; Salehi, Javad
2018-02-01
The inherent volatility and unpredictable nature of renewable generations and load demand pose considerable challenges for energy exchange optimization of microgrids (MG). To address these challenges, this paper proposes a new risk-based multi-objective energy exchange optimization for networked MGs from economic and reliability standpoints under load consumption and renewable power generation uncertainties. In so doing, three various risk-based strategies are distinguished by using conditional value at risk (CVaR) approach. The proposed model is specified as a two-distinct objective function. The first function minimizes the operation and maintenance costs, cost of power transaction between upstream network and MGs as well as power loss cost, whereas the second function minimizes the energy not supplied (ENS) value. Furthermore, the stochastic scenario-based approach is incorporated into the approach in order to handle the uncertainty. Also, Kantorovich distance scenario reduction method has been implemented to reduce the computational burden. Finally, non-dominated sorting genetic algorithm (NSGAII) is applied to minimize the objective functions simultaneously and the best solution is extracted by fuzzy satisfying method with respect to risk-based strategies. To indicate the performance of the proposed model, it is performed on the modified IEEE 33-bus distribution system and the obtained results show that the presented approach can be considered as an efficient tool for optimal energy exchange optimization of MGs. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Mollalo, A; Khodabandehloo, E
2016-07-01
Zoonotic cutaneous leishmaniasis (ZCL) constitutes a serious public health problem in many parts of the world including Iran. This study was carried out to assess the risk of the disease in an endemic province by developing spatial environmentally based models in yearly intervals. To fill the gap of underestimated true burden of ZCL and short study period, analytical hierarchy process (AHP) and fuzzy AHP decision-making methods were used to determine the ZCL risk zones in a Geographic Information System platform. Generated risk maps showed that high-risk areas were predominantly located at the northern and northeastern parts in each of the three study years. Comparison of the generated risk maps with geocoded ZCL cases at the village level demonstrated that in both methods more than 90%, 70% and 80% of the cases occurred in high and very high risk areas for the years 2010, 2011, and 2012, respectively. Moreover, comparison of the risk categories with spatially averaged normalized difference vegetation index (NDVI) images and a digital elevation model of the study region indicated persistent strong negative relationships between these environmental variables and ZCL risk degrees. These findings identified more susceptible areas of ZCL and will help the monitoring of this zoonosis to be more targeted.
Using technology to assess and intervene with illicit drug-using persons at risk for HIV.
Horvath, Keith J; Lammert, Sara; LeGrand, Sara; Muessig, Kathryn E; Bauermeister, José A
2017-09-01
This review describes recent literature on novel ways technology is used for assessment of illicit drug use and HIV risk behaviours, suggestions for optimizing intervention acceptability, and recently completed and ongoing technology-based interventions for drug-using persons at risk for HIV and others with high rates of drug use and HIV risk behaviour. Among studies (n = 5) comparing technology-based to traditional assessment methods, those using Ecological Momentary Assessment (EMA) had high rates of reported drug use and high concordance with traditional assessment methods. The two recent studies assessing the acceptability of mHealth approaches overall demonstrate high interest in these approaches. Current or in-progress technology-based interventions (n = 8) are delivered using mobile apps (n = 5), text messaging (n = 2) and computers (n = 1). Most intervention studies are in progress or do not report intervention outcomes; the results from one efficacy trial showed significantly higher HIV testing rates among persons in need of drug treatment. Studies are needed to continually assess technology adoption and intervention preferences among drug-using populations to ensure that interventions are appropriately matched to users. Large-scale technology-based intervention trials to assess the efficacy of these approaches, as well as the impact of individual intervention components, on drug use and other high-risk behaviours are recommended.
ERIC Educational Resources Information Center
Ha, Yeongmi; Choi, Eunsook; Seo, Yeongmi; Kim, Tae-gu
2013-01-01
Background: This study identified relationships among subjective social status (SSS), weight perception, weight control behaviors, and weight status in Korean adolescents using nationally representative data collected from the 2009 Korea Youth Risk Behaviors Web-Based Survey. Methods: Data from 67,185 students aged 12-18 years were analyzed.…
Contract Design: Risk Management and Evaluation.
Mühlbacher, Axel C; Amelung, Volker E; Juhnke, Christin
2018-01-12
Effective risk adjustment is an aspect that is more and more given weight on the background of competitive health insurance systems and vital healthcare systems. The risk structure of the providers plays a vital role in Pay for Performance. A prerequisite for optimal incentive-based service models is a (partial) dependence of the agent's returns on the provider's gain level. Integrated care systems as well as accountable care organisations (ACOs) in the US and similar concepts in other countries are advocated as an effective method of improving the performance of healthcare systems. These systems outline a payment and care delivery model that intends to tie provider reimbursements to predefined quality metrics. By this the total costs of care shall be reduced. Little is known about the contractual design and the main challenges of delegating "accountability" to these new kinds of organisations and/or contracts. The costs of market utilisation are highly relevant for the conception of healthcare contracts; furthermore information asymmetries and contract-specific investments are an obstacle to the efficient operation of ACOs. A comprehensive literature review on methods of designing contracts in Integrated Care was conducted. The research question in this article focuses on how reimbursement strategies, evaluation of measures and methods of risk adjustment can best be integrated in healthcare contracting. Each integrated care contract includes challenges for both payers and providers without having sufficient empirical data on both sides. These challenges are clinical, administrative or financial nature. Risk adjusted contracts ensure that the reimbursement roughly matches the true costs resulting from the morbidity of a population. If reimbursement of care provider corresponds to the actual expenses for an individual/population the problem of risk selection is greatly reduced. The currently used methods of risk adjustment have widely differing model and forecast accuracy. For this reason, it is necessary to clearly regulate the method of risk adjustment in the integrated care contract. The series of three articles on contract design has shown that coordination and motivation problems in designing healthcare contracts cannot be solved at no-costs. Moreover, it became clear, that complete contracts in healthcare are unrealistic and that contracts do always include certain uncertainties. These are based on the risk of random, and no contracting party can control these risks completely. It is also not possible to fully integrate these risks in the contract or to eliminate these risks by the parties.
Risk Control Through the Use of Procedures - A Method for Evaluating the Change in Risk
NASA Technical Reports Server (NTRS)
Praino, Gregory; Sharit, Joseph
2010-01-01
Organizations use procedures to influence or control the behavior of their workers, but often have no basis for determining whether an additional rule, or procedural control will be beneficial. This paper outlines a proposed method for determining if the addition or removal of procedural controls will impact the occurrences of critical consequences. The proposed method focuses on two aspects: how valuable the procedural control is, based on the inevitability of the consequence and the opportunity to intervene; and how likely the control is to fail, based on five procedural design elements that address how well the rule or control has been Defined, Assigned, Trained, Organized and Monitored-referred to as the DATOM elements
Risk analysis theory applied to fishing operations: A new approach on the decision-making problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cunha, J.C.S.
1994-12-31
In the past the decisions concerning whether to continue or interrupt a fishing operation were based primarily on the operator`s previous experience. This procedure often led to wrong decisions and unnecessary loss of money and time. This paper describes a decision-making method based on risk analysis theory and previous operation results from a field under study. The method leads to more accurate decisions on a daily basis allowing the operator to verify each day of the operation if the decision being carried out is the one with the highest probability to conduct to the best economical result. An example ofmore » the method application is provided at the end of the paper.« less
Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach
NASA Astrophysics Data System (ADS)
Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh
2017-03-01
Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.
Forecasting VaR and ES of stock index portfolio: A Vine copula method
NASA Astrophysics Data System (ADS)
Zhang, Bangzheng; Wei, Yu; Yu, Jiang; Lai, Xiaodong; Peng, Zhenfeng
2014-12-01
Risk measurement has both theoretical and practical significance in risk management. Using daily sample of 10 international stock indices, firstly this paper models the internal structures among different stock markets with C-Vine, D-Vine and R-Vine copula models. Secondly, the Value-at-Risk (VaR) and Expected Shortfall (ES) of the international stock markets portfolio are forecasted using Monte Carlo method based on the estimated dependence of different Vine copulas. Finally, the accuracy of VaR and ES measurements obtained from different statistical models are evaluated by UC, IND, CC and Posterior analysis. The empirical results show that the VaR forecasts at the quantile levels of 0.9, 0.95, 0.975 and 0.99 with three kinds of Vine copula models are sufficiently accurate. Several traditional methods, such as historical simulation, mean-variance and DCC-GARCH models, fail to pass the CC backtesting. The Vine copula methods can accurately forecast the ES of the portfolio on the base of VaR measurement, and D-Vine copula model is superior to other Vine copulas.
Ability to distinguish between human and animal fecal pollution is important for risk assessment and watershed management, particularly in bodies of water used as sources of drinking water or for recreation. PCR-based methods were used to determine the source of fecal pollution ...
Method for Assessing the Integrated Risk of Soil Pollution in Industrial and Mining Gathering Areas
Guan, Yang; Shao, Chaofeng; Gu, Qingbao; Ju, Meiting; Zhang, Qian
2015-01-01
Industrial and mining activities are recognized as major sources of soil pollution. This study proposes an index system for evaluating the inherent risk level of polluting factories and introduces an integrated risk assessment method based on human health risk. As a case study, the health risk, polluting factories and integrated risks were analyzed in a typical industrial and mining gathering area in China, namely, Binhai New Area. The spatial distribution of the risk level was determined using a Geographic Information System. The results confirmed the following: (1) Human health risk in the study area is moderate to extreme, with heavy metals posing the greatest threat; (2) Polluting factories pose a moderate to extreme inherent risk in the study area. Such factories are concentrated in industrial and urban areas, but are irregularly distributed and also occupy agricultural land, showing a lack of proper planning and management; (3) The integrated risks of soil are moderate to high in the study area. PMID:26580644
Instability risk assessment of construction waste pile slope based on fuzzy entropy
NASA Astrophysics Data System (ADS)
Ma, Yong; Xing, Huige; Yang, Mao; Nie, Tingting
2018-05-01
Considering the nature and characteristics of construction waste piles, this paper analyzed the factors affecting the stability of the slope of construction waste piles, and established the system of the assessment indexes for the slope failure risks of construction waste piles. Based on the basic principles and methods of fuzzy mathematics, the factor set and the remark set were established. The membership grade of continuous factor indexes is determined using the "ridge row distribution" function, while that for the discrete factor indexes was determined by the Delphi Method. For the weight of factors, the subjective weight was determined by the Analytic Hierarchy Process (AHP) and objective weight by the entropy weight method. And the distance function was introduced to determine the combination coefficient. This paper established a fuzzy comprehensive assessment model of slope failure risks of construction waste piles, and assessed pile slopes in the two dimensions of hazard and vulnerability. The root mean square of the hazard assessment result and vulnerability assessment result was the final assessment result. The paper then used a certain construction waste pile slope as the example for analysis, assessed the risks of the four stages of a landfill, verified the assessment model and analyzed the slope's failure risks and preventive measures against a slide.
NASA Astrophysics Data System (ADS)
Purwanggono, Bambang; Margarette, Anastasia
2017-12-01
Completion time of highway construction is very meaningful for smooth transportation, moreover expected number of ownership motor vehicle will increase each year. Therefore, this study was conducted with to analyze the constraints that contained in an infrastructure development project. This research was conducted on Jatingaleh Underpass Project, Semarang. This research was carried out while the project is running, on the implementation, this project is experiencing delays. This research is done to find out what are the constraints that occur in execution of a road infrastructure project, in particular that causes delays. The method that used to find the root cause is fishbone diagram to obtain a possible means of mitigation. Coupled with the RFMEA method used to determine the critical risks that must be addressed immediately on road infrastructure project. The result of data tabulation in this study indicates that the most possible mitigation tool to make a Standard Operating Procedure (SOP) recommendations to disrupt utilities that interfere project implementation. Process of risk assessment has been carried out systematically based on ISO 31000:2009 on risk management and for determination of delayed variables, the requirements of process groups according to ISO 21500:2013 on project management were used.
Assessing diet in populations at risk for konzo and neurolathyrism.
Dufour, Darna L
2011-03-01
Although both konzo and neurolathyrism are diseases associated with diet, we know surprising little about the diets of the groups at risk. The objective of this paper is to discuss methods for assessing dietary intake in populations at risk for konzo and lathyrism. These methods include weighed food records and interview based techniques like 24-h recalls and food frequency questionnaires (FFQs). Food records have the potential to provide accurate information on food quantities, and are generally the method of choice. Interview based methods provide less precise information on the quantities of foods ingested, and are subject to recall bias, but may be useful in some studies or for surveillance. Sample size needs to be adequate to account for day-to-day and seasonal variability in food intake, and differences between age and sex groups. Adequate data on the composition of foods, as actually consumed, are needed to evaluate the food intake information. This is especially important in the case of cassava and grass pea where the toxins in the diet is a function of processing. Biomarkers for assessing the cyanogen exposure from cassava-based diets are available; biomarkers for the β-ODAP exposure from grass pea diets need development. Copyright © 2010 Elsevier Ltd. All rights reserved.
Wang, Ying; Wang, Juying; Mu, Jingli; Wang, Zhen; Cong, Yi; Yao, Ziwei; Lin, Zhongsheng
2016-06-01
Polycyclic aromatic hydrocarbons (PAHs), a class of ubiquitous pollutants in marine environments, exhibit moderate to high adverse effects on aquatic organisms and humans. However, the lack of PAH toxicity data for aquatic organism has limited evaluation of their ecological risks. In the present study, aquatic predicted no-effect concentrations (PNECs) of 16 priority PAHs were derived based on species sensitivity distribution models, and their probabilistic ecological risks in seawater of Liaodong Bay, Bohai Sea, China, were assessed. A quantitative structure-activity relationship method was adopted to achieve the predicted chronic toxicity data for the PNEC derivation. Good agreement for aquatic PNECs of 8 PAHs based on predicted and experimental chronic toxicity data was observed (R(2) = 0.746), and the calculated PNECs ranged from 0.011 µg/L to 205.3 µg/L. A significant log-linear relationship also existed between the octanol-water partition coefficient and PNECs derived from experimental toxicity data (R(2) = 0.757). A similar order of ecological risks for the 16 PAH species in seawater of Liaodong Bay was found by probabilistic risk quotient and joint probability curve methods. The individual high ecological risk of benzo[a]pyrene, benzo[b]fluoranthene, and benz[a]anthracene needs to be determined. The combined ecological risk of PAHs in seawater of Liaodong Bay calculated by the joint probability curve method was 13.9%, indicating a high risk as a result of co-exposure to PAHs. Environ Toxicol Chem 2016;35:1587-1593. © 2015 SETAC. © 2015 SETAC.
Guidelines for Risk-Based Changeover of Biopharma Multi-Product Facilities.
Lynch, Rob; Barabani, David; Bellorado, Kathy; Canisius, Peter; Heathcote, Doug; Johnson, Alan; Wyman, Ned; Parry, Derek Willison
2018-01-01
In multi-product biopharma facilities, the protection from product contamination due to the manufacture of multiple products simultaneously is paramount to assure product quality. To that end, the use of traditional changeover methods (elastomer change-out, full sampling, etc.) have been widely used within the industry and have been accepted by regulatory agencies. However, with the endorsement of Quality Risk Management (1), the use of risk-based approaches may be applied to assess and continuously improve established changeover processes. All processes, including changeover, can be improved with investment (money/resources), parallel activities, equipment design improvements, and standardization. However, processes can also be improved by eliminating waste. For product changeover, waste is any activity not needed for the new process or that does not provide added assurance of the quality of the subsequent product. The application of a risk-based approach to changeover aligns with the principles of Quality Risk Management. Through the use of risk assessments, the appropriate changeover controls can be identified and controlled to assure product quality is maintained. Likewise, the use of risk assessments and risk-based approaches may be used to improve operational efficiency, reduce waste, and permit concurrent manufacturing of products. © PDA, Inc. 2018.
Impact of model-based risk analysis for liver surgery planning.
Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K
2014-05-01
A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.
NASA Astrophysics Data System (ADS)
Davis, Adam Christopher
This research develops a new framework for evaluating the occupational risks of exposure to hazardous substances in any setting where As Low As Reasonably Achievable (ALARA) practices are mandated or used. The evaluation is performed by developing a hypothesis-test-based procedure for evaluating the homogeneity of various epidemiological cohorts, and thus the appropriateness of the application of aggregate data-pooling techniques to those cohorts. A statistical methodology is then developed as an alternative to aggregate pooling for situations in which individual cohorts show heterogeneity between them and are thus unsuitable for pooled analysis. These methods are then applied to estimate the all-cancer mortality risks incurred by workers at four Department-of-Energy nuclear weapons laboratories. Both linear, no-threshold and dose-bin averaged risks are calculated and it is further shown that aggregate analysis tends to overestimate the risks with respect to those calculated by the methods developed in this work. The risk estimates developed in Chapter 2 are, in Chapter 3, applied to assess the risks to workers engaged in americium recovery operations at Los Alamos National Laboratory. The work described in Chapter 3 develops a full radiological protection assessment for the new americium recovery project, including development of exposure cases, creation and modification of MCNP5 models, development of a time-and-motion study, and the final synthesis of all data. This work also develops a new risk-based method of determining whether administrative controls, such as staffing increases, are ALARA-optimized. The EPA's estimate of the value of statistical life is applied to these risk estimates to determine a monetary value for risk. The rate of change of this "risk value" (marginal risk) is then compared with the rate of change of workers' compensations as additional workers are added to the project to reduce the dose (and therefore, presumably, risk) to each individual.
Contract Design: Risk Management and Evaluation
Amelung, Volker E.; Juhnke, Christin
2018-01-01
Introduction: Effective risk adjustment is an aspect that is more and more given weight on the background of competitive health insurance systems and vital healthcare systems. The risk structure of the providers plays a vital role in Pay for Performance. A prerequisite for optimal incentive-based service models is a (partial) dependence of the agent’s returns on the provider’s gain level. Integrated care systems as well as accountable care organisations (ACOs) in the US and similar concepts in other countries are advocated as an effective method of improving the performance of healthcare systems. These systems outline a payment and care delivery model that intends to tie provider reimbursements to predefined quality metrics. By this the total costs of care shall be reduced. Methods: Little is known about the contractual design and the main challenges of delegating “accountability” to these new kinds of organisations and/or contracts. The costs of market utilisation are highly relevant for the conception of healthcare contracts; furthermore information asymmetries and contract-specific investments are an obstacle to the efficient operation of ACOs. A comprehensive literature review on methods of designing contracts in Integrated Care was conducted. The research question in this article focuses on how reimbursement strategies, evaluation of measures and methods of risk adjustment can best be integrated in healthcare contracting. Results: Each integrated care contract includes challenges for both payers and providers without having sufficient empirical data on both sides. These challenges are clinical, administrative or financial nature. Risk adjusted contracts ensure that the reimbursement roughly matches the true costs resulting from the morbidity of a population. If reimbursement of care provider corresponds to the actual expenses for an individual/population the problem of risk selection is greatly reduced. The currently used methods of risk adjustment have widely differing model and forecast accuracy. For this reason, it is necessary to clearly regulate the method of risk adjustment in the integrated care contract. Conclusions and discussion: The series of three articles on contract design has shown that coordination and motivation problems in designing healthcare contracts cannot be solved at no-costs. Moreover, it became clear, that complete contracts in healthcare are unrealistic and that contracts do always include certain uncertainties. These are based on the risk of random, and no contracting party can control these risks completely. It is also not possible to fully integrate these risks in the contract or to eliminate these risks by the parties. PMID:29632454
2009-12-01
correctly Risk before validation step: 41-60% - Is this too high/ low ? Why? Risk 8: Operational or data latency impacts based on relationship between...too high, too low , or correct. We also asked them to comment on why they felt this way. Finally, we left additional space on the survey for any...cost of each validation effort was too high, too low , or acceptable. They then gave us rationale for their beliefs. The second cost associated with
Introduction of risk size in the determination of uncertainty factor UFL in risk assessment
NASA Astrophysics Data System (ADS)
Xue, Jinling; Lu, Yun; Velasquez, Natalia; Yu, Ruozhen; Hu, Hongying; Liu, Zhengtao; Meng, Wei
2012-09-01
The methodology for using uncertainty factors in health risk assessment has been developed for several decades. A default value is usually applied for the uncertainty factor UFL, which is used to extrapolate from LOAEL (lowest observed adverse effect level) to NAEL (no adverse effect level). Here, we have developed a new method that establishes a linear relationship between UFL and the additional risk level at LOAEL based on the dose-response information, which represents a very important factor that should be carefully considered. This linear formula makes it possible to select UFL properly in the additional risk range from 5.3% to 16.2%. Also the results remind us that the default value 10 may not be conservative enough when the additional risk level at LOAEL exceeds 16.2%. Furthermore, this novel method not only provides a flexible UFL instead of the traditional default value, but also can ensure a conservative estimation of the UFL with fewer errors, and avoid the benchmark response selection involved in the benchmark dose method. These advantages can improve the estimation of the extrapolation starting point in the risk assessment.
A comparison of the environmental impact of different AOPs: risk indexes.
Giménez, Jaime; Bayarri, Bernardí; González, Óscar; Malato, Sixto; Peral, José; Esplugas, Santiago
2014-12-31
Today, environmental impact associated with pollution treatment is a matter of great concern. A method is proposed for evaluating environmental risk associated with Advanced Oxidation Processes (AOPs) applied to wastewater treatment. The method is based on the type of pollution (wastewater, solids, air or soil) and on materials and energy consumption. An Environmental Risk Index (E), constructed from numerical criteria provided, is presented for environmental comparison of processes and/or operations. The Operation Environmental Risk Index (EOi) for each of the unit operations involved in the process and the Aspects Environmental Risk Index (EAj) for process conditions were also estimated. Relative indexes were calculated to evaluate the risk of each operation (E/NOP) or aspect (E/NAS) involved in the process, and the percentage of the maximum achievable for each operation and aspect was found. A practical application of the method is presented for two AOPs: photo-Fenton and heterogeneous photocatalysis with suspended TiO2 in Solarbox. The results report the environmental risks associated with each process, so that AOPs tested and the operations involved with them can be compared.
NASA Astrophysics Data System (ADS)
Haining, Wang; Lei, Wang; Qian, Zhang; Zongqiang, Zheng; Hongyu, Zhou; Chuncheng, Gao
2018-03-01
For the uncertain problems in the comprehensive evaluation of supervision risk in electricity transaction, this paper uses the unidentified rational numbers to evaluation the supervision risk, to obtain the possible result and corresponding credibility of evaluation and realize the quantification of risk indexes. The model can draw the risk degree of various indexes, which makes it easier for the electricity transaction supervisors to identify the transaction risk and determine the risk level, assisting the decision-making and realizing the effective supervision of the risk. The results of the case analysis verify the effectiveness of the model.
Spatial generalised linear mixed models based on distances.
Melo, Oscar O; Mateu, Jorge; Melo, Carlos E
2016-10-01
Risk models derived from environmental data have been widely shown to be effective in delineating geographical areas of risk because they are intuitively easy to understand. We present a new method based on distances, which allows the modelling of continuous and non-continuous random variables through distance-based spatial generalised linear mixed models. The parameters are estimated using Markov chain Monte Carlo maximum likelihood, which is a feasible and a useful technique. The proposed method depends on a detrending step built from continuous or categorical explanatory variables, or a mixture among them, by using an appropriate Euclidean distance. The method is illustrated through the analysis of the variation in the prevalence of Loa loa among a sample of village residents in Cameroon, where the explanatory variables included elevation, together with maximum normalised-difference vegetation index and the standard deviation of normalised-difference vegetation index calculated from repeated satellite scans over time. © The Author(s) 2013.
DEVELOPMENT OF ENVIRONMENTAL INDICES FOR GREEN CHEMICAL PRODUCTION AND USE
Chemical production, use and disposal cause adverse impacts on the environment. Consequently, much research has been conducted to develop methods for estimating the risk of chemicals and to screen them based on environmental impact. Risk assessment may be subdivide...
Fish Consumption Advisories: Toward a Unified, Scientifically Credible Approach
A model is proposed for fish consumption advisories based on consensus-derived risk assessment values for common contaminants in fish and the latest risk assessment methods. he model accounts in part for the expected toxicity to mixtures of chemicals, the underlying uncertainties...
2013-01-01
Background Making evidence-based decisions often requires comparison of two or more options. Research-based evidence may exist which quantifies how likely the outcomes are for each option. Understanding these numeric estimates improves patients’ risk perception and leads to better informed decision making. This paper summarises current “best practices” in communication of evidence-based numeric outcomes for developers of patient decision aids (PtDAs) and other health communication tools. Method An expert consensus group of fourteen researchers from North America, Europe, and Australasia identified eleven main issues in risk communication. Two experts for each issue wrote a “state of the art” summary of best evidence, drawing on the PtDA, health, psychological, and broader scientific literature. In addition, commonly used terms were defined and a set of guiding principles and key messages derived from the results. Results The eleven key components of risk communication were: 1) Presenting the chance an event will occur; 2) Presenting changes in numeric outcomes; 3) Outcome estimates for test and screening decisions; 4) Numeric estimates in context and with evaluative labels; 5) Conveying uncertainty; 6) Visual formats; 7) Tailoring estimates; 8) Formats for understanding outcomes over time; 9) Narrative methods for conveying the chance of an event; 10) Important skills for understanding numerical estimates; and 11) Interactive web-based formats. Guiding principles from the evidence summaries advise that risk communication formats should reflect the task required of the user, should always define a relevant reference class (i.e., denominator) over time, should aim to use a consistent format throughout documents, should avoid “1 in x” formats and variable denominators, consider the magnitude of numbers used and the possibility of format bias, and should take into account the numeracy and graph literacy of the audience. Conclusion A substantial and rapidly expanding evidence base exists for risk communication. Developers of tools to facilitate evidence-based decision making should apply these principles to improve the quality of risk communication in practice. PMID:24625237
An evaluation of Computational Fluid dynamics model for flood risk analysis
NASA Astrophysics Data System (ADS)
Di Francesco, Silvia; Biscarini, Chiara; Montesarchio, Valeria
2014-05-01
This work presents an analysis of the hydrological-hydraulic engineering requisites for Risk evaluation and efficient flood damage reduction plans. Most of the research efforts have been dedicated to the scientific and technical aspects of risk assessment, providing estimates of possible alternatives and of the risk associated. In the decision making process for mitigation plan, the contribute of scientist is crucial, due to the fact that Risk-Damage analysis is based on evaluation of flow field ,of Hydraulic Risk and on economical and societal considerations. The present paper will focus on the first part of process, the mathematical modelling of flood events which is the base for all further considerations. The evaluation of potential catastrophic damage consequent to a flood event and in particular to dam failure requires modelling of the flood with sufficient detail so to capture the spatial and temporal evolutions of the event, as well of the velocity field. Thus, the selection of an appropriate mathematical model to correctly simulate flood routing is an essential step. In this work we present the application of two 3D Computational fluid dynamics models to a synthetic and real case study in order to evaluate the correct evolution of flow field and the associated flood Risk . The first model is based on a opensource CFD platform called openFoam. Water flow is schematized with a classical continuum approach based on Navier-Stokes equation coupled with Volume of fluid (VOF) method to take in account the multiphase character of river bottom-water- air systems. The second model instead is based on the Lattice Boltzmann method, an innovative numerical fluid dynamics scheme based on Boltzmann's kinetic equation that represents the flow dynamics at the macroscopic level by incorporating a microscopic kinetic approach. Fluid is seen as composed by particles that can move and collide among them. Simulation results from both models are promising and congruent to experimental results available in literature, thought the LBM model requires less computational effort respect to the NS one.
NASA Astrophysics Data System (ADS)
Misztal, A.; Belu, N.
2016-08-01
Operation of every company is associated with the risk of interfering with proper performance of its fundamental processes. This risk is associated with various internal areas of the company, as well as the environment in which it operates. From the point of view of ensuring compliance of the course of specific technological processes and, consequently, product conformity with requirements, it is important to identify these threats and eliminate or reduce the risk of their occurrence. The purpose of this article is to present a model of areas of identifying risk affecting the compliance of processes and products, which is based on multiregional targeted monitoring of typical places of interference and risk management methods. The model is based on the verification of risk analyses carried out in small and medium-sized manufacturing companies in various industries..
Integrated rare variant-based risk gene prioritization in disease case-control sequencing studies.
Lin, Jhih-Rong; Zhang, Quanwei; Cai, Ying; Morrow, Bernice E; Zhang, Zhengdong D
2017-12-01
Rare variants of major effect play an important role in human complex diseases and can be discovered by sequencing-based genome-wide association studies. Here, we introduce an integrated approach that combines the rare variant association test with gene network and phenotype information to identify risk genes implicated by rare variants for human complex diseases. Our data integration method follows a 'discovery-driven' strategy without relying on prior knowledge about the disease and thus maintains the unbiased character of genome-wide association studies. Simulations reveal that our method can outperform a widely-used rare variant association test method by 2 to 3 times. In a case study of a small disease cohort, we uncovered putative risk genes and the corresponding rare variants that may act as genetic modifiers of congenital heart disease in 22q11.2 deletion syndrome patients. These variants were missed by a conventional approach that relied on the rare variant association test alone.
Communication about environmental health risks: a systematic review.
Fitzpatrick-Lewis, Donna; Yost, Jennifer; Ciliska, Donna; Krishnaratne, Shari
2010-11-01
Using the most effective methods and techniques for communicating risk to the public is critical. Understanding the impact that different types of risk communication have played in real and perceived public health risks can provide information about how messages, policies and programs can and should be communicated in order to be most effective. The purpose of this systematic review is to identify the effectiveness of communication strategies and factors that impact communication uptake related to environmental health risks. A systematic review of English articles using multiple databases with appropriate search terms. Data sources also included grey literature. Key organization websites and key journals were hand searched for relevant articles. Consultation with experts took place to locate any additional references.Articles had to meet relevance criteria for study design [randomized controlled trials, clinical controlled trials, cohort analytic, cohort, any pre-post, interrupted time series, mixed methods or any qualitative studies), participants (those in community-living, non-clinical populations), interventions (including, but not limited to, any community-based methods or tools such as Internet, telephone, media-based interventions or any combination thereof), and outcomes (reported measurable outcomes such as awareness, knowledge or attitudinal or behavioural change). Articles were assessed for quality and data was extracted using standardized tools by two independent reviewers. Articles were given an overall assessment of strong, moderate or weak quality. There were no strong or moderate studies. Meta-analysis was not appropriate to the data. Data for 24 articles were analyzed and reported in a narrative format. The findings suggest that a multi-media approach is more effective than any single media approach. Similarly, printed material that offers a combination of information types (i.e., text and diagrams) is a more effective than just a single type, such as all text. Findings also suggest that factors influencing response to risk communications are impacted by personal risk perception, previous personal experience with risk, sources of information and trust in those sources. No single method of message delivery is best. Risk communication strategies that incorporate the needs of the target audience(s) with a multi-faceted delivery method are most effective at reaching the audience.
A method for scenario-based risk assessment for robust aerospace systems
NASA Astrophysics Data System (ADS)
Thomas, Victoria Katherine
In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps involved in completing the modeling and simulation are: Alternative Solution Modeling, Uncertainty Quantification, Risk Assessment, and Risk Mitigation. Focus area three consists of Decision Support. In this area a decision support interface is created that allows for game playing between solution alternatives and risk mitigation. A multi-attribute decision making process is also implemented to aid in decision making. A demonstration problem inspired by Airbus' mid 1980s decision to break into the widebody long-range market was developed to illustrate the use of this method. The results showed that the method is able to capture additional types of risk than previous analysis methods, particularly at the early stages of aircraft design. It was also shown that the method can be used to help create a system that is robust to external environmental factors. The addition of an external environment risk analysis in the early stages of conceptual design can add another dimension to the analysis of feasibility and viability. The ability to take risk into account during the early stages of the design process can allow for the elimination of potentially feasible and viable but too-risky alternatives. The addition of a scenario-based analysis instead of a traditional probabilistic analysis enabled uncertainty to be effectively bound and examined over a variety of potential futures instead of only a single future. There is also potential for a product to be groomed for a specific future that one believes is likely to happen, or for a product to be steered during design as the future unfolds.
Application of the Risk-Based Early Warning Method in a Fracture-Karst Water Source, North China.
Guo, Yongli; Wu, Qing; Li, Changsuo; Zhao, Zhenhua; Sun, Bin; He, Shiyi; Jiang, Guanghui; Zhai, Yuanzheng; Guo, Fang
2018-03-01
The paper proposes a risk-based early warning considering characteristics of fracture-karst aquifer in North China and applied it in a super-large fracture-karst water source. Groundwater vulnerability, types of land use, water abundance, transmissivity and spatial temporal variation of groundwater quality were chosen as indexes of the method. Weights of factors were obtained by using AHP method based on relative importance of factors, maps of factors were zoned by GIS, early warning map was conducted based on extension theory with the help of GIS, ENVI+IDL. The early warning map fused five factors very well, serious and tremendous warning areas are mainly located in northwest and east with high or relatively high transmissivity and groundwater pollutant loading, and obviously deteriorated or deteriorated trend of petroleum. The early warning map warns people where more attention should be paid, and the paper guides decision making to take appropriate protection actions in different warning levels areas.
Risk Assessment in Underground Coalmines Using Fuzzy Logic in the Presence of Uncertainty
NASA Astrophysics Data System (ADS)
Tripathy, Debi Prasad; Ala, Charan Kumar
2018-04-01
Fatal accidents are occurring every year as regular events in Indian coal mining industry. To increase the safety conditions, it has become a prerequisite to performing a risk assessment of various operations in mines. However, due to uncertain accident data, it is hard to conduct a risk assessment in mines. The object of this study is to present a method to assess safety risks in underground coalmines. The assessment of safety risks is based on the fuzzy reasoning approach. Mamdani fuzzy logic model is developed in the fuzzy logic toolbox of MATLAB. A case study is used to demonstrate the applicability of the developed model. The summary of risk evaluation in case study mine indicated that mine fire has the highest risk level among all the hazard factors. This study could help the mine management to prepare safety measures based on the risk rankings obtained.
Active epidemiological surveillance of musculoskeletal disorders in a shoe factory
Roquelaure, Y; Mariel, J; Fanello, S; Boissiere, J; Chiron, H; Dano, C; Bureau, D; Penneau-Fontbonne, D
2002-01-01
Aims: (1) To evaluate an active method of surveillance of musculoskeletal disorders (MSDs). (2) To compare different criteria for deciding whether or not a work situation could be considered at high risk of MSDs in a large, modern shoe factory. Methods: A total of 253 blue collar workers were interviewed and examined by the same physician in 1996; 191 of them were re-examined in 1997. Risk factors of MSDs were assessed for each worker by standardised job site work analysis. Prevalence and incidence rates of carpal tunnel syndrome, rotator cuff syndrome, and tension neck syndrome were calculated for each of the nine main types of work situation. Different criteria used to assess situations with high risk of MSDs were compared. Results: On the basis of prevalence data, three types of work situation were detected to be at high risk of MSDs: cutting, sewing, and assembly preparation. The three types of work situations identified on the basis of incidence data (sewing preparation, mechanised assembling, and finishing) were different from those identified by prevalence data. At least one recognised risk factor for MSDs was identified for all groups of work situations. The ergonomic risk could be considered as serious for the four types of work situation having the highest ergonomic scores (sewing, assembly preparation, pasting, and cutting). Conclusion: The results of the health surveillance method depend largely on the definition of the criteria used to define the risk of MSDs. The criteria based on incidence data are more valid than those based on prevalence data. Health and risk factor surveillance must be combined to predict the risk of MSDs in the company. However, exposure assessment plays a greater role in determining the priorities for ergonomic intervention. PMID:12107293
Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land
2006-01-01
We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.
1990-12-01
Armstrong Aerospace Medical Research Laboratory, Wright Paterson Air Force Base, and Drs. Melvin Andersen and Michael Cargas , formerly with the Harry G...based on the arterial blood concentration surrogate were more III-1-10 similar to those de ’!ved in the traditional manner than were the estimates based on...pharmacokinetic modeling. Prepared by Office of Risk Analysis, Oak Ridge National L-ioratory, Oak Pidge, Tenn.zsee. Prepared under Contract No. DE -ACO5-84
Brautbar, Ariel; Pompeii, Lisa A.; Dehghan, Abbas; Ngwa, Julius S.; Nambi, Vijay; Virani, Salim S.; Rivadeneira, Fernando; Uitterlinden, André G.; Hofman, Albert; Witteman, Jacqueline C.M.; Pencina, Michael J.; Folsom, Aaron R.; Cupples, L. Adrienne; Ballantyne, Christie M.; Boerwinkle, Eric
2013-01-01
Objective Multiple studies have identified single-nucleotide polymorphisms (SNPs) that are associated with coronary heart disease (CHD). We examined whether SNPs selected based on predefined criteria will improve CHD risk prediction when added to traditional risk factors (TRFs). Methods SNPs were selected from the literature based on association with CHD, lack of association with a known CHD risk factor, and successful replication. A genetic risk score (GRS) was constructed based on these SNPs. Cox proportional hazards model was used to calculate CHD risk based on the Atherosclerosis Risk in Communities (ARIC) and Framingham CHD risk scores with and without the GRS. Results The GRS was associated with risk for CHD (hazard ratio [HR] = 1.10; 95% confidence interval [CI]: 1.07–1.13). Addition of the GRS to the ARIC risk score significantly improved discrimination, reclassification, and calibration beyond that afforded by TRFs alone in non-Hispanic whites in the ARIC study. The area under the receiver operating characteristic curve (AUC) increased from 0.742 to 0.749 (Δ= 0.007; 95% CI, 0.004–0.013), and the net reclassification index (NRI) was 6.3%. Although the risk estimates for CHD in the Framingham Offspring (HR = 1.12; 95% CI: 1.10–1.14) and Rotterdam (HR = 1.08; 95% CI: 1.02–1.14) Studies were significantly improved by adding the GRS to TRFs, improvements in AUC and NRI were modest. Conclusion Addition of a GRS based on direct associations with CHD to TRFs significantly improved discrimination and reclassification in white participants of the ARIC Study, with no significant improvement in the Rotterdam and Framingham Offspring Studies. PMID:22789513
[Medium-term forecast of solar cosmic rays radiation risk during a manned Mars mission].
Petrov, V M; Vlasov, A G
2006-01-01
Medium-term forecasting radiation hazard from solar cosmic rays will be vital in a manned Mars mission. Modern methods of space physics lack acceptable reliability in medium-term forecasting the SCR onset and parameters. The proposed estimation of average radiation risk from SCR during the manned Mars mission is made with the use of existing SCR fluence and spectrum models and correlation of solar particle event frequency with predicted Wolf number. Radiation risk is considered an additional death probability from acute radiation reactions (ergonomic component) or acute radial disease in flight. The algorithm for radiation risk calculation is described and resulted risk levels for various periods of the 23-th solar cycle are presented. Applicability of this method to advance forecasting and possible improvements are being investigated. Recommendations to the crew based on risk estimation are exemplified.
Wang, Molin; Liao, Xiaomei; Laden, Francine; Spiegelman, Donna
2016-06-15
Identification of the latency period and age-related susceptibility, if any, is an important aspect of assessing risks of environmental, nutritional, and occupational exposures. We consider estimation and inference for latency and age-related susceptibility in relative risk and excess risk models. We focus on likelihood-based methods for point and interval estimation of the latency period and age-related windows of susceptibility coupled with several commonly considered exposure metrics. The method is illustrated in a study of the timing of the effects of constituents of air pollution on mortality in the Nurses' Health Study. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Development and application of a probabilistic method for wildfire suppression cost modeling
Matthew P. Thompson; Jessica R. Haas; Mark A. Finney; David E. Calkin; Michael S. Hand; Mark J. Browne; Martin Halek; Karen C. Short; Isaac C. Grenfell
2015-01-01
Wildfire activity and escalating suppression costs continue to threaten the financial health of federal land management agencies. In order to minimize and effectively manage the cost of financial risk, agencies need the ability to quantify that risk. A fundamental aim of this research effort, therefore, is to develop a process for generating risk-based metrics for...
ERIC Educational Resources Information Center
Fleming, Steven T.
1992-01-01
The concept of risk-adjusted measures of quality is discussed, and a methodology is proposed for risk-adjusting and integrating multiple adverse outcomes of anesthesia services into measures for quality assurance and quality improvement programs. Although designed for a new anesthesiology database, the methods should apply to other health…
ERIC Educational Resources Information Center
Davies, Kurtland; Cohen, Michael J.
A model of an integrated ecologically-based counseling and recovery program is explored as a means of incorporating educational and psychological nature-connecting methods and materials with traditional recovery activities for people at risk and as a preventative. The first part of the program introduces high-risk high school students, most of…
Risk Management using Dependency Stucture Matrix
NASA Astrophysics Data System (ADS)
Petković, Ivan
2011-09-01
An efficient method based on dependency structure matrix (DSM) analysis is given for ranking risks in a complex system or process whose entities are mutually dependent. This rank is determined according to the element's values of the unique positive eigenvector which corresponds to the matrix spectral radius modeling the considered engineering system. For demonstration, the risk problem of NASA's robotic spacecraft is analyzed.
Takeuchi, Yoshinori; Shinozaki, Tomohiro; Matsuyama, Yutaka
2018-01-08
Despite the frequent use of self-controlled methods in pharmacoepidemiological studies, the factors that may bias the estimates from these methods have not been adequately compared in real-world settings. Here, we comparatively examined the impact of a time-varying confounder and its interactions with time-invariant confounders, time trends in exposures and events, restrictions, and misspecification of risk period durations on the estimators from three self-controlled methods. This study analyzed self-controlled case series (SCCS), case-crossover (CCO) design, and sequence symmetry analysis (SSA) using simulated and actual electronic medical records datasets. We evaluated the performance of the three self-controlled methods in simulated cohorts for the following scenarios: 1) time-invariant confounding with interactions between the confounders, 2) time-invariant and time-varying confounding without interactions, 3) time-invariant and time-varying confounding with interactions among the confounders, 4) time trends in exposures and events, 5) restricted follow-up time based on event occurrence, and 6) patient restriction based on event history. The sensitivity of the estimators to misspecified risk period durations was also evaluated. As a case study, we applied these methods to evaluate the risk of macrolides on liver injury using electronic medical records. In the simulation analysis, time-varying confounding produced bias in the SCCS and CCO design estimates, which aggravated in the presence of interactions between the time-invariant and time-varying confounders. The SCCS estimates were biased by time trends in both exposures and events. Erroneously short risk periods introduced bias to the CCO design estimate, whereas erroneously long risk periods introduced bias to the estimates of all three methods. Restricting the follow-up time led to severe bias in the SSA estimates. The SCCS estimates were sensitive to patient restriction. The case study showed that although macrolide use was significantly associated with increased liver injury occurrence in all methods, the value of the estimates varied. The estimations of the three self-controlled methods depended on various underlying assumptions, and the violation of these assumptions may cause non-negligible bias in the resulting estimates. Pharmacoepidemiologists should select the appropriate self-controlled method based on how well the relevant key assumptions are satisfied with respect to the available data.
Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A
2011-09-10
In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Jinguuji, Motoharu; Toprak, Selcuk
2017-12-01
The Hinode area of Itako City in Ibaraki Prefecture, Japan, suffered some of the most severe liquefaction damage of any areas in the Great Eastern Japan Earthquake in 2011. This liquefaction damage has been investigated by Itako City, as well as by universities and research institutes in Japan. The National Institute of Advanced Industrial Science and Technology (AIST) has carried out numerous investigations along the Tone River, and in particular, intensive surveys were done in the Hinode area. We have conducted a risk analysis based on the thickness and depth of the liquefaction layer measured using cone penetration testing (CPT) data and electric resistivity data obtained in the Hinode area. The distribution of the risk estimated from CPT at 143 points, and that obtained from analysis of the resistivity survey data, agreed with the distribution of actual damage. We also carried out conventional risk analyses method using the liquefaction resistance factor (FL) and liquefaction potential index (PL) methods with CPT data. The results show high PL values over the entire area, but their distribution did not agree well with actual damage in some parts of the study area. Because the analysis of the thickness and depth of the liquefaction layer, using geophysical prospecting methods, can cover a widespread area, this method will be very useful in investigating liquefaction risk, especially for gas and water pipelines.
Risk-based transfer responses to climate change, simulated through autocorrelated stochastic methods
NASA Astrophysics Data System (ADS)
Kirsch, B.; Characklis, G. W.
2009-12-01
Maintaining municipal water supply reliability despite growing demands can be achieved through a variety of mechanisms, including supply strategies such as temporary transfers. However, much of the attention on transfers has been focused on market-based transfers in the western United States largely ignoring the potential for transfers in the eastern U.S. The different legal framework of the eastern and western U.S. leads to characteristic differences between their respective transfers. Western transfers tend to be agricultural-to-urban and involve raw, untreated water, with the transfer often involving a simple change in the location and/or timing of withdrawals. Eastern transfers tend to be contractually established urban-to-urban transfers of treated water, thereby requiring the infrastructure to transfer water between utilities. Utilities require the tools to be able to evaluate transfer decision rules and the resulting expected future transfer behavior. Given the long-term planning horizons of utilities, potential changes in hydrologic patterns due to climate change must be considered. In response, this research develops a method for generating a stochastic time series that reproduces the historic autocorrelation and can be adapted to accommodate future climate scenarios. While analogous in operation to an autoregressive model, this method reproduces the seasonal autocorrelation structure, as opposed to assuming the strict stationarity produced by an autoregressive model. Such urban-to-urban transfers are designed to be rare, transient events used primarily during times of severe drought, and incorporating Monte Carlo techniques allows for the development of probability distributions of likely outcomes. This research evaluates a system risk-based, urban-to-urban transfer agreement between three utilities in the Triangle region of North Carolina. Two utilities maintain their own surface water supplies in adjoining watersheds and look to obtain transfers via interconnections to a third utility with access to excess supply. The stochastic generation method is adapted to maintain the cross-correlation of inflows between watersheds. Risk-based decision rules are developed to govern transfers based upon the current level of risk to the water supply. This work determines how expected transfer behavior changes under four future climate scenarios assuming several different risk-thresholds.
Risk assessment of logistics outsourcing based on BP neural network
NASA Astrophysics Data System (ADS)
Liu, Xiaofeng; Tian, Zi-you
The purpose of this article is to evaluate the risk of the enterprises logistics outsourcing. To get this goal, the paper first analysed he main risks existing in the logistics outsourcing, and then set up a risk evaluation index system of the logistics outsourcing; second applied BP neural network into the logistics outsourcing risk evaluation and used MATLAB to the simulation. It proved that the network error is small and has strong practicability. And this method can be used by enterprises to evaluate the risks of logistics outsourcing.
Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian
2012-04-01
Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.
Xu, H D; Zhao, L; Tang, S C; Zhang, J; Kong, F L; Jia, G
2016-12-20
Objective: To explore and validate suitable risk assessment methods for titanium dioxide though applying three risk assessment tools for nanomaterials based on the control banding (CB) approach. Methods: A factory manufacturing titanium dioxide in Jinan city, Shandong province, was assessed using a quantitative exposure method and qualitative risk assessment methods in September, 2014. A condensation particle counter equipment was used to monitor the number concentration of particles at packaging workshop and jet milling workshop. We employed three control banding tools, including CB nanotool, Stoffenmanager nano and the Guidance on working safely with nanomaterials and nanoproducts (GWSNN) to evaluate the two workshops, then compared the evaluation results. Results: The increases of particle concentrations were generated directly by packaging and jet milling processes, the number concentration from (3.52±1.46) ×10(4)/cm(3) to (14.70±8.86) ×10(4)/cm(3) at packaging workshop and from (0.97±0.25) ×10(4)/cm(3) to (1.26±0.35) ×10(4)/cm(3) at milling workshop (both P <0.05) . The number concentrations at packaging workshop were higher than those at jet milling workshop during both manufacturing and break times (both P <0.05) . The results of CB nanotool showed that the risk level of the packaging workshop was classified as high and the risk level of the jet milling workshop was classified asmedium. The results of Stoffenmanager nano showed that the risk level of the packaging workshop was classified as medium and the risk level of the jet milling workshop was classified as low. The results of GWSNN showed that the risk level of packaging workshop was classified as high and the risk level of jet milling workshop was classified as low. Conclusion: The results of evaluation based on the three control banding tools are related and aligned with the results of quantitative monitoring, so they are all suitable to perform occupational health risk assessment on industrial scale production of titanium dioxideto some extent.
Design Tool Using a New Optimization Method Based on a Stochastic Process
NASA Astrophysics Data System (ADS)
Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio
Conventional optimization methods are based on a deterministic approach since their purpose is to find out an exact solution. However, such methods have initial condition dependence and the risk of falling into local solution. In this paper, we propose a new optimization method based on the concept of path integrals used in quantum mechanics. The method obtains a solution as an expected value (stochastic average) using a stochastic process. The advantages of this method are that it is not affected by initial conditions and does not require techniques based on experiences. We applied the new optimization method to a hang glider design. In this problem, both the hang glider design and its flight trajectory were optimized. The numerical calculation results prove that performance of the method is sufficient for practical use.
The Spatial Distributions and Variations of Water Environmental Risk in Yinma River Basin, China.
Di, Hui; Liu, Xingpeng; Zhang, Jiquan; Tong, Zhijun; Ji, Meichen
2018-03-15
Water environmental risk is the probability of the occurrence of events caused by human activities or the interaction of human activities and natural processes that will damage a water environment. This study proposed a water environmental risk index (WERI) model to assess the water environmental risk in the Yinma River Basin based on hazards, exposure, vulnerability, and regional management ability indicators in a water environment. The data for each indicator were gathered from 2000, 2005, 2010, and 2015 to assess the spatial and temporal variations in water environmental risk using particle swarm optimization and the analytic hierarchy process (PSO-AHP) method. The results showed that the water environmental risk in the Yinma River Basin decreased from 2000 to 2015. The risk level of the water environment was high in Changchun, while the risk levels in Yitong and Yongji were low. The research methods provide information to support future decision making by the risk managers in the Yinma River Basin, which is in a high-risk water environment. Moreover, water environment managers could reduce the risks by adjusting the indicators that affect water environmental risks.
Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581
NASA Astrophysics Data System (ADS)
Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin
2016-04-01
Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.
A Review on Automatic Mammographic Density and Parenchymal Segmentation
He, Wenda; Juette, Arne; Denton, Erika R. E.; Oliver, Arnau
2015-01-01
Breast cancer is the most frequently diagnosed cancer in women. However, the exact cause(s) of breast cancer still remains unknown. Early detection, precise identification of women at risk, and application of appropriate disease prevention measures are by far the most effective way to tackle breast cancer. There are more than 70 common genetic susceptibility factors included in the current non-image-based risk prediction models (e.g., the Gail and the Tyrer-Cuzick models). Image-based risk factors, such as mammographic densities and parenchymal patterns, have been established as biomarkers but have not been fully incorporated in the risk prediction models used for risk stratification in screening and/or measuring responsiveness to preventive approaches. Within computer aided mammography, automatic mammographic tissue segmentation methods have been developed for estimation of breast tissue composition to facilitate mammographic risk assessment. This paper presents a comprehensive review of automatic mammographic tissue segmentation methodologies developed over the past two decades and the evidence for risk assessment/density classification using segmentation. The aim of this review is to analyse how engineering advances have progressed and the impact automatic mammographic tissue segmentation has in a clinical environment, as well as to understand the current research gaps with respect to the incorporation of image-based risk factors in non-image-based risk prediction models. PMID:26171249
Method for the measurement of susceptibility to decubitus ulcer formation.
Meijer, J H; Schut, G L; Ribbe, M W; Goovaerts, H G; Nieuwenhuys, R; Reulen, J P; Schneider, H
1989-09-01
A method for measuring the susceptibility of a patient to develop decubitus ulcers is described and initially evaluated. It is based on an indirect, noninvasive measurement of the transient regional blood flow response after a test pressure load which simulates the external stimulus for pressure-sore formation. This method was developed to determine the individual risk of a patient and to study the subfactors which contribute to the susceptibility. This would also offer the possibility of evaluating the effect of preventive treatment aimed at reducing the susceptibility. The method was found to discriminate between preselected elderly patients at risk on the one hand, and non-risk patients and healthy young adults on the other hand. No differences in blood flow responses were found between the non-risk elderly patients and the healthy young adults. This suggests that age per se is not a factor in the formation of pressure sores. In the risk group the recovery time after pressure relief was found to be three times as long as the duration of the pressure exercise. This indicates that the recovery time after pressure exercise may be as important as the period of pressure exercise in deducing the risk of developing decubitus ulcers.
ERIC Educational Resources Information Center
MacMaster, Samuel A.; Jones, Jenny L.; Rasch, Randolph F. R.; Crawford, Sharon L.; Thompson, Stephanie; Sanders, Edwin C., II
2007-01-01
Objective: This article provides an evaluation of a federally funded faith-based program that serves African Americans who use heroin and cocaine and are at risk for HIV/AIDS in Nashville, Tennessee. Methods: Data were collected from 163 individuals at baseline and 6- and 12-month follow-up interviews. A subset of participants (n = 51) completed…
2010-06-01
a storytelling narrative of how the risk was mitigated and what worked or did not work. A knowledge-based risk is also a means of transferring...devel- oping a cadre of trained facilitators to assist teams in using Web-based deci- sion-support technology to support team brain - storming...and use “contextualization” (a.k.a. storytelling ) as an alternative method to analysis? Storytelling Instead of Analysis There have been some
Milburn, Trelani F; Lonigan, Christopher J; Allan, Darcey M; Phillips, Beth M
2017-04-01
To investigate approaches for identifying young children who may be at risk for later reading-related learning disabilities, this study compared the use of four contemporary methods of indexing learning disability (LD) with older children (i.e., IQ-achievement discrepancy, low achievement, low growth, and dual-discrepancy) to determine risk status with a large sample of 1,011 preschoolers. These children were classified as at risk or not using each method across three early-literacy skills (i.e., language, phonological awareness, print knowledge) and at three levels of severity (i.e., 5th, 10th, 25th percentiles). Chance-corrected affected-status agreement (CCASA) indicated poor agreement among methods with rates of agreement generally decreasing with greater levels of severity for both single- and two-measure classification, and agreement rates were lower for two-measure classification than for single-measure classification. These low rates of agreement between conventional methods of identifying children at risk for LD represent a significant impediment for identification and intervention for young children considered at-risk.
A risk analysis for production processes with disposable bioreactors.
Merseburger, Tobias; Pahl, Ina; Müller, Daniel; Tanner, Markus
2014-01-01
: Quality management systems are, as a rule, tightly defined systems that conserve existing processes and therefore guarantee compliance with quality standards. But maintaining quality also includes introducing new enhanced production methods and making use of the latest findings of bioscience. The advances in biotechnology and single-use manufacturing methods for producing new drugs especially impose new challenges on quality management, as quality standards have not yet been set. New methods to ensure patient safety have to be established, as it is insufficient to rely only on current rules. A concept of qualification, validation, and manufacturing procedures based on risk management needs to be established and realized in pharmaceutical production. The chapter starts with an introduction to the regulatory background of the manufacture of medicinal products. It then continues with key methods of risk management. Hazards associated with the production of medicinal products with single-use equipment are described with a focus on bioreactors, storage containers, and connecting devices. The hazards are subsequently evaluated and criteria for risk evaluation are presented. This chapter concludes with aspects of industrial application of quality risk management.
Milburn, Trelani F.; Lonigan, Christopher J.; Allan, Darcey M.; Phillips, Beth M.
2017-01-01
To investigate approaches for identifying young children who may be at risk for later reading-related learning disabilities, this study compared the use of four contemporary methods of indexing learning disability (LD) with older children (i.e., IQ-achievement discrepancy, low achievement, low growth, and dual-discrepancy) to determine risk status with a large sample of 1,011 preschoolers. These children were classified as at risk or not using each method across three early-literacy skills (i.e., language, phonological awareness, print knowledge) and at three levels of severity (i.e., 5th, 10th, 25th percentiles). Chance-corrected affected-status agreement (CCASA) indicated poor agreement among methods with rates of agreement generally decreasing with greater levels of severity for both single- and two-measure classification, and agreement rates were lower for two-measure classification than for single-measure classification. These low rates of agreement between conventional methods of identifying children at risk for LD represent a significant impediment for identification and intervention for young children considered at-risk. PMID:28670102
Assessing Arsenic Bioavailability In Soil When In Vitro Gastrointestinal Methods Are The Only Option
Human health risk assessment science continues to mature with bioavailability-based risk assessment frameworks being developed and/or considered for implementation in the U.S., Canada, the European Union, Australia and other countries. Incidental ingestion is an important exposu...
Historically, risk assessment has relied upon toxicological data to obtain hazard-based reference levels, which are subsequently compared to exposure estimates to determine whether an unacceptable risk to public health may exist. Recent advances in analytical methods, biomarker ...
Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox
Spee, Ton; Gillen, Matt; Lentz, Thomas J.; Garrod, Andrew; Evans, Paul; Swuste, Paul
2011-01-01
Objectives This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Methods Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. Results This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. Conclusion The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions. PMID:22953194
Kulinich, S I; Gertsekovich, D A; Cherniak, E V
1994-01-01
The results of questionnaire-based case-control screening for ovarian oncopathology in 478 patients and 478 healthy females are presented. The two groups matched with respect to age and place of residence. An 82.4% effective mathematical decision rule developed. It can be used as a method of early diagnosis of ovarian tumors as well as in formation of groups at high risk.
Robust approaches to quantification of margin and uncertainty for sparse data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin
Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less
Software Safety Risk in Legacy Safety-Critical Computer Systems
NASA Technical Reports Server (NTRS)
Hill, Janice; Baggs, Rhoda
2007-01-01
Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.
Krass, I; Mitchell, B; Clarke, P; Brillant, M; Dienaar, R; Hughes, J; Lau, P; Peterson, G; Stewart, K; Taylor, S; Wilkinson, J; Armour, C
2007-03-01
To compare the efficacy and cost-effectiveness of two methods of screening for undiagnosed type 2 diabetes in Australian community pharmacy. A random sample of 30 pharmacies were allocated into two groups: (i) tick test only (TTO); or (ii) sequential screening (SS) method. Both methods used the same initial risk assessment for type 2 diabetes. Subjects with one or more risk factors in the TTO group were offered a referral to their general practitioner (GP). Under the SS method, patients with risk factors were offered a capillary blood glucose test and those identified as being at risk referred to a GP. The effectiveness and cost-effectiveness of these approaches was assessed. A total of 1286 people were screened over a period of 3 months. The rate of diagnosis of diabetes was significantly higher for SS compared with the TTO method (1.7% versus 0.2%; p=0.008). The SS method resulted in fewer referrals to the GP and a higher uptake of referrals than the TTO method and so was the more cost-effective screening method. SS is the superior method from a cost and efficacy perspective. It should be considered as the preferred option for screening by community based pharmacists in Australia.
Landy, Rebecca; Cheung, Li C; Schiffman, Mark; Gage, Julia C; Hyun, Noorie; Wentzensen, Nicolas; Kinney, Walter K; Castle, Philip E; Fetterman, Barbara; Poitras, Nancy E; Lorey, Thomas; Sasieni, Peter D; Katki, Hormuzd A
2018-06-01
Electronic health-records (EHR) are increasingly used by epidemiologists studying disease following surveillance testing to provide evidence for screening intervals and referral guidelines. Although cost-effective, undiagnosed prevalent disease and interval censoring (in which asymptomatic disease is only observed at the time of testing) raise substantial analytic issues when estimating risk that cannot be addressed using Kaplan-Meier methods. Based on our experience analysing EHR from cervical cancer screening, we previously proposed the logistic-Weibull model to address these issues. Here we demonstrate how the choice of statistical method can impact risk estimates. We use observed data on 41,067 women in the cervical cancer screening program at Kaiser Permanente Northern California, 2003-2013, as well as simulations to evaluate the ability of different methods (Kaplan-Meier, Turnbull, Weibull and logistic-Weibull) to accurately estimate risk within a screening program. Cumulative risk estimates from the statistical methods varied considerably, with the largest differences occurring for prevalent disease risk when baseline disease ascertainment was random but incomplete. Kaplan-Meier underestimated risk at earlier times and overestimated risk at later times in the presence of interval censoring or undiagnosed prevalent disease. Turnbull performed well, though was inefficient and not smooth. The logistic-Weibull model performed well, except when event times didn't follow a Weibull distribution. We have demonstrated that methods for right-censored data, such as Kaplan-Meier, result in biased estimates of disease risks when applied to interval-censored data, such as screening programs using EHR data. The logistic-Weibull model is attractive, but the model fit must be checked against Turnbull non-parametric risk estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Qi-An; Xiao, Yinghong; Chen, Hui; Chen, Liang
Our research analyzes the effect of the traders’ subjective risk attitude, optimism and overconfidence on their risk taking behaviors on the Chinese Stock Market by experimental study method. We find that investors’ risk taking behavior is significantly affected by their subjective risk attitude, optimism and overconfidence. Our results also argue that the objective return and volatility of stock are not as good predictors of risk taking behavior as subjective risk and return measures. Moreover, we illustrate that overconfidence and optimism have an significant impact on risk taking behavior In line with theoretical models.
Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R
2013-01-01
The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacvarov, D.C.
1981-01-01
A new method for probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes is presented. The utilized approach of applying the finite element method for probabilistic risk assessment is demonstrated to be very powerful. The reasons for this are two. First, the finite element method is inherently suitable for analysis of three dimensional spaces where the parameters, such as three variate probability densities of the lightning currents, are non-uniformly distributed. Second, the finite element method permits non-uniform discretization of the three dimensional probability spaces thus yielding high accuracy in critical regions, such as the area of themore » low probability events, while at the same time maintaining coarse discretization in the non-critical areas to keep the number of grid points and the size of the problem to a manageable low level. The finite element probabilistic risk assessment method presented here is based on a new multidimensional search algorithm. It utilizes an efficient iterative technique for finite element interpolation of the transmission line insulation flashover criteria computed with an electro-magnetic transients program. Compared to other available methods the new finite element probabilistic risk assessment method is significantly more accurate and approximately two orders of magnitude computationally more efficient. The method is especially suited for accurate assessment of rare, very low probability events.« less
Screening-Level Ecological Risk Assessment Methods, Revision 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mirenda, Richard J.
2012-08-16
This document provides guidance for screening-level assessments of potential adverse impacts to ecological resources from release of environmental contaminants at the Los Alamos National Laboratory (LANL or the Laboratory). The methods presented are based on two objectives, namely: to provide a basis for reaching consensus with regulators, managers, and other interested parties on how to conduct screening-level ecological risk investigations at the Laboratory; and to provide guidance for ecological risk assessors under the Environmental Programs (EP) Directorate. This guidance promotes consistency, rigor, and defensibility in ecological screening investigations and in reporting those investigation results. The purpose of the screening assessmentmore » is to provide information to the risk managers so informed riskmanagement decisions can be made. This document provides examples of recommendations and possible risk-management strategies.« less
Qualitative risk assessment during polymer mortar test specimens preparation - methods comparison
NASA Astrophysics Data System (ADS)
Silva, F.; Sousa, S. P. B.; Arezes, P.; Swuste, P.; Ribeiro, M. C. S.; Baptista, J. S.
2015-05-01
Polymer binder modification with inorganic nanomaterials (NM) could be a potential and efficient solution to control matrix flammability of polymer concrete (PC) materials without sacrificing other important properties. Occupational exposures can occur all along the life cycle of a NM and “nanoproducts” from research through scale-up, product development, manufacturing, and end of life. The main objective of the present study is to analyse and compare different qualitative risk assessment methods during the production of polymer mortars (PM) with NM. The laboratory scale production process was divided in 3 main phases (pre-production, production and post-production), which allow testing the assessment methods in different situations. The risk assessment involved in the manufacturing process of PM was made by using the qualitative analyses based on: French Agency for Food, Environmental and Occupational Health & Safety method (ANSES); Control Banding Nanotool (CB Nanotool); Ecole Polytechnique Fédérale de Lausanne method (EPFL); Guidance working safely with nanomaterials and nanoproducts (GWSNN); Istituto Superiore per la Prevenzione e la Sicurezza del Lavoro, Italy method (ISPESL); Precautionary Matrix for Synthetic Nanomaterials (PMSN); and Stoffenmanager Nano. It was verified that the different methods applied also produce different final results. In phases 1 and 3 the risk assessment tends to be classified as medium-high risk, while for phase 2 the more common result is medium level. It is necessary to improve the use of qualitative methods by defining narrow criteria for the methods selection for each assessed situation, bearing in mind that the uncertainties are also a relevant factor when dealing with the risk related to nanotechnologies field.
NASA Astrophysics Data System (ADS)
van Ginneken, Meike; Oron, Gideon
2000-09-01
This study assesses health risks to consumers due to the use of agricultural products irrigated with reclaimed wastewater. The analysis is based on a definition of an exposure model which takes into account several parameters: (1) the quality of the applied wastewater, (2) the irrigation method, (3) the elapsed times between irrigation, harvest, and product consumption, and (4) the consumers' habits. The exposure model is used for numerical simulation of human consumers' risks using the Monte Carlo simulation method. The results of the numerical simulation show large deviations, probably caused by uncertainty (impreciseness in quality of input data) and variability due to diversity among populations. There is a 10-orders of magnitude difference in the risk of infection between the different exposure scenarios with the same water quality. This variation indicates the need for setting risk-based criteria for wastewater reclamation rather than single water quality guidelines. Extra data are required to decrease uncertainty in the risk assessment. Future research needs to include definition of acceptable risk criteria, more accurate dose-response modeling, information regarding pathogen survival in treated wastewater, additional data related to the passage of pathogens into and in the plants during irrigation, and information regarding the behavior patterns of the community of human consumers.
Peptide and protein biomarkers for type 1 diabetes mellitus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Qibin; Metz, Thomas O.
A method for identifying persons with increased risk of developing type 1 diabetes mellitus, or having type I diabetes mellitus, utilizing selected biomarkers described herein either alone or in combination. The present disclosure allows for broad based, reliable, screening of large population bases. Also provided are arrays and kits that can be used to perform such methods.
Peptide and protein biomarkers for type 1 diabetes mellitus
Zhang, Qibin; Metz, Thomas O.
2014-06-10
A method for identifying persons with increased risk of developing type 1 diabetes mellitus, or having type I diabetes mellitus, utilizing selected biomarkers described herein either alone or in combination. The present disclosure allows for broad based, reliable, screening of large population bases. Also provided are arrays and kits that can be used to perform such methods.
[Using sequential indicator simulation method to define risk areas of soil heavy metals in farmland.
Yang, Hao; Song, Ying Qiang; Hu, Yue Ming; Chen, Fei Xiang; Zhang, Rui
2018-05-01
The heavy metals in soil have serious impacts on safety, ecological environment and human health due to their toxicity and accumulation. It is necessary to efficiently identify the risk area of heavy metals in farmland soil, which is of important significance for environment protection, pollution warning and farmland risk control. We collected 204 samples and analyzed the contents of seven kinds of heavy metals (Cu, Zn, Pb, Cd, Cr, As, Hg) in Zengcheng District of Guangzhou, China. In order to overcame the problems of the data, including the limitation of abnormal values and skewness distribution and the smooth effect with the traditional kriging methods, we used sequential indicator simulation method (SISIM) to define the spatial distribution of heavy metals, and combined Hakanson index method to identify potential ecological risk area of heavy metals in farmland. The results showed that: (1) Based on the similar accuracy of spatial prediction of soil heavy metals, the SISIM had a better expression of detail rebuild than ordinary kriging in small scale area. Compared to indicator kriging, the SISIM had less error rate (4.9%-17.1%) in uncertainty evaluation of heavy-metal risk identification. The SISIM had less smooth effect and was more applicable to simulate the spatial uncertainty assessment of soil heavy metals and risk identification. (2) There was no pollution in Zengcheng's farmland. Moderate potential ecological risk was found in the southern part of study area due to enterprise production, human activities, and river sediments. This study combined the sequential indicator simulation with Hakanson risk index method, and effectively overcame the outlier information loss and smooth effect of traditional kriging method. It provided a new way to identify the soil heavy metal risk area of farmland in uneven sampling.
Meteorological risks are drivers of environmental innovation in agro-ecosystem management
NASA Astrophysics Data System (ADS)
Gobin, Anne; Van de Vijver, Hans; Vanwindekens, Frédéric; de Frutos Cachorro, Julia; Verspecht, Ann; Planchon, Viviane; Buyse, Jeroen
2017-04-01
Agricultural crop production is to a great extent determined by weather conditions. The research hypothesis is that meteorological risks act as drivers of environmental innovation in agro-ecosystem management. The methodology comprised five major parts: the hazard, its impact on different agro-ecosystems, vulnerability, risk management and risk communication. Generalized Extreme Value (GEV) theory was used to model annual maxima of meteorological variables based on a location-, scale- and shape-parameter that determine the center of the distribution, the deviation of the location-parameter and the upper tail decay, respectively. Spatial interpolation of GEV-derived return levels resulted in spatial temperature extremes, precipitation deficits and wet periods. The temporal overlap between extreme weather conditions and sensitive periods in the agro-ecosystem was realised using a bio-physically based modelling framework that couples phenology, a soil water balance and crop growth. 20-year return values for drought and waterlogging during different crop stages were related to arable yields. The method helped quantify agricultural production risks and rate both weather and crop-based agricultural insurance. The spatial extent of vulnerability is developed on different layers of geo-information to include meteorology, soil-landscapes, crop cover and management. Vulnerability of agroecosystems was mapped based on rules set by experts' knowledge and implemented by Fuzzy Inference System modelling and Geographical Information System tools. The approach was applied for cropland vulnerability to heavy rain and grassland vulnerability to drought. The level of vulnerability and resilience of an agro-ecosystem was also determined by risk management which differed across sectors and farm types. A calibrated agro-economic model demonstrated a marked influence of climate adapted land allocation and crop management on individual utility. The "chain of risk" approach allowed for investigating the hypothesis that meteorological risks act as drivers for agricultural innovation. Risk types were quantified in terms of probability and distribution, and further distinguished according to production type. Examples of strategies and options were provided at field, farm and policy level using different modelling methods.
Quantifying prognosis with risk predictions.
Pace, Nathan L; Eberhart, Leopold H J; Kranke, Peter R
2012-01-01
Prognosis is a forecast, based on present observations in a patient, of their probable outcome from disease, surgery and so on. Research methods for the development of risk probabilities may not be familiar to some anaesthesiologists. We briefly describe methods for identifying risk factors and risk scores. A probability prediction rule assigns a risk probability to a patient for the occurrence of a specific event. Probability reflects the continuum between absolute certainty (Pi = 1) and certified impossibility (Pi = 0). Biomarkers and clinical covariates that modify risk are known as risk factors. The Pi as modified by risk factors can be estimated by identifying the risk factors and their weighting; these are usually obtained by stepwise logistic regression. The accuracy of probabilistic predictors can be separated into the concepts of 'overall performance', 'discrimination' and 'calibration'. Overall performance is the mathematical distance between predictions and outcomes. Discrimination is the ability of the predictor to rank order observations with different outcomes. Calibration is the correctness of prediction probabilities on an absolute scale. Statistical methods include the Brier score, coefficient of determination (Nagelkerke R2), C-statistic and regression calibration. External validation is the comparison of the actual outcomes to the predicted outcomes in a new and independent patient sample. External validation uses the statistical methods of overall performance, discrimination and calibration and is uniformly recommended before acceptance of the prediction model. Evidence from randomised controlled clinical trials should be obtained to show the effectiveness of risk scores for altering patient management and patient outcomes.
Augmenting the Deliberative Method for Ranking Risks.
Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel
2016-01-01
The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. © 2015 Society for Risk Analysis.
Epidemiologic research using probabilistic outcome definitions.
Cai, Bing; Hennessy, Sean; Lo Re, Vincent; Small, Dylan S
2015-01-01
Epidemiologic studies using electronic healthcare data often define the presence or absence of binary clinical outcomes by using algorithms with imperfect specificity, sensitivity, and positive predictive value. This results in misclassification and bias in study results. We describe and evaluate a new method called probabilistic outcome definition (POD) that uses logistic regression to estimate the probability of a clinical outcome using multiple potential algorithms and then uses multiple imputation to make valid inferences about the risk ratio or other epidemiologic parameters of interest. We conducted a simulation to evaluate the performance of the POD method with two variables that can predict the true outcome and compared the POD method with the conventional method. The simulation results showed that when the true risk ratio is equal to 1.0 (null), the conventional method based on a binary outcome provides unbiased estimates. However, when the risk ratio is not equal to 1.0, the traditional method, either using one predictive variable or both predictive variables to define the outcome, is biased when the positive predictive value is <100%, and the bias is very severe when the sensitivity or positive predictive value is poor (less than 0.75 in our simulation). In contrast, the POD method provides unbiased estimates of the risk ratio both when this measure of effect is equal to 1.0 and not equal to 1.0. Even when the sensitivity and positive predictive value are low, the POD method continues to provide unbiased estimates of the risk ratio. The POD method provides an improved way to define outcomes in database research. This method has a major advantage over the conventional method in that it provided unbiased estimates of risk ratios and it is easy to use. Copyright © 2014 John Wiley & Sons, Ltd.
Developing an objective evaluation method to estimate diabetes risk in community-based settings.
Kenya, Sonjia; He, Qing; Fullilove, Robert; Kotler, Donald P
2011-05-01
Exercise interventions often aim to affect abdominal obesity and glucose tolerance, two significant risk factors for type 2 diabetes. Because of limited financial and clinical resources in community and university-based environments, intervention effects are often measured with interviews or questionnaires and correlated with weight loss or body fat indicated by body bioimpedence analysis (BIA). However, self-reported assessments are subject to high levels of bias and low levels of reliability. Because obesity and body fat are correlated with diabetes at different levels in various ethnic groups, data reflecting changes in weight or fat do not necessarily indicate changes in diabetes risk. To determine how exercise interventions affect diabetes risk in community and university-based settings, improved evaluation methods are warranted. We compared a noninvasive, objective measurement technique--regional BIA--with whole-body BIA for its ability to assess abdominal obesity and predict glucose tolerance in 39 women. To determine regional BIA's utility in predicting glucose, we tested the association between the regional BIA method and blood glucose levels. Regional BIA estimates of abdominal fat area were significantly correlated (r = 0.554, P < 0.003) with fasting glucose. When waist circumference and family history of diabetes were added to abdominal fat in multiple regression models, the association with glucose increased further (r = 0.701, P < 0.001). Regional BIA estimates of abdominal fat may predict fasting glucose better than whole-body BIA as well as provide an objective assessment of changes in diabetes risk achieved through physical activity interventions in community settings.
A Multifaceted School-based Intervention to Reduce Risk for Type 2 Diabetes in At-Risk Youth
Grey, Margaret; Jaser, Sarah S.; Holl, Marita G.; Jefferson, Vanessa; Dziura, James; Northrup, Veronika
2009-01-01
Objective To evaluate the impact of a multifaceted, school-based intervention on inner city youth at high risk for type 2 diabetes mellitus (T2DM) and to determine whether the addition of coping skills training (CST) and health coaching improves outcomes. Method 198 students in New Haven, CT at risk for T2DM (BMI > 85th percentile and family history of diabetes) were randomized by school to an educational intervention with or without the addition of CST and health coaching. Students were enrolled from 2004–2007 and followed for 12 months. Results Students in both groups showed some improvement in anthropometric measures, lipids, and depressive symptoms over 12 months. BMI was not improved by the intervention. Students who received CST showed greater improvement on some indicators of metabolic risk than students who received education only. Conclusion A multifaceted, school-based intervention may hold promise for reducing metabolic risk in urban, minority youth. PMID:19643125
Are Cultural Values and Beliefs Included in U.S. Based HIV Interventions?
Wyatt, Gail E.; Williams, John K.; Gupta, Arpana; Malebranche, Dominique
2013-01-01
Objective To determine the extent to which current U.S. based HIV/AIDS prevention and risk reduction interventions address and include aspects of cultural beliefs in definitions, curricula, measures and related theories that may contradict current safer sex messages. Method A comprehensive literature review was conducted to determine which published HIV/AIDS prevention and risk reduction interventions incorporated aspects of cultural beliefs. Results This review of 166 HIV prevention and risk reduction interventions, published between 1988 and 2010, identified 34 interventions that varied in cultural definitions and the integration of cultural concepts. Conclusion HIV interventions need to move beyond targeting specific populations based upon race/ethnicity, gender, sexual, drug and/or risk behaviors and incorporate cultural beliefs and experiences pertinent to an individual’s risk. Theory based interventions that incorporate cultural beliefs within a contextual framework are needed if prevention and risk reduction messages are to reach targeted at risk populations. Implications for the lack of uniformity of cultural definitions, measures and related theories are discussed and recommendations are made to ensure that cultural beliefs are acknowledged for their potential conflict with safer sex skills and practices. PMID:21884721
Privacy-preserving record linkage on large real world datasets.
Randall, Sean M; Ferrante, Anna M; Boyd, James H; Bauer, Jacqueline K; Semmens, James B
2014-08-01
Record linkage typically involves the use of dedicated linkage units who are supplied with personally identifying information to determine individuals from within and across datasets. The personally identifying information supplied to linkage units is separated from clinical information prior to release by data custodians. While this substantially reduces the risk of disclosure of sensitive information, some residual risks still exist and remain a concern for some custodians. In this paper we trial a method of record linkage which reduces privacy risk still further on large real world administrative data. The method uses encrypted personal identifying information (bloom filters) in a probability-based linkage framework. The privacy preserving linkage method was tested on ten years of New South Wales (NSW) and Western Australian (WA) hospital admissions data, comprising in total over 26 million records. No difference in linkage quality was found when the results were compared to traditional probabilistic methods using full unencrypted personal identifiers. This presents as a possible means of reducing privacy risks related to record linkage in population level research studies. It is hoped that through adaptations of this method or similar privacy preserving methods, risks related to information disclosure can be reduced so that the benefits of linked research taking place can be fully realised. Copyright © 2013 Elsevier Inc. All rights reserved.
Ferreira, António Miguel; Marques, Hugo; Tralhão, António; Santos, Miguel Borges; Santos, Ana Rita; Cardoso, Gonçalo; Dores, Hélder; Carvalho, Maria Salomé; Madeira, Sérgio; Machado, Francisco Pereira; Cardim, Nuno; de Araújo Gonçalves, Pedro
2016-11-01
Current guidelines recommend the use of the Modified Diamond-Forrester (MDF) method to assess the pre-test likelihood of obstructive coronary artery disease (CAD). We aimed to compare the performance of the MDF method with two contemporary algorithms derived from multicenter trials that additionally incorporate cardiovascular risk factors: the calculator-based 'CAD Consortium 2' method, and the integer-based CONFIRM score. We assessed 1069 consecutive patients without known CAD undergoing coronary CT angiography (CCTA) for stable chest pain. Obstructive CAD was defined as the presence of coronary stenosis ≥50% on 64-slice dual-source CT. The three methods were assessed for calibration, discrimination, net reclassification, and changes in proposed downstream testing based upon calculated pre-test likelihoods. The observed prevalence of obstructive CAD was 13.8% (n=147). Overestimations of the likelihood of obstructive CAD were 140.1%, 9.8%, and 18.8%, respectively, for the MDF, CAD Consortium 2 and CONFIRM methods. The CAD Consortium 2 showed greater discriminative power than the MDF method, with a C-statistic of 0.73 vs. 0.70 (p<0.001), while the CONFIRM score did not (C-statistic 0.71, p=0.492). Reclassification of pre-test likelihood using the 'CAD Consortium 2' or CONFIRM scores resulted in a net reclassification improvement of 0.19 and 0.18, respectively, which would change the diagnostic strategy in approximately half of the patients. Newer risk factor-encompassing models allow for a more precise estimation of pre-test probabilities of obstructive CAD than the guideline-recommended MDF method. Adoption of these scores may improve disease prediction and change the diagnostic pathway in a significant proportion of patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Bayesian-network-based safety risk assessment for steel construction projects.
Leu, Sou-Sen; Chang, Ching-Miao
2013-05-01
There are four primary accident types at steel building construction (SC) projects: falls (tumbles), object falls, object collapse, and electrocution. Several systematic safety risk assessment approaches, such as fault tree analysis (FTA) and failure mode and effect criticality analysis (FMECA), have been used to evaluate safety risks at SC projects. However, these traditional methods ineffectively address dependencies among safety factors at various levels that fail to provide early warnings to prevent occupational accidents. To overcome the limitations of traditional approaches, this study addresses the development of a safety risk-assessment model for SC projects by establishing the Bayesian networks (BN) based on fault tree (FT) transformation. The BN-based safety risk-assessment model was validated against the safety inspection records of six SC building projects and nine projects in which site accidents occurred. The ranks of posterior probabilities from the BN model were highly consistent with the accidents that occurred at each project site. The model accurately provides site safety-management abilities by calculating the probabilities of safety risks and further analyzing the causes of accidents based on their relationships in BNs. In practice, based on the analysis of accident risks and significant safety factors, proper preventive safety management strategies can be established to reduce the occurrence of accidents on SC sites. Copyright © 2013 Elsevier Ltd. All rights reserved.
Beronius, Anna; Molander, Linda; Zilliacus, Johanna; Rudén, Christina; Hanberg, Annika
2018-05-28
The Science in Risk Assessment and Policy (SciRAP) web-based platform was developed to promote and facilitate structure and transparency in the evaluation of ecotoxicity and toxicity studies for hazard and risk assessment of chemicals. The platform includes sets of criteria and a colour-coding tool for evaluating the reliability and relevance of individual studies. The SciRAP method for evaluating in vivo toxicity studies was first published in 2014 and the aim of the work presented here was to evaluate and develop that method further. Toxicologists and risk assessors from different sectors and geographical areas were invited to test the SciRAP criteria and tool on a specific set of in vivo toxicity studies and to provide feedback concerning the scientific soundness and user-friendliness of the SciRAP approach. The results of this expert assessment were used to refine and improve both the evaluation criteria and the colour-coding tool. It is expected that the SciRAP web-based platform will continue to be developed and enhanced to keep up to date with the needs of end-users. Copyright © 2018 John Wiley & Sons, Ltd.
Perceived Versus Objective Breast Cancer, Breast Cancer Risk in Diverse Women
Fehniger, Julia; Livaudais-Toman, Jennifer; Karliner, Leah; Kerlikowske, Karla; Tice, Jeffrey A.; Quinn, Jessica; Ozanne, Elissa
2014-01-01
Abstract Background: Prior research suggests that women do not accurately estimate their risk for breast cancer. Estimating and informing women of their risk is essential for tailoring appropriate screening and risk reduction strategies. Methods: Data were collected for BreastCARE, a randomized controlled trial designed to evaluate a PC-tablet based intervention providing multiethnic women and their primary care physicians with tailored information about breast cancer risk. We included women ages 40–74 visiting general internal medicine primary care clinics at one academic practice and one safety net practice who spoke English, Spanish, or Cantonese, and had no personal history of breast cancer. We collected baseline information regarding risk perception and concern. Women were categorized as high risk (vs. average risk) if their family history met criteria for referral to genetic counseling or if they were in the top 5% of risk for their age based on the Gail or Breast Cancer Surveillance Consortium Model (BCSC) breast cancer risk model. Results: Of 1,261 participants, 25% (N=314) were classified as high risk. More average risk than high risk women had correct risk perception (72% vs. 18%); 25% of both average and high risk women reported being very concerned about breast cancer. Average risk women with correct risk perception were less likely to be concerned about breast cancer (odds ratio [OR]=0.3; 95% confidence interval [CI]=0.2–0.4) while high risk women with correct risk perception were more likely to be concerned about breast cancer (OR=5.1; 95%CI=2.7–9.6). Conclusions: Many women did not accurately perceive their risk for breast cancer. Women with accurate risk perception had an appropriate level of concern about breast cancer. Improved methods of assessing and informing women of their breast cancer risk could motivate high risk women to apply appropriate prevention strategies and allay unnecessary concern among average risk women. PMID:24372085
NASA Astrophysics Data System (ADS)
Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio
The conventional optimization methods were based on a deterministic approach, since their purpose is to find out an exact solution. However, these methods have initial condition dependence and risk of falling into local solution. In this paper, we propose a new optimization method based on a concept of path integral method used in quantum mechanics. The method obtains a solutions as an expected value (stochastic average) using a stochastic process. The advantages of this method are not to be affected by initial conditions and not to need techniques based on experiences. We applied the new optimization method to a design of the hang glider. In this problem, not only the hang glider design but also its flight trajectory were optimized. The numerical calculation results showed that the method has a sufficient performance.
There are significant scientific and technological challenges to managing natural resources. Data needs are cited as an obvious limitation, but there exist more fundamental scientific issues. What is still needed is a method of comparing management strategies based on projected i...
Community-based programs for assessing and mitigating nvironmental risks represent a challenge to participants because each brings a different level of understanding of the issues affecting the community. These programs often require the collaboration of several community sectors...
AN INFORMATIC APPROACH TO ESTIMATING ECOLOGICAL RISKS POSED BY PHARMACEUTICAL USE
A new method for estimating risks of human prescription pharmaceuticals based on information found in regulatory filings as well as scientific and trade literature is described in a presentation at the Pharmaceuticals in the Environment Workshop in Las Vegas, NV, August 23-25, 20...
Cut set-based risk and reliability analysis for arbitrarily interconnected networks
Wyss, Gregory D.
2000-01-01
Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.
Adapt-Mix: learning local genetic correlation structure improves summary statistics-based analyses
Park, Danny S.; Brown, Brielin; Eng, Celeste; Huntsman, Scott; Hu, Donglei; Torgerson, Dara G.; Burchard, Esteban G.; Zaitlen, Noah
2015-01-01
Motivation: Approaches to identifying new risk loci, training risk prediction models, imputing untyped variants and fine-mapping causal variants from summary statistics of genome-wide association studies are playing an increasingly important role in the human genetics community. Current summary statistics-based methods rely on global ‘best guess’ reference panels to model the genetic correlation structure of the dataset being studied. This approach, especially in admixed populations, has the potential to produce misleading results, ignores variation in local structure and is not feasible when appropriate reference panels are missing or small. Here, we develop a method, Adapt-Mix, that combines information across all available reference panels to produce estimates of local genetic correlation structure for summary statistics-based methods in arbitrary populations. Results: We applied Adapt-Mix to estimate the genetic correlation structure of both admixed and non-admixed individuals using simulated and real data. We evaluated our method by measuring the performance of two summary statistics-based methods: imputation and joint-testing. When using our method as opposed to the current standard of ‘best guess’ reference panels, we observed a 28% decrease in mean-squared error for imputation and a 73.7% decrease in mean-squared error for joint-testing. Availability and implementation: Our method is publicly available in a software package called ADAPT-Mix available at https://github.com/dpark27/adapt_mix. Contact: noah.zaitlen@ucsf.edu PMID:26072481
Small numbers, disclosure risk, security, and reliability issues in Web-based data query systems.
Rudolph, Barbara A; Shah, Gulzar H; Love, Denise
2006-01-01
This article describes the process for developing consensus guidelines and tools for releasing public health data via the Web and highlights approaches leading agencies have taken to balance disclosure risk with public dissemination of reliable health statistics. An agency's choice of statistical methods for improving the reliability of released data for Web-based query systems is based upon a number of factors, including query system design (dynamic analysis vs preaggregated data and tables), population size, cell size, data use, and how data will be supplied to users. The article also describes those efforts that are necessary to reduce the risk of disclosure of an individual's protected health information.
Market-implied spread for earthquake CAT bonds: financial implications of engineering decisions.
Damnjanovic, Ivan; Aslan, Zafer; Mander, John
2010-12-01
In the event of natural and man-made disasters, owners of large-scale infrastructure facilities (assets) need contingency plans to effectively restore the operations within the acceptable timescales. Traditionally, the insurance sector provides the coverage against potential losses. However, there are many problems associated with this traditional approach to risk transfer including counterparty risk and litigation. Recently, a number of innovative risk mitigation methods, termed alternative risk transfer (ART) methods, have been introduced to address these problems. One of the most important ART methods is catastrophe (CAT) bonds. The objective of this article is to develop an integrative model that links engineering design parameters with financial indicators including spread and bond rating. The developed framework is based on a four-step structural loss model and transformed survival model to determine expected excess returns. We illustrate the framework for a seismically designed bridge using two unique CAT bond contracts. The results show a nonlinear relationship between engineering design parameters and market-implied spread. © 2010 Society for Risk Analysis.
Classifying Nanomaterial Risks Using Multi-Criteria Decision Analysis
NASA Astrophysics Data System (ADS)
Linkov, I.; Steevens, J.; Chappell, M.; Tervonen, T.; Figueira, J. R.; Merad, M.
There is rapidly growing interest by regulatory agencies and stakeholders in the potential toxicity and other risks associated with nanomaterials throughout the different stages of the product life cycle (e.g., development, production, use and disposal). Risk assessment methods and tools developed and applied to chemical and biological material may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material because of the variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as promote the safe use/handling of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. The stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different risk categories based on our current knowledge of nanomaterial's physico-chemical characteristics, variation in produced material, and best professional judgement. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.1,2
Analysis of dengue fever risk using geostatistics model in bone regency
NASA Astrophysics Data System (ADS)
Amran, Stang, Mallongi, Anwar
2017-03-01
This research aim is to analysis of dengue fever risk based on Geostatistics model in Bone Regency. Risk levels of dengue fever are denoted by parameter of Binomial distribution. Effect of temperature, rainfalls, elevation, and larvae abundance are investigated through Geostatistics model. Bayesian hierarchical method is used in estimation process. Using dengue fever data in eleven locations this research shows that temperature and rainfall have significant effect of dengue fever risk in Bone regency.
Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins
Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.
2010-12-14
A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.
Climate change vulnerability for species-Assessing the assessments.
Wheatley, Christopher J; Beale, Colin M; Bradbury, Richard B; Pearce-Higgins, James W; Critchlow, Rob; Thomas, Chris D
2017-09-01
Climate change vulnerability assessments are commonly used to identify species at risk from global climate change, but the wide range of methodologies available makes it difficult for end users, such as conservation practitioners or policymakers, to decide which method to use as a basis for decision-making. In this study, we evaluate whether different assessments consistently assign species to the same risk categories and whether any of the existing methodologies perform well at identifying climate-threatened species. We compare the outputs of 12 climate change vulnerability assessment methodologies, using both real and simulated species, and validate the methods using historic data for British birds and butterflies (i.e. using historical data to assign risks and more recent data for validation). Our results show that the different vulnerability assessment methods are not consistent with one another; different risk categories are assigned for both the real and simulated sets of species. Validation of the different vulnerability assessments suggests that methods incorporating historic trend data into the assessment perform best at predicting distribution trends in subsequent time periods. This study demonstrates that climate change vulnerability assessments should not be used interchangeably due to the poor overall agreement between methods when considering the same species. The results of our validation provide more support for the use of trend-based rather than purely trait-based approaches, although further validation will be required as data become available. © 2017 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Sinha, Jill Witmer
2007-01-01
Many minority adolescents in the United States today are at a high risk for truancy, dropout, and academic under-achievement. Truancy is related to a host of preceding and subsequent risks such as delinquency and limited vocational outcomes. Using participatory research methods, this federally funded, 10-month study assessed youths' perceptions of…
ERIC Educational Resources Information Center
Lauritsen, Marlene Briciet; Pedersen, Carsten Bocker; Mortensen, Preben Bo
2005-01-01
Background: The etiology of autism is unknown. A strong genetic component has been detected but non-genetic factors may also be involved in the etiology. Methods: We used data from the Danish Psychiatric Central Register and the Danish Civil Registration System to study some risk factors of autism, including place of birth, parental place of…
Modelling Risk to US Military Populations from Stopping Blanket Mandatory Polio Vaccination
Burgess, Andrew
2017-01-01
Objectives Transmission of polio poses a threat to military forces when deploying to regions where such viruses are endemic. US-born soldiers generally enter service with immunity resulting from childhood immunization against polio; moreover, new recruits are routinely vaccinated with inactivated poliovirus vaccine (IPV), supplemented based upon deployment circumstances. Given residual protection from childhood vaccination, risk-based vaccination may sufficiently protect troops from polio transmission. Methods This analysis employed a mathematical system for polio transmission within military populations interacting with locals in a polio-endemic region to evaluate changes in vaccination policy. Results Removal of blanket immunization had no effect on simulated polio incidence among deployed military populations when risk-based immunization was employed; however, when these individuals reintegrated with their base populations, risk of transmission to nondeployed personnel increased by 19%. In the absence of both blanket- and risk-based immunization, transmission to nondeployed populations increased by 25%. The overall number of new infections among nondeployed populations was negligible for both scenarios due to high childhood immunization rates, partial protection against transmission conferred by IPV, and low global disease incidence levels. Conclusion Risk-based immunization driven by deployment to polio-endemic regions is sufficient to prevent transmission among both deployed and nondeployed US military populations. PMID:29104608
Creating a Chinese suicide dictionary for identifying suicide risk on social media.
Lv, Meizhen; Li, Ang; Liu, Tianli; Zhu, Tingshao
2015-01-01
Introduction. Suicide has become a serious worldwide epidemic. Early detection of individual suicide risk in population is important for reducing suicide rates. Traditional methods are ineffective in identifying suicide risk in time, suggesting a need for novel techniques. This paper proposes to detect suicide risk on social media using a Chinese suicide dictionary. Methods. To build the Chinese suicide dictionary, eight researchers were recruited to select initial words from 4,653 posts published on Sina Weibo (the largest social media service provider in China) and two Chinese sentiment dictionaries (HowNet and NTUSD). Then, another three researchers were recruited to filter out irrelevant words. Finally, remaining words were further expanded using a corpus-based method. After building the Chinese suicide dictionary, we tested its performance in identifying suicide risk on Weibo. First, we made a comparison of the performance in both detecting suicidal expression in Weibo posts and evaluating individual levels of suicide risk between the dictionary-based identifications and the expert ratings. Second, to differentiate between individuals with high and non-high scores on self-rating measure of suicide risk (Suicidal Possibility Scale, SPS), we built Support Vector Machines (SVM) models on the Chinese suicide dictionary and the Simplified Chinese Linguistic Inquiry and Word Count (SCLIWC) program, respectively. After that, we made a comparison of the classification performance between two types of SVM models. Results and Discussion. Dictionary-based identifications were significantly correlated with expert ratings in terms of both detecting suicidal expression (r = 0.507) and evaluating individual suicide risk (r = 0.455). For the differentiation between individuals with high and non-high scores on SPS, the Chinese suicide dictionary (t1: F 1 = 0.48; t2: F 1 = 0.56) produced a more accurate identification than SCLIWC (t1: F 1 = 0.41; t2: F 1 = 0.48) on different observation windows. Conclusions. This paper confirms that, using social media, it is possible to implement real-time monitoring individual suicide risk in population. Results of this study may be useful to improve Chinese suicide prevention programs and may be insightful for other countries.
Creating a Chinese suicide dictionary for identifying suicide risk on social media
Liu, Tianli
2015-01-01
Introduction. Suicide has become a serious worldwide epidemic. Early detection of individual suicide risk in population is important for reducing suicide rates. Traditional methods are ineffective in identifying suicide risk in time, suggesting a need for novel techniques. This paper proposes to detect suicide risk on social media using a Chinese suicide dictionary. Methods. To build the Chinese suicide dictionary, eight researchers were recruited to select initial words from 4,653 posts published on Sina Weibo (the largest social media service provider in China) and two Chinese sentiment dictionaries (HowNet and NTUSD). Then, another three researchers were recruited to filter out irrelevant words. Finally, remaining words were further expanded using a corpus-based method. After building the Chinese suicide dictionary, we tested its performance in identifying suicide risk on Weibo. First, we made a comparison of the performance in both detecting suicidal expression in Weibo posts and evaluating individual levels of suicide risk between the dictionary-based identifications and the expert ratings. Second, to differentiate between individuals with high and non-high scores on self-rating measure of suicide risk (Suicidal Possibility Scale, SPS), we built Support Vector Machines (SVM) models on the Chinese suicide dictionary and the Simplified Chinese Linguistic Inquiry and Word Count (SCLIWC) program, respectively. After that, we made a comparison of the classification performance between two types of SVM models. Results and Discussion. Dictionary-based identifications were significantly correlated with expert ratings in terms of both detecting suicidal expression (r = 0.507) and evaluating individual suicide risk (r = 0.455). For the differentiation between individuals with high and non-high scores on SPS, the Chinese suicide dictionary (t1: F1 = 0.48; t2: F1 = 0.56) produced a more accurate identification than SCLIWC (t1: F1 = 0.41; t2: F1 = 0.48) on different observation windows. Conclusions. This paper confirms that, using social media, it is possible to implement real-time monitoring individual suicide risk in population. Results of this study may be useful to improve Chinese suicide prevention programs and may be insightful for other countries. PMID:26713232
Manuel, Douglas G.; Tuna, Meltem; Hennessy, Deirdre; Okhmatovskaia, Anya; Finès, Philippe; Tanuseputro, Peter; Tu, Jack V.; Flanagan, William
2014-01-01
Background Reductions in preventable risks associated with cardiovascular disease have contributed to a steady decrease in its incidence over the past 50 years in most developed countries. However, it is unclear whether this trend will continue. Our objective was to examine future risk by projecting trends in preventable risk factors in Canada to 2021. Methods We created a population-based microsimulation model using national data on births, deaths and migration; socioeconomic data; cardiovascular disease risk factors; and algorithms for changes in these risk factors (based on sociodemographic characteristics and previous cardiovascular disease risk). An initial population of 22.5 million people, representing the Canadian adult population in 2001, had 13 characteristics including the risk factors used in clinical risk prediction. There were 6.1 million potential exposure profiles for each person each year. Outcome measures included annual prevalence of risk factors (smoking, obesity, diabetes, hypertension and lipid levels) and of co-occurring risks. Results From 2003 to 2009, the projected risks of cardiovascular disease based on the microsimulation model closely approximated those based on national surveys. Except for obesity and diabetes, all risk factors were projected to decrease through to 2021. The largest projected decreases were for the prevalence of smoking (from 25.7% in 2001 to 17.7% in 2021) and uncontrolled hypertension (from 16.1% to 10.8%). Between 2015 and 2017, obesity was projected to surpass smoking as the most prevalent risk factor. Interpretation Risks of cardiovascular disease are projected to decrease modestly in Canada, leading to a likely continuing decline in its incidence. PMID:25077135
2014-01-01
Background We developed a standardised method to assess the quality of infection control in Dutch Nursing Home (NH), based on a cross-sectional survey that visualises the results. The method was called the Infection control RIsk Infection Scan (IRIS). We tested the applicability of this new tool in a multicentre surveillance executed June and July 2012. Methods The IRIS includes two patient outcome-variables, i.e. the prevalence of healthcare associated infections (HAI) and rectal carriage of Extended-Spectrum Beta-Lactamase (ESBL) producing Enterobacteriaceae (ESBL-E); two patient-related risk factors, i.e. use of medical devices, and antimicrobial therapy; and three ward-related risk factors, i.e. environmental contamination, availability of local guidelines, and shortcomings in infection prevention preconditions. Results were categorised as low-, intermediate- and high risk, presented in an easy-to-read graphic risk spider-plot. This plot was given as feedback to management and healthcare workers of the NH. Results Large differences were found among most the variables in the different NH. Common shortcomings were the availability of infection control guidelines and the level of environmental cleaning. Most striking differences were observed in the prevalence of ESBL carriage, ranged from zero to 20.6% (p < 0.001). Conclusions The IRIS provided a rapid and easy to understand assessment of the infection control situation of the participating NH. The results can be used to improve the quality of infection control based on the specific needs of a NH but needs further validation in future studies. Repeated measurement can determine the effectiveness of the interventions. This makes the IRIS a useful tool for quality systems. PMID:25243067
Effects of protection forests on rockfall risks: implementation in the Swiss risk concept
NASA Astrophysics Data System (ADS)
Trappmann, Daniel; Moos, Christine; Fehlmann, Michael; Ernst, Jacqueline; Sandri, Arthur; Dorren, Luuk; Stoffel, Markus
2016-04-01
Forests growing on slopes below active rockfall cliffs can provide effective protection for human lives and infrastructures. The risk-based approach for natural hazards in Switzerland shall take such biological measures just like existing technical protective measures into account, provided that certain criteria regarding condition, maintenance and durability are met. This contribution describes a project in which we are investigating how the effects of protection forests can be considered in rockfall risk analyses in an appropriate way. In principle, protection forests reduce rockfall risks in three different ways: (i) reduction of the event magnitude (energy) due to collisions with tree stems; (ii) reduction of frequency of occurrence of a given scenario (block volume arriving at the damage potential); (iii) reduction of spatial probability of occurrence (spread and runout) of a given scenario in case of multiple fragments during one event. The aim of this work is to develop methods for adequately implementing these three effects of rockfall protection forests in risk calculations. To achieve this, we use rockfall simulations taking collisions with trees into account and detailed field validation. On five test sites, detailed knowledge on past rockfall activity is gathered by combining investigations of impacted trees, analysis of documented historical events, and deposits in the field. Based on this empirical data on past rockfalls, a methodology is developed that allows transferring real past rockfall activity to simulation results obtained with the three-dimensional, process-based model Rockyfor3D. Different ways of quantifying the protective role of forests will be considered by comparing simulation results with and without forest cover. Combining these different research approaches, systematic considerations shall lead to the development of methods for adequate inclusion of the protective effects of forests in risk calculations. The applicability of the developed methods will be tested on the case study slopes in order to ensure practical applicability to a broad range of rockfall situations on forested slopes.
NASA Astrophysics Data System (ADS)
Moan, T.
2017-12-01
An overview of integrity management of offshore structures, with emphasis on the oil and gas energy sector, is given. Based on relevant accident experiences and means to control the associated risks, accidents are categorized from a technical-physical as well as human and organizational point of view. Structural risk relates to extreme actions as well as structural degradation. Risk mitigation measures, including adequate design criteria, inspection, repair and maintenance as well as quality assurance and control of engineering processes, are briefly outlined. The current status of risk and reliability methodology to aid decisions in the integrity management is briefly reviewed. Finally, the need to balance the uncertainties in data, methods and computational efforts and the cautious use and quality assurance and control in applying high fidelity methods to avoid human errors, is emphasized, and with a plea to develop both high fidelity as well as efficient, simplified methods for design.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
New Methods for the Analysis of Heartbeat Behavior in Risk Stratification
Glass, Leon; Lerma, Claudia; Shrier, Alvin
2011-01-01
Developing better methods for risk stratification for tachyarrhythmic sudden cardiac remains a major challenge for physicians and scientists. Since the transition from sinus rhythm to ventricular tachycardia/fibrillation happens by different mechanisms in different people, it is unrealistic to think that a single measure will be adequate to provide a good index for risk stratification. We analyze the dynamical properties of ventricular premature complexes over 24 h in an effort to understand the underlying mechanisms of ventricular arrhythmias and to better understand the arrhythmias that occur in individual patients. Two dimensional density plots, called heartprints, correlate characteristic features of the dynamics of premature ventricular complexes and the sinus rate. Heartprints show distinctive characteristics in individual patients. Based on a better understanding of the natures of transitions from sinus rhythm to sudden cardiac and the mechanisms of arrhythmia prior to cardiac arrest, it should be possible to develop better methods for risk stratification. PMID:22144963
An Emerging New Risk Analysis Science: Foundations and Implications.
Aven, Terje
2018-05-01
To solve real-life problems-such as those related to technology, health, security, or climate change-and make suitable decisions, risk is nearly always a main issue. Different types of sciences are often supporting the work, for example, statistics, natural sciences, and social sciences. Risk analysis approaches and methods are also commonly used, but risk analysis is not broadly accepted as a science in itself. A key problem is the lack of explanatory power and large uncertainties when assessing risk. This article presents an emerging new risk analysis science based on novel ideas and theories on risk analysis developed in recent years by the risk analysis community. It builds on a fundamental change in thinking, from the search for accurate predictions and risk estimates, to knowledge generation related to concepts, theories, frameworks, approaches, principles, methods, and models to understand, assess, characterize, communicate, and (in a broad sense) manage risk. Examples are used to illustrate the importance of this distinct/separate risk analysis science for solving risk problems, supporting science in general and other disciplines in particular. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Teschke, Kay; Marion, Stephen A; Tsui, Joseph K C; Shen, Hui; Rugbjerg, Kathrine; Harris, M Anne
2014-02-01
We used a population-based sample of 403 Parkinson's disease cases and 405 controls to examine risks by occupation. Results were compared to a previous clinic-based analysis. With censoring of jobs held within 10 years of diagnosis, the following had significantly or strongly increased risks: social science, law and library jobs (OR = 1.8); farming and horticulture jobs (OR = 2.0); gas station jobs (OR = 2.6); and welders (OR = 3.0). The following had significantly decreased risks: management and administration jobs (OR = 0.70); and other health care jobs (OR = 0.44). These results were consistent with other findings for social science and farming occupations. Risks for teaching, medicine and health occupations were not elevated, unlike our previous clinic-based study. This underscores the value of population-based over clinic-based samples. Occupational studies may be particularly susceptible to referral bias because social networks may spread preferentially via jobs. © 2013 Wiley Periodicals, Inc.
Stream Bank Stability in Eastern Nebraska
Soenksen, Phillip J.; Turner, Mary J.; Dietsch, Benjamin J.; Simon, Andrew
2003-01-01
Dredged and straightened channels in eastern Nebraska have experienced degradation leading to channel widening by bank failure. Degradation has progressed headward and affected the drainage systems upstream from the modified reaches. This report describes a study that was undertaken to analyze bank stability at selected sites in eastern Nebraska and develop a simplified method for estimating the stability of banks at future study sites. Bank cross sections along straight reaches of channel and geotechnical data were collected at approximately 150 sites in 26 counties of eastern Nebraska. The sites were categorized into three groups based on mapped soil permeability. With increasing permeability of the soil groups, the median cohesion values decreased and the median friction angles increased. Three analytical methods were used to determine if banks were stable (should not fail even when saturated), at risk (should not fail unless saturated), or unstable (should have already failed). The Culmann and Agricultural Research Service methods were based on the Coulomb equation and planar failure; an indirect method was developed that was based on Bishop's simplified method of slices and rotational failure. The maximum angle from horizontal at which the bank would be stable for the given soil and bank height conditions also was computed with the indirect method. Because of few soil shear-strength data, all analyses were based on the assumption of homogeneous banks, which was later shown to be atypical, at least for some banks. Using the Culmann method and assuming no soil tension cracks, 67 percent of all 908 bank sections were identified as stable, 32 percent were at risk, and 1 percent were unstable; when tension cracks were assumed, the results changed to 58 percent stable, 40 percent at risk, and 1 percent unstable. Using the Agricultural Research Service method, 67 percent of all bank sections were identified as stable and 33 percent were at risk. Using the indirect method, 62 percent of all bank sections were identified as stable and 31 percent were at risk; 3 percent were unstable, and 3 percent were outside of the range of the tables developed for the method. For each of the methods that were used, the largest percentage of stable banks and the smallest percentage of at risk banks was for the soil group with the lowest soil permeability and highest median cohesion values. A comparison of the expected stable bank angles for saturated conditions and the surveyed bank angles indicated that many of the surveyed bank angles were considerably less than the maximum expected stable bank angles despite the banks being classified as at risk or unstable. For severely degraded channels along straight reaches this was not expected. It was expected that they would have angles close to the maximum stable angle as they should have been failing from an oversteepened condition. Several explanations are possible. The channel reaches of some study sites have not yet been affected to a significant degree by degradation; study sites were selected throughout individual basins and severe degradation has not yet extended to some sites along upper reaches; and some reaches have experienced aggradation as degradation progresses upstream. Another possibility is that some bank sections have been affected by lateral migration processes, which typically result in shallow bank angles on the inside bend of the channel. Another possibility is that the maximum expected stable bank angles are too steep. The stability methods used were well established and in essential agreement with each other, and there was no reason to question the geometry data. This left non-representative soil data as a probable reason for computed stable bank angles that, at least in some cases, are overly steep. Based on an examination of the cohesion data, to which the stable bank-angle calculations were most sensitive, both vertical and horizontal variability in soil properti
NASA Astrophysics Data System (ADS)
Beketskaya, Olga
2010-05-01
In Russia quality standards of contaminated substances values in environment consist of ecological and sanitary rate-setting. The sanitary risk assessment base on potential risk that contaminants pose to protect human beings. The main purpose of the ecological risk assessment is to protect ecosystem. To determine negative influence on living organisms in the sanitary risk assessment in Russia we use MPC. This value of contaminants show how substances affected on different part of environment, biological activity and soil processes. The ecological risk assessment based on comparison compounds concentration with background concentration for definite territories. Taking into account high interval of microelements value in soils, we suggest using statistic method for determination of concentration levels of chemical elements concentration in soils of Russia. This method is based on determination middle levels of elements content in natural condition. The top limit of middle chemical elements concentration in soils is value, which exceed middle regional background level in three times standard deviation. The top limit of natural concentration excess we can explain as anthropogenic impact. At first we study changing in the middle content value of microelements in soils of geographic regions in European part of Russia on the basis of cartographical analysis. Cartographical analysis showed that the soil of mountainous and mountain surrounding regions is enriched with microelements. On the plain territory of European part of Russia for most of microelements was noticed general direction of increasing their concentration in soils from north to south, also in the same direction soil clay content rise for majority of soils. For all other territories a clear connection has been noticed between the distribution of sand sediment. By our own investigation and data from scientific literature data base was created. This data base consist of following soil properties: texture, organic matter content, concentration of microelements and pH value. On the basis of this data base massive of data for Forest-steppe and Steppe regions was create, which was divided by texture. For all data statistics method was done and was calculated maximum level natural microelements content for soils with different texture (?+3*δ). As a result of our statistic calculation we got middle and the top limit of background concentration of microelements in sandy and clay soils (conditional border - sandy loam) of two regions. We showed, that for all territory of European part of Russia and for Forest-steppe and Steppe regions separately middle content and maximum level natural microelements concentrations (?+3*σ) are higher in clay soils, rather then in sandy soils. Data characterizing soils, in different regions, of similar texture differs less than the data collected for sandy and clay soils of the same region. After all this calculation we can notice, that data of middle and top limit of background microelements concentration in soils, based on statistic method, can be used in the aim of ecological risk assessment. Using offered method allow to calculate top limit of background concentration for sandy and clay soils for large-scale geographic regions, exceeding which will be evidence of anthropogenic contamination of soil.
Rank-based methods for modeling dependence between loss triangles.
Côté, Marie-Pier; Genest, Christian; Abdallah, Anas
2016-01-01
In order to determine the risk capital for their aggregate portfolio, property and casualty insurance companies must fit a multivariate model to the loss triangle data relating to each of their lines of business. As an inadequate choice of dependence structure may have an undesirable effect on reserve estimation, a two-stage inference strategy is proposed in this paper to assist with model selection and validation. Generalized linear models are first fitted to the margins. Standardized residuals from these models are then linked through a copula selected and validated using rank-based methods. The approach is illustrated with data from six lines of business of a large Canadian insurance company for which two hierarchical dependence models are considered, i.e., a fully nested Archimedean copula structure and a copula-based risk aggregation model.
NASA Technical Reports Server (NTRS)
Friedlander, Alan
1991-01-01
A number of disposal options for space nuclear reactors and the associated risks, mostly in the long term, based on probabilities of Earth reentry are discussed. The results are based on a five year study that was conducted between 1978 and 1983 on the space disposal of high level nuclear waste. The study provided assessment of disposal options, stability of disposal or storage orbits, and assessment of the long term risks of Earth reentry of the nuclear waste.
Proposal of a method for evaluating tsunami risk using response-surface methodology
NASA Astrophysics Data System (ADS)
Fukutani, Y.
2017-12-01
Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.
Seismic, high wind, tornado, and probabilistic risk assessments of the High Flux Isotope Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, S.P.; Stover, R.L.; Hashimoto, P.S.
1989-01-01
Natural phenomena analyses were performed on the High Flux Isotope Reactor (HFIR) Deterministic and probabilistic evaluations were made to determine the risks resulting from earthquakes, high winds, and tornadoes. Analytic methods in conjunction with field evaluations and an earthquake experience data base evaluation methods were used to provide more realistic results in a shorter amount of time. Plant modifications completed in preparation for HFIR restart and potential future enhancements are discussed. 5 figs.
Risk Evaluation of Railway Coal Transportation Network Based on Multi Level Grey Evaluation Model
NASA Astrophysics Data System (ADS)
Niu, Wei; Wang, Xifu
2018-01-01
The railway transport mode is currently the most important way of coal transportation, and now China’s railway coal transportation network has become increasingly perfect, but there is still insufficient capacity, some lines close to saturation and other issues. In this paper, the theory and method of risk assessment, analytic hierarchy process and multi-level gray evaluation model are applied to the risk evaluation of coal railway transportation network in China. Based on the example analysis of Shanxi railway coal transportation network, to improve the internal structure and the competitiveness of the market.
Decomposition-Based Failure Mode Identification Method for Risk-Free Design of Large Systems
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Stone, Robert B.; Roberts, Rory A.; Clancy, Daniel (Technical Monitor)
2002-01-01
When designing products, it is crucial to assure failure and risk-free operation in the intended operating environment. Failures are typically studied and eliminated as much as possible during the early stages of design. The few failures that go undetected result in unacceptable damage and losses in high-risk applications where public safety is of concern. Published NASA and NTSB accident reports point to a variety of components identified as sources of failures in the reported cases. In previous work, data from these reports were processed and placed in matrix form for all the system components and failure modes encountered, and then manipulated using matrix methods to determine similarities between the different components and failure modes. In this paper, these matrices are represented in the form of a linear combination of failures modes, mathematically formed using Principal Components Analysis (PCA) decomposition. The PCA decomposition results in a low-dimensionality representation of all failure modes and components of interest, represented in a transformed coordinate system. Such a representation opens the way for efficient pattern analysis and prediction of failure modes with highest potential risks on the final product, rather than making decisions based on the large space of component and failure mode data. The mathematics of the proposed method are explained first using a simple example problem. The method is then applied to component failure data gathered from helicopter, accident reports to demonstrate its potential.
A risk adjustment approach to estimating the burden of skin disease in the United States.
Lim, Henry W; Collins, Scott A B; Resneck, Jack S; Bolognia, Jean; Hodge, Julie A; Rohrer, Thomas A; Van Beek, Marta J; Margolis, David J; Sober, Arthur J; Weinstock, Martin A; Nerenz, David R; Begolka, Wendy Smith; Moyano, Jose V
2018-01-01
Direct insurance claims tabulation and risk adjustment statistical methods can be used to estimate health care costs associated with various diseases. In this third manuscript derived from the new national Burden of Skin Disease Report from the American Academy of Dermatology, a risk adjustment method that was based on modeling the average annual costs of individuals with or without specific diseases, and specifically tailored for 24 skin disease categories, was used to estimate the economic burden of skin disease. The results were compared with the claims tabulation method used in the first 2 parts of this project. The risk adjustment method estimated the direct health care costs of skin diseases to be $46 billion in 2013, approximately $15 billion less than estimates using claims tabulation. For individual skin diseases, the risk adjustment cost estimates ranged from 11% to 297% of those obtained using claims tabulation for the 10 most costly skin disease categories. Although either method may be used for purposes of estimating the costs of skin disease, the choice of method will affect the end result. These findings serve as an important reference for future discussions about the method chosen in health care payment models to estimate both the cost of skin disease and the potential cost impact of care changes. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
Viability qPCR, a new tool for Legionella risk management.
Lizana, X; López, A; Benito, S; Agustí, G; Ríos, M; Piqué, N; Marqués, A M; Codony, F
2017-11-01
Viability quantitative Polymerase Chain Reaction (v-qPCR) is a recent analytical approach for only detecting live microorganisms by DNA amplification-based methods This approach is based on the use of a reagent that irreversibly fixes dead cells DNA. In this study, we evaluate the utility of v-qPCR versus culture method for Legionellosis risk management. The present study was performed using 116 real samples. Water samples were simultaneously analysed by culture, v-qPCR and qPCR methods. Results were compared by means of a non-parametric test. In 11.6% of samples using both methods (culture method and v-qPCR) results were positive, in 50.0% of samples both methods gave rise to negative results. As expected, equivalence between methods was not observed in all cases, as in 32.1% of samples positive results were obtained by v-qPCR and all of them gave rise to negative results by culture. Only in 6.3% of samples, with very low Legionella levels, was culture positive and v-qPCR negative. In 3.5% of samples, overgrowth of other bacteria did not allow performing the culture. When comparing both methods, significant differences between culture and v-qPCR were in the samples belonging to the cooling towers-evaporative condensers group. The v-qPCR method detected greater presence and obtained higher concentrations of Legionella spp. (p<0.001). Otherwise, no significant differences between methods were found in the rest of the groups. The v-qPCR method can be used as a quick tool to evaluate Legionellosis risk, especially in cooling towers-evaporative condensers, where this technique can detect higher levels than culture. The combined interpretation of PCR results along with the ratio of live cells is proposed as a tool for understanding the sample context and estimating the Legionellosis risk potential according to 4 levels of hierarchy. Copyright © 2017 Elsevier GmbH. All rights reserved.
Rücker, Viktoria; Keil, Ulrich; Fitzgerald, Anthony P; Malzahn, Uwe; Prugger, Christof; Ertl, Georg; Heuschmann, Peter U; Neuhauser, Hannelore
2016-01-01
Estimation of absolute risk of cardiovascular disease (CVD), preferably with population-specific risk charts, has become a cornerstone of CVD primary prevention. Regular recalibration of risk charts may be necessary due to decreasing CVD rates and CVD risk factor levels. The SCORE risk charts for fatal CVD risk assessment were first calibrated for Germany with 1998 risk factor level data and 1999 mortality statistics. We present an update of these risk charts based on the SCORE methodology including estimates of relative risks from SCORE, risk factor levels from the German Health Interview and Examination Survey for Adults 2008–11 (DEGS1) and official mortality statistics from 2012. Competing risks methods were applied and estimates were independently validated. Updated risk charts were calculated based on cholesterol, smoking, systolic blood pressure risk factor levels, sex and 5-year age-groups. The absolute 10-year risk estimates of fatal CVD were lower according to the updated risk charts compared to the first calibration for Germany. In a nationwide sample of 3062 adults aged 40–65 years free of major CVD from DEGS1, the mean 10-year risk of fatal CVD estimated by the updated charts was lower by 29% and the estimated proportion of high risk people (10-year risk > = 5%) by 50% compared to the older risk charts. This recalibration shows a need for regular updates of risk charts according to changes in mortality and risk factor levels in order to sustain the identification of people with a high CVD risk. PMID:27612145
A Model-Free Machine Learning Method for Risk Classification and Survival Probability Prediction.
Geng, Yuan; Lu, Wenbin; Zhang, Hao Helen
2014-01-01
Risk classification and survival probability prediction are two major goals in survival data analysis since they play an important role in patients' risk stratification, long-term diagnosis, and treatment selection. In this article, we propose a new model-free machine learning framework for risk classification and survival probability prediction based on weighted support vector machines. The new procedure does not require any specific parametric or semiparametric model assumption on data, and is therefore capable of capturing nonlinear covariate effects. We use numerous simulation examples to demonstrate finite sample performance of the proposed method under various settings. Applications to a glioma tumor data and a breast cancer gene expression survival data are shown to illustrate the new methodology in real data analysis.
Fuzzy risk analysis of a modern γ-ray industrial irradiator.
Castiglia, F; Giardina, M
2011-06-01
Fuzzy fault tree analyses were used to investigate accident scenarios that involve radiological exposure to operators working in industrial γ-ray irradiation facilities. The HEART method, a first generation human reliability analysis method, was used to evaluate the probability of adverse human error in these analyses. This technique was modified on the basis of fuzzy set theory to more directly take into account the uncertainties in the error-promoting factors on which the methodology is based. Moreover, with regard to some identified accident scenarios, fuzzy radiological exposure risk, expressed in terms of potential annual death, was evaluated. The calculated fuzzy risks for the examined plant were determined to be well below the reference risk suggested by International Commission on Radiological Protection.
Nonparametric estimation of benchmark doses in environmental risk assessment
Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen
2013-01-01
Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133
A Population-based survey of risk for cancer in individuals diagnosed with myotonic dystrophy
Abbott, Diana; Johnson, Nicholas E; Cannon-Albright, Lisa A.
2018-01-01
Introduction The risk of cancer in patients diagnosed with myotonic dystrophy (DM) is reported for the homogeneous Utah population. Methods Clinical data accessed from the largest Utah healthcare providers have been record-linked to the Utah Population Database (UPDB), a population-based resource also linked to the Utah Cancer Registry. Relative risks were estimated for 36 cancers of different types in 281 DM patients. Results Testicular cancer (RR=10.74; 95% CI: 1.91, 38.79), endometrial cancer (6.98; 1.24, 25.22), and Non-Hodgkins lymphoma (4.25; 1.16, 12.43) were all observed at significant excess in DM patients. Discussion This study confirms an overall increased risk of cancer in DM. Individuals diagnosed with DM might benefit from risk counseling. PMID:27064430
Total Homocysteine Is Associated With White Matter Hyperintensity Volume
Wright, Clinton B.; Paik, Myunghee C.; Brown, Truman R.; Stabler, Sally P.; Allen, Robert H.; Sacco, Ralph L.; DeCarli, Charles
2005-01-01
Background Total homocysteine (tHcy) has been implicated as a risk factor for stroke and dementia, but the mechanism is unclear. White matter hyperintensities may be a risk factor for both, but studies of the relationship between tHcy and quantitative measures of white matter hyperintensity volume (WMHV) are lacking, especially in minority populations. Methods A community-based sample of 259 subjects with baseline tHcy levels underwent pixel-based quantitative measurement of WMHV. We examined the relationship between tHcy and WMHV adjusting for age, sociodemographics, vascular risk factors, and B12 deficiency. Results Higher levels of tHcy were associated with WMHV adjusting for sociodemographics and vascular risk factors. Conclusions These cross-sectional data provide evidence that tHcy is a risk factor for white matter damage. PMID:15879345
Risk assessments for mixtures: technical methods commonly used in the United States
A brief (20 minute) talk on the technical approaches used by EPA and other US agencies to assess risks posed by combined exposures to one or more chemicals. The talk systemically reviews the methodologies (whole-mixtures and component-based approaches) that are or have been used ...
School-Based Obesity Interventions: A Literature Review
ERIC Educational Resources Information Center
Shaya, Fadia T.; Flores, David; Gbarayor, Confidence M.; Wang, Jingshu
2008-01-01
Background: Childhood obesity is an impending epidemic. This article is an overview of different interventions conducted in school settings so as to guide efforts for an effective management of obesity in children, thus minimizing the risk of adult obesity and related cardiovascular risk. Methods: PubMed and OVID Medline databases were searched…
[Occupational risk as a criterion determining economic responsibility of employers].
Subbotin, V V; Tkachev, V V
2003-01-01
The authors suggested a new method to calculate discounts and increments, value of assurance collection, that is based on differentiation of insurers, but not of economic branches. Occupational risk class should be set according to the previous results with consideration of work safety parameters described in the article.
Electronic health record-based cardiac risk assessment and identification of unmet preventive needs.
Persell, Stephen D; Dunne, Alexis P; Lloyd-Jones, Donald M; Baker, David W
2009-04-01
Cardiac risk assessment may not be routinely performed. Electronic health records (EHRs) offer the potential to automate risk estimation. We compared EHR-based assessment with manual chart review to determine the accuracy of automated cardiac risk estimation and determination of candidates for antiplatelet or lipid-lowering interventions. We performed an observational retrospective study of 23,111 adults aged 20 to 79 years, seen in a large urban primary care group practice. Automated assessments classified patients into 4 cardiac risk groups or as unclassifiable and determined candidates for antiplatelet or lipid-lowering interventions based on current guidelines. A blinded physician manually reviewed 100 patients from each risk group and the unclassifiable group. We determined the agreement between full review and automated assessments for cardiac risk estimation and identification of which patients were candidates for interventions. By automated methods, 9.2% of the population were candidates for lipid-lowering interventions, and 8.0% were candidates for antiplatelet medication. Agreement between automated risk classification and manual review was high (kappa = 0.91; 95% confidence interval [CI], 0.88-0.93). Automated methods accurately identified candidates for antiplatelet therapy [sensitivity, 0.81 (95% CI, 0.73-0.89); specificity, 0.98 (95% CI, 0.96-0.99); positive predictive value, 0.86 (95% CI, 0.78-0.94); and negative predictive value, 0.98 (95% CI, 0.97-0.99)] and lipid lowering [sensitivity, 0.92 (95% CI, 0.87-0.96); specificity, 0.98 (95% CI, 0.97-0.99); positive predictive value, 0.94 (95% CI, 0.89-0.99); and negative predictive value, 0.99 (95% CI, 0.98-> or =0.99)]. EHR data can be used to automatically perform cardiovascular risk stratification and identify patients in need of risk-lowering interventions. This could improve detection of high-risk patients whom physicians would otherwise be unaware.
Davis, Alexander L; Wong-Parodi, Gabrielle; Fischhoff, Baruch; Sadovsky, Yoel; Simhan, Hyagriv N
2017-01-01
Background Despite significant advances in medical interventions and health care delivery, preterm births in the United States are on the rise. Existing research has identified important, seemingly simple precautions that could significantly reduce preterm birth risk. However, it has proven difficult to communicate even these simple recommendations to women in need of them. Our objective was to draw on methods from behavioral decision research to develop a personalized smartphone app-based medical communication tool to assess and communicate pregnancy risks related to preterm birth. Objective A longitudinal, prospective pilot study was designed to develop an engaging, usable smartphone app that communicates personalized pregnancy risk and gathers risk data, with the goal of decreasing preterm birth rates in a typically hard-to-engage patient population. Methods We used semistructured interviews and user testing to develop a smartphone app based on an approach founded in behavioral decision research. For usability evaluation, 16 participants were recruited from the outpatient clinic at a major academic hospital specializing in high-risk pregnancies and provided a smartphone with the preloaded app and a digital weight scale. Through the app, participants were queried daily to assess behavioral risks, mood, and symptomology associated with preterm birth risk. Participants also completed monthly phone interviews to report technical problems and their views on the app’s usefulness. Results App use was higher among participants at higher risk, as reflected in reporting poorer daily moods (Odds ratio, OR 1.20, 95% CI 0.99-1.47, P=.08), being more likely to smoke (OR 4.00, 95% CI 0.93-16.9, P=.06), being earlier in their pregnancy (OR 1.07, 95% CI 1.02-1.12, P=.005), and having a lower body mass index (OR 1.07, 95% CI 1.00-1.15, P=.05). Participant-reported intention to breastfeed increased from baseline to the end of the trial, t15=−2.76, P=.01. Participants’ attendance at prenatal appointments was 84% compared with the clinic norm of 50%, indicating a conservatively estimated cost savings of ~US $450/patient over 3 months. Conclusions Our app is an engaging method for assessing and communicating risk during pregnancy in a typically hard-to-reach population, providing accessible and personalized distant obstetrical care, designed to target preterm birth risk, specifically. PMID:28396302
Underhill, Meghan L.; Lally, Robin M.; Kiviniemi, Marc T.; Murekeyisoni, Christine; Dickerson, Suzanne S.
2013-01-01
Background Based on known or suggested genetic risk factors, a growing number of women now live with knowledge of a potential cancer diagnosis that may never occur. Given this, it is important to understand the meaning of living with high risk for hereditary breast cancer. Objective The objective of the study was to explore how women at high risk for hereditary breast cancer (1) form self-identity, (2) apply self-care strategies toward risk, and (3) describe the meaning of care through a high-risk breast program. Methods Interpretive hermeneutic phenomenology guided the qualitative research method. Women at high risk for hereditary breast cancer were recruited from a high-risk breast program. Open-ended interview questions focused on experiences living as women managing high risk for breast cancer. Consistent with hermeneutic methodology, the principal investigator led a team to analyze the interview transcripts. Results Twenty women participated in in-depth interviews. Analysis revealed that women describe their own identity based on their family story and grieve over actual and potential familial loss. This experience influences self-care strategies, including seeking care from hereditary breast cancer risk experts for early detection and prevention, as well as maintaining a connection for early treatment “when” diagnosis occurs. Conclusions Healthy women living with high risk for hereditary breast cancer are living within the context of their family cancer story, which influences how they define themselves and engage in self-care. Implications for Practice Findings present important practical, research, and policy information regarding health promotion, psychosocial assessment, and support for women living with this risk. PMID:22544165
Risk management for sulfur dioxide abatement under multiple uncertainties
NASA Astrophysics Data System (ADS)
Dai, C.; Sun, W.; Tan, Q.; Liu, Y.; Lu, W. T.; Guo, H. C.
2016-03-01
In this study, interval-parameter programming, two-stage stochastic programming (TSP), and conditional value-at-risk (CVaR) were incorporated into a general optimization framework, leading to an interval-parameter CVaR-based two-stage programming (ICTP) method. The ICTP method had several advantages: (i) its objective function simultaneously took expected cost and risk cost into consideration, and also used discrete random variables and discrete intervals to reflect uncertain properties; (ii) it quantitatively evaluated the right tail of distributions of random variables which could better calculate the risk of violated environmental standards; (iii) it was useful for helping decision makers to analyze the trade-offs between cost and risk; and (iv) it was effective to penalize the second-stage costs, as well as to capture the notion of risk in stochastic programming. The developed model was applied to sulfur dioxide abatement in an air quality management system. The results indicated that the ICTP method could be used for generating a series of air quality management schemes under different risk-aversion levels, for identifying desired air quality management strategies for decision makers, and for considering a proper balance between system economy and environmental quality.
Construction of an Exome-Wide Risk Score for Schizophrenia Based on a Weighted Burden Test.
Curtis, David
2018-01-01
Polygenic risk scores obtained as a weighted sum of associated variants can be used to explore association in additional data sets and to assign risk scores to individuals. The methods used to derive polygenic risk scores from common SNPs are not suitable for variants detected in whole exome sequencing studies. Rare variants, which may have major effects, are seen too infrequently to judge whether they are associated and may not be shared between training and test subjects. A method is proposed whereby variants are weighted according to their frequency, their annotations and the genes they affect. A weighted sum across all variants provides an individual risk score. Scores constructed in this way are used in a weighted burden test and are shown to be significantly different between schizophrenia cases and controls using a five-way cross-validation procedure. This approach represents a first attempt to summarise exome sequence variation into a summary risk score, which could be combined with risk scores from common variants and from environmental factors. It is hoped that the method could be developed further. © 2017 John Wiley & Sons Ltd/University College London.
Codony, Francesc; Pérez, Leonardo Martín; Adrados, Bárbara; Agustí, Gemma; Fittipaldi, Mariana; Morató, Jordi
2012-01-01
Culture-based methods for fecal indicator microorganisms are the standard protocol to assess potential health risk from drinking water systems. However, these traditional fecal indicators are inappropriate surrogates for disinfection-resistant fecal pathogens and the indigenous pathogens that grow in drinking water systems. There is now a range of molecular-based methods, such as quantitative PCR, which allow detection of a variety of pathogens and alternative indicators. Hence, in addition to targeting total Escherichia coli (i.e., dead and alive) for the detection of fecal pollution, various amoebae may be suitable to indicate the potential presence of pathogenic amoeba-resisting microorganisms, such as Legionellae. Therefore, monitoring amoeba levels by quantitative PCR could be a useful tool for directly and indirectly evaluating health risk and could also be a complementary approach to current microbial quality control strategies for drinking water systems.
Decision support systems and methods for complex networks
Huang, Zhenyu [Richland, WA; Wong, Pak Chung [Richland, WA; Ma, Jian [Richland, WA; Mackey, Patrick S [Richland, WA; Chen, Yousu [Richland, WA; Schneider, Kevin P [Seattle, WA
2012-02-28
Methods and systems for automated decision support in analyzing operation data from a complex network. Embodiments of the present invention utilize these algorithms and techniques not only to characterize the past and present condition of a complex network, but also to predict future conditions to help operators anticipate deteriorating and/or problem situations. In particular, embodiments of the present invention characterize network conditions from operation data using a state estimator. Contingency scenarios can then be generated based on those network conditions. For at least a portion of all of the contingency scenarios, risk indices are determined that describe the potential impact of each of those scenarios. Contingency scenarios with risk indices are presented visually as graphical representations in the context of a visual representation of the complex network. Analysis of the historical risk indices based on the graphical representations can then provide trends that allow for prediction of future network conditions.
NASA Astrophysics Data System (ADS)
Lusiana, N.
2013-12-01
Abstract Floods haves frequently hit Indonesia and have had greater negative impacts. In Javaboth the area affected by flooding and the amount of damage caused by floods have increased. At least, five factors, affect the flooding in Indonesia, including rainfall, reduced retention capacity of the watershed, erroneous design of river channel development, silting-up of the river, and erroneous regional layout. The level of the disastrous risks can be evaluated based on the extent of the threat and susceptibility of a region. One methode for risk assessment is Geographical Information System (GIS)-based mapping. Objectives of this research are: 1) evaluating current flood risk in susceptible areas, 2) applying supported land-based layout as effort to mitigate floodrisk, and 3) evaluating floodrisk for the period 2031 in the Tempuran floodplain of Ponorogo Regency. Result show that the area categorized as high risk covers 104. 6 ha (1. 2%), moderate risk covers 2512. 9 ha (28. 4%), low risk covers 3140. 8 ha (35. 5%), and the lowest risk covers 3096. 1 (34. 9%). Using Regional Layout Design for the years 2011 - 2031, the high risk area covers 67. 9 ha (0.8%), moderate risk covers 3033 ha (34. 3%), low risk covers 2770. 8 ha (31, 3%), and the lowest risk covers 2982. 6 ha (34%). Based on supported land suitability, the high-risk areais only 2. 9 ha (0.1%), moderate risk covers of 426. 1 ha (4. 8%), low risk covers 4207. 4 ha (47. 5%), and the lowest risk covers 4218 ha (47. 6%). Flood risk can be mitigated by applying supported land-based layout as shown by the reduced high-risk area, and the fact that > 90% of the areas are categorized as low or lowest risk of disaster. Keywords : Carrying Capacity, Land Capacity, Flood Risk
NASA Astrophysics Data System (ADS)
Zhang, Weihong.; Zhao, Yongsheng; Hong, Mei; Guo, Xiaodong
2009-04-01
Groundwater pollution usually is complex and concealed, remediation of which is difficult, high cost, time-consuming, and ineffective. An early warning system for groundwater pollution is needed that detects groundwater quality problems and gets the information necessary to make sound decisions before massive groundwater quality degradation occurs. Groundwater pollution early warning were performed by considering comprehensively the current groundwater quality, groundwater quality varying trend and groundwater pollution risk . The map of the basic quality of the groundwater was obtained by fuzzy comprehensive evaluation or BP neural network evaluation. Based on multi-annual groundwater monitoring datasets, Water quality state in sometime of the future was forecasted using time-sequenced analyzing methods. Water quality varying trend was analyzed by Spearman's rank correlative coefficient.The relative risk map of groundwater pollution was estimated through a procedure that identifies, cell by cell,the values of three factors, that is inherent vulnerability, load risk of pollution source and contamination hazard. DRASTIC method was used to assess inherent vulnerability of aquifer. Load risk of pollution source was analyzed based on the potential of contamination and pollution degree. Assessment index of load risk of pollution source which involves the variety of pollution source, quantity of contaminants, releasing potential of pollutants, and distance were determined. The load risks of all sources considered by GIS overlay technology. Early warning model of groundwater pollution combined with ComGIS technology organically, the regional groundwater pollution early-warning information system was developed, and applied it into Qiqiha'er groundwater early warning. It can be used to evaluate current water quality, to forecast water quality changing trend, and to analyze space-time influencing range of groundwater quality by natural process and human activities. Keywords: groundwater pollution, early warning, aquifer vulnerability, pollution load, pollution risk, ComGIS
Real-time flood forecasts & risk assessment using a possibility-theory based fuzzy neural network
NASA Astrophysics Data System (ADS)
Khan, U. T.
2016-12-01
Globally floods are one of the most devastating natural disasters and improved flood forecasting methods are essential for better flood protection in urban areas. Given the availability of high resolution real-time datasets for flood variables (e.g. streamflow and precipitation) in many urban areas, data-driven models have been effectively used to predict peak flow rates in river; however, the selection of input parameters for these types of models is often subjective. Additionally, the inherit uncertainty associated with data models along with errors in extreme event observations means that uncertainty quantification is essential. Addressing these concerns will enable improved flood forecasting methods and provide more accurate flood risk assessments. In this research, a new type of data-driven model, a quasi-real-time updating fuzzy neural network is developed to predict peak flow rates in urban riverine watersheds. A possibility-to-probability transformation is first used to convert observed data into fuzzy numbers. A possibility theory based training regime is them used to construct the fuzzy parameters and the outputs. A new entropy-based optimisation criterion is used to train the network. Two existing methods to select the optimum input parameters are modified to account for fuzzy number inputs, and compared. These methods are: Entropy-Wavelet-based Artificial Neural Network (EWANN) and Combined Neural Pathway Strength Analysis (CNPSA). Finally, an automated algorithm design to select the optimum structure of the neural network is implemented. The overall impact of each component of training this network is to replace the traditional ad hoc network configuration methods, with one based on objective criteria. Ten years of data from the Bow River in Calgary, Canada (including two major floods in 2005 and 2013) are used to calibrate and test the network. The EWANN method selected lagged peak flow as a candidate input, whereas the CNPSA method selected lagged precipitation and lagged mean daily flow as candidate inputs. Model performance metric show that the CNPSA method had higher performance (with an efficiency of 0.76). Model output was used to assess the risk of extreme peak flows for a given day using an inverse possibility-to-probability transformation.
Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky
Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presentedmore » a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.« less
Consideration of VT5 etch-based OPC modeling
NASA Astrophysics Data System (ADS)
Lim, ChinTeong; Temchenko, Vlad; Kaiser, Dieter; Meusel, Ingo; Schmidt, Sebastian; Schneider, Jens; Niehoff, Martin
2008-03-01
Including etch-based empirical data during OPC model calibration is a desired yet controversial decision for OPC modeling, especially for process with a large litho to etch biasing. While many OPC software tools are capable of providing this functionality nowadays; yet few were implemented in manufacturing due to various risks considerations such as compromises in resist and optical effects prediction, etch model accuracy or even runtime concern. Conventional method of applying rule-based alongside resist model is popular but requires a lot of lengthy code generation to provide a leaner OPC input. This work discusses risk factors and their considerations, together with introduction of techniques used within Mentor Calibre VT5 etch-based modeling at sub 90nm technology node. Various strategies are discussed with the aim of better handling of large etch bias offset without adding complexity into final OPC package. Finally, results were presented to assess the advantages and limitations of the final method chosen.
Support vector machines-based fault diagnosis for turbo-pump rotor
NASA Astrophysics Data System (ADS)
Yuan, Sheng-Fa; Chu, Fu-Lei
2006-05-01
Most artificial intelligence methods used in fault diagnosis are based on empirical risk minimisation principle and have poor generalisation when fault samples are few. Support vector machines (SVM) is a new general machine-learning tool based on structural risk minimisation principle that exhibits good generalisation even when fault samples are few. Fault diagnosis based on SVM is discussed. Since basic SVM is originally designed for two-class classification, while most of fault diagnosis problems are multi-class cases, a new multi-class classification of SVM named 'one to others' algorithm is presented to solve the multi-class recognition problems. It is a binary tree classifier composed of several two-class classifiers organised by fault priority, which is simple, and has little repeated training amount, and the rate of training and recognition is expedited. The effectiveness of the method is verified by the application to the fault diagnosis for turbo pump rotor.
Stukel, Thérèse A.; Fisher, Elliott S; Wennberg, David E.; Alter, David A.; Gottlieb, Daniel J.; Vermeulen, Marian J.
2007-01-01
Context Comparisons of outcomes between patients treated and untreated in observational studies may be biased due to differences in patient prognosis between groups, often because of unobserved treatment selection biases. Objective To compare 4 analytic methods for removing the effects of selection bias in observational studies: multivariable model risk adjustment, propensity score risk adjustment, propensity-based matching, and instrumental variable analysis. Design, Setting, and Patients A national cohort of 122 124 patients who were elderly (aged 65–84 years), receiving Medicare, and hospitalized with acute myocardial infarction (AMI) in 1994–1995, and who were eligible for cardiac catheterization. Baseline chart reviews were taken from the Cooperative Cardiovascular Project and linked to Medicare health administrative data to provide a rich set of prognostic variables. Patients were followed up for 7 years through December 31, 2001, to assess the association between long-term survival and cardiac catheterization within 30 days of hospital admission. Main Outcome Measure Risk-adjusted relative mortality rate using each of the analytic methods. Results Patients who received cardiac catheterization (n=73 238) were younger and had lower AMI severity than those who did not. After adjustment for prognostic factors by using standard statistical risk-adjustment methods, cardiac catheterization was associated with a 50% relative decrease in mortality (for multivariable model risk adjustment: adjusted relative risk [RR], 0.51; 95% confidence interval [CI], 0.50–0.52; for propensity score risk adjustment: adjusted RR, 0.54; 95% CI, 0.53–0.55; and for propensity-based matching: adjusted RR, 0.54; 95% CI, 0.52–0.56). Using regional catheterization rate as an instrument, instrumental variable analysis showed a 16% relative decrease in mortality (adjusted RR, 0.84; 95% CI, 0.79–0.90). The survival benefits of routine invasive care from randomized clinical trials are between 8% and 21 %. Conclusions Estimates of the observational association of cardiac catheterization with long-term AMI mortality are highly sensitive to analytic method. All standard risk-adjustment methods have the same limitations regarding removal of unmeasured treatment selection biases. Compared with standard modeling, instrumental variable analysis may produce less biased estimates of treatment effects, but is more suited to answering policy questions than specific clinical questions. PMID:17227979
Topography- and nightlight-based national flood risk assessment in Canada
NASA Astrophysics Data System (ADS)
Elshorbagy, Amin; Bharath, Raja; Lakhanpal, Anchit; Ceola, Serena; Montanari, Alberto; Lindenschmidt, Karl-Erich
2017-04-01
In Canada, flood analysis and water resource management, in general, are tasks conducted at the provincial level; therefore, unified national-scale approaches to water-related problems are uncommon. In this study, a national-scale flood risk assessment approach is proposed and developed. The study focuses on using global and national datasets available with various resolutions to create flood risk maps. First, a flood hazard map of Canada is developed using topography-based parameters derived from digital elevation models, namely, elevation above nearest drainage (EAND) and distance from nearest drainage (DFND). This flood hazard mapping method is tested on a smaller area around the city of Calgary, Alberta, against a flood inundation map produced by the city using hydraulic modelling. Second, a flood exposure map of Canada is developed using a land-use map and the satellite-based nightlight luminosity data as two exposure parameters. Third, an economic flood risk map is produced, and subsequently overlaid with population density information to produce a socioeconomic flood risk map for Canada. All three maps of hazard, exposure, and risk are classified into five classes, ranging from very low to severe. A simple way to include flood protection measures in hazard estimation is also demonstrated using the example of the city of Winnipeg, Manitoba. This could be done for the entire country if information on flood protection across Canada were available. The evaluation of the flood hazard map shows that the topography-based method adopted in this study is both practical and reliable for large-scale analysis. Sensitivity analysis regarding the resolution of the digital elevation model is needed to identify the resolution that is fine enough for reliable hazard mapping, but coarse enough for computational tractability. The nightlight data are found to be useful for exposure and risk mapping in Canada; however, uncertainty analysis should be conducted to investigate the effect of the overglow phenomenon on flood risk mapping.
A Strategy to Safely Live and Work in the Space Radiation Environment
NASA Technical Reports Server (NTRS)
Corbin, Barbara J.; Sulzman, Frank M.; Krenek, Sam
2006-01-01
The goal of the National Aeronautics and Space Agency and the Space Radiation Project is to ensure that astronauts can safely live and work in the space radiation environment. The space radiation environment poses both acute and chronic risks to crew health and safety, but unlike some other aspects of space travel, space radiation exposure has clinically relevant implications for the lifetime of the crew. The term safely means that risks are sufficiently understood such that acceptable limits on mission, post-mission and multi-mission consequences (for example, excess lifetime fatal cancer risk) can be defined. The Space Radiation Project strategy has several elements. The first element is to use a peer-reviewed research program to increase our mechanistic knowledge and genetic capabilities to develop tools for individual risk projection, thereby reducing our dependency on epidemiological data and population-based risk assessment. The second element is to use the NASA Space Radiation Laboratory to provide a ground-based facility to study the understanding of health effects/mechanisms of damage from space radiation exposure and the development and validation of biological models of risk, as well as methods for extrapolation to human risk. The third element is a risk modeling effort that integrates the results from research efforts into models of human risk to reduce uncertainties in predicting risk of carcinogenesis, central nervous system damage, degenerative tissue disease, and acute radiation effects. To understand the biological basis for risk, we must also understand the physical aspects of the crew environment. Thus the fourth element develops computer codes to predict radiation transport properties, evaluate integrated shielding technologies and provide design optimization recommendations for the design of human space systems. Understanding the risks and determining methods to mitigate the risks are keys to a successful radiation protection strategy.
Case Study on Project Risk Management Planning Based on Soft System Methodology
NASA Astrophysics Data System (ADS)
Lifang, Xie; Jun, Li
This paper analyzed the soft system characters of construction projects and the applicability on using Soft System Methodology (SSM) for risk analysis after a brief review of SSM. Taking a hydropower project as an example, it constructed the general frame of project risk management planning (PRMP) and established the Risk Management Planning (RMP) system from the perspective of the interests of co-ordination. This paper provided the ideas and methods for construction RMP under the win-win situation through the practice of SSM.
Schütte, Katrin; Boeing, Heiner; Hart, Andy; Heeschen, Walther; Reimerdes, Ernst H; Santare, Dace; Skog, Kerstin; Chiodini, Alessandro
2012-11-01
The aim of the European Funded Project BRAFO (benefit-risk analysis of foods) project was to develop a framework that allows quantitative comparison of human health risks and benefits of foods based on a common scale of measurement. This publication describes the application of the BRAFO methodology to three different case studies: the formation of acrylamide in potato and cereal based products, the formation of benzo(a)pyrene through smoking and grilling of meat and fish and the heat-treatment of milk. Reference, alternative scenario and target population represented the basic structure to test the tiers of the framework. Various intervention methods intended to reduce acrylamide in potato and cereal products were evaluated against the historical production methods. In conclusion the benefits of the acrylamide-reducing measures were considered prevailing. For benzo(a)pyrene, three illustrated alternative scenarios were evaluated against the most common smoking practice. The alternative scenarios were assessed as delivering benefits, introducing only minimal potential risks. Similar considerations were made for heat treatment of milk where the comparison of the microbiological effects of heat treatment, physico-chemical changes of milk constituents with positive and negative health effects was assessed. In general, based on data available, benefits of the heat treatment were outweighing any risks. Copyright © 2012 ILSI Europe. Published by Elsevier Ltd.. All rights reserved.
Clarke, John R; Ragone, Andrew V; Greenwald, Lloyd
2005-09-01
We conducted a comparison of methods for predicting survival using survival risk ratios (SRRs), including new comparisons based on International Classification of Diseases, Ninth Revision (ICD-9) versus Abbreviated Injury Scale (AIS) six-digit codes. From the Pennsylvania trauma center's registry, all direct trauma admissions were collected through June 22, 1999. Patients with no comorbid medical diagnoses and both ICD-9 and AIS injury codes were used for comparisons based on a single set of data. SRRs for ICD-9 and then for AIS diagnostic codes were each calculated two ways: from the survival rate of patients with each diagnosis and when each diagnosis was an isolated diagnosis. Probabilities of survival for the cohort were calculated using each set of SRRs by the multiplicative ICISS method and, where appropriate, the minimum SRR method. These prediction sets were then internally validated against actual survival by the Hosmer-Lemeshow goodness-of-fit statistic. The 41,364 patients had 1,224 different ICD-9 injury diagnoses in 32,261 combinations and 1,263 corresponding AIS injury diagnoses in 31,755 combinations, ranging from 1 to 27 injuries per patient. All conventional ICD-9-based combinations of SRRs and methods had better Hosmer-Lemeshow goodness-of-fit statistic fits than their AIS-based counterparts. The minimum SRR method produced better calibration than the multiplicative methods, presumably because it did not magnify inaccuracies in the SRRs that might occur with multiplication. Predictions of survival based on anatomic injury alone can be performed using ICD-9 codes, with no advantage from extra coding of AIS diagnoses. Predictions based on the single worst SRR were closer to actual outcomes than those based on multiplying SRRs.
Imaging Breast Density: Established and Emerging Modalities1
Chen, Jeon-Hor; Gulsen, Gultekin; Su, Min-Ying
2015-01-01
Mammographic density has been proven as an independent risk factor for breast cancer. Women with dense breast tissue visible on a mammogram have a much higher cancer risk than women with little density. A great research effort has been devoted to incorporate breast density into risk prediction models to better estimate each individual’s cancer risk. In recent years, the passage of breast density notification legislation in many states in USA requires that every mammography report should provide information regarding the patient’s breast density. Accurate definition and measurement of breast density are thus important, which may allow all the potential clinical applications of breast density to be implemented. Because the two-dimensional mammography-based measurement is subject to tissue overlapping and thus not able to provide volumetric information, there is an urgent need to develop reliable quantitative measurements of breast density. Various new imaging technologies are being developed. Among these new modalities, volumetric mammographic density methods and three-dimensional magnetic resonance imaging are the most well studied. Besides, emerging modalities, including different x-ray–based, optical imaging, and ultrasound-based methods, have also been investigated. All these modalities may either overcome some fundamental problems related to mammographic density or provide additional density and/or compositional information. The present review article aimed to summarize the current established and emerging imaging techniques for the measurement of breast density and the evidence of the clinical use of these density methods from the literature. PMID:26692524
Multisite Parent-Centered Risk Assessment to Reduce Pediatric Oral Chemotherapy Errors
Walsh, Kathleen E.; Mazor, Kathleen M.; Roblin, Douglas; Biggins, Colleen; Wagner, Joann L.; Houlahan, Kathleen; Li, Justin W.; Keuker, Christopher; Wasilewski-Masker, Karen; Donovan, Jennifer; Kanaan, Abir; Weingart, Saul N.
2013-01-01
Purpose: Observational studies describe high rates of errors in home oral chemotherapy use in children. In hospitals, proactive risk assessment methods help front-line health care workers develop error prevention strategies. Our objective was to engage parents of children with cancer in a multisite study using proactive risk assessment methods to identify how errors occur at home and propose risk reduction strategies. Methods: We recruited parents from three outpatient pediatric oncology clinics in the northeast and southeast United States to participate in failure mode and effects analyses (FMEA). An FMEA is a systematic team-based proactive risk assessment approach in understanding ways a process can fail and develop prevention strategies. Steps included diagram the process, brainstorm and prioritize failure modes (places where things go wrong), and propose risk reduction strategies. We focused on home oral chemotherapy administration after a change in dose because prior studies identified this area as high risk. Results: Parent teams consisted of four parents at two of the sites and 10 at the third. Parents developed a 13-step process map, with two to 19 failure modes per step. The highest priority failure modes included miscommunication when receiving instructions from the clinician (caused by conflicting instructions or parent lapses) and unsafe chemotherapy handling at home. Recommended risk assessment strategies included novel uses of technology to improve parent access to information, clinicians, and other parents while at home. Conclusion: Parents of pediatric oncology patients readily participated in a proactive risk assessment method, identifying processes that pose a risk for medication errors involving home oral chemotherapy. PMID:23633976
The development of a 3D risk analysis method.
I, Yet-Pole; Cheng, Te-Lung
2008-05-01
Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.
Dickerson, Justin B; McNeal, Catherine J; Tsai, Ginger; Rivera, Cathleen M; Smith, Matthew Lee; Ohsfeldt, Robert L; Ory, Marcia G
2014-04-18
Health risk assessments are becoming more popular as a tool to conveniently and effectively reach community-dwelling adults who may be at risk for serious chronic conditions such as coronary heart disease (CHD). The use of such instruments to improve adults' risk factor awareness and concordance with clinically measured risk factor values could be an opportunity to advance public health knowledge and build effective interventions. The objective of this study was to determine if an Internet-based health risk assessment can highlight important aspects of agreement between respondents' self-reported and clinically measured CHD risk factors for community-dwelling adults who may be at risk for CHD. Data from an Internet-based cardiovascular health risk assessment (Heart Aware) administered to community-dwelling adults at 127 clinical sites were analyzed. Respondents were recruited through individual hospital marketing campaigns, such as media advertising and print media, found throughout inpatient and outpatient facilities. CHD risk factors from the Framingham Heart Study were examined. Weighted kappa statistics were calculated to measure interrater agreement between respondents' self-reported and clinically measured CHD risk factors. Weighted kappa statistics were then calculated for each sample by strata of overall 10-year CHD risk. Three samples were drawn based on strategies for treating missing data: a listwise deleted sample, a pairwise deleted sample, and a multiple imputation (MI) sample. The MI sample (n=16,879) was most appropriate for addressing missing data. No CHD risk factor had better than marginal interrater agreement (κ>.60). High-density lipoprotein cholesterol (HDL-C) exhibited suboptimal interrater agreement that deteriorated (eg, κ<.30) as overall CHD risk increased. Conversely, low-density lipoprotein cholesterol (LDL-C) interrater agreement improved (eg, up to κ=.25) as overall CHD risk increased. Overall CHD risk of the sample was lower than comparative population-based CHD risk (ie, no more than 15% risk of CHD for the sample vs up to a 30% chance of CHD for the population). Interventions are needed to improve knowledge of CHD risk factors. Specific interventions should address perceptions of HDL-C and LCL-C. Internet-based health risk assessments such as Heart Aware may contribute to public health surveillance, but they must address selection bias of Internet-based recruitment methods.
NASA Technical Reports Server (NTRS)
Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.
2016-01-01
Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR's definition of risk does not yet exist. This study provides a first step towards such a risk based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to greater than 56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.
Risk factors for operated carpal tunnel syndrome: a multicenter population-based case-control study
Mattioli, Stefano; Baldasseroni, Alberto; Bovenzi, Massimo; Curti, Stefania; Cooke, Robin MT; Campo, Giuseppe; Barbieri, Pietro G; Ghersi, Rinaldo; Broccoli, Marco; Cancellieri, Maria Pia; Colao, Anna Maria; dell'Omo, Marco; Fateh-Moghadam, Pirous; Franceschini, Flavia; Fucksia, Serenella; Galli, Paolo; Gobba, Fabriziomaria; Lucchini, Roberto; Mandes, Anna; Marras, Teresa; Sgarrella, Carla; Borghesi, Stefano; Fierro, Mauro; Zanardi, Francesca; Mancini, Gianpiero; Violante, Francesco S
2009-01-01
Background Carpal tunnel syndrome (CTS) is a socially and economically relevant disease caused by compression or entrapment of the median nerve within the carpal tunnel. This population-based case-control study aims to investigate occupational/non-occupational risk factors for surgically treated CTS. Methods Cases (n = 220) aged 18-65 years were randomly drawn from 13 administrative databases of citizens who were surgically treated with carpal tunnel release during 2001. Controls (n = 356) were randomly sampled from National Health Service registry records and were frequency matched by age-gender-specific CTS hospitalization rates. Results At multivariate analysis, risk factors were blue-collar/housewife status, BMI ≥ 30 kg/m2, sibling history of CTS and coexistence of trigger finger. Being relatively tall (cut-offs based on tertiles: women ≥165 cm; men ≥175 cm) was associated with lower risk. Blue-collar work was a moderate/strong risk factor in both sexes. Raised risks were apparent for combinations of biomechanical risk factors that included frequent repetitivity and sustained force. Conclusion This study strongly underlines the relevance of biomechanical exposures in both non-industrial and industrial work as risk factors for surgically treated CTS. PMID:19758429
Zhang, Yan; Zhong, Ming
2013-01-01
Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883
Interplay between past market correlation structure changes and future volatility outbursts.
Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T
2016-11-18
We report significant relations between past changes in the market correlation structure and future changes in the market volatility. This relation is made evident by using a measure of "correlation structure persistence" on correlation-based information filtering networks that quantifies the rate of change of the market dependence structure. We also measured changes in the correlation structure by means of a "metacorrelation" that measures a lagged correlation between correlation matrices computed over different time windows. Both methods show a deep interplay between past changes in correlation structure and future changes in volatility and we demonstrate they can anticipate market risk variations and this can be used to better forecast portfolio risk. Notably, these methods overcome the curse of dimensionality that limits the applicability of traditional econometric tools to portfolios made of a large number of assets. We report on forecasting performances and statistical significance of both methods for two different equity datasets. We also identify an optimal region of parameters in terms of True Positive and False Positive trade-off, through a ROC curve analysis. We find that this forecasting method is robust and it outperforms logistic regression predictors based on past volatility only. Moreover the temporal analysis indicates that methods based on correlation structural persistence are able to adapt to abrupt changes in the market, such as financial crises, more rapidly than methods based on past volatility.
Interplay between past market correlation structure changes and future volatility outbursts
NASA Astrophysics Data System (ADS)
Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T.
2016-11-01
We report significant relations between past changes in the market correlation structure and future changes in the market volatility. This relation is made evident by using a measure of “correlation structure persistence” on correlation-based information filtering networks that quantifies the rate of change of the market dependence structure. We also measured changes in the correlation structure by means of a “metacorrelation” that measures a lagged correlation between correlation matrices computed over different time windows. Both methods show a deep interplay between past changes in correlation structure and future changes in volatility and we demonstrate they can anticipate market risk variations and this can be used to better forecast portfolio risk. Notably, these methods overcome the curse of dimensionality that limits the applicability of traditional econometric tools to portfolios made of a large number of assets. We report on forecasting performances and statistical significance of both methods for two different equity datasets. We also identify an optimal region of parameters in terms of True Positive and False Positive trade-off, through a ROC curve analysis. We find that this forecasting method is robust and it outperforms logistic regression predictors based on past volatility only. Moreover the temporal analysis indicates that methods based on correlation structural persistence are able to adapt to abrupt changes in the market, such as financial crises, more rapidly than methods based on past volatility.
Interplay between past market correlation structure changes and future volatility outbursts
Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T.
2016-01-01
We report significant relations between past changes in the market correlation structure and future changes in the market volatility. This relation is made evident by using a measure of “correlation structure persistence” on correlation-based information filtering networks that quantifies the rate of change of the market dependence structure. We also measured changes in the correlation structure by means of a “metacorrelation” that measures a lagged correlation between correlation matrices computed over different time windows. Both methods show a deep interplay between past changes in correlation structure and future changes in volatility and we demonstrate they can anticipate market risk variations and this can be used to better forecast portfolio risk. Notably, these methods overcome the curse of dimensionality that limits the applicability of traditional econometric tools to portfolios made of a large number of assets. We report on forecasting performances and statistical significance of both methods for two different equity datasets. We also identify an optimal region of parameters in terms of True Positive and False Positive trade-off, through a ROC curve analysis. We find that this forecasting method is robust and it outperforms logistic regression predictors based on past volatility only. Moreover the temporal analysis indicates that methods based on correlation structural persistence are able to adapt to abrupt changes in the market, such as financial crises, more rapidly than methods based on past volatility. PMID:27857144
The Spatial Distributions and Variations of Water Environmental Risk in Yinma River Basin, China
Di, Hui; Liu, Xingpeng; Tong, Zhijun; Ji, Meichen
2018-01-01
Water environmental risk is the probability of the occurrence of events caused by human activities or the interaction of human activities and natural processes that will damage a water environment. This study proposed a water environmental risk index (WERI) model to assess the water environmental risk in the Yinma River Basin based on hazards, exposure, vulnerability, and regional management ability indicators in a water environment. The data for each indicator were gathered from 2000, 2005, 2010, and 2015 to assess the spatial and temporal variations in water environmental risk using particle swarm optimization and the analytic hierarchy process (PSO-AHP) method. The results showed that the water environmental risk in the Yinma River Basin decreased from 2000 to 2015. The risk level of the water environment was high in Changchun, while the risk levels in Yitong and Yongji were low. The research methods provide information to support future decision making by the risk managers in the Yinma River Basin, which is in a high-risk water environment. Moreover, water environment managers could reduce the risks by adjusting the indicators that affect water environmental risks. PMID:29543706
NASA Astrophysics Data System (ADS)
Tacnet, Jean-Marc; Dupouy, Guillaume; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille
2017-04-01
In mountain areas, natural phenomena such as snow avalanches, debris-flows and rock-falls, put people and objects at risk with sometimes dramatic consequences. Risk is classically considered as a combination of hazard, the combination of the intensity and frequency of the phenomenon, and vulnerability which corresponds to the consequences of the phenomenon on exposed people and material assets. Risk management consists in identifying the risk level as well as choosing the best strategies for risk prevention, i.e. mitigation. In the context of natural phenomena in mountainous areas, technical and scientific knowledge is often lacking. Risk management decisions are therefore based on imperfect information. This information comes from more or less reliable sources ranging from historical data, expert assessments, numerical simulations etc. Finally, risk management decisions are the result of complex knowledge management and reasoning processes. Tracing the information and propagating information quality from data acquisition to decisions are therefore important steps in the decision-making process. One major goal today is therefore to assist decision-making while considering the availability, quality and reliability of information content and sources. A global integrated framework is proposed to improve the risk management process in a context of information imperfection provided by more or less reliable sources: uncertainty as well as imprecision, inconsistency and incompleteness are considered. Several methods are used and associated in an original way: sequential decision context description, development of specific multi-criteria decision-making methods, imperfection propagation in numerical modeling and information fusion. This framework not only assists in decision-making but also traces the process and evaluates the impact of information quality on decision-making. We focus and present two main developments. The first one relates to uncertainty and imprecision propagation in numerical modeling using both classical Monte-Carlo probabilistic approach and also so-called Hybrid approach using possibility theory. Second approach deals with new multi-criteria decision-making methods which consider information imperfection, source reliability, importance and conflict, using fuzzy sets as well as possibility and belief function theories. Implemented methods consider information imperfection propagation and information fusion in total aggregation methods such as AHP (Saaty, 1980) or partial aggregation methods such as the Electre outranking method (see Soft Electre Tri ) or decisions in certain but also risky or uncertain contexts (see new COWA-ER and FOWA-ER- Cautious and Fuzzy Ordered Weighted Averaging-Evidential Reasoning). For example, the ER-MCDA methodology considers expert assessment as a multi-criteria decision process based on imperfect information provided by more or less heterogeneous, reliable and conflicting sources: it mixes AHP, fuzzy sets theory, possibility theory and belief function theory using DSmT (Dezert-Smarandache Theory) framework which provides powerful fusion rules.
Mapping of HLA- DQ haplotypes in a group of Danish patients with celiac disease.
Lund, Flemming; Hermansen, Mette N; Pedersen, Merete F; Hillig, Thore; Toft-Hansen, Henrik; Sölétormos, György
2015-10-01
A cost-effective identification of HLA- DQ risk haplotypes using the single nucleotide polymorphism (SNP) technique has recently been applied in the diagnosis of celiac disease (CD) in four European populations. The objective of the study was to map risk HLA- DQ haplotypes in a group of Danish CD patients using the SNP technique. Cohort A: Among 65 patients with gastrointestinal symptoms we compared the HLA- DQ2 and HLA- DQ8 risk haplotypes obtained by the SNP technique (method 1) with results based on a sequence specific primer amplification technique (method 2) and a technique used in an assay from BioDiagene (method 3). Cohort B: 128 patients with histologically verified CD were tested for CD risk haplotypes (method 1). Patients with negative results were further tested for sub-haplotypes of HLA- DQ2 (methods 2 and 3). Cohort A: The three applied methods provided the same HLA- DQ2 and HLA- DQ8 results among 61 patients. Four patients were negative for the HLA- DQ2 and HLA- DQ8 haplotypes (method 1) but were positive for the HLA- DQ2.5-trans and HLA- DQ2.2 haplotypes (methods 2 and 3). Cohort B: A total of 120 patients were positive for the HLA- DQ2.5-cis and HLA- DQ8 haplotypes (method 1). The remaining seven patients were positive for HLA- DQ2.5-trans or HLA- DQ2.2 haplotypes (methods 2 and 3). One patient was negative with all three HLA methods. The HLA- DQ risk haplotypes were detected in 93.8% of the CD patients using the SNP technique (method 1). The sensitivity increased to 99.2% by combining methods 1 - 3.
Rethinking the Clinically Based Thresholds of TransCelerate BioPharma for Risk-Based Monitoring.
Zink, Richard C; Dmitrienko, Anastasia; Dmitrienko, Alex
2018-01-01
The quality of data from clinical trials has received a great deal of attention in recent years. Of central importance is the need to protect the well-being of study participants and maintain the integrity of final analysis results. However, traditional approaches to assess data quality have come under increased scrutiny as providing little benefit for the substantial cost. Numerous regulatory guidance documents and industry position papers have described risk-based approaches to identify quality and safety issues. In particular, the position paper of TransCelerate BioPharma recommends defining risk thresholds to assess safety and quality risks based on past clinical experience. This exercise can be extremely time-consuming, and the resulting thresholds may only be relevant to a particular therapeutic area, patient or clinical site population. In addition, predefined thresholds cannot account for safety or quality issues where the underlying rate of observing a particular problem may change over the course of a clinical trial, and often do not consider varying patient exposure. In this manuscript, we appropriate rules commonly utilized for funnel plots to define a traffic-light system for risk indicators based on statistical criteria that consider the duration of patient follow-up. Further, we describe how these methods can be adapted to assess changing risk over time. Finally, we illustrate numerous graphical approaches to summarize and communicate risk, and discuss hybrid clinical-statistical approaches to allow for the assessment of risk at sites with low patient enrollment. We illustrate the aforementioned methodologies for a clinical trial in patients with schizophrenia. Funnel plots are a flexible graphical technique that can form the basis for a risk-based strategy to assess data integrity, while considering site sample size, patient exposure, and changing risk across time.
New methods for fall risk prediction.
Ejupi, Andreas; Lord, Stephen R; Delbaere, Kim
2014-09-01
Accidental falls are the leading cause of injury-related death and hospitalization in old age, with over one-third of the older adults experiencing at least one fall or more each year. Because of limited healthcare resources, regular objective fall risk assessments are not possible in the community on a large scale. New methods for fall prediction are necessary to identify and monitor those older people at high risk of falling who would benefit from participating in falls prevention programmes. Technological advances have enabled less expensive ways to quantify physical fall risk in clinical practice and in the homes of older people. Recently, several studies have demonstrated that sensor-based fall risk assessments of postural sway, functional mobility, stepping and walking can discriminate between fallers and nonfallers. Recent research has used low-cost, portable and objective measuring instruments to assess fall risk in older people. Future use of these technologies holds promise for assessing fall risk accurately in an unobtrusive manner in clinical and daily life settings.
Investigating Sociodemographic Disparities in Cancer Risk Using Web-Based Informatics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoon, Hong-Jun; Tourassi, Georgia
Cancer health disparities due to demographic and socioeconomic factors are an area of great interest in the epidemiological community. Adjusting for such factors is important when developing cancer risk models. However, for digital epidemiology studies relying on online sources such information is not readily available. This paper presents a novel method for extracting demographic and socioeconomic information from openly available online obituaries. The method relies on tailored language processing rules and a probabilistic scheme to map subjects’ occupation history to the occupation classification codes and related earnings provided by the U.S. Census Bureau. Using this information, a case-control study ismore » executed fully in silico to investigate how age, gender, parity, and income level impact breast and lung cancer risk. Based on 48,368 online obituaries (4,643 for breast cancer, 6,274 for lung cancer, and 37,451 cancer-free) collected automatically and a generalized cancer risk model, our study shows strong association between age, parity, and socioeconomic status and cancer risk. Although for breast cancer the observed trends are very consistent with traditional epidemiological studies, some inconsistency is observed for lung cancer with respect to socioeconomic status.« less
Evaluating Risk Perception based on Gender Differences for Mountaineering Activity
NASA Astrophysics Data System (ADS)
Susanto, Novie; Susatyo, Nugroho W. P.; Rizkiyah, Ega
2018-02-01
In average 26 death events in mountaineering per year for the time span from 2003 to 2012 is reported. The number of women dying during the mountaineering is significantly smaller than males (3.5 deaths male for one female death). This study aims to analyze the differences of risk perception based on gender and provide recommendations as education basic to prevent accidents in mountaineering. This study utilizes the Kruskal-Wallis test and the Delphi Method. A total of 200 mountaineer respondents (100 males and 100 females) participated in this study. The independent variable in this study was gender. The dependent variable was risk perception including perception toward the serious accident, perception toward the probability of accident event as well as anxiety level and perception of efficacy and self-efficacy. The study result showed that the risk perception of women is higher than men with significant difference (p-value = 0.019). The recommendations from Delphi method result are by developing a positive mental attitude, showing about the risks that exist in nature, implementing Cognitive Behaviour Therapy (CBT) to raise awareness of the safety of ownself, following the climbing or mountaineer school, and using instructors to give lessons about safety in outdoor activities.
Investigating Sociodemographic Disparities in Cancer Risk Using Web-Based Informatics
Yoon, Hong-Jun; Tourassi, Georgia
2018-01-24
Cancer health disparities due to demographic and socioeconomic factors are an area of great interest in the epidemiological community. Adjusting for such factors is important when developing cancer risk models. However, for digital epidemiology studies relying on online sources such information is not readily available. This paper presents a novel method for extracting demographic and socioeconomic information from openly available online obituaries. The method relies on tailored language processing rules and a probabilistic scheme to map subjects’ occupation history to the occupation classification codes and related earnings provided by the U.S. Census Bureau. Using this information, a case-control study ismore » executed fully in silico to investigate how age, gender, parity, and income level impact breast and lung cancer risk. Based on 48,368 online obituaries (4,643 for breast cancer, 6,274 for lung cancer, and 37,451 cancer-free) collected automatically and a generalized cancer risk model, our study shows strong association between age, parity, and socioeconomic status and cancer risk. Although for breast cancer the observed trends are very consistent with traditional epidemiological studies, some inconsistency is observed for lung cancer with respect to socioeconomic status.« less
Zhang, Wei; Wei, Shilin; Teng, Yanbin; Zhang, Jianku; Wang, Xiufang; Yan, Zheping
2017-01-01
In view of a dynamic obstacle environment with motion uncertainty, we present a dynamic collision avoidance method based on the collision risk assessment and improved velocity obstacle method. First, through the fusion optimization of forward-looking sonar data, the redundancy of the data is reduced and the position, size and velocity information of the obstacles are obtained, which can provide an accurate decision-making basis for next-step collision avoidance. Second, according to minimum meeting time and the minimum distance between the obstacle and unmanned underwater vehicle (UUV), this paper establishes the collision risk assessment model, and screens key obstacles to avoid collision. Finally, the optimization objective function is established based on the improved velocity obstacle method, and a UUV motion characteristic is used to calculate the reachable velocity sets. The optimal collision speed of UUV is searched in velocity space. The corresponding heading and speed commands are calculated, and outputted to the motion control module. The above is the complete dynamic obstacle avoidance process. The simulation results show that the proposed method can obtain a better collision avoidance effect in the dynamic environment, and has good adaptability to the unknown dynamic environment. PMID:29186878
Prioritizing chemicals for environmental management in China based on screening of potential risks
NASA Astrophysics Data System (ADS)
Yu, Xiangyi; Mao, Yan; Sun, Jinye; Shen, Yingwa
2014-03-01
The rapid development of China's chemical industry has created increasing pressure to improve the environmental management of chemicals. To bridge the large gap between the use and safe management of chemicals, we performed a comprehensive review of the international methods used to prioritize chemicals for environmental management. By comparing domestic and foreign methods, we confirmed the presence of this gap and identified potential solutions. Based on our literature review, we developed an appropriate screening method that accounts for the unique characteristics of chemical use within China. The proposed method is based on an evaluation using nine indices of the potential hazard posed by a chemical: three environmental hazard indices (persistence, bioaccumulation, and eco-toxicity), four health hazard indices (acute toxicity, carcinogenicity, mutagenicity, and reproductive and developmental toxicity), and two environmental exposure hazard indices (chemical amount and utilization pattern). The results of our screening agree with results of previous efforts from around the world, confirming the validity of the new system. The classification method will help decisionmakers to prioritize and identify the chemicals with the highest environmental risk, thereby providing a basis for improving chemical management in China.