Sample records for evaluating value-at-risk models

  1. An integrated approach to evaluating alternative risk prediction strategies: a case study comparing alternative approaches for preventing invasive fungal disease.

    PubMed

    Sadique, Z; Grieve, R; Harrison, D A; Jit, M; Allen, E; Rowan, K M

    2013-12-01

    This article proposes an integrated approach to the development, validation, and evaluation of new risk prediction models illustrated with the Fungal Infection Risk Evaluation study, which developed risk models to identify non-neutropenic, critically ill adult patients at high risk of invasive fungal disease (IFD). Our decision-analytical model compared alternative strategies for preventing IFD at up to three clinical decision time points (critical care admission, after 24 hours, and end of day 3), followed with antifungal prophylaxis for those judged "high" risk versus "no formal risk assessment." We developed prognostic models to predict the risk of IFD before critical care unit discharge, with data from 35,455 admissions to 70 UK adult, critical care units, and validated the models externally. The decision model was populated with positive predictive values and negative predictive values from the best-fitting risk models. We projected lifetime cost-effectiveness and expected value of partial perfect information for groups of parameters. The risk prediction models performed well in internal and external validation. Risk assessment and prophylaxis at the end of day 3 was the most cost-effective strategy at the 2% and 1% risk threshold. Risk assessment at each time point was the most cost-effective strategy at a 0.5% risk threshold. Expected values of partial perfect information were high for positive predictive values or negative predictive values (£11 million-£13 million) and quality-adjusted life-years (£11 million). It is cost-effective to formally assess the risk of IFD for non-neutropenic, critically ill adult patients. This integrated approach to developing and evaluating risk models is useful for informing clinical practice and future research investment. © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR) Published by International Society for Pharmacoeconomics and Outcomes Research (ISPOR) All rights reserved.

  2. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  3. Persistent hemifacial spasm after microvascular decompression: a risk assessment model.

    PubMed

    Shah, Aalap; Horowitz, Michael

    2017-06-01

    Microvascular decompression (MVD) for hemifacial spasm (HFS) provides resolution of disabling symptoms such as eyelid twitching and muscle contractions of the entire hemiface. The primary aim of this study was to evaluate the predictive value of patient demographics and spasm characteristics on long-term outcomes, with or without intraoperative lateral spread response (LSR) as an additional variable in a risk assessment model. A retrospective study was undertaken to evaluate the associations of pre-operative patient characteristics, as well as intraoperative LSR and need for a staged procedure on the presence of persistent or recurrent HFS at the time of hospital discharge and at follow-up. A risk assessment model was constructed with the inclusion of six clinically or statistically significant variables from the univariate analyses. A receiving operator characteristic curve was generated, and area under the curve was calculated to determine the strength of the predictive model. A risk assessment model was first created consisting of significant pre-operative variables (Model 1) (age >50, female gender, history of botulinum toxin use, platysma muscle involvement). This model demonstrated borderline predictive value for persistent spasm at discharge (AUC .60; p=.045) and fair predictive value at follow-up (AUC .75; p=.001). Intraoperative variables (e.g. LSR persistence) demonstrated little additive value (Model 2) (AUC .67). Patients with a higher risk score (three or greater) demonstrated greater odds of persistent HFS at the time of discharge (OR 1.5 [95%CI 1.16-1.97]; p=.035), as well as greater odds of persistent or recurrent spasm at the time of follow-up (OR 3.0 [95%CI 1.52-5.95]; p=.002) Conclusions: A risk assessment model consisting of pre-operative clinical characteristics is useful in prognosticating HFS persistence at follow-up.

  4. Research on efficiency evaluation model of integrated energy system based on hybrid multi-attribute decision-making.

    PubMed

    Li, Yan

    2017-05-25

    The efficiency evaluation model of integrated energy system, involving many influencing factors, and the attribute values are heterogeneous and non-deterministic, usually cannot give specific numerical or accurate probability distribution characteristics, making the final evaluation result deviation. According to the characteristics of the integrated energy system, a hybrid multi-attribute decision-making model is constructed. The evaluation model considers the decision maker's risk preference. In the evaluation of the efficiency of the integrated energy system, the evaluation value of some evaluation indexes is linguistic value, or the evaluation value of the evaluation experts is not consistent. These reasons lead to ambiguity in the decision information, usually in the form of uncertain linguistic values and numerical interval values. In this paper, the risk preference of decision maker is considered when constructing the evaluation model. Interval-valued multiple-attribute decision-making method and fuzzy linguistic multiple-attribute decision-making model are proposed. Finally, the mathematical model of efficiency evaluation of integrated energy system is constructed.

  5. Risk evaluation on leading companies in property and real estate subsector at IDX: A Value-at-Risk with ARMAX-GARCHX approach and duration test

    NASA Astrophysics Data System (ADS)

    Dwi Prastyo, Dedy; Handayani, Dwi; Fam, Soo-Fen; Puteri Rahayu, Santi; Suhartono; Luh Putu Satyaning Pradnya Paramita, Ni

    2018-03-01

    Risk assessment and evaluation becomes essential for financial institution to measure the potential risk of their counterparties. In middle of 2016 until first quarter of 2017, there is national program from Indonesian government so-called Tax Amnesty. One subsector that has potential to receive positive impact from the Tax Amnesty program is property and real estate. This work evaluates the risk of top five companies in term of capital share listed in Indonesia stock exchange (IDX). To do this, the Value-at-Risk (VaR) with ARMAX-GARCHX approach is employed. The ARMAX-GARCHX simultaneously models the adaptive mean and variance of stock return of each company considering exogenous variables, i.e. IDR/USD exchange rate and Jakarta Composite Index (JCI). The risk is evaluated in scheme of time moving window. The risk evaluation using 5% quantile with window size 500 transaction days perform better result compare to other scenarios. In addition, duration test is used to test the dependency between shortfalls. It informs that series of shortfall are independent.

  6. [Health risk assessment of coke oven PAHs emissions].

    PubMed

    Bo, Xin; Wang, Gang; Wen, Rou; Zhao, Chun-Li; Wu, Tie; Li, Shi-Bei

    2014-07-01

    Polycyclic aromatic hydrocarbons (PAHs) produced by coke oven are with strong toxicity and carcinogenicity. Taken typical coke oven of iron and steel enterprises as the case study, the dispersion and migration of 13 kinds of PAHs emitted from coke oven were analyzed using AERMOD dispersion model, the carcinogenic and non-carcinogenic risks at the receptors within the modeling domain were evaluated using BREEZE Risk Analyst and the Human Health Risk Assessment Protocol for Hazardous Waste Combustion (HHRAP) was followed, the health risks caused by PAHs emission from coke oven were quantitatively evaluated. The results indicated that attention should be paid to the non-carcinogenic risk of naphthalene emission (the maximum value was 0.97). The carcinogenic risks of each single pollutant were all below 1.0E-06, while the maximum value of total carcinogenic risk was 2.65E-06, which may have some influence on the health of local residents.

  7. A wildfire risk modeling system for evaluating landscape fuel treatment strategies

    Treesearch

    Alan Ager; Mark Finney; Andrew McMahan

    2006-01-01

    Despite a wealth of literature and models concerning wildfire risk, field units in Federal land management agencies lack a clear framework and operational tools to measure how risk might change from proposed fuel treatments. In an actuarial context, risk is defined as the expected value change from a fire, calculated as the product of (1) probability of a fire at a...

  8. Risk assessment of brine contamination to aquatic resources from energy development in glacial drift deposits: Williston Basin, USA

    USGS Publications Warehouse

    Preston, Todd M.; Chesley-Preston, Tara

    2015-01-01

    Our goal was to improve the Sheridan County assessment (SCA) and evaluate the use of this new Williston Basin assessment (WBA) across 31 counties mantled by glacial drift in the Williston Basin. To determine if the WBA model improved the SCA model, results from both assessments were compared to CI values from 37 surface and groundwater samples collected to evaluate the SCA. The WBA (R2 = 0.65) outperformed the SCA (R2 = 0.52) indicating improved model performance. Applicability across the Williston Basin was evaluated by comparing WBA results to CI values from 123 surface water samples collected from 97 sections. Based on the WBA, the majority (83.5%) of sections lacked an oil well and had minimal risk. Sections with one or more oil wells comprised low (8.4%), moderate (6.5%), or high (1.7%) risk areas. The percentage of contaminated water samples, percentage of sections with at least one contaminated sample, and the average CI value of contaminated samples increased from low to high risk indicating applicability across the Williston Basin. Furthermore, the WBA performed better compared to only the contaminated samples (R2 = 0.62) versus all samples (R2 = 0.38). This demonstrates that the WBA was successful at identifying sections, but not individual aquatic resources, with an increased risk of contamination; therefore, WBA results can prioritize future sampling within areas of increased risk.

  9. Occupational health and safety: Designing and building with MACBETH a value risk-matrix for evaluating health and safety risks

    NASA Astrophysics Data System (ADS)

    Lopes, D. F.; Oliveira, M. D.; Costa, C. A. Bana e.

    2015-05-01

    Risk matrices (RMs) are commonly used to evaluate health and safety risks. Nonetheless, they violate some theoretical principles that compromise their feasibility and use. This study describes how multiple criteria decision analysis methods have been used to improve the design and the deployment of RMs to evaluate health and safety risks at the Occupational Health and Safety Unit (OHSU) of the Regional Health Administration of Lisbon and Tagus Valley. ‘Value risk-matrices’ (VRMs) are built with the MACBETH approach in four modelling steps: a) structuring risk impacts, involving the construction of descriptors of impact that link risk events with health impacts and are informed by scientific evidence; b) generating a value measurement scale of risk impacts, by applying the MACBETH-Choquet procedure; c) building a system for eliciting subjective probabilities that makes use of a numerical probability scale that was constructed with MACBETH qualitative judgments on likelihood; d) and defining a classification colouring scheme for the VRM. A VRM built with OHSU members was implemented in a decision support system which will be used by OHSU members to evaluate health and safety risks and to identify risk mitigation actions.

  10. Human Health Risk Assessment Simulations in a Distributed Environment for Shuttle Launch

    NASA Technical Reports Server (NTRS)

    Thirumalainambi, Rajkumar; Bardina, Jorge

    2004-01-01

    During the launch of a rocket under prevailing weather conditions, commanders at Cape Canaveral Air Force station evaluate the possibility of whether wind blown toxic emissions might reach civilian and military personnel in the near by area. In our model, we focused mainly on Hydrogen chloride (HCL), Nitrogen oxides (NOx) and Nitric acid (HNO3), which are non-carcinogenic chemicals as per United States Environmental Protection Agency (USEPA) classification. We have used the hazard quotient model to estimate the number of people at risk. It is based on the number of people with exposure above a reference exposure level that is unlikely to cause adverse health effects. The risk to the exposed population is calculated by multiplying the individual risk and the number in exposed population. The risk values are compared against the acceptable risk values and GO or NO-go situation is decided based on risk values for the Shuttle launch. The entire model is simulated over the web and different scenaria can be generated which allows management to choose an optimum decision.

  11. Chemical-specific screening criteria for interpretation of biomonitoring data for volatile organic compounds (VOCs)--application of steady-state PBPK model solutions.

    PubMed

    Aylward, Lesa L; Kirman, Chris R; Blount, Ben C; Hays, Sean M

    2010-10-01

    The National Health and Nutrition Examination Survey (NHANES) generates population-representative biomonitoring data for many chemicals including volatile organic compounds (VOCs) in blood. However, no health or risk-based screening values are available to evaluate these data from a health safety perspective or to use in prioritizing among chemicals for possible risk management actions. We gathered existing risk assessment-based chronic exposure reference values such as reference doses (RfDs), reference concentrations (RfCs), tolerable daily intakes (TDIs), cancer slope factors, etc. and key pharmacokinetic model parameters for 47 VOCs. Using steady-state solutions to a generic physiologically-based pharmacokinetic (PBPK) model structure, we estimated chemical-specific steady-state venous blood concentrations across chemicals associated with unit oral and inhalation exposure rates and with chronic exposure at the identified exposure reference values. The geometric means of the slopes relating modeled steady-state blood concentrations to steady-state exposure to a unit oral dose or unit inhalation concentration among 38 compounds with available pharmacokinetic parameters were 12.0 microg/L per mg/kg-d (geometric standard deviation [GSD] of 3.2) and 3.2 microg/L per mg/m(3) (GSD=1.7), respectively. Chemical-specific blood concentration screening values based on non-cancer reference values for both oral and inhalation exposure range from 0.0005 to 100 microg/L; blood concentrations associated with cancer risk-specific doses at the 1E-05 risk level ranged from 5E-06 to 6E-02 microg/L. The distribution of modeled steady-state blood concentrations associated with unit exposure levels across VOCs may provide a basis for estimating blood concentration screening values for VOCs that lack chemical-specific pharmacokinetic data. The screening blood concentrations presented here provide a tool for risk assessment-based evaluation of population biomonitoring data for VOCs and are most appropriately applied to central tendency estimates for such datasets. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  12. Experimental Evaluation of the Value Added by Raising a Reader and Supplemental Parent Training in Shared Reading

    ERIC Educational Resources Information Center

    Anthony, Jason L.; Williams, Jeffrey M.; Zhang, Zhoe; Landry, Susan H.; Dunkelberger, Martha J.

    2014-01-01

    Research Findings: In an effort toward developing a comprehensive, effective, scalable, and sustainable early childhood education program for at-risk populations, we conducted an experimental evaluation of the value added by 2 family involvement programs to the Texas Early Education Model (TEEM). A total of 91 preschool classrooms that served…

  13. Minimizing metastatic risk in radiotherapy fractionation schedules

    NASA Astrophysics Data System (ADS)

    Badri, Hamidreza; Ramakrishnan, Jagdish; Leder, Kevin

    2015-11-01

    Metastasis is the process by which cells from a primary tumor disperse and form new tumors at distant anatomical locations. The treatment and prevention of metastatic cancer remains an extremely challenging problem. This work introduces a novel biologically motivated objective function to the radiation optimization community that takes into account metastatic risk instead of the status of the primary tumor. In this work, we consider the problem of developing fractionated irradiation schedules that minimize production of metastatic cancer cells while keeping normal tissue damage below an acceptable level. A dynamic programming framework is utilized to determine the optimal fractionation scheme. We evaluated our approach on a breast cancer case using the heart and the lung as organs-at-risk (OAR). For small tumor α /β values, hypo-fractionated schedules were optimal, which is consistent with standard models. However, for relatively larger α /β values, we found the type of schedule depended on various parameters such as the time when metastatic risk was evaluated, the α /β values of the OARs, and the normal tissue sparing factors. Interestingly, in contrast to standard models, hypo-fractionated and semi-hypo-fractionated schedules (large initial doses with doses tapering off with time) were suggested even with large tumor α/β values. Numerical results indicate the potential for significant reduction in metastatic risk.

  14. Estimating the Value-at-Risk for some stocks at the capital market in Indonesia based on ARMA-FIGARCH models

    NASA Astrophysics Data System (ADS)

    Sukono; Lesmana, E.; Susanti, D.; Napitupulu, H.; Hidayat, Y.

    2017-11-01

    Value-at-Risk has already become a standard measurement that must be carried out by the financial institution for both internal interest and regulatory. In this paper, the estimation of Value-at-Risk of some stocks with econometric models approach is analyzed. In this research, we assume that the stock return follows the time series model. To do the estimation of mean value we are using ARMA models, while to estimate the variance value we are using FIGARCH models. Furthermore, the mean value estimator and the variance are used to estimate the Value-at-Risk. The result of the analysis shows that from five stock PRUF, BBRI, MPPA, BMRI, and INDF, the Value-at-Risk obtained are 0.01791, 0.06037, 0.02550, 0.06030, and 0.02585 respectively. Since Value-at-Risk represents the maximum risk size of each stock at a 95% level of significance, then it can be taken into consideration in determining the investment policy on stocks.

  15. A quantile-based Time at Risk: A new approach for assessing risk in financial markets

    NASA Astrophysics Data System (ADS)

    Bolgorian, Meysam; Raei, Reza

    2013-11-01

    In this paper, we provide a new measure for evaluation of risk in financial markets. This measure is based on the return interval of critical events in financial markets or other investment situations. Our main goal was to devise a model like Value at Risk (VaR). As VaR, for a given financial asset, probability level and time horizon, gives a critical value such that the likelihood of loss on the asset over the time horizon exceeds this value is equal to the given probability level, our concept of Time at Risk (TaR), using a probability distribution function of return intervals, provides a critical time such that the probability that the return interval of a critical event exceeds this time equals the given probability level. As an empirical application, we applied our model to data from the Tehran Stock Exchange Price Index (TEPIX) as a financial asset (market portfolio) and reported the results.

  16. Risk of Acute Liver Failure in Patients With Drug-Induced Liver Injury: Evaluation of Hy's Law and a New Prognostic Model.

    PubMed

    Lo Re, Vincent; Haynes, Kevin; Forde, Kimberly A; Goldberg, David S; Lewis, James D; Carbonari, Dena M; Leidl, Kimberly B F; Reddy, K Rajender; Nezamzadeh, Melissa S; Roy, Jason; Sha, Daohang; Marks, Amy R; De Boer, Jolanda; Schneider, Jennifer L; Strom, Brian L; Corley, Douglas A

    2015-12-01

    Few studies have evaluated the ability of laboratory tests to predict risk of acute liver failure (ALF) among patients with drug-induced liver injury (DILI). We aimed to develop a highly sensitive model to identify DILI patients at increased risk of ALF. We compared its performance with that of Hy's Law, which predicts severity of DILI based on levels of alanine aminotransferase or aspartate aminotransferase and total bilirubin, and validated the model in a separate sample. We conducted a retrospective cohort study of 15,353 Kaiser Permanente Northern California members diagnosed with DILI from 2004 through 2010, liver aminotransferase levels above the upper limit of normal, and no pre-existing liver disease. Thirty ALF events were confirmed by medical record review. Logistic regression was used to develop prognostic models for ALF based on laboratory results measured at DILI diagnosis. External validation was performed in a sample of 76 patients with DILI at the University of Pennsylvania. Hy's Law identified patients that developed ALF with a high level of specificity (0.92) and negative predictive value (0.99), but low level of sensitivity (0.68) and positive predictive value (0.02). The model we developed, comprising data on platelet count and total bilirubin level, identified patients with ALF with a C statistic of 0.87 (95% confidence interval [CI], 0.76-0.96) and enabled calculation of a risk score (Drug-Induced Liver Toxicity ALF Score). We found a cut-off score that identified patients at high risk patients for ALF with a sensitivity value of 0.91 (95% CI, 0.71-0.99) and a specificity value of 0.76 (95% CI, 0.75-0.77). This cut-off score identified patients at high risk for ALF with a high level of sensitivity (0.89; 95% CI, 0.52-1.00) in the validation analysis. Hy's Law identifies patients with DILI at high risk for ALF with low sensitivity but high specificity. We developed a model (the Drug-Induced Liver Toxicity ALF Score) based on platelet count and total bilirubin level that identifies patients at increased risk for ALF with high sensitivity. Copyright © 2015 AGA Institute. Published by Elsevier Inc. All rights reserved.

  17. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    PubMed

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Clearing margin system in the futures markets—Applying the value-at-risk model to Taiwanese data

    NASA Astrophysics Data System (ADS)

    Chiu, Chien-Liang; Chiang, Shu-Mei; Hung, Jui-Cheng; Chen, Yu-Lung

    2006-07-01

    This article sets out to investigate if the TAIFEX has adequate clearing margin adjustment system via unconditional coverage, conditional coverage test and mean relative scaled bias to assess the performance of three value-at-risk (VaR) models (i.e., the TAIFEX, RiskMetrics and GARCH-t). For the same model, original and absolute returns are compared to explore which can accurately capture the true risk. For the same return, daily and tiered adjustment methods are examined to evaluate which corresponds to risk best. The results indicate that the clearing margin adjustment of the TAIFEX cannot reflect true risks. The adjustment rules, including the use of absolute return and tiered adjustment of the clearing margin, have distorted VaR-based margin requirements. Besides, the results suggest that the TAIFEX should use original return to compute VaR and daily adjustment system to set clearing margin. This approach would improve the funds operation efficiency and the liquidity of the futures markets.

  19. Debris flow risk mapping on medium scale and estimation of prospective economic losses

    NASA Astrophysics Data System (ADS)

    Blahut, Jan; Sterlacchini, Simone

    2010-05-01

    Delimitation of potential zones affected by debris flow hazard, mapping of areas at risk, and estimation of future economic damage provides important information for spatial planners and local administrators in all countries endangered by this type of phenomena. This study presents a medium scale (1:25 000 - 1: 50 000) analysis applied in the Consortium of Mountain Municipalities of Valtellina di Tirano (Italian Alps, Lombardy Region). In this area a debris flow hazard map was coupled with the information about the elements at risk to obtain monetary values of prospective damage. Two available hazard maps were obtained from GIS medium scale modelling. Probability estimations of debris flow occurrence were calculated using existing susceptibility maps and two sets of aerial images. Value to the elements at risk was assigned according to the official information on housing costs and land value from the Territorial Agency of Lombardy Region. In the first risk map vulnerability values were assumed to be 1. The second risk map uses three classes of vulnerability values qualitatively estimated according to the debris flow possible propagation. Risk curves summarizing the possible economic losses were calculated. Finally these maps of economic risk were compared to maps derived from qualitative evaluation of the values of the elements at risk.

  20. Estimation of Value-at-Risk for Energy Commodities via CAViaR Model

    NASA Astrophysics Data System (ADS)

    Xiliang, Zhao; Xi, Zhu

    This paper uses the Conditional Autoregressive Value at Risk model (CAViaR) proposed by Engle and Manganelli (2004) to evaluate the value-at-risk for daily spot prices of Brent crude oil and West Texas Intermediate crude oil covering the period May 21th, 1987 to Novermber 18th, 2008. Then the accuracy of the estimates of CAViaR model, Normal-GARCH, and GED-GARCH was compared. The results show that all the methods do good job for the low confidence level (95%), and GED-GARCH is the best for spot WTI price, Normal-GARCH and Adaptive-CAViaR are the best for spot Brent price. However, for the high confidence level (99%), Normal-GARCH do a good job for spot WTI, GED-GARCH and four kind of CAViaR specifications do well for spot Brent price. Normal-GARCH does badly for spot Brent price. The result seems suggest that CAViaR do well as well as GED-GARCH since CAViaR directly model the quantile autoregression, but it does not outperform GED-GARCH although it does outperform Normal-GARCH.

  1. Geographic Mapping as a Tool for Identifying Communities at High Risk for Fires.

    PubMed

    Fahey, Erin; Lehna, Carlee; Hanchette, Carol; Coty, Mary-Beth

    2016-01-01

    The purpose of this study was to evaluate whether the sample of older adults in a home fire safety (HFS) study captured participants living in the areas at highest risk for fire occurrence. The secondary aim was to identify high risk areas to focus future HFS interventions. Geographic information systems software was used to identify census tracts where study participants resided. Census data for these tracts were compared with participant data based on seven risk factors (ie, age greater than 65 years, nonwhite race, below high school education, low socioeconomic status, rented housing, year home built, home value) previously identified in a fire risk model. The distribution of participants and census tracts among risk categories determined how well higher risk census tracts were sampled. Of the 46 census tracts where the HFS intervention was implemented, 78% (n = 36) were identified as high or severe risk according to the fire risk model. Study participants' means for median annual family income (P < .0001) and median home value (P < .0001) were significantly lower than the census tract means (n = 46), indicating participants were at higher risk of fire occurrence. Of the 92 census tracts identified as high or severe risk in the entire county, the study intervention was implemented in 39% (n = 36), indicating 56 census tracts as potential areas for future HFS interventions. The Geographic information system-based fire risk model is an underutilized but important tool for practice that allows community agencies to develop, plan, and evaluate their outreach efforts and ensure the most effective use of scarce resources.

  2. Biomarkers and low risk in heart failure. Data from COACH and TRIUMPH.

    PubMed

    Meijers, Wouter C; de Boer, Rudolf A; van Veldhuisen, Dirk J; Jaarsma, Tiny; Hillege, Hans L; Maisel, Alan S; Di Somma, Salvatore; Voors, Adriaan A; Peacock, W Frank

    2015-12-01

    Traditionally, risk stratification in heart failure (HF) emphasizes assessment of high risk. We aimed to determine if biomarkers could identify patients with HF at low risk for death or HF rehospitalization. This analysis was a substudy of The Coordinating Study Evaluating Outcomes of Advising and Counselling in Heart Failure (COACH) trial. Enrolment of HF patients occurred before discharge. We defined low risk as the absence of death and/or HF rehospitalizations at 180 days. We tested a diverse group of 29 biomarkers on top of a clinical risk model, with and without N-terminal pro-B-type natriuretic peptide (NT-proBNP), and defined the low risk biomarker cut-off at the 10th percentile associated with high positive predictive value. The best performing biomarkers together with NT-proBNP and cardiac troponin I (cTnI) were re-evaluated in a validation cohort of 285 HF patients. Of 592 eligible COACH patients, the mean (± SD) age was 71 (± 11) years and median (IQR) NT-proBNP was 2521 (1301-5634) pg/mL. Logistic regression analysis showed that only galectin-3, fully adjusted, was significantly associated with the absence of events at 180 days (OR 8.1, 95% confidence interval 1.06-50.0, P = 0.039). Galectin-3, showed incremental value when added to the clinical risk model without NT-proBNP (increase in area under the curve from 0.712 to 0.745, P = 0.04). However, no biomarker showed significant improvement by net reclassification improvement on top of the clinical risk model, with or without NT-proBNP. We confirmed our results regarding galectin-3, NT-proBNP, and cTnI in the independent validation cohort. We describe the value of various biomarkers to define low risk, and demonstrate that galectin-3 identifies HF patients at (very) low risk for 30-day and 180-day mortality and HF rehospitalizations after an episode of acute HF. Such patients might be safely discharged. © 2015 The Authors European Journal of Heart Failure © 2015 European Society of Cardiolog.

  3. Mathematical modelling of risk reduction in reinsurance

    NASA Astrophysics Data System (ADS)

    Balashov, R. B.; Kryanev, A. V.; Sliva, D. E.

    2017-01-01

    The paper presents a mathematical model of efficient portfolio formation in the reinsurance markets. The presented approach provides the optimal ratio between the expected value of return and the risk of yield values below a certain level. The uncertainty in the return values is conditioned by use of expert evaluations and preliminary calculations, which result in expected return values and the corresponding risk levels. The proposed method allows for implementation of computationally simple schemes and algorithms for numerical calculation of the numerical structure of the efficient portfolios of reinsurance contracts of a given insurance company.

  4. Can shoulder dystocia be reliably predicted?

    PubMed

    Dodd, Jodie M; Catcheside, Britt; Scheil, Wendy

    2012-06-01

    To evaluate factors reported to increase the risk of shoulder dystocia, and to evaluate their predictive value at a population level. The South Australian Pregnancy Outcome Unit's population database from 2005 to 2010 was accessed to determine the occurrence of shoulder dystocia in addition to reported risk factors, including age, parity, self-reported ethnicity, presence of diabetes and infant birth weight. Odds ratios (and 95% confidence interval) of shoulder dystocia was calculated for each risk factor, which were then incorporated into a logistic regression model. Test characteristics for each variable in predicting shoulder dystocia were calculated. As a proportion of all births, the reported rate of shoulder dystocia increased significantly from 0.95% in 2005 to 1.38% in 2010 (P = 0.0002). Using a logistic regression model, induction of labour and infant birth weight greater than both 4000 and 4500 g were identified as significant independent predictors of shoulder dystocia. The value of risk factors alone and when incorporated into the logistic regression model was poorly predictive of the occurrence of shoulder dystocia. While there are a number of factors associated with an increased risk of shoulder dystocia, none are of sufficient sensitivity or positive predictive value to allow their use clinically to reliably and accurately identify the occurrence of shoulder dystocia. © 2012 The Authors ANZJOG © 2012 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  5. Quantile uncertainty and value-at-risk model risk.

    PubMed

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  6. [Theoretical model study about the application risk of high risk medical equipment].

    PubMed

    Shang, Changhao; Yang, Fenghui

    2014-11-01

    Research for establishing a risk monitoring theoretical model of high risk medical equipment at applying site. Regard the applying site as a system which contains some sub-systems. Every sub-system consists of some risk estimating indicators. After quantizing of each indicator, the quantized values are multiplied with corresponding weight and then the products are accumulated. Hence, the risk estimating value of each subsystem is attained. Follow the calculating method, the risk estimating values of each sub-system are multiplied with corresponding weights and then the product is accumulated. The cumulative sum is the status indicator of the high risk medical equipment at applying site. The status indicator reflects the applying risk of the medical equipment at applying site. Establish a risk monitoring theoretical model of high risk medical equipment at applying site. The model can monitor the applying risk of high risk medical equipment at applying site dynamically and specially.

  7. Evaluating critical uncertainty thresholds in a spatial model of forest pest invasion risk

    Treesearch

    Frank H. Koch; Denys Yemshanov; Daniel W. McKenney; William D. Smith

    2009-01-01

    Pest risk maps can provide useful decision support in invasive species management, but most do not adequately consider the uncertainty associated with predicted risk values. This study explores how increased uncertainty in a risk model’s numeric assumptions might affect the resultant risk map. We used a spatial stochastic model, integrating components for...

  8. Combined Hydrologic (AGWA-KINEROS2) and Hydraulic (HEC2) Modeling for Post-Fire Runoff and Inundation Risk Assessment through a Set of Python Tools

    NASA Astrophysics Data System (ADS)

    Barlow, J. E.; Goodrich, D. C.; Guertin, D. P.; Burns, I. S.

    2016-12-01

    Wildfires in the Western United States can alter landscapes by removing vegetation and changing soil properties. These altered landscapes produce more runoff than pre-fire landscapes which can lead to post-fire flooding that can damage infrastructure and impair natural resources. Resources, structures, historical artifacts and others that could be impacted by increased runoff are considered values at risk. .The Automated Geospatial Watershed Assessment tool (AGWA) allows users to quickly set up and execute the Kinematic Runoff and Erosion model (KINEROS2 or K2) in the ESRI ArcMap environment. The AGWA-K2 workflow leverages the visualization capabilities of GIS to facilitate evaluation of rapid watershed assessments for post-fire planning efforts. High relative change in peak discharge, as simulated by K2, provides a visual and numeric indicator to investigate those channels in the watershed that should be evaluated for more detailed analysis, especially if values at risk are within or near that channel. Modeling inundation extent along a channel would provide more specific guidance about risk along a channel. HEC-2 and HEC-RAS can be used for hydraulic modeling efforts at the reach and river system scale. These models have been used to address flood boundaries and, accordingly, flood risk. However, data collection and organization for hydraulic models can be time consuming and therefore a combined hydrologic-hydraulic modeling approach is not often employed for rapid assessments. A simplified approach could streamline this process and provide managers with a simple workflow and tool to perform a quick risk assessment for a single reach. By focusing on a single reach highlighted by large relative change in peak discharge, data collection efforts can be minimized and the hydraulic computations can be performed to supplement risk analysis. The incorporation of hydraulic analysis through a suite of Python tools (as outlined by HEC-2) with AGWA-K2 will allow more rapid applications of combined hydrologic-hydraulic modeling. This combined modeling approach is built in the ESRI ArcGIS application to enable rapid model preparation, execution and result visualization for risk assessment in post-fire environments.

  9. EVALUATING RISK-PREDICTION MODELS USING DATA FROM ELECTRONIC HEALTH RECORDS.

    PubMed

    Wang, L E; Shaw, Pamela A; Mathelier, Hansie M; Kimmel, Stephen E; French, Benjamin

    2016-03-01

    The availability of data from electronic health records facilitates the development and evaluation of risk-prediction models, but estimation of prediction accuracy could be limited by outcome misclassification, which can arise if events are not captured. We evaluate the robustness of prediction accuracy summaries, obtained from receiver operating characteristic curves and risk-reclassification methods, if events are not captured (i.e., "false negatives"). We derive estimators for sensitivity and specificity if misclassification is independent of marker values. In simulation studies, we quantify the potential for bias in prediction accuracy summaries if misclassification depends on marker values. We compare the accuracy of alternative prognostic models for 30-day all-cause hospital readmission among 4548 patients discharged from the University of Pennsylvania Health System with a primary diagnosis of heart failure. Simulation studies indicate that if misclassification depends on marker values, then the estimated accuracy improvement is also biased, but the direction of the bias depends on the direction of the association between markers and the probability of misclassification. In our application, 29% of the 1143 readmitted patients were readmitted to a hospital elsewhere in Pennsylvania, which reduced prediction accuracy. Outcome misclassification can result in erroneous conclusions regarding the accuracy of risk-prediction models.

  10. Predicting nonstationary flood frequencies: Evidence supports an updated stationarity thesis in the United States

    NASA Astrophysics Data System (ADS)

    Luke, Adam; Vrugt, Jasper A.; AghaKouchak, Amir; Matthew, Richard; Sanders, Brett F.

    2017-07-01

    Nonstationary extreme value analysis (NEVA) can improve the statistical representation of observed flood peak distributions compared to stationary (ST) analysis, but management of flood risk relies on predictions of out-of-sample distributions for which NEVA has not been comprehensively evaluated. In this study, we apply split-sample testing to 1250 annual maximum discharge records in the United States and compare the predictive capabilities of NEVA relative to ST extreme value analysis using a log-Pearson Type III (LPIII) distribution. The parameters of the LPIII distribution in the ST and nonstationary (NS) models are estimated from the first half of each record using Bayesian inference. The second half of each record is reserved to evaluate the predictions under the ST and NS models. The NS model is applied for prediction by (1) extrapolating the trend of the NS model parameters throughout the evaluation period and (2) using the NS model parameter values at the end of the fitting period to predict with an updated ST model (uST). Our analysis shows that the ST predictions are preferred, overall. NS model parameter extrapolation is rarely preferred. However, if fitting period discharges are influenced by physical changes in the watershed, for example from anthropogenic activity, the uST model is strongly preferred relative to ST and NS predictions. The uST model is therefore recommended for evaluation of current flood risk in watersheds that have undergone physical changes. Supporting information includes a MATLAB® program that estimates the (ST/NS/uST) LPIII parameters from annual peak discharge data through Bayesian inference.

  11. Biomechanical, psychosocial and individual risk factors predicting low back functional impairment among furniture distribution employees

    PubMed Central

    Ferguson, Sue A.; Allread, W. Gary; Burr, Deborah L.; Heaney, Catherine; Marras, William S.

    2013-01-01

    Background Biomechanical, psychosocial and individual risk factors for low back disorder have been studied extensively however few researchers have examined all three risk factors. The objective of this was to develop a low back disorder risk model in furniture distribution workers using biomechanical, psychosocial and individual risk factors. Methods This was a prospective study with a six month follow-up time. There were 454 subjects at 9 furniture distribution facilities enrolled in the study. Biomechanical exposure was evaluated using the American Conference of Governmental Industrial Hygienists (2001) lifting threshold limit values for low back injury risk. Psychosocial and individual risk factors were evaluated via questionnaires. Low back health functional status was measured using the lumbar motion monitor. Low back disorder cases were defined as a loss of low back functional performance of −0.14 or more. Findings There were 92 cases of meaningful loss in low back functional performance and 185 non cases. A multivariate logistic regression model included baseline functional performance probability, facility, perceived workload, intermediated reach distance number of exertions above threshold limit values, job tenure manual material handling, and age combined to provide a model sensitivity of 68.5% and specificity of 71.9%. Interpretation: The results of this study indicate which biomechanical, individual and psychosocial risk factors are important as well as how much of each risk factor is too much resulting in increased risk of low back disorder among furniture distribution workers. PMID:21955915

  12. Development and validation of a risk model for identification of non-neutropenic, critically ill adult patients at high risk of invasive Candida infection: the Fungal Infection Risk Evaluation (FIRE) Study.

    PubMed

    Harrison, D; Muskett, H; Harvey, S; Grieve, R; Shahin, J; Patel, K; Sadique, Z; Allen, E; Dybowski, R; Jit, M; Edgeworth, J; Kibbler, C; Barnes, R; Soni, N; Rowan, K

    2013-02-01

    There is increasing evidence that invasive fungal disease (IFD) is more likely to occur in non-neutropenic patients in critical care units. A number of randomised controlled trials (RCTs) have evaluated antifungal prophylaxis in non-neutropenic, critically ill patients, demonstrating a reduction in the risk of proven IFD and suggesting a reduction in mortality. It is necessary to establish a method to identify and target antifungal prophylaxis at those patients at highest risk of IFD, who stand to benefit most from any antifungal prophylaxis strategy. To develop and validate risk models to identify non-neutropenic, critically ill adult patients at high risk of invasive Candida infection, who would benefit from antifungal prophylaxis, and to assess the cost-effectiveness of targeting antifungal prophylaxis to high-risk patients based on these models. Systematic review, prospective data collection, statistical modelling, economic decision modelling and value of information analysis. Ninety-six UK adult general critical care units. Consecutive admissions to participating critical care units. None. Invasive fungal disease, defined as a blood culture or sample from a normally sterile site showing yeast/mould cells in a microbiological or histopathological report. For statistical and economic modelling, the primary outcome was invasive Candida infection, defined as IFD-positive for Candida species. Systematic review: Thirteen articles exploring risk factors, risk models or clinical decision rules for IFD in critically ill adult patients were identified. Risk factors reported to be significantly associated with IFD were included in the final data set for the prospective data collection. Data were collected on 60,778 admissions between July 2009 and March 2011. Overall, 383 patients (0.6%) were admitted with or developed IFD. The majority of IFD patients (94%) were positive for Candida species. The most common site of infection was blood (55%). The incidence of IFD identified in unit was 4.7 cases per 1000 admissions, and for unit-acquired IFD was 3.2 cases per 1000 admissions. Statistical modelling: Risk models were developed at admission to the critical care unit, 24 hours and the end of calendar day 3. The risk model at admission had fair discrimination (c-index 0.705). Discrimination improved at 24 hours (c-index 0.823) and this was maintained at the end of calendar day 3 (c-index 0.835). There was a drop in model performance in the validation sample. Economic decision model: Irrespective of risk threshold, incremental quality-adjusted life-years of prophylaxis strategies compared with current practice were positive but small compared with the incremental costs. Incremental net benefits of each prophylaxis strategy compared with current practice were all negative. Cost-effectiveness acceptability curves showed that current practice was the strategy most likely to be cost-effective. Across all parameters in the decision model, results indicated that the value of further research for the whole population of interest might be high relative to the research costs. The results of the Fungal Infection Risk Evaluation (FIRE) Study, derived from a highly representative sample of adult general critical care units across the UK, indicated a low incidence of IFD among non-neutropenic, critically ill adult patients. IFD was associated with substantially higher mortality, more intensive organ support and longer length of stay. Risk modelling produced simple risk models that provided acceptable discrimination for identifying patients at 'high risk' of invasive Candida infection. Results of the economic model suggested that the current most cost-effective treatment strategy for prophylactic use of systemic antifungal agents among non-neutropenic, critically ill adult patients admitted to NHS adult general critical care units is a strategy of no risk assessment and no antifungal prophylaxis. Funding for this study was provided by the Health Technology Assessment programme of the National Institute for Health Research.

  13. The necessity of sociodemographic status adjustment in hospital value rankings for perforated appendicitis in children.

    PubMed

    Tian, Yao; Sweeney, John F; Wulkan, Mark L; Heiss, Kurt F; Raval, Mehul V

    2016-06-01

    Hospitals are increasingly focused on demonstration of high-value care for common surgical procedures. Although sociodemographic status (SDS) factors have been tied to various surgical outcomes, the impact of SDS factors on hospital value rankings has not been well explored. Our objective was to examine effects of SDS factors on high-value surgical care at the patient level, and to illustrate the importance of SDS adjustment when evaluating hospital-level performance. Perforated appendicitis hospitalizations were identified from the 2012 Kids' Inpatient Database. The primary outcome of interest was high-value care as defined by evaluation of duration of stay and cost. SDS factors included race, health insurance type, median household income, and patient location. The impact of SDS on high-value care was estimated using regression models after accounting for hospital-level variation. Risk-adjusted value rankings were compared before and after adjustment for SDS. From 9,986 hospitalizations, 998 high-value encounters were identified. African Americans were less likely to experience high-value care compared with white patients after adjusting for all SDS variables. Although private insurance and living in nonmetro counties were associated independently with high-value care, the effects were attenuated in the fully adjusted models. For the 136 hospitals ranked according to risk-adjusted value status, 59 hospitals' rankings improved after adjustment and 53 hospitals' rankings declined. After adjustment for patient and hospital factors, SDS has a small but significant impact on risk-adjusted hospital performance ranking for pediatric appendicitis. Adjustment for SDS should be considered in future comparative performance assessment. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. The Lack of Utility of Circulating Biomarkers of Inflammation and Endothelial Dysfunction for Type 2 Diabetes Risk Prediction Among Postmenopausal Women

    PubMed Central

    Chao, Chun; Song, Yiqing; Cook, Nancy; Tseng, Chi-Hong; Manson, JoAnn E.; Eaton, Charles; Margolis, Karen L.; Rodriguez, Beatriz; Phillips, Lawrence S.; Tinker, Lesley F.; Liu, Simin

    2011-01-01

    Background Recent studies have linked plasma markers of inflammation and endothelial dysfunction to type 2 diabetes mellitus (DM) development. However, the utility of these novel biomarkers for type 2 DM risk prediction remains uncertain. Methods The Women’s Health Initiative Observational Study (WHIOS), a prospective cohort, and a nested case-control study within the WHIOS of 1584 incident type 2 DM cases and 2198 matched controls were used to evaluate the utility of plasma markers of inflammation and endothelial dysfunction for type 2 DM risk prediction. Between September 1994 and December 1998, 93 676 women aged 50 to 79 years were enrolled in the WHIOS. Fasting plasma levels of glucose, insulin, white blood cells, tumor necrosis factor receptor 2, interleukin 6, high-sensitivity C-reactive protein, E-selectin, soluble intercellular adhesion molecule 1, and vascular cell adhesion molecule 1 were measured using blood samples collected at baseline. A series of prediction models including traditional risk factors and novel plasma markers were evaluated on the basis of global model fit, model discrimination, net reclassification improvement, and positive and negative predictive values. Results Although white blood cell count and levels of interleukin 6, high-sensitivity C-reactive protein, and soluble intercellular adhesion molecule 1 significantly enhanced model fit, none of the inflammatory and endothelial dysfunction markers improved the ability of model discrimination (area under the receiver operating characteristic curve, 0.93 vs 0.93), net reclassification, or predictive values (positive, 0.22 vs 0.24; negative, 0.99 vs 0.99 [using 15% 6-year type 2 DM risk as the cutoff]) compared with traditional risk factors. Similar results were obtained in ethnic-specific analyses. Conclusion Beyond traditional risk factors, measurement of plasma markers of systemic inflammation and endothelial dysfunction contribute relatively little additional value in clinical type 2 DM risk prediction in a multiethnic cohort of postmenopausal women. PMID:20876407

  15. Developing Risk Prediction Models for Kidney Injury and Assessing Incremental Value for Novel Biomarkers

    PubMed Central

    Kerr, Kathleen F.; Meisner, Allison; Thiessen-Philbrook, Heather; Coca, Steven G.

    2014-01-01

    The field of nephrology is actively involved in developing biomarkers and improving models for predicting patients’ risks of AKI and CKD and their outcomes. However, some important aspects of evaluating biomarkers and risk models are not widely appreciated, and statistical methods are still evolving. This review describes some of the most important statistical concepts for this area of research and identifies common pitfalls. Particular attention is paid to metrics proposed within the last 5 years for quantifying the incremental predictive value of a new biomarker. PMID:24855282

  16. Extensions of criteria for evaluating risk prediction models for public health applications.

    PubMed

    Pfeiffer, Ruth M

    2013-04-01

    We recently proposed two novel criteria to assess the usefulness of risk prediction models for public health applications. The proportion of cases followed, PCF(p), is the proportion of individuals who will develop disease who are included in the proportion p of individuals in the population at highest risk. The proportion needed to follow-up, PNF(q), is the proportion of the general population at highest risk that one needs to follow in order that a proportion q of those destined to become cases will be followed (Pfeiffer, R.M. and Gail, M.H., 2011. Two criteria for evaluating risk prediction models. Biometrics 67, 1057-1065). Here, we extend these criteria in two ways. First, we introduce two new criteria by integrating PCF and PNF over a range of values of q or p to obtain iPCF, the integrated PCF, and iPNF, the integrated PNF. A key assumption in the previous work was that the risk model is well calibrated. This assumption also underlies novel estimates of iPCF and iPNF based on observed risks in a population alone. The second extension is to propose and study estimates of PCF, PNF, iPCF, and iPNF that are consistent even if the risk models are not well calibrated. These new estimates are obtained from case-control data when the outcome prevalence in the population is known, and from cohort data, with baseline covariates and observed health outcomes. We study the efficiency of the various estimates and propose and compare tests for comparing two risk models, both of which were evaluated in the same validation data.

  17. A stochastic inventory management model for a dual sourcing supply chain with disruptions

    NASA Astrophysics Data System (ADS)

    Iakovou, Eleftherios; Vlachos, Dimitrios; Xanthopoulos, Anastasios

    2010-03-01

    As companies continue to globalise their operations and outsource significant portion of their value chain activities, they often end up relying heavily on order replenishments from distant suppliers. The explosion in long-distance sourcing is exposing supply chains and shareholder value at ever increasing operational and disruption risks. It is well established, both in academia and in real-world business environments, that resource flexibility is an effective method for hedging against supply chain disruption risks. In this contextual framework, we propose a single period stochastic inventory decision-making model that could be employed for capturing the trade-off between inventory policies and disruption risks for an unreliable dual sourcing supply network for both the capacitated and uncapacitated cases. Through the developed model, we obtain some important managerial insights and evaluate the merit of contingency strategies in managing uncertain supply chains.

  18. Facility-specific radiation exposure risks and their implications for radiation workers at Department of Energy laboratories

    NASA Astrophysics Data System (ADS)

    Davis, Adam Christopher

    This research develops a new framework for evaluating the occupational risks of exposure to hazardous substances in any setting where As Low As Reasonably Achievable (ALARA) practices are mandated or used. The evaluation is performed by developing a hypothesis-test-based procedure for evaluating the homogeneity of various epidemiological cohorts, and thus the appropriateness of the application of aggregate data-pooling techniques to those cohorts. A statistical methodology is then developed as an alternative to aggregate pooling for situations in which individual cohorts show heterogeneity between them and are thus unsuitable for pooled analysis. These methods are then applied to estimate the all-cancer mortality risks incurred by workers at four Department-of-Energy nuclear weapons laboratories. Both linear, no-threshold and dose-bin averaged risks are calculated and it is further shown that aggregate analysis tends to overestimate the risks with respect to those calculated by the methods developed in this work. The risk estimates developed in Chapter 2 are, in Chapter 3, applied to assess the risks to workers engaged in americium recovery operations at Los Alamos National Laboratory. The work described in Chapter 3 develops a full radiological protection assessment for the new americium recovery project, including development of exposure cases, creation and modification of MCNP5 models, development of a time-and-motion study, and the final synthesis of all data. This work also develops a new risk-based method of determining whether administrative controls, such as staffing increases, are ALARA-optimized. The EPA's estimate of the value of statistical life is applied to these risk estimates to determine a monetary value for risk. The rate of change of this "risk value" (marginal risk) is then compared with the rate of change of workers' compensations as additional workers are added to the project to reduce the dose (and therefore, presumably, risk) to each individual.

  19. Street Gangs: A Modeling Approach to Evaluating At Risk Youth

    DTIC Science & Technology

    2008-03-01

    Kaoru Ishikawa (Ryan, 2000; Herrmann, 2001). Ishikawa diagrams stem from the area of quality control but have been used in many other areas such as...Overview................................................................................................................43 Ishikawa Diagram...Figure 1. Example Ishikawa “Fishbone” Diagram.......................................................31 Figure 2. Example Value Hierarchy

  20. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  1. A Discrete Event Simulation Model to Assess the Economic Value of a Hypothetical Pharmacogenomics Test for Statin-Induced Myopathy in Patients Initiating a Statin in Secondary Cardiovascular Prevention.

    PubMed

    Mitchell, Dominic; Guertin, Jason R; Dubois, Anick; Dubé, Marie-Pierre; Tardif, Jean-Claude; Iliza, Ange Christelle; Fanton-Aita, Fiorella; Matteau, Alexis; LeLorier, Jacques

    2018-04-01

    Statin (HMG-CoA reductase inhibitor) therapy is the mainstay dyslipidemia treatment and reduces the risk of a cardiovascular (CV) event (CVE) by up to 35%. However, adherence to statin therapy is poor. One reason patients discontinue statin therapy is musculoskeletal pain and the associated risk of rhabdomyolysis. Research is ongoing to develop a pharmacogenomics (PGx) test for statin-induced myopathy as an alternative to the current diagnosis method, which relies on creatine kinase levels. The potential economic value of a PGx test for statin-induced myopathy is unknown. We developed a lifetime discrete event simulation (DES) model for patients 65 years of age initiating a statin after a first CVE consisting of either an acute myocardial infarction (AMI) or a stroke. The model evaluates the potential economic value of a hypothetical PGx test for diagnosing statin-induced myopathy. We have assessed the model over the spectrum of test sensitivity and specificity parameters. Our model showed that a strategy with a perfect PGx test had an incremental cost-utility ratio of 4273 Canadian dollars ($Can) per quality-adjusted life year (QALY). The probabilistic sensitivity analysis shows that when the payer willingness-to-pay per QALY reaches $Can12,000, the PGx strategy is favored in 90% of the model simulations. We found that a strategy favoring patients staying on statin therapy is cost effective even if patients maintained on statin are at risk of rhabdomyolysis. Our results are explained by the fact that statins are highly effective in reducing the CV risk in patients at high CV risk, and this benefit largely outweighs the risk of rhabdomyolysis.

  2. Introduction of a new laboratory test: an econometric approach with the use of neural network analysis.

    PubMed

    Jabor, A; Vlk, T; Boril, P

    1996-04-15

    We designed a simulation model for the assessment of the financial risks involved when a new diagnostic test is introduced in the laboratory. The model is based on a neural network consisting of ten neurons and assumes that input entities can have assigned appropriate uncertainty. Simulations are done on a 1-day interval basis. Risk analysis completes the model and the financial effects are evaluated for a selected time period. The basic output of the simulation consists of total expenses and income during the simulation time, net present value of the project at the end of simulation, total number of control samples during simulation, total number of patients evaluated and total number of used kits.

  3. Pollution assessment and source apportionment of heavy metals in contaminated site soils

    NASA Astrophysics Data System (ADS)

    Zheng, Hongbo; Ma, Yan

    2018-03-01

    Pollution characteristics of heavy metals in soil were analyzed with a typical contaminated site as the case area. The pollution degree of the element was evaluated by indexes of geoaccumulation (Igeo). The potential ecological risk of heavy metals was assessed with potential ecological risk index model. Principal component analysis (PCA) model was simultaneously carried out to identify the main sources of heavy metals in topsoils. The results indicated that: 1. Mean values of 11 kinds of metals in topsoils were greater than respective soil background values, following the order: Zn>Pb>V>Cr>Cu>Ni>Co>As>Sb>Cd>Hg. Heavy metals with a certain accumulation in the research area were significantly affected by external factors. 2. Igeo results showed that Cd and Zn reached strongly polluted degree, while Pb with moderately to strongly polluted, Sb and Hg with moderately polluted, Cu, Co, Ni and Cr with unpolluted to moderately polluted, V and As with un-polluted. 3. Potential ecological risk assessment showed the degree of ecological risk with Cd at very high risk, Hg at high risk, Pb at moderate risk and others at low risk. The comprehensive risk of all the metals was very high. 4. PCA got three main sources with contributions, including industrial activities (44.18%), traffic and burning dust (26.68%) and soil parent materials (12.20%).

  4. Forecasting risk along a river basin using a probabilistic and deterministic model for environmental risk assessment of effluents through ecotoxicological evaluation and GIS.

    PubMed

    Gutiérrez, Simón; Fernandez, Carlos; Barata, Carlos; Tarazona, José Vicente

    2009-12-20

    This work presents a computer model for Risk Assessment of Basins by Ecotoxicological Evaluation (RABETOX). The model is based on whole effluent toxicity testing and water flows along a specific river basin. It is capable of estimating the risk along a river segment using deterministic and probabilistic approaches. The Henares River Basin was selected as a case study to demonstrate the importance of seasonal hydrological variations in Mediterranean regions. As model inputs, two different ecotoxicity tests (the miniaturized Daphnia magna acute test and the D.magna feeding test) were performed on grab samples from 5 waste water treatment plant effluents. Also used as model inputs were flow data from the past 25 years, water velocity measurements and precise distance measurements using Geographical Information Systems (GIS). The model was implemented into a spreadsheet and the results were interpreted and represented using GIS in order to facilitate risk communication. To better understand the bioassays results, the effluents were screened through SPME-GC/MS analysis. The deterministic model, performed each month during one calendar year, showed a significant seasonal variation of risk while revealing that September represents the worst-case scenario with values up to 950 Risk Units. This classifies the entire area of study for the month of September as "sublethal significant risk for standard species". The probabilistic approach using Monte Carlo analysis was performed on 7 different forecast points distributed along the Henares River. A 0% probability of finding "low risk" was found at all forecast points with a more than 50% probability of finding "potential risk for sensitive species". The values obtained through both the deterministic and probabilistic approximations reveal the presence of certain substances, which might be causing sublethal effects in the aquatic species present in the Henares River.

  5. Non-animal approaches for toxicokinetics in risk evaluations of food chemicals.

    PubMed

    Punt, Ans; Peijnenburg, Ad A C M; Hoogenboom, Ron L A P; Bouwmeester, Hans

    2017-01-01

    The objective of the present work was to review the availability and predictive value of non-animal toxicokinetic approaches and to evaluate their current use in European risk evaluations of food contaminants, additives and food contact materials, as well as pesticides and medicines. Results revealed little use of quantitative animal or human kinetic data in risk evaluations of food chemicals, compared with pesticides and medicines. Risk evaluations of medicines provided sufficient in vivo kinetic data from different species to evaluate the predictive value of animal kinetic data for humans. These data showed a relatively poor correlation between the in vivo bioavailability in rats and dogs versus that in humans. In contrast, in vitro (human) kinetic data have been demonstrated to provide adequate predictions of the fate of compounds in humans, using appropriate in vitro-in vivo scalers and by integration of in vitro kinetic data with in silico kinetic modelling. Even though in vitro kinetic data were found to be occasionally included within risk evaluations of food chemicals, particularly results from Caco-2 absorption experiments and in vitro data on gut-microbial conversions, only minor use of in vitro methods for metabolism and quantitative in vitro-in vivo extrapolation methods was identified. Yet, such quantitative predictions are essential in the development of alternatives to animal testing as well as to increase human relevance of toxicological risk evaluations. Future research should aim at further improving and validating quantitative alternative methods for kinetics, thereby increasing regulatory acceptance of non-animal kinetic data.

  6. The European Thoracic Surgery Database project: modelling the risk of in-hospital death following lung resection.

    PubMed

    Berrisford, Richard; Brunelli, Alessandro; Rocco, Gaetano; Treasure, Tom; Utley, Martin

    2005-08-01

    To identify pre-operative factors associated with in-hospital mortality following lung resection and to construct a risk model that could be used prospectively to inform decisions and retrospectively to enable fair comparisons of outcomes. Data were submitted to the European Thoracic Surgery Database from 27 units in 14 countries. We analysed data concerning all patients that had a lung resection. Logistic regression was used with a random sample of 60% of cases to identify pre-operative factors associated with in-hospital mortality and to build a model of risk. The resulting model was tested on the remaining 40% of patients. A second model based on age and ppoFEV1% was developed for risk of in-hospital death amongst tumour resection patients. Of the 3426 adult patients that had a first lung resection for whom mortality data were available, 66 died within the same hospital admission. Within the data used for model development, dyspnoea (according to the Medical Research Council classification), ASA (American Society of Anaesthesiologists) score, class of procedure and age were found to be significantly associated with in-hospital death in a multivariate analysis. The logistic model developed on these data displayed predictive value when tested on the remaining data. Two models of the risk of in-hospital death amongst adult patients undergoing lung resection have been developed. The models show predictive value and can be used to discern between high-risk and low-risk patients. Amongst the test data, the model developed for all diagnoses performed well at low risk, underestimated mortality at medium risk and overestimated mortality at high risk. The second model for resection of lung neoplasms was developed after establishing the performance of the first model and so could not be tested robustly. That said, we were encouraged by its performance over the entire range of estimated risk. The first of these two models could be regarded as an evaluation based on clinically available criteria while the second uses data obtained from objective measurement. We are optimistic that further model development and testing will provide a tool suitable for case mix adjustment.

  7. Multifractal Value at Risk model

    NASA Astrophysics Data System (ADS)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  8. A site-specific farm-scale GIS approach for reducing groundwater contamination by pesticides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mulla, D.J.; Perillo, C.A.; Cogger, C.G.

    1996-05-01

    It has been proposed to vary pesticide applications by patterns in surface organic C to reduce the potential for contamination of groundwater. To evaluate the feasibility of this {open_quotes}precision farming{close_quotes} approach, data for carbofuran concentrations from 57 locations sampled to a depth of 180 cm were fit to the convective-dispersive equation. Fitted values for pore water velocity (v) ranged from 0.17 to 1.92 cm d{sup -1}, with a mean of 0.68 cm d{sup -1}. Values for dispersion (D) ranged from 0.29 to 13.35 cm{sup 2} d{sup -1}, with a mean of 2.57. With this data, risks of pesticide leaching weremore » estimated at each location using the attenuation factor (AF) model, and a dispersion based leached factor (LF) model. Using the AF model gave two locations with a very high pesticide leaching risk, 6 with a low risk, and 2 with no risk. Using the LF model, 6 had a high risk, 13 had a medium risk, 18 had a low risk, and 20 had no risk. Pesticide leaching risks were not correlated with any measured surface soil properties. Much of the variability in leaching risk was because of velocity variations, so it would be incorrect to assume that surface organic C content controls the leaching risk. 30 refs., 1 fig., 3 tabs.« less

  9. Developing risk prediction models for kidney injury and assessing incremental value for novel biomarkers.

    PubMed

    Kerr, Kathleen F; Meisner, Allison; Thiessen-Philbrook, Heather; Coca, Steven G; Parikh, Chirag R

    2014-08-07

    The field of nephrology is actively involved in developing biomarkers and improving models for predicting patients' risks of AKI and CKD and their outcomes. However, some important aspects of evaluating biomarkers and risk models are not widely appreciated, and statistical methods are still evolving. This review describes some of the most important statistical concepts for this area of research and identifies common pitfalls. Particular attention is paid to metrics proposed within the last 5 years for quantifying the incremental predictive value of a new biomarker. Copyright © 2014 by the American Society of Nephrology.

  10. The transition to emerging revenue models.

    PubMed

    Harris, John M; Hemnani, Rashi

    2013-04-01

    A financial assessment aimed at gauging the true impact of the healthcare industry's new value-based payment models for a health system should begin with separate analyses of the following: The direct contract results, The impact of volume changes on net income, The impact of operational improvements, Net income at risk from competitor actions. The results of these four analyses then should be evaluated in combination to identify the ultimate impact of the new revenue models on the health system's bottom line.

  11. Validation of the Retinal Detachment after Open Globe Injury (RD-OGI) Score as an Effective Tool for Predicting Retinal Detachment.

    PubMed

    Brodowska, Katarzyna; Stryjewski, Tomasz P; Papavasileiou, Evangelia; Chee, Yewlin E; Eliott, Dean

    2017-05-01

    The Retinal Detachment after Open Globe Injury (RD-OGI) Score is a clinical prediction model that was developed at the Massachusetts Eye and Ear Infirmary to predict the risk of retinal detachment (RD) after open globe injury (OGI). This study sought to validate the RD-OGI Score in an independent cohort of patients. Retrospective cohort study. The predictive value of the RD-OGI Score was evaluated by comparing the original RD-OGI Scores of 893 eyes with OGI that presented between 1999 and 2011 (the derivation cohort) with 184 eyes with OGI that presented from January 1, 2012, to January 31, 2014 (the validation cohort). Three risk classes (low, moderate, and high) were created and logistic regression was undertaken to evaluate the optimal predictive value of the RD-OGI Score. A Kaplan-Meier survival analysis evaluated survival experience between the risk classes. Time to RD. At 1 year after OGI, 255 eyes (29%) in the derivation cohort and 66 eyes (36%) in the validation cohort were diagnosed with an RD. At 1 year, the low risk class (RD-OGI Scores 0-2) had a 3% detachment rate in the derivation cohort and a 0% detachment rate in the validation cohort, the moderate risk class (RD-OGI Scores 2.5-4.5) had a 29% detachment rate in the derivation cohort and a 35% detachment rate in the validation cohort, and the high risk class (RD-OGI scores 5-7.5) had a 73% detachment rate in the derivation cohort and an 86% detachment rate in the validation cohort. Regression modeling revealed the RD-OGI to be highly discriminative, especially 30 days after injury, with an area under the receiver operating characteristic curve of 0.939 in the validation cohort. Survival experience was significantly different depending upon the risk class (P < 0.0001, log-rank chi-square). The RD-OGI Score can reliably predict the future risk of developing an RD based on clinical variables that are present at the time of the initial evaluation after OGI. Copyright © 2017 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  12. Cost-benefit analysis of biopsy methods for suspicious mammographic lesions; discussion 994-5.

    PubMed

    Fahy, B N; Bold, R J; Schneider, P D; Khatri, V; Goodnight, J E

    2001-09-01

    Stereotactic core biopsy (SCB) is more cost-effective than needle-localized biopsy (NLB) for evaluation and treatment of mammographic lesions. A computer-generated mathematical model was developed based on clinical outcome modeling to estimate costs accrued during evaluation and treatment of suspicious mammographic lesions. Total costs were determined for evaluation and subsequent treatment of cancer when either SCB or NLB was used as the initial biopsy method. Cost was estimated by the cumulative work relative value units accrued. The risk of malignancy based on the Breast Imaging Reporting Data System (BIRADS) score and mammographic suspicion of ductal carcinoma in situ were varied to simulate common clinical scenarios. Total cost accumulated during evaluation and subsequent surgical therapy (if required). Evaluation of BIRADS 5 lesions (highly suggestive, risk of malignancy = 90%) resulted in equivalent relative value units for both techniques (SCB, 15.54; NLB, 15.47). Evaluation of lesions highly suspicious for ductal carcinoma in situ yielded similar total treatment relative value units (SCB, 11.49; NLB, 10.17). Only for evaluation of BIRADS 4 lesions (suspicious abnormality, risk of malignancy = 34%) was SCB more cost-effective than NLB (SCB, 7.65 vs. NLB, 15.66). No difference in cost-benefit was found when lesions highly suggestive of malignancy (BIRADS 5) or those suspicious for ductal carcinoma in situ were evaluated initially with SCB vs. NLB, thereby disproving the hypothesis. Only for intermediate-risk lesions (BIRADS 4) did initial evaluation with SCB yield a greater cost savings than with NLB.

  13. Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis

    NASA Astrophysics Data System (ADS)

    Lari, S.; Frattini, P.; Crosta, G. B.

    2009-04-01

    We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models, nevertheless, requires different assumptions and computational efforts, and provides results with different level of detail.

  14. Evaluation of viral load thresholds for predicting new WHO Stage 3 and 4 events in HIV-infected children receiving highly active antiretroviral therapy

    PubMed Central

    Siberry, George K; Harris, D. Robert; Oliveira, Ricardo Hugo; Krauss, Margot R.; Hofer, Cristina B.; Tiraboschi, Adriana Aparecida; Marques, Heloisa; Succi, Regina C.; Abreu, Thalita; Negra, Marinella Della; Mofenson, Lynne M.; Hazra, Rohan

    2012-01-01

    Background This study evaluated a wide range of viral load (VL) thresholds to identify a cut-point that best predicts new clinical events in children on stable highly-active antiretroviral therapy (HAART). Methods Cox proportional hazards modeling was used to assess the adjusted risk of World Health Organization stage 3 or 4 clinical events (WHO events) as a function of time-varying CD4, VL, and hemoglobin values in a cohort study of Latin American children on HAART ≥ 6 months. Models were fit using different VL cut-points between 400 and 50,000 copies/mL, with model fit evaluated on the basis of the minimum Akaike Information Criterion (AIC) value, a standard model fit statistic. Results Models were based on 67 subjects with WHO events out of 550 subjects on study. The VL cutpoints of > 2600 copies/mL and > 32,000 copies/mL corresponded to the lowest AIC values and were associated with the highest hazard ratios [2.0 (p = 0.015) and 2.1 (p = 0.0058), respectively] for WHO events. Conclusions In HIV-infected Latin American children on stable HAART, two distinct VL thresholds (> 2,600 copies/mL and > 32,000 copies/mL) were identified for predicting children at significantly increased risk of HIV-related clinical illness, after accounting for CD4 level, hemoglobin level, and other significant factors. PMID:22343177

  15. Carotid Artery End-Diastolic Velocity and Future Cerebro-Cardiovascular Events in Asymptomatic High Risk Patients.

    PubMed

    Chung, Hyemoon; Jung, Young Hak; Kim, Ki-Hyun; Kim, Jong-Youn; Min, Pil-Ki; Yoon, Young Won; Lee, Byoung Kwon; Hong, Bum-Kee; Rim, Se-Joong; Kwon, Hyuck Moon; Choi, Eui-Young

    2016-01-01

    Prognostic value of additional carotid Doppler evaluations to carotid intima-media thickness (IMT) and plaque has not been completely evaluated. A total of 1119 patients with risk factors for, but without, overt coronary artery disease (CAD), who underwent both carotid ultrasound and Doppler examination were included in the present study. Parameters of interest included peak systolic and end-diastolic velocities, resistive indices of the carotid arteries, IMT, and plaque measurements. The primary end-point was all-cause cerebro-cardiovascular events (CVEs) including acute myocardial infarction, coronary revascularization therapy, heart failure admission, stroke, and cardiovascular death. Model 1 covariates comprised age and sex; Model 2 also included hypertension, diabetes and smoking; Model 3 also had use of aspirin and statin; and Model 4 also included IMT and plaque. The mean follow-up duration was 1386±461 days and the mean age of the study population was 60±12 years. Amongst 1119 participants, 43% were women, 57% had a history of hypertension, and 23% had diabetes. During follow-up, 6.6% of patients experienced CVEs. Among carotid Doppler parameters, average common carotid artery end-diastolic velocity was the independent predictor for future CVEs after adjustments for all models variables (HR 0.95 per cm/s, 95% confident interval 0.91-0.99, p=0.034 in Model 4) and significantly increased the predictive value of Model 4 (global χ(2)=59.0 vs. 62.8, p=0.029). Carotid Doppler measurements in addition to IMT and plaque evaluation are independently associated with future CVEs in asymptomatic patients at risk for CAD.

  16. Prognostic Value of the Thrombolysis in Myocardial Infarction Risk Score in ST-Elevation Myocardial Infarction Patients With Left Ventricular Dysfunction (from the EPHESUS Trial).

    PubMed

    Popovic, Batric; Girerd, Nicolas; Rossignol, Patrick; Agrinier, Nelly; Camenzind, Edoardo; Fay, Renaud; Pitt, Bertram; Zannad, Faiez

    2016-11-15

    The Thrombolysis in Myocardial Infarction (TIMI) risk score remains a robust prediction tool for short-term and midterm outcome in the patients with ST-elevation myocardial infarction (STEMI). However, the validity of this risk score in patients with STEMI with reduced left ventricular ejection fraction (LVEF) remains unclear. A total of 2,854 patients with STEMI with early coronary revascularization participating in the randomized EPHESUS (Epleronone Post-Acute Myocardial Infarction Heart Failure Efficacy and Survival Study) trial were analyzed. TIMI risk score was calculated at baseline, and its predictive value was evaluated using C-indexes from Cox models. The increase in reclassification of other variables in addition to TIMI score was assessed using the net reclassification index. TIMI risk score had a poor predictive accuracy for all-cause mortality (C-index values at 30 days and 1 year ≤0.67) and recurrent myocardial infarction (MI; C-index values ≤0.60). Among TIMI score items, diabetes/hypertension/angina, heart rate >100 beats/min, and systolic blood pressure <100 mm Hg were inconsistently associated with survival, whereas none of the TIMI score items, aside from age, were significantly associated with MI recurrence. Using a constructed predictive model, lower LVEF, lower estimated glomerular filtration rate (eGFR), and previous MI were significantly associated with all-cause mortality. The predictive accuracy of this model, which included LVEF and eGFR, was fair for both 30-day and 1-year all-cause mortality (C-index values ranging from 0.71 to 0.75). In conclusion, TIMI risk score demonstrates poor discrimination in predicting mortality or recurrent MI in patients with STEMI with reduced LVEF. LVEF and eGFR are major factors that should not be ignored by predictive risk scores in this population. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Systematic review of fall risk screening tools for older patients in acute hospitals.

    PubMed

    Matarese, Maria; Ivziku, Dhurata; Bartolozzi, Francesco; Piredda, Michela; De Marinis, Maria Grazia

    2015-06-01

    To determine the most accurate fall risk screening tools for predicting falls among patients aged 65 years or older admitted to acute care hospitals. Falls represent a serious problem in older inpatients due to the potential physical, social, psychological and economic consequences. Older inpatients present with risk factors associated with age-related physiological and psychological changes as well as multiple morbidities. Thus, fall risk screening tools for older adults should include these specific risk factors. There are no published recommendations addressing what tools are appropriate for older hospitalized adults. Systematic review. MEDLINE, CINAHL and Cochrane electronic databases were searched between January 1981-April 2013. Only prospective validation studies reporting sensitivity and specificity values were included. Recommendations of the Cochrane Handbook of Diagnostic Test Accuracy Reviews have been followed. Three fall risk assessment tools were evaluated in seven articles. Due to the limited number of studies, meta-analysis was carried out only for the STRATIFY and Hendrich Fall Risk Model II. In the combined analysis, the Hendrich Fall Risk Model II demonstrated higher sensitivity than STRATIFY, while the STRATIFY showed higher specificity. In both tools, the Youden index showed low prognostic accuracy. The identified tools do not demonstrate predictive values as high as needed for identifying older inpatients at risk for falls. For this reason, no tool can be recommended for fall detection. More research is needed to evaluate fall risk screening tools for older inpatients. © 2014 John Wiley & Sons Ltd.

  18. Risk Prediction Models of Locoregional Failure After Radical Cystectomy for Urothelial Carcinoma: External Validation in a Cohort of Korean Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ku, Ja Hyeon; Kim, Myong; Jeong, Chang Wook

    2014-08-01

    Purpose: To evaluate the predictive accuracy and general applicability of the locoregional failure model in a different cohort of patients treated with radical cystectomy. Methods and Materials: A total of 398 patients were included in the analysis. Death and isolated distant metastasis were considered competing events, and patients without any events were censored at the time of last follow-up. The model included the 3 variables pT classification, the number of lymph nodes identified, and margin status, as follows: low risk (≤pT2), intermediate risk (≥pT3 with ≥10 nodes removed and negative margins), and high risk (≥pT3 with <10 nodes removed ormore » positive margins). Results: The bootstrap-corrected concordance index of the model 5 years after radical cystectomy was 66.2%. When the risk stratification was applied to the validation cohort, the 5-year locoregional failure estimates were 8.3%, 21.2%, and 46.3% for the low-risk, intermediate-risk, and high-risk groups, respectively. The risk of locoregional failure differed significantly between the low-risk and intermediate-risk groups (subhazard ratio [SHR], 2.63; 95% confidence interval [CI], 1.35-5.11; P<.001) and between the low-risk and high-risk groups (SHR, 4.28; 95% CI, 2.17-8.45; P<.001). Although decision curves were appropriately affected by the incidence of the competing risk, decisions about the value of the models are not likely to be affected because the model remains of value over a wide range of threshold probabilities. Conclusions: The model is not completely accurate, but it demonstrates a modest level of discrimination, adequate calibration, and meaningful net benefit gain for prediction of locoregional failure after radical cystectomy.« less

  19. Threshold Values for Identification of Contamination Predicted by Reduced-Order Models

    DOE PAGES

    Last, George V.; Murray, Christopher J.; Bott, Yi-Ju; ...

    2014-12-31

    The U.S. Department of Energy’s (DOE’s) National Risk Assessment Partnership (NRAP) Project is developing reduced-order models to evaluate potential impacts on underground sources of drinking water (USDWs) if CO2 or brine leaks from deep CO2 storage reservoirs. Threshold values, below which there would be no predicted impacts, were determined for portions of two aquifer systems. These threshold values were calculated using an interwell approach for determining background groundwater concentrations that is an adaptation of methods described in the U.S. Environmental Protection Agency’s Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities.

  20. Risk-based cost-benefit analysis for evaluating microbial risk mitigation in a drinking water system.

    PubMed

    Bergion, Viktor; Lindhe, Andreas; Sokolova, Ekaterina; Rosén, Lars

    2018-04-01

    Waterborne outbreaks of gastrointestinal diseases can cause large costs to society. Risk management needs to be holistic and transparent in order to reduce these risks in an effective manner. Microbial risk mitigation measures in a drinking water system were investigated using a novel approach combining probabilistic risk assessment and cost-benefit analysis. Lake Vomb in Sweden was used to exemplify and illustrate the risk-based decision model. Four mitigation alternatives were compared, where the first three alternatives, A1-A3, represented connecting 25, 50 and 75%, respectively, of on-site wastewater treatment systems in the catchment to the municipal wastewater treatment plant. The fourth alternative, A4, represented installing a UV-disinfection unit in the drinking water treatment plant. Quantitative microbial risk assessment was used to estimate the positive health effects in terms of quality adjusted life years (QALYs), resulting from the four mitigation alternatives. The health benefits were monetised using a unit cost per QALY. For each mitigation alternative, the net present value of health and environmental benefits and investment, maintenance and running costs was calculated. The results showed that only A4 can reduce the risk (probability of infection) below the World Health Organization guidelines of 10 -4 infections per person per year (looking at the 95th percentile). Furthermore, all alternatives resulted in a negative net present value. However, the net present value would be positive (looking at the 50 th percentile using a 1% discount rate) if non-monetised benefits (e.g. increased property value divided evenly over the studied time horizon and reduced microbial risks posed to animals), estimated at 800-1200 SEK (€100-150) per connected on-site wastewater treatment system per year, were included. This risk-based decision model creates a robust and transparent decision support tool. It is flexible enough to be tailored and applied to local settings of drinking water systems. The model provides a clear and holistic structure for decisions related to microbial risk mitigation. To improve the decision model, we suggest to further develop the valuation and monetisation of health effects and to refine the propagation of uncertainties and variabilities between the included methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Prediction models for the risk of spontaneous preterm birth based on maternal characteristics: a systematic review and independent external validation.

    PubMed

    Meertens, Linda J E; van Montfort, Pim; Scheepers, Hubertina C J; van Kuijk, Sander M J; Aardenburg, Robert; Langenveld, Josje; van Dooren, Ivo M A; Zwaan, Iris M; Spaanderman, Marc E A; Smits, Luc J M

    2018-04-17

    Prediction models may contribute to personalized risk-based management of women at high risk of spontaneous preterm delivery. Although prediction models are published frequently, often with promising results, external validation generally is lacking. We performed a systematic review of prediction models for the risk of spontaneous preterm birth based on routine clinical parameters. Additionally, we externally validated and evaluated the clinical potential of the models. Prediction models based on routinely collected maternal parameters obtainable during first 16 weeks of gestation were eligible for selection. Risk of bias was assessed according to the CHARMS guidelines. We validated the selected models in a Dutch multicenter prospective cohort study comprising 2614 unselected pregnant women. Information on predictors was obtained by a web-based questionnaire. Predictive performance of the models was quantified by the area under the receiver operating characteristic curve (AUC) and calibration plots for the outcomes spontaneous preterm birth <37 weeks and <34 weeks of gestation. Clinical value was evaluated by means of decision curve analysis and calculating classification accuracy for different risk thresholds. Four studies describing five prediction models fulfilled the eligibility criteria. Risk of bias assessment revealed a moderate to high risk of bias in three studies. The AUC of the models ranged from 0.54 to 0.67 and from 0.56 to 0.70 for the outcomes spontaneous preterm birth <37 weeks and <34 weeks of gestation, respectively. A subanalysis showed that the models discriminated poorly (AUC 0.51-0.56) for nulliparous women. Although we recalibrated the models, two models retained evidence of overfitting. The decision curve analysis showed low clinical benefit for the best performing models. This review revealed several reporting and methodological shortcomings of published prediction models for spontaneous preterm birth. Our external validation study indicated that none of the models had the ability to predict spontaneous preterm birth adequately in our population. Further improvement of prediction models, using recent knowledge about both model development and potential risk factors, is necessary to provide an added value in personalized risk assessment of spontaneous preterm birth. © 2018 The Authors Acta Obstetricia et Gynecologica Scandinavica published by John Wiley & Sons Ltd on behalf of Nordic Federation of Societies of Obstetrics and Gynecology (NFOG).

  2. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model.

    PubMed

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels.

  3. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model

    PubMed Central

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels. PMID:26351652

  4. Admission glucose does not improve GRACE score at 6 months and 5 years after myocardial infarction.

    PubMed

    de Mulder, Maarten; van der Ploeg, Tjeerd; de Waard, Guus A; Boersma, Eric; Umans, Victor A

    2011-01-01

    Admission plasma glucose (APG) is a biomarker that predicts mortality in myocardial infarction (MI) patients. Therefore, APG may improve risk stratification based on the GRACE risk score. We collected data on baseline characteristics and long-term (median 55 months) outcome of 550 MI patients who entered our hospital in 2003 and 2006. We determined the GRACE risk score at admission for each patient, which was entered in a logistic regression model, together with APG, to evaluate their prognostic value for 6-month and 5-year mortality. Patients with APG ≥7.8 mmol/l had a higher mortality than those with APG levels <7.8 mmol/l; 6 months: 13.7 versus 3.6%, p value <0.001; 5 years: 20.4 versus 11.1%, p value 0.003. After adjustment for the GRACE risk score variables, APG appeared a significant predictor of 6-month and 5-year mortality, adjusted OR 1.17 (1.06-1.29) and 1.12 (1.03-1.22). The combination of the GRACE risk score and APG increased the model's performance (discrimination C-index 0.87 vs. 0.85), although the difference was not significant (p = 0.095). Combining the GRACE risk score and APG reclassified 12.9% of the patients, but the net reclassification improvement was nonsignificant (p = 0.146). APG is a predictor of 6-month and 5-year mortality, each mmol/l increase in APG being associated with a mortality increase of 17 and 12%, respectively, independent of the GRACE risk score. However, adding APG to the GRACE model did not result in significantly improved clinical risk stratification. Copyright © 2012 S. Karger AG, Basel.

  5. Combined Endoscopic/Sonographic-Based Risk Matrix Model for Predicting One-Year Risk of Surgery: A Prospective Observational Study of a Tertiary Center Severe/Refractory Crohn's Disease Cohort.

    PubMed

    Rispo, Antonio; Imperatore, Nicola; Testa, Anna; Bucci, Luigi; Luglio, Gaetano; De Palma, Giovanni Domenico; Rea, Matilde; Nardone, Olga Maria; Caporaso, Nicola; Castiglione, Fabiana

    2018-03-08

    In the management of Crohn's Disease (CD) patients, having a simple score combining clinical, endoscopic and imaging features to predict the risk of surgery could help to tailor treatment more effectively. AIMS: to prospectively evaluate the one-year risk factors for surgery in refractory/severe CD and to generate a risk matrix for predicting the probability of surgery at one year. CD patients needing a disease re-assessment at our tertiary IBD centre underwent clinical, laboratory, endoscopy and bowel sonography (BS) examinations within one week. The optimal cut-off values in predicting surgery were identified using ROC curves for Simple Endoscopic Score for CD (SES-CD), bowel wall thickness (BWT) at BS, and small bowel CD extension at BS. Binary logistic regression and Cox's regression were then carried out. Finally, the probabilities of surgery were calculated for selected baseline levels of covariates and results were arranged in a prediction matrix. Of 100 CD patients, 30 underwent surgery within one year. SES-CD©9 (OR 15.3; p<0.001), BWT©7 mm (OR 15.8; p<0.001), small bowel CD extension at BS©33 cm (OR 8.23; p<0.001) and stricturing/penetrating behavior (OR 4.3; p<0.001) were the only independent factors predictive of surgery at one-year based on binary logistic and Cox's regressions. Our matrix model combined these risk factors and the probability of surgery ranged from 0.48% to 87.5% (sixteen combinations). Our risk matrix combining clinical, endoscopic and ultrasonographic findings can accurately predict the one-year risk of surgery in patients with severe/refractory CD requiring a disease re-evaluation. This tool could be of value in clinical practice, serving as the basis for a tailored management of CD patients.

  6. A risk hedging strategy under the nonparallel-shift yield curve

    NASA Astrophysics Data System (ADS)

    Gong, Pu; He, Xubiao

    2005-08-01

    Under the assumption of the movement of rigid, a nonparallel-shift model in the term structure of interest rates is developed by introducing Fisher & Weil duration which is a well-known concept in the area of interest risk management. This paper has studied the hedge and replication for portfolio immunization to minimize the risk exposure. Throughout the experiment of numerical simulation, the risk exposures of the portfolio under the different risk hedging strategies are quantitatively evaluated by the method of value at risk (VaR) order statistics (OS) estimation. The results show that the risk hedging strategy proposed in this paper is very effective for the interest risk management of the default-free bond.

  7. A Prognostic Indicator for Patients Hospitalized with Heart Failure.

    PubMed

    Snow, Richard; Vogel, Karen; Vanderhoff, Bruce; Kelch, Benjamin P; Ferris, Frank D

    2016-12-01

    Current methods for identifying patients at risk of dying within six months suffer from clinician biases resulting in underestimation of this risk. As a result, patients who are potentially eligible for hospice and palliative care services frequently do not benefit from these services until they are very close to the end of their lives. To develop a prospective prognostic indicator based on actual survival within Centers for Medicare and Medicaid Services (CMS) claims data that identifies patients with congestive heart failure (CHF) who are at risk of six-month mortality. CMS claims data from January 1, 2008 to June 30, 2009 were reviewed to find the first hospitalization for CHF patients with episode of care diagnosis-related groups (DRGs) 291, 292, and 293. Univariate and multivariable analyses were used to determine the associations between demographic and clinical factors and six-month mortality. The resulting model was evaluated for discrimination and calibration. The resulting prospective prognostic model demonstrated fair discrimination with an ROC of 0.71 and good calibration with a Hosmer-Lemshow statistic of 0.98. Across all DRGs, 5% of discharged patients had a six-month mortality risk of greater than 50%. This prospective approach appears to provide a method to identify patients with CHF who would potentially benefit from a clinical evaluation for referral to hospice care or for a palliative care consult due to high predicted risk of dying within 180 days after discharge from a hospital. This approach can provide a model to match at-risk patients with evidenced-based care in a more consistent manner. This method of identifying patients at risk needs further prospective evaluation to see if it has value for clinicians, increases referrals to hospice and palliative care services, and benefits patients and families.

  8. Seismic hazard, risk, and design for South America

    USGS Publications Warehouse

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best available science.

  9. Ecosystem Risk Assessment Using the Comprehensive Assessment of Risk to Ecosystems (CARE) Tool

    NASA Astrophysics Data System (ADS)

    Battista, W.; Fujita, R.; Karr, K.

    2016-12-01

    Effective Ecosystem Based Management requires a localized understanding of the health and functioning of a given system as well as of the various factors that may threaten the ongoing ability of the system to support the provision of valued services. Several risk assessment models are available that can provide a scientific basis for understanding these factors and for guiding management action, but these models focus mainly on single species and evaluate only the impacts of fishing in detail. We have developed a new ecosystem risk assessment model - the Comprehensive Assessment of Risk to Ecosystems (CARE) - that allows analysts to consider the cumulative impact of multiple threats, interactions among multiple threats that may result in synergistic or antagonistic impacts, and the impacts of a suite of threats on whole-ecosystem productivity and functioning, as well as on specific ecosystem services. The CARE model was designed to be completed in as little as two hours, and uses local and expert knowledge where data are lacking. The CARE tool can be used to evaluate risks facing a single site; to compare multiple sites for the suitability or necessity of different management options; or to evaluate the effects of a proposed management action aimed at reducing one or more risks. This analysis can help users identify which threats are the most important at a given site, and therefore where limited management resources should be targeted. CARE can be applied to virtually any system, and can be modified as knowledge is gained or to better match different site characteristics. CARE builds on previous ecosystem risk assessment tools to provide a comprehensive assessment of fishing and non-fishing threats that can be used to inform environmental management decisions across a broad range of systems.

  10. Ecosystem Risk Assessment Using the Comprehensive Assessment of Risk to Ecosystems (CARE) Tool

    NASA Astrophysics Data System (ADS)

    Battista, W.; Fujita, R.; Karr, K.

    2016-02-01

    Effective Ecosystem Based Management requires a localized understanding of the health and functioning of a given system as well as of the various factors that may threaten the ongoing ability of the system to support the provision of valued services. Several risk assessment models are available that can provide a scientific basis for understanding these factors and for guiding management action, but these models focus mainly on single species and evaluate only the impacts of fishing in detail. We have developed a new ecosystem risk assessment model - the Comprehensive Assessment of Risk to Ecosystems (CARE) - that allows analysts to consider the cumulative impact of multiple threats, interactions among multiple threats that may result in synergistic or antagonistic impacts, and the impacts of a suite of threats on whole-ecosystem productivity and functioning, as well as on specific ecosystem services. The CARE model was designed to be completed in as little as two hours, and uses local and expert knowledge where data are lacking. The CARE tool can be used to evaluate risks facing a single site; to compare multiple sites for the suitability or necessity of different management options; or to evaluate the effects of a proposed management action aimed at reducing one or more risks. This analysis can help users identify which threats are the most important at a given site, and therefore where limited management resources should be targeted. CARE can be applied to virtually any system, and can be modified as knowledge is gained or to better match different site characteristics. CARE builds on previous ecosystem risk assessment tools to provide a comprehensive assessment of fishing and non-fishing threats that can be used to inform environmental management decisions across a broad range of systems.

  11. 76 FR 53162 - Acceptance of Public Submissions Regarding the Study of Stable Value Contracts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-25

    ... the risk of a run on a SVF? To the extent that SVC providers use value-at-risk (``VaR'') models, do such VaR models adequately assess the risk of loss resulting from such events or other possible but extremely unlikely events? Do other loss models more adequately assess the risk of loss, such as the...

  12. Evaluation of Foreign Investment in Power Plants using Real Options

    NASA Astrophysics Data System (ADS)

    Kato, Moritoshi; Zhou, Yicheng

    This paper proposes new methods for evaluating foreign investment in power plants under market uncertainty using a real options approach. We suppose a thermal power plant project in a deregulated electricity market. One of our proposed methods is that we calculate the cash flow generated by the project in a reference year using actual market data to incorporate periodic characteristics of energy prices into a yearly cash flow model. We make the stochastic yearly cash flow model with the initial value which is the cash flow in the reference year, and certain trend and volatility. Then we calculate the real options value (ROV) of the project which has abandonment options using the yearly cash flow model. Another our proposed method is that we evaluate foreign currency/domestic currency exchange rate risk by representing ROV in foreign currency as yearly pay off and exchanging it to ROV in domestic currency using a stochastic exchange rate model. We analyze the effect of the heat rate and operation and maintenance costs of the power plant on ROV, and evaluate exchange rate risk through numerical examples. Our proposed method will be useful for the risk management of foreign investment in power plants.

  13. Accidental falls in hospital inpatients: evaluation of sensitivity and specificity of two risk assessment tools.

    PubMed

    Lovallo, Carmela; Rolandi, Stefano; Rossetti, Anna Maria; Lusignani, Maura

    2010-03-01

    This paper is a report of a study comparing the effectiveness of two falls risk assessment tools (Conley Scale and Hendrich Risk Model) by using them simultaneously with the same sample of hospital inpatients. Different risk assessment tools are available in literature. However, neither recent critical reviews nor international guidelines on fall prevention have identified tools that can be generalized to all categories of hospitalized patients. A prospective observational study was carried out in acute medical, surgical wards and rehabilitation units. From October 2007 to January 2008, 1148 patients were assessed with both instruments, subsequently noting the occurrence of falls. The sensitivity, specificity, positive and negative predictive values, and Receiver Operating Characteristics curves were calculated. The number of patients correctly identified with the Conley Scale (n = 41) was higher than with the Hendrich Model (n = 27). The Conley Scale gave sensitivity and specificity values of 69.49% and 61% respectively. The Hendrich Model gave a sensitivity value of 45.76% and a specificity value of 71%. Positive and negative predictive values were comparable. The Conley Scale is indicated for use in the medical sector, on the strength of its high sensitivity. However, since its specificity is very low, it is deemed useful to submit individual patients giving positive results to more in-depth clinical evaluation in order to decide whether preventive measures need to be taken. In surgical sectors, the low sensitivity values given by both scales suggest that further studies are warranted.

  14. Detection of high GS risk group prostate tumors by diffusion tensor imaging and logistic regression modelling.

    PubMed

    Ertas, Gokhan

    2018-07-01

    To assess the value of joint evaluation of diffusion tensor imaging (DTI) measures by using logistic regression modelling to detect high GS risk group prostate tumors. Fifty tumors imaged using DTI on a 3 T MRI device were analyzed. Regions of interests focusing on the center of tumor foci and noncancerous tissue on the maps of mean diffusivity (MD) and fractional anisotropy (FA) were used to extract the minimum, the maximum and the mean measures. Measure ratio was computed by dividing tumor measure by noncancerous tissue measure. Logistic regression models were fitted for all possible pair combinations of the measures using 5-fold cross validation. Systematic differences are present for all MD measures and also for all FA measures in distinguishing the high risk tumors [GS ≥ 7(4 + 3)] from the low risk tumors [GS ≤ 7(3 + 4)] (P < 0.05). Smaller value for MD measures and larger value for FA measures indicate the high risk. The models enrolling the measures achieve good fits and good classification performances (R 2 adj  = 0.55-0.60, AUC = 0.88-0.91), however the models using the measure ratios perform better (R 2 adj  = 0.59-0.75, AUC = 0.88-0.95). The model that employs the ratios of minimum MD and maximum FA accomplishes the highest sensitivity, specificity and accuracy (Se = 77.8%, Sp = 96.9% and Acc = 90.0%). Joint evaluation of MD and FA diffusion tensor imaging measures is valuable to detect high GS risk group peripheral zone prostate tumors. However, use of the ratios of the measures improves the accuracy of the detections substantially. Logistic regression modelling provides a favorable solution for the joint evaluations easily adoptable in clinical practice. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Advantages of new cardiovascular risk-assessment strategies in high-risk patients with hypertension.

    PubMed

    Ruilope, Luis M; Segura, Julian

    2005-10-01

    Accurate assessment of cardiovascular disease (CVD) risk in patients with hypertension is important when planning appropriate treatment of modifiable risk factors. The causes of CVD are multifactorial, and hypertension seldom exists as an isolated risk factor. Classic models of risk assessment are more accurate than a simple counting of risk factors, but they are not generalizable to all populations. In addition, the risk associated with hypertension is graded, continuous, and independent of other risk factors, and this is not reflected in classic models of risk assessment. This article is intended to review both classic and newer models of CVD risk assessment. MEDLINE was searched for articles published between 1990 and 2005 that contained the terms cardiovascular disease, hypertension, or risk assessment. Articles describing major clinical trials, new data about cardiovascular risk, or global risk stratification were selected for review. Some patients at high long-term risk for CVD events (eg, patients aged <50 years with multiple risk factors) may go untreated because they do not meet the absolute risk-intervention threshold of 20% risk over 10 years with the classic model. Recognition of the limitations of classic risk-assessment models led to new guidelines, particularly those of the European Society of Hypertension-European Society of Cardiology. These guidelines view hypertension as one of many risk and disease factors that require treatment to decrease risk. These newer guidelines include a more comprehensive range of risk factors and more finely graded blood pressure ranges to stratify patients by degree of risk. Whether they accurately predict CVD risk in most populations is not known. Evidence from the Valsartan Antihypertensive Long-term Use Evaluation (VALUE) study, which stratified patients by several risk and disease factors, highlights the predictive value of some newer CVD risk assessments. Modern risk assessments, which include blood pressure along with a wide array of modifiable risk factors, may be more accurate than classic models for CVD risk prediction.

  16. Forecasting the value-at-risk of Chinese stock market using the HARQ model and extreme value theory

    NASA Astrophysics Data System (ADS)

    Liu, Guangqiang; Wei, Yu; Chen, Yongfei; Yu, Jiang; Hu, Yang

    2018-06-01

    Using intraday data of the CSI300 index, this paper discusses value-at-risk (VaR) forecasting of the Chinese stock market from the perspective of high-frequency volatility models. First, we measure the realized volatility (RV) with 5-minute high-frequency returns of the CSI300 index and then model it with the newly introduced heterogeneous autoregressive quarticity (HARQ) model, which can handle the time-varying coefficients of the HAR model. Second, we forecast the out-of-sample VaR of the CSI300 index by combining the HARQ model and extreme value theory (EVT). Finally, using several popular backtesting methods, we compare the VaR forecasting accuracy of HARQ model with other traditional HAR-type models, such as HAR, HAR-J, CHAR, and SHAR. The empirical results show that the novel HARQ model can beat other HAR-type models in forecasting the VaR of the Chinese stock market at various risk levels.

  17. The ACC/AHA 2013 pooled cohort equations compared to a Korean Risk Prediction Model for atherosclerotic cardiovascular disease.

    PubMed

    Jung, Keum Ji; Jang, Yangsoo; Oh, Dong Joo; Oh, Byung-Hee; Lee, Sang Hoon; Park, Seong-Wook; Seung, Ki-Bae; Kim, Hong-Kyu; Yun, Young Duk; Choi, Sung Hee; Sung, Jidong; Lee, Tae-Yong; Kim, Sung Hi; Koh, Sang Baek; Kim, Moon Chan; Chang Kim, Hyeon; Kimm, Heejin; Nam, Chungmo; Park, Sungha; Jee, Sun Ha

    2015-09-01

    To evaluate the performance of the American College of Cardiology/American Heart Association (ACC/AHA) 2013 Pooled Cohort Equations in the Korean Heart Study (KHS) population and to develop a Korean Risk Prediction Model (KRPM) for atherosclerotic cardiovascular disease (ASCVD) events. The KHS cohort included 200,010 Korean adults aged 40-79 years who were free from ASCVD at baseline. Discrimination, calibration, and recalibration of the ACC/AHA Equations in predicting 10-year ASCVD risk in the KHS cohort were evaluated. The KRPM was derived using Cox model coefficients, mean risk factor values, and mean incidences from the KHS cohort. In the discriminatory analysis, the ACC/AHA Equations' White and African-American (AA) models moderately distinguished cases from non-cases, and were similar to the KRPM: For men, the area under the receiver operating characteristic curve (AUROCs) were 0.727 (White model), 0.725 (AA model), and 0.741 (KRPM); for women, the corresponding AUROCs were 0.738, 0.739, and 0.745. Absolute 10-year ASCVD risk for men in the KHS cohort was overestimated by 56.5% (White model) and 74.1% (AA model), while the risk for women was underestimated by 27.9% (White model) and overestimated by 29.1% (AA model). Recalibration of the ACC/AHA Equations did not affect discriminatory ability but improved calibration substantially, especially in men in the White model. Of the three ASCVD risk prediction models, the KRPM showed best calibration. The ACC/AHA Equations should not be directly applied for ASCVD risk prediction in a Korean population. The KRPM showed best predictive ability for ASCVD risk. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. Evaluating the risk of water distribution system failure: A shared frailty model

    NASA Astrophysics Data System (ADS)

    Clark, Robert M.; Thurnau, Robert C.

    2011-12-01

    Condition assessment (CA) Modeling is drawing increasing interest as a technique that can assist in managing drinking water infrastructure. This paper develops a model based on the application of a Cox proportional hazard (PH)/shared frailty model and applies it to evaluating the risk of failure in drinking water networks using data from the Laramie Water Utility (located in Laramie, Wyoming, USA). Using the risk model a cost/ benefit analysis incorporating the inspection value method (IVM), is used to assist in making improved repair, replacement and rehabilitation decisions for selected drinking water distribution system pipes. A separate model is developed to predict failures in prestressed concrete cylinder pipe (PCCP). Various currently available inspection technologies are presented and discussed.

  19. Predicting Readmission at Early Hospitalization Using Electronic Clinical Data: An Early Readmission Risk Score.

    PubMed

    Tabak, Ying P; Sun, Xiaowu; Nunez, Carlos M; Gupta, Vikas; Johannes, Richard S

    2017-03-01

    Identifying patients at high risk for readmission early during hospitalization may aid efforts in reducing readmissions. We sought to develop an early readmission risk predictive model using automated clinical data available at hospital admission. We developed an early readmission risk model using a derivation cohort and validated the model with a validation cohort. We used a published Acute Laboratory Risk of Mortality Score as an aggregated measure of clinical severity at admission and the number of hospital discharges in the previous 90 days as a measure of disease progression. We then evaluated the administrative data-enhanced model by adding principal and secondary diagnoses and other variables. We examined the c-statistic change when additional variables were added to the model. There were 1,195,640 adult discharges from 70 hospitals with 39.8% male and the median age of 63 years (first and third quartile: 43, 78). The 30-day readmission rate was 11.9% (n=142,211). The early readmission model yielded a graded relationship of readmission and the Acute Laboratory Risk of Mortality Score and the number of previous discharges within 90 days. The model c-statistic was 0.697 with good calibration. When administrative variables were added to the model, the c-statistic increased to 0.722. Automated clinical data can generate a readmission risk score early at hospitalization with fair discrimination. It may have applied value to aid early care transition. Adding administrative data increases predictive accuracy. The administrative data-enhanced model may be used for hospital comparison and outcome research.

  20. Using Radiation Risk Models in Cancer Screening Simulations: Important Assumptions and Effects on Outcome Projections

    PubMed Central

    Lee, Janie M.; McMahon, Pamela M.; Lowry, Kathryn P.; Omer, Zehra B.; Eisenberg, Jonathan D.; Pandharipande, Pari V.; Gazelle, G. Scott

    2012-01-01

    Purpose: To evaluate the effect of incorporating radiation risk into microsimulation (first-order Monte Carlo) models for breast and lung cancer screening to illustrate effects of including radiation risk on patient outcome projections. Materials and Methods: All data used in this study were derived from publicly available or deidentified human subject data. Institutional review board approval was not required. The challenges of incorporating radiation risk into simulation models are illustrated with two cancer screening models (Breast Cancer Model and Lung Cancer Policy Model) adapted to include radiation exposure effects from mammography and chest computed tomography (CT), respectively. The primary outcome projected by the breast model was life expectancy (LE) for BRCA1 mutation carriers. Digital mammographic screening beginning at ages 25, 30, 35, and 40 years was evaluated in the context of screenings with false-positive results and radiation exposure effects. The primary outcome of the lung model was lung cancer–specific mortality reduction due to annual screening, comparing two diagnostic CT protocols for lung nodule evaluation. The Metropolis-Hastings algorithm was used to estimate the mean values of the results with 95% uncertainty intervals (UIs). Results: Without radiation exposure effects, the breast model indicated that annual digital mammography starting at age 25 years maximized LE (72.03 years; 95% UI: 72.01 years, 72.05 years) and had the highest number of screenings with false-positive results (2.0 per woman). When radiation effects were included, annual digital mammography beginning at age 30 years maximized LE (71.90 years; 95% UI: 71.87 years, 71.94 years) with a lower number of screenings with false-positive results (1.4 per woman). For annual chest CT screening of 50-year-old females with no follow-up for nodules smaller than 4 mm in diameter, the lung model predicted lung cancer–specific mortality reduction of 21.50% (95% UI: 20.90%, 22.10%) without radiation risk and 17.75% (95% UI: 16.97%, 18.41%) with radiation risk. Conclusion: Because including radiation exposure risk can influence long-term projections from simulation models, it is important to include these risks when conducting modeling-based assessments of diagnostic imaging. © RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11110352/-/DC1 PMID:22357897

  1. Plasma Free Amino Acid Profiles Predict Four-Year Risk of Developing Diabetes, Metabolic Syndrome, Dyslipidemia, and Hypertension in Japanese Population

    PubMed Central

    Yamakado, Minoru; Nagao, Kenji; Imaizumi, Akira; Tani, Mizuki; Toda, Akiko; Tanaka, Takayuki; Jinzu, Hiroko; Miyano, Hiroshi; Yamamoto, Hiroshi; Daimon, Takashi; Horimoto, Katsuhisa; Ishizaka, Yuko

    2015-01-01

    Plasma free amino acid (PFAA) profile is highlighted in its association with visceral obesity and hyperinsulinemia, and future diabetes. Indeed PFAA profiling potentially can evaluate individuals’ future risks of developing lifestyle-related diseases, in addition to diabetes. However, few studies have been performed especially in Asian populations, about the optimal combination of PFAAs for evaluating health risks. We quantified PFAA levels in 3,701 Japanese subjects, and determined visceral fat area (VFA) and two-hour post-challenge insulin (Ins120 min) values in 865 and 1,160 subjects, respectively. Then, models between PFAA levels and the VFA or Ins120 min values were constructed by multiple linear regression analysis with variable selection. Finally, a cohort study of 2,984 subjects to examine capabilities of the obtained models for predicting four-year risk of developing new-onset lifestyle-related diseases was conducted. The correlation coefficients of the obtained PFAA models against VFA or Ins120 min were higher than single PFAA level. Our models work well for future risk prediction. Even after adjusting for commonly accepted multiple risk factors, these models can predict future development of diabetes, metabolic syndrome, and dyslipidemia. PFAA profiles confer independent and differing contributions to increasing the lifestyle-related disease risks in addition to the currently known factors in a general Japanese population. PMID:26156880

  2. [Perioperative mortality. Risk factors associated with anaesthesia].

    PubMed

    Zajac, Krzysztof; Zajac, Małgorzata

    2005-01-01

    Perioperative mortality associated with anaesthesia has been closely monitored throughout half of the century. The breakthrough in anaesthesia safety occurred in the 80-ties and 90-ties of the last century, when we could witness 5-folded reduction in mortality associated with anaesthesia, i.e. from 1 death:2680 operations/anaesthetic procedures (the 50-ties of the 20 h century) to 1:10,000 (and even 20-folded reduction within the ASA 1 and 2 groups of the patients--1 death:185,000 procedures). However, the more detailed analysis showed that the perioperative mortality is significantly higher, namely 1 death: approximately 500 procedures, and in the ASA 5 group of patients 1:4.5 procedures; what is more meaningful, the numbers have not been changed since 50 years. This phenomenon supports the thesis of anaesthesia safety, however, it indicates the drawbacks within the models and scoring systems evaluating operative risk. Several available scoring scales which can predict death rate, at the same time are not able to assess the extent of the other than biological risk factors. The "extra-biological risk" (i.e. process of therapy) may in some cases increase the operative risk as a whole. The value of the operative risk, as the fraction given by predicted death rate, is located between the numbers 0 and 1 (or between survival and death in the binary model of the probability theory). Recognition of the "extrabiological risk" value depends however on the high sensitivity of the scales evaluating prediction of death rate.

  3. Measuring daily Value-at-Risk of SSEC index: A new approach based on multifractal analysis and extreme value theory

    NASA Astrophysics Data System (ADS)

    Wei, Yu; Chen, Wang; Lin, Yu

    2013-05-01

    Recent studies in the econophysics literature reveal that price variability has fractal and multifractal characteristics not only in developed financial markets, but also in emerging markets. Taking high-frequency intraday quotes of the Shanghai Stock Exchange Component (SSEC) Index as example, this paper proposes a new method to measure daily Value-at-Risk (VaR) by combining the newly introduced multifractal volatility (MFV) model and the extreme value theory (EVT) method. Two VaR backtesting techniques are then employed to compare the performance of the model with that of a group of linear and nonlinear generalized autoregressive conditional heteroskedasticity (GARCH) models. The empirical results show the multifractal nature of price volatility in Chinese stock market. VaR measures based on the multifractal volatility model and EVT method outperform many GARCH-type models at high-risk levels.

  4. [Early prediction of the neurological result at 12 months in newborns at neurological risk].

    PubMed

    Herbón, F; Garibotti, G; Moguilevsky, J

    2015-08-01

    The aim of this study was to evaluate the Amiel-Tison neurological examination (AT) and cranial ultrasound at term for predicting the neurological result at 12 months in newborns with neurological risk. The study included 89 newborns with high risk of neurological damage, who were discharged from the Neonatal Intensive Care of the Hospital Zonal Bariloche, Argentina. The assessment consisted of a neurological examination and cranial ultrasound at term, and neurological examination and evaluation of development at 12 months. The sensitivity, specificity, positive and negative predictor value was calculated. The relationship between perinatal factors and neurodevelopment at 12 month of age was also calculated using logistic regression models. Seventy children completed the follow-up. At 12 months of age, 14% had an abnormal neurological examination, and 17% abnormal development. The neurological examination and the cranial ultrasound at term had low sensitivity to predict abnormal neurodevelopment. At 12 months, 93% of newborns with normal AT showed normal neurological results, and 86% normal development. Among newborns with normal cranial ultrasound the percentages were 90 and 81%, respectively. Among children with three or more perinatal risk factors, the frequency of abnormalities in the neurological response was 5.4 times higher than among those with fewer risk factors, and abnormal development was 3.5 times more frequent. The neurological examination and cranial ultrasound at term had low sensitivity but high negative predictive value for the neurodevelopment at 12 months. Three or more perinatal risk factors were associated with neurodevelopment abnormalities at 12 months of age. Copyright © 2014 Asociación Española de Pediatría. Published by Elsevier España, S.L.U. All rights reserved.

  5. A microRNA-based prediction model for lymph node metastasis in hepatocellular carcinoma.

    PubMed

    Zhang, Li; Xiang, Zuo-Lin; Zeng, Zhao-Chong; Fan, Jia; Tang, Zhao-You; Zhao, Xiao-Mei

    2016-01-19

    We developed an efficient microRNA (miRNA) model that could predict the risk of lymph node metastasis (LNM) in hepatocellular carcinoma (HCC). We first evaluated a training cohort of 192 HCC patients after hepatectomy and found five LNM associated predictive factors: vascular invasion, Barcelona Clinic Liver Cancer stage, miR-145, miR-31, and miR-92a. The five statistically independent factors were used to develop a predictive model. The predictive value of the miRNA-based model was confirmed in a validation cohort of 209 consecutive HCC patients. The prediction model was scored for LNM risk from 0 to 8. The cutoff value 4 was used to distinguish high-risk and low-risk groups. The model sensitivity and specificity was 69.6 and 80.2%, respectively, during 5 years in the validation cohort. And the area under the curve (AUC) for the miRNA-based prognostic model was 0.860. The 5-year positive and negative predictive values of the model in the validation cohort were 30.3 and 95.5%, respectively. Cox regression analysis revealed that the LNM hazard ratio of the high-risk versus low-risk groups was 11.751 (95% CI, 5.110-27.021; P < 0.001) in the validation cohort. In conclusion, the miRNA-based model is reliable and accurate for the early prediction of LNM in patients with HCC.

  6. Construction of a pathological risk model of occult lymph node metastases for prognostication by semi-automated image analysis of tumor budding in early-stage oral squamous cell carcinoma

    PubMed Central

    Pedersen, Nicklas Juel; Jensen, David Hebbelstrup; Lelkaitis, Giedrius; Kiss, Katalin; Charabi, Birgitte; Specht, Lena; von Buchwald, Christian

    2017-01-01

    It is challenging to identify at diagnosis those patients with early oral squamous cell carcinoma (OSCC), who have a poor prognosis and those that have a high risk of harboring occult lymph node metastases. The aim of this study was to develop a standardized and objective digital scoring method to evaluate the predictive value of tumor budding. We developed a semi-automated image-analysis algorithm, Digital Tumor Bud Count (DTBC), to evaluate tumor budding. The algorithm was tested in 222 consecutive patients with early-stage OSCC and major endpoints were overall (OS) and progression free survival (PFS). We subsequently constructed and cross-validated a binary logistic regression model and evaluated its clinical utility by decision curve analysis. A high DTBC was an independent predictor of both poor OS and PFS in a multivariate Cox regression model. The logistic regression model was able to identify patients with occult lymph node metastases with an area under the curve (AUC) of 0.83 (95% CI: 0.78–0.89, P <0.001) and a 10-fold cross-validated AUC of 0.79. Compared to other known histopathological risk factors, the DTBC had a higher diagnostic accuracy. The proposed, novel risk model could be used as a guide to identify patients who would benefit from an up-front neck dissection. PMID:28212555

  7. Influences on emergency department length of stay for older people.

    PubMed

    Street, Maryann; Mohebbi, Mohammadreza; Berry, Debra; Cross, Anthony; Considine, Julie

    2018-02-14

    The aim of this study was to examine the influences on emergency department (ED) length of stay (LOS) for older people and develop a predictive model for an ED LOS more than 4 h. This retrospective cohort study used organizational data linkage at the patient level from a major Australian health service. The study population was aged 65 years or older, attending an ED during the 2013/2014 financial year. We developed and internally validated a clinical prediction rule. Discriminatory performance of the model was evaluated by receiver operating characteristic (ROC) curve analysis. An integer-based risk score was developed using multivariate logistic regression. The risk score was evaluated using ROC analysis. There were 33 926 ED attendances: 57.5% (n=19 517) had an ED LOS more than 4 h. The area under ROC for age, usual accommodation, triage category, arrival by ambulance, arrival overnight, imaging, laboratory investigations, overcrowding, time to be seen by doctor, ED visits with admission and access block relating to ED LOS more than 4 h was 0.796, indicating good performance. In the validation set, area under ROC was 0.80, P-value was 0.36 and prediction mean square error was 0.18, indicating good calibration. The risk score value attributed to each risk factor ranged from 2 to 68 points. The clinical prediction rule stratified patients into five levels of risk on the basis of the total risk score. Objective identification of older people at intermediate and high risk of an ED LOS more than 4 h early in ED care enables targeted approaches to streamline the patient journey, decrease ED LOS and optimize emergency care for older people.

  8. Measuring Value-at-Risk and Expected Shortfall of crude oil portfolio using extreme value theory and vine copula

    NASA Astrophysics Data System (ADS)

    Yu, Wenhua; Yang, Kun; Wei, Yu; Lei, Likun

    2018-01-01

    Volatilities of crude oil price have important impacts on the steady and sustainable development of world real economy. Thus it is of great academic and practical significance to model and measure the volatility and risk of crude oil markets accurately. This paper aims to measure the Value-at-Risk (VaR) and Expected Shortfall (ES) of a portfolio consists of four crude oil assets by using GARCH-type models, extreme value theory (EVT) and vine copulas. The backtesting results show that the combination of GARCH-type-EVT models and vine copula methods can produce accurate risk measures of the oil portfolio. Mixed R-vine copula is more flexible and superior to other vine copulas. Different GARCH-type models, which can depict the long-memory and/or leverage effect of oil price volatilities, however offer similar marginal distributions of the oil returns.

  9. Integrating Tenascin-C protein expression and 1q25 copy number status in pediatric intracranial ependymoma prognostication: A new model for risk stratification.

    PubMed

    Andreiuolo, Felipe; Le Teuff, Gwénaël; Bayar, Mohamed Amine; Kilday, John-Paul; Pietsch, Torsten; von Bueren, André O; Witt, Hendrik; Korshunov, Andrey; Modena, Piergiorgio; Pfister, Stefan M; Pagès, Mélanie; Castel, David; Giangaspero, Felice; Chimelli, Leila; Varlet, Pascale; Rutkowski, Stefan; Frappaz, Didier; Massimino, Maura; Grundy, Richard; Grill, Jacques

    2017-01-01

    Despite multimodal therapy, prognosis of pediatric intracranial ependymomas remains poor with a 5-year survival rate below 70% and frequent late deaths. This multicentric European study evaluated putative prognostic biomarkers. Tenascin-C (TNC) immunohistochemical expression and copy number status of 1q25 were retained for a pooled analysis of 5 independent cohorts. The prognostic value of TNC and 1q25 on the overall survival (OS) was assessed using a Cox model adjusted to age at diagnosis, tumor location, WHO grade, extent of resection, radiotherapy and stratified by cohort. Stratification on a predictor that did not satisfy the proportional hazards assumption was considered. Model performance was evaluated and an internal-external cross validation was performed. Among complete cases with 5-year median follow-up (n = 470; 131 deaths), TNC and 1q25 gain were significantly associated with age at diagnosis and posterior fossa tumor location. 1q25 status added independent prognostic value for death beyond the classical variables with a hazard ratio (HR) = 2.19 95%CI = [1.29; 3.76] (p = 0.004), while TNC prognostic relation was tumor location-dependent with HR = 2.19 95%CI = [1.29; 3.76] (p = 0.004) in posterior fossa and HR = 0.64 [0.28; 1.48] (p = 0.295) in supratentorial (interaction p value = 0.015). The derived prognostic score identified 3 different robust risk groups. The omission of upfront RT was not associated with OS for good and intermediate prognostic groups while the absence of upfront RT was negatively associated with OS in the poor risk group. Integrated TNC expression and 1q25 status are useful to better stratify patients and to eventually adapt treatment regimens in pediatric intracranial ependymoma.

  10. Breeding objectives for pigs in Kenya. II: economic values incorporating risks in different smallholder production systems.

    PubMed

    Mbuthia, Jackson Mwenda; Rewe, Thomas Odiwuor; Kahi, Alexander Kigunzu

    2015-02-01

    This study estimated economic values for production traits (dressing percentage (DP), %; live weight for growers (LWg), kg; live weight for sows (LWs), kg) and functional traits (feed intake for growers (FEEDg), feed intake for sow (FEEDs), preweaning survival rate (PrSR), %; postweaning survival (PoSR), %; sow survival rate (SoSR), %, total number of piglets born (TNB) and farrowing interval (FI), days) under different smallholder pig production systems in Kenya. Economic values were estimated considering two production circumstances: fixed-herd and fixed-feed. Under the fixed-herd scenario, economic values were estimated assuming a situation where the herd cannot be increased due to other constraints apart from feed resources. The fixed-feed input scenario assumed that the herd size is restricted by limitation of feed resources available. In addition to the tradition profit model, a risk-rated bio-economic model was used to derive risk-rated economic values. This model accounted for imperfect knowledge concerning risk attitude of farmers and variance of input and output prices. Positive economic values obtained for traits DP, LWg, LWs, PoSR, PrSR, SoSR and TNB indicate that targeting them in improvement would positively impact profitability in pig breeding programmes. Under the fixed-feed basis, the risk-rated economic values for DP, LWg, LWs and SoSR were similar to those obtained under the fixed-herd situation. Accounting for risks in the EVs did not yield errors greater than ±50 % in all the production systems and basis of evaluation meaning there would be relatively little effect on the real genetic gain of a selection index. Therefore, both traditional and risk-rated models can be satisfactorily used to predict profitability in pig breeding programmes.

  11. Bioconcentration of gaseous organic chemicals in plant leaves: Comparison of experimental data with model predictions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polder, M.D.; Hulzebos, E.M.; Jager, D.T.

    1998-01-01

    This literature study is performed to support the implementation of two models in a risk assessment system for the evaluation of chemicals and their risk for human health and the environment. One of the exposure pathways for humans and cattle is the uptake of chemicals by plants. In this risk assessment system the transfer of gaseous organic substances from air to plants modeled by Riederer is included. A similar model with a more refined approach, including dilution by growth, is proposed by Trapp and Matthies, which was implemented in the European version of this risk assessment system (EUSES). In thismore » study both models are evaluated by comparison with experimental data on leaf/air partition coefficients found in the literature. For herbaceous plants both models give good estimations for the leaf/air partition coefficient up to 10{sup 7}, with deviations for most substances within a factor of five. For the azalea and spruce group the fit between experimental BCF values and the calculated model values is less adequate. For substances for which Riederer estimates a leaf/air partition coefficient above 10{sup 7}, the approach of Trapp and Matthies seems more adequate; however, few data were available.« less

  12. Coronary artery calcium scoring does not add prognostic value to standard 64-section CT angiography protocol in low-risk patients suspected of having coronary artery disease.

    PubMed

    Kwon, Sung Woo; Kim, Young Jin; Shim, Jaemin; Sung, Ji Min; Han, Mi Eun; Kang, Dong Won; Kim, Ji-Ye; Choi, Byoung Wook; Chang, Hyuk-Jae

    2011-04-01

    To evaluate the prognostic outcome of cardiac computed tomography (CT) for prediction of major adverse cardiac events (MACEs) in low-risk patients suspected of having coronary artery disease (CAD) and to explore the differential prognostic values of coronary artery calcium (CAC) scoring and coronary CT angiography. Institutional review committee approval and informed consent were obtained. In 4338 patients who underwent 64-section CT for evaluation of suspected CAD, both CAC scoring and CT angiography were concurrently performed by using standard scanning protocols. Follow-up clinical outcome data regarding composite MACEs were procured. Multivariable Cox proportional hazards models were developed to predict MACEs. Risk-adjusted models incorporated traditional risk factors for CAC scoring and coronary CT angiography. During the mean follow-up of 828 days ± 380, there were 105 MACEs, for an event rate of 3%. The presence of obstructive CAD at coronary CT angiography had independent prognostic value, which escalated according to the number of stenosed vessels (P < .001). In the receiver operating characteristic curve (ROC) analysis, the superiority of coronary CT angiography to CAC scoring was demonstrated by a significantly greater area under the ROC curve (AUC) (0.892 vs 0.810, P < .001), whereas no significant incremental value for the addition of CAC scoring to coronary CT angiography was established (AUC = 0.892 for coronary CT angiography alone vs 0.902 with addition of CAC scoring, P = .198). Coronary CT angiography is better than CAC scoring in predicting MACEs in low-risk patients suspected of having CAD. Furthermore, the current standard multisection CT protocol (coronary CT angiography combined with CAC scoring) has no incremental prognostic value compared with coronary CT angiography alone. Therefore, in terms of determining prognosis, CAC scoring may no longer need to be incorporated in the cardiac CT protocol in this population. © RSNA, 2011.

  13. Risk adjustment in the American College of Surgeons National Surgical Quality Improvement Program: a comparison of logistic versus hierarchical modeling.

    PubMed

    Cohen, Mark E; Dimick, Justin B; Bilimoria, Karl Y; Ko, Clifford Y; Richards, Karen; Hall, Bruce Lee

    2009-12-01

    Although logistic regression has commonly been used to adjust for risk differences in patient and case mix to permit quality comparisons across hospitals, hierarchical modeling has been advocated as the preferred methodology, because it accounts for clustering of patients within hospitals. It is unclear whether hierarchical models would yield important differences in quality assessments compared with logistic models when applied to American College of Surgeons (ACS) National Surgical Quality Improvement Program (NSQIP) data. Our objective was to evaluate differences in logistic versus hierarchical modeling for identifying hospitals with outlying outcomes in the ACS-NSQIP. Data from ACS-NSQIP patients who underwent colorectal operations in 2008 at hospitals that reported at least 100 operations were used to generate logistic and hierarchical prediction models for 30-day morbidity and mortality. Differences in risk-adjusted performance (ratio of observed-to-expected events) and outlier detections from the two models were compared. Logistic and hierarchical models identified the same 25 hospitals as morbidity outliers (14 low and 11 high outliers), but the hierarchical model identified 2 additional high outliers. Both models identified the same eight hospitals as mortality outliers (five low and three high outliers). The values of observed-to-expected events ratios and p values from the two models were highly correlated. Results were similar when data were permitted from hospitals providing < 100 patients. When applied to ACS-NSQIP data, logistic and hierarchical models provided nearly identical results with respect to identification of hospitals' observed-to-expected events ratio outliers. As hierarchical models are prone to implementation problems, logistic regression will remain an accurate and efficient method for performing risk adjustment of hospital quality comparisons.

  14. A risk-based multi-objective model for optimal placement of sensors in water distribution system

    NASA Astrophysics Data System (ADS)

    Naserizade, Sareh S.; Nikoo, Mohammad Reza; Montaseri, Hossein

    2018-02-01

    In this study, a new stochastic model based on Conditional Value at Risk (CVaR) and multi-objective optimization methods is developed for optimal placement of sensors in water distribution system (WDS). This model determines minimization of risk which is caused by simultaneous multi-point contamination injection in WDS using CVaR approach. The CVaR considers uncertainties of contamination injection in the form of probability distribution function and calculates low-probability extreme events. In this approach, extreme losses occur at tail of the losses distribution function. Four-objective optimization model based on NSGA-II algorithm is developed to minimize losses of contamination injection (through CVaR of affected population and detection time) and also minimize the two other main criteria of optimal placement of sensors including probability of undetected events and cost. Finally, to determine the best solution, Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), as a subgroup of Multi Criteria Decision Making (MCDM) approach, is utilized to rank the alternatives on the trade-off curve among objective functions. Also, sensitivity analysis is done to investigate the importance of each criterion on PROMETHEE results considering three relative weighting scenarios. The effectiveness of the proposed methodology is examined through applying it to Lamerd WDS in the southwestern part of Iran. The PROMETHEE suggests 6 sensors with suitable distribution that approximately cover all regions of WDS. Optimal values related to CVaR of affected population and detection time as well as probability of undetected events for the best optimal solution are equal to 17,055 persons, 31 mins and 0.045%, respectively. The obtained results of the proposed methodology in Lamerd WDS show applicability of CVaR-based multi-objective simulation-optimization model for incorporating the main uncertainties of contamination injection in order to evaluate extreme value of losses in WDS.

  15. Risk of false decision on conformity of a multicomponent material when test results of the components' content are correlated.

    PubMed

    Kuselman, Ilya; Pennecchi, Francesca R; da Silva, Ricardo J N B; Hibbert, D Brynn

    2017-11-01

    The probability of a false decision on conformity of a multicomponent material due to measurement uncertainty is discussed when test results are correlated. Specification limits of the components' content of such a material generate a multivariate specification interval/domain. When true values of components' content and corresponding test results are modelled by multivariate distributions (e.g. by multivariate normal distributions), a total global risk of a false decision on the material conformity can be evaluated based on calculation of integrals of their joint probability density function. No transformation of the raw data is required for that. A total specific risk can be evaluated as the joint posterior cumulative function of true values of a specific batch or lot lying outside the multivariate specification domain, when the vector of test results, obtained for the lot, is inside this domain. It was shown, using a case study of four components under control in a drug, that the correlation influence on the risk value is not easily predictable. To assess this influence, the evaluated total risk values were compared with those calculated for independent test results and also with those assuming much stronger correlation than that observed. While the observed statistically significant correlation did not lead to a visible difference in the total risk values in comparison to the independent test results, the stronger correlation among the variables caused either the total risk decreasing or its increasing, depending on the actual values of the test results. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Science You Can Use Bulletin: Wildfire triage: Targeting mitigation based on social, economic, and ecological values

    Treesearch

    Karl Malcolm; Matthew Thompson; Dave Calkin; Mark Finney; Alan Ager

    2012-01-01

    Evaluating the risks of wildfire relative to the valuable resources found in any managed landscape requires an interdisciplinary approach. Researchers at the Rocky Mountain Research Station and Western Wildland Threat Assessment Center developed such a process, using a combination of techniques rooted in fire modeling and ecology, economics, decision sciences, and the...

  17. A nomogram based on mammary ductoscopic indicators for evaluating the risk of breast cancer in intraductal neoplasms with nipple discharge.

    PubMed

    Lian, Zhen-Qiang; Wang, Qi; Zhang, An-Qin; Zhang, Jiang-Yu; Han, Xiao-Rong; Yu, Hai-Yun; Xie, Si-Mei

    2015-04-01

    Mammary ductoscopy (MD) is commonly used to detect intraductal lesions associated with nipple discharge. This study investigated the relationships between ductoscopic image-based indicators and breast cancer risk, and developed a nomogram for evaluating breast cancer risk in intraductal neoplasms with nipple discharge. A total of 879 consecutive inpatients (916 breasts) with nipple discharge who underwent selective duct excision for intraductal neoplasms detected by MD from June 2008 to April 2014 were analyzed retrospectively. A nomogram was developed using a multivariate logistic regression model based on data from a training set (687 cases) and validated in an independent validation set (229 cases). A Youden-derived cut-off value was assigned to the nomogram for the diagnosis of breast cancer. Color of discharge, location, appearance, and surface of neoplasm, and morphology of ductal wall were independent predictors for breast cancer in multivariate logistic regression analysis. A nomogram based on these predictors performed well. The P value of the Hosmer-Lemeshow test for the prediction model was 0.36. Area under the curve values of 0.812 (95 % confidence interval (CI) 0.763-0.860) and 0.738 (95 % CI 0.635-0.841) was obtained in the training and validation sets, respectively. The accuracies of the nomogram for breast cancer diagnosis were 71.2 % in the training set and 75.5 % in the validation set. We developed a nomogram for evaluating breast cancer risk in intraductal neoplasms with nipple discharge based on MD image findings. This model may aid individual risk assessment and guide treatment in clinical practice.

  18. SU-E-T-128: Applying Failure Modes and Effects Analysis to a Risk-Based Quality Management for Stereotactic Radiosurgery in Brazil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teixeira, F; Universidade do Estado do Rio de Janeiro, Rio De Janeiro, RJ; Almeida, C de

    2015-06-15

    Purpose: The goal of the present work was to evaluate the process maps for stereotactic radiosurgery (SRS) treatment at three radiotherapy centers in Brazil and apply the FMEA technique to evaluate similarities and differences, if any, of the hazards and risks associated with these processes. Methods: A team, consisting of professionals from different disciplines and involved in the SRS treatment, was formed at each center. Each team was responsible for the development of the process map, and performance of FMEA and FTA. A facilitator knowledgeable in these techniques led the work at each center. The TG100 recommended scales were usedmore » for the evaluation of hazard and severity for each step for the major process “treatment planning”. Results: Hazard index given by the Risk Priority Number (RPN) is found to range from 4–270 for various processes and the severity (S) index is found to range from 1–10. The RPN values > 100 and severity value ≥ 7 were chosen to flag safety improvement interventions. Number of steps with RPN ≥100 were found to be 6, 59 and 45 for the three centers. The corresponding values for S ≥ 7 are 24, 21 and 25 respectively. The range of RPN and S values for each center belong to different process steps and failure modes. Conclusion: These results show that interventions to improve safety is different for each center and it is associated with the skill level of the professional team as well as the technology used to provide radiosurgery treatment. The present study will very likely be a model for implementation of risk-based prospective quality management program for SRS treatment in Brazil where currently there are 28 radiotherapy centers performing SRS. A complete FMEA for SRS for these three radiotherapy centers is currently under development.« less

  19. Mapping Dependence Between Extreme Rainfall and Storm Surge

    NASA Astrophysics Data System (ADS)

    Wu, Wenyan; McInnes, Kathleen; O'Grady, Julian; Hoeke, Ron; Leonard, Michael; Westra, Seth

    2018-04-01

    Dependence between extreme storm surge and rainfall can have significant implications for flood risk in coastal and estuarine regions. To supplement limited observational records, we use reanalysis surge data from a hydrodynamic model as the basis for dependence mapping, providing information at a resolution of approximately 30 km along the Australian coastline. We evaluated this approach by comparing the dependence estimates from modeled surge to that calculated using historical surge records from 79 tide gauges around Australia. The results show reasonable agreement between the two sets of dependence values, with the exception of lower seasonal variation in the modeled dependence values compared to the observed data, especially at locations where there are multiple processes driving extreme storm surge. This is due to the combined impact of local bathymetry as well as the resolution of the hydrodynamic model and its meteorological inputs. Meteorological drivers were also investigated for different combinations of extreme rainfall and surge—namely rain-only, surge-only, and coincident extremes—finding that different synoptic patterns are responsible for each combination. The ability to supplement observational records with high-resolution modeled surge data enables a much more precise quantification of dependence along the coastline, strengthening the physical basis for assessments of flood risk in coastal regions.

  20. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1986-01-01

    The risks, values, and costs of the SETI project are evaluated and compared with those of the Viking project. Examination of the scientific values, side benefits, and costs of the two projects reveal that both projects provide equal benefits at equal costs. The probability of scientific and technical success is analyzed.

  1. Incremental Value of Repeated Risk Factor Measurements for Cardiovascular Disease Prediction in Middle-Aged Korean Adults: Results From the NHIS-HEALS (National Health Insurance System-National Health Screening Cohort).

    PubMed

    Cho, In-Jeong; Sung, Ji Min; Chang, Hyuk-Jae; Chung, Namsik; Kim, Hyeon Chang

    2017-11-01

    Increasing evidence suggests that repeatedly measured cardiovascular disease (CVD) risk factors may have an additive predictive value compared with single measured levels. Thus, we evaluated the incremental predictive value of incorporating periodic health screening data for CVD prediction in a large nationwide cohort with periodic health screening tests. A total of 467 708 persons aged 40 to 79 years and free from CVD were randomly divided into development (70%) and validation subcohorts (30%). We developed 3 different CVD prediction models: a single measure model using single time point screening data; a longitudinal average model using average risk factor values from periodic screening data; and a longitudinal summary model using average values and the variability of risk factors. The development subcohort included 327 396 persons who had 3.2 health screenings on average and 25 765 cases of CVD over 12 years. The C statistics (95% confidence interval [CI]) for the single measure, longitudinal average, and longitudinal summary models were 0.690 (95% CI, 0.682-0.698), 0.695 (95% CI, 0.687-0.703), and 0.752 (95% CI, 0.744-0.760) in men and 0.732 (95% CI, 0.722-0.742), 0.735 (95% CI, 0.725-0.745), and 0.790 (95% CI, 0.780-0.800) in women, respectively. The net reclassification index from the single measure model to the longitudinal average model was 1.78% in men and 1.33% in women, and the index from the longitudinal average model to the longitudinal summary model was 32.71% in men and 34.98% in women. Using averages of repeatedly measured risk factor values modestly improves CVD predictability compared with single measurement values. Incorporating the average and variability information of repeated measurements can lead to great improvements in disease prediction. URL: https://www.clinicaltrials.gov. Unique identifier: NCT02931500. © 2017 American Heart Association, Inc.

  2. A step function model to evaluate the real monetary value of man-sievert with real GDP.

    PubMed

    Na, Seong H; Kim, Sun G

    2009-01-01

    For use in a cost-benefit analysis to establish optimum levels of radiation protection in Korea under the ALARA principle, we introduce a discrete step function model to evaluate man-sievert monetary value in the real economic value. The model formula, which is unique and country-specific, is composed of real GDP, the nominal risk coefficient for cancer and hereditary effects, the aversion factor against radiation exposure, and average life expectancy. Unlike previous researches on alpha-value assessment, we show different alpha values in the real term, differentiated with respect to the range of individual doses, which would be more realistic and informative for application to the radiation protection practices. GDP deflators of economy can reflect the society's situations. Finally, we suggest that the Korean model can be generalized simply to other countries without normalizing any country-specific factors.

  3. Computer-Based Model Calibration and Uncertainty Analysis: Terms and Concepts

    DTIC Science & Technology

    2015-07-01

    uncertainty analyses throughout the lifecycle of planning, designing, and operating of Civil Works flood risk management projects as described in...value 95% of the time. In the frequentist approach to PE, model parameters area regarded as having true values, and their estimate is based on the...in catchment models. 1. Evaluating parameter uncertainty. Water Resources Research 19(5):1151–1172. Lee, P. M. 2012. Bayesian statistics: An

  4. Forecasting VaR and ES of stock index portfolio: A Vine copula method

    NASA Astrophysics Data System (ADS)

    Zhang, Bangzheng; Wei, Yu; Yu, Jiang; Lai, Xiaodong; Peng, Zhenfeng

    2014-12-01

    Risk measurement has both theoretical and practical significance in risk management. Using daily sample of 10 international stock indices, firstly this paper models the internal structures among different stock markets with C-Vine, D-Vine and R-Vine copula models. Secondly, the Value-at-Risk (VaR) and Expected Shortfall (ES) of the international stock markets portfolio are forecasted using Monte Carlo method based on the estimated dependence of different Vine copulas. Finally, the accuracy of VaR and ES measurements obtained from different statistical models are evaluated by UC, IND, CC and Posterior analysis. The empirical results show that the VaR forecasts at the quantile levels of 0.9, 0.95, 0.975 and 0.99 with three kinds of Vine copula models are sufficiently accurate. Several traditional methods, such as historical simulation, mean-variance and DCC-GARCH models, fail to pass the CC backtesting. The Vine copula methods can accurately forecast the ES of the portfolio on the base of VaR measurement, and D-Vine copula model is superior to other Vine copulas.

  5. Evaluation and Enhancement of Calibration in the American College of Surgeons NSQIP Surgical Risk Calculator.

    PubMed

    Liu, Yaoming; Cohen, Mark E; Hall, Bruce L; Ko, Clifford Y; Bilimoria, Karl Y

    2016-08-01

    The American College of Surgeon (ACS) NSQIP Surgical Risk Calculator has been widely adopted as a decision aid and informed consent tool by surgeons and patients. Previous evaluations showed excellent discrimination and combined discrimination and calibration, but model calibration alone, and potential benefits of recalibration, were not explored. Because lack of calibration can lead to systematic errors in assessing surgical risk, our objective was to assess calibration and determine whether spline-based adjustments could improve it. We evaluated Surgical Risk Calculator model calibration, as well as discrimination, for each of 11 outcomes modeled from nearly 3 million patients (2010 to 2014). Using independent random subsets of data, we evaluated model performance for the Development (60% of records), Validation (20%), and Test (20%) datasets, where prediction equations from the Development dataset were recalibrated using restricted cubic splines estimated from the Validation dataset. We also evaluated performance on data subsets composed of higher-risk operations. The nonrecalibrated Surgical Risk Calculator performed well, but there was a slight tendency for predicted risk to be overestimated for lowest- and highest-risk patients and underestimated for moderate-risk patients. After recalibration, this distortion was eliminated, and p values for miscalibration were most often nonsignificant. Calibration was also excellent for subsets of higher-risk operations, though observed calibration was reduced due to instability associated with smaller sample sizes. Performance of NSQIP Surgical Risk Calculator models was shown to be excellent and improved with recalibration. Surgeons and patients can rely on the calculator to provide accurate estimates of surgical risk. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  6. Comparison of different risk stratification systems in predicting short-term serious outcome of syncope patients.

    PubMed

    Safari, Saeed; Baratloo, Alireza; Hashemi, Behrooz; Rahmati, Farhad; Forouzanfar, Mohammad Mehdi; Motamedi, Maryam; Mirmohseni, Ladan

    2016-01-01

    Determining etiologic causes and prognosis can significantly improve management of syncope patients. The present study aimed to compare the values of San Francisco, Osservatorio Epidemiologico sulla Sincope nel Lazio (OESIL), Boston, and Risk Stratification of Syncope in the Emergency Department (ROSE) score clinical decision rules in predicting the short-term serious outcome of syncope patients. The present diagnostic accuracy study with 1-week follow-up was designed to evaluate the predictive values of the four mentioned clinical decision rules. Screening performance characteristics of each model in predicting mortality, myocardial infarction (MI), and cerebrovascular accidents (CVAs) were calculated and compared. To evaluate the value of each aforementioned model in predicting the outcome, sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio were calculated and receiver-operating curve (ROC) curve analysis was done. A total of 187 patients (mean age: 64.2 ± 17.2 years) were enrolled in the study. Mortality, MI, and CVA were seen in 19 (10.2%), 12 (6.4%), and 36 (19.2%) patients, respectively. Area under the ROC curve for OESIL, San Francisco, Boston, and ROSE models in prediction the risk of 1-week mortality, MI, and CVA was in the 30-70% range, with no significant difference among models ( P > 0.05). The pooled model did not show higher accuracy in prediction of mortality, MI, and CVA compared to others ( P > 0.05). This study revealed the weakness of all four evaluated models in predicting short-term serious outcome of syncope patients referred to the emergency department without any significant advantage for one among others.

  7. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  8. Research and Evaluations of the Health Aspects of Disasters, Part IX: Risk-Reduction Framework.

    PubMed

    Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P; Loretti, Alessandro

    2016-06-01

    A disaster is a failure of resilience to an event. Mitigating the risks that a hazard will progress into a destructive event, or increasing the resilience of a society-at-risk, requires careful analysis, planning, and execution. The Disaster Logic Model (DLM) is used to define the value (effects, costs, and outcome(s)), impacts, and benefits of interventions directed at risk reduction. A Risk-Reduction Framework, based on the DLM, details the processes involved in hazard mitigation and/or capacity-building interventions to augment the resilience of a community or to decrease the risk that a secondary event will develop. This Framework provides the structure to systematically undertake and evaluate risk-reduction interventions. It applies to all interventions aimed at hazard mitigation and/or increasing the absorbing, buffering, or response capacities of a community-at-risk for a primary or secondary event that could result in a disaster. The Framework utilizes the structure provided by the DLM and consists of 14 steps: (1) hazards and risks identification; (2) historical perspectives and predictions; (3) selection of hazard(s) to address; (4) selection of appropriate indicators; (5) identification of current resilience standards and benchmarks; (6) assessment of the current resilience status; (7) identification of resilience needs; (8) strategic planning; (9) selection of an appropriate intervention; (10) operational planning; (11) implementation; (12) assessments of outputs; (13) synthesis; and (14) feedback. Each of these steps is a transformation process that is described in detail. Emphasis is placed on the role of Coordination and Control during planning, implementation of risk-reduction/capacity building interventions, and evaluation. Birnbaum ML , Daily EK , O'Rourke AP , Loretti A . Research and evaluations of the health aspects of disasters, part IX: Risk-Reduction Framework. Prehosp Disaster Med. 2016;31(3):309-325.

  9. Determination of osteoporosis risk factors using a multiple logistic regression model in postmenopausal Turkish women.

    PubMed

    Akkus, Zeki; Camdeviren, Handan; Celik, Fatma; Gur, Ali; Nas, Kemal

    2005-09-01

    To determine the risk factors of osteoporosis using a multiple binary logistic regression method and to assess the risk variables for osteoporosis, which is a major and growing health problem in many countries. We presented a case-control study, consisting of 126 postmenopausal healthy women as control group and 225 postmenopausal osteoporotic women as the case group. The study was carried out in the Department of Physical Medicine and Rehabilitation, Dicle University, Diyarbakir, Turkey between 1999-2002. The data from the 351 participants were collected using a standard questionnaire that contains 43 variables. A multiple logistic regression model was then used to evaluate the data and to find the best regression model. We classified 80.1% (281/351) of the participants using the regression model. Furthermore, the specificity value of the model was 67% (84/126) of the control group while the sensitivity value was 88% (197/225) of the case group. We found the distribution of residual values standardized for final model to be exponential using the Kolmogorow-Smirnow test (p=0.193). The receiver operating characteristic curve was found successful to predict patients with risk for osteoporosis. This study suggests that low levels of dietary calcium intake, physical activity, education, and longer duration of menopause are independent predictors of the risk of low bone density in our population. Adequate dietary calcium intake in combination with maintaining a daily physical activity, increasing educational level, decreasing birth rate, and duration of breast-feeding may contribute to healthy bones and play a role in practical prevention of osteoporosis in Southeast Anatolia. In addition, the findings of the present study indicate that the use of multivariate statistical method as a multiple logistic regression in osteoporosis, which maybe influenced by many variables, is better than univariate statistical evaluation.

  10. Predictor characteristics necessary for building a clinically useful risk prediction model: a simulation study.

    PubMed

    Schummers, Laura; Himes, Katherine P; Bodnar, Lisa M; Hutcheon, Jennifer A

    2016-09-21

    Compelled by the intuitive appeal of predicting each individual patient's risk of an outcome, there is a growing interest in risk prediction models. While the statistical methods used to build prediction models are increasingly well understood, the literature offers little insight to researchers seeking to gauge a priori whether a prediction model is likely to perform well for their particular research question. The objective of this study was to inform the development of new risk prediction models by evaluating model performance under a wide range of predictor characteristics. Data from all births to overweight or obese women in British Columbia, Canada from 2004 to 2012 (n = 75,225) were used to build a risk prediction model for preeclampsia. The data were then augmented with simulated predictors of the outcome with pre-set prevalence values and univariable odds ratios. We built 120 risk prediction models that included known demographic and clinical predictors, and one, three, or five of the simulated variables. Finally, we evaluated standard model performance criteria (discrimination, risk stratification capacity, calibration, and Nagelkerke's r 2 ) for each model. Findings from our models built with simulated predictors demonstrated the predictor characteristics required for a risk prediction model to adequately discriminate cases from non-cases and to adequately classify patients into clinically distinct risk groups. Several predictor characteristics can yield well performing risk prediction models; however, these characteristics are not typical of predictor-outcome relationships in many population-based or clinical data sets. Novel predictors must be both strongly associated with the outcome and prevalent in the population to be useful for clinical prediction modeling (e.g., one predictor with prevalence ≥20 % and odds ratio ≥8, or 3 predictors with prevalence ≥10 % and odds ratios ≥4). Area under the receiver operating characteristic curve values of >0.8 were necessary to achieve reasonable risk stratification capacity. Our findings provide a guide for researchers to estimate the expected performance of a prediction model before a model has been built based on the characteristics of available predictors.

  11. Estimation and prediction under local volatility jump-diffusion model

    NASA Astrophysics Data System (ADS)

    Kim, Namhyoung; Lee, Younhee

    2018-02-01

    Volatility is an important factor in operating a company and managing risk. In the portfolio optimization and risk hedging using the option, the value of the option is evaluated using the volatility model. Various attempts have been made to predict option value. Recent studies have shown that stochastic volatility models and jump-diffusion models reflect stock price movements accurately. However, these models have practical limitations. Combining them with the local volatility model, which is widely used among practitioners, may lead to better performance. In this study, we propose a more effective and efficient method of estimating option prices by combining the local volatility model with the jump-diffusion model and apply it using both artificial and actual market data to evaluate its performance. The calibration process for estimating the jump parameters and local volatility surfaces is divided into three stages. We apply the local volatility model, stochastic volatility model, and local volatility jump-diffusion model estimated by the proposed method to KOSPI 200 index option pricing. The proposed method displays good estimation and prediction performance.

  12. Exploring heterogeneous market hypothesis using realized volatility

    NASA Astrophysics Data System (ADS)

    Chin, Wen Cheong; Isa, Zaidi; Mohd Nor, Abu Hassan Shaari

    2013-04-01

    This study investigates the heterogeneous market hypothesis using high frequency data. The cascaded heterogeneous trading activities with different time durations are modelled by the heterogeneous autoregressive framework. The empirical study indicated the presence of long memory behaviour and predictability elements in the financial time series which supported heterogeneous market hypothesis. Besides the common sum-of-square intraday realized volatility, we also advocated two power variation realized volatilities in forecast evaluation and risk measurement in order to overcome the possible abrupt jumps during the credit crisis. Finally, the empirical results are used in determining the market risk using the value-at-risk approach. The findings of this study have implications for informationally market efficiency analysis, portfolio strategies and risk managements.

  13. Assessing the performance of remotely-sensed flooding indicators and their potential contribution to early warning for leptospirosis in Cambodia

    PubMed Central

    Ledien, Julia; Sorn, Sopheak; Hem, Sopheak; Huy, Rekol; Buchy, Philippe

    2017-01-01

    Remote sensing can contribute to early warning for diseases with environmental drivers, such as flooding for leptospirosis. In this study we assessed whether and which remotely-sensed flooding indicator could be used in Cambodia to study any disease for which flooding has already been identified as an important driver, using leptospirosis as a case study. The performance of six potential flooding indicators was assessed by ground truthing. The Modified Normalized Difference Water Index (MNDWI) was used to estimate the Risk Ratio (RR) of being infected by leptospirosis when exposed to floods it detected, in particular during the rainy season. Chi-square tests were also calculated. Another variable—the time elapsed since the first flooding of the year—was created using MNDWI values and was also included as explanatory variable in a generalized linear model (GLM) and in a boosted regression tree model (BRT) of leptospirosis infections, along with other explanatory variables. Interestingly, MNDWI thresholds for both detecting water and predicting the risk of leptospirosis seroconversion were independently evaluated at -0.3. Value of MNDWI greater than -0.3 was significantly related to leptospirosis infection (RR = 1.61 [1.10–1.52]; χ2 = 5.64, p-value = 0.02, especially during the rainy season (RR = 2.03 [1.25–3.28]; χ2 = 8.15, p-value = 0.004). Time since the first flooding of the year was a significant risk factor in our GLM model (p-value = 0.042). These results suggest that MNDWI may be useful as a risk indicator in an early warning remote sensing tool for flood-driven diseases like leptospirosis in South East Asia. PMID:28704461

  14. Assessing the performance of remotely-sensed flooding indicators and their potential contribution to early warning for leptospirosis in Cambodia.

    PubMed

    Ledien, Julia; Sorn, Sopheak; Hem, Sopheak; Huy, Rekol; Buchy, Philippe; Tarantola, Arnaud; Cappelle, Julien

    2017-01-01

    Remote sensing can contribute to early warning for diseases with environmental drivers, such as flooding for leptospirosis. In this study we assessed whether and which remotely-sensed flooding indicator could be used in Cambodia to study any disease for which flooding has already been identified as an important driver, using leptospirosis as a case study. The performance of six potential flooding indicators was assessed by ground truthing. The Modified Normalized Difference Water Index (MNDWI) was used to estimate the Risk Ratio (RR) of being infected by leptospirosis when exposed to floods it detected, in particular during the rainy season. Chi-square tests were also calculated. Another variable-the time elapsed since the first flooding of the year-was created using MNDWI values and was also included as explanatory variable in a generalized linear model (GLM) and in a boosted regression tree model (BRT) of leptospirosis infections, along with other explanatory variables. Interestingly, MNDWI thresholds for both detecting water and predicting the risk of leptospirosis seroconversion were independently evaluated at -0.3. Value of MNDWI greater than -0.3 was significantly related to leptospirosis infection (RR = 1.61 [1.10-1.52]; χ2 = 5.64, p-value = 0.02, especially during the rainy season (RR = 2.03 [1.25-3.28]; χ2 = 8.15, p-value = 0.004). Time since the first flooding of the year was a significant risk factor in our GLM model (p-value = 0.042). These results suggest that MNDWI may be useful as a risk indicator in an early warning remote sensing tool for flood-driven diseases like leptospirosis in South East Asia.

  15. Evaluating the role of coastal habitats and sea-level rise in hurricane risk mitigation: An ecological economic assessment method and application to a business decision.

    PubMed

    Reddy, Sheila M W; Guannel, Gregory; Griffin, Robert; Faries, Joe; Boucher, Timothy; Thompson, Michael; Brenner, Jorge; Bernhardt, Joey; Verutes, Gregory; Wood, Spencer A; Silver, Jessica A; Toft, Jodie; Rogers, Anthony; Maas, Alexander; Guerry, Anne; Molnar, Jennifer; DiMuro, Johnathan L

    2016-04-01

    Businesses may be missing opportunities to account for ecosystem services in their decisions, because they do not have methods to quantify and value ecosystem services. We developed a method to quantify and value coastal protection and other ecosystem services in the context of a cost-benefit analysis of hurricane risk mitigation options for a business. We first analyze linked biophysical and economic models to examine the potential protection provided by marshes. We then applied this method to The Dow Chemical Company's Freeport, Texas facility to evaluate natural (marshes), built (levee), and hybrid (marshes and a levee designed for marshes) defenses against a 100-y hurricane. Model analysis shows that future sea-level rise decreases marsh area, increases flood heights, and increases the required levee height (12%) and cost (8%). In this context, marshes do not provide sufficient protection to the facility, located 12 km inland, to warrant a change in levee design for a 100-y hurricane. Marshes do provide some protection near shore and under smaller storm conditions, which may help maintain the coastline and levee performance in the face of sea-level rise. In sum, the net present value to the business of built defenses ($217 million [2010 US$]) is greater than natural defenses ($15 million [2010 US$]) and similar to the hybrid defense scenario ($229 million [2010 US$]). Examination of a sample of public benefits from the marshes shows they provide at least $117 million (2010 US$) in coastal protection, recreational value, and C sequestration to the public, while supporting 12 fisheries and more than 300 wildlife species. This study provides information on where natural defenses may be effective and a replicable approach that businesses can use to incorporate private, as well as public, ecosystem service values into hurricane risk management at other sites. © 2015 The Authors. Integrated Environmental Assessment and Management Published by Wiley Periodicals, Inc. on behalf of SETAC.

  16. Health risk assessment for exposure to nitrate in drinking water from village wells in Semarang, Indonesia.

    PubMed

    Sadler, Ross; Maetam, Brooke; Edokpolo, Benjamin; Connell, Des; Yu, Jimmy; Stewart, Donald; Park, M-J; Gray, Darren; Laksono, Budi

    2016-09-01

    The levels of nitrate in 52 drinking water wells in rural Central Java, Indonesia were evaluated in April 2014, and the results were used for a health risk assessment for the local populations by using probabilistic techniques. The concentrations of nitrate in drinking water had a range of 0.01-84 mg/L, a mean of 20 mg/L and a medium of 14 mg/L. Only two of the 52 samples exceeded the WHO guideline values of 50 mg/L for infant methaemoglobinaemia. The hazard quotient values as evaluated against the WHO guideline value at the 50 and 95 percentile points were HQ50 at 0.42 and HQ95 at 1.2, respectively. These indicated a low risk of infant methaemoglobinaemia for the whole population, but some risk for the sensitive portion of the population. The HQ50 and HQ95 values based on WHO acceptable daily intake dose for adult male and female were 0.35 and 1.0, respectively, indicating a generally a low level of risk. A risk characterisation linking birth defects to nitrate levels in water consumed during the first three months of pregnancy resulted in a HQ50/50 values of 1.5 and a HQ95/5 value of 65. These HQ values indicated an elevated risk for birth defects, in particular for the more sensitive population. A sanitation improvement program in the study area had a positive effect in reducing nitrate levels in wells and the corresponding risk for public health. For example, the birth defect HQ50/50 values for a subset of wells surveyed in both 2014 and 2015 was reduced from 1.1 to 0.71. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Baseline ecological risk assessment of the Calcasieu Estuary, Louisiana: 2. An evaluation of the predictive ability of effects-based sediment quality guidelines

    USGS Publications Warehouse

    MacDonald, Donald D.; Ingersoll, Christopher G.; Smorong, Dawn E.; Sinclair, Jesse A.; Lindskoog, Rebekka; Wang, Ning; Severn, Corrine; Gouguet, Ron; Meyer, John; Field, Jay

    2011-01-01

    Three sets of effects-based sediment-quality guidelines (SQGs) were evaluated to support the selection of sediment-quality benchmarks for assessing risks to benthic invertebrates in the Calcasieu Estuary, Louisiana. These SQGs included probable effect concentrations (PECs), effects range median values (ERMs), and logistic regression model (LRMs)-based T50 values. The results of this investigation indicate that all three sets of SQGs tend to underestimate sediment toxicity in the Calcasieu Estuary (i.e., relative to the national data sets), as evaluated using the results of 10-day toxicity tests with the amphipod, Hyalella azteca, or Ampelisca abdita, and 28-day whole-sediment toxicity tests with the H. azteca. These results emphasize the importance of deriving site-specific toxicity thresholds for assessing risks to benthic invertebrates.

  18. Valuing a Lifestyle Intervention for Middle Eastern Immigrants at Risk of Diabetes.

    PubMed

    Saha, Sanjib; Gerdtham, Ulf-G; Siddiqui, Faiza; Bennet, Louise

    2018-02-27

    Willingness-to-pay (WTP) techniques are increasingly being used in the healthcare sector for assessing the value of interventions. The objective of this study was to estimate WTP and its predictors in a randomized controlled trial of a lifestyle intervention exclusively targeting Middle Eastern immigrants living in Malmö, Sweden, who are at high risk of type 2 diabetes. We used the contingent valuation method to evaluate WTP. The questionnaire was designed following the payment-scale approach, and administered at the end of the trial, giving an ex-post perspective. We performed logistic regression and linear regression techniques to identify the factors associated with zero WTP value and positive WTP values. The intervention group had significantly higher average WTP than the control group (216 SEK vs. 127 SEK; p = 0.035; 1 U.S.$ = 8.52 SEK, 2015 price year) per month. The regression models demonstrated that being in the intervention group, acculturation, and self-employment were significant factors associated with positive WTP values. Male participants and lower-educated participants had a significantly higher likelihood of zero WTP. In this era of increased migration, our findings can help policy makers to take informed decisions to implement lifestyle interventions for immigrant populations.

  19. Valuing a Lifestyle Intervention for Middle Eastern Immigrants at Risk of Diabetes

    PubMed Central

    Siddiqui, Faiza

    2018-01-01

    Willingness-to-pay (WTP) techniques are increasingly being used in the healthcare sector for assessing the value of interventions. The objective of this study was to estimate WTP and its predictors in a randomized controlled trial of a lifestyle intervention exclusively targeting Middle Eastern immigrants living in Malmö, Sweden, who are at high risk of type 2 diabetes. We used the contingent valuation method to evaluate WTP. The questionnaire was designed following the payment-scale approach, and administered at the end of the trial, giving an ex-post perspective. We performed logistic regression and linear regression techniques to identify the factors associated with zero WTP value and positive WTP values. The intervention group had significantly higher average WTP than the control group (216 SEK vs. 127 SEK; p = 0.035; 1 U.S.$ = 8.52 SEK, 2015 price year) per month. The regression models demonstrated that being in the intervention group, acculturation, and self-employment were significant factors associated with positive WTP values. Male participants and lower-educated participants had a significantly higher likelihood of zero WTP. In this era of increased migration, our findings can help policy makers to take informed decisions to implement lifestyle interventions for immigrant populations. PMID:29495529

  20. [LaboRisCh: an algorithm for assessment of health risks due to chemicals in research laboratories and similar workplaces].

    PubMed

    Strafella, Elisabetta; Bracci, M; Calisti, R; Governa, M; Santarelli, Lory

    2008-01-01

    Chemical risk assessment in research laboratories is complicated by factors such as the large number of agents to be considered, each present in small quantities, and the very short and erratic periods of exposure, all of which make reliable environmental and biological monitoring particularly difficult and at times impossible. In such environments, a preliminary evaluation procedure based on algorithms would be useful to establish the hazard potential of a given situation and to guide the appropriate intervention. The LaboRisCh model was expressly designed to assess the health risk due to chemicals in research laboratories and similar workplaces. The model is based on the calculation of the value of a synthetic single risk index for each substance and compound found in a laboratory and, subsequently, of a further synthetic single risk index for the whole laboratory or, where required, a section thereof. This makes LaboRisCh a compromise between need for information, ease of use, and resources required for the assessment. The risk index includes several items, chiefly the physical and chemical properties, intrinsic hazard potential, amount, dilution, and time of exposure to each agent; waste management; possible interactions; presence and efficiency of collective and individual protection devices, and staff training in good laboratory practices. The value of the synthetic single index corresponds to one of three areas: no risk (green), possible risk (yellow), and certain risk (red). Preliminary data confirm the model. LaboRisCh appears to be a reliable method for chemical risk assessment in research laboratories and similar workplaces.

  1. Comparing biomarkers as principal surrogate endpoints.

    PubMed

    Huang, Ying; Gilbert, Peter B

    2011-12-01

    Recently a new definition of surrogate endpoint, the "principal surrogate," was proposed based on causal associations between treatment effects on the biomarker and on the clinical endpoint. Despite its appealing interpretation, limited research has been conducted to evaluate principal surrogates, and existing methods focus on risk models that consider a single biomarker. How to compare principal surrogate value of biomarkers or general risk models that consider multiple biomarkers remains an open research question. We propose to characterize a marker or risk model's principal surrogate value based on the distribution of risk difference between interventions. In addition, we propose a novel summary measure (the standardized total gain) that can be used to compare markers and to assess the incremental value of a new marker. We develop a semiparametric estimated-likelihood method to estimate the joint surrogate value of multiple biomarkers. This method accommodates two-phase sampling of biomarkers and is more widely applicable than existing nonparametric methods by incorporating continuous baseline covariates to predict the biomarker(s), and is more robust than existing parametric methods by leaving the error distribution of markers unspecified. The methodology is illustrated using a simulated example set and a real data set in the context of HIV vaccine trials. © 2011, The International Biometric Society.

  2. Long-Term Post-CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions.

    PubMed

    Carr, Brendan M; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C; Zhu, Wei; Shroyer, A Laurie

    2016-01-01

    Clinical risk models are commonly used to predict short-term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long-term mortality. The added value of long-term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long-term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Long-term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c-index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Mortality rates were 3%, 9%, and 17% at one-, three-, and five years, respectively (median follow-up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long-term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Long-term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long-term mortality risk can be accurately assessed and subgroups of higher-risk patients can be identified for enhanced follow-up care. More research appears warranted to refine long-term CABG clinical risk models. © 2015 The Authors. Journal of Cardiac Surgery Published by Wiley Periodicals, Inc.

  3. Long‐Term Post‐CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions

    PubMed Central

    Carr, Brendan M.; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C.; Zhu, Wei

    2015-01-01

    Abstract Background/aim Clinical risk models are commonly used to predict short‐term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long‐term mortality. The added value of long‐term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long‐term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Methods Long‐term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c‐index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Results Mortality rates were 3%, 9%, and 17% at one‐, three‐, and five years, respectively (median follow‐up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long‐term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Conclusions Long‐term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long‐term mortality risk can be accurately assessed and subgroups of higher‐risk patients can be identified for enhanced follow‐up care. More research appears warranted to refine long‐term CABG clinical risk models. doi: 10.1111/jocs.12665 (J Card Surg 2016;31:23–30) PMID:26543019

  4. A risk assessment tool applied to the study of shale gas resources.

    PubMed

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's 'Screening and Ranking Framework (SRF)' developed to evaluate potential geologic carbon dioxide (CO2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. A framework for global river flood risk assessment

    NASA Astrophysics Data System (ADS)

    Winsemius, H. C.; Van Beek, L. P. H.; Bouwman, A.; Ward, P. J.; Jongman, B.

    2012-04-01

    There is an increasing need for strategic global assessments of flood risks. Such assessments may be required by: (a) International Financing Institutes and Disaster Management Agencies to evaluate where, when, and which investments in flood risk mitigation are most required; (b) (re-)insurers, who need to determine their required coverage capital; and (c) large companies to account for risks of regional investments. In this contribution, we propose a framework for global river flood risk assessment. The framework combines coarse scale resolution hazard probability distributions, derived from global hydrological model runs (typical scale about 0.5 degree resolution) with high resolution estimates of exposure indicators. The high resolution is required because floods typically occur at a much smaller scale than the typical resolution of global hydrological models, and exposure indicators such as population, land use and economic value generally are strongly variable in space and time. The framework therefore estimates hazard at a high resolution ( 1 km2) by using a) global forcing data sets of the current (or in scenario mode, future) climate; b) a global hydrological model; c) a global flood routing model, and d) importantly, a flood spatial downscaling routine. This results in probability distributions of annual flood extremes as an indicator of flood hazard, at the appropriate resolution. A second component of the framework combines the hazard probability distribution with classical flood impact models (e.g. damage, affected GDP, affected population) to establish indicators for flood risk. The framework can be applied with a large number of datasets and models and sensitivities of such choices can be evaluated by the user. The framework is applied using the global hydrological model PCR-GLOBWB, combined with a global flood routing model. Downscaling of the hazard probability distributions to 1 km2 resolution is performed with a new downscaling algorithm, applied on a number of target regions. We demonstrate the use of impact models in these regions based on global GDP, population, and land use maps. In this application, we show sensitivities of the estimated risks with regard to the use of different climate input datasets, decisions made in the downscaling algorithm, and different approaches to establish distributed estimates of GDP and asset exposure to flooding.

  6. PREDICTION OF POPULATION-LEVEL RESPONSE FROM MYSID TOXICITY TEST DATA USING POPULATION MODEL TECHNIQUES

    EPA Science Inventory

    Acute and chronic bioassay statistics are used to evaluate the toxicity and the risks of chemical stressors to mysid shrimp Americamysis bahia (formerly Mysidopsis bahia). These include LC50 values from acute tests, chronic values (the geometric mean of the no-obsderved-effect co...

  7. Multiple attribute decision making model and application to food safety risk evaluation.

    PubMed

    Ma, Lihua; Chen, Hong; Yan, Huizhe; Yang, Lifeng; Wu, Lifeng

    2017-01-01

    Decision making for supermarket food purchase decisions are characterized by network relationships. This paper analyzed factors that influence supermarket food selection and proposes a supplier evaluation index system based on the whole process of food production. The author established the intuitive interval value fuzzy set evaluation model based on characteristics of the network relationship among decision makers, and validated for a multiple attribute decision making case study. Thus, the proposed model provides a reliable, accurate method for multiple attribute decision making.

  8. Development and validation of multivariable predictive model for thromboembolic events in lymphoma patients.

    PubMed

    Antic, Darko; Milic, Natasa; Nikolovski, Srdjan; Todorovic, Milena; Bila, Jelena; Djurdjevic, Predrag; Andjelic, Bosko; Djurasinovic, Vladislava; Sretenovic, Aleksandra; Vukovic, Vojin; Jelicic, Jelena; Hayman, Suzanne; Mihaljevic, Biljana

    2016-10-01

    Lymphoma patients are at increased risk of thromboembolic events but thromboprophylaxis in these patients is largely underused. We sought to develop and validate a simple model, based on individual clinical and laboratory patient characteristics that would designate lymphoma patients at risk for thromboembolic event. The study population included 1,820 lymphoma patients who were treated in the Lymphoma Departments at the Clinics of Hematology, Clinical Center of Serbia and Clinical Center Kragujevac. The model was developed using data from a derivation cohort (n = 1,236), and further assessed in the validation cohort (n = 584). Sixty-five patients (5.3%) in the derivation cohort and 34 (5.8%) patients in the validation cohort developed thromboembolic events. The variables independently associated with risk for thromboembolism were: previous venous and/or arterial events, mediastinal involvement, BMI>30 kg/m(2) , reduced mobility, extranodal localization, development of neutropenia and hemoglobin level < 100g/L. Based on the risk model score, the population was divided into the following risk categories: low (score 0-1), intermediate (score 2-3), and high (score >3). For patients classified at risk (intermediate and high-risk scores), the model produced negative predictive value of 98.5%, positive predictive value of 25.1%, sensitivity of 75.4%, and specificity of 87.5%. A high-risk score had positive predictive value of 65.2%. The diagnostic performance measures retained similar values in the validation cohort. Developed prognostic Thrombosis Lymphoma - ThroLy score is more specific for lymphoma patients than any other available score targeting thrombosis in cancer patients. Am. J. Hematol. 91:1014-1019, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Serum C-reactive protein (CRP) as a simple and independent prognostic factor in extranodal natural killer/T-cell lymphoma, nasal type.

    PubMed

    Li, Ya-Jun; Li, Zhi-Ming; Xia, Yi; Huang, Jia-Jia; Huang, Hui-Qiang; Xia, Zhong-Jun; Lin, Tong-Yu; Li, Su; Cai, Xiu-Yu; Wu-Xiao, Zhi-Jun; Jiang, Wen-Qi

    2013-01-01

    C-reactive protein (CRP) is a biomarker of the inflammatory response, and it shows significant prognostic value for several types of solid tumors. The prognostic significance of CRP for lymphoma has not been fully examined. We evaluated the prognostic role of baseline serum CRP levels in patients with extranodal natural killer (NK)/T-cell lymphoma (ENKTL). We retrospectively analyzed 185 patients with newly diagnosed ENKTL. The prognostic value of the serum CRP level was evaluated for the low-CRP group (CRP≤10 mg/L) versus the high-CRP group (CRP>10 mg/L). The prognostic value of the International Prognostic Index (IPI) and the Korean Prognostic Index (KPI) were evaluated and compared with the newly developed prognostic model. Patients in the high-CRP group tended to display increased adverse clinical characteristics, lower rates of complete remission (P<0.001), inferior progression-free survival (PFS, P = 0.001), and inferior overall survival (OS, P<0.001). Multivariate analysis demonstrated that elevated serum CRP levels, age >60 years, hypoalbuminemia, and elevated lactate dehydrogenase levels were independent adverse predictors of OS. Based on these four independent predictors, we constructed a new prognostic model that identified 4 groups with varying OS: group 1, no adverse factors; group 2, 1 factor; group 3, 2 factors; and group 4, 3 or 4 factors (P<0.001). The novel prognostic model was found to be superior to both the IPI in discriminating patients with different outcomes in the IPI low-risk group and the KPI in distinguishing between the low- and intermediate-low-risk groups, the intermediate-low- and high-intermediate-risk groups, and the high-intermediate- and high-risk groups. Our results suggest that pretreatment serum CRP levels represent an independent predictor of clinical outcome for patients with ENKTL. The prognostic value of the new prognostic model is superior to both IPI and KPI.

  10. Serum C-Reactive Protein (CRP) as a Simple and Independent Prognostic Factor in Extranodal Natural Killer/T-Cell Lymphoma, Nasal Type

    PubMed Central

    Xia, Yi; Huang, Jia-Jia; Huang, Hui-Qiang; Xia, Zhong-Jun; Lin, Tong-Yu; Li, Su; Cai, Xiu-Yu; Wu-Xiao, Zhi-Jun; Jiang, Wen-Qi

    2013-01-01

    Background C-reactive protein (CRP) is a biomarker of the inflammatory response, and it shows significant prognostic value for several types of solid tumors. The prognostic significance of CRP for lymphoma has not been fully examined. We evaluated the prognostic role of baseline serum CRP levels in patients with extranodal natural killer (NK)/T-cell lymphoma (ENKTL). Methods We retrospectively analyzed 185 patients with newly diagnosed ENKTL. The prognostic value of the serum CRP level was evaluated for the low-CRP group (CRP≤10 mg/L) versus the high-CRP group (CRP>10 mg/L). The prognostic value of the International Prognostic Index (IPI) and the Korean Prognostic Index (KPI) were evaluated and compared with the newly developed prognostic model. Results Patients in the high-CRP group tended to display increased adverse clinical characteristics, lower rates of complete remission (P<0.001), inferior progression-free survival (PFS, P = 0.001), and inferior overall survival (OS, P<0.001). Multivariate analysis demonstrated that elevated serum CRP levels, age >60 years, hypoalbuminemia, and elevated lactate dehydrogenase levels were independent adverse predictors of OS. Based on these four independent predictors, we constructed a new prognostic model that identified 4 groups with varying OS: group 1, no adverse factors; group 2, 1 factor; group 3, 2 factors; and group 4, 3 or 4 factors (P<0.001). The novel prognostic model was found to be superior to both the IPI in discriminating patients with different outcomes in the IPI low-risk group and the KPI in distinguishing between the low- and intermediate-low-risk groups, the intermediate-low- and high-intermediate-risk groups, and the high-intermediate- and high-risk groups. Conclusions Our results suggest that pretreatment serum CRP levels represent an independent predictor of clinical outcome for patients with ENKTL. The prognostic value of the new prognostic model is superior to both IPI and KPI. PMID:23724031

  11. Impact of a clinical decision model for febrile children at risk for serious bacterial infections at the emergency department: a randomized controlled trial.

    PubMed

    de Vos-Kerkhof, Evelien; Nijman, Ruud G; Vergouwe, Yvonne; Polinder, Suzanne; Steyerberg, Ewout W; van der Lei, Johan; Moll, Henriëtte A; Oostenbrink, Rianne

    2015-01-01

    To assess the impact of a clinical decision model for febrile children at risk for serious bacterial infections (SBI) attending the emergency department (ED). Randomized controlled trial with 439 febrile children, aged 1 month-16 years, attending the pediatric ED of a Dutch university hospital during 2010-2012. Febrile children were randomly assigned to the intervention (clinical decision model; n = 219) or the control group (usual care; n = 220). The clinical decision model included clinical symptoms, vital signs, and C-reactive protein and provided high/low-risks for "pneumonia" and "other SBI". Nurses were guided by the intervention to initiate additional tests for high-risk children. The clinical decision model was evaluated by 1) area-under-the-receiver-operating-characteristic-curve (AUC) to indicate discriminative ability and 2) feasibility, to measure nurses' compliance to model recommendations. Primary patient outcome was defined as correct SBI diagnoses. Secondary process outcomes were defined as length of stay; diagnostic tests; antibiotic treatment; hospital admission; revisits and medical costs. The decision model had good discriminative ability for both pneumonia (n = 33; AUC 0.83 (95% CI 0.75-0.90)) and other SBI (n = 22; AUC 0.81 (95% CI 0.72-0.90)). Compliance to model recommendations was high (86%). No differences in correct SBI determination were observed. Application of the clinical decision model resulted in less full-blood-counts (14% vs. 22%, p-value < 0.05) and more urine-dipstick testing (71% vs. 61%, p-value < 0.05). In contrast to our expectations no substantial impact on patient outcome was perceived. The clinical decision model preserved, however, good discriminatory ability to detect SBI, achieved good compliance among nurses and resulted in a more standardized diagnostic approach towards febrile children, with less full blood-counts and more rightfully urine-dipstick testing. Nederlands Trial Register NTR2381.

  12. Impact of a Clinical Decision Model for Febrile Children at Risk for Serious Bacterial Infections at the Emergency Department: A Randomized Controlled Trial

    PubMed Central

    de Vos-Kerkhof, Evelien; Nijman, Ruud G.; Vergouwe, Yvonne; Polinder, Suzanne; Steyerberg, Ewout W.; van der Lei, Johan; Moll, Henriëtte A.; Oostenbrink, Rianne

    2015-01-01

    Objectives To assess the impact of a clinical decision model for febrile children at risk for serious bacterial infections (SBI) attending the emergency department (ED). Methods Randomized controlled trial with 439 febrile children, aged 1 month-16 years, attending the pediatric ED of a Dutch university hospital during 2010-2012. Febrile children were randomly assigned to the intervention (clinical decision model; n=219) or the control group (usual care; n=220). The clinical decision model included clinical symptoms, vital signs, and C-reactive protein and provided high/low-risks for “pneumonia” and “other SBI”. Nurses were guided by the intervention to initiate additional tests for high-risk children. The clinical decision model was evaluated by 1) area-under-the-receiver-operating-characteristic-curve (AUC) to indicate discriminative ability and 2) feasibility, to measure nurses’ compliance to model recommendations. Primary patient outcome was defined as correct SBI diagnoses. Secondary process outcomes were defined as length of stay; diagnostic tests; antibiotic treatment; hospital admission; revisits and medical costs. Results The decision model had good discriminative ability for both pneumonia (n=33; AUC 0.83 (95% CI 0.75-0.90)) and other SBI (n=22; AUC 0.81 (95% CI 0.72-0.90)). Compliance to model recommendations was high (86%). No differences in correct SBI determination were observed. Application of the clinical decision model resulted in less full-blood-counts (14% vs. 22%, p-value<0.05) and more urine-dipstick testing (71% vs. 61%, p-value<0.05). Conclusions In contrast to our expectations no substantial impact on patient outcome was perceived. The clinical decision model preserved, however, good discriminatory ability to detect SBI, achieved good compliance among nurses and resulted in a more standardized diagnostic approach towards febrile children, with less full blood-counts and more rightfully urine-dipstick testing. Trial Registration Nederlands Trial Register NTR2381 PMID:26024532

  13. A framework to evaluate the cost-effectiveness of the NADiA ProsVue slope to guide adjuvant radiotherapy among men with high-risk characteristics following prostatectomy for prostate cancer.

    PubMed

    Reed, Shelby D; Stewart, Suzanne Biehn; Scales, Charles D; Moul, Judd W

    2014-07-01

    The NADiA ProsVue is a prognostic system that measures prostate-specific antigen slope to identify men at lower risk of clinical recurrence of prostate cancer after radical prostatectomy. We developed a decision-modeling framework to evaluate its cost-effectiveness to guide the use of adjuvant radiotherapy (ART). We populated the model using patient-level data and external sources. Patients were classified as intermediate risk or high risk on the basis of Cancer of the Prostate Risk Assessment-Postsurgical (CAPRA-S) nomogram and then stratified by the ProsVue slope (≤2 pg/mL/mo; >2 pg/mL/mo) and receipt of ART. In sensitivity analyses, we varied the effect of the ProsVue slope on the use of ART and other model parameters. The cost-effectiveness of the ProsVue-guided strategy varied widely because of small differences in quality-adjusted life-years (QALYs) at 10 years. In the intermediate-risk group, when the use of ART decreased from 20% (standard care) to 7.5% among patients with a ProsVue slope value of 2 pg/mL/mo or less, the incremental cost-effectiveness ratio was $25,160/QALY. In the high-risk group, the use of ART would have to decrease from 40% (standard care) to 11.5% among those with a ProsVue slope value of 2 pg/mL/mo or less to obtain a ratio of $50,000/QALY. The cost-effectiveness ratios were sensitive to varying benefits of salvage therapy, quality of life, and costs of ART and ProsVue testing. The effect of the ProsVue system on costs will be dependent on the extent to which ART decreases among men identified as having a low risk of recurrence. Its effect on QALYs will remain conditional on uncertain clinical and quality-of-life benefits associated with ART. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  14. Customer-Specific Transaction Risk Management in E-Commerce

    NASA Astrophysics Data System (ADS)

    Ruch, Markus; Sackmann, Stefan

    Increasing potential for turnover in e-commerce is inextricably linked with an increase in risk. Online retailers (e-tailers), aiming for a company-wide value orientation should manage this risk. However, current approaches to risk management either use average retail prices elevated by an overall risk premium or restrict the payment methods offered to customers. Thus, they neglect customer-specific value and risk attributes and leave turnover potentials unconsidered. To close this gap, an innovative valuation model is proposed in this contribution that integrates customer-specific risk and potential turnover. The approach presented evaluates different payment methods using their risk-turnover characteristic, provides a risk-adjusted decision basis for selecting payment methods and allows e-tailers to derive automated risk management decisions per customer and transaction without reducing turnover potential.

  15. New Tools and Methods for Assessing Risk-Management Strategies

    DTIC Science & Technology

    2004-03-01

    Theories to evaluate the risks and benefits of various acquisition alternatives and allowed researchers to monitor the process students used to make a...revealed distinct risk-management strategies. 15. SUBJECT TERMS risk managements, acquisition process, expected value theory , multi-attribute utility theory ...Utility Theories to evaluate the risks and benefits of various acquisition alternatives, and allowed us to monitor the process subjects used to arrive at

  16. Refining value-at-risk estimates using a Bayesian Markov-switching GJR-GARCH copula-EVT model.

    PubMed

    Sampid, Marius Galabe; Hasim, Haslifah M; Dai, Hongsheng

    2018-01-01

    In this paper, we propose a model for forecasting Value-at-Risk (VaR) using a Bayesian Markov-switching GJR-GARCH(1,1) model with skewed Student's-t innovation, copula functions and extreme value theory. A Bayesian Markov-switching GJR-GARCH(1,1) model that identifies non-constant volatility over time and allows the GARCH parameters to vary over time following a Markov process, is combined with copula functions and EVT to formulate the Bayesian Markov-switching GJR-GARCH(1,1) copula-EVT VaR model, which is then used to forecast the level of risk on financial asset returns. We further propose a new method for threshold selection in EVT analysis, which we term the hybrid method. Empirical and back-testing results show that the proposed VaR models capture VaR reasonably well in periods of calm and in periods of crisis.

  17. [Quantitative evaluation of health risk associated with occupational inhalation exposure to vinyl chloride at production plants in Poland].

    PubMed

    Szymczak, W

    1997-01-01

    Vinyl chloride is classified by the IARC in group 1-human carcinogens. In Poland occupational exposure to vinyl chloride is found among workers employed in many branches of industry, among others in the industry of vinyl chloride synthesis and polymerization as well as in the plastics, footwear, rubber, pharmaceutical and metallurgical industries. Concentrations observed range from the noon-determinable level to 90 mg/m3, at the MAC value equal to 5 mg/m3. Neoplasm of liver is a major carcinogenic effect of vinyl chloride. Hence, the health assessment focused on this critical risk. Four different linear dose-response models, developed by several authors and based on results of different epidemiological studies, were used to characterise the extent of cancer risk depending on the level of vinyl chloride concentrations. The estimated risk related to a forty-year employment under exposure equal to MAC values (5 mg/m3) fell within the range from 2.9.10(-4) to 2.6.10(-3). As the figures depict it did not exceed the acceptable level (10(-3)).

  18. Comparison of different risk stratification systems in predicting short-term serious outcome of syncope patients

    PubMed Central

    Safari, Saeed; Baratloo, Alireza; Hashemi, Behrooz; Rahmati, Farhad; Forouzanfar, Mohammad Mehdi; Motamedi, Maryam; Mirmohseni, Ladan

    2016-01-01

    Background: Determining etiologic causes and prognosis can significantly improve management of syncope patients. The present study aimed to compare the values of San Francisco, Osservatorio Epidemiologico sulla Sincope nel Lazio (OESIL), Boston, and Risk Stratification of Syncope in the Emergency Department (ROSE) score clinical decision rules in predicting the short-term serious outcome of syncope patients. Materials and Methods: The present diagnostic accuracy study with 1-week follow-up was designed to evaluate the predictive values of the four mentioned clinical decision rules. Screening performance characteristics of each model in predicting mortality, myocardial infarction (MI), and cerebrovascular accidents (CVAs) were calculated and compared. To evaluate the value of each aforementioned model in predicting the outcome, sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio were calculated and receiver-operating curve (ROC) curve analysis was done. Results: A total of 187 patients (mean age: 64.2 ± 17.2 years) were enrolled in the study. Mortality, MI, and CVA were seen in 19 (10.2%), 12 (6.4%), and 36 (19.2%) patients, respectively. Area under the ROC curve for OESIL, San Francisco, Boston, and ROSE models in prediction the risk of 1-week mortality, MI, and CVA was in the 30–70% range, with no significant difference among models (P > 0.05). The pooled model did not show higher accuracy in prediction of mortality, MI, and CVA compared to others (P > 0.05). Conclusion: This study revealed the weakness of all four evaluated models in predicting short-term serious outcome of syncope patients referred to the emergency department without any significant advantage for one among others. PMID:27904602

  19. A behavioural and neural evaluation of prospective decision-making under risk

    PubMed Central

    Symmonds, Mkael; Bossaerts, Peter; Dolan, Raymond J.

    2010-01-01

    Making the best choice when faced with a chain of decisions requires a person to judge both anticipated outcomes and future actions. Although economic decision-making models account for both risk and reward in single choice contexts there is a dearth of similar knowledge about sequential choice. Classical utility-based models assume that decision-makers select and follow an optimal pre-determined strategy, irrespective of the particular order in which options are presented. An alternative model involves continuously re-evaluating decision utilities, without prescribing a specific future set of choices. Here, using behavioral and functional magnetic resonance imaging (fMRI) data, we studied human subjects in a sequential choice task and use these data to compare alternative decision models of valuation and strategy selection. We provide evidence that subjects adopt a model of re-evaluating decision utilities, where available strategies are continuously updated and combined in assessing action values. We validate this model by using simultaneously-acquired fMRI data to show that sequential choice evokes a pattern of neural response consistent with a tracking of anticipated distribution of future reward, as expected in such a model. Thus, brain activity evoked at each decision point reflects the expected mean, variance and skewness of possible payoffs, consistent with the idea that sequential choice evokes a prospective evaluation of both available strategies and possible outcomes. PMID:20980595

  20. Fate and risk of atrazine and sulfentrazone to nontarget species at an agriculture site.

    PubMed

    Thorngren, Jordan L; Harwood, Amanda D; Murphy, Tracye M; Huff Hartz, Kara E; Fung, Courtney Y; Lydy, Michael J

    2017-05-01

    The present study evaluated the risk associated with the application and co-occurrence of 2 herbicides, atrazine and sulfentrazone, applied to a 32-ha corn and soybean rotational field. Field concentrations of the compounds were measured in soil, runoff water, and groundwater, with peak mean atrazine and sulfentrazone concentrations found in the soil (144 ng/g dry wt, and 318 ng/g dry wt, respectively). Individual and mixture laboratory bioassays were conducted to determine the effects of atrazine and sulfentrazone on the survival of Daphnia magna and Pimephales promelas, the germination of Lactuca sativa, and the growth of Pseudokirchneriella subcapita and Lemna minor. Pseudokirchneriella subcapita and L. minor were the most susceptible species tested, and the effects on growth of the herbicides in mixtures best fit an independent action model. Risk quotients and margin of safety of 10% (MOS10) values were used to estimate risk and were calculated using runoff water concentrations. The MOS10 values were more sensitive than risk quotients in estimating risk. The MOS10 value for sulfentrazone runoff water concentration effects on P. subcapita was 7.8, and for L. minor was 1.1, with MOS10 values < 1 indicating potential risk. Overall, the environmentally relevant concentrations fell below the effect concentrations; therefore, atrazine and sulfentrazone posed little to no risk to the nontarget species tested. Environ Toxicol Chem 2017;36:1301-1310. © 2016 SETAC. © 2016 SETAC.

  1. Modelling the impact of new patient visits on risk adjusted access at 2 clinics.

    PubMed

    Kolber, Michael A; Rueda, Germán; Sory, John B

    2018-06-01

    To evaluate the effect new outpatient clinic visits has on the availability of follow-up visits for established patients when patient visit frequency is risk adjusted. Diagnosis codes for patients from 2 Internal Medicine Clinics were extracted through billing data. The HHS-HCC risk adjusted scores for each clinic were determined based upon the average of all clinic practitioners' profiles. These scores were then used to project encounter frequencies for established patients, and for new patients entering the clinic based on risk and time of entry into the clinics. A distinct mean risk frequency distribution for physicians in each clinic could be defined providing model parameters. Within the model, follow-up visit utilization at the highest risk adjusted visit frequencies would require more follow-up slots than currently available when new patient no-show rates and annual patient loss are included. Patients seen at an intermediate or lower visit risk adjusted frequency could be accommodated when new patient no-show rates and annual patient clinic loss are considered. Value-based care is driven by control of cost while maintaining quality of care. In order to control cost, there has been a drive to increase visit frequency in primary care for those patients at increased risk. Adding new patients to primary care clinics limits the availability of follow-up slots that accrue over time for those at highest risk, thereby limiting disease and, potentially, cost control. If frequency of established care visits can be reduced by improved disease control, closing the practice to new patients, hiring health care extenders, or providing non-face to face care models then quality and cost of care may be improved. © 2018 John Wiley & Sons, Ltd.

  2. Density dependence and risk of extinction in a small population of sea otters

    USGS Publications Warehouse

    Gerber, L.R.; Buenau, K.E.; VanBlaricom, G.

    2004-01-01

    Sea otters (Enhydra lutris (L.)) were hunted to extinction off the coast of Washington State early in the 20th century. A new population was established by translocations from Alaska in 1969 and 1970. The population, currently numbering at least 550 animals, A major threat to the population is the ongoing risk of majour oil spills in sea otter habitat. We apply population models to census and demographic data in order to evaluate the status of the population. We fit several density dependent models to test for density dependence and determine plausible values for the carrying capacity (K) by comparing model goodness of fit to an exponential model. Model fits were compared using Akaike Information Criterion (AIC). A significant negative relationship was found between the population growth rate and population size (r2=0.27, F=5.57, df=16, p<0.05), suggesting density dependence in Washington state sea otters. Information criterion statistics suggest that the model is the most parsimonious, followed closely by the logistic Beverton-Holt model. Values of K ranged from 612 to 759 with best-fit parameter estimates for the Beverton-Holt model including 0.26 for r and 612 for K. The latest (2001) population index count (555) puts the population at 87-92% of the estimated carrying capacity, above the suggested range for optimum sustainable population (OSP). Elasticity analysis was conducted to examine the effects of proportional changes in vital rates on the population growth rate (??). The elasticity values indicate the population is most sensitive to changes in survival rates (particularly adult survival).

  3. Conservative Exposure Predictions for Rapid Risk Assessment of Phase-Separated Additives in Medical Device Polymers.

    PubMed

    Chandrasekar, Vaishnavi; Janes, Dustin W; Saylor, David M; Hood, Alan; Bajaj, Akhil; Duncan, Timothy V; Zheng, Jiwen; Isayeva, Irada S; Forrey, Christopher; Casey, Brendan J

    2018-01-01

    A novel approach for rapid risk assessment of targeted leachables in medical device polymers is proposed and validated. Risk evaluation involves understanding the potential of these additives to migrate out of the polymer, and comparing their exposure to a toxicological threshold value. In this study, we propose that a simple diffusive transport model can be used to provide conservative exposure estimates for phase separated color additives in device polymers. This model has been illustrated using a representative phthalocyanine color additive (manganese phthalocyanine, MnPC) and polymer (PEBAX 2533) system. Sorption experiments of MnPC into PEBAX were conducted in order to experimentally determine the diffusion coefficient, D = (1.6 ± 0.5) × 10 -11  cm 2 /s, and matrix solubility limit, C s  = 0.089 wt.%, and model predicted exposure values were validated by extraction experiments. Exposure values for the color additive were compared to a toxicological threshold for a sample risk assessment. Results from this study indicate that a diffusion model-based approach to predict exposure has considerable potential for use as a rapid, screening-level tool to assess the risk of color additives and other small molecule additives in medical device polymers.

  4. A Risk Score Model for Evaluation and Management of Patients with Thyroid Nodules.

    PubMed

    Zhang, Yongwen; Meng, Fanrong; Hong, Lianqing; Chu, Lanfang

    2018-06-12

    The study is aimed to establish a simplified and practical tool for analyzing thyroid nodules. A novel risk score model was designed, risk factors including patient history, patient characteristics, physical examination, symptoms of compression, thyroid function, ultrasonography (US) of thyroid and cervical lymph nodes were evaluated and classified into high risk factors, intermediate risk factors, and low risk factors. A total of 243 thyroid nodules in 162 patients were assessed with risk score system and Thyroid Imaging-Reporting and Data System (TI-RADS). The diagnostic performance of risk score system and TI-RADS was compared. The accuracy in the diagnosis of thyroid nodules was 89.3% for risk score system, 74.9% for TI-RADS respectively. The specificity, accuracy and positive predictive value (PPV) of risk score system were significantly higher than the TI-RADS system (χ 2 =26.287, 17.151, 11.983; p <0.05), statistically significant differences were not observed in the sensitivity and negative predictive value (NPV) between the risk score system and TI-RADS (χ 2 =1.276, 0.290; p>0.05). The area under the curve (AUC) for risk score diagnosis system was 0.963, standard error 0.014, 95% confidence interval (CI)=0.934-0.991, the AUC for TI-RADS diagnosis system was 0.912 with standard error 0.021, 95% CI=0.871-0.953, the AUC for risk score system was significantly different from that of TI-RADS (Z=2.02; p <0.05). Risk score model is a reliable, simplified and cost-effective diagnostic tool used in diagnosis of thyroid cancer. The higher the score is, the higher the risk of malignancy will be. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Applying spatial regression to evaluate risk factors for microbiological contamination of urban groundwater sources in Juba, South Sudan

    NASA Astrophysics Data System (ADS)

    Engström, Emma; Mörtberg, Ulla; Karlström, Anders; Mangold, Mikael

    2017-06-01

    This study developed methodology for statistically assessing groundwater contamination mechanisms. It focused on microbial water pollution in low-income regions. Risk factors for faecal contamination of groundwater-fed drinking-water sources were evaluated in a case study in Juba, South Sudan. The study was based on counts of thermotolerant coliforms in water samples from 129 sources, collected by the humanitarian aid organisation Médecins Sans Frontières in 2010. The factors included hydrogeological settings, land use and socio-economic characteristics. The results showed that the residuals of a conventional probit regression model had a significant positive spatial autocorrelation (Moran's I = 3.05, I-stat = 9.28); therefore, a spatial model was developed that had better goodness-of-fit to the observations. The most significant factor in this model ( p-value 0.005) was the distance from a water source to the nearest Tukul area, an area with informal settlements that lack sanitation services. It is thus recommended that future remediation and monitoring efforts in the city be concentrated in such low-income regions. The spatial model differed from the conventional approach: in contrast with the latter case, lowland topography was not significant at the 5% level, as the p-value was 0.074 in the spatial model and 0.040 in the traditional model. This study showed that statistical risk-factor assessments of groundwater contamination need to consider spatial interactions when the water sources are located close to each other. Future studies might further investigate the cut-off distance that reflects spatial autocorrelation. Particularly, these results advise research on urban groundwater quality.

  6. Evaluating predictive modeling's potential to improve teleretinal screening participation in urban safety net clinics.

    PubMed

    Ogunyemi, Omolola; Teklehaimanot, Senait; Patty, Lauren; Moran, Erin; George, Sheba

    2013-01-01

    Screening guidelines for diabetic patients recommend yearly eye examinations to detect diabetic retinopathy and other forms of diabetic eye disease. However, annual screening rates for retinopathy in US urban safety net settings remain low. Using data gathered from a study of teleretinal screening in six urban safety net clinics, we assessed whether predictive modeling could be of value in identifying patients at risk of developing retinopathy. We developed and examined the accuracy of two predictive modeling approaches for diabetic retinopathy in a sample of 513 diabetic individuals, using routinely available clinical variables from retrospective medical record reviews. Bayesian networks and radial basis function (neural) networks were learned using ten-fold cross-validation. The predictive models were modestly predictive with the best model having an AUC of 0.71. Using routinely available clinical variables to predict patients at risk of developing retinopathy and to target them for annual eye screenings may be of some usefulness to safety net clinics.

  7. Reproductive Risk Factors and Coronary Heart Disease in the Women’s Health Initiative Observational Study

    PubMed Central

    Parikh, Nisha I.; Jeppson, Rebecca P.; Berger, Jeffrey S.; Eaton, Charles B.; Kroenke, Candyce H.; LeBlanc, Erin S.; Lewis, Cora E.; Loucks, Eric B.; Parker, Donna R.; Rillamas-Sun, Eileen; Ryckman, Kelli K; Waring, Molly E.; Schenken, Robert S.; Johnson, Karen C; Edstedt-Bonamy, Anna-Karin; Allison, Matthew A.; Howard, Barbara V.

    2016-01-01

    Background Reproductive factors provide an early window into a woman’s coronary heart disease (CHD) risk, however their contribution to CHD risk stratification is uncertain. Methods and Results In the Women’s Health Initiative Observational Study, we constructed Cox proportional hazards models for CHD including age, pregnancy status, number of live births, age at menarche, menstrual irregularity, age at first birth, stillbirths, miscarriages, infertility ≥ 1 year, infertility cause, and breastfeeding. We next added each candidate reproductive factor to an established CHD risk factor model. A final model was then constructed with significant reproductive factors added to established CHD risk factors. Improvement in C-statistic, net reclassification index (or NRI with risk categories of <5%, 5–<10%, and ≥10% 10-year risk of CHD) and integrated discriminatory index (IDI) were assessed. Among 72,982 women [n=4607 CHD events, median follow-up=12.0 (IQR=8.3–13.7) years, mean (SD) age 63.2 (7.2) years], an age-adjusted reproductive risk factor model had a C-statistic of 0.675 for CHD. In a model adjusted for established CHD risk factors, younger age at first birth, number of still births, number of miscarriages and lack of breastfeeding were positively associated with CHD. Reproductive factors modestly improved model discrimination (C-statistic increased from 0.726 to 0.730; IDI=0.0013, p-value < 0.0001). Net reclassification for women with events was not improved (NRI events=0.007, p-value=0.18); and for women without events was marginally improved (NRI non-events=0.002, p-value=0.04) Conclusions Key reproductive factors are associated with CHD independently of established CHD risk factors, very modestly improve model discrimination and do not materially improve net reclassification. PMID:27143682

  8. Exchangeability, extreme returns and Value-at-Risk forecasts

    NASA Astrophysics Data System (ADS)

    Huang, Chun-Kai; North, Delia; Zewotir, Temesgen

    2017-07-01

    In this paper, we propose a new approach to extreme value modelling for the forecasting of Value-at-Risk (VaR). In particular, the block maxima and the peaks-over-threshold methods are generalised to exchangeable random sequences. This caters for the dependencies, such as serial autocorrelation, of financial returns observed empirically. In addition, this approach allows for parameter variations within each VaR estimation window. Empirical prior distributions of the extreme value parameters are attained by using resampling procedures. We compare the results of our VaR forecasts to that of the unconditional extreme value theory (EVT) approach and the conditional GARCH-EVT model for robust conclusions.

  9. Cannabis Use and Reduced Risk of Insulin Resistance in HIV-HCV Infected Patients: A Longitudinal Analysis (ANRS CO13 HEPAVIH).

    PubMed

    Carrieri, Maria Patrizia; Serfaty, Lawrence; Vilotitch, Antoine; Winnock, Maria; Poizot-Martin, Isabelle; Loko, Marc-Arthur; Lions, Caroline; Lascoux-Combe, Caroline; Roux, Perrine; Salmon-Ceron, Dominique; Spire, Bruno; Dabis, Francois

    2015-07-01

    Diabetes and insulin resistance (IR) is common in human immunodeficiency virus-hepatitis C virus (HIV-HCV)-coinfected patients, a population also concerned with elevated cannabis use. Cannabis has been associated with reduced IR risk in some population-based surveys. We determined whether cannabis use was consistently associated with reduced IR risk in HEPAVIH, a French nationwide cohort of HIV-HCV-coinfected patients. HEPAVIH medical and sociobehavioral data were collected (using annual self-administered questionnaires). We used 60 months of follow-up data for patients with at least 1 medical visit where IR (using homeostatic model assessment of insulin resistance [HOMA-IR]) and cannabis use were assessed. A mixed logistic regression model was used to evaluate the association between IR risk (HOMA-IR > 2.77) and cannabis use (occasional, regular, daily). Among the 703 patients included in the study (1287 visits), 323 (46%) had HOMA-IR > 2.77 for at least 1 follow-up visit and 319 (45%) reported cannabis use in the 6 months before the first available visit. Cannabis users (irrespective of frequency) were less likely to have HOMA-IR > 2.77 (odds ratio [95% confidence interval], 0.4 [.2-.5]) after adjustment for known correlates/confounders. Two sensitivity analyses with HOMA-IR values as a continuous variable and a cutoff value of 3.8 confirmed the association between reduced IR risk and cannabis use. Cannabis use is associated with a lower IR risk in HIV-HCV-coinfected patients. The benefits of cannabis-based pharmacotherapies for patients concerned with increased risk of IR and diabetes need to be evaluated in clinical research and practice. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Risk factors and algorithms for chlamydial and gonococcal cervical infections in women attending family planning clinics in Thailand.

    PubMed

    Rugpao, Sungwal; Rungruengthanakit, Kittipong; Werawatanakul, Yuthapong; Sinchai, Wanida; Ruengkris, Tosaporn; Lamlertkittikul, Surachai; Pinjareon, Sutham; Koonlertkit, Sompong; Limtrakul, Aram; Sriplienchan, Somchai; Wongthanee, Antika; Sirirojn, Bangorn; Morrison, Charles S; Celentano, David D

    2010-02-01

    To identify risk factors associated with and evaluate algorithms for predicting Chlamydia trachomatis (CT) and Neisseria gonorrhoeae (NG) cervical infections in women attending family planning clinics in Thailand. Eligible women were recruited from family planning clinics from all regions in Thailand. The women were followed at 3-month intervals for 15-24 months. At each visit, the women were interviewed for interval sexually transmitted infection (STI) history in the past 3 months, recent sexual behavior, and contraceptive use. Pelvic examinations were performed and endocervical specimens were collected to test for CT and NG using polymerase chain reaction. Factors associated with incident CT/NG cervical infections in multivariate analyses included region of country other than the north, age

  11. Process change evaluation framework for allogeneic cell therapies: impact on drug development and commercialization.

    PubMed

    Hassan, Sally; Huang, Hsini; Warren, Kim; Mahdavi, Behzad; Smith, David; Jong, Simcha; Farid, Suzanne S

    2016-04-01

    Some allogeneic cell therapies requiring a high dose of cells for large indication groups demand a change in cell expansion technology, from planar units to microcarriers in single-use bioreactors for the market phase. The aim was to model the optimal timing for making this change. A development lifecycle cash flow framework was created to examine the implications of process changes to microcarrier cultures at different stages of a cell therapy's lifecycle. The analysis performed under assumptions used in the framework predicted that making this switch earlier in development is optimal from a total expected out-of-pocket cost perspective. From a risk-adjusted net present value view, switching at Phase I is economically competitive but a post-approval switch can offer the highest risk-adjusted net present value as the cost of switching is offset by initial market penetration with planar technologies. The framework can facilitate early decision-making during process development.

  12. Factoring uncertainty into restoration modeling of in-situ leach uranium mines

    USGS Publications Warehouse

    Johnson, Raymond H.; Friedel, Michael J.

    2009-01-01

    Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.

  13. Value of Progression of Coronary Artery Calcification for Risk Prediction of Coronary and Cardiovascular Events: Result of the HNR Study (Heinz Nixdorf Recall).

    PubMed

    Lehmann, Nils; Erbel, Raimund; Mahabadi, Amir A; Rauwolf, Michael; Möhlenkamp, Stefan; Moebus, Susanne; Kälsch, Hagen; Budde, Thomas; Schmermund, Axel; Stang, Andreas; Führer-Sakel, Dagmar; Weimar, Christian; Roggenbuck, Ulla; Dragano, Nico; Jöckel, Karl-Heinz

    2018-02-13

    Computed tomography (CT) allows estimation of coronary artery calcium (CAC) progression. We evaluated several progression algorithms in our unselected, population-based cohort for risk prediction of coronary and cardiovascular events. In 3281 participants (45-74 years of age), free from cardiovascular disease until the second visit, risk factors, and CTs at baseline (b) and after a mean of 5.1 years (5y) were measured. Hard coronary and cardiovascular events, and total cardiovascular events including revascularization, as well, were recorded during a follow-up time of 7.8±2.2 years after the second CT. The added predictive value of 10 CAC progression algorithms on top of risk factors including baseline CAC was evaluated by using survival analysis, C-statistics, net reclassification improvement, and integrated discrimination index. A subgroup analysis of risk in CAC categories was performed. We observed 85 (2.6%) hard coronary, 161 (4.9%) hard cardiovascular, and 241 (7.3%) total cardiovascular events. Absolute CAC progression was higher with versus without subsequent coronary events (median, 115 [Q1-Q3, 23-360] versus 8 [0-83], P <0.0001; similar for hard/total cardiovascular events). Some progression algorithms added to the predictive value of baseline CT and risk assessment in terms of C-statistic or integrated discrimination index, especially for total cardiovascular events. However, CAC progression did not improve models including CAC 5y and 5-year risk factors. An excellent prognosis was found for 921 participants with double-zero CAC b =CAC 5y =0 (10-year coronary and hard/total cardiovascular risk: 1.4%, 2.0%, and 2.8%), which was for participants with incident CAC 1.8%, 3.8%, and 6.6%, respectively. When CAC b progressed from 1 to 399 to CAC 5y ≥400, coronary and total cardiovascular risk were nearly 2-fold in comparison with subjects who remained below CAC 5y =400. Participants with CAC b ≥400 had high rates of hard coronary and hard/total cardiovascular events (10-year risk: 12.0%, 13.5%, and 30.9%, respectively). CAC progression is associated with coronary and cardiovascular event rates, but adds only weakly to risk prediction. What counts is the most recent CAC value and risk factor assessment. Therefore, a repeat scan >5 years after the first scan may be of additional value, except when a double-zero CT scan is present or when the subjects are already at high risk. © 2017 The Authors.

  14. Weighted Genetic Risk Scores and Prediction of Weight Gain in Solid Organ Transplant Populations

    PubMed Central

    Saigi-Morgui, Núria; Quteineh, Lina; Bochud, Pierre-Yves; Crettol, Severine; Kutalik, Zoltán; Wojtowicz, Agnieszka; Bibert, Stéphanie; Beckmann, Sonja; Mueller, Nicolas J; Binet, Isabelle; van Delden, Christian; Steiger, Jürg; Mohacsi, Paul; Stirnimann, Guido; Soccal, Paola M.; Pascual, Manuel; Eap, Chin B

    2016-01-01

    Background Polygenic obesity in Solid Organ Transplant (SOT) populations is considered a risk factor for the development of metabolic abnormalities and graft survival. Few studies to date have studied the genetics of weight gain in SOT recipients. We aimed to determine whether weighted genetic risk scores (w-GRS) integrating genetic polymorphisms from GWAS studies (SNP group#1 and SNP group#2) and from Candidate Gene studies (SNP group#3) influence BMI in SOT populations and if they predict ≥10% weight gain (WG) one year after transplantation. To do so, two samples (nA = 995, nB = 156) were obtained from naturalistic studies and three w-GRS were constructed and tested for association with BMI over time. Prediction of 10% WG at one year after transplantation was assessed with models containing genetic and clinical factors. Results w-GRS were associated with BMI in sample A and B combined (BMI increased by 0.14 and 0.11 units per additional risk allele in SNP group#1 and #2, respectively, p-values<0.008). w-GRS of SNP group#3 showed an effect of 0.01 kg/m2 per additional risk allele when combining sample A and B (p-value 0.04). Models with genetic factors performed better than models without in predicting 10% WG at one year after transplantation. Conclusions This is the first study in SOT evaluating extensively the association of w-GRS with BMI and the influence of clinical and genetic factors on 10% of WG one year after transplantation, showing the importance of integrating genetic factors in the final model. Genetics of obesity among SOT recipients remains an important issue and can contribute to treatment personalization and prediction of WG after transplantation. PMID:27788139

  15. Weighted Genetic Risk Scores and Prediction of Weight Gain in Solid Organ Transplant Populations.

    PubMed

    Saigi-Morgui, Núria; Quteineh, Lina; Bochud, Pierre-Yves; Crettol, Severine; Kutalik, Zoltán; Wojtowicz, Agnieszka; Bibert, Stéphanie; Beckmann, Sonja; Mueller, Nicolas J; Binet, Isabelle; van Delden, Christian; Steiger, Jürg; Mohacsi, Paul; Stirnimann, Guido; Soccal, Paola M; Pascual, Manuel; Eap, Chin B

    2016-01-01

    Polygenic obesity in Solid Organ Transplant (SOT) populations is considered a risk factor for the development of metabolic abnormalities and graft survival. Few studies to date have studied the genetics of weight gain in SOT recipients. We aimed to determine whether weighted genetic risk scores (w-GRS) integrating genetic polymorphisms from GWAS studies (SNP group#1 and SNP group#2) and from Candidate Gene studies (SNP group#3) influence BMI in SOT populations and if they predict ≥10% weight gain (WG) one year after transplantation. To do so, two samples (nA = 995, nB = 156) were obtained from naturalistic studies and three w-GRS were constructed and tested for association with BMI over time. Prediction of 10% WG at one year after transplantation was assessed with models containing genetic and clinical factors. w-GRS were associated with BMI in sample A and B combined (BMI increased by 0.14 and 0.11 units per additional risk allele in SNP group#1 and #2, respectively, p-values<0.008). w-GRS of SNP group#3 showed an effect of 0.01 kg/m2 per additional risk allele when combining sample A and B (p-value 0.04). Models with genetic factors performed better than models without in predicting 10% WG at one year after transplantation. This is the first study in SOT evaluating extensively the association of w-GRS with BMI and the influence of clinical and genetic factors on 10% of WG one year after transplantation, showing the importance of integrating genetic factors in the final model. Genetics of obesity among SOT recipients remains an important issue and can contribute to treatment personalization and prediction of WG after transplantation.

  16. Partnership working between the Fire Service and NHS: delivering a cost-saving service to improve the safety of high-risk people.

    PubMed

    Craig, Joyce A; Creegan, Shelagh; Tait, Martin; Dolan, Donna

    2015-04-14

    The Scottish Fire and Rescue Service and NHS Tayside piloted partnership working. A Community Fire Safety Link Worker provided Risk Assessments to adults, identified by community health teams, at high risk of fires, with the aim of reducing fires. An existing evaluation shows the Service developed a culture of 'high trust' between partners and had high client satisfaction. This paper reports on an economic evaluation of the costs and benefits of the Link Worker role. An economic evaluation of the costs and benefits of the Link Worker role was undertaken. Changes in the Risk Assessment score following delivery of the Service were used to estimate the potential fires avoided. These were valued using a national cost of a fire. The estimated cost of delivering the Service was deducted from these savings. The pilot was estimated to save 4.4 fires, equivalent to £286 per client. The estimated cost of delivering the Service was £55 per client, giving net savings of £231 per client. The pilot was cost-saving under all scenarios, with results sensitive to the probability of a fire. We believe this is the first evaluation of Fire Safety Risk Assessments. Partnership working, delivering joint Risk Assessments in the homes of people at high risk of fire, is modelled to be cost saving. Uncertainties in data and small sample are key limitations. Further research is required into the ex ante risk of fire by risk category. Despite these limitations, potential savings identified in this study supports greater adoption of this partnership initiative.

  17. CalTOX (registered trademark), A multimedia total exposure model spreadsheet user's guide. Version 4.0(Beta)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKone, T.E.; Enoch, K.G.

    2002-08-01

    CalTOX has been developed as a set of spreadsheet models and spreadsheet data sets to assist in assessing human exposures from continuous releases to multiple environmental media, i.e. air, soil, and water. It has also been used for waste classification and for setting soil clean-up levels at uncontrolled hazardous wastes sites. The modeling components of CalTOX include a multimedia transport and transformation model, multi-pathway exposure scenario models, and add-ins to quantify and evaluate uncertainty and variability. All parameter values used as inputs to CalTOX are distributions, described in terms of mean values and a coefficient of variation, rather than asmore » point estimates or plausible upper values such as most other models employ. This probabilistic approach allows both sensitivity and uncertainty analyses to be directly incorporated into the model operation. This manual provides CalTOX users with a brief overview of the CalTOX spreadsheet model and provides instructions for using the spreadsheet to make deterministic and probabilistic calculations of source-dose-risk relationships.« less

  18. Machine Learning for Social Services: A Study of Prenatal Case Management in Illinois.

    PubMed

    Pan, Ian; Nolan, Laura B; Brown, Rashida R; Khan, Romana; van der Boor, Paul; Harris, Daniel G; Ghani, Rayid

    2017-06-01

    To evaluate the positive predictive value of machine learning algorithms for early assessment of adverse birth risk among pregnant women as a means of improving the allocation of social services. We used administrative data for 6457 women collected by the Illinois Department of Human Services from July 2014 to May 2015 to develop a machine learning model for adverse birth prediction and improve upon the existing paper-based risk assessment. We compared different models and determined the strongest predictors of adverse birth outcomes using positive predictive value as the metric for selection. Machine learning algorithms performed similarly, outperforming the current paper-based risk assessment by up to 36%; a refined paper-based assessment outperformed the current assessment by up to 22%. We estimate that these improvements will allow 100 to 170 additional high-risk pregnant women screened for program eligibility each year to receive services that would have otherwise been unobtainable. Our analysis exhibits the potential for machine learning to move government agencies toward a more data-informed approach to evaluating risk and providing social services. Overall, such efforts will improve the efficiency of allocating resource-intensive interventions.

  19. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  20. Predicting High Imaging Utilization Based on Initial Radiology Reports: A Feasibility Study of Machine Learning.

    PubMed

    Hassanpour, Saeed; Langlotz, Curtis P

    2016-01-01

    Imaging utilization has significantly increased over the last two decades, and is only recently showing signs of moderating. To help healthcare providers identify patients at risk for high imaging utilization, we developed a prediction model to recognize high imaging utilizers based on their initial imaging reports. The prediction model uses a machine learning text classification framework. In this study, we used radiology reports from 18,384 patients with at least one abdomen computed tomography study in their imaging record at Stanford Health Care as the training set. We modeled the radiology reports in a vector space and trained a support vector machine classifier for this prediction task. We evaluated our model on a separate test set of 4791 patients. In addition to high prediction accuracy, in our method, we aimed at achieving high specificity to identify patients at high risk for high imaging utilization. Our results (accuracy: 94.0%, sensitivity: 74.4%, specificity: 97.9%, positive predictive value: 87.3%, negative predictive value: 95.1%) show that a prediction model can enable healthcare providers to identify in advance patients who are likely to be high utilizers of imaging services. Machine learning classifiers developed from narrative radiology reports are feasible methods to predict imaging utilization. Such systems can be used to identify high utilizers, inform future image ordering behavior, and encourage judicious use of imaging. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  1. [The model of perioperative risk assessment in elderly patients - interim analysis].

    PubMed

    Grabowska, Izabela; Ścisło, Lucyna; Pietruszka, Szymon; Walewska, Elzbieta; Paszko, Agata; Siarkiewicz, Benita; Richter, Piotr; Budzyński, Andrzej; Szczepanik, Antoni M

    2017-04-21

    Demographic changes in contemporary society require implementation of proper perioperative care of elderly patients due to an increased risk of perioperative complications in this group. Preoperative assessment of health status identifies risks and enables preventive interventions, improving outcomes of surgical treatment. The Comprehensive Geriatric Assessment contains numerous diagnostic tests and consultations, which is expensive and difficult to use in everyday practice. The development of a simplified model of perioperative assessment of elderly patients will help identifying the group of patients who require further diagnostic workup. The aim of the study is to evaluate the usefulness of the tests used in a proposed model of perioperative risk assessment in elderly patients. In a group of 178 patients older than 64 years admitted for surgical procedures, a battery of tests was performed. The proposed model of perioperative risk assessment included: Charlson Comorbidity Index, ADL (activities of daily living), TUG test (timed "up and go" test), MNA (mini nutritional assessment), AMTS (abbreviated mental test score), spirometry measurement of respiratory muscle strength (Pimax, Pemax). Distribution of abnormal results of each test has been analysed. The Charlson Index over 6 points was recorded in 10.1% of patients (15.1% in cancer patients). Abnormal result of the TUG test was observed in 32.1%. The risk of malnutrition in MNA test has been identified in 29.7% (39.2% in cancer patients). Abnormal test results at the level of 10-30% indicate potential diagnostic value of Charlson Comorbidity Index, TUG test and MNA in the evaluation of perioperative risk in elderly patients.

  2. Adverse conditions at the workplace are associated with increased suicide risk.

    PubMed

    Baumert, Jens; Schneider, Barbara; Lukaschek, Karoline; Emeny, Rebecca T; Meisinger, Christa; Erazo, Natalia; Dragano, Nico; Ladwig, Karl-Heinz

    2014-10-01

    The present study addressed potential harms of a negative working environment for employed subjects. The main aim was to evaluate if adverse working conditions and job strain are related to an increase in suicide mortality. The study population consisted of 6817 participants drawn from the MONICA/KORA Augsburg, Germany, surveys conducted in 1984-1995, being employed at baseline examination and followed up on average for 12.6 years. Adverse working conditions were assessed by an instrument of 16 items about chronobiological, physical and psychosocial conditions at the workplace, job strain was assessed as defined by Karasek. Suicide risks were estimated by Cox regression adjusted for suicide-related risk factors. A number of 28 suicide cases were observed within follow-up. High levels of adversity in chronobiological/physical working conditions significantly increased the risk for suicide mortality (HR 3.28, 95% CI 1.43-7.54) compared to low/intermediate levels in a model adjusted for age, sex and survey (p value 0.005). Additional adjustment for living alone, low educational level, smoking, high alcohol consumption, obesity and depressed mood attenuated this effect (HR 2.73) but significance remained (p value 0.022). Adverse psychosocial working conditions and job strain, in contrast, had no impact on subsequent suicide mortality risk (p values > 0.200). A negative working environment concerning chronobiological or physical conditions at the workplace had an unfavourable impact on suicide mortality risk, even after controlling for relevant suicide-related risk factors. Employer interventions aimed to improve workplace conditions might be considered as a suitable means to prevent suicides among employees. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Evaluation of Revised International Staging System (R-ISS) for transplant-eligible multiple myeloma patients.

    PubMed

    González-Calle, Verónica; Slack, Abigail; Keane, Niamh; Luft, Susan; Pearce, Kathryn E; Ketterling, Rhett P; Jain, Tania; Chirackal, Sintosebastian; Reeder, Craig; Mikhael, Joseph; Noel, Pierre; Mayo, Angela; Adams, Roberta H; Ahmann, Gregory; Braggio, Esteban; Stewart, A Keith; Bergsagel, P Leif; Van Wier, Scott A; Fonseca, Rafael

    2018-04-06

    The International Myeloma Working Group has proposed the Revised International Staging System (R-ISS) for risk stratification of multiple myeloma (MM) patients. There are a limited number of studies that have validated this risk model in the autologous stem cell transplant (ASCT) setting. In this retrospective study, we evaluated the applicability and value for predicting survival of the R-ISS model in 134 MM patients treated with new agents and ASCT at the Mayo Clinic in Arizona and the University Hospital of Salamanca in Spain. The patients were reclassified at diagnosis according to the R-ISS: 44 patients (33%) had stage I, 75 (56%) had stage II, and 15 (11%) had stage III. After a median follow-up of 60 months, R-ISS assessed at diagnosis was an independent predictor for overall survival (OS) after ASCT, with median OS not reached, 111 and 37 months for R-ISS I, II and III, respectively (P < 0.001). We also found that patients belonging to R-ISS II and having high-risk chromosomal abnormalities (CA) had a significant shorter median OS than those with R-ISS II without CA: 70 vs. 111 months, respectively. Therefore, this study lends further support for the R-ISS as a reliable prognostic tool for estimating survival in transplant myeloma patients and suggests the importance of high-risk CA in the R-ISS II group.

  4. Mining geriatric assessment data for in-patient fall prediction models and high-risk subgroups

    PubMed Central

    2012-01-01

    Background Hospital in-patient falls constitute a prominent problem in terms of costs and consequences. Geriatric institutions are most often affected, and common screening tools cannot predict in-patient falls consistently. Our objectives are to derive comprehensible fall risk classification models from a large data set of geriatric in-patients' assessment data and to evaluate their predictive performance (aim#1), and to identify high-risk subgroups from the data (aim#2). Methods A data set of n = 5,176 single in-patient episodes covering 1.5 years of admissions to a geriatric hospital were extracted from the hospital's data base and matched with fall incident reports (n = 493). A classification tree model was induced using the C4.5 algorithm as well as a logistic regression model, and their predictive performance was evaluated. Furthermore, high-risk subgroups were identified from extracted classification rules with a support of more than 100 instances. Results The classification tree model showed an overall classification accuracy of 66%, with a sensitivity of 55.4%, a specificity of 67.1%, positive and negative predictive values of 15% resp. 93.5%. Five high-risk groups were identified, defined by high age, low Barthel index, cognitive impairment, multi-medication and co-morbidity. Conclusions Our results show that a little more than half of the fallers may be identified correctly by our model, but the positive predictive value is too low to be applicable. Non-fallers, on the other hand, may be sorted out with the model quite well. The high-risk subgroups and the risk factors identified (age, low ADL score, cognitive impairment, institutionalization, polypharmacy and co-morbidity) reflect domain knowledge and may be used to screen certain subgroups of patients with a high risk of falling. Classification models derived from a large data set using data mining methods can compete with current dedicated fall risk screening tools, yet lack diagnostic precision. High-risk subgroups may be identified automatically from existing geriatric assessment data, especially when combined with domain knowledge in a hybrid classification model. Further work is necessary to validate our approach in a controlled prospective setting. PMID:22417403

  5. Mining geriatric assessment data for in-patient fall prediction models and high-risk subgroups.

    PubMed

    Marschollek, Michael; Gövercin, Mehmet; Rust, Stefan; Gietzelt, Matthias; Schulze, Mareike; Wolf, Klaus-Hendrik; Steinhagen-Thiessen, Elisabeth

    2012-03-14

    Hospital in-patient falls constitute a prominent problem in terms of costs and consequences. Geriatric institutions are most often affected, and common screening tools cannot predict in-patient falls consistently. Our objectives are to derive comprehensible fall risk classification models from a large data set of geriatric in-patients' assessment data and to evaluate their predictive performance (aim#1), and to identify high-risk subgroups from the data (aim#2). A data set of n = 5,176 single in-patient episodes covering 1.5 years of admissions to a geriatric hospital were extracted from the hospital's data base and matched with fall incident reports (n = 493). A classification tree model was induced using the C4.5 algorithm as well as a logistic regression model, and their predictive performance was evaluated. Furthermore, high-risk subgroups were identified from extracted classification rules with a support of more than 100 instances. The classification tree model showed an overall classification accuracy of 66%, with a sensitivity of 55.4%, a specificity of 67.1%, positive and negative predictive values of 15% resp. 93.5%. Five high-risk groups were identified, defined by high age, low Barthel index, cognitive impairment, multi-medication and co-morbidity. Our results show that a little more than half of the fallers may be identified correctly by our model, but the positive predictive value is too low to be applicable. Non-fallers, on the other hand, may be sorted out with the model quite well. The high-risk subgroups and the risk factors identified (age, low ADL score, cognitive impairment, institutionalization, polypharmacy and co-morbidity) reflect domain knowledge and may be used to screen certain subgroups of patients with a high risk of falling. Classification models derived from a large data set using data mining methods can compete with current dedicated fall risk screening tools, yet lack diagnostic precision. High-risk subgroups may be identified automatically from existing geriatric assessment data, especially when combined with domain knowledge in a hybrid classification model. Further work is necessary to validate our approach in a controlled prospective setting.

  6. The Relative Impact of Climate Change on the Extinction Risk of Tree Species in the Montane Tropical Andes.

    PubMed

    Tejedor Garavito, Natalia; Newton, Adrian C; Golicher, Duncan; Oldfield, Sara

    2015-01-01

    There are widespread concerns that anthropogenic climate change will become a major cause of global biodiversity loss. However, the potential impact of climate change on the extinction risk of species remains poorly understood, particularly in comparison to other current threats. The objective of this research was to examine the relative impact of climate change on extinction risk of upper montane tree species in the tropical Andes, an area of high biodiversity value that is particularly vulnerable to climate change impacts. The extinction risk of 129 tree species endemic to the region was evaluated according to the IUCN Red List criteria, both with and without the potential impacts of climate change. Evaluations were supported by development of species distribution models, using three methods (generalized additive models, recursive partitioning, and support vector machines), all of which produced similarly high AUC values when averaged across all species evaluated (0.82, 0.86, and 0.88, respectively). Inclusion of climate change increased the risk of extinction of 18-20% of the tree species evaluated, depending on the climate scenario. The relative impact of climate change was further illustrated by calculating the Red List Index, an indicator that shows changes in the overall extinction risk of sets of species over time. A 15% decline in the Red List Index was obtained when climate change was included in this evaluation. While these results suggest that climate change represents a significant threat to tree species in the tropical Andes, they contradict previous suggestions that climate change will become the most important cause of biodiversity loss in coming decades. Conservation strategies should therefore focus on addressing the multiple threatening processes currently affecting biodiversity, rather than focusing primarily on potential climate change impacts.

  7. The Relative Impact of Climate Change on the Extinction Risk of Tree Species in the Montane Tropical Andes

    PubMed Central

    Tejedor Garavito, Natalia; Newton, Adrian C.; Golicher, Duncan; Oldfield, Sara

    2015-01-01

    There are widespread concerns that anthropogenic climate change will become a major cause of global biodiversity loss. However, the potential impact of climate change on the extinction risk of species remains poorly understood, particularly in comparison to other current threats. The objective of this research was to examine the relative impact of climate change on extinction risk of upper montane tree species in the tropical Andes, an area of high biodiversity value that is particularly vulnerable to climate change impacts. The extinction risk of 129 tree species endemic to the region was evaluated according to the IUCN Red List criteria, both with and without the potential impacts of climate change. Evaluations were supported by development of species distribution models, using three methods (generalized additive models, recursive partitioning, and support vector machines), all of which produced similarly high AUC values when averaged across all species evaluated (0.82, 0.86, and 0.88, respectively). Inclusion of climate change increased the risk of extinction of 18–20% of the tree species evaluated, depending on the climate scenario. The relative impact of climate change was further illustrated by calculating the Red List Index, an indicator that shows changes in the overall extinction risk of sets of species over time. A 15% decline in the Red List Index was obtained when climate change was included in this evaluation. While these results suggest that climate change represents a significant threat to tree species in the tropical Andes, they contradict previous suggestions that climate change will become the most important cause of biodiversity loss in coming decades. Conservation strategies should therefore focus on addressing the multiple threatening processes currently affecting biodiversity, rather than focusing primarily on potential climate change impacts. PMID:26177097

  8. Development and validation of an automated delirium risk assessment system (Auto-DelRAS) implemented in the electronic health record system.

    PubMed

    Moon, Kyoung-Ja; Jin, Yinji; Jin, Taixian; Lee, Sun-Mi

    2018-01-01

    A key component of the delirium management is prevention and early detection. To develop an automated delirium risk assessment system (Auto-DelRAS) that automatically alerts health care providers of an intensive care unit (ICU) patient's delirium risk based only on data collected in an electronic health record (EHR) system, and to evaluate the clinical validity of this system. Cohort and system development designs were used. Medical and surgical ICUs in two university hospitals in Seoul, Korea. A total of 3284 patients for the development of Auto-DelRAS, 325 for external validation, 694 for validation after clinical applications. The 4211 data items were extracted from the EHR system and delirium was measured using CAM-ICU (Confusion Assessment Method for Intensive Care Unit). The potential predictors were selected and a logistic regression model was established to create a delirium risk scoring algorithm to construct the Auto-DelRAS. The Auto-DelRAS was evaluated at three months and one year after its application to clinical practice to establish the predictive validity of the system. Eleven predictors were finally included in the logistic regression model. The results of the Auto-DelRAS risk assessment were shown as high/moderate/low risk on a Kardex screen. The predictive validity, analyzed after the clinical application of Auto-DelRAS after one year, showed a sensitivity of 0.88, specificity of 0.72, positive predictive value of 0.53, negative predictive value of 0.94, and a Youden index of 0.59. A relatively high level of predictive validity was maintained with the Auto-DelRAS system, even one year after it was applied to clinical practice. Copyright © 2017. Published by Elsevier Ltd.

  9. Assessing the reliability of dose coefficients for exposure to radioiodine by members of the public, accounting for dosimetric and risk model uncertainties.

    PubMed

    Puncher, M; Zhang, W; Harrison, J D; Wakeford, R

    2017-06-26

    Assessments of risk to a specific population group resulting from internal exposure to a particular radionuclide can be used to assess the reliability of the appropriate International Commission on Radiological Protection (ICRP) dose coefficients used as a radiation protection device for the specified exposure pathway. An estimate of the uncertainty on the associated risk is important for informing judgments on reliability; a derived uncertainty factor, UF, is an estimate of the 95% probable geometric difference between the best risk estimate and the nominal risk and is a useful tool for making this assessment. This paper describes the application of parameter uncertainty analysis to quantify uncertainties resulting from internal exposures to radioiodine by members of the public, specifically 1, 10 and 20-year old females from the population of England and Wales. Best estimates of thyroid cancer incidence risk (lifetime attributable risk) are calculated for ingestion or inhalation of 129 I and 131 I, accounting for uncertainties in biokinetic model and cancer risk model parameter values. These estimates are compared with the equivalent ICRP derived nominal age-, sex- and population-averaged estimates of excess thyroid cancer incidence to obtain UFs. Derived UF values for ingestion or inhalation of 131 I for 1 year, 10-year and 20-year olds are around 28, 12 and 6, respectively, when compared with ICRP Publication 103 nominal values, and 9, 7 and 14, respectively, when compared with ICRP Publication 60 values. Broadly similar results were obtained for 129 I. The uncertainties on risk estimates are largely determined by uncertainties on risk model parameters rather than uncertainties on biokinetic model parameters. An examination of the sensitivity of the results to the risk models and populations used in the calculations show variations in the central estimates of risk of a factor of around 2-3. It is assumed that the direct proportionality of excess thyroid cancer risk and dose observed at low to moderate acute doses and incorporated in the risk models also applies to very small doses received at very low dose rates; the uncertainty in this assumption is considerable, but largely unquantifiable. The UF values illustrate the need for an informed approach to the use of ICRP dose and risk coefficients.

  10. ADMS Evaluation Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2018-01-23

    Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.

  11. The Effect of Forest Management Strategy on Carbon Storage and Revenue in Western Washington: A Probabilistic Simulation of Tradeoffs.

    PubMed

    Fischer, Paul W; Cullen, Alison C; Ettl, Gregory J

    2017-01-01

    The objectives of this study are to understand tradeoffs between forest carbon and timber values, and evaluate the impact of uncertainty in improved forest management (IFM) carbon offset projects to improve forest management decisions. The study uses probabilistic simulation of uncertainty in financial risk for three management scenarios (clearcutting in 45- and 65-year rotations and no harvest) under three carbon price schemes (historic voluntary market prices, cap and trade, and carbon prices set to equal net present value (NPV) from timber-oriented management). Uncertainty is modeled for value and amount of carbon credits and wood products, the accuracy of forest growth model forecasts, and four other variables relevant to American Carbon Registry methodology. Calculations use forest inventory data from a 1,740 ha forest in western Washington State, using the Forest Vegetation Simulator (FVS) growth model. Sensitivity analysis shows that FVS model uncertainty contributes more than 70% to overall NPV variance, followed in importance by variability in inventory sample (3-14%), and short-term prices for timber products (8%), while variability in carbon credit price has little influence (1.1%). At regional average land-holding costs, a no-harvest management scenario would become revenue-positive at a carbon credit break-point price of $14.17/Mg carbon dioxide equivalent (CO 2 e). IFM carbon projects are associated with a greater chance of both large payouts and large losses to landowners. These results inform policymakers and forest owners of the carbon credit price necessary for IFM approaches to equal or better the business-as-usual strategy, while highlighting the magnitude of financial risk and reward through probabilistic simulation. © 2016 Society for Risk Analysis.

  12. An overview of techniques for linking high-dimensional molecular data to time-to-event endpoints by risk prediction models.

    PubMed

    Binder, Harald; Porzelius, Christine; Schumacher, Martin

    2011-03-01

    Analysis of molecular data promises identification of biomarkers for improving prognostic models, thus potentially enabling better patient management. For identifying such biomarkers, risk prediction models can be employed that link high-dimensional molecular covariate data to a clinical endpoint. In low-dimensional settings, a multitude of statistical techniques already exists for building such models, e.g. allowing for variable selection or for quantifying the added value of a new biomarker. We provide an overview of techniques for regularized estimation that transfer this toward high-dimensional settings, with a focus on models for time-to-event endpoints. Techniques for incorporating specific covariate structure are discussed, as well as techniques for dealing with more complex endpoints. Employing gene expression data from patients with diffuse large B-cell lymphoma, some typical modeling issues from low-dimensional settings are illustrated in a high-dimensional application. First, the performance of classical stepwise regression is compared to stage-wise regression, as implemented by a component-wise likelihood-based boosting approach. A second issues arises, when artificially transforming the response into a binary variable. The effects of the resulting loss of efficiency and potential bias in a high-dimensional setting are illustrated, and a link to competing risks models is provided. Finally, we discuss conditions for adequately quantifying the added value of high-dimensional gene expression measurements, both at the stage of model fitting and when performing evaluation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A Risk Prediction Model for Sporadic CRC Based on Routine Lab Results.

    PubMed

    Boursi, Ben; Mamtani, Ronac; Hwang, Wei-Ting; Haynes, Kevin; Yang, Yu-Xiao

    2016-07-01

    Current risk scores for colorectal cancer (CRC) are based on demographic and behavioral factors and have limited predictive values. To develop a novel risk prediction model for sporadic CRC using clinical and laboratory data in electronic medical records. We conducted a nested case-control study in a UK primary care database. Cases included those with a diagnostic code of CRC, aged 50-85. Each case was matched with four controls using incidence density sampling. CRC predictors were examined using univariate conditional logistic regression. Variables with p value <0.25 in the univariate analysis were further evaluated in multivariate models using backward elimination. Discrimination was assessed using receiver operating curve. Calibration was evaluated using the McFadden's R2. Net reclassification index (NRI) associated with incorporation of laboratory results was calculated. Results were internally validated. A model similar to existing CRC prediction models which included age, sex, height, obesity, ever smoking, alcohol dependence, and previous screening colonoscopy had an AUC of 0.58 (0.57-0.59) with poor goodness of fit. A laboratory-based model including hematocrit, MCV, lymphocytes, and neutrophil-lymphocyte ratio (NLR) had an AUC of 0.76 (0.76-0.77) and a McFadden's R2 of 0.21 with a NRI of 47.6 %. A combined model including sex, hemoglobin, MCV, white blood cells, platelets, NLR, and oral hypoglycemic use had an AUC of 0.80 (0.79-0.81) with a McFadden's R2 of 0.27 and a NRI of 60.7 %. Similar results were shown in an internal validation set. A laboratory-based risk model had good predictive power for sporadic CRC risk.

  14. Relationship between neck circumference, insulin resistance and arterial stiffness in overweight and obese subjects.

    PubMed

    Fantin, Francesco; Comellato, Gabriele; Rossi, Andrea P; Grison, Elisa; Zoico, Elena; Mazzali, Gloria; Zamboni, Mauro

    2017-09-01

    Background Only a few studies have investigated the relationship between neck circumference and cardiometabolic risk. The aim of this study was to assess the relationships between neck circumference, waist circumference, metabolic variables and arterial stiffness in a group of overweight and obese subjects evaluating a possible independent role of neck circumference in determining arterial stiffness. Methods and results We studied 95 subjects (53 women) with an age range of 20-77 years and body mass index range from 25.69 to 47.04 kg/m 2 . In each subject we evaluated body mass index, waist, hip and neck circumference, systolic and diastolic blood pressure, insulin, fasting glucose, cholesterol, low-density lipoprotein and high-density lipoprotein cholesterol and triglycerides. Arterial stiffness was assessed by carotid-femoral pulse wave velocity (PWVcf) and carotid-radial pulse wave velocity (PWVcr). Both PWVcf and PWVcr were higher in subjects with high values of neck circumference compared with subjects with normal values of neck circumference. Subjects with high values of neck circumference and abdominal obesity presented higher values of mean arterial pressure, PWVcr and homeostasis model assessment (HOMA) index and lower values of high-density lipoprotein than subjects with only abdominal obesity. Two models of stepwise multiple regression were performed in order to evaluate the combined effect of independent variables on arterial stiffness. In the first model PWVcf was considered a dependent variable, and age, gender, systolic blood pressure, triglycerides, high-density lipoprotein cholesterol, waist circumference, neck circumference, HOMA index and the use of anti-hypertensive medications were considered independent variables. Age, systolic blood pressure, triglycerides and waist circumference were significant predictors of PWVcf, explaining 65% of its variance. In the second model, in which PWVcr was considered a dependent variable, neck circumference and gender were significant predictors of PWVcr, explaining 24% of its variance. Conclusions These findings emphasise the need to measure not only waist but even neck circumference to better stratify and identify individuals at increased cardiometabolic risk, as upper-body subcutaneous fat is a novel, easily measured fat depot.

  15. Supervised Risk Predictor of Breast Cancer Based on Intrinsic Subtypes

    PubMed Central

    Parker, Joel S.; Mullins, Michael; Cheang, Maggie C.U.; Leung, Samuel; Voduc, David; Vickery, Tammi; Davies, Sherri; Fauron, Christiane; He, Xiaping; Hu, Zhiyuan; Quackenbush, John F.; Stijleman, Inge J.; Palazzo, Juan; Marron, J.S.; Nobel, Andrew B.; Mardis, Elaine; Nielsen, Torsten O.; Ellis, Matthew J.; Perou, Charles M.; Bernard, Philip S.

    2009-01-01

    Purpose To improve on current standards for breast cancer prognosis and prediction of chemotherapy benefit by developing a risk model that incorporates the gene expression–based “intrinsic” subtypes luminal A, luminal B, HER2-enriched, and basal-like. Methods A 50-gene subtype predictor was developed using microarray and quantitative reverse transcriptase polymerase chain reaction data from 189 prototype samples. Test sets from 761 patients (no systemic therapy) were evaluated for prognosis, and 133 patients were evaluated for prediction of pathologic complete response (pCR) to a taxane and anthracycline regimen. Results The intrinsic subtypes as discrete entities showed prognostic significance (P = 2.26E-12) and remained significant in multivariable analyses that incorporated standard parameters (estrogen receptor status, histologic grade, tumor size, and node status). A prognostic model for node-negative breast cancer was built using intrinsic subtype and clinical information. The C-index estimate for the combined model (subtype and tumor size) was a significant improvement on either the clinicopathologic model or subtype model alone. The intrinsic subtype model predicted neoadjuvant chemotherapy efficacy with a negative predictive value for pCR of 97%. Conclusion Diagnosis by intrinsic subtype adds significant prognostic and predictive information to standard parameters for patients with breast cancer. The prognostic properties of the continuous risk score will be of value for the management of node-negative breast cancers. The subtypes and risk score can also be used to assess the likelihood of efficacy from neoadjuvant chemotherapy. PMID:19204204

  16. Evaluating predictive modeling’s potential to improve teleretinal screening participation in urban safety net clinics

    PubMed Central

    Ogunyemi, Omolola; Teklehaimanot, Senait; Patty, Lauren; Moran, Erin; George, Sheba

    2013-01-01

    Introduction Screening guidelines for diabetic patients recommend yearly eye examinations to detect diabetic retinopathy and other forms of diabetic eye disease. However, annual screening rates for retinopathy in US urban safety net settings remain low. Methods Using data gathered from a study of teleretinal screening in six urban safety net clinics, we assessed whether predictive modeling could be of value in identifying patients at risk of developing retinopathy. We developed and examined the accuracy of two predictive modeling approaches for diabetic retinopathy in a sample of 513 diabetic individuals, using routinely available clinical variables from retrospective medical record reviews. Bayesian networks and radial basis function (neural) networks were learned using ten-fold cross-validation. Results The predictive models were modestly predictive with the best model having an AUC of 0.71. Discussion Using routinely available clinical variables to predict patients at risk of developing retinopathy and to target them for annual eye screenings may be of some usefulness to safety net clinics. PMID:23920536

  17. Glycemic index, glycemic load and invasive breast cancer incidence in postmenopausal women: The PREDIMED study.

    PubMed

    Castro-Quezada, Itandehui; Sánchez-Villegas, Almudena; Martínez-González, Miguel Á; Salas-Salvadó, Jordi; Corella, Dolores; Estruch, Ramón; Schröder, Helmut; Álvarez-Pérez, Jacqueline; Ruiz-López, María D; Artacho, Reyes; Ros, Emilio; Bulló, Mónica; Sorli, Jose V; Fitó, Montserrat; Ruiz-Gutiérrez, Valentina; Toledo, Estefanía; Buil-Cosiales, Pilar; García Rodríguez, Antonio; Lapetra, José; Pintó, Xavier; Salaverría, Itziar; Tur, Josep A; Romaguera, Dora; Tresserra-Rimbau, Anna; Serra-Majem, Lluís

    2016-11-01

    The objective of this study was to evaluate the prospective associations between dietary glycemic index (GI) and glycemic load (GL) and the risk for invasive breast cancer incidence in postmenopausal women at high cardiovascular disease (CVD) risk. This study was conducted within the framework of the PREvención con DIeta MEDiterránea (PREDIMED) study, a nutritional intervention trial for primary cardiovascular prevention. We included 4010 women aged between 60 and 80 years who were initially free from breast cancer but at high risk for CVD disease. Dietary information was collected using a validated 137-item food frequency questionnaire. We assigned GI values using the International Tables of GI and GL values. Cases were ascertained through yearly consultation of medical records and through consultation of the National Death Index. Only cases confirmed by results from cytology tests or histological evaluation were included. We estimated multivariable-adjusted hazard ratios for invasive breast cancer risk across tertiles of energy-adjusted dietary GI/GL using Cox regression models. We repeated our analyses using yearly repeated measures of GI/GL intakes. No associations were found between baseline dietary GI/GL and invasive breast cancer incidence. The multivariable hazard ratio and 95% confidence interval (CI) for the top tertile of dietary GI was 1.02 (95% CI: 0.42-2.46) and for dietary GL was 1.00 (95% CI: 0.44-2.30) when compared with the bottom tertile. Repeated-measures analyses yielded similar results. In sensitivity analyses, no significant associations were observed for women with obesity or diabetes. Dietary GI and GL did not appear to be associated with an increased risk for invasive breast cancer in postmenopausal women at high CVD risk.

  18. Fitness components and ecological risk of transgenic release: a model using Japanese medaka (Oryzias latipes).

    PubMed

    Muir, W M; Howard, R D

    2001-07-01

    Any release of transgenic organisms into nature is a concern because ecological relationships between genetically engineered organisms and other organisms (including their wild-type conspecifics) are unknown. To address this concern, we developed a method to evaluate risk in which we input estimates of fitness parameters from a founder population into a recurrence model to predict changes in transgene frequency after a simulated transgenic release. With this method, we grouped various aspects of an organism's life cycle into six net fitness components: juvenile viability, adult viability, age at sexual maturity, female fecundity, male fertility, and mating advantage. We estimated these components for wild-type and transgenic individuals using the fish, Japanese medaka (Oryzias latipes). We generalized our model's predictions using various combinations of fitness component values in addition to our experimentally derived estimates. Our model predicted that, for a wide range of parameter values, transgenes could spread in populations despite high juvenile viability costs if transgenes also have sufficiently high positive effects on other fitness components. Sensitivity analyses indicated that transgene effects on age at sexual maturity should have the greatest impact on transgene frequency, followed by juvenile viability, mating advantage, female fecundity, and male fertility, with changes in adult viability, resulting in the least impact.

  19. Routine real-time cost-effectiveness monitoring of a web-based depression intervention: a risk-sharing proposal.

    PubMed

    Naveršnik, Klemen; Mrhar, Aleš

    2014-02-27

    A new health care technology must be cost-effective in order to be adopted. If evidence regarding cost-effectiveness is uncertain, then the decision maker faces two choices: (1) adopt the technology and run the risk that it is less effective in actual practice, or (2) reject the technology and risk that potential health is forgone. A new depression eHealth service was found to be cost-effective in a previously published study. The results, however, were unreliable because it was based on a pilot clinical trial. A conservative decision maker would normally require stronger evidence for the intervention to be implemented. Our objective was to evaluate how to facilitate service implementation by shifting the burden of risk due to uncertainty to the service provider and ensure that the intervention remains cost-effective during routine use. We propose a risk-sharing scheme, where the service cost depends on the actual effectiveness of the service in real-life setting. Routine efficacy data can be used as the input to the cost-effectiveness model, which employs a mapping function to translate a depression specific score into quality-adjusted life-years. The latter is the denominator in the cost-effectiveness ratio calculation, required by the health care decision maker. The output of the model is a "value graph", showing intervention value as a function of its observed (future) efficacy, using the €30,000 per quality-adjusted life-year (QALY) threshold. We found that the eHealth service should improve the patient's outcome by at least 11.9 points on the Beck Depression Inventory scale in order for the cost-effectiveness ratio to remain below the €30,000/QALY threshold. The value of a single point improvement was found to be between €200 and €700, depending on depression severity at treatment start. Value of the eHealth service, based on the current efficacy estimates, is €1900, which is significantly above its estimated cost (€200). The eHealth depression service is particularly suited to routine monitoring, since data can be gathered through the Internet within the service communication channels. This enables real-time cost-effectiveness evaluation and allows a value-based price to be established. We propose a novel pricing scheme where the price is set to a point in the interval between cost and value, which provides an economic surplus to both the payer and the provider. Such a business model will assure that a portion of the surplus is retained by the payer and not completely appropriated by the private provider. If the eHealth service were to turn out less effective than originally anticipated, then the price would be lowered in order to achieve the cost-effectiveness threshold and this risk of financial loss would be borne by the provider.

  20. Improvement of selective screening strategy for gestational diabetes through a more accurate definition of high-risk groups.

    PubMed

    Pintaudi, Basilio; Di Vieste, Giacoma; Corrado, Francesco; Lucisano, Giuseppe; Pellegrini, Fabio; Giunta, Loretta; Nicolucci, Antonio; D'Anna, Rosario; Di Benedetto, Antonino

    2014-01-01

    This study aimed to assess the predictive value of risk factors (RFs) for gestational diabetes mellitus (GDM) established by selective screening (SS) and to identify subgroups of women at a higher risk of developing GDM. A retrospective, single-center study design was employed. Data of 1015 women screened for GDM at 24-28 weeks of gestation and diagnosed according to the International Association of Diabetes and Pregnancy Study Groups criteria were evaluated. Information on RFs established by SS was also collected and their association with GDM was determined. To identify distinct and homogeneous subgroups of patients at a higher risk, the RECursive Partitioning and AMalgamation (RECPAM) method was used. Overall, 113 (11.1%) women were diagnosed as having GDM. The application of the SS criteria would result in the execution of an oral glucose tolerance test (OGTT) in 58.3% of women and 26 (23.0%) cases of GDM would not be detected due to the absence of any RF. The RECPAM analysis identified high-risk subgroups characterized by fasting plasma glucose values >5.1 mmol/l (odds ratio (OR)=26.5; 95% CI 14.3-49.0) and pre-pregnancy BMI (OR=7.0; 95% CI 3.9-12.8 for overweight women). In a final logistic model including RECPAM classes, previous macrosomia (OR=3.6; 95% CI 1.1-11.6), and family history of diabetes (OR=1.8; 95% CI 1.1-2.8), but not maternal age, were also found to be associated with an increased risk of developing GDM. A screening approach based on the RECPAM model would reduce by over 50% (23.0 vs 10.6%) the number of undiagnosed GDM cases when compared with the current SS approach, at the expense of 50 additional OGTTs required. A screening approach based on our RECPAM model results in a significant reduction in the number of undetected GDM cases compared with the current SS procedure.

  1. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    PubMed

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  2. Correlation between oral health in disabled children and depressive symptoms in their mothers.

    PubMed

    D'Alessandro, G; Cremonesi, I; Alkhamis, N; Piana, G

    2014-09-01

    The aim of this study was to evaluate the presence and degree of depressive symptoms in mothers of disabled children and to assess the correlation between maternal major depression risk and son/daughter oral health. A prospective study was conducted in 51 disabled children and their 51 mothers. In children dmft/DMFT values, food and/or sugar- sweetened consumption levels and daily tooth brushing frequency were evaluated. Depressive maternal symptoms were measured by EDPS questionnaire: the questionnaire scores were converted into positive predictive values (PPV) that represented the risk of falling into major depression. A regression analysis was performed on the variables (statical significance was set at p value ≤ 0.05). Children (8.68 ± 3.98 years old) average dmft/DMFT was 2.7. Fifty three percent of the mothers (38.37 ± 6.04 years) were at risk for depression (PPV > 60%), while depressive symptoms were already present in 25% of the subjects (PPV=100%). Mothers of disabled children are more likely to fall into major depression compared to mothers of healthy children. For each mother-child couple the correlation between different variables was evaluated: there was a statistically significant correlation between children's dmft/DMFT values and mothers' depression risk. The risk of maternal depression was statistically correlated to prevalence of caries and sugar consumption in children.

  3. Risk assessment models to evaluate the necessity of prostate biopsies in North Chinese patients with 4-50 ng/mL PSA.

    PubMed

    Zhao, Jing; Liu, Shuai; Gao, Dexuan; Ding, Sentai; Niu, Zhihong; Zhang, Hui; Huang, Zhilong; Qiu, Juhui; Li, Qing; Li, Ning; Xie, Fang; Cui, Jilei; Lu, Jiaju

    2017-02-07

    Prostate-specific antigen (PSA) is widely used for prostate cancer screening, but low specificity results in high false positive rates of prostate biopsies. To develop new risk assessment models to overcome the diagnostic limitation of PSA and reduce unnecessary prostate biopsies in North Chinese patients with 4-50 ng/mL PSA. A total of 702 patients in seven hospitals with 4-10 and 10-50 ng/mL PSA, respectively, who had undergone transrectal ultrasound-guided prostate biopsies, were assessed. Analysis-modeling stage for several clinical indexes related to prostate cancer and renal function was carried out. Multiple logistic regression analyses were used to develop new risk assessment models of prostate cancer for both PSA level ranges 4-10 and 10-50 ng/mL. External validation stage of the new models was performed to assess the necessity of biopsy. The new models for both PSA ranges performed significantly better than PSA for detecting prostate cancers. Both models showed higher areas under the curves (0.937 and 0.873, respectively) compared with PSA alone (0.624 and 0.595), at pre-determined cut-off values of 0.1067 and 0.6183, respectively. Patients above the cut-off values were recommended for immediate biopsy, while the others were actively observed. External validation of the models showed significantly increased detection rates for prostate cancer (4-10 ng/mL group, 39.29% vs 17.79%, p=0.006; 10-50 ng/mL group, 71.83% vs 50.0%, p=0.015). We developed risk assessment models for North Chinese patients with 4-50 ng/mL PSA to reduce unnecessary prostate biopsies and increase the detection rate of prostate cancer.

  4. On set-valued functionals: Multivariate risk measures and Aumann integrals

    NASA Astrophysics Data System (ADS)

    Ararat, Cagin

    In this dissertation, multivariate risk measures for random vectors and Aumann integrals of set-valued functions are studied. Both are set-valued functionals with values in a complete lattice of subsets of Rm. Multivariate risk measures are considered in a general d-asset financial market with trading opportunities in discrete time. Specifically, the following features of the market are incorporated in the evaluation of multivariate risk: convex transaction costs modeled by solvency regions, intermediate trading constraints modeled by convex random sets, and the requirement of liquidation into the first m ≤ d of the assets. It is assumed that the investor has a "pure" multivariate risk measure R on the space of m-dimensional random vectors which represents her risk attitude towards the assets but does not take into account the frictions of the market. Then, the investor with a d-dimensional position minimizes the set-valued functional R over all m-dimensional positions that she can reach by trading in the market subject to the frictions described above. The resulting functional Rmar on the space of d-dimensional random vectors is another multivariate risk measure, called the market-extension of R. A dual representation for R mar that decomposes the effects of R and the frictions of the market is proved. Next, multivariate risk measures are studied in a utility-based framework. It is assumed that the investor has a complete risk preference towards each individual asset, which can be represented by a von Neumann-Morgenstern utility function. Then, an incomplete preference is considered for multivariate positions which is represented by the vector of the individual utility functions. Under this structure, multivariate shortfall and divergence risk measures are defined as the optimal values of set minimization problems. The dual relationship between the two classes of multivariate risk measures is constructed via a recent Lagrange duality for set optimization. In particular, it is shown that a shortfall risk measure can be written as an intersection over a family of divergence risk measures indexed by a scalarization parameter. Examples include the multivariate versions of the entropic risk measure and the average value at risk. In the second part, Aumann integrals of set-valued functions on a measurable space are viewed as set-valued functionals and a Daniell-Stone type characterization theorem is proved for such functionals. More precisely, it is shown that a functional that maps measurable set-valued functions into a certain complete lattice of subsets of Rm can be written as the Aumann integral with respect to a measure if and only if the functional is (1) additive and (2) positively homogeneous, (3) it preserves decreasing limits, (4) it maps halfspace-valued functions to halfspaces, and (5) it maps shifted cone-valued functions to shifted cones. While the first three properties already exist in the classical Daniell-Stone theorem for the Lebesgue integral, the last two properties are peculiar to the set-valued framework and they suffice to complement the first three properties to identify a set-valued functional as the Aumann integral with respect to a measure.

  5. An Extreme-Value Approach to Anomaly Vulnerability Identification

    NASA Technical Reports Server (NTRS)

    Everett, Chris; Maggio, Gaspare; Groen, Frank

    2010-01-01

    The objective of this paper is to present a method for importance analysis in parametric probabilistic modeling where the result of interest is the identification of potential engineering vulnerabilities associated with postulated anomalies in system behavior. In the context of Accident Precursor Analysis (APA), under which this method has been developed, these vulnerabilities, designated as anomaly vulnerabilities, are conditions that produce high risk in the presence of anomalous system behavior. The method defines a parameter-specific Parameter Vulnerability Importance measure (PVI), which identifies anomaly risk-model parameter values that indicate the potential presence of anomaly vulnerabilities, and allows them to be prioritized for further investigation. This entails analyzing each uncertain risk-model parameter over its credible range of values to determine where it produces the maximum risk. A parameter that produces high system risk for a particular range of values suggests that the system is vulnerable to the modeled anomalous conditions, if indeed the true parameter value lies in that range. Thus, PVI analysis provides a means of identifying and prioritizing anomaly-related engineering issues that at the very least warrant improved understanding to reduce uncertainty, such that true vulnerabilities may be identified and proper corrective actions taken.

  6. Extreme value modelling of Ghana stock exchange index.

    PubMed

    Nortey, Ezekiel N N; Asare, Kwabena; Mettle, Felix Okoe

    2015-01-01

    Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana stock exchange all-shares index (2000-2010) by applying the extreme value theory (EVT) to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before the EVT method was applied. The Peak Over Threshold approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model's goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the value at risk and expected shortfall risk measures at some high quantiles, based on the fitted GPD model.

  7. Evaluation and simplification of the occupational slip, trip and fall risk-assessment test

    PubMed Central

    NAKAMURA, Takehiro; OYAMA, Ichiro; FUJINO, Yoshihisa; KUBO, Tatsuhiko; KADOWAKI, Koji; KUNIMOTO, Masamizu; ODOI, Haruka; TABATA, Hidetoshi; MATSUDA, Shinya

    2016-01-01

    Objective: The purpose of this investigation is to evaluate the efficacy of the occupational slip, trip and fall (STF) risk assessment test developed by the Japan Industrial Safety and Health Association (JISHA). We further intended to simplify the test to improve efficiency. Methods: A previous cohort study was performed using 540 employees aged ≥50 years who took the JISHA’s STF risk assessment test. We conducted multivariate analysis using these previous results as baseline values and answers to questionnaire items or score on physical fitness tests as variables. The screening efficiency of each model was evaluated based on the obtained receiver operating characteristic (ROC) curve. Results: The area under the ROC obtained in multivariate analysis was 0.79 when using all items. Six of the 25 questionnaire items were selected for stepwise analysis, giving an area under the ROC curve of 0.77. Conclusion: Based on the results of follow-up performed one year after the initial examination, we successfully determined the usefulness of the STF risk assessment test. Administering a questionnaire alone is sufficient for screening subjects at risk of STF during the subsequent one-year period. PMID:27021057

  8. Detection of Cardiovascular Disease Risk's Level for Adults Using Naive Bayes Classifier.

    PubMed

    Miranda, Eka; Irwansyah, Edy; Amelga, Alowisius Y; Maribondang, Marco M; Salim, Mulyadi

    2016-07-01

    The number of deaths caused by cardiovascular disease and stroke is predicted to reach 23.3 million in 2030. As a contribution to support prevention of this phenomenon, this paper proposes a mining model using a naïve Bayes classifier that could detect cardiovascular disease and identify its risk level for adults. The process of designing the method began by identifying the knowledge related to the cardiovascular disease profile and the level of cardiovascular disease risk factors for adults based on the medical record, and designing a mining technique model using a naïve Bayes classifier. Evaluation of this research employed two methods: accuracy, sensitivity, and specificity calculation as well as an evaluation session with cardiologists and internists. The characteristics of cardiovascular disease are identified by its primary risk factors. Those factors are diabetes mellitus, the level of lipids in the blood, coronary artery function, and kidney function. Class labels were assigned according to the values of these factors: risk level 1, risk level 2 and risk level 3. The evaluation of the classifier performance (accuracy, sensitivity, and specificity) in this research showed that the proposed model predicted the class label of tuples correctly (above 80%). More than eighty percent of respondents (including cardiologists and internists) who participated in the evaluation session agree till strongly agreed that this research followed medical procedures and that the result can support medical analysis related to cardiovascular disease. The research showed that the proposed model achieves good performance for risk level detection of cardiovascular disease.

  9. Risk stratification for arrhythmic death in an emergency department cohort: a new method of nonlinear PD2i analysis of the ECG

    PubMed Central

    Skinner, James E; Meyer, Michael; Dalsey, William C; Nester, Brian A; Ramalanjaona, George; O’Neil, Brian J; Mangione, Antoinette; Terregino, Carol; Moreyra, Abel; Weiss, Daniel N; Anchin, Jerry M; Geary, Una; Taggart, Pamela

    2008-01-01

    Heart rate variability (HRV) reflects both cardiac autonomic function and risk of sudden arrhythmic death (AD). Indices of HRV based on linear stochastic models are independent risk factors for AD in postmyocardial infarction (MI) cohorts. Indices based on nonlinear deterministic models have a higher sensitivity and specificity for predicting AD in retrospective data. A new nonlinear deterministic model, the automated Point Correlation Dimension (PD2i), was prospectively evaluated for prediction of AD. Patients were enrolled (N = 918) in 6 emergency departments (EDs) upon presentation with chest pain and being determined to be at risk of acute MI (AMI) >7%. Brief digital ECGs (>1000 heartbeats, ∼15 min) were recorded and automated PD2i results obtained. Out-of-hospital AD was determined by modified Hinkle-Thaler criteria. All-cause mortality at 1 year was 6.2%, with 3.5% being ADs. Of the AD fatalities, 34% were without previous history of MI or diagnosis of AMI. The PD2i prediction of AD had sensitivity = 96%, specificity = 85%, negative predictive value = 99%, and relative risk >24.2 (p ≤ 0.001). HRV analysis by the time-dependent nonlinear PD2i algorithm can accurately predict risk of AD in an ED cohort and may have both life-saving and resource-saving implications for individual risk assessment. PMID:19209249

  10. Foraging and predation risk for larval cisco (Coregonus artedi) in Lake Superior: a modelling synthesis of empirical survey data

    USGS Publications Warehouse

    Myers, Jared T.; Yule, Daniel L.; Jones, Michael L.; Quinlan, Henry R.; Berglund, Eric K.

    2014-01-01

    The relative importance of predation and food availability as contributors to larval cisco (Coregonus artedi) mortality in Lake Superior were investigated using a visual foraging model to evaluate potential predation pressure by rainbow smelt (Osmerus mordax) and a bioenergetic model to evaluate potential starvation risk. The models were informed by observations of rainbow smelt, larval cisco, and zooplankton abundance at three Lake Superior locations during the period of spring larval cisco emergence and surface-oriented foraging. Predation risk was highest at Black Bay, ON, where average rainbow smelt densities in the uppermost 10 m of the water column were >1000 ha−1. Turbid conditions at the Twin Ports, WI-MN, affected larval cisco predation risk because rainbow smelt remained suspended in the upper water column during daylight, placing them alongside larval cisco during both day and night hours. Predation risk was low at Cornucopia, WI, owing to low smelt densities (<400 ha−1) and deep light penetration, which kept rainbow smelt near the lakebed and far from larvae during daylight. In situ zooplankton density estimates were low compared to the values used to develop the larval coregonid bioenergetics model, leading to predictions of negative growth rates for 10 mm larvae at all three locations. The model predicted that 15 mm larvae were capable of attaining positive growth at Cornucopia and the Twin Ports where low water temperatures (2–6 °C) decreased their metabolic costs. Larval prey resources were highest at Black Bay but warmer water temperatures there offset the benefit of increased prey availability. A sensitivity analysis performed on the rainbow smelt visual foraging model showed that it was relatively insensitive, while the coregonid bioenergetics model showed that the absolute growth rate predictions were highly sensitive to input parameters (i.e., 20% parameter perturbation led to order of magnitude differences in model estimates). Our modelling indicated that rainbow smelt predation may limit larval cisco survival at Black Bay and to a lesser extent at Twin Ports, and that starvation may be a major source of mortality at all three locations. The framework we describe has the potential to further our understanding of the relative importance of starvation and predation on larval fish survivorship, provided information on prey resources available to larvae are measured at sufficiently fine spatial scales and the models provide a realistic depiction of the dynamic processes that the larvae experience.

  11. Deficient Contractor Business Systems: Applying the Value at Risk (VaR) Model to Earned Value Management Systems

    DTIC Science & Technology

    2013-06-30

    QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC

  12. No added value of age at menopause and the lifetime cumulative number of menstrual cycles for cardiovascular risk prediction in postmenopausal women.

    PubMed

    Atsma, Femke; van der Schouw, Yvonne T; Grobbee, Diederick E; Hoes, Arno W; Bartelink, Marie-Louise E L

    2008-11-12

    The aim of the present study was to investigate the added value of age at menopause and the lifetime cumulative number of menstrual cycles in cardiovascular risk prediction in postmenopausal women. This study included 971 women. The ankle-arm index was used as a proxy for cardiovascular morbidity and mortality. The ankle-arm index was calculated for each leg by dividing the highest ankle systolic blood pressure by the highest brachial systolic blood pressure. A cut-off value of 0.95 was used to differentiate between low and high risk women. Three cardiovascular risk models were constructed. In the initial model all classical predictors for cardiovascular disease were investigated. This model was then extended by age at menopause or the lifetime cumulative number of menstrual cycles to test their added value for cardiovascular risk prediction. Differences in discriminative power between the models were investigated by comparing the area under the receiver operating characteristic (ROC) curves. The mean age was 66.0 (+/-5.6) years. The 6 independent predictors for cardiovascular disease were age, systolic blood pressure, total to HDL cholesterol ratio, current smoking, glucose level, and body mass index > or =30 kg/m(2). The ROC area was 0.69 (0.64-0.73) and did not change when age at menopause or the lifetime cumulative number of menstrual cycles was added. The findings in this study among postmenopausal women did not support the view that age at menopause or a refined estimation of lifetime endogenous estrogen exposure would improve cardiovascular risk prediction as approximated by the ankle-arm index.

  13. [Screening of malnutrition risk versus indicators of nutritional status and systemic inflammatory response in newly diagnosed lung cancer patients].

    PubMed

    Illa, P; Tomíšková, M; Skřičková, J

    2014-01-01

    Most lung cancers are already advanced at the time of dia-gnosis. In these patients, a frequent symptom is protein energy malnutrition, often diagnosed prior to oncological treatment. Malnutrition results in poor tolerance of treatment and increased morbidity and mortality. Nutritional Risk Screening (NRS) 2002 adapted for oncological patients was used to assess the risk of undernutrition in a group of 188 lung cancer patients. The risk was evaluated on a 6- point scale according to common signs of nutritional status and tumor and its treatment risk factors. A score of 3 and more (called "nutritional risk") means a significant risk of malnutrition. Furthermore, pretreatment nutritional characteristics were evaluated in patients (including the value of BMI) and laboratory values indicating malnutrition/ acute phase response (albumin/ C reactive protein - CRP). Acceptable NRS score was found in 50.6%, while in 45.3% was suggested into risk of malnutrition ("nutritional risk"). Only 6.6% of our patients had a BMI less than 20 kg/ m2. Significant differences in albumin and CRP values in various categories of NRS were confirmed. Initial signs of cancer malnutrition may be overlooked in patients who fall within or above the range of BMI for adequate weight, although these patients may be at significant risk of malnutrition. The indicators of nutritional status and systemic inflammatory responses were significantly associated with resulting values NRS score.

  14. Assessing and Mitigating Hurricane Storm Surge Risk in a Changing Environment

    NASA Astrophysics Data System (ADS)

    Lin, N.; Shullman, E.; Xian, S.; Feng, K.

    2017-12-01

    Hurricanes have induced devastating storm surge flooding worldwide. The impacts of these storms may worsen in the coming decades because of rapid coastal development coupled with sea-level rise and possibly increasing storm activity due to climate change. Major advances in coastal flood risk management are urgently needed. We present an integrated dynamic risk analysis for flooding task (iDraft) framework to assess and manage coastal flood risk at the city or regional scale, considering integrated dynamic effects of storm climatology change, sea-level rise, and coastal development. We apply the framework to New York City. First, we combine climate-model projected storm surge climatology and sea-level rise with engineering- and social/economic-model projected coastal exposure and vulnerability to estimate the flood damage risk for the city over the 21st century. We derive temporally-varying risk measures such as the annual expected damage as well as temporally-integrated measures such as the present value of future losses. We also examine the individual and joint contributions to the changing risk of the three dynamic factors (i.e., sea-level rise, storm change, and coastal development). Then, we perform probabilistic cost-benefit analysis for various coastal flood risk mitigation strategies for the city. Specifically, we evaluate previously proposed mitigation measures, including elevating houses on the floodplain and constructing flood barriers at the coast, by comparing their estimated cost and probability distribution of the benefit (i.e., present value of avoided future losses). We also propose new design strategies, including optimal design (e.g., optimal house elevation) and adaptive design (e.g., flood protection levels that are designed to be modified over time in a dynamic and uncertain environment).

  15. Deficient Contractor Business Systems: Applying the Value at Risk (VAR) Model to Earned Value Management Systems

    DTIC Science & Technology

    2013-06-01

    measuring numerical risk to the government ( Galway , 2004). However, quantitative risk analysis is rarely utilized in DoD acquisition programs because the...quantitative assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost...Kindle version]. Retrieved from Amazon.com 83 Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review

  16. Illustrative case using the RISK21 roadmap and matrix: prioritization for evaluation of chemicals found in drinking water

    PubMed Central

    Wolf, Douglas C.; Bachman, Ammie; Barrett, Gordon; Bellin, Cheryl; Goodman, Jay I.; Jensen, Elke; Moretto, Angelo; McMullin, Tami; Pastoor, Timothy P.; Schoeny, Rita; Slezak, Brian; Wend, Korinna; Embry, Michelle R.

    2016-01-01

    ABSTRACT The HESI-led RISK21 effort has developed a framework supporting the use of twenty-first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach that leads to an informed decision on human health safety to be made when sufficient evidence is available. It provides a transparent and consistent approach to evaluate information in order to maximize the ability of assessments to inform decisions and to optimize the use of resources. To demonstrate the application of the framework’s roadmap and matrix, this case study evaluates a large number of chemicals that could be present in drinking water. The focus is to prioritize which of these should be considered for human health risk as individual contaminants. The example evaluates 20 potential drinking water contaminants, using the tiered RISK21 approach in combination with graphical representation of information at each step, using the RISK21 matrix. Utilizing the framework, 11 of the 20 chemicals were assigned low priority based on available exposure data alone, which demonstrated that exposure was extremely low. The remaining nine chemicals were further evaluated, using refined estimates of toxicity based on readily available data, with three deemed high priority for further evaluation. In the present case study, it was determined that the greatest value of additional information would be from improved exposure models and not from additional hazard characterization. PMID:26451723

  17. Illustrative case using the RISK21 roadmap and matrix: prioritization for evaluation of chemicals found in drinking water.

    PubMed

    Wolf, Douglas C; Bachman, Ammie; Barrett, Gordon; Bellin, Cheryl; Goodman, Jay I; Jensen, Elke; Moretto, Angelo; McMullin, Tami; Pastoor, Timothy P; Schoeny, Rita; Slezak, Brian; Wend, Korinna; Embry, Michelle R

    2016-01-01

    The HESI-led RISK21 effort has developed a framework supporting the use of twenty-first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach that leads to an informed decision on human health safety to be made when sufficient evidence is available. It provides a transparent and consistent approach to evaluate information in order to maximize the ability of assessments to inform decisions and to optimize the use of resources. To demonstrate the application of the framework's roadmap and matrix, this case study evaluates a large number of chemicals that could be present in drinking water. The focus is to prioritize which of these should be considered for human health risk as individual contaminants. The example evaluates 20 potential drinking water contaminants, using the tiered RISK21 approach in combination with graphical representation of information at each step, using the RISK21 matrix. Utilizing the framework, 11 of the 20 chemicals were assigned low priority based on available exposure data alone, which demonstrated that exposure was extremely low. The remaining nine chemicals were further evaluated, using refined estimates of toxicity based on readily available data, with three deemed high priority for further evaluation. In the present case study, it was determined that the greatest value of additional information would be from improved exposure models and not from additional hazard characterization.

  18. Factors predictive of obstructive sleep apnea in patients undergoing pre-operative evaluation for bariatric surgery and referred to a sleep laboratory for polysomnography

    PubMed Central

    Duarte, Ricardo Luiz de Menezes; Magalhães-da-Silveira, Flavio José

    2015-01-01

    Objective: To identify the main predictive factors for obtaining a diagnosis of obstructive sleep apnea (OSA) in patients awaiting bariatric surgery. Methods: Retrospective study of consecutive patients undergoing pre-operative evaluation for bariatric surgery and referred for in-laboratory polysomnography. Eight variables were evaluated: sex, age, neck circumference (NC), BMI, Epworth Sleepiness Scale (ESS) score, snoring, observed apnea, and hypertension. We employed ROC curve analysis to determine the best cut-off value for each variable and multiple linear regression to identify independent predictors of OSA severity. Results: We evaluated 1,089 patients, of whom 781 (71.7%) were female. The overall prevalence of OSA-defined as an apnea/hypopnea index (AHI) ≥ 5.0 events/h-was 74.8%. The best cut-off values for NC, BMI, age, and ESS score were 42 cm, 42 kg/m2, 37 years, and 10 points, respectively. All eight variables were found to be independent predictors of a diagnosis of OSA in general, and all but one were found to be independent predictors of a diagnosis of moderate/severe OSA (AHI ≥ 15.0 events/h), the exception being hypertension. We devised a 6-item model, designated the NO-OSAS model (NC, Obesity, Observed apnea, Snoring, Age, and Sex), with a cut-off value of ≥ 3 for identifying high-risk patients. For a diagnosis of moderate/severe OSA, the model showed 70.8% accuracy, 82.8% sensitivity, and 57.9% specificity. Conclusions: In our sample of patients awaiting bariatric surgery, there was a high prevalence of OSA. At a cut-off value of ≥ 3, the proposed 6-item model showed good accuracy for a diagnosis of moderate/severe OSA. PMID:26578136

  19. Development and Evaluation of an Automated Machine Learning Algorithm for In-Hospital Mortality Risk Adjustment Among Critical Care Patients.

    PubMed

    Delahanty, Ryan J; Kaufman, David; Jones, Spencer S

    2018-06-01

    Risk adjustment algorithms for ICU mortality are necessary for measuring and improving ICU performance. Existing risk adjustment algorithms are not widely adopted. Key barriers to adoption include licensing and implementation costs as well as labor costs associated with human-intensive data collection. Widespread adoption of electronic health records makes automated risk adjustment feasible. Using modern machine learning methods and open source tools, we developed and evaluated a retrospective risk adjustment algorithm for in-hospital mortality among ICU patients. The Risk of Inpatient Death score can be fully automated and is reliant upon data elements that are generated in the course of usual hospital processes. One hundred thirty-one ICUs in 53 hospitals operated by Tenet Healthcare. A cohort of 237,173 ICU patients discharged between January 2014 and December 2016. The data were randomly split into training (36 hospitals), and validation (17 hospitals) data sets. Feature selection and model training were carried out using the training set while the discrimination, calibration, and accuracy of the model were assessed in the validation data set. Model discrimination was evaluated based on the area under receiver operating characteristic curve; accuracy and calibration were assessed via adjusted Brier scores and visual analysis of calibration curves. Seventeen features, including a mix of clinical and administrative data elements, were retained in the final model. The Risk of Inpatient Death score demonstrated excellent discrimination (area under receiver operating characteristic curve = 0.94) and calibration (adjusted Brier score = 52.8%) in the validation dataset; these results compare favorably to the published performance statistics for the most commonly used mortality risk adjustment algorithms. Low adoption of ICU mortality risk adjustment algorithms impedes progress toward increasing the value of the healthcare delivered in ICUs. The Risk of Inpatient Death score has many attractive attributes that address the key barriers to adoption of ICU risk adjustment algorithms and performs comparably to existing human-intensive algorithms. Automated risk adjustment algorithms have the potential to obviate known barriers to adoption such as cost-prohibitive licensing fees and significant direct labor costs. Further evaluation is needed to ensure that the level of performance observed in this study could be achieved at independent sites.

  20. A radiomics model from joint FDG-PET and MRI texture features for the prediction of lung metastases in soft-tissue sarcomas of the extremities

    NASA Astrophysics Data System (ADS)

    Vallières, M.; Freeman, C. R.; Skamene, S. R.; El Naqa, I.

    2015-07-01

    This study aims at developing a joint FDG-PET and MRI texture-based model for the early evaluation of lung metastasis risk in soft-tissue sarcomas (STSs). We investigate if the creation of new composite textures from the combination of FDG-PET and MR imaging information could better identify aggressive tumours. Towards this goal, a cohort of 51 patients with histologically proven STSs of the extremities was retrospectively evaluated. All patients had pre-treatment FDG-PET and MRI scans comprised of T1-weighted and T2-weighted fat-suppression sequences (T2FS). Nine non-texture features (SUV metrics and shape features) and forty-one texture features were extracted from the tumour region of separate (FDG-PET, T1 and T2FS) and fused (FDG-PET/T1 and FDG-PET/T2FS) scans. Volume fusion of the FDG-PET and MRI scans was implemented using the wavelet transform. The influence of six different extraction parameters on the predictive value of textures was investigated. The incorporation of features into multivariable models was performed using logistic regression. The multivariable modeling strategy involved imbalance-adjusted bootstrap resampling in the following four steps leading to final prediction model construction: (1) feature set reduction; (2) feature selection; (3) prediction performance estimation; and (4) computation of model coefficients. Univariate analysis showed that the isotropic voxel size at which texture features were extracted had the most impact on predictive value. In multivariable analysis, texture features extracted from fused scans significantly outperformed those from separate scans in terms of lung metastases prediction estimates. The best performance was obtained using a combination of four texture features extracted from FDG-PET/T1 and FDG-PET/T2FS scans. This model reached an area under the receiver-operating characteristic curve of 0.984 ± 0.002, a sensitivity of 0.955 ± 0.006, and a specificity of 0.926 ± 0.004 in bootstrapping evaluations. Ultimately, lung metastasis risk assessment at diagnosis of STSs could improve patient outcomes by allowing better treatment adaptation.

  1. STakeholder-Objective Risk Model (STORM): Determining the aggregated risk of multiple contaminant hazards in groundwater well catchments

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, R.; Binning, P. J.; Nowak, W.

    2015-09-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any point in time, which then affects the pumped quality upon transport through the aquifer. In such situations, estimating the overall risk is not trivial, and three key questions emerge: (1) How to aggregate the impacts from different contaminants and spill locations to an overall, cumulative impact on the value at risk? (2) How to properly account for the stochastic nature of spill events when converting the aggregated impact to a risk estimate? (3) How will the overall risk and subsequent decision making depend on stakeholder objectives, where stakeholder objectives refer to the values at risk, risk attitudes and risk metrics that can vary between stakeholders. In this study, we provide a STakeholder-Objective Risk Model (STORM) for assessing the total aggregated risk. Or concept is a quantitative, probabilistic and modular framework for simulation-based risk estimation. It rests on the source-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired by a German drinking water catchment. As one may expect, the results depend strongly on the chosen stakeholder objectives, but they are equally sensitive to different approaches for risk aggregation across different hazards, contaminant types, and over time.

  2. Conservation Risks: When Will Rhinos be Extinct?

    PubMed

    Haas, Timothy C; Ferreira, Sam M

    2016-08-01

    We develop a risk intelligence system for biodiversity enterprises. Such enterprises depend on a supply of endangered species for their revenue. Many of these enterprises, however, cannot purchase a supply of this resource and are largely unable to secure the resource against theft in the form of poaching. Because replacements are not available once a species becomes extinct, insurance products are not available to reduce the risk exposure of these enterprises to an extinction event. For many species, the dynamics of anthropogenic impacts driven by economic as well as noneconomic values of associated wildlife products along with their ecological stressors can help meaningfully predict extinction risks. We develop an agent/individual-based economic-ecological model that captures these effects and apply it to the case of South African rhinos. Our model uses observed rhino dynamics and poaching statistics. It seeks to predict rhino extinction under the present scenario. This scenario has no legal horn trade, but allows live African rhino trade and legal hunting. Present rhino populations are small and threatened by a rising onslaught of poaching. This present scenario and associated dynamics predicts continued decline in rhino population size with accelerated extinction risks of rhinos by 2036. Our model supports the computation of extinction risks at any future time point. This capability can be used to evaluate the effectiveness of proposed conservation strategies at reducing a species' extinction risk. Models used to compute risk predictions, however, need to be statistically estimated. We point out that statistically fitting such models to observations will involve massive numbers of observations on consumer behavior and time-stamped location observations on thousands of animals. Finally, we propose Big Data algorithms to perform such estimates and to interpret the fitted model's output.

  3. Temperature Prediction Model for Bone Drilling Based on Density Distribution and In Vivo Experiments for Minimally Invasive Robotic Cochlear Implantation.

    PubMed

    Feldmann, Arne; Anso, Juan; Bell, Brett; Williamson, Tom; Gavaghan, Kate; Gerber, Nicolas; Rohrbach, Helene; Weber, Stefan; Zysset, Philippe

    2016-05-01

    Surgical robots have been proposed ex vivo to drill precise holes in the temporal bone for minimally invasive cochlear implantation. The main risk of the procedure is damage of the facial nerve due to mechanical interaction or due to temperature elevation during the drilling process. To evaluate the thermal risk of the drilling process, a simplified model is proposed which aims to enable an assessment of risk posed to the facial nerve for a given set of constant process parameters for different mastoid bone densities. The model uses the bone density distribution along the drilling trajectory in the mastoid bone to calculate a time dependent heat production function at the tip of the drill bit. Using a time dependent moving point source Green's function, the heat equation can be solved at a certain point in space so that the resulting temperatures can be calculated over time. The model was calibrated and initially verified with in vivo temperature data. The data was collected in minimally invasive robotic drilling of 12 holes in four different sheep. The sheep were anesthetized and the temperature elevations were measured with a thermocouple which was inserted in a previously drilled hole next to the planned drilling trajectory. Bone density distributions were extracted from pre-operative CT data by averaging Hounsfield values over the drill bit diameter. Post-operative [Formula: see text]CT data was used to verify the drilling accuracy of the trajectories. The comparison of measured and calculated temperatures shows a very good match for both heating and cooling phases. The average prediction error of the maximum temperature was less than 0.7 °C and the average root mean square error was approximately 0.5 °C. To analyze potential thermal damage, the model was used to calculate temperature profiles and cumulative equivalent minutes at 43 °C at a minimal distance to the facial nerve. For the selected drilling parameters, temperature elevation profiles and cumulative equivalent minutes suggest that thermal elevation of this minimally invasive cochlear implantation surgery may pose a risk to the facial nerve, especially in sclerotic or high density mastoid bones. Optimized drilling parameters need to be evaluated and the model could be used for future risk evaluation.

  4. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    NASA Astrophysics Data System (ADS)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  5. Maternal Characteristics for the Prediction of Preeclampsia in Nulliparous Women: The Great Obstetrical Syndromes (GOS) Study.

    PubMed

    Boutin, Amélie; Gasse, Cédric; Demers, Suzanne; Giguère, Yves; Tétu, Amélie; Bujold, Emmanuel

    2018-05-01

    Low-dose aspirin started in early pregnancy significantly reduces the risk of preeclampsia (PE) in high-risk women, especially preterm PE. This study aimed to evaluate the influence of maternal characteristics on the risk of PE in nulliparous women. The Great Obstetrical Syndromes (GOS) study recruited nulliparous women with singleton pregnancies at 11 to 13 weeks. The following maternal characteristics were collected: age, BMI, ethnicity, chronic diseases, smoking, and assisted reproductive technologies. Relative weight analyses were conducted, and predictive multivariate proportional hazard models were constructed. Receiver operating characteristic curve analyses with their area under the curve (AUC) were used to evaluate the value of each factor for the prediction of PE and preterm PE. The study also evaluated the SOGC guidelines for identification of women at high risk of PE. Of 4739 participants, 232 (4.9%) developed PE, including 30 (0.6%) with preterm PE. In univariate analyses, only BMI was significantly associated with the risk of PE (AUC 0.60; 95% CI 0.55-0.65) and preterm PE (AUC 0.64; 95% CI 054-0.73). Adding other maternal characteristics to BMI had a non-significant and marginal impact on the discriminative ability to the models for PE (AUC 0.62; 95% CI 0.58-0.66) and preterm PE (AUC 0.65; 95% CI 0.56-0.74). At a false-positive rate of 10%, maternal characteristics could have predicted 23% of PE and 19% of preterm PE. The SOGC guidelines were not discriminant for PE (detecting 96% of PE and 93% of preterm PE with a 94% false-positive rate). In nulliparous women, BMI is the most discriminant maternal characteristic for the prediction of PE. Maternal characteristics should not be used alone to identify nulliparous women at high risk of PE. Copyright © 2018 Society of Obstetricians and Gynaecologists of Canada. Published by Elsevier Inc. All rights reserved.

  6. Scintigraphic calf perfusion symmetry after exercise and prediction of cardiovascular events: One stone to kill two birds?

    NASA Astrophysics Data System (ADS)

    Tellier, Philippe; Lecouffe, Pascal; Zureik, Mahmoud

    2007-02-01

    BackgroundPeripheral arterial disease (PAD) is commonly associated with a high cardiovascular mortality and morbidity as a marker of plurifocal atherosclerosis. Whether exercise thallium perfusion muscular asymmetry in the legs associated with PAD has prognostic value is unknown. Such a hypothesis was evaluated in a prospective study which remains the gold standard in clinical research. Methods and resultsScintigraphic calf perfusion symmetry after exercise (SCPSE) was measured at the end of a maximal or symptom-limited treadmill exercise test in 358 patients with known or suspected coronary artery disease (CAD). During the follow-up period (mean 85.3±32.8 months), 93 cardiovascular events and deaths (incident cases) occurred. Among those incident cases, the percentage of subjects with higher SCPSE values (third tertile) was 45.2%, versus 29.1% in controls (lower tertiles) ( p=0.005). In stepwise multivariate analysis performed with the Cox proportional hazards model, previous CAD and SCPSE were the only significant independent predictors of prognosis. The multivariate relative risk of cardiovascular death or event in subjects with higher values of SCPSE was 1.94 (95% CI: 1.15-3.21; p<0.01). ConclusionsScintigraphic calf perfusion asymmetry after exercise was independently associated with incident cardiovascular events in high-risk subjects. This index, which is easily and quickly calculated, could be used for evaluation of cardiovascular risk.

  7. Psychometric assessment of HIV/STI sexual risk scale among MSM: a Rasch model approach.

    PubMed

    Li, Jian; Liu, Hongjie; Liu, Hui; Feng, Tiejian; Cai, Yumao

    2011-10-05

    Little research has assessed the degree of severity and ordering of different types of sexual behaviors for HIV/STI infection in a measurement scale. The purpose of this study was to apply the Rasch model on psychometric assessment of an HIV/STI sexual risk scale among men who have sex with men (MSM). A cross-sectional study using respondent driven sampling was conducted among 351 MSM in Shenzhen, China. The Rasch model was used to examine the psychometric properties of an HIV/STI sexual risk scale including nine types of sexual behaviors. The Rasch analysis of the nine items met the unidimensionality and local independence assumption. Although the person reliability was low at 0.35, the item reliability was high at 0.99. The fit statistics provided acceptable infit and outfit values. Item difficulty invariance analysis showed that the item estimates of the risk behavior items were invariant (within error). The findings suggest that the Rasch model can be utilized for measuring the level of sexual risk for HIV/STI infection as a single latent construct and for establishing the relative degree of severity of each type of sexual behavior in HIV/STI transmission and acquisition among MSM. The measurement scale provides a useful measurement tool to inform, design and evaluate behavioral interventions for HIV/STI infection among MSM.

  8. Integrating seasonal climate prediction and agricultural models for insights into agricultural practice

    PubMed Central

    Hansen, James W

    2005-01-01

    Interest in integrating crop simulation models with dynamic seasonal climate forecast models is expanding in response to a perceived opportunity to add value to seasonal climate forecasts for agriculture. Integrated modelling may help to address some obstacles to effective agricultural use of climate information. First, modelling can address the mismatch between farmers' needs and available operational forecasts. Probabilistic crop yield forecasts are directly relevant to farmers' livelihood decisions and, at a different scale, to early warning and market applications. Second, credible ex ante evidence of livelihood benefits, using integrated climate–crop–economic modelling in a value-of-information framework, may assist in the challenge of obtaining institutional, financial and political support; and inform targeting for greatest benefit. Third, integrated modelling can reduce the risk and learning time associated with adaptation and adoption, and related uncertainty on the part of advisors and advocates. It can provide insights to advisors, and enhance site-specific interpretation of recommendations when driven by spatial data. Model-based ‘discussion support systems’ contribute to learning and farmer–researcher dialogue. Integrated climate–crop modelling may play a genuine, but limited role in efforts to support climate risk management in agriculture, but only if they are used appropriately, with understanding of their capabilities and limitations, and with cautious evaluation of model predictions and of the insights that arises from model-based decision analysis. PMID:16433092

  9. Developing a novel risk prediction model for severe malarial anemia.

    PubMed

    Brickley, E B; Kabyemela, E; Kurtis, J D; Fried, M; Wood, A M; Duffy, P E

    2017-01-01

    As a pilot study to investigate whether personalized medicine approaches could have value for the reduction of malaria-related mortality in young children, we evaluated questionnaire and biomarker data collected from the Mother Offspring Malaria Study Project birth cohort (Muheza, Tanzania, 2002-2006) at the time of delivery as potential prognostic markers for pediatric severe malarial anemia. Severe malarial anemia, defined here as a Plasmodium falciparum infection accompanied by hemoglobin levels below 50 g/L, is a key manifestation of life-threatening malaria in high transmission regions. For this study sample, a prediction model incorporating cord blood levels of interleukin-1β provided the strongest discrimination of severe malarial anemia risk with a C-index of 0.77 (95% CI 0.70-0.84), whereas a pragmatic model based on sex, gravidity, transmission season at delivery, and bed net possession yielded a more modest C-index of 0.63 (95% CI 0.54-0.71). Although additional studies, ideally incorporating larger sample sizes and higher event per predictor ratios, are needed to externally validate these prediction models, the findings provide proof of concept that risk score-based screening programs could be developed to avert severe malaria cases in early childhood.

  10. Risk control and the minimum significant risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seiler, F.A.; Alvarez, J.L.

    1996-06-01

    Risk management implies that the risk manager can, by his actions, exercise at least a modicum of control over the risk in question. In the terminology of control theory, a management action is a control signal imposed as feedback on the system to bring about a desired change in the state of the system. In the terminology of risk management, an action is taken to bring a predicted risk to lower values. Even if it is assumed that the management action taken is 100% effective and that the projected risk reduction is infinitely well known, there is a lower limitmore » to the desired effects that can be achieved. It is based on the fact that all risks, such as the incidence of cancer, exhibit a degree of variability due to a number of extraneous factors such as age at exposure, sex, location, and some lifestyle parameters such as smoking or the consumption of alcohol. If the control signal is much smaller than the variability of the risk, the signal is lost in the noise and control is lost. This defines a minimum controllable risk based on the variability of the risk over the population considered. This quantity is the counterpart of the minimum significant risk which is defined by the uncertainties of the risk model. Both the minimum controllable risk and the minimum significant risk are evaluated for radiation carcinogenesis and are shown to be of the same order of magnitude. For a realistic management action, the assumptions of perfectly effective action and perfect model prediction made above have to be dropped, resulting in an effective minimum controllable risk which is determined by both risk limits. Any action below that effective limit is futile, but it is also unethical due to the ethical requirement of doing more good than harm. Finally, some implications of the effective minimum controllable risk on the use of the ALARA principle and on the evaluation of remedial action goals are presented.« less

  11. Mapping eastern equine encephalitis virus risk for white-tailed deer in Michigan

    PubMed Central

    Downs, Joni A.; Hyzer, Garrett; Marion, Eric; Smith, Zachary J.; Kelen, Patrick Vander; Unnasch, Thomas R.

    2015-01-01

    Eastern equine encephalitis (EEE) is a mosquito-borne viral disease that is often fatal to humans and horses. Some species including white-tailed deer and passerine birds can survive infection with the EEE virus (EEEV) and develop antibodies that can be detected using laboratory techniques. In this way, collected serum samples from free ranging white-tailed deer can be used to monitor the presence of the virus in ecosystems. This study developed and tested a risk index model designed to predict EEEV activity in white-tailed deer in a three-county area of Michigan. The model evaluates EEEV risk on a continuous scale from 0.0 (no measurable risk) to 1.0 (highest possible risk). High risk habitats are identified as those preferred by white-tailed deer that are also located in close proximity to an abundance of wetlands and lowland forests, which support disease vectors and hosts. The model was developed based on relevant literature and was tested with known locations of infected deer that showed neurological symptoms. The risk index model accurately predicted the known locations, with the mean value for those sites equal to the 94th percentile of values in the study area. The risk map produced by the model could be used refine future EEEV monitoring efforts that use serum samples from free-ranging white-tailed deer to monitor viral activity. Alternatively, it could be used focus educational efforts targeted toward deer hunters that may have elevated risks of infection. PMID:26494931

  12. A risk assessment of direct and indirect exposure to emissions from a proposed hazardous waste incinerator in Puerto Rico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hallinger, K.; Huggins, A.; Warner, L.

    1995-12-31

    An Indirect Exposure Assessment (IEA) was conducted, under USEPA`s RCRA Combustion Strategy, as part of the Part B permitting process for a proposed hazardous waste incinerator. The IEA involved identification of constituents of concern, emissions estimations, air dispersion and deposition modeling, evaluation of site-specific exposure pathways/scenarios, and food chain modeling in order to evaluate potential human health and environmental risks. The COMPDEP model was used to determine ambient ground level concentrations and dry and wet deposition rates of constituents of concern. The air modeling results were input into 50th percentile (Central) and 95th percentile (High-End) exposure scenarios which evaluated directmore » exposure via inhalation, dermal contact, and soil ingestion pathways, and indirect exposure through the food chain. The indirect pathway analysis considered the accumulation of constituents in plants and animals used as food sources by local inhabitants. Local food consumption data obtained from the Puerto Rico USDA were combined with realistic present-day and future-use exposure scenarios such as residential use, pineapple farming, and subsistence farming to obtain a comprehensive evaluation of risk, Overall risk was calculated using constituent doses and toxicity factors associated with the various routes of exposure. Risk values for each exposure pathway were summed to determine total carcinogenic and non-carcinogenic hazard to exposed individuals. A population risk assessment was also conducted in order to assess potential risks to the population surrounding the facility. Results of the assessment indicated no acute effects from constituents of concern, and a high-end excess lifetime cancer risk of approximately 6 in a million with dioxins (as 2,3,7,8-TCDD) and arsenic dominating the risk estimate.« less

  13. Net reclassification index at event rate: properties and relationships.

    PubMed

    Pencina, Michael J; Steyerberg, Ewout W; D'Agostino, Ralph B

    2017-12-10

    The net reclassification improvement (NRI) is an attractively simple summary measure quantifying improvement in performance because of addition of new risk marker(s) to a prediction model. Originally proposed for settings with well-established classification thresholds, it quickly extended into applications with no thresholds in common use. Here we aim to explore properties of the NRI at event rate. We express this NRI as a difference in performance measures for the new versus old model and show that the quantity underlying this difference is related to several global as well as decision analytic measures of model performance. It maximizes the relative utility (standardized net benefit) across all classification thresholds and can be viewed as the Kolmogorov-Smirnov distance between the distributions of risk among events and non-events. It can be expressed as a special case of the continuous NRI, measuring reclassification from the 'null' model with no predictors. It is also a criterion based on the value of information and quantifies the reduction in expected regret for a given regret function, casting the NRI at event rate as a measure of incremental reduction in expected regret. More generally, we find it informative to present plots of standardized net benefit/relative utility for the new versus old model across the domain of classification thresholds. Then, these plots can be summarized with their maximum values, and the increment in model performance can be described by the NRI at event rate. We provide theoretical examples and a clinical application on the evaluation of prognostic biomarkers for atrial fibrillation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Effectiveness of a Risk Screener in Identifying Hepatitis C Virus in a Primary Care Setting

    PubMed Central

    Litwin, Alain H.; Smith, Bryce D.; Koppelman, Elisa A.; McKee, M. Diane; Christiansen, Cindy L.; Gifford, Allen L.; Weinbaum, Cindy M.; Southern, William N.

    2012-01-01

    Objectives. We evaluated an intervention designed to identify patients at risk for hepatitis C virus (HCV) through a risk screener used by primary care providers. Methods. A clinical reminder sticker prompted physicians at 3 urban clinics to screen patients for 12 risk factors and order HCV testing if any risks were present. Risk factor data were collected from the sticker; demographic and testing data were extracted from electronic medical records. We used the t test, χ2 test, and rank-sum test to compare patients who had and had not been screened and developed an analytic model to identify the incremental value of each element of the screener. Results. Among screened patients, 27.8% (n = 902) were identified as having at least 1 risk factor. Of screened patients with risk factors, 55.4% (n = 500) were tested for HCV. Our analysis showed that 7 elements (injection drug use, intranasal drug use, elevated alanine aminotransferase, transfusions before 1992, ≥ 20 lifetime sex partners, maternal HCV, existing liver disease) accounted for all HCV infections identified. Conclusions. A brief risk screener with a paper-based clinical reminder was effective in increasing HCV testing in a primary care setting. PMID:22994166

  15. Role of mathematical models in assessment of risk and in attempts to define management strategy.

    PubMed

    Flamm, W G; Winbush, J S

    1984-06-01

    Risk assessment of food-borne carcinogens is becoming a common practice at FDA. Actual risk is not being estimated, only the upper limit of risk. The risk assessment process involves a large number of steps and assumptions, many of which affect the numerical value estimated. The mathematical model which is to be applied is only one of the factors which affect these numerical values. To fulfill the policy objective of using the "worst plausible case" in estimating the upper limit of risk, recognition needs to be given to a proper balancing of assumptions and decisions. Interaction between risk assessors and risk managers should avoid making or giving the appearance of making specific technical decisions such as the choice of the mathematical model. The importance of this emerging field is too great to jeopardize it by inappropriately mixing scientific judgments with policy judgments. The risk manager should understand fully the points and range of uncertainty involved in arriving at the estimates of risk which must necessarily affect the choice of the policy or regulatory options available.

  16. Predicting Acute Exacerbations in Chronic Obstructive Pulmonary Disease.

    PubMed

    Samp, Jennifer C; Joo, Min J; Schumock, Glen T; Calip, Gregory S; Pickard, A Simon; Lee, Todd A

    2018-03-01

    With increasing health care costs that have outpaced those of other industries, payers of health care are moving from a fee-for-service payment model to one in which reimbursement is tied to outcomes. Chronic obstructive pulmonary disease (COPD) is a disease where this payment model has been implemented by some payers, and COPD exacerbations are a quality metric that is used. Under an outcomes-based payment model, it is important for health systems to be able to identify patients at risk for poor outcomes so that they can target interventions to improve outcomes. To develop and evaluate predictive models that could be used to identify patients at high risk for COPD exacerbations. This study was retrospective and observational and included COPD patients treated with a bronchodilator-based combination therapy. We used health insurance claims data to obtain demographics, enrollment information, comorbidities, medication use, and health care resource utilization for each patient over a 6-month baseline period. Exacerbations were examined over a 6-month outcome period and included inpatient (primary discharge diagnosis for COPD), outpatient, and emergency department (outpatient/emergency department visits with a COPD diagnosis plus an acute prescription for an antibiotic or corticosteroid within 5 days) exacerbations. The cohort was split into training (75%) and validation (25%) sets. Within the training cohort, stepwise logistic regression models were created to evaluate risk of exacerbations based on factors measured during the baseline period. Models were evaluated using sensitivity, specificity, and positive and negative predictive values. The base model included all confounding or effect modifier covariates. Several other models were explored using different sets of observations and variables to determine the best predictive model. There were 478,772 patients included in the analytic sample, of which 40.5% had exacerbations during the outcome period. Patients with exacerbations had slightly more comorbidities, medication use, and health care resource utilization compared with patients without exacerbations. In the base model, sensitivity was 41.6% and specificity was 85.5%. Positive and negative predictive values were 66.2% and 68.2%, respectively. Other models that were evaluated resulted in similar test characteristics as the base model. In this study, we were not able to predict COPD exacerbations with a high level of accuracy using health insurance claims data from COPD patients treated with bronchodilator-based combination therapy. Future studies should be done to explore predictive models for exacerbations. No outside funding supported this study. Samp is now employed by, and owns stock in, AbbVie. The other authors have nothing to disclose. Study concept and design were contributed by Joo and Pickard, along with the other authors. Samp and Lee performed the data analysis, with assistance from the other authors. Samp wrote the manuscript, which was revised by Schumock and Calip, along with the other authors.

  17. Cost-effectiveness and value of information analysis of nutritional support for preventing pressure ulcers in high-risk patients: implement now, research later.

    PubMed

    Tuffaha, Haitham W; Roberts, Shelley; Chaboyer, Wendy; Gordon, Louisa G; Scuffham, Paul A

    2015-04-01

    Pressure ulcers are a major cause of mortality, morbidity, and increased healthcare cost. Nutritional support may reduce the incidence of pressure ulcers in hospitalised patients who are at risk of pressure ulcer and malnutrition. To evaluate the cost-effectiveness of nutritional support in preventing pressure ulcers in high-risk hospitalised patients, and to assess the value of further research to inform the decision to implement this intervention using value of information analysis (VOI). The analysis was from the perspective of Queensland Health, Australia using a decision model with evidence derived from a systematic review and meta-analysis. Resources were valued using 2014 prices and the time horizon of the analysis was one year. Monte Carlo simulation was used to estimate net monetary benefits (NB) and to calculate VOI measures. Compared with standard hospital diet, nutritional support was cost saving at AU$425 per patient, and more effective with an average 0.005 quality-adjusted life years (QALY) gained. At a willingness-to-pay of AU$50,000 per QALY, the incremental NB was AU$675 per patient, with a probability of 87 % that nutritional support is cost-effective. The expected value of perfect information was AU$5 million and the expected value of perfect parameter information was highest for the relative risk of developing a pressure ulcer at AU$2.5 million. For a future trial investigating the relative effectiveness of the interventions, the expected net benefit of research would be maximised at AU$100,000 with 1,200 patients in each arm if nutritional support was perfectly implemented. The opportunity cost of withholding the decision to implement the intervention until the results of the future study are available would be AU$14 million. Nutritional support is cost-effective in preventing pressure ulcers in high-risk hospitalised patients compared with standard diet. Future research to reduce decision uncertainty is worthwhile; however, given the opportunity losses associated with delaying the implementation, "implement and research" is the approach recommended for this intervention.

  18. Literature Review on Modeling Cyber Networks and Evaluating Cyber Risks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelic, Andjelka; Campbell, Philip L

    The National Infrastructure Simulations and Analysis Center (NISAC) conducted a literature review on modeling cyber networks and evaluating cyber risks. The literature review explores where modeling is used in the cyber regime and ways that consequence and risk are evaluated. The relevant literature clusters in three different spaces: network security, cyber-physical, and mission assurance. In all approaches, some form of modeling is utilized at varying levels of detail, while the ability to understand consequence varies, as do interpretations of risk. This document summarizes the different literature viewpoints and explores their applicability to securing enterprise networks.

  19. Coastal flooding impact evaluation using an INtegrated DisRuption Assessment (INDRA) model for Varna region, Western Black Sea

    NASA Astrophysics Data System (ADS)

    Andreeva, Nataliya; Eftimova, Petya; Valchev, Nikolay; Prodanov, Bogdan

    2017-04-01

    The study presents evaluation and comparative analysis of storm induced flooding impacts on different coastal receptors at a scale of Varna region using INtegrated DisRuption Assessment (INDRA) model. The model was developed within the FP7 RISC-KIT project, as a part of Coastal Risk Assessment Framework (CRAF) consisting of two phases. CRAF Phase 1 is a screening process that evaluates coastal risk at a regional scale by means of coastal indices approach, which helps to identify potentially vulnerable coastal sectors: hot spots (HS). CRAF Phase 2 has the objective to assess and rank identified hotspots by detailed risk analysis done by jointly performing a hazard assessment and an impact evaluation on different categories (population, businesses, ecosystems, transport and utilities) using INDRA model at a regional level. Basically, the model assess the shock of events by estimating the impact on directly exposed to flooding hazard receptors of different vulnerability, as well as the potential ripple effects during an event in order to assess the "indirect" impacts, which occur outside the hazard area and/or continue after the event for all considered categories. The potential impacts are expressed in terms of uniform "Impact Indicators", which independently score the indirect impacts of these categories assessing disruption and recovery of the receptors. The ultimate hotspot ranking is obtained through the use of a Multi Criteria analysis (MCA) incorporated in the model, considering preferences of stakeholders. The case study area - Varna regional coast - is located on the western Black Sea, Bulgaria. The coastline, with a length of about 70 km, stretches from cape Ekrene to cape St. Atanas and includes Varna Bay. After application of CRAF Phase 1 three hotspots were selected for further analysis: Kabakum beach (HS1), Varna Central beach plus Port wall (HS2) and Artificial Island (HS3). For first two hotspots beaches and associated infrastructure are the assets that attract holiday-makers and tourists in summer season. For HS3 the exposed area is occupied by storage premises for industrial goods and oil/fuel tanks. Flooding hazard was assessed through coupled use of XBeach 1D and LISFLOOD 2D inundation models at the selected hotspots. The "response" approach was adopted as 75 extreme storm events were simulated to obtain storm maxima series of overtopping discharges, flood depth, depth-velocity and berm retreat. The selected return periods within the extreme value analysis were 20, 50 and 100 years. For impact evaluation by INDRA model the categories "Population" and "Business" were considered. Impacts on Population were addressed by 3 impact indicators: "Risk to Life", "Household Displacement Time" and "Household Financial Recovery", while for Business category only by "Business Financial Recovery". Hotspots ranking was done using MCA by weighting of the evaluated indicators: focused on Risk to Life (F1) and on Business Financial Recovery (F2). MCA scoring focused on Household displacement/recovery was not evaluated because modelling results revealed quite a low number of flooded household receptors. Results show that for both F1 and F2 and for all considered return periods HS2 has the highest scores, which makes it a final hotspot.

  20. Validation of risk stratification for children with febrile neutropenia in a pediatric oncology unit in India.

    PubMed

    Das, Anirban; Trehan, Amita; Oberoi, Sapna; Bansal, Deepak

    2017-06-01

    The study aims to validate a score predicting risk of complications in pediatric patients with chemotherapy-related febrile neutropenia (FN) and evaluate the performance of previously published models for risk stratification. Children diagnosed with cancer and presenting with FN were evaluated in a prospective single-center study. A score predicting the risk of complications, previously derived in the unit, was validated on a prospective cohort. Performance of six predictive models published from geographically distinct settings was assessed on the same cohort. Complications were observed in 109 (26.3%) of 414 episodes of FN over 15 months. A risk score based on undernutrition (two points), time from last chemotherapy (<7 days = two points), presence of a nonupper respiratory focus of infection (two points), C-reactive protein (>60 mg/l = five points), and absolute neutrophil count (<100 per μl = two points) was used to stratify patients into "low risk" (score <7, n = 208) and assessed using the following parameters: overall performance (Nagelkerke R 2 = 34.4%), calibration (calibration slope = 0.39; P = 0.25 in Hosmer-Lemeshow test), discrimination (c-statistic = 0.81), overall sensitivity (86%), negative predictive value (93%), and clinical net benefit (0.43). Six previously published rules demonstrated inferior performance in this cohort. An indigenous decision rule using five simple predefined variables was successful in identifying children at risk for complications. Prediction models derived in developed nations may not be appropriate for low-middle-income settings and need to be validated before use. © 2016 Wiley Periodicals, Inc.

  1. Characterizing Decision-Analysis Performances of Risk Prediction Models Using ADAPT Curves.

    PubMed

    Lee, Wen-Chung; Wu, Yun-Chun

    2016-01-01

    The area under the receiver operating characteristic curve is a widely used index to characterize the performance of diagnostic tests and prediction models. However, the index does not explicitly acknowledge the utilities of risk predictions. Moreover, for most clinical settings, what counts is whether a prediction model can guide therapeutic decisions in a way that improves patient outcomes, rather than to simply update probabilities.Based on decision theory, the authors propose an alternative index, the "average deviation about the probability threshold" (ADAPT).An ADAPT curve (a plot of ADAPT value against the probability threshold) neatly characterizes the decision-analysis performances of a risk prediction model.Several prediction models can be compared for their ADAPT values at a chosen probability threshold, for a range of plausible threshold values, or for the whole ADAPT curves. This should greatly facilitate the selection of diagnostic tests and prediction models.

  2. Predicting drug-induced liver injury in human with Naïve Bayes classifier approach.

    PubMed

    Zhang, Hui; Ding, Lan; Zou, Yi; Hu, Shui-Qing; Huang, Hai-Guo; Kong, Wei-Bao; Zhang, Ji

    2016-10-01

    Drug-induced liver injury (DILI) is one of the major safety concerns in drug development. Although various toxicological studies assessing DILI risk have been developed, these methods were not sufficient in predicting DILI in humans. Thus, developing new tools and approaches to better predict DILI risk in humans has become an important and urgent task. In this study, we aimed to develop a computational model for assessment of the DILI risk with using a larger scale human dataset and Naïve Bayes classifier. The established Naïve Bayes prediction model was evaluated by 5-fold cross validation and an external test set. For the training set, the overall prediction accuracy of the 5-fold cross validation was 94.0 %. The sensitivity, specificity, positive predictive value and negative predictive value were 97.1, 89.2, 93.5 and 95.1 %, respectively. The test set with the concordance of 72.6 %, sensitivity of 72.5 %, specificity of 72.7 %, positive predictive value of 80.4 %, negative predictive value of 63.2 %. Furthermore, some important molecular descriptors related to DILI risk and some toxic/non-toxic fragments were identified. Thus, we hope the prediction model established here would be employed for the assessment of human DILI risk, and the obtained molecular descriptors and substructures should be taken into consideration in the design of new candidate compounds to help medicinal chemists rationally select the chemicals with the best prospects to be effective and safe.

  3. Hydrogeochemical assessment of mine-impacted water and sediment of iron ore mining

    NASA Astrophysics Data System (ADS)

    Nur Atirah Affandi, Fatin; Kusin, Faradiella Mohd; Aqilah Sulong, Nur; Madzin, Zafira

    2018-04-01

    This study was carried out to evaluate the hydrogeochemical behaviour of mine-impacted water and sediment of a former iron ore mining area. Sampling of mine water and sediment were carried out at selected locations within the mine including the former mining ponds, mine tailings and the nearby stream. The water samples were analysed for their hydrochemical facies, major and trace elements including heavy metals. The water in the mining ponds and the mine tailings was characterised as highly acidic (pH 2.54-3.07), but has near-neutral pH in the nearby stream. Results indicated that Fe and Mn in water have exceeded the recommended guidelines values and was also supported by the results of geochemical modelling. The results also indicated that sediments in the mining area were contaminated with Cd and As as shown by the potential ecological risk index values. The total risk index of heavy metals in the sediment were ranked in the order of Cd>As>Pb>Cu>Zn>Cr. Overall, the extent of potential ecological risks of the mining area were categorised as having low to moderate ecological risk.

  4. Impact of Atmospheric Aerosols on Solar Photovoltaic Electricity Generation in China

    NASA Astrophysics Data System (ADS)

    Li, X.; Mauzerall, D. L.; Wagner, F.; Peng, W.; Yang, J.

    2016-12-01

    Hurricanes have induced devastating storm surge flooding worldwide. The impacts of these storms may worsen in the coming decades because of rapid coastal development coupled with sea-level rise and possibly increasing storm activity due to climate change. Major advances in coastal flood risk management are urgently needed. We present an integrated dynamic risk analysis for flooding task (iDraft) framework to assess and manage coastal flood risk at the city or regional scale, considering integrated dynamic effects of storm climatology change, sea-level rise, and coastal development. We apply the framework to New York City. First, we combine climate-model projected storm surge climatology and sea-level rise with engineering- and social/economic-model projected coastal exposure and vulnerability to estimate the flood damage risk for the city over the 21st century. We derive temporally-varying risk measures such as the annual expected damage as well as temporally-integrated measures such as the present value of future losses. We also examine the individual and joint contributions to the changing risk of the three dynamic factors (i.e., sea-level rise, storm change, and coastal development). Then, we perform probabilistic cost-benefit analysis for various coastal flood risk mitigation strategies for the city. Specifically, we evaluate previously proposed mitigation measures, including elevating houses on the floodplain and constructing flood barriers at the coast, by comparing their estimated cost and probability distribution of the benefit (i.e., present value of avoided future losses). We also propose new design strategies, including optimal design (e.g., optimal house elevation) and adaptive design (e.g., flood protection levels that are designed to be modified over time in a dynamic and uncertain environment).

  5. Association of Toll-Like Receptor 4 Polymorphisms with Diabetic Foot Ulcers and Application of Artificial Neural Network in DFU Risk Assessment in Type 2 Diabetes Patients

    PubMed Central

    Singh, Kanhaiya; Agrawal, Neeraj K.; Gupta, Sanjeev K.

    2013-01-01

    The Toll-Like receptor 4 (TLR4) plays an important role in immunity, tissue repair, and regeneration. The objective of the present work was to evaluate the association of TLR4 single nucleotide polymorphisms (SNPs) rs4986790, rs4986791, rs11536858 (merged into rs10759931), rs1927911, and rs1927914 with increased diabetic foot ulcer (DFU) risk in patients with type 2 diabetes mellitus (T2DM). PCR-RFLP was used for genotyping TLR4 SNPs in 125 T2DM patients with DFU and 130 controls. The haplotypes and linkage disequilibrium between the SNPs were determined using Haploview software. Multivariate linear regression (MLR) and artificial neural network (ANN) modeling was done to observe their predictability for the risk of DFU in T2DM patients. Risk genotypes of all SNPs except rs1927914 were significantly associated with DFU. Haplotype ACATC (P value = 9.3E − 5) showed strong association with DFU risk. Two haplotypes ATATC (P value = 0.0119) and ATGTT (P value = 0.0087) were found to be protective against DFU. In conclusion TLR4 SNPs and their haplotypes may increase the risk of impairment of wound healing in T2DM patients. ANN model (83%) is found to be better than the MLR model (76%) and can be used as a tool for the DFU risk assessment in T2DM patients. PMID:23936790

  6. Anaemia to predict outcome in patients with acute coronary syndromes.

    PubMed

    Ennezat, Pierre Vladimir; Maréchaux, Sylvestre; Pinçon, Claire; Finzi, Jonathan; Barrailler, Stéphanie; Bouabdallaoui, Nadia; Van Belle, Eric; Montalescot, Gilles; Collet, Jean-Philippe

    2013-01-01

    Owing to the heterogeneous population of patients with acute coronary syndromes (ACS), risk stratification with tools such as the GRACE risk score is recommended to guide therapeutic management and improve outcome. To evaluate whether anaemia refines the value of the GRACE risk model to predict midterm outcome after an ACS. A prospective registry of 1064 ACS patients (63 ± 14 years; 73% men; 57% ST-segment elevation myocardial infarction [MI]) was studied. Anaemia was defined as haemoglobin less than 13 mg/dL in men or less than 12 mg/dL in women. The primary endpoint was 6-month death or rehospitalization for MI. The primary endpoint was reached in 132 patients, including 68 deaths. Anaemia was associated with adverse clinical outcomes (hazard ratio 3.008, 95% confidence interval 2.137-4.234; P<0.0001) in univariate analysis and remained independently associated with outcome after adjustment for the Global Registry of Acute Coronary Events (GRACE) risk score (hazard ratio 2.870, 95% confidence interval 1.815-4.538; P<0.0001). Anaemia provided additional prognostic information to the GRACE score as demonstrated by a systematic improvement in global model fit and discrimination (c-statistic increasing from 0.633 [0.571;0.696] to 0.697 [0.638;0.755]). Subsequently, adding anaemia to the GRACE score led to reclassification of 595 patients into different risk categories; 16.5% patients at low risk (≤ 5% risk of death or rehospitalization for MI) were upgraded to intermediate (>5-10%) or high risk (>10%); 79.5% patients at intermediate risk were reclassified as low (55%) or high risk (24%); and 45.5% patients at high risk were downgraded to intermediate risk. Overall, 174 patients were reclassified into a higher risk category (17.3%) and 421 into a lower risk category (41.9%). Anaemia provides independent additional prognostic information to the GRACE score. Combining anaemia with the GRACE score refines its predictive value, which often overestimates the risk. Copyright © 2013. Published by Elsevier Masson SAS.

  7. Temporal and geographical external validation study and extension of the Mayo Clinic prediction model to predict eGFR in the younger population of Swiss ADPKD patients.

    PubMed

    Girardat-Rotar, Laura; Braun, Julia; Puhan, Milo A; Abraham, Alison G; Serra, Andreas L

    2017-07-17

    Prediction models in autosomal dominant polycystic kidney disease (ADPKD) are useful in clinical settings to identify patients with greater risk of a rapid disease progression in whom a treatment may have more benefits than harms. Mayo Clinic investigators developed a risk prediction tool for ADPKD patients using a single kidney value. Our aim was to perform an independent geographical and temporal external validation as well as evaluate the potential for improving the predictive performance by including additional information on total kidney volume. We used data from the on-going Swiss ADPKD study from 2006 to 2016. The main analysis included a sample size of 214 patients with Typical ADPKD (Class 1). We evaluated the Mayo Clinic model performance calibration and discrimination in our external sample and assessed whether predictive performance could be improved through the addition of subsequent kidney volume measurements beyond the baseline assessment. The calibration of both versions of the Mayo Clinic prediction model using continuous Height adjusted total kidney volume (HtTKV) and using risk subclasses was good, with R 2 of 78% and 70%, respectively. Accuracy was also good with 91.5% and 88.7% of the predicted within 30% of the observed, respectively. Additional information regarding kidney volume did not substantially improve the model performance. The Mayo Clinic prediction models are generalizable to other clinical settings and provide an accurate tool based on available predictors to identify patients at high risk for rapid disease progression.

  8. Multifractality and value-at-risk forecasting of exchange rates

    NASA Astrophysics Data System (ADS)

    Batten, Jonathan A.; Kinateder, Harald; Wagner, Niklas

    2014-05-01

    This paper addresses market risk prediction for high frequency foreign exchange rates under nonlinear risk scaling behaviour. We use a modified version of the multifractal model of asset returns (MMAR) where trading time is represented by the series of volume ticks. Our dataset consists of 138,418 5-min round-the-clock observations of EUR/USD spot quotes and trading ticks during the period January 5, 2006 to December 31, 2007. Considering fat-tails, long-range dependence as well as scale inconsistency with the MMAR, we derive out-of-sample value-at-risk (VaR) forecasts and compare our approach to historical simulation as well as a benchmark GARCH(1,1) location-scale VaR model. Our findings underline that the multifractal properties in EUR/USD returns in fact have notable risk management implications. The MMAR approach is a parsimonious model which produces admissible VaR forecasts at the 12-h forecast horizon. For the daily horizon, the MMAR outperforms both alternatives based on conditional as well as unconditional coverage statistics.

  9. Utility of Risk Models in Decision Making After Radical Prostatectomy: Lessons from a Natural History Cohort of Intermediate- and High-Risk Men.

    PubMed

    Ross, Ashley E; Yousefi, Kasra; Davicioni, Elai; Ghadessi, Mercedeh; Johnson, Michael H; Sundi, Debasish; Tosoian, Jeffery J; Han, Misop; Humphreys, Elizabeth B; Partin, Alan W; Walsh, Patrick C; Trock, Bruce J; Schaeffer, Edward M

    2016-03-01

    Current guidelines suggest adjuvant radiation therapy for men with adverse pathologic features (APFs) at radical prostatectomy (RP). We examine at-risk men treated only with RP until the time of metastasis. To evaluate whether clinicopathologic risk models can help guide postoperative therapeutic decision making. Men with National Comprehensive Cancer Network intermediate- or high-risk localized prostate cancer undergoing RP in the prostate-specific antigen (PSA) era were identified (n=3089). Only men with initial undetectable PSA after surgery and who received no therapy prior to metastasis were included. APFs were defined as pT3 disease or positive surgical margins. Area under the receiver operating characteristic curve (AUC) for time to event data was used to measure the discrimination performance of the risk factors. Cumulative incidence curves were constructed using Fine and Gray competing risks analysis to estimate the risk of biochemical recurrence (BCR) or metastasis, taking censoring and death due to other causes into consideration. Overall, 43% of the cohort (n=1327) had APFs at RP. Median follow-up for censored patients was 5 yr. Cumulative incidence of metastasis was 6% at 10 yr after RP for all patients. Cumulative incidence of metastasis among men with APFs was 7.5% at 10 yr after RP. Among men with BCR, the incidence of metastasis was 38% 5 yr after BCR. At 10 yr after RP, time-dependent AUC for predicting metastasis by Cancer of the Prostate Risk Assessment Postsurgical or Eggener risk models was 0.81 (95% confidence interval [CI], 0.72-0.97) and 0.78 (95% CI, 0.67-0.97) in the APF population, respectively. At 5 yr after BCR, these values were lower (0.58 [95% CI, 0.50-0.66] and 0.70 [95% CI, 0.63-0.76]) among those who developed BCR. Use of risk model cut points could substantially reduce overtreatment while minimally increasing undertreatment (ie, use of an Eggener cut point of 2.5% for treatment of men with APFs would spare 46% from treatment while only allowing for metastatic events in 1% at 10 yr after RP). Use of risk models reduces overtreatment and should be a routine part of patient counseling when considering adjuvant therapy. Risk model performance is significantly reduced among men with BCR. Use of current risk models can help guide decision making regarding therapy after surgery and reduce overtreatment. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  10. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    PubMed

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived commitment, influence and confidence of stakeholders in implementing policy and program decisions identified in the modelling process; and the impact of the process in terms of policy and program change. The study will generate empirical evidence on the feasibility and potential value of simulation modelling to support knowledge mobilisation and consensus building in health settings.

  11. Preoperative and postoperative serial assessments of postural balance and fall risk in patients with arthroscopic anterior cruciate ligament reconstruction.

    PubMed

    Gokalp, Oguzhan; Akkaya, Semih; Akkaya, Nuray; Buker, Nihal; Gungor, Harun R; Ok, Nusret; Yorukoglu, Cagdas

    2016-04-27

    Impaired postural balance due to somatosensory data loss with mechanical instability has been shown in patients with ACL deficiency. To assess postural balance in patients with ACL insufficiency prior to surgery and following reconstruction with serial evaluations. Thirty patients (mean age of 27.7 ± 6.7 years) who underwent arthroscopic reconstruction of ACL with bone-patellar tendon-bone autograft were examined for clinical and functional variables at preoperative day and postoperative 12th week. Posturographic analysis were performed by using Tetrax Interactive Balance System (Sunlight Medical Ltd, Israel) at preoperative day, at 4th, 8th, and 12th weeks following reconstruction. Data computed by posturographic software by the considerations of the oscillation velocities of body sways is fall risk as a numeric value (0-100, lower values indicate better condition). All of the patients (mean age of 27.7 ± 6.7 years) had significant improvements for clinical, functional evaluations and fall risk (p< 0.05). Mean fall risk was within high-risk category (59.9 ± 22.8) preoperatively. The highest fall risk was detected at postoperative 4th week. Patients had high fall risk at 8th week similar to preoperative value. Mean fall risk decreased to low level risk at 12th week. Preoperative symptom duration had relationships with preoperative fall risk and postoperative improvement of fall risk (p= 0.001, r= -0.632, p= 0.001, r= -0.870, respectively). The improvement of fall risk was higher in patients with symptoms shorter than 6 months (p= 0.001). According to these results, mean fall risk of patients with ACL insufficiency was within high risk category preoperatively, and fall risk improves after surgical reconstruction, but as the duration of complaints lengthens especially longer than 6 months, the improvement of fall risk decreases following reconstruction.

  12. A TCP model for external beam treatment of intermediate-risk prostate cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walsh, Sean; Putten, Wil van der

    2013-03-15

    Purpose: Biological models offer the ability to predict clinical outcomes. The authors describe a model to predict the clinical response of intermediate-risk prostate cancer to external beam radiotherapy for a variety of fractionation regimes. Methods: A fully heterogeneous population averaged tumor control probability model was fit to clinical outcome data for hyper, standard, and hypofractionated treatments. The tumor control probability model was then employed to predict the clinical outcome of extreme hypofractionation regimes, as utilized in stereotactic body radiotherapy. Results: The tumor control probability model achieves an excellent level of fit, R{sup 2} value of 0.93 and a root meanmore » squared error of 1.31%, to the clinical outcome data for hyper, standard, and hypofractionated treatments using realistic values for biological input parameters. Residuals Less-Than-Or-Slanted-Equal-To 1.0% are produced by the tumor control probability model when compared to clinical outcome data for stereotactic body radiotherapy. Conclusions: The authors conclude that this tumor control probability model, used with the optimized radiosensitivity values obtained from the fit, is an appropriate mechanistic model for the analysis and evaluation of external beam RT plans with regard to tumor control for these clinical conditions.« less

  13. Risk Score for Detecting Dysglycemia: A Cross-Sectional Study of a Working-Age Population in an Oil Field in China.

    PubMed

    Tian, Xiubiao; Liu, Yan; Han, Ying; Shi, Jieli; Zhu, Tiehong

    2017-06-11

    BACKGROUND Dysglycemia (pre-diabetes or diabetes) in young adults has increased rapidly. However, the risk scores for detecting dysglycemia in oil field staff and workers in China are limited. This study developed a risk score for the early identification of dysglycemia based on epidemiological and health examination data in an oil field working-age population with increased risk of diabetes. MATERIAL AND METHODS Multivariable logistic regression was used to develop the risk score model in a population-based, cross-sectional study. All subjects completed the questionnaires and underwent physical examination and oral glucose tolerance tests. The performance of the risk score models was evaluated using the area under the receiver operating characteristic curve (AUC). RESULTS The study population consisted of 1995 participants, 20-64 years old (49.4% males), with undiagnosed diabetes or pre-diabetes who underwent periodic health examinations from March 2014 to June 2015 in Dagang oil field, Tianjin, China. Age, sex, body mass index, history of high blood glucose, smoking, triglyceride, and fasting plasma glucose (FPG) constituted the Dagang dysglycemia risk score (Dagang DRS) model. The performance of Dagang DRS was superior to m-FINDRISC (AUC: 0.791; 95% confidence interval (CI), 0.773-0.809 vs. 0.633; 95% CI, 0.611-0.654). At the cut-off value of 5.6 mmol/L, the Dagang DRS (AUC: 0.616; 95% CI, 0.592-0.641) was better than the FPG value alone (AUC: 0.571; 95% CI, 0.546-0.596) in participants with FPG <6.1 mmol/L (n=1545, P=0.028). CONCLUSIONS Dagang DRS is a valuable tool for detecting dysglycemia, especially when FPG <6.1 mmol/L, in oil field workers in China.

  14. Combining radar and direct observation to estimate pelican collision risk at a proposed wind farm on the Cape west coast, South Africa

    PubMed Central

    Reid, Tim; du Plessis, Johan; Colyn, Robin; Benn, Grant; Millikin, Rhonda

    2018-01-01

    Pre-construction assessments of bird collision risk at proposed wind farms are often confounded by insufficient or poor quality data describing avian flight paths through the development area. These limitations can compromise the practical value of wind farm impact studies. We used radar- and observer-based methods to quantify great white pelican flights in the vicinity of a planned wind farm on the Cape west coast, South Africa, and modelled turbine collision risk under various scenarios. Model outputs were combined with pre-existing demographic data to evaluate the possible influence of the wind farm on the pelican population, and to examine impact mitigation options. We recorded high volumes of great white pelican movement through the wind farm area, coincident with the breeding cycle of the nearby colony and associated with flights to feeding areas located about 50 km away. Pelicans were exposed to collision risk at a mean rate of 2.02 High Risk flights.h-1. Risk was confined to daylight hours, highest during the middle of the day and in conditions of strong north-westerly winds, and 82% of High Risk flights were focused on only five of the proposed 35 turbine placements. Predicted mean mortality rates (22 fatalities.yr-1, 95% Cl, 16–29, with average bird and blade speeds and 95% avoidance rates) were not sustainable, resulting in a negative population growth rate (λ = 0.991). Models suggested that removal of the five highest risk turbines from the project, or institution of a curtailment regimen that shuts down at least these turbines at peak traffic times, could theoretically reduce impacts to manageable levels. However, in spite of the large quantities of high quality data used in our analyses, our collision risk model remains compromised by untested assumptions about pelican avoidance rates and uncertainties about the existing dynamics of the pelican population, and our findings are probably not reliable enough to ensure sustainable development. PMID:29408877

  15. Combining radar and direct observation to estimate pelican collision risk at a proposed wind farm on the Cape west coast, South Africa.

    PubMed

    Jenkins, Andrew R; Reid, Tim; du Plessis, Johan; Colyn, Robin; Benn, Grant; Millikin, Rhonda

    2018-01-01

    Pre-construction assessments of bird collision risk at proposed wind farms are often confounded by insufficient or poor quality data describing avian flight paths through the development area. These limitations can compromise the practical value of wind farm impact studies. We used radar- and observer-based methods to quantify great white pelican flights in the vicinity of a planned wind farm on the Cape west coast, South Africa, and modelled turbine collision risk under various scenarios. Model outputs were combined with pre-existing demographic data to evaluate the possible influence of the wind farm on the pelican population, and to examine impact mitigation options. We recorded high volumes of great white pelican movement through the wind farm area, coincident with the breeding cycle of the nearby colony and associated with flights to feeding areas located about 50 km away. Pelicans were exposed to collision risk at a mean rate of 2.02 High Risk flights.h-1. Risk was confined to daylight hours, highest during the middle of the day and in conditions of strong north-westerly winds, and 82% of High Risk flights were focused on only five of the proposed 35 turbine placements. Predicted mean mortality rates (22 fatalities.yr-1, 95% Cl, 16-29, with average bird and blade speeds and 95% avoidance rates) were not sustainable, resulting in a negative population growth rate (λ = 0.991). Models suggested that removal of the five highest risk turbines from the project, or institution of a curtailment regimen that shuts down at least these turbines at peak traffic times, could theoretically reduce impacts to manageable levels. However, in spite of the large quantities of high quality data used in our analyses, our collision risk model remains compromised by untested assumptions about pelican avoidance rates and uncertainties about the existing dynamics of the pelican population, and our findings are probably not reliable enough to ensure sustainable development.

  16. Cognitive Function and Vascular Risk Factors Among Older African American Adults

    PubMed Central

    Park, Moon Ho; Tsang, Siny; Sperling, Scott A.; Manning, Carol

    2017-01-01

    To evaluate the association between vascular risk factors and cognitive impairment among older African American (AA) adults in a primary care clinic. Participants included 96 AA adults aged 60 years or older who were evaluated for global and domain-specific cognition. Participants were interviewed using the Computerized Assessment of Memory and Cognitive Impairment (CAMCI). The relationship between CAMCI cognitive domain scores and vascular risk factors were examined using hierarchical regression models. Patients who smoked, those with higher SBP/DBP values had lower accuracy rates on CAMCI cognitive domains (attention, executive, memory).Those with higher BMI had better attention scores. Patients with higher HbA1C values had worse verbal memory. Patients with higher blood pressure were significantly faster in responding to tasks in the executive domain. Primary care providers working with older AA adults with these VRFs could implement cognitive screening earlier into their practice to reduce barriers of seeking treatment. PMID:28417319

  17. Investment appraisal using quantitative risk analysis.

    PubMed

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  18. Approximation of the ruin probability using the scaled Laplace transform inversion

    PubMed Central

    Mnatsakanov, Robert M.; Sarkisian, Khachatur; Hakobyan, Artak

    2015-01-01

    The problem of recovering the ruin probability in the classical risk model based on the scaled Laplace transform inversion is studied. It is shown how to overcome the problem of evaluating the ruin probability at large values of an initial surplus process. Comparisons of proposed approximations with the ones based on the Laplace transform inversions using a fixed Talbot algorithm as well as on the ones using the Trefethen–Weideman–Schmelzer and maximum entropy methods are presented via a simulation study. PMID:26752796

  19. Evaluation of the nutrition screening tool for childhood cancer (SCAN).

    PubMed

    Murphy, Alexia J; White, Melinda; Viani, Karina; Mosby, Terezie T

    2016-02-01

    Malnutrition is a serious concern for children with cancer and nutrition screening may offer a simple alternative to nutrition assessment for identifying children with cancer who are at risk of malnutrition. The present paper aimed to evaluate the nutrition screening tool for childhood cancer (SCAN). SCAN was developed after an extensive review of currently available tools and published screening recommendation, consideration of pediatric oncology nutrition guidelines, piloting questions, and consulting with members of International Pediatric Oncology Nutrition Group. In Study 1, the accuracy and validity of SCAN against pediatric subjective global nutrition assessment (pediatric SGNA) was determined. In Study 2, subjects were classified as 'at risk of malnutrition' and 'not at risk of malnutrition' according to SCAN and measures of height, weight, body mass index (BMI) and body composition were compared between the groups. The validation of SCAN against pediatric SGNA showed SCAN had 'excellent' accuracy (0.90, 95% CI 0.78-1.00; p < 0.001), 100% sensitivity, 39% specificity, 56% positive predictive value and 100% negative predictive value. When subjects in Study 2 were classified into 'at risk of malnutrition' and 'not at risk of malnutrition' according to SCAN, the 'at risk of malnutrition' group had significantly lower values for weight Z score (p = 0.001), BMI Z score (p = 0.001) and fat mass index (FMI) (p = 0.04), than the 'not at risk of malnutrition' group. This study shows that SCAN is a simple, quick and valid tool which can be used to identify children with cancer who are at risk of malnutrition. Copyright © 2015 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  20. Model Continuation High Schools: Social-Cognitive Factors That Contribute to Re-Engaging At-Risk Students Emotionally, Behaviorally, and Cognitively towards Graduation

    ERIC Educational Resources Information Center

    Sumbera, Becky

    2017-01-01

    This three-phase, two-method qualitative study explored and identified policies, programs, and practices that school-site administrators perceived as most effective in reengaging at-risk students emotionally, behaviorally, and cognitively at 10 California Model Continuation High Schools (MCHS). Eccles' expectancy-value theoretical framework was…

  1. A systematic and critical review of model-based economic evaluations of pharmacotherapeutics in patients with bipolar disorder.

    PubMed

    Mohiuddin, Syed

    2014-08-01

    Bipolar disorder (BD) is a chronic and relapsing mental illness with a considerable health-related and economic burden. The primary goal of pharmacotherapeutics for BD is to improve patients' well-being. The use of decision-analytic models is key in assessing the added value of the pharmacotherapeutics aimed at treating the illness, but concerns have been expressed about the appropriateness of different modelling techniques and about the transparency in the reporting of economic evaluations. This paper aimed to identify and critically appraise published model-based economic evaluations of pharmacotherapeutics in BD patients. A systematic review combining common terms for BD and economic evaluation was conducted in MEDLINE, EMBASE, PSYCINFO and ECONLIT. Studies identified were summarised and critically appraised in terms of the use of modelling technique, model structure and data sources. Considering the prognosis and management of BD, the possible benefits and limitations of each modelling technique are discussed. Fourteen studies were identified using model-based economic evaluations of pharmacotherapeutics in BD patients. Of these 14 studies, nine used Markov, three used discrete-event simulation (DES) and two used decision-tree models. Most of the studies (n = 11) did not include the rationale for the choice of modelling technique undertaken. Half of the studies did not include the risk of mortality. Surprisingly, no study considered the risk of having a mixed bipolar episode. This review identified various modelling issues that could potentially reduce the comparability of one pharmacotherapeutic intervention with another. Better use and reporting of the modelling techniques in the future studies are essential. DES modelling appears to be a flexible and comprehensive technique for evaluating the comparability of BD treatment options because of its greater flexibility of depicting the disease progression over time. However, depending on the research question, modelling techniques other than DES might also be appropriate in some cases.

  2. Spatial gradient of human health risk from exposure to trace elements and radioactive pollutants in soils at the Puchuncaví-Ventanas industrial complex, Chile.

    PubMed

    Salmani-Ghabeshi, S; Palomo-Marín, M R; Bernalte, E; Rueda-Holgado, F; Miró-Rodríguez, C; Cereceda-Balic, F; Fadic, X; Vidal, V; Funes, M; Pinilla-Gil, E

    2016-11-01

    The Punchuncaví Valley in central Chile, heavily affected by a range of anthropogenic emissions from a localized industrial complex, has been studied as a model environment for evaluating the spatial gradient of human health risk, which are mainly caused by trace elemental pollutants in soil. Soil elemental profiles in 121 samples from five selected locations representing different degrees of impact from the industrial source were used for human risk estimation. Distance to source dependent cumulative non-carcinogenic hazard indexes above 1 for children (max 4.4 - min 1.5) were found in the study area, ingestion being the most relevant risk pathway. The significance of health risk differences within the study area was confirmed by statistical analysis (ANOVA and HCA) of individual hazard index values at the five sampling locations. As was the dominant factor causing unacceptable carcinogenic risk levels for children (<10 -4 ) at the two sampling locations which are closer to the industrial complex, whereas the risk was just in the tolerable range (10 -6 - 10 -4 ) for children and adults in the rest of the sampling locations at the study area. Furthermore, we assessed gamma ray radiation external hazard indexes and annual effective dose rate from the natural radioactivity elements ( 226 Ra, 232 Th and 40 K) levels in the surface soils of the study area. The highest average values for the specific activity of 232 Th (31 Bq kg -1 ), 40 K (615 Bq kg - 1 ), and 226 Ra (25 Bq kg -1 ) are lower than limit recommended by OECD, so no significant radioactive risk was detected within the study area. In addition, no significant variability of radioactive risk was observed among sampling locations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Investigation of Two Models to Set and Evaluate Quality Targets for HbA1c: Biological Variation and Sigma-metrics

    PubMed Central

    Weykamp, Cas; John, Garry; Gillery, Philippe; English, Emma; Ji, Linong; Lenters-Westra, Erna; Little, Randie R.; Roglic, Gojka; Sacks, David B.; Takei, Izumi

    2016-01-01

    Background A major objective of the IFCC Task Force on implementation of HbA1c standardization is to develop a model to define quality targets for HbA1c. Methods Two generic models, the Biological Variation and Sigma-metrics model, are investigated. Variables in the models were selected for HbA1c and data of EQA/PT programs were used to evaluate the suitability of the models to set and evaluate quality targets within and between laboratories. Results In the biological variation model 48% of individual laboratories and none of the 26 instrument groups met the minimum performance criterion. In the Sigma-metrics model, with a total allowable error (TAE) set at 5 mmol/mol (0.46% NGSP) 77% of the individual laboratories and 12 of 26 instrument groups met the 2 sigma criterion. Conclusion The Biological Variation and Sigma-metrics model were demonstrated to be suitable for setting and evaluating quality targets within and between laboratories. The Sigma-metrics model is more flexible as both the TAE and the risk of failure can be adjusted to requirements related to e.g. use for diagnosis/monitoring or requirements of (inter)national authorities. With the aim of reaching international consensus on advice regarding quality targets for HbA1c, the Task Force suggests the Sigma-metrics model as the model of choice with default values of 5 mmol/mol (0.46%) for TAE, and risk levels of 2 and 4 sigma for routine laboratories and laboratories performing clinical trials, respectively. These goals should serve as a starting point for discussion with international stakeholders in the field of diabetes. PMID:25737535

  4. Cumulative Risk Assessment: An Overview of Methodological Approaches for Evaluating Combined Health Effects from Exposure to Multiple Environmental Stressors

    PubMed Central

    Sexton, Ken

    2012-01-01

    Systematic evaluation of cumulative health risks from the combined effects of multiple environmental stressors is becoming a vital component of risk-based decisions aimed at protecting human populations and communities. This article briefly examines the historical development of cumulative risk assessment as an analytical tool, and discusses current approaches for evaluating cumulative health effects from exposure to both chemical mixtures and combinations of chemical and nonchemical stressors. A comparison of stressor-based and effects-based assessment methods is presented, and the potential value of focusing on viable risk management options to limit the scope of cumulative evaluations is discussed. The ultimate goal of cumulative risk assessment is to provide answers to decision-relevant questions based on organized scientific analysis; even if the answers, at least for the time being, are inexact and uncertain. PMID:22470298

  5. Ecotoxicological evaluation of propranolol hydrochloride and losartan potassium to Lemna minor L. (1753) individually and in binary mixtures.

    PubMed

    Godoy, Aline A; Kummrow, Fábio; Pamplin, Paulo Augusto Z

    2015-07-01

    Antihypertensive pharmaceuticals, including the beta-blockers, are one of the most detected therapeutic classes in the environment. The ecotoxicity of propranolol hydrochloride and losartan potassium was evaluated, both individually and combined in a binary mixture, by using the Lemna minor growth inhibition test. The endpoints evaluated in the single-pharmaceutical tests were frond number, total frond area and fresh weight. For the evaluation of the mixture toxicity, the selected endpoint was frond number. Water quality criteria values (WQC) were derived for the protection of freshwater and saltwater pelagic communities regarding the effects induced by propranolol and losartan using ecotoxicological data from the literature, including our data. The risks associated with both pharmaceutical effects on non-target organisms were quantified through the measured environmental concentration (MEC)/predicted-no-effect concentration (PNEC) ratios. For propranolol, the total frond area was the most sensitive endpoint (EC50 = 77.3 mg L(-1)), while for losartan there was no statistically significant difference between the endpoints. Losartan is only slightly more toxic than propranolol. Both concentration addition and independent action models overestimated the mixture toxicity of the pharmaceuticals at all the effect concentration levels evaluated. The joint action of both pharmaceuticals showed an antagonistic interaction to L. minor. Derived WQC assumed lower values for propranolol than for losartan. The MEC/PNEC ratios showed that propranolol may pose a risk for the most sensitive aquatic species, while acceptable risks posed by losartan were estimated for most of aquatic matrices. To the authors knowledge these are the first data about losartan toxicity for L. minor.

  6. Measurement error and timing of predictor values for multivariable risk prediction models are poorly reported.

    PubMed

    Whittle, Rebecca; Peat, George; Belcher, John; Collins, Gary S; Riley, Richard D

    2018-05-18

    Measurement error in predictor variables may threaten the validity of clinical prediction models. We sought to evaluate the possible extent of the problem. A secondary objective was to examine whether predictors are measured at the intended moment of model use. A systematic search of Medline was used to identify a sample of articles reporting the development of a clinical prediction model published in 2015. After screening according to a predefined inclusion criteria, information on predictors, strategies to control for measurement error and intended moment of model use were extracted. Susceptibility to measurement error for each predictor was classified into low and high risk. Thirty-three studies were reviewed, including 151 different predictors in the final prediction models. Fifty-one (33.7%) predictors were categorised as high risk of error, however this was not accounted for in the model development. Only 8 (24.2%) studies explicitly stated the intended moment of model use and when the predictors were measured. Reporting of measurement error and intended moment of model use is poor in prediction model studies. There is a need to identify circumstances where ignoring measurement error in prediction models is consequential and whether accounting for the error will improve the predictions. Copyright © 2018. Published by Elsevier Inc.

  7. Soy and isoflavone consumption and risk of gastrointestinal cancer: a systematic review and meta-analysis.

    PubMed

    Tse, Genevieve; Eslick, Guy D

    2016-02-01

    Evidence suggests that soy foods have chemoprotective properties that may reduce the risk of certain cancers such as breast and prostate cancer. However, data involving gastrointestinal (GI) have been limited, and the evidence remains controversial. This study aims to determine the potential relationship between dietary soy intake and GI cancer risk with an evaluation of the effects of isoflavone as an active soy constituent. Relevant studies were identified after literature search via electronic databases through May 2014. Subgroup analysis for isoflavone intake (studies n = 10) was performed. Covariants including gender types, anatomical subsites and preparation methods were also evaluated. Pooled adjusted odds ratios (ORs) comparing highest and lowest categories of dietary pattern scores were calculated using a random effects model. Twenty-two case-control and 18 cohort studies were included for meta-analysis, which contained a total of 633,476 participants and 13,639 GI cancer cases. The combined OR was calculated as 0.93 (95% CI 0.87-0.99; p value heterogeneity = 0.01), showing only a slight decrease in risk, the association was stronger for colon cancer (OR 0.92; 95% CI 0.96-0.99; p value heterogeneity = 0.163) and colorectal cancer (CRC) (OR 0.92; 95% CI 0.87-0.97; p value heterogeneity = 0.3). Subgroup analysis for isoflavone intake showed a statistically significant risk reduction with a risk estimate of 0.73 (95% CI 0.59-0.92; p value heterogeneity = 0), and particularly for CRC (OR 0.76; 95% CI 0.59-0.98; p value heterogeneity = 0). This study provides evidence that soy intake as a food group is only associated with a small reduction in GI cancer risk. Separate analysis for dietary isoflavone intakes suggests a stronger inverse association.

  8. Predicting and evaluation the severity in acute pancreatitis using a new modeling built on body mass index and intra-abdominal pressure.

    PubMed

    Fei, Yang; Gao, Kun; Tu, Jianfeng; Wang, Wei; Zong, Guang-Quan; Li, Wei-Qin

    2017-06-03

    Acute pancreatitis (AP) keeps as severe medical diagnosis and treatment problem. Early evaluation for severity and risk stratification in patients with AP is very important. Some scoring system such as acute physiology and chronic health evaluation-II (APACHE-II), the computed tomography severity index (CTSI), Ranson's score and the bedside index of severity of AP (BISAP) have been used, nevertheless, there're a few shortcomings in these methods. The aim of this study was to construct a new modeling including intra-abdominal pressure (IAP) and body mass index (BMI) to evaluate the severity in AP. The study comprised of two independent cohorts of patients with AP, one set was used to develop modeling from Jinling hospital in the period between January 2013 and October 2016, 1073 patients were included in it; another set was used to validate modeling from the 81st hospital in the period between January 2012 and December 2016, 326 patients were included in it. The association between risk factors and severity of AP were assessed by univariable analysis; multivariable modeling was explored through stepwise selection regression. The change in IAP and BMI were combined to generate a regression equation as the new modeling. Statistical indexes were used to evaluate the value of the prediction in the new modeling. Univariable analysis confirmed change in IAP and BMI to be significantly associated with severity of AP. The predict sensitivity, specificity, positive predictive value, negative predictive value and accuracy by the new modeling for severity of AP were 77.6%, 82.6%, 71.9%, 87.5% and 74.9% respectively in the developing dataset. There were significant differences between the new modeling and other scoring systems in these parameters (P < 0.05). In addition, a comparison of the area under receiver operating characteristic curves of them showed a statistically significant difference (P < 0.05). The same results could be found in the validating dataset. A new modeling based on IAP and BMI is more likely to predict the severity of AP. Copyright © 2017. Published by Elsevier Inc.

  9. A MELD-based model to determine risk of mortality among patients with acute variceal bleeding.

    PubMed

    Reverter, Enric; Tandon, Puneeta; Augustin, Salvador; Turon, Fanny; Casu, Stefania; Bastiampillai, Ravin; Keough, Adam; Llop, Elba; González, Antonio; Seijo, Susana; Berzigotti, Annalisa; Ma, Mang; Genescà, Joan; Bosch, Jaume; García-Pagán, Joan Carles; Abraldes, Juan G

    2014-02-01

    Patients with cirrhosis with acute variceal bleeding (AVB) have high mortality rates (15%-20%). Previously described models are seldom used to determine prognoses of these patients, partially because they have not been validated externally and because they include subjective variables, such as bleeding during endoscopy and Child-Pugh score, which are evaluated inconsistently. We aimed to improve determination of risk for patients with AVB. We analyzed data collected from 178 patients with cirrhosis (Child-Pugh scores of A, B, and C: 15%, 57%, and 28%, respectively) and esophageal AVB who received standard therapy from 2007 through 2010. We tested the performance (discrimination and calibration) of previously described models, including the model for end-stage liver disease (MELD), and developed a new MELD calibration to predict the mortality of patients within 6 weeks of presentation with AVB. MELD-based predictions were validated in cohorts of patients from Canada (n = 240) and Spain (n = 221). Among study subjects, the 6-week mortality rate was 16%. MELD was the best model in terms of discrimination; it was recalibrated to predict the 6-week mortality rate with logistic regression (logit, -5.312 + 0.207 • MELD; bootstrapped R(2), 0.3295). MELD values of 19 or greater predicted 20% or greater mortality, whereas MELD scores less than 11 predicted less than 5% mortality. The model performed well for patients from Canada at all risk levels. In the Spanish validation set, in which all patients were treated with banding ligation, MELD predictions were accurate up to the 20% risk threshold. We developed a MELD-based model that accurately predicts mortality among patients with AVB, based on objective variables available at admission. This model could be useful to evaluate the efficacy of new therapies and stratify patients in randomized trials. Copyright © 2014 AGA Institute. Published by Elsevier Inc. All rights reserved.

  10. University of North Carolina Caries Risk Assessment Study: comparisons of high risk prediction, any risk prediction, and any risk etiologic models.

    PubMed

    Beck, J D; Weintraub, J A; Disney, J A; Graves, R C; Stamm, J W; Kaste, L M; Bohannan, H M

    1992-12-01

    The purpose of this analysis is to compare three different statistical models for predicting children likely to be at risk of developing dental caries over a 3-yr period. Data are based on 4117 children who participated in the University of North Carolina Caries Risk Assessment Study, a longitudinal study conducted in the Aiken, South Carolina, and Portland, Maine areas. The three models differed with respect to either the types of variables included or the definition of disease outcome. The two "Prediction" models included both risk factor variables thought to cause dental caries and indicator variables that are associated with dental caries, but are not thought to be causal for the disease. The "Etiologic" model included only etiologic factors as variables. A dichotomous outcome measure--none or any 3-yr increment, was used in the "Any Risk Etiologic model" and the "Any Risk Prediction Model". Another outcome, based on a gradient measure of disease, was used in the "High Risk Prediction Model". The variables that are significant in these models vary across grades and sites, but are more consistent among the Etiologic model than the Predictor models. However, among the three sets of models, the Any Risk Prediction Models have the highest sensitivity and positive predictive values, whereas the High Risk Prediction Models have the highest specificity and negative predictive values. Considerations in determining model preference are discussed.

  11. Problems With Risk Reclassification Methods for Evaluating Prediction Models

    PubMed Central

    Pepe, Margaret S.

    2011-01-01

    For comparing the performance of a baseline risk prediction model with one that includes an additional predictor, a risk reclassification analysis strategy has been proposed. The first step is to cross-classify risks calculated according to the 2 models for all study subjects. Summary measures including the percentage of reclassification and the percentage of correct reclassification are calculated, along with 2 reclassification calibration statistics. The author shows that interpretations of the proposed summary measures and P values are problematic. The author's recommendation is to display the reclassification table, because it shows interesting information, but to use alternative methods for summarizing and comparing model performance. The Net Reclassification Index has been suggested as one alternative method. The author argues for reporting components of the Net Reclassification Index because they are more clinically relevant than is the single numerical summary measure. PMID:21555714

  12. Calibrating the response to health warnings: limiting both overreaction and underreaction with self-affirmation.

    PubMed

    Griffin, Dale W; Harris, Peter R

    2011-05-01

    Self-affirmation, reflecting on one's defining personal values, increases acceptance of threatening information, but does it do so at the cost of inducing undue alarm in people at low risk of harm? We contrast an alarm model, wherein self-affirmation simply increases response to threat, with a calibration model, wherein self-affirmation increases sensitivity to the self-relevance of health-risk information. Female seafood consumers (N = 165) completed a values self-affirmation or control task before reading a U.S. Food and Drug Administration brochure on mercury in seafood. Findings support the calibration model: Among frequent seafood consumers, self-affirmation generally increased concern (reports of depth of thought, personal message relevance, perceived risk, and negative affect) for those high in defensiveness and reduced it for those low in defensiveness. Among infrequent consumers of seafood, self-affirmation typically reduced concern. Thus, self-affirmation increased the sensitivity with which women at different levels of risk, and at different levels of defensiveness, responded cognitively and affectively to the materials.

  13. Left ventricular energy model predicts adverse events in women with suspected myocardial ischemia: results from the NHLBI-sponsored women’s ischemia syndrome evaluation (WISE) study

    PubMed Central

    Weinberg, Nicole; Pohost, Gerald M.; Bairey Merz, C. Noel; Shaw, Leslee J.; Sopko, George; Fuisz, Anthon; Rogers, William J.; Walsh, Edward G.; Johnson, B. Delia; Sharaf, Barry L.; Pepine, Carl J.; Mankad, Sunil; Reis, Steven E.; Rayarao, Geetha; Vido, Diane A.; Bittner, Vera; Tauxe, Lindsey; Olson, Marian B.; Kelsey, Sheryl F.; Biederman, Robert WW

    2013-01-01

    Objectives To assess the prognostic value of a left ventricular energy-model in women with suspected myocardial ischemia. Background The prognostic value of internal energy utilization (IEU) of the left ventricle in women with suspected myocardial ischemia is unknown. Methods Women [n=227, mean age 59±12 years (range, 31-86 years)], with symptoms of myocardial ischemia, underwent myocardial perfusion imaging (MPI) assessment for regional perfusion defects along with measurement of ventricular volumes separately by gated Single Photon Emission Computed Tomography (SPECT) (n=207) and magnetic resonance imaging (MRI) (n=203). During follow-up (40±17 months), time to first major adverse cardiovascular event (MACE, death, myocardial infarction or hospitalization for congestive heart failure) was analyzed using MRI and gated SPECT variables. Results Adverse events occurred in 31 (14%). Multivariable Cox models were formed for each modality: IEU and wall thickness by MRI (Chi-squared 34, P<0.005) and IEU and systolic blood pressure by gated SEPCT (Chi-squared 34, P<0.005). The models remained predictive after adjustment for age, disease history and Framingham risk score. For each Cox model, patients were categorized as high-risk if the model hazard was positive and not high-risk otherwise. Kaplan-Meier analysis of time to MACE was performed for high-risk vs. not high-risk for MR (log rank 25.3, P<0.001) and gated SEPCT (log rank 18.2, P<0.001) models. Conclusions Among women with suspected myocardial ischemia a high internal energy utilization has higher prognostic value than either a low EF or the presence of a myocardial perfusion defect assessed using two independent modalities of MR or gated SPECT. PMID:24015377

  14. Risk model for estimating the 1-year risk of deferred lesion intervention following deferred revascularization after fractional flow reserve assessment.

    PubMed

    Depta, Jeremiah P; Patel, Jayendrakumar S; Novak, Eric; Gage, Brian F; Masrani, Shriti K; Raymer, David; Facey, Gabrielle; Patel, Yogesh; Zajarias, Alan; Lasala, John M; Amin, Amit P; Kurz, Howard I; Singh, Jasvindar; Bach, Richard G

    2015-02-21

    Although lesions deferred revascularization following fractional flow reserve (FFR) assessment have a low risk of adverse cardiac events, variability in risk for deferred lesion intervention (DLI) has not been previously evaluated. The aim of this study was to develop a prediction model to estimate 1-year risk of DLI for coronary lesions where revascularization was not performed following FFR assessment. A prediction model for DLI was developed from a cohort of 721 patients with 882 coronary lesions where revascularization was deferred based on FFR between 10/2002 and 7/2010. Deferred lesion intervention was defined as any revascularization of a lesion previously deferred following FFR. The final DLI model was developed using stepwise Cox regression and validated using bootstrapping techniques. An algorithm was constructed to predict the 1-year risk of DLI. During a mean (±SD) follow-up period of 4.0 ± 2.3 years, 18% of lesions deferred after FFR underwent DLI; the 1-year incidence of DLI was 5.3%, while the predicted risk of DLI varied from 1 to 40%. The final Cox model included the FFR value, age, current or former smoking, history of coronary artery disease (CAD) or prior percutaneous coronary intervention, multi-vessel CAD, and serum creatinine. The c statistic for the DLI prediction model was 0.66 (95% confidence interval, CI: 0.61-0.70). Patients deferred revascularization based on FFR have variation in their risk for DLI. A clinical prediction model consisting of five clinical variables and the FFR value can help predict the risk of DLI in the first year following FFR assessment. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2014. For permissions please email: journals.permissions@oup.com.

  15. Assessment of winter wheat loss risk impacted by climate change from 1982 to 2011

    NASA Astrophysics Data System (ADS)

    Du, Xin

    2017-04-01

    The world's farmers will face increasing pressure to grow more food on less land in succeeding few decades, because it seems that the continuous population growth and agricultural products turning to biofuels would extend several decades into the future. Therefore, the increased demand for food supply worldwide calls for improved accuracy of crop productivity estimation and assessment of grain production loss risk. Extensive studies have been launched to evaluate the impacts of climate change on crop production based on various crop models drove with global or regional climate model (GCM/RCM) output. However, assessment of climate change impacts on agriculture productivity is plagued with uncertainties of the future climate change scenarios and complexity of crop model. Therefore, given uncertain climate conditions and a lack of model parameters, these methods are strictly limited in application. In this study, an empirical assessment approach for crop loss risk impacted by water stress has been established and used to evaluate the risk of winter wheat loss in China, United States, Germany, France and United Kingdom. The average value of winter wheat loss risk impacted by water stress for the three countries of Europe is about -931kg/ha, which is obviously higher in contrast with that in China (-570kg/ha) and in United States (-367kg/ha). Our study has important implications for further application of operational assessment of crop loss risk at a country or region scale. Future studies should focus on using higher spatial resolution remote sensing data, combining actual evapo-transpiration to estimate water stress, improving the method for downscaling of statistic crop yield data, and establishing much more rational and elaborate zoning method.

  16. Deciding which drugs get onto the formulary: a value-based approach.

    PubMed

    Seigfried, Raymond J; Corbo, Teresa; Saltzberg, Mitchell T; Reitz, Jeffrey; Bennett, Dean A

    2013-01-01

    Hospitals, physicians, payers, and patients face economic and ethical decisions about the use of biotechnology drugs, commonly called specialty medications. These often target a small population, have data based on smaller clinical trials, are expensive, and may have questionable advantage. This is a result of how the Food and Drug Administration (FDA) approves medications, which is based only on safety and efficacy. Cancer drugs, once approved by the FDA, regardless of cost or value must be covered by Medicare. Some states have laws requiring additional coverage as well. All of this has created an unintended consequence: It has driven up costs with questionable evidence to support the medication's value, placing patients, payers, and providers in an ethical conflict. In this new era of health care transformation, health care leaders must focus on creating value to support a sustainable health system. Christiana Care Health System's Value Institute has designed a new model to evaluate specialty medications, using value as its main criterion. This article describes the process and outcomes using a new value model for evaluating specialty medications for a hospital formulary. It also introduces a new criterion of evaluation entitled "Societal Benefit" that provides a rating on quality- of-life issues. With measurable factors of efficacy, risk, cost, and quality-of-life concerns, our methodology provides a more balanced approach in the evaluation of specialty medications. Specialty medications are the fastest growing segment of drug expense, and it is hard to understand how these medications will be sustainable under health care reforms. Unlike other countries, the United States has no national agency providing cost-effectiveness review; review occurs, if at all, at a local level. Laws governing Medicare and most private insurers' coverage of FDA-approved medication and some clinical quality standards conflict with cost-effectiveness, making this type of review difficult. Finally, because these medications affect the health system as a whole, it is a great example to begin to support health care reform. Hospitals need to challenge the value of specialty medication. Although our model will continue to evolve, value is now our central consideration when selecting specialty medications to be added to the formulary. We share this experience to encourage other hospitals to design their own approach to this vital issue. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  17. Risk Assessment of Alzheimer's Disease using the Information Diffusion Model from Structural Magnetic Resonance Imaging.

    PubMed

    Beheshti, Iman; Olya, Hossain G T; Demirel, Hasan

    2016-04-05

    Recently, automatic risk assessment methods have been a target for the detection of Alzheimer's disease (AD) risk. This study aims to develop an automatic computer-aided AD diagnosis technique for risk assessment of AD using information diffusion theory. Information diffusion is a fuzzy mathematics logic of set-value that is used for risk assessment of natural phenomena, which attaches fuzziness (uncertainty) and incompleteness. Data were obtained from voxel-based morphometry analysis of structural magnetic resonance imaging. The information diffusion model results revealed that the risk of AD increases with a reduction of the normalized gray matter ratio (p > 0.5, normalized gray matter ratio <40%). The information diffusion model results were evaluated by calculation of the correlation of two traditional risk assessments of AD, the Mini-Mental State Examination and the Clinical Dementia Rating. The correlation results revealed that the information diffusion model findings were in line with Mini-Mental State Examination and Clinical Dementia Rating results. Application of information diffusion model contributes to the computerization of risk assessment of AD, which has a practical implication for the early detection of AD.

  18. Investing in a healthy lifestyle strategy: is it worth it?

    PubMed

    Benmarhnia, Tarik; Dionne, Pierre-Alexandre; Tchouaket, Éric; Fansi, Alvine K; Brousselle, Astrid

    2017-01-01

    In Quebec, various actors fund activities aimed at increasing physical activity, improving eating habits and reducing smoking. The objective was to evaluate how effective does the healthy lifestyle habits promotion (HLHP) strategy need to be to make to offset its costs. First, we built the logic model of the HLHP strategy. We then assessed the strategy's total cost as well as the direct health care expenditures associated with lifestyle-related risk factors (smoking, physical inactivity, insufficient intake of fruits and vegetables, obesity and overweight). Finally, we estimated the break-even point beyond which the economic benefits of the HLHP strategy would outweigh its costs. The HLHP strategy cost for 2010-2011 was estimated at $110 million. Direct healthcare expenditures associated with lifestyle-related risk factors were estimated at $4.161 billion. We estimated that 47 % of these expenditures were attributable to these risk factors. We concluded that the HLHP strategy cost corresponded to 5.6 % of the annual healthcare expenditures attributable to these risk factors. This study compared the economic value of HLHP activities against healthcare expenditures associated with targeted risk factors.

  19. Comparison of Nutritional Risk Scores for Predicting Mortality in Japanese Chronic Hemodialysis Patients.

    PubMed

    Takahashi, Hiroshi; Inoue, Keiko; Shimizu, Kazue; Hiraga, Keiko; Takahashi, Erika; Otaki, Kaori; Yoshikawa, Taeko; Furuta, Kumiko; Tokunaga, Chika; Sakakibara, Tomoyo; Ito, Yasuhiko

    2017-05-01

    Protein energy wasting (PEW) is consistently associated with poor prognosis in hemodialysis (HD) patients. We compared the predictability of PEW as diagnosed by The International Society of Renal Nutrition and Metabolism criteria (PEW ISRNM ) and geriatric nutritional risk index (GNRI) for all-cause mortality in Japanese HD patients. As cut-off values for body mass index (BMI) for PEW have not been established in PEW ISRNM for Asian populations, these were also investigated. The nutritional status from 409 HD patients was evaluated according to ISRNM and GNRI criteria. To compare the predictability of mortality, C-index, net reclassification improvement (NRI) and integrated discrimination improvement were evaluated. During follow-up (median, 52 months; range, 7 months), 70 patients (17.1%) presented PEW according to ISRNM and 131 patients (32.1%) according to GNRI; in addition, 101 patients (24.7%) died. PEW ISRNM and GNRI were identified as independent predictors of death. Addition of PEW ISRNM and GNRI to a predictive model based on established risk factors improved NRI and integrated discrimination improvement. However, no differences were found between models including PEW ISRNM and GNRI. When lowering the criterion level of BMI per 1 kg/m 2 sequentially, PEW ISRNM at BMI <20 kg/m 2 maximized the hazard ratio for mortality. The model including PEW ISRNM at BMI <20 kg/m 2 improved NRI compared with the model including GNRI. PEW ISRNM and GNRI represent independent predictors of mortality, with comparable predictability. The diagnostic criterion of BMI in the ISRNM for Japanese population might be better at <20 kg/m 2 than at <23 kg/m 2 . Copyright © 2016 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  20. The Moderating Effects of Students’ Personality Traits on Pro-Environmental Behavioral Intentions in Response to Climate Change

    PubMed Central

    Yu, Tai-Yi; Yu, Tai-Kuei

    2017-01-01

    This study developed a model that examined the relationship between undergraduate students’ beliefs, norms and pro-environment behavioral intentions in the context of global climate change (GCC). The model was further evaluated to determine whether latent variables, such as sustainability value, environmental concern, social norms, perceived risk, pro-environmental attitude, as defined by the theory of planned behavior and value-belief-norm theory, significantly influenced students’ intentions towards pro-environmental behavior. The research model was empirically tested using data collected form 275 undergraduate students. Empirical results found support for four interaction effects of personality traits and the related latent variables of environmental attitude, including sustainability value, social norms, environmental concern and perceived risk. The impact of undergraduate students’ environmental attitudes was moderated by personality traits. The findings of this research offer policy makers and enterprises better understandings of undergraduate students’ attitudes and behavioral intentions towards GCC and promote the visibility of this issue. PMID:29186016

  1. The Moderating Effects of Students' Personality Traits on Pro-Environmental Behavioral Intentions in Response to Climate Change.

    PubMed

    Yu, Tai-Yi; Yu, Tai-Kuei

    2017-11-29

    This study developed a model that examined the relationship between undergraduate students' beliefs, norms and pro-environment behavioral intentions in the context of global climate change (GCC). The model was further evaluated to determine whether latent variables, such as sustainability value, environmental concern, social norms, perceived risk, pro-environmental attitude, as defined by the theory of planned behavior and value-belief-norm theory, significantly influenced students' intentions towards pro-environmental behavior. The research model was empirically tested using data collected form 275 undergraduate students. Empirical results found support for four interaction effects of personality traits and the related latent variables of environmental attitude, including sustainability value, social norms, environmental concern and perceived risk. The impact of undergraduate students' environmental attitudes was moderated by personality traits. The findings of this research offer policy makers and enterprises better understandings of undergraduate students' attitudes and behavioral intentions towards GCC and promote the visibility of this issue.

  2. Assessing the value of post-processed state-of-the-art long-term weather forecast ensembles for agricultural water management mediated by farmers' behaviours

    NASA Astrophysics Data System (ADS)

    Li, Yu; Giuliani, Matteo; Castelletti, Andrea

    2016-04-01

    Recent advances in modelling of coupled ocean-atmosphere dynamics significantly improved skills of long-term climate forecast from global circulation models (GCMs). These more accurate weather predictions are supposed to be a valuable support to farmers in optimizing farming operations (e.g. crop choice, cropping and watering time) and for more effectively coping with the adverse impacts of climate variability. Yet, assessing how actually valuable this information can be to a farmer is not straightforward and farmers' response must be taken into consideration. Indeed, in the context of agricultural systems potentially useful forecast information should alter stakeholders' expectation, modify their decisions, and ultimately produce an impact on their performance. Nevertheless, long-term forecast are mostly evaluated in terms of accuracy (i.e., forecast quality) by comparing hindcast and observed values and only few studies investigated the operational value of forecast looking at the gain of utility within the decision-making context, e.g. by considering the derivative of forecast information, such as simulated crop yields or simulated soil moisture, which are essential to farmers' decision-making process. In this study, we contribute a step further in the assessment of the operational value of long-term weather forecasts products by embedding these latter into farmers' behavioral models. This allows a more critical assessment of the forecast value mediated by the end-users' perspective, including farmers' risk attitudes and behavioral patterns. Specifically, we evaluate the operational value of thirteen state-of-the-art long-range forecast products against climatology forecast and empirical prediction (i.e. past year climate and historical average) within an integrated agronomic modeling framework embedding an implicit model of the farmers' decision-making process. Raw ensemble datasets are bias-corrected and downscaled using a stochastic weather generator, in order to address the mismatch of the spatio-temporal scale between forecast data from GCMs and our model. For each product, the experiment is composed by two cascade simulations: 1) an ex-ante simulation using forecast data, and 2) an ex-post simulation with observations. Multi-year simulations are performed to account for climate variability, and the operational value of the different forecast products is evaluated against the perfect foresight on the basis of expected crop productivity as well as the final decisions under different decision-making criterions. Our results show that not all products generate beneficial effects to farmers' performance, and the forecast errors might be amplified due to farmers' decision-making process and risk attitudes, yielding little or even worse performance compared with the empirical approaches.

  3. Evaluating the risk of death via the hematopoietic syndrome mode for prolonged exposure of nuclear workers to radiation delivered at very low rates.

    PubMed

    Scott, B R; Lyzlov, A F; Osovets, S V

    1998-05-01

    During a Phase-I effort, studies were planned to evaluate deterministic (nonstochastic) effects of chronic exposure of nuclear workers at the Mayak atomic complex in the former Soviet Union to relatively high levels (> 0.25 Gy) of ionizing radiation. The Mayak complex has been used, since the late 1940's, to produce plutonium for nuclear weapons. Workers at Site A of the complex were involved in plutonium breeding using nuclear reactors, and some were exposed to relatively large doses of gamma rays plus relatively small neutron doses. The Weibull normalized-dose model, which has been set up to evaluate the risk of specific deterministic effects of combined, continuous exposure of humans to alpha, beta, and gamma radiations, is here adapted for chronic exposure to gamma rays and neutrons during repeated 6-h work shifts--as occurred for some nuclear workers at Site A. Using the adapted model, key conclusions were reached that will facilitate a Phase-II study of deterministic effects among Mayak workers. These conclusions include the following: (1) neutron doses may be more important for Mayak workers than for Japanese A-bomb victims in Hiroshima and can be accounted for using an adjusted dose (which accounts for neutron relative biological effectiveness); (2) to account for dose-rate effects, normalized dose X (a dimensionless fraction of an LD50 or ED50) can be evaluated in terms of an adjusted dose; (3) nonlinear dose-response curves for the risk of death via the hematopoietic mode can be converted to linear dose-response curves (for low levels of risk) using a newly proposed dimensionless dose, D = X(V), in units of Oklad (where D is pronounced "deh"), and V is the shape parameter in the Weibull model; (4) for X < or = Xo, where Xo is the threshold normalized dose, D = 0; (5) unlike absorbed dose, the dose D can be averaged over different Mayak workers in order to calculate the average risk of death via the hematopoietic mode for the population exposed at Site A; and (6) the expected cases of death via the hematopoietic syndrome mode for Mayak workers chronically exposed during work shifts at Site A to gamma rays and neutrons can be predicted using ln(2)B M[D]; where B (pronounced "beh") is the number of workers at risk (criticality accident victims excluded); and M[D] is the average (mean) value of D (averaged over the worker population at risk, for Site A, for the time period considered). These results can be used to facilitate a Phase II study of deterministic radiation effects among Mayak workers chronically exposed to gamma rays and neutrons.

  4. Race, Genetic West African Ancestry, and Prostate Cancer Prediction by PSA in Prospectively Screened High-Risk Men

    PubMed Central

    Giri, Veda N.; Egleston, Brian; Ruth, Karen; Uzzo, Robert G.; Chen, David Y.T.; Buyyounouski, Mark; Raysor, Susan; Hooker, Stanley; Torres, Jada Benn; Ramike, Teniel; Mastalski, Kathleen; Kim, Taylor Y.; Kittles, Rick

    2008-01-01

    Introduction “Race-specific” PSA needs evaluation in men at high-risk for prostate cancer (PCA) for optimizing early detection. Baseline PSA and longitudinal prediction for PCA was examined by self-reported race and genetic West African (WA) ancestry in the Prostate Cancer Risk Assessment Program, a prospective high-risk cohort. Materials and Methods Eligibility criteria are age 35–69 years, FH of PCA, African American (AA) race, or BRCA1/2 mutations. Biopsies have been performed at low PSA values (<4.0 ng/mL). WA ancestry was discerned by genotyping 100 ancestry informative markers. Cox proportional hazards models evaluated baseline PSA, self-reported race, and genetic WA ancestry. Cox models were used for 3-year predictions for PCA. Results 646 men (63% AA) were analyzed. Individual WA ancestry estimates varied widely among self-reported AA men. “Race-specific” differences in baseline PSA were not found by self-reported race or genetic WA ancestry. Among men with ≥ 1 follow-up visit (405 total, 54% AA), three-year prediction for PCA with a PSA of 1.5–4.0 ng/mL was higher in AA men with age in the model (p=0.025) compared to EA men. Hazard ratios of PSA for PCA were also higher by self-reported race (1.59 for AA vs. 1.32 for EA, p=0.04). There was a trend for increasing prediction for PCA with increasing genetic WA ancestry. Conclusions “Race-specific” PSA may need to be redefined as higher prediction for PCA at any given PSA in AA men. Large-scale studies are needed to confirm if genetic WA ancestry explains these findings to make progress in personalizing PCA early detection. PMID:19240249

  5. Indoor tanning and the MC1R genotype: risk prediction for basal cell carcinoma risk in young people.

    PubMed

    Molinaro, Annette M; Ferrucci, Leah M; Cartmel, Brenda; Loftfield, Erikka; Leffell, David J; Bale, Allen E; Mayne, Susan T

    2015-06-01

    Basal cell carcinoma (BCC) incidence is increasing, particularly in young people, and can be associated with significant morbidity and treatment costs. To identify young individuals at risk of BCC, we assessed existing melanoma or overall skin cancer risk prediction models and built a novel risk prediction model, with a focus on indoor tanning and the melanocortin 1 receptor gene, MC1R. We evaluated logistic regression models among 759 non-Hispanic whites from a case-control study of patients seen between 2006 and 2010 in New Haven, Connecticut. In our data, the adjusted area under the receiver operating characteristic curve (AUC) for a model by Han et al. (Int J Cancer. 2006;119(8):1976-1984) with 7 MC1R variants was 0.72 (95% confidence interval (CI): 0.66, 0.78), while that by Smith et al. (J Clin Oncol. 2012;30(15 suppl):8574) with MC1R and indoor tanning had an AUC of 0.69 (95% CI: 0.63, 0.75). Our base model had greater predictive ability than existing models and was significantly improved when we added ever-indoor tanning, burns from indoor tanning, and MC1R (AUC = 0.77, 95% CI: 0.74, 0.81). Our early-onset BCC risk prediction model incorporating MC1R and indoor tanning extends the work of other skin cancer risk prediction models, emphasizes the value of both genotype and indoor tanning in skin cancer risk prediction in young people, and should be validated with an independent cohort. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Risk Score Algorithm for Treatment of Persistent Apical Periodontitis

    PubMed Central

    Yu, V.S.; Khin, L.W.; Hsu, C.S.; Yee, R.; Messer, H.H.

    2014-01-01

    Persistent apical periodontitis related to a nonvital tooth that does not resolve following root canal treatment may be compatible with health and may not require further intervention. This research aimed to develop a Deterioration Risk Score (DRS) to differentiate lesions requiring further intervention from lesions likely to be compatible with health. In this cross-sectional study, patient records (2003-2008) were screened for root-filled teeth with periapical radiolucency visible on periapical radiographs taken at treatment and at recruitment at least 4 yr later. The final sample consisted of 228 lesions in 182 patients. Potential demographic and treatment risk factors were screened against 3 categorical outcomes (improved/unchanged/deteriorated), and a multivariate independent multinomial probit regression model was built. A 5-level DRS was constructed by summing values of adjusted regression coefficients in the model, based on predicted probabilities of deterioration. Most lesions (127, 55.7%) had improved over time, while 32 (14.0%) remained unchanged, and 69 (30.3%) had deteriorated. Significant predictors of deterioration were as follows: time since treatment (relative risk [RR]: 1.11, 95% confidence interval [CI]: 1.01-1.22, p = .030, rounded beta value = 1, for every year increase after 4 yr), current pain (RR: 3.79, 95% CI: 1.48-9.70, p = .005, rounded beta value = 13), sinus tract present (RR: 4.13, 95% CI: 1.11-15.29, p = .034, rounded beta value = 14), and lesion size (RR: 7.20, 95% CI: 3.70-14.02, p < .001, rounded beta value = 20). Persistent apical periodontitis with DRS <15 represented very low risk; 15-20, low risk; 21-30, moderate risk; 31-40, high risk; and >40, very high risk. DRS could help the clinician identify persistent apical periodontitis at low risk for deterioration, and it would not require intervention. When validated, this tool could reduce the risk of overtreatment and contribute toward targeted care and better efficiency in the timely management of disease. PMID:25190267

  7. Peak Pc Prediction in Conjunction Analysis: Conjunction Assessment Risk Analysis. Pc Behavior Prediction Models

    NASA Technical Reports Server (NTRS)

    Vallejo, J.J.; Hejduk, M.D.; Stamey, J. D.

    2015-01-01

    Satellite conjunction risk typically evaluated through the probability of collision (Pc). Considers both conjunction geometry and uncertainties in both state estimates. Conjunction events initially discovered through Joint Space Operations Center (JSpOC) screenings, usually seven days before Time of Closest Approach (TCA). However, JSpOC continues to track objects and issue conjunction updates. Changes in state estimate and reduced propagation time cause Pc to change as event develops. These changes a combination of potentially predictable development and unpredictable changes in state estimate covariance. Operationally useful datum: the peak Pc. If it can reasonably be inferred that the peak Pc value has passed, then risk assessment can be conducted against this peak value. If this value is below remediation level, then event intensity can be relaxed. Can the peak Pc location be reasonably predicted?

  8. Value-at-Risk forecasts by a spatiotemporal model in Chinese stock market

    NASA Astrophysics Data System (ADS)

    Gong, Pu; Weng, Yingliang

    2016-01-01

    This paper generalizes a recently proposed spatial autoregressive model and introduces a spatiotemporal model for forecasting stock returns. We support the view that stock returns are affected not only by the absolute values of factors such as firm size, book-to-market ratio and momentum but also by the relative values of factors like trading volume ranking and market capitalization ranking in each period. This article studies a new method for constructing stocks' reference groups; the method is called quartile method. Applying the method empirically to the Shanghai Stock Exchange 50 Index, we compare the daily volatility forecasting performance and the out-of-sample forecasting performance of Value-at-Risk (VaR) estimated by different models. The empirical results show that the spatiotemporal model performs surprisingly well in terms of capturing spatial dependences among individual stocks, and it produces more accurate VaR forecasts than the other three models introduced in the previous literature. Moreover, the findings indicate that both allowing for serial correlation in the disturbances and using time-varying spatial weight matrices can greatly improve the predictive accuracy of a spatial autoregressive model.

  9. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  10. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  11. Reevaluation of 1999 Health-Based Environmental Screening Levels (HBESLs) for Chemical Warfare Agents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, Annetta Paule; Dolislager, Fredrick G

    2007-05-01

    This report evaluates whether new information and updated scientific models require that changes be made to previously published health-based environmental soil screening levels (HBESLs) and associated environmental fate/breakdown information for chemical warfare agents (USACHPPM 1999). Specifically, the present evaluation describes and compares changes that have been made since 1999 to U.S. Environmental Protection Agency (EPA) risk assessment models, EPA exposure assumptions, as well as to specific chemical warfare agent parameters (e.g., toxicity values). Comparison was made between screening value estimates recalculated with current assumptions and earlier health-based environmental screening levels presented in 1999. The chemical warfare agents evaluated include themore » G-series and VX nerve agents and the vesicants sulfur mustard (agent HD) and Lewisite (agent L). In addition, key degradation products of these agents were also evaluated. Study findings indicate that the combined effect of updates and/or changes to EPA risk models, EPA default exposure parameters, and certain chemical warfare agent toxicity criteria does not result in significant alteration to the USACHPPM (1999) health-based environmental screening level estimates for the G-series and VX nerve agents or the vesicant agents HD and L. Given that EPA's final position on separate Tier 1 screening levels for indoor and outdoor worker screening assessments has not yet been released as of May 2007, the study authors find that the 1999 screening level estimates (see Table ES.1) are still appropriate and protective for screening residential as well as nonresidential sites. As such, risk management decisions made on the basis of USACHPPM (1999) recommendations do not require reconsideration. While the 1999 HBESL values are appropriate for continued use as general screening criteria, the updated '2007' estimates (presented below) that follow the new EPA protocols currently under development are also protective. When EPA finalizes and documents a position on the matter of indoor and outdoor worker screening assessments, site-specific risk assessments should make use of modified models and criteria. Screening values such as those presented in this report may be used to assess soil or other porous media to determine whether chemical warfare agent contamination is present as part of initial site investigations (whether due to intentional or accidental releases) and to determine whether weather/decontamination has adequately mitigated the presence of agent residual to below levels of concern. However, despite the availability of scientifically supported health-based criteria, there are significant resources needs that should be considered during sample planning. In particular, few analytical laboratories are likely to be able to meet these screening levels. Analyses will take time and usually have limited confidence at these concentrations. Therefore, and particularly for the more volatile agents, soil/destructive samples of porous media should be limited and instead enhanced with headspace monitoring and presence-absence wipe sampling.« less

  12. Functional correlation approach to operational risk in banking organizations

    NASA Astrophysics Data System (ADS)

    Kühn, Reimer; Neu, Peter

    2003-05-01

    A Value-at-Risk-based model is proposed to compute the adequate equity capital necessary to cover potential losses due to operational risks, such as human and system process failures, in banking organizations. Exploring the analogy to a lattice gas model from physics, correlations between sequential failures are modeled by as functionally defined, heterogeneous couplings between mutually supportive processes. In contrast to traditional risk models for market and credit risk, where correlations are described as equal-time-correlations by a covariance matrix, the dynamics of the model shows collective phenomena such as bursts and avalanches of process failures.

  13. Validation of the Japanese disease severity classification and the GAP model in Japanese patients with idiopathic pulmonary fibrosis.

    PubMed

    Kondoh, Shun; Chiba, Hirofumi; Nishikiori, Hirotaka; Umeda, Yasuaki; Kuronuma, Koji; Otsuka, Mitsuo; Yamada, Gen; Ohnishi, Hirofumi; Mori, Mitsuru; Kondoh, Yasuhiro; Taniguchi, Hiroyuki; Homma, Sakae; Takahashi, Hiroki

    2016-09-01

    The clinical course of idiopathic pulmonary fibrosis (IPF) shows great inter-individual differences. It is important to standardize the severity classification to accurately evaluate each patient׳s prognosis. In Japan, an original severity classification (the Japanese disease severity classification, JSC) is used. In the United States, the new multidimensional index and staging system (the GAP model) has been proposed. The objective of this study was to evaluate the model performance for the prediction of mortality risk of the JSC and GAP models using a large cohort of Japanese patients with IPF. This is a retrospective cohort study including 326 patients with IPF in the Hokkaido prefecture from 2003 to 2007. We obtained the survival curves of each stage of the GAP and JSC models to perform a comparison. In the GAP model, the prognostic value for mortality risk of Japanese patients was also evaluated. In the JSC, patient prognoses were roughly divided into two groups, mild cases (Stages I and II) and severe cases (Stages III and IV). In the GAP model, there was no significant difference in survival between Stages II and III, and the mortality rates in the patients classified into the GAP Stages I and II were underestimated. It is difficult to predict accurate prognosis of IPF using the JSC and the GAP models. A re-examination of the variables from the two models is required, as well as an evaluation of the prognostic value to revise the severity classification for Japanese patients with IPF. Copyright © 2016 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  14. VTE Risk assessment - a prognostic Model: BATER Cohort Study of young women.

    PubMed

    Heinemann, Lothar Aj; Dominh, Thai; Assmann, Anita; Schramm, Wolfgang; Schürmann, Rolf; Hilpert, Jan; Spannagl, Michael

    2005-04-18

    BACKGROUND: Community-based cohort studies are not available that evaluated the predictive power of both clinical and genetic risk factors for venous thromboembolism (VTE). There is, however, clinical need to forecast the likelihood of future occurrence of VTE, at least qualitatively, to support decisions about intensity of diagnostic or preventive measures. MATERIALS AND METHODS: A 10-year observation period of the Bavarian Thromboembolic Risk (BATER) study, a cohort study of 4337 women (18-55 years), was used to develop a predictive model of VTE based on clinical and genetic variables at baseline (1993). The objective was to prepare a probabilistic scheme that discriminates women with virtually no VTE risk from those at higher levels of absolute VTE risk in the foreseeable future. A multivariate analysis determined which variables at baseline were the best predictors of a future VTE event, provided a ranking according to the predictive power, and permitted to design a simple graphic scheme to assess the individual VTE risk using five predictor variables. RESULTS: Thirty-four new confirmed VTEs occurred during the observation period of over 32,000 women-years (WYs). A model was developed mainly based on clinical information (personal history of previous VTE and family history of VTE, age, BMI) and one composite genetic risk markers (combining Factor V Leiden and Prothrombin G20210A Mutation). Four levels of increasing VTE risk were arbitrarily defined to map the prevalence in the study population: No/low risk of VTE (61.3%), moderate risk (21.1%), high risk (6.0%), very high risk of future VTE (0.9%). In 10.6% of the population the risk assessment was not possible due to lacking VTE cases. The average incidence rates for VTE in these four levels were: 4.1, 12.3, 47.2, and 170.5 per 104 WYs for no, moderate, high, and very high risk, respectively. CONCLUSION: Our prognostic tool - containing clinical information (and if available also genetic data) - seems to be worthwhile testing in medical practice in order to confirm or refute the positive findings of this study. Our cohort study will be continued to include more VTE cases and to increase predictive value of the model.

  15. Assessing risk factors for periodontitis using regression

    NASA Astrophysics Data System (ADS)

    Lobo Pereira, J. A.; Ferreira, Maria Cristina; Oliveira, Teresa

    2013-10-01

    Multivariate statistical analysis is indispensable to assess the associations and interactions between different factors and the risk of periodontitis. Among others, regression analysis is a statistical technique widely used in healthcare to investigate and model the relationship between variables. In our work we study the impact of socio-demographic, medical and behavioral factors on periodontal health. Using regression, linear and logistic models, we can assess the relevance, as risk factors for periodontitis disease, of the following independent variables (IVs): Age, Gender, Diabetic Status, Education, Smoking status and Plaque Index. The multiple linear regression analysis model was built to evaluate the influence of IVs on mean Attachment Loss (AL). Thus, the regression coefficients along with respective p-values will be obtained as well as the respective p-values from the significance tests. The classification of a case (individual) adopted in the logistic model was the extent of the destruction of periodontal tissues defined by an Attachment Loss greater than or equal to 4 mm in 25% (AL≥4mm/≥25%) of sites surveyed. The association measures include the Odds Ratios together with the correspondent 95% confidence intervals.

  16. Cost-Effectiveness of Osteoporosis Screening Strategies for Men

    PubMed Central

    Nayak, Smita; Greenspan, Susan L.

    2016-01-01

    Osteoporosis affects many men, with significant morbidity and mortality. However, the best osteoporosis screening strategies for men are unknown. We developed an individual-level state-transition cost-effectiveness model with a lifetime time horizon to identify the cost-effectiveness of different osteoporosis screening strategies for U.S. men involving various screening tests (dual-energy x-ray absorptiometry (DXA); the Osteoporosis Self-Assessment Tool (OST); or a fracture risk assessment strategy using age, femoral neck bone mineral density (BMD), and Vertebral Fracture Assessment (VFA)); screening initiation ages (50, 60, 70, or 80); and repeat screening intervals (5 years or 10 years). In base-case analysis, no screening was a less effective option than all other strategies evaluated; furthermore, no screening was more expensive than all strategies that involved screening with DXA or the OST risk assessment instrument, and thus no screening was “dominated” by screening with DXA or OST at all evaluated screening initiation ages and repeat screening intervals. Screening strategies that most frequently appeared as most cost-effective in base-case analysis and one-way sensitivity analyses when assuming willingness-to-pay of $50,000/QALY or $100,000/QALY included screening initiation at age 50 with the fracture risk assessment strategy and repeat screening every 10 years; screening initiation at age 50 with fracture risk assessment and repeat screening every 5 years; and screening initiation at age 50 with DXA and repeat screening every 5 years. In conclusion, expansion of osteoporosis screening for U.S. men to initiate routine screening at age 50 or 60 would be expected to be effective and of good value for improving health outcomes. A fracture risk assessment strategy using variables of age, femoral neck BMD, and VFA is likely to be the most effective of the evaluated strategies within accepted cost-effectiveness parameters. DXA and OST are also reasonable screening options, albeit likely slightly less effective than the evaluated fracture risk assessment strategy. PMID:26751984

  17. Application of remote sensing data and GIS for landslide risk assessment as an environmental threat to Izmir city (west Turkey).

    PubMed

    Akgun, Aykut; Kıncal, Cem; Pradhan, Biswajeet

    2012-09-01

    In this study, landslide risk assessment for Izmir city (west Turkey) was carried out, and the environmental effects of landslides on further urban development were evaluated using geographical information systems and remote sensing techniques. For this purpose, two different data groups, namely conditioning and triggering data, were produced. With the help of conditioning data such as lithology, slope gradient, slope aspect, distance from roads, distance from faults and distance from drainage lines, a landslide susceptibility model was constructed by using logistic regression modelling approach. The accuracy assessment of the susceptibility map was carried out by the area under curvature (AUC) approach, and a 0.810 AUC value was obtained. This value shows that the map obtained is successful. Due to the fact that the study area is located in an active seismic region, earthquake data were considered as primary triggering factor contributing to landslide occurrence. In addition to this, precipitation data were also taken into account as a secondary triggering factor. Considering the susceptibility data and triggering factors, a landslide hazard index was obtained. Furthermore, using the Aster data, a land-cover map was produced with an overall kappa value of 0.94. From this map, settlement areas were extracted, and these extracted data were assessed as elements at risk in the study area. Next, a vulnerability index was created by using these data. Finally, the hazard index and the vulnerability index were combined, and a landslide risk map for Izmir city was obtained. Based on this final risk map, it was observed that especially south and north parts of the Izmir Bay, where urbanization is dense, are threatened to future landsliding. This result can be used for preliminary land use planning by local governmental authorities.

  18. 12 CFR Appendix C to Part 325 - Risk-Based Capital for State Non-Member Banks: Market Risk

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... commodity prices. (2) Specific risk means changes in the market value of specific positions due to factors... its risk measurement and risk management systems at least annually. (c) Market risk factors. The bank's internal model must use risk factors sufficient to measure the market risk inherent in all covered...

  19. Approaches for the Application of Physiologically Based ...

    EPA Pesticide Factsheets

    EPA released the final report, Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment as announced in a September 22 2006 Federal Register Notice.This final report addresses the application and evaluation of PBPK models for risk assessment purposes. These models represent an important class of dosimetry models that are useful for predicting internal dose at target organs for risk assessment applications. EPA is releasing a final report describing the evaluation and applications of physiologically based pharmacokinetic (PBPK) models in health risk assessment. This was announced in the September 22 2006 Federal Register Notice.

  20. Conditional Toxicity Value (CTV) Predictor: An In Silico Approach for Generating Quantitative Risk Estimates for Chemicals.

    PubMed

    Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A

    2018-05-01

    Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.

  1. Evaluation of different radon guideline values based on characterization of ecological risk and visualization of lung cancer mortality trends in British Columbia, Canada.

    PubMed

    Branion-Calles, Michael C; Nelson, Trisalyn A; Henderson, Sarah B

    2015-11-19

    There is no safe concentration of radon gas, but guideline values provide threshold concentrations that are used to map areas at higher risk. These values vary between different regions, countries, and organizations, which can lead to differential classification of risk. For example the World Health Organization suggests a 100 Bq m(-3)value, while Health Canada recommends 200 Bq m(-3). Our objective was to describe how different thresholds characterized ecological radon risk and their visual association with lung cancer mortality trends in British Columbia, Canada. Eight threshold values between 50 and 600 Bq m(-3) were identified, and classes of radon vulnerability were defined based on whether the observed 95(th) percentile radon concentration was above or below each value. A balanced random forest algorithm was used to model vulnerability, and the results were mapped. We compared high vulnerability areas, their estimated populations, and differences in lung cancer mortality trends stratified by smoking prevalence and sex. Classification accuracy improved as the threshold concentrations decreased and the area classified as high vulnerability increased. Majority of the population lived within areas of lower vulnerability regardless of the threshold value. Thresholds as low as 50 Bq m(-3) were associated with higher lung cancer mortality, even in areas with low smoking prevalence. Temporal trends in lung cancer mortality were increasing for women, while decreasing for men. Radon contributes to lung cancer in British Columbia. The results of the study contribute evidence supporting the use of a reference level lower than the current guideline of 200 Bq m(-3) for the province.

  2. A Risk-Based Ecohydrological Approach to Assessing Environmental Flow Regimes

    NASA Astrophysics Data System (ADS)

    Mcgregor, Glenn B.; Marshall, Jonathan C.; Lobegeiger, Jaye S.; Holloway, Dean; Menke, Norbert; Coysh, Julie

    2018-03-01

    For several decades there has been recognition that water resource development alters river flow regimes and impacts ecosystem values. Determining strategies to protect or restore flow regimes to achieve ecological outcomes is a focus of water policy and legislation in many parts of the world. However, consideration of existing environmental flow assessment approaches for application in Queensland identified deficiencies precluding their adoption. Firstly, in managing flows and using ecosystem condition as an indicator of effectiveness, many approaches ignore the fact that river ecosystems are subjected to threatening processes other than flow regime alteration. Secondly, many focus on providing flows for responses without considering how often they are necessary to sustain ecological values in the long-term. Finally, few consider requirements at spatial-scales relevant to the desired outcomes, with frequent focus on individual places rather than the regions supporting sustainability. Consequently, we developed a risk-based ecohydrological approach that identifies ecosystem values linked to desired ecological outcomes, is sensitive to flow alteration and uses indicators of broader ecosystem requirements. Monitoring and research is undertaken to quantify flow-dependencies and ecological modelling is used to quantify flow-related ecological responses over an historical flow period. The relative risk from different flow management scenarios can be evaluated at relevant spatial-scales. This overcomes the deficiencies identified above and provides a robust and useful foundation upon which to build the information needed to support water planning decisions. Application of the risk assessment approach is illustrated here by two case studies.

  3. A Formative Evaluation of the Children, Youth, and Families at Risk Coaching Model

    ERIC Educational Resources Information Center

    Olson, Jonathan R.; Smith, Burgess; Hawkey, Kyle R.; Perkins, Daniel F.; Borden, Lynne M.

    2016-01-01

    In this article, we describe the results of a formative evaluation of a coaching model designed to support recipients of funding through the Children, Youth, and Families at Risk (CYFAR) initiative. Results indicate that CYFAR coaches draw from a variety of types of coaching and that CYFAR principle investigators (PIs) are generally satisfied with…

  4. An emission-weighted proximity model for air pollution exposure assessment.

    PubMed

    Zou, Bin; Wilson, J Gaines; Zhan, F Benjamin; Zeng, Yongnian

    2009-08-15

    Among the most common spatial models for estimating personal exposure are Traditional Proximity Models (TPMs). Though TPMs are straightforward to configure and interpret, they are prone to extensive errors in exposure estimates and do not provide prospective estimates. To resolve these inherent problems with TPMs, we introduce here a novel Emission Weighted Proximity Model (EWPM) to improve the TPM, which takes into consideration the emissions from all sources potentially influencing the receptors. EWPM performance was evaluated by comparing the normalized exposure risk values of sulfur dioxide (SO(2)) calculated by EWPM with those calculated by TPM and monitored observations over a one-year period in two large Texas counties. In order to investigate whether the limitations of TPM in potential exposure risk prediction without recorded incidence can be overcome, we also introduce a hybrid framework, a 'Geo-statistical EWPM'. Geo-statistical EWPM is a synthesis of Ordinary Kriging Geo-statistical interpolation and EWPM. The prediction results are presented as two potential exposure risk prediction maps. The performance of these two exposure maps in predicting individual SO(2) exposure risk was validated with 10 virtual cases in prospective exposure scenarios. Risk values for EWPM were clearly more agreeable with the observed concentrations than those from TPM. Over the entire study area, the mean SO(2) exposure risk from EWPM was higher relative to TPM (1.00 vs. 0.91). The mean bias of the exposure risk values of 10 virtual cases between EWPM and 'Geo-statistical EWPM' are much smaller than those between TPM and 'Geo-statistical TPM' (5.12 vs. 24.63). EWPM appears to more accurately portray individual exposure relative to TPM. The 'Geo-statistical EWPM' effectively augments the role of the standard proximity model and makes it possible to predict individual risk in future exposure scenarios resulting in adverse health effects from environmental pollution.

  5. Feasibility Study of Low-Cost Image-Based Heritage Documentation in Nepal

    NASA Astrophysics Data System (ADS)

    Dhonju, H. K.; Xiao, W.; Sarhosis, V.; Mills, J. P.; Wilkinson, S.; Wang, Z.; Thapa, L.; Panday, U. S.

    2017-02-01

    Cultural heritage structural documentation is of great importance in terms of historical preservation, tourism, educational and spiritual values. Cultural heritage across the world, and in Nepal in particular, is at risk from various natural hazards (e.g. earthquakes, flooding, rainfall etc), poor maintenance and preservation, and even human destruction. This paper evaluates the feasibility of low-cost photogrammetric modelling cultural heritage sites, and explores the practicality of using photogrammetry in Nepal. The full pipeline of 3D modelling for heritage documentation and conservation, including visualisation, reconstruction, and structure analysis, is proposed. In addition, crowdsourcing is discussed as a method of data collection of growing prominence.

  6. Protecting groundwater resources at biosolids recycling sites.

    PubMed

    McFarland, Michael J; Kumarasamy, Karthik; Brobst, Robert B; Hais, Alan; Schmitz, Mark D

    2013-01-01

    In developing the national biosolids recycling rule (Title 40 of the Code of Federal Regulation Part 503 or Part 503), the USEPA conducted deterministic risk assessments whose results indicated that the probability of groundwater impairment associated with biosolids recycling was insignificant. Unfortunately, the computational capabilities available for performing risk assessments of pollutant fate and transport at that time were limited. Using recent advances in USEPA risk assessment methodology, the present study evaluates whether the current national biosolids pollutant limits remain protective of groundwater quality. To take advantage of new risk assessment approaches, a computer-based groundwater risk characterization screening tool (RCST) was developed using USEPA's Multimedia, Multi-pathway, Multi-receptor Exposure and Risk Assessment program. The RCST, which generates a noncarcinogenic human health risk estimate (i.e., hazard quotient [HQ] value), has the ability to conduct screening-level risk characterizations. The regulated heavy metals modeled in this study were As, Cd, Ni, Se, and Zn. Results from RCST application to biosolids recycling sites located in Yakima County, Washington, indicated that biosolids could be recycled at rates as high as 90 Mg ha, with no negative human health effects associated with groundwater consumption. Only under unrealistically high biosolids land application rates were public health risks characterized as significant (HQ ≥ 1.0). For example, by increasing the biosolids application rate and pollutant concentrations to 900 Mg ha and 10 times the regulatory limit, respectively, the HQ values varied from 1.4 (Zn) to 324.0 (Se). Since promulgation of Part 503, no verifiable cases of groundwater contamination by regulated biosolids pollutants have been reported. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  7. A comparison of imputation techniques for handling missing predictor values in a risk model with a binary outcome.

    PubMed

    Ambler, Gareth; Omar, Rumana Z; Royston, Patrick

    2007-06-01

    Risk models that aim to predict the future course and outcome of disease processes are increasingly used in health research, and it is important that they are accurate and reliable. Most of these risk models are fitted using routinely collected data in hospitals or general practices. Clinical outcomes such as short-term mortality will be near-complete, but many of the predictors may have missing values. A common approach to dealing with this is to perform a complete-case analysis. However, this may lead to overfitted models and biased estimates if entire patient subgroups are excluded. The aim of this paper is to investigate a number of methods for imputing missing data to evaluate their effect on risk model estimation and the reliability of the predictions. Multiple imputation methods, including hotdecking and multiple imputation by chained equations (MICE), were investigated along with several single imputation methods. A large national cardiac surgery database was used to create simulated yet realistic datasets. The results suggest that complete case analysis may produce unreliable risk predictions and should be avoided. Conditional mean imputation performed well in our scenario, but may not be appropriate if using variable selection methods. MICE was amongst the best performing multiple imputation methods with regards to the quality of the predictions. Additionally, it produced the least biased estimates, with good coverage, and hence is recommended for use in practice.

  8. Reliability of IOTA score and ADNEX model in the screening of ovarian malignancy in postmenopausal women.

    PubMed

    Nohuz, Erdogan; De Simone, Luisa; Chêne, Gautier

    2018-04-28

    The IOTA (International Ovarian Tumor Analysis) group has developed the ADNEX (Assessment of Different NEoplasias in the adneXa) model to predict the risk that an ovarian mass is benign, borderline or malignant. This study aimed to test reliability of these risks prediction models to improve the performance of pelvic ultrasound and discriminate between benign and malignant cysts. Postmenopausal women with an adnexal mass (including ovarian, para-ovarian and tubal) and who underwent a standardized ultrasound examination before surgery were included. Prospectively and retrospectively collected data and ultrasound appearances of the tumors were described using the terms and definitions of the IOTA group and tested in accordance with the ADNEX model and were compared to the final histological diagnosis. Of the 107 menopausal patients recruited between 2011 and 2016, 14 were excluded (incomplete inclusion criteria). Thus, 93 patients constituted a cohort in whom 89 had benign cysts (83 ovarian and 6 tubal or para-ovarian cysts), 1 had border line tumor and 3 had invasive ovarian cancers (1 at first stage, 1 at advanced stage and 1 metastatic tumor in the ovary). The overall prevalence of malignancy was 4.3%. Every benign ovarian cyst was classified as probably benign by IOTA score which showed also a high specificity with the totality of probably malignant lesion proved malignant by histological exam. The limit of this score was the important rate of not classified or undetermined cysts. However, the malignancy risks calculated by ADNEX model allowed identifying the totality of malignancy. Thus, the combination of the two methods of analysis showed a sensitivity and specificity rates of respectively 100% and 98%. Evaluation of malignancy risks by these 2 tests highlighted a negative predictive value of 100% (there was no case of false negative) and a positive predictive value of 80%. On the basis of our findings, the IOTA classification and the ADNEX multimodal algorithm used as risks prediction models can improve the performance of pelvic ultrasound and discriminate between benign and malignant cysts in postmenopausal women, especially for undetermined lesions. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  9. Analyzing seasonal patterns of wildfire exposure factors in Sardinia, Italy.

    PubMed

    Salis, Michele; Ager, Alan A; Alcasena, Fermin J; Arca, Bachisio; Finney, Mark A; Pellizzaro, Grazia; Spano, Donatella

    2015-01-01

    In this paper, we applied landscape scale wildfire simulation modeling to explore the spatiotemporal patterns of wildfire likelihood and intensity in the island of Sardinia (Italy). We also performed wildfire exposure analysis for selected highly valued resources on the island to identify areas characterized by high risk. We observed substantial variation in burn probability, fire size, and flame length among time periods within the fire season, which starts in early June and ends in late September. Peak burn probability and flame length were observed in late July. We found that patterns of wildfire likelihood and intensity were mainly related to spatiotemporal variation in ignition locations, fuel moisture, and wind vectors. Our modeling approach allowed consideration of historical patterns of winds, ignition locations, and live and dead fuel moisture on fire exposure factors. The methodology proposed can be useful for analyzing potential wildfire risk and effects at landscape scale, evaluating historical changes and future trends in wildfire exposure, as well as for addressing and informing fuel management and risk mitigation issues.

  10. Overcoming Learning Aversion in Evaluating and Managing Uncertain Risks.

    PubMed

    Cox, Louis Anthony Tony

    2015-10-01

    Decision biases can distort cost-benefit evaluations of uncertain risks, leading to risk management policy decisions with predictably high retrospective regret. We argue that well-documented decision biases encourage learning aversion, or predictably suboptimal learning and premature decision making in the face of high uncertainty about the costs, risks, and benefits of proposed changes. Biases such as narrow framing, overconfidence, confirmation bias, optimism bias, ambiguity aversion, and hyperbolic discounting of the immediate costs and delayed benefits of learning, contribute to deficient individual and group learning, avoidance of information seeking, underestimation of the value of further information, and hence needlessly inaccurate risk-cost-benefit estimates and suboptimal risk management decisions. In practice, such biases can create predictable regret in selection of potential risk-reducing regulations. Low-regret learning strategies based on computational reinforcement learning models can potentially overcome some of these suboptimal decision processes by replacing aversion to uncertain probabilities with actions calculated to balance exploration (deliberate experimentation and uncertainty reduction) and exploitation (taking actions to maximize the sum of expected immediate reward, expected discounted future reward, and value of information). We discuss the proposed framework for understanding and overcoming learning aversion and for implementing low-regret learning strategies using regulation of air pollutants with uncertain health effects as an example. © 2015 Society for Risk Analysis.

  11. Distributions of observed death tolls govern sensitivity to human fatalities

    PubMed Central

    Olivola, Christopher Y.; Sagara, Namika

    2009-01-01

    How we react to humanitarian crises, epidemics, and other tragic events involving the loss of human lives depends largely on the extent to which we are moved by the size of their associated death tolls. Many studies have demonstrated that people generally exhibit a diminishing sensitivity to the number of human fatalities and, equivalently, a preference for risky (vs. sure) alternatives in decisions under risk involving human losses. However, the reason for this tendency remains unknown. Here we show that the distributions of event-related death tolls that people observe govern their evaluations of, and risk preferences concerning, human fatalities. In particular, we show that our diminishing sensitivity to human fatalities follows from the fact that these death tolls are approximately power-law distributed. We further show that, by manipulating the distribution of mortality-related events that people observe, we can alter their risk preferences in decisions involving fatalities. Finally, we show that the tendency to be risk-seeking in mortality-related decisions is lower in countries in which high-mortality events are more frequently observed. Our results support a model of magnitude evaluation based on memory sampling and relative judgment. This model departs from the utility-based approaches typically encountered in psychology and economics in that it does not rely on stable, underlying value representations to explain valuation and choice, or on choice behavior to derive value functions. Instead, preferences concerning human fatalities emerge spontaneously from the distributions of sampled events and the relative nature of the evaluation process. PMID:20018778

  12. Economic risk assessment of drought impacts on irrigated agriculture

    NASA Astrophysics Data System (ADS)

    Lopez-Nicolas, A.; Pulido-Velazquez, M.; Macian-Sorribes, H.

    2017-07-01

    In this paper we present an innovative framework for an economic risk analysis of drought impacts on irrigated agriculture. It consists on the integration of three components: stochastic time series modelling for prediction of inflows and future reservoir storages at the beginning of the irrigation season; statistical regression for the evaluation of water deliveries based on projected inflows and storages; and econometric modelling for economic assessment of the production value of agriculture based on irrigation water deliveries and crop prices. Therefore, the effect of the price volatility can be isolated from the losses due to water scarcity in the assessment of the drought impacts. Monte Carlo simulations are applied to generate probability functions of inflows, which are translated into probabilities of storages, deliveries, and finally, production value of agriculture. The framework also allows the assessment of the value of mitigation measures as reduction of economic losses during droughts. The approach was applied to the Jucar river basin, a complex system affected by multiannual severe droughts, with irrigated agriculture as the main consumptive demand. Probability distributions of deliveries and production value were obtained for each irrigation season. In the majority of the irrigation districts, drought causes a significant economic impact. The increase of crop prices can partially offset the losses from the reduction of production due to water scarcity in some districts. Emergency wells contribute to mitigating the droughts' impacts on the Jucar river system.

  13. Prediction impact curve is a new measure integrating intervention effects in the evaluation of risk models.

    PubMed

    Campbell, William; Ganna, Andrea; Ingelsson, Erik; Janssens, A Cecile J W

    2016-01-01

    We propose a new measure of assessing the performance of risk models, the area under the prediction impact curve (auPIC), which quantifies the performance of risk models in terms of their average health impact in the population. Using simulated data, we explain how the prediction impact curve (PIC) estimates the percentage of events prevented when a risk model is used to assign high-risk individuals to an intervention. We apply the PIC to the Atherosclerosis Risk in Communities (ARIC) Study to illustrate its application toward prevention of coronary heart disease. We estimated that if the ARIC cohort received statins at baseline, 5% of events would be prevented when the risk model was evaluated at a cutoff threshold of 20% predicted risk compared to 1% when individuals were assigned to the intervention without the use of a model. By calculating the auPIC, we estimated that an average of 15% of events would be prevented when considering performance across the entire interval. We conclude that the PIC is a clinically meaningful measure for quantifying the expected health impact of risk models that supplements existing measures of model performance. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Developing a clinical utility framework to evaluate prediction models in radiogenomics

    NASA Astrophysics Data System (ADS)

    Wu, Yirong; Liu, Jie; Munoz del Rio, Alejandro; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-03-01

    Combining imaging and genetic information to predict disease presence and behavior is being codified into an emerging discipline called "radiogenomics." Optimal evaluation methodologies for radiogenomics techniques have not been established. We aim to develop a clinical decision framework based on utility analysis to assess prediction models for breast cancer. Our data comes from a retrospective case-control study, collecting Gail model risk factors, genetic variants (single nucleotide polymorphisms-SNPs), and mammographic features in Breast Imaging Reporting and Data System (BI-RADS) lexicon. We first constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail+SNP, and (3) Gail+SNP+BI-RADS. Then, we generated ROC curves for three models. After we assigned utility values for each category of findings (true negative, false positive, false negative and true positive), we pursued optimal operating points on ROC curves to achieve maximum expected utility (MEU) of breast cancer diagnosis. We used McNemar's test to compare the predictive performance of the three models. We found that SNPs and BI-RADS features augmented the baseline Gail model in terms of the area under ROC curve (AUC) and MEU. SNPs improved sensitivity of the Gail model (0.276 vs. 0.147) and reduced specificity (0.855 vs. 0.912). When additional mammographic features were added, sensitivity increased to 0.457 and specificity to 0.872. SNPs and mammographic features played a significant role in breast cancer risk estimation (p-value < 0.001). Our decision framework comprising utility analysis and McNemar's test provides a novel framework to evaluate prediction models in the realm of radiogenomics.

  15. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    USGS Publications Warehouse

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  16. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  17. Impact of risk factors on cardiovascular risk: a perspective on risk estimation in a Swiss population.

    PubMed

    Chrubasik, Sigrun A; Chrubasik, Cosima A; Piper, Jörg; Schulte-Moenting, Juergen; Erne, Paul

    2015-01-01

    In models and scores for estimating cardiovascular risk (CVR), the relative weightings given to blood pressure measurements (BPMs), and biometric and laboratory variables are such that even large differences in blood pressure lead to rather low differences in the resulting total risk when compared with other concurrent risk factors. We evaluated this phenomenon based on the PROCAM score, using BPMs made by volunteer subjects at home (HBPMs) and automated ambulatory BPMs (ABPMs) carried out in the same subjects. A total of 153 volunteers provided the data needed to estimate their CVR by means of the PROCAM formula. Differences (deltaCVR) between the risk estimated by entering the ABPM and that estimated with the HBPM were compared with the differences (deltaBPM) between the ABPM and the corresponding HBPM. In addition to the median values (= second quartile), the first and third quartiles of blood pressure profiles were also considered. PROCAM risk values were converted to European Society of Cardiology (ESC) risk values and all participants were assigned to the risk groups low, medium and high. Based on the PROCAM score, 132 participants had a low risk for suffering myocardial infarction, 16 a medium risk and 5 a high risk. The calculated ESC scores classified 125 participants into the low-risk group, 26 into the medium- and 2 into the high-risk group for death from a cardiovascular event. Mean ABPM tended to be higher than mean HBPM. Use of mean systolic ABPM or HBPM in the PROCAM formula had no major impact on the risk level. Our observations are in agreement with the rather low weighting of blood pressure as risk determinant in the PROCAM score. BPMs assessed with different methods had relatively little impact on estimation of cardiovascular risk in the given context of other important determinants. The risk calculations in our unselected population reflect the given classification of Switzerland as a so-called cardiovascular "low risk country".

  18. Impact of Lifestyle and Metformin Interventions on the Risk of Progression to Diabetes and Regression to Normal Glucose Regulation in Overweight or Obese People With Impaired Glucose Regulation.

    PubMed

    Herman, William H; Pan, Qing; Edelstein, Sharon L; Mather, Kieren J; Perreault, Leigh; Barrett-Connor, Elizabeth; Dabelea, Dana M; Horton, Edward; Kahn, Steven E; Knowler, William C; Lorenzo, Carlos; Pi-Sunyer, Xavier; Venditti, Elizabeth; Ye, Wen

    2017-12-01

    Both lifestyle and metformin interventions can delay or prevent progression to type 2 diabetes mellitus (DM) in people with impaired glucose regulation, but there is considerable interindividual variation in the likelihood of receiving benefit. Understanding an individual's 3-year risk of progressing to DM and regressing to normal glucose regulation (NGR) might facilitate benefit-based tailored treatment. We used the values of 19 clinical variables measured at the Diabetes Prevention Program (DPP) baseline evaluation and Cox proportional hazards models to assess the 3-year risk of progression to DM and regression to NGR separately for DPP lifestyle, metformin, and placebo participants who were adherent to the interventions. Lifestyle participants who lost ≥5% of their initial body weight at 6 months and metformin and placebo participants who reported taking ≥80% of their prescribed medication at the 6-month follow-up were defined as adherent. Eleven of 19 clinical variables measured at baseline predicted progression to DM, and 6 of 19 predicted regression to NGR. Compared with adherent placebo participants at lowest risk of developing diabetes, participants at lowest risk of developing diabetes who adhered to a lifestyle intervention had an 8% absolute risk reduction (ARR) of developing diabetes and a 35% greater absolute likelihood of reverting to NGR. Participants at lowest risk of developing diabetes who adhered to a metformin intervention had no reduction in their risk of developing diabetes and a 17% greater absolute likelihood of reverting to NGR. Participants at highest risk of developing DM who adhered to a lifestyle intervention had a 39% ARR of developing diabetes and a 24% greater absolute likelihood of reverting to NGR, whereas those who adhered to the metformin intervention had a 25% ARR of developing diabetes and an 11% greater absolute likelihood of reverting to NGR. Unlike our previous analyses that sought to explain population risk, these analyses evaluate individual risk. The models can be used by overweight and obese adults with fasting hyperglycemia and impaired glucose tolerance to facilitate personalized decision-making by allowing them to explicitly weigh the benefits and feasibility of the lifestyle and metformin interventions. © 2017 by the American Diabetes Association.

  19. Simulating wildfire spread behavior between two NASA Active Fire data timeframes

    NASA Astrophysics Data System (ADS)

    Adhikari, B.; Hodza, P.; Xu, C.; Minckley, T. A.

    2017-12-01

    Although NASA's Active Fire dataset is considered valuable in mapping the spatial distribution and extent of wildfires across the world, the data is only available at approximately 12-hour time intervals, creating uncertainties and risks associated with fire spread and behavior between the two Visible Infrared Imaging Radiometer Satellite (VIIRS) data collection timeframes. Our study seeks to close the information gap for the United States by using the latest Active Fire data collected for instance around 0130 hours as an ignition source and critical inputs to a wildfire model by uniquely incorporating forecasted and real-time weather conditions for predicting fire perimeter at the next 12 hour reporting time (i.e. around 1330 hours). The model ingests highly dynamic variables such as fuel moisture, temperature, relative humidity, wind among others, and prompts a Monte Carlo simulation exercise that uses a varying range of possible values for evaluating all possible wildfire behaviors. The Monte Carlo simulation implemented in this model provides a measure of the relative wildfire risk levels at various locations based on the number of times those sites are intersected by simulated fire perimeters. Model calibration is achieved using data at next reporting time (i.e. after 12 hours) to enhance the predictive quality at further time steps. While initial results indicate that the calibrated model can predict the overall geometry and direction of wildland fire spread, the model seems to over-predict the sizes of most fire perimeters possibly due to unaccounted fire suppression activities. Nonetheless, the results of this study show great promise in aiding wildland fire tracking, fighting and risk management.

  20. Robust optimization based upon statistical theory.

    PubMed

    Sobotta, B; Söhn, M; Alber, M

    2010-08-01

    Organ movement is still the biggest challenge in cancer treatment despite advances in online imaging. Due to the resulting geometric uncertainties, the delivered dose cannot be predicted precisely at treatment planning time. Consequently, all associated dose metrics (e.g., EUD and maxDose) are random variables with a patient-specific probability distribution. The method that the authors propose makes these distributions the basis of the optimization and evaluation process. The authors start from a model of motion derived from patient-specific imaging. On a multitude of geometry instances sampled from this model, a dose metric is evaluated. The resulting pdf of this dose metric is termed outcome distribution. The approach optimizes the shape of the outcome distribution based on its mean and variance. This is in contrast to the conventional optimization of a nominal value (e.g., PTV EUD) computed on a single geometry instance. The mean and variance allow for an estimate of the expected treatment outcome along with the residual uncertainty. Besides being applicable to the target, the proposed method also seamlessly includes the organs at risk (OARs). The likelihood that a given value of a metric is reached in the treatment is predicted quantitatively. This information reveals potential hazards that may occur during the course of the treatment, thus helping the expert to find the right balance between the risk of insufficient normal tissue sparing and the risk of insufficient tumor control. By feeding this information to the optimizer, outcome distributions can be obtained where the probability of exceeding a given OAR maximum and that of falling short of a given target goal can be minimized simultaneously. The method is applicable to any source of residual motion uncertainty in treatment delivery. Any model that quantifies organ movement and deformation in terms of probability distributions can be used as basis for the algorithm. Thus, it can generate dose distributions that are robust against interfraction and intrafraction motion alike, effectively removing the need for indiscriminate safety margins.

  1. Trial Implementation of a Multihazard Risk Assessment Framework for High-Impact Low-Frequency Power Grid Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.

    The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less

  2. Trial Implementation of a Multihazard Risk Assessment Framework for High-Impact Low-Frequency Power Grid Events

    DOE PAGES

    Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.; ...

    2017-08-25

    The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less

  3. A Probabilistic Model for Cushing's Syndrome Screening in At-Risk Populations: A Prospective Multicenter Study.

    PubMed

    León-Justel, Antonio; Madrazo-Atutxa, Ainara; Alvarez-Rios, Ana I; Infantes-Fontán, Rocio; Garcia-Arnés, Juan A; Lillo-Muñoz, Juan A; Aulinas, Anna; Urgell-Rull, Eulàlia; Boronat, Mauro; Sánchez-de-Abajo, Ana; Fajardo-Montañana, Carmen; Ortuño-Alonso, Mario; Salinas-Vert, Isabel; Granada, Maria L; Cano, David A; Leal-Cerro, Alfonso

    2016-10-01

    Cushing's syndrome (CS) is challenging to diagnose. Increased prevalence of CS in specific patient populations has been reported, but routine screening for CS remains questionable. To decrease the diagnostic delay and improve disease outcomes, simple new screening methods for CS in at-risk populations are needed. To develop and validate a simple scoring system to predict CS based on clinical signs and an easy-to-use biochemical test. Observational, prospective, multicenter. Referral hospital. A cohort of 353 patients attending endocrinology units for outpatient visits. All patients were evaluated with late-night salivary cortisol (LNSC) and a low-dose dexamethasone suppression test for CS. Diagnosis or exclusion of CS. Twenty-six cases of CS were diagnosed in the cohort. A risk scoring system was developed by logistic regression analysis, and cutoff values were derived from a receiver operating characteristic curve. This risk score included clinical signs and symptoms (muscular atrophy, osteoporosis, and dorsocervical fat pad) and LNSC levels. The estimated area under the receiver operating characteristic curve was 0.93, with a sensitivity of 96.2% and specificity of 82.9%. We developed a risk score to predict CS in an at-risk population. This score may help to identify at-risk patients in non-endocrinological settings such as primary care, but external validation is warranted.

  4. Impact of gastrectomy procedural complexity on surgical outcomes and hospital comparisons.

    PubMed

    Mohanty, Sanjay; Paruch, Jennifer; Bilimoria, Karl Y; Cohen, Mark; Strong, Vivian E; Weber, Sharon M

    2015-08-01

    Most risk adjustment approaches adjust for patient comorbidities and the primary procedure. However, procedures done at the same time as the index case may increase operative risk and merit inclusion in adjustment models for fair hospital comparisons. Our objectives were to evaluate the impact of surgical complexity on postoperative outcomes and hospital comparisons in gastric cancer surgery. Patients who underwent gastric resection for cancer were identified from a large clinical dataset. Procedure complexity was characterized using secondary procedure CPT codes and work relative value units (RVUs). Regression models were developed to evaluate the association between complexity variables and outcomes. The impact of complexity adjustment on model performance and hospital comparisons was examined. Among 3,467 patients who underwent gastrectomy for adenocarcinoma, 2,171 operations were distal and 1,296 total. A secondary procedure was reported for 33% of distal gastrectomies and 59% of total gastrectomies. Six of 10 secondary procedures were associated with adverse outcomes. For example, patients who underwent a synchronous bowel resection had a higher risk of mortality (odds ratio [OR], 2.14; 95% CI, 1.07-4.29) and reoperation (OR, 2.09; 95% CI, 1.26-3.47). Model performance was slightly better for nearly all outcomes with complexity adjustment (mortality c-statistics: standard model, 0.853; secondary procedure model, 0.858; RVU model, 0.855). Hospital ranking did not change substantially after complexity adjustment. Surgical complexity variables are associated with adverse outcomes in gastrectomy, but complexity adjustment does not affect hospital rankings appreciably. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Dynamic extreme values modeling and monitoring by means of sea shores water quality biomarkers and valvometry.

    PubMed

    Durrieu, Gilles; Pham, Quang-Khoai; Foltête, Anne-Sophie; Maxime, Valérie; Grama, Ion; Tilly, Véronique Le; Duval, Hélène; Tricot, Jean-Marie; Naceur, Chiraz Ben; Sire, Olivier

    2016-07-01

    Water quality can be evaluated using biomarkers such as tissular enzymatic activities of endemic species. Measurement of molluscs bivalves activity at high frequency (e.g., valvometry) during a long time period is another way to record the animal behavior and to evaluate perturbations of the water quality in real time. As the pollution affects the activity of oysters, we consider the valves opening and closing velocities to monitor the water quality assessment. We propose to model the huge volume of velocity data collected in the framework of valvometry using a new nonparametric extreme values statistical model. The objective is to estimate the tail probabilities and the extreme quantiles of the distribution of valve closing velocity. The tail of the distribution function of valve closing velocity is modeled by a Pareto distribution with parameter t,τ , beyond a threshold τ according to the time t of the experiment. Our modeling approach reveals the dependence between the specific activity of two enzymatic biomarkers (Glutathione-S-transferase and acetylcholinesterase) and the continuous recording of oyster valve velocity, proving the suitability of this tool for water quality assessment. Thus, valvometry allows in real-time in situ analysis of the bivalves behavior and appears as an effective early warning tool in ecological risk assessment and marine environment monitoring.

  6. Evaluation of self-combustion risk in tire derived aggregate fills.

    PubMed

    Arroyo, Marcos; San Martin, Ignacio; Olivella, Sebastian; Saaltink, Maarten W

    2011-01-01

    Lightweight tire derived aggregate (TDA) fills are a proven recycling outlet for waste tires, requiring relatively low cost waste processing and being competitively priced against other lightweight fill alternatives. However its value has been marred as several TDA fills have self-combusted during the early applications of this technique. An empirical review of these cases led to prescriptive guidelines from the ASTM aimed at avoiding this problem. This approach has been successful in avoiding further incidents of self-combustion. However, at present there remains no rational method available to quantify self-combustion risk in TDA fills. This means that it is not clear which aspects of the ASTM guidelines are essential and which are accessory. This hinders the practical use of TDA fills despite their inherent advantages as lightweight fill. Here a quantitative approach to self-combustion risk evaluation is developed and illustrated with a parametric analysis of an embankment case. This is later particularized to model a reported field self-combustion case. The approach is based on the available experimental observations and incorporates well-tested methodological (ISO corrosion evaluation) and theoretical tools (finite element analysis of coupled heat and mass flow). The results obtained offer clear insights into the critical aspects of the problem, allowing already some meaningful recommendations for guideline revision. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Making predictions of mangrove deforestation: a comparison of two methods in Kenya.

    PubMed

    Rideout, Alasdair J R; Joshi, Neha P; Viergever, Karin M; Huxham, Mark; Briers, Robert A

    2013-11-01

    Deforestation of mangroves is of global concern given their importance for carbon storage, biogeochemical cycling and the provision of other ecosystem services, but the links between rates of loss and potential drivers or risk factors are rarely evaluated. Here, we identified key drivers of mangrove loss in Kenya and compared two different approaches to predicting risk. Risk factors tested included various possible predictors of anthropogenic deforestation, related to population, suitability for land use change and accessibility. Two approaches were taken to modelling risk; a quantitative statistical approach and a qualitative categorical ranking approach. A quantitative model linking rates of loss to risk factors was constructed based on generalized least squares regression and using mangrove loss data from 1992 to 2000. Population density, soil type and proximity to roads were the most important predictors. In order to validate this model it was used to generate a map of losses of Kenyan mangroves predicted to have occurred between 2000 and 2010. The qualitative categorical model was constructed using data from the same selection of variables, with the coincidence of different risk factors in particular mangrove areas used in an additive manner to create a relative risk index which was then mapped. Quantitative predictions of loss were significantly correlated with the actual loss of mangroves between 2000 and 2010 and the categorical risk index values were also highly correlated with the quantitative predictions. Hence, in this case the relatively simple categorical modelling approach was of similar predictive value to the more complex quantitative model of mangrove deforestation. The advantages and disadvantages of each approach are discussed, and the implications for mangroves are outlined. © 2013 Blackwell Publishing Ltd.

  8. The multiple stressor ecological risk assessment for the mercury-contaminated South River and upper Shenandoah River using the Bayesian network-relative risk model.

    PubMed

    Landis, Wayne G; Ayre, Kimberley K; Johns, Annie F; Summers, Heather M; Stinson, Jonah; Harris, Meagan J; Herring, Carlie E; Markiewicz, April J

    2017-01-01

    We have conducted a regional scale risk assessment using the Bayesian Network Relative Risk Model (BN-RRM) to calculate the ecological risks to the South River and upper Shenandoah River study area. Four biological endpoints (smallmouth bass, white sucker, Belted Kingfisher, and Carolina Wren) and 4 abiotic endpoints (Fishing River Use, Swimming River Use, Boating River Use, and Water Quality Standards) were included in this risk assessment, based on stakeholder input. Although mercury (Hg) contamination was the original impetus for the site being remediated, other chemical and physical stressors were evaluated. There were 3 primary conclusions from the BN-RRM results. First, risk varies according to location, type and quality of habitat, and exposure to stressors within the landscape. The patterns of risk can be evaluated with reasonable certitude. Second, overall risk to abiotic endpoints was greater than overall risk to biotic endpoints. By including both biotic and abiotic endpoints, we are able to compare risk to endpoints that represent a wide range of stakeholder values. Third, whereas Hg reduction is the regulatory priority for the South River, Hg is not the only stressor driving risk to the endpoints. Ecological and habitat stressors contribute risk to the endpoints and should be considered when managing this site. This research provides the foundation for evaluating the risks of multiple stressors of the South River to a variety of endpoints. From this foundation, tools for the evaluation of management options and an adaptive management tools have been forged. Integr Environ Assess Manag 2017;13:85-99. © 2016 SETAC. © 2016 SETAC.

  9. Meta-analysis reveals PTPN22 1858C/T polymorphism confers susceptibility to rheumatoid arthritis in Caucasian but not in Asian population.

    PubMed

    Nabi, Gowher; Akhter, Naseem; Wahid, Mohd; Bhatia, Kanchan; Mandal, Raju Kumar; Dar, Sajad Ahmad; Jawed, Arshad; Haque, Shafiul

    2016-01-01

    The PTPN22 1858C/T polymorphism is associated with rheumatoid arthritis (RA). However, reports from the Asian populations are conflicting in nature and lacks consensus. The aim of our study was to evaluate the association between the PTPN22 1858C/T polymorphism and RA in Asian and Caucasian subjects by carrying out a meta-analysis of Asian and Caucasian data. A total of 27 205 RA cases and 27 677 controls were considered in the present meta-analysis involving eight Asian and 35 Caucasian studies. The pooled odds ratios (ORs) were performed for the allele, dominant, and recessive genetic model. No statistically significant association was found between the PTPN22 1858C/T polymorphism and risk of RA in Asian population (allele genetic model: OR = 1.217, 95% confidence interval (CI) = 0.99-1.496, p value 0.061; dominant genetic model: OR = 1.238, 95% CI = 0.982-1.562, p value 0.071; recessive genetic model: OR = 1.964, 95% CI = 0.678-5.693, p value 0.213). A significant association with risk of RA in Caucasian population suggesting that T-- allele does confer susceptibility to RA in this subgroup was observed (allele genetic model: OR = 1.638, 95% CI = 1.574-1.705, p value < 0.0001; dominant genetic model: OR = 1.67, 95% CI = 1.598-1.745, p value < 0.0001; recessive genetic model: OR = 2.65, 95% CI = 2.273-3.089, p value < 0.0001). The PTPN22 1858C/T polymorphism is not associated with RA risk in Asian populations. However, our meta-analysis confirms that the PTPN22 1858C/T polymorphism is associated with RA susceptibility in Caucasians.

  10. Risk prediction for chronic kidney disease progression using heterogeneous electronic health record data and time series analysis.

    PubMed

    Perotte, Adler; Ranganath, Rajesh; Hirsch, Jamie S; Blei, David; Elhadad, Noémie

    2015-07-01

    As adoption of electronic health records continues to increase, there is an opportunity to incorporate clinical documentation as well as laboratory values and demographics into risk prediction modeling. The authors develop a risk prediction model for chronic kidney disease (CKD) progression from stage III to stage IV that includes longitudinal data and features drawn from clinical documentation. The study cohort consisted of 2908 primary-care clinic patients who had at least three visits prior to January 1, 2013 and developed CKD stage III during their documented history. Development and validation cohorts were randomly selected from this cohort and the study datasets included longitudinal inpatient and outpatient data from these populations. Time series analysis (Kalman filter) and survival analysis (Cox proportional hazards) were combined to produce a range of risk models. These models were evaluated using concordance, a discriminatory statistic. A risk model incorporating longitudinal data on clinical documentation and laboratory test results (concordance 0.849) predicts progression from state III CKD to stage IV CKD more accurately when compared to a similar model without laboratory test results (concordance 0.733, P<.001), a model that only considers the most recent laboratory test results (concordance 0.819, P < .031) and a model based on estimated glomerular filtration rate (concordance 0.779, P < .001). A risk prediction model that takes longitudinal laboratory test results and clinical documentation into consideration can predict CKD progression from stage III to stage IV more accurately than three models that do not take all of these variables into consideration. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  11. Caries risk assessment in young adults using Public Dental Service guidelines and the Cariogram--a comparative study.

    PubMed

    Petersson, Gunnel Hänsel; Ericson, Ewa; Isberg, Per-Erik; Twetman, Svante

    2013-01-01

    To investigate the caries risk profiles in young adults and to compare the risk classification using the Public Dental Service (PDS) guidelines with a risk assessment program, the Cariogram. All 19-year-old patients registered at eight public dental clinics were invited to participate (n = 1699). The study group who completed the baseline examination consisted of 1295 subjects representing 10% of all 19 year-olds attending dental care at the PDS in Skåne, Sweden. A risk classification of each patient was made by the patient's regular team according to the PDS guidelines. A research team collected whole saliva samples and information from a questionnaire and a structured interview in order to calculate risk according to the Cariogram model. The mean DFS value was 4.9 and 23% of the patients were registered as caries-free (DFS = 0). The PDS risk classification was predominantly based on past caries and/or present caries activity. The majority was classified as 'some risk', while 16.7% were assessed as being of 'high' or 'very high risk'. The corresponding value for the two highest risk groups in the Cariogram model was 17.4%. The agreement between the two models was found acceptable (77.5%) for those assessed as low risk, while discrepancies were disclosed among those classified with higher risks. Although the proportion of subjects assessed with high or very high risk was similar using the PDS guidelines and the Cariogram model, the agreement between the models was fair. An acceptable agreement was only disclosed for the low risk category.

  12. Theory-Based Cartographic Risk Model Development and Application for Home Fire Safety.

    PubMed

    Furmanek, Stephen; Lehna, Carlee; Hanchette, Carol

    There is a gap in the use of predictive risk models to identify areas at risk for home fires and burn injury. The purpose of this study was to describe the creation, validation, and application of such a model using a sample from an intervention study with parents of newborns in Jefferson County, KY, as an example. Performed was a literature search to identify risk factors for home fires and burn injury in the target population. Obtained from the American Community Survey at the census tract level and synthesized to create a predictive cartographic risk model was risk factor data. Model validation was performed through correlation, regression, and Moran's I with fire incidence data from open records. Independent samples t-tests were used to examine the model in relation to geocoded participant addresses. Participant risk level for fire rate was determined and proximity to fire station service areas and hospitals. The model showed high and severe risk clustering in the northwest section of the county. Strongly correlated with fire rate was modeled risk; the best predictive model for fire risk contained home value (low), race (black), and non high school graduates. Applying the model to the intervention sample, the majority of participants were at lower risk and mostly within service areas closest to a fire department and hospital. Cartographic risk models were useful in identifying areas at risk and analyzing participant risk level. The methods outlined in this study are generalizable to other public health issues.

  13. Construction and evaluation of FiND, a fall risk prediction model of inpatients from nursing data.

    PubMed

    Yokota, Shinichiroh; Ohe, Kazuhiko

    2016-04-01

    To construct and evaluate an easy-to-use fall risk prediction model based on the daily condition of inpatients from secondary use electronic medical record system data. The present authors scrutinized electronic medical record system data and created a dataset for analysis by including inpatient fall report data and Intensity of Nursing Care Needs data. The authors divided the analysis dataset into training data and testing data, then constructed the fall risk prediction model FiND from the training data, and tested the model using the testing data. The dataset for analysis contained 1,230,604 records from 46,241 patients. The sensitivity of the model constructed from the training data was 71.3% and the specificity was 66.0%. The verification result from the testing dataset was almost equivalent to the theoretical value. Although the model's accuracy did not surpass that of models developed in previous research, the authors believe FiND will be useful in medical institutions all over Japan because it is composed of few variables (only age, sex, and the Intensity of Nursing Care Needs items), and the accuracy for unknown data was clear. © 2016 Japan Academy of Nursing Science.

  14. Isoleucine-to-methionine substitution at residue 148 variant of PNPLA3 gene and metabolic outcomes in gestational diabetes.

    PubMed

    Bo, Simona; Gambino, Roberto; Menato, Guido; Canil, Stefania; Ponzo, Valentina; Pinach, Silvia; Durazzo, Marilena; Ghigo, Ezio; Cassader, Maurizio; Musso, Giovanni

    2015-02-01

    A single nucleotide polymorphism (SNP) of the patatin-like phospholipase-3 (PNPLA3)/adiponutrin gene (rs738409 C>G) is strongly associated with nonalcoholic fatty liver disease; to our knowledge, no data are available on the impact of this PNPLA3 SNP on liver and metabolic outcomes during pregnancy in patients with gestational diabetes (GD). We evaluated the impact of the PNPLA3 rs738409 SNP on liver enzymes, metabolic indexes, and maternal and neonatal outcomes in 200 GD patients enrolled in a lifestyle intervention. In a randomized trial with a 2 × 2 factorial design, exercise significantly improved maternal and neonatal outcomes in GD patients. Effects of the G allele on metabolic and liver indexes and maternal and neonatal outcomes were evaluated in these patients. At the end of the trial, fasting insulin and homeostasis model assessment of insulin resistance (HOMA-IR) values were significantly lower and liver enzymes significantly higher in PNPLA3 G-allele carriers. In a multiple regression model, the G allele was associated directly with aspartate aminotransferase (β = 2.60; 95% CI: 0.99, 4.20), alanine aminotransferase (β = 3.70; 95% CI: 1.78, 5.62), and γ-glutamyl transferase (β = 3.70; 95% CI: 0.80, 6.60) and inversely with insulin (β = -2.01; 95% CI: -3.24, -0.78) and HOMA-IR (β = -0.39; -0.64, -0.14) values at the end of the trial. In a multiple logistic regression model, the G allele was associated directly with risk of developing liver enzyme elevation during pregnancy (OR: 4.21; 95% CI: 1.78, 9.97) and inversely with the birth of large-for-gestational-age newborns (OR: 0.19; 95% CI: 0.06, 0.62). No diet × genotype or exercise × genotype interaction was shown. The PNPLA3 SNP rs738409 G allele was associated with risk of mildly elevated transaminases in GD independent of a lifestyle intervention and despite a significant reduction in insulin resistance and risk of macrosomic offspring. This trial was registered at clinicaltrials.gov as NCT01506310. © 2015 American Society for Nutrition.

  15. Remodeling characteristics and collagen distribution in synthetic mesh materials explanted from human subjects after abdominal wall reconstruction: an analysis of remodeling characteristics by patient risk factors and surgical site classifications

    PubMed Central

    Cavallo, Jaime A.; Roma, Andres A.; Jasielec, Mateusz S.; Ousley, Jenny; Creamer, Jennifer; Pichert, Matthew D.; Baalman, Sara; Frisella, Margaret M.; Matthews, Brent D.

    2014-01-01

    Background The purpose of this study was to evaluate the associations between patient characteristics or surgical site classifications and the histologic remodeling scores of synthetic meshes biopsied from their abdominal wall repair sites in the first attempt to generate a multivariable risk prediction model of non-constructive remodeling. Methods Biopsies of the synthetic meshes were obtained from the abdominal wall repair sites of 51 patients during a subsequent abdominal re-exploration. Biopsies were stained with hematoxylin and eosin, and evaluated according to a semi-quantitative scoring system for remodeling characteristics (cell infiltration, cell types, extracellular matrix deposition, inflammation, fibrous encapsulation, and neovascularization) and a mean composite score (CR). Biopsies were also stained with Sirius Red and Fast Green, and analyzed to determine the collagen I:III ratio. Based on univariate analyses between subject clinical characteristics or surgical site classification and the histologic remodeling scores, cohort variables were selected for multivariable regression models using a threshold p value of ≤0.200. Results The model selection process for the extracellular matrix score yielded two variables: subject age at time of mesh implantation, and mesh classification (c-statistic = 0.842). For CR score, the model selection process yielded two variables: subject age at time of mesh implantation and mesh classification (r2 = 0.464). The model selection process for the collagen III area yielded a model with two variables: subject body mass index at time of mesh explantation and pack-year history (r2 = 0.244). Conclusion Host characteristics and surgical site assessments may predict degree of remodeling for synthetic meshes used to reinforce abdominal wall repair sites. These preliminary results constitute the first steps in generating a risk prediction model that predicts the patients and clinical circumstances for which non-constructive remodeling of an abdominal wall repair site with synthetic mesh reinforcement is most likely to occur. PMID:24442681

  16. Influence of information about specific absorption rate (SAR) upon customers' purchase decisions and safety evaluation of mobile phones.

    PubMed

    Wiedemann, Peter M; Schütz, Holger; Clauberg, Martin

    2008-02-01

    This study investigated whether the SAR value is a purchase-relevant characteristic of mobile phones for laypersons and what effect the disclosure of a precautionary SAR value has on laypersons' risk perception. The study consisted of two parts: Study part 1 used a conjoint analysis design to explore the relevance of the SAR value and other features of mobile phones for an intended buying decision. Study part 2 used an experimental, repeated measures design to examine the effect of the magnitude of SAR values and the disclosure of a precautionary SAR value on risk perception. In addition, the study included an analysis of prior concerns of the study participants with regard to mobile phone risks. Part 1 indicates that the SAR value has a high relevance for laypersons' purchase intentions. In the experimental purchase setting it ranks even before price and equipment features. The results of study part 2 show that providing information of a precautionary limit value does not influence risk perception. This result suggests that laypersons' underlying subjective "safety model" for mobile phones resembles more a "margin of safety" concept than a threshold concept. The latter observation holds true no matter how concerned the participants are. (c) 2007 Wiley-Liss, Inc.

  17. A behavioral and neural evaluation of prospective decision-making under risk.

    PubMed

    Symmonds, Mkael; Bossaerts, Peter; Dolan, Raymond J

    2010-10-27

    Making the best choice when faced with a chain of decisions requires a person to judge both anticipated outcomes and future actions. Although economic decision-making models account for both risk and reward in single-choice contexts, there is a dearth of similar knowledge about sequential choice. Classical utility-based models assume that decision-makers select and follow an optimal predetermined strategy, regardless of the particular order in which options are presented. An alternative model involves continuously reevaluating decision utilities, without prescribing a specific future set of choices. Here, using behavioral and functional magnetic resonance imaging (fMRI) data, we studied human subjects in a sequential choice task and use these data to compare alternative decision models of valuation and strategy selection. We provide evidence that subjects adopt a model of reevaluating decision utilities, in which available strategies are continuously updated and combined in assessing action values. We validate this model by using simultaneously acquired fMRI data to show that sequential choice evokes a pattern of neural response consistent with a tracking of anticipated distribution of future reward, as expected in such a model. Thus, brain activity evoked at each decision point reflects the expected mean, variance, and skewness of possible payoffs, consistent with the idea that sequential choice evokes a prospective evaluation of both available strategies and possible outcomes.

  18. Debates—Perspectives on socio-hydrology: Modeling flood risk as a public policy problem

    NASA Astrophysics Data System (ADS)

    Gober, Patricia; Wheater, Howard S.

    2015-06-01

    Socio-hydrology views human activities as endogenous to water system dynamics; it is the interaction between human and biophysical processes that threatens the viability of current water systems through positive feedbacks and unintended consequences. Di Baldassarre et al. implement socio-hydrology as a flood risk problem using the concept of social memory as a vehicle to link human perceptions to flood damage. Their mathematical model has heuristic value in comparing potential flood damages in green versus technological societies. It can also support communities in exploring the potential consequences of policy decisions and evaluating critical policy tradeoffs, for example, between flood protection and economic development. The concept of social memory does not, however, adequately capture the social processes whereby public perceptions are translated into policy action, including the pivotal role played by the media in intensifying or attenuating perceived flood risk, the success of policy entrepreneurs in keeping flood hazard on the public agenda during short windows of opportunity for policy action, and different societal approaches to managing flood risk that derive from cultural values and economic interests. We endorse the value of seeking to capture these dynamics in a simplified conceptual framework, but favor a broader conceptualization of socio-hydrology that includes a knowledge exchange component, including the way modeling insights and scientific results are communicated to floodplain managers. The social processes used to disseminate the products of socio-hydrological research are as important as the research results themselves in determining whether modeling is used for real-world decision making.

  19. Second-Year Results of an Obesity Prevention Program at The Dow Chemical Company

    PubMed Central

    Roemer, Enid C.; Pei, Xiaofei; Short, Meghan E.; Tabrizi, Maryam J.; Wilson, Mark G.; DeJoy, David M.; Craun, Beth A.; Tully, Karen J.; White, John M.; Baase, Catherine M.

    2010-01-01

    Objective Evaluate innovative, evidence-based approaches to organizational/supportive environmental interventions aimed at reducing the prevalence of obesity among Dow employees after two years of implementation. Methods A quasi-experimental study design compared outcomes for two levels of intervention intensity to a control group. Propensity scores were used to weight baseline differences between intervention and control subjects. Difference-in-differences methods and multi-level modeling were used to control for individual and site-level confounders. Results Intervention participants maintained their weight and BMI while control participants gained 1.3 pounds and increased their BMI values by 0.2 over two years. Significant differences in blood pressure and cholesterol values were observed when comparing intervention employees to controls. At higher intensity sites, improvements were more pronounced. Conclusions Environmental interventions at the workplace can support weight management and risk reduction after two years. PMID:20190646

  20. Latino/a Youth Intentions to Smoke Cigarettes: Exploring the Roles of Culture and Gender

    PubMed Central

    Lorenzo-Blanco, Elma I.; Schwartz, Seth J.; Unger, Jennifer B.; Zamboanga, Byron L.; Des Rosiers, Sabrina E.; Huang, Shi; Villamar, Juan A.; Soto, Daniel W.; Pattarroyo, Monica; Baezconde-Garbanati, Lourdes

    2016-01-01

    Latino/a youth are at risk for cigarette smoking. This risk seems to increase as youth navigate the U.S. cultural context, especially for girls. To investigate how acculturation may influence Latino/a youths’ intentions to use cigarettes, this study combines a bidimensional/multidomain model of acculturation and the Theory of Reasoned Action. Our sample consisted of 303 recent Latino/a immigrant youth who had resided in the United States for five years or less at baseline (141 girls, 160 boys; 153 from Miami, 150 from Los Angeles) who completed surveys at 3 time-points. Youth completed measures of acculturation (Latino/a practices, Latino/a identity, collectivistic values; U.S. cultural practices, U.S. identity, individualistic values), smoking related health risk attitudes, perceived subjective norms regarding smoking, and intentions to use cigarettes. Structural equation modeling indicated that collectivistic values were associated with more perceived disapproval of smoking, which in turn was negatively associated with intentions to smoke. Collectivistic values may help protect Latino/a immigrant youth from intending to smoke. Thus, educational smoking prevention efforts could promote collectivistic values and disseminate messages about the negative consequences of smoking on interpersonal relationships. PMID:28042523

  1. Latino/a Youth Intentions to Smoke Cigarettes: Exploring the Roles of Culture and Gender.

    PubMed

    Lorenzo-Blanco, Elma I; Schwartz, Seth J; Unger, Jennifer B; Zamboanga, Byron L; Des Rosiers, Sabrina E; Huang, Shi; Villamar, Juan A; Soto, Daniel W; Pattarroyo, Monica; Baezconde-Garbanati, Lourdes

    2015-08-01

    Latino/a youth are at risk for cigarette smoking. This risk seems to increase as youth navigate the U.S. cultural context, especially for girls. To investigate how acculturation may influence Latino/a youths' intentions to use cigarettes, this study combines a bidimensional/multidomain model of acculturation and the Theory of Reasoned Action. Our sample consisted of 303 recent Latino/a immigrant youth who had resided in the United States for five years or less at baseline (141 girls, 160 boys; 153 from Miami, 150 from Los Angeles) who completed surveys at 3 time-points. Youth completed measures of acculturation (Latino/a practices, Latino/a identity, collectivistic values; U.S. cultural practices, U.S. identity, individualistic values), smoking related health risk attitudes, perceived subjective norms regarding smoking, and intentions to use cigarettes. Structural equation modeling indicated that collectivistic values were associated with more perceived disapproval of smoking, which in turn was negatively associated with intentions to smoke. Collectivistic values may help protect Latino/a immigrant youth from intending to smoke. Thus, educational smoking prevention efforts could promote collectivistic values and disseminate messages about the negative consequences of smoking on interpersonal relationships.

  2. Ecological risk assessment of microcystin-LR in the upstream section of the Haihe River based on a species sensitivity distribution model.

    PubMed

    Niu, Zhiguang; Du, Lei; Li, Jiafu; Zhang, Ying; Lv, Zhiwei

    2018-02-01

    The eutrophication of surface water has been the main problem of water quality management in recent decades, and the ecological risk of microcystin-LR (MC-LR), which is the by-product of eutrophication, has drawn more attention worldwide. The aims of our study were to determine the predicted no effect concentration (PNEC) of MC-LR and to assess the ecological risk of MC-LR in the upstream section of the Haihe River. HC 5 (hazardous concentration for 5% of biological species) and PNEC were obtained from a species sensitivity distribution (SSD) model, which was constructed with the acute toxicity data of MC-LR on aquatic organisms. The concentrations of MC-LR in the upstream section of the Haihe River from April to August of 2015 were analysed, and the ecological risk characteristics of MC-LR were evaluated based on the SSD model. The results showed that the HC 5 of MC-LR in freshwater was 17.18 μg/L and PNEC was 5.73 μg/L. The concentrations of MC-LR ranged from 0.68 μg/L to 32.21 μg/L and were obviously higher in summer than in spring. The values of the risk quotient (RQ) ranged from 0.12 to 5.62, suggesting that the risk of MC-LR for aquatic organisms in the river was at a medium or high level during the study period. Compared with other waterbodies in the world, the pollution level of MC-LR in the Haihe River was at a moderate level. This research could promote the study of the ecological risk of MC-LR at the ecosystem level. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Research on Liquidity Risk Evaluation of Chinese A-Shares Market Based on Extension Theory

    NASA Astrophysics Data System (ADS)

    Bai-Qing, Sun; Peng-Xiang, Liu; Lin, Zhang; Yan-Ge, Li

    This research defines the liquidity risk of stock market in matter-element theory and affair-element theory, establishes the indicator system of the forewarning for liquidity risks,designs the model and the process of early warning using the extension set method, extension dependent function and the comprehensive evaluation model. And the paper studies empirically A-shares market through the data of 1A0001, which prove that the model can better describe liquidity risk of China’s A-share market. At last, it gives the corresponding policy recommendations.

  4. Application and comparison of the FADES, MADIT, and SHFM-D risk models for risk stratification of prophylactic implantable cardioverter-defibrillator treatment

    PubMed Central

    van der Heijden, Aafke C.; van Rees, Johannes B.; Levy, Wayne C.; van der Bom, Johanna G.; Cannegieter, Suzanne C.; de Bie, Mihàly K.; van Erven, Lieselot; Schalij, Martin J.; Borleffs, C. Jan  Willem

    2017-01-01

    Aims Implantable cardioverter-defibrillator (ICD) treatment is beneficial in selected patients. However, it remains difficult to accurately predict which patients benefit most from ICD implantation. For this purpose, different risk models have been developed. The aim was to validate and compare the FADES, MADIT, and SHFM-D models. Methods and results All patients receiving a prophylactic ICD at the Leiden University Medical Center were evaluated. Individual model performance was evaluated by C-statistics. Model performances were compared using net reclassification improvement (NRI) and integrated differentiation improvement (IDI). The primary endpoint was non-benefit of ICD treatment, defined as mortality without prior ventricular arrhythmias requiring ICD intervention. A total of 1969 patients were included (age 63 ± 11 years; 79% male). During a median follow-up of 4.5 ± 3.9 years, 318 (16%) patients died without prior ICD intervention. All three risk models were predictive for event-free mortality (all: P < 0.001). The C-statistics were 0.66, 0.69, and 0.75, respectively, for FADES, MADIT, and SHFM-D (all: P < 0.001). Application of the SHFM-D resulted in an improved IDI of 4% and NRI of 26% compared with MADIT; IDI improved 11% with the use of SHFM-D instead of FADES (all: P < 0.001), but NRI remained unchanged (P = 0.71). Patients in the highest-risk category of the MADIT and SHFM-D models had 1.7 times higher risk to experience ICD non-benefit than receive appropriate ICD interventions [MADIT: mean difference (MD) 20% (95% CI: 7–33%), P = 0.001; SHFM-D: MD 16% (95% CI: 5–27%), P = 0.005]. Patients in the highest-risk category of FADES were as likely to experience ICD intervention as ICD non-benefit [MD 3% (95% CI: –8 to 14%), P = 0.60]. Conclusion The predictive and discriminatory value of SHFM-D to predict non-benefit of ICD treatment is superior to FADES and MADIT in patients receiving prophylactic ICD treatment. PMID:28130376

  5. Are youth mentoring programs good value-for-money? An evaluation of the Big Brothers Big Sisters Melbourne Program.

    PubMed

    Moodie, Marjory L; Fisher, Jane

    2009-01-30

    The Big Brothers Big Sisters (BBBS) program matches vulnerable young people with a trained, supervised adult volunteer as mentor. The young people are typically seriously disadvantaged, with multiple psychosocial problems. Threshold analysis was undertaken to determine whether investment in the program was a worthwhile use of limited public funds. The potential cost savings were based on US estimates of life-time costs associated with high-risk youth who drop out-of-school and become adult criminals. The intervention was modelled for children aged 10-14 years residing in Melbourne in 2004. If the program serviced 2,208 of the most vulnerable young people, it would cost AUD 39.5 M. Assuming 50% were high-risk, the associated costs of their adult criminality would be AUD 3.3 billion. To break even, the program would need to avert high-risk behaviours in only 1.3% (14/1,104) of participants. This indicative evaluation suggests that the BBBS program represents excellent 'value for money'.

  6. America's Youth Are at Risk: Developing Models for Action in the Nation's Public Libraries.

    ERIC Educational Resources Information Center

    Flum, Judith G.; Weisner, Stan

    1993-01-01

    Discussion of public library support systems for at-risk teens focuses on the Bay Area Library and Information System (BALIS) that was developed to improve library services to at-risk teenagers in the San Francisco Bay area. Highlights include needs assessment; staff training; intervention models; and project evaluation. (10 references) (LRW)

  7. Prediction of new brain metastases after radiosurgery: validation and analysis of performance of a multi-institutional nomogram.

    PubMed

    Ayala-Peacock, Diandra N; Attia, Albert; Braunstein, Steve E; Ahluwalia, Manmeet S; Hepel, Jaroslaw; Chung, Caroline; Contessa, Joseph; McTyre, Emory; Peiffer, Ann M; Lucas, John T; Isom, Scott; Pajewski, Nicholas M; Kotecha, Rupesh; Stavas, Mark J; Page, Brandi R; Kleinberg, Lawrence; Shen, Colette; Taylor, Robert B; Onyeuku, Nasarachi E; Hyde, Andrew T; Gorovets, Daniel; Chao, Samuel T; Corso, Christopher; Ruiz, Jimmy; Watabe, Kounosuke; Tatter, Stephen B; Zadeh, Gelareh; Chiang, Veronica L S; Fiveash, John B; Chan, Michael D

    2017-11-01

    Stereotactic radiosurgery (SRS) without whole brain radiotherapy (WBRT) for brain metastases can avoid WBRT toxicities, but with risk of subsequent distant brain failure (DBF). Sole use of number of metastases to triage patients may be an unrefined method. Data on 1354 patients treated with SRS monotherapy from 2000 to 2013 for new brain metastases was collected across eight academic centers. The cohort was divided into training and validation datasets and a prognostic model was developed for time to DBF. We then evaluated the discrimination and calibration of the model within the validation dataset, and confirmed its performance with an independent contemporary cohort. Number of metastases (≥8, HR 3.53 p = 0.0001), minimum margin dose (HR 1.07 p = 0.0033), and melanoma histology (HR 1.45, p = 0.0187) were associated with DBF. A prognostic index derived from the training dataset exhibited ability to discriminate patients' DBF risk within the validation dataset (c-index = 0.631) and Heller's explained relative risk (HERR) = 0.173 (SE = 0.048). Absolute number of metastases was evaluated for its ability to predict DBF in the derivation and validation datasets, and was inferior to the nomogram. A nomogram high-risk threshold yielding a 2.1-fold increased need for early WBRT was identified. Nomogram values also correlated to number of brain metastases at time of failure (r = 0.38, p < 0.0001). We present a multi-institutionally validated prognostic model and nomogram to predict risk of DBF and guide risk-stratification of patients who are appropriate candidates for radiosurgery versus upfront WBRT.

  8. Diagnosis of periprosthetic joint infection in Medicare patients: multicriteria decision analysis.

    PubMed

    Diaz-Ledezma, Claudio; Lichstein, Paul M; Dolan, James G; Parvizi, Javad

    2014-11-01

    In the setting of finite healthcare resources, developing cost-efficient strategies for periprosthetic joint infection (PJI) diagnosis is paramount. The current levels of knowledge allow for PJI diagnostic recommendations based on scientific evidence but do not consider the benefits, opportunities, costs, and risks of the different diagnostic alternatives. We determined the best diagnostic strategy for knee and hip PJI in the ambulatory setting for Medicare patients, utilizing benefits, opportunities, costs, and risks evaluation through multicriteria decision analysis (MCDA). The PJI diagnostic definition supported by the Musculoskeletal Infection Society was employed for the MCDA. Using a preclinical model, we evaluated three diagnostic strategies that can be conducted in a Medicare patient seen in the outpatient clinical setting complaining of a painful TKA or THA. Strategies were (1) screening with serum markers (erythrocyte sedimentation rate [ESR]/C-reactive protein [CRP]) followed by arthrocentesis in positive cases, (2) immediate arthrocentesis, and (3) serum markers requested simultaneously with arthrocentesis. MCDA was conducted through the analytic hierarchy process, comparing the diagnostic strategies in terms of benefits, opportunities, costs, and risks. Strategy 1 was the best alternative to diagnose knee PJI among Medicare patients (normalized value: 0.490), followed by Strategy 3 (normalized value: 0.403) and then Strategy 2 (normalized value: 0.106). The same ranking of alternatives was observed for the hip PJI model (normalized value: 0.487, 0.405, and 0.107, respectively). The sensitivity analysis found this sequence to be robust with respect to benefits, opportunities, and risks. However, if during the decision-making process, cost savings was given a priority of higher than 54%, the ranking for the preferred diagnostic strategy changed. After considering the benefits, opportunities, costs, and risks of the different available alternatives, our preclinical model supports the American Academy of Orthopaedic Surgeons recommendations regarding the use of serum markers (ESR/CRP) before arthrocentesis as the best diagnostic strategy for PJI among Medicare patients. Level II, economic and decision analysis. See Instructions to Authors for a complete description of levels of evidence.

  9. Relative Biological Effectiveness of HZE Particles for Chromosomal Exchanges and Other Surrogate Cancer Risk Endpoints.

    PubMed

    Cacao, Eliedonna; Hada, Megumi; Saganti, Premkumar B; George, Kerry A; Cucinotta, Francis A

    2016-01-01

    The biological effects of high charge and energy (HZE) particle exposures are of interest in space radiation protection of astronauts and cosmonauts, and estimating secondary cancer risks for patients undergoing Hadron therapy for primary cancers. The large number of particles types and energies that makeup primary or secondary radiation in HZE particle exposures precludes tumor induction studies in animal models for all but a few particle types and energies, thus leading to the use of surrogate endpoints to investigate the details of the radiation quality dependence of relative biological effectiveness (RBE) factors. In this report we make detailed RBE predictions of the charge number and energy dependence of RBE's using a parametric track structure model to represent experimental results for the low dose response for chromosomal exchanges in normal human lymphocyte and fibroblast cells with comparison to published data for neoplastic transformation and gene mutation. RBE's are evaluated against acute doses of γ-rays for doses near 1 Gy. Models that assume linear or non-targeted effects at low dose are considered. Modest values of RBE (<10) are found for simple exchanges using a linear dose response model, however in the non-targeted effects model for fibroblast cells large RBE values (>10) are predicted at low doses <0.1 Gy. The radiation quality dependence of RBE's against the effects of acute doses γ-rays found for neoplastic transformation and gene mutation studies are similar to those found for simple exchanges if a linear response is assumed at low HZE particle doses. Comparisons of the resulting model parameters to those used in the NASA radiation quality factor function are discussed.

  10. Relative Biological Effectiveness of HZE Particles for Chromosomal Exchanges and Other Surrogate Cancer Risk Endpoints

    DOE PAGES

    Cacao, Eliedonna; Hada, Megumi; Saganti, Premkumar B.; ...

    2016-04-25

    The biological effects of high charge and energy (HZE) particle exposures are of interest in space radiation protection of astronauts and cosmonauts, and estimating secondary cancer risks for patients undergoing Hadron therapy for primary cancers. The large number of particles types and energies that makeup primary or secondary radiation in HZE particle exposures precludes tumor induction studies in animal models for all but a few particle types and energies, thus leading to the use of surrogate endpoints to investigate the details of the radiation quality dependence of relative biological effectiveness (RBE) factors. In this report we make detailed RBE predictionsmore » of the charge number and energy dependence of RBE’s using a parametric track structure model to represent experimental results for the low dose response for chromosomal exchanges in normal human lymphocyte and fibroblast cells with comparison to published data for neoplastic transformation and gene mutation. RBE’s are evaluated against acute doses of γ-rays for doses near 1 Gy. Models that assume linear or non-targeted effects at low dose are considered. Modest values of RBE (<10) are found for simple exchanges using a linear dose response model, however in the non-targeted effects model for fibroblast cells large RBE values (>10) are predicted at low doses <0.1 Gy. The radiation quality dependence of RBE’s against the effects of acute doses γ-rays found for neoplastic transformation and gene mutation studies are similar to those found for simple exchanges if a linear response is assumed at low HZE particle doses. Finally, we discuss comparisons of the resulting model parameters to those used in the NASA radiation quality factor function.« less

  11. Relative Biological Effectiveness of HZE Particles for Chromosomal Exchanges and Other Surrogate Cancer Risk Endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cacao, Eliedonna; Hada, Megumi; Saganti, Premkumar B.

    The biological effects of high charge and energy (HZE) particle exposures are of interest in space radiation protection of astronauts and cosmonauts, and estimating secondary cancer risks for patients undergoing Hadron therapy for primary cancers. The large number of particles types and energies that makeup primary or secondary radiation in HZE particle exposures precludes tumor induction studies in animal models for all but a few particle types and energies, thus leading to the use of surrogate endpoints to investigate the details of the radiation quality dependence of relative biological effectiveness (RBE) factors. In this report we make detailed RBE predictionsmore » of the charge number and energy dependence of RBE’s using a parametric track structure model to represent experimental results for the low dose response for chromosomal exchanges in normal human lymphocyte and fibroblast cells with comparison to published data for neoplastic transformation and gene mutation. RBE’s are evaluated against acute doses of γ-rays for doses near 1 Gy. Models that assume linear or non-targeted effects at low dose are considered. Modest values of RBE (<10) are found for simple exchanges using a linear dose response model, however in the non-targeted effects model for fibroblast cells large RBE values (>10) are predicted at low doses <0.1 Gy. The radiation quality dependence of RBE’s against the effects of acute doses γ-rays found for neoplastic transformation and gene mutation studies are similar to those found for simple exchanges if a linear response is assumed at low HZE particle doses. Finally, we discuss comparisons of the resulting model parameters to those used in the NASA radiation quality factor function.« less

  12. Mortality Probability Model III and Simplified Acute Physiology Score II

    PubMed Central

    Vasilevskis, Eduard E.; Kuzniewicz, Michael W.; Cason, Brian A.; Lane, Rondall K.; Dean, Mitzi L.; Clay, Ted; Rennie, Deborah J.; Vittinghoff, Eric; Dudley, R. Adams

    2009-01-01

    Background: To develop and compare ICU length-of-stay (LOS) risk-adjustment models using three commonly used mortality or LOS prediction models. Methods: Between 2001 and 2004, we performed a retrospective, observational study of 11,295 ICU patients from 35 hospitals in the California Intensive Care Outcomes Project. We compared the accuracy of the following three LOS models: a recalibrated acute physiology and chronic health evaluation (APACHE) IV-LOS model; and models developed using risk factors in the mortality probability model III at zero hours (MPM0) and the simplified acute physiology score (SAPS) II mortality prediction model. We evaluated models by calculating the following: (1) grouped coefficients of determination; (2) differences between observed and predicted LOS across subgroups; and (3) intraclass correlations of observed/expected LOS ratios between models. Results: The grouped coefficients of determination were APACHE IV with coefficients recalibrated to the LOS values of the study cohort (APACHE IVrecal) [R2 = 0.422], mortality probability model III at zero hours (MPM0 III) [R2 = 0.279], and simplified acute physiology score (SAPS II) [R2 = 0.008]. For each decile of predicted ICU LOS, the mean predicted LOS vs the observed LOS was significantly different (p ≤ 0.05) for three, two, and six deciles using APACHE IVrecal, MPM0 III, and SAPS II, respectively. Plots of the predicted vs the observed LOS ratios of the hospitals revealed a threefold variation in LOS among hospitals with high model correlations. Conclusions: APACHE IV and MPM0 III were more accurate than SAPS II for the prediction of ICU LOS. APACHE IV is the most accurate and best calibrated model. Although it is less accurate, MPM0 III may be a reasonable option if the data collection burden or the treatment effect bias is a consideration. PMID:19363210

  13. Integrating human health and ecological concerns in risk assessments.

    PubMed

    Cirone, P A; Bruce Duncan, P

    2000-11-03

    The interconnections between ecosystems, human health and welfare have been increasingly recognized by the US government, academia, and the public. This paper continues this theme by addressing the use of risk assessment to integrate people into a single assessment. In a broad overview of the risk assessment process we stress the need to build a conceptual model of the whole system including multiple species (humans and other ecological entities), stressors, and cumulative effects. We also propose converging landscape ecology and evaluation of ecosystem services with risk assessment to address these cumulative responses. We first look at how this integration can occur within the problem formulation step in risk assessment where the system is defined, a conceptual model created, a subset of components and functions selected, and the analytical framework decided in a context that includes the management decisions. A variety of examples of problem formulations (salmon, wild insects, hyporheic ecosystems, ultraviolet (UV) radiation, nitrogen fertilization, toxic chemicals, and oil spills) are presented to illustrate how treating humans as components of the landscape can add value to risk assessments. We conclude that the risk assessment process should help address the urgent needs of society in proportion to importance, to provide a format to communicate knowledge and understanding, and to inform policy and management decisions.

  14. Arthroscopic Bankart repair and capsular shift for recurrent anterior shoulder instability: functional outcomes and identification of risk factors for recurrence.

    PubMed

    Ahmed, Issaq; Ashton, Fiona; Robinson, Christopher Michael

    2012-07-18

    Arthroscopic Bankart repair and capsular shift is a well-established technique for the treatment of anterior shoulder instability. The purpose of this study was to evaluate the outcomes following arthroscopic Bankart repair and capsular shift and to identify risk factors that are predictive of recurrence of glenohumeral instability. We performed a retrospective review of a prospectively collected database consisting of 302 patients who had undergone arthroscopic Bankart repair and capsular shift for the treatment of recurrent anterior glenohumeral instability. The prevalence of patient and injury-related risk factors for recurrence was assessed. Cox proportional hazards models were used to estimate the predicted probability of recurrence within two years. The chief outcome measures were the risk of recurrence and the two-year functional outcomes assessed with the Western Ontario shoulder instability index (WOSI) and disabilities of the arm, shoulder and hand (DASH) scores. The rate of recurrent glenohumeral instability after arthroscopic Bankart repair and capsular shift was 13.2%. The median time to recurrence was twelve months, and this complication developed within one year in 55% of these patients. The risk of recurrence was independently predicted by the patient's age at surgery, the severity of glenoid bone loss, and the presence of an engaging Hill-Sachs lesion (all p < 0.001). These variables were incorporated into a model to provide an estimate of the risk of recurrence after surgery. Varying the cutoff level for the predicted probability of recurrence in the model from 50% to lower values increased the sensitivity of the model to detect recurrences but decreased the positive predictive value of the model to correctly predict failed repairs. There was a significant improvement in the mean WOSI and DASH scores at two years postoperatively (both p < 0.001), but the mean scores in the group with recurrence were significantly lower than those in the group without recurrence (both p < 0.001). Our study identified factors that are independently associated with a higher risk of recurrence following arthroscopic Bankart repair and capsular shift. These data can be useful for counseling patients undergoing this procedure for the treatment of recurrent glenohumeral instability and individualizing treatment options for particular groups of patients. Prognostic level I. See Instructions for authors for a complete description of levels of evidence.

  15. Can risk assessment predict suicide in secondary mental healthcare? Findings from the South London and Maudsley NHS Foundation Trust Biomedical Research Centre (SLaM BRC) Case Register.

    PubMed

    Lopez-Morinigo, Javier-David; Fernandes, Andrea C; Shetty, Hitesh; Ayesa-Arriola, Rosa; Bari, Ashraful; Stewart, Robert; Dutta, Rina

    2018-06-02

    The predictive value of suicide risk assessment in secondary mental healthcare remains unclear. This study aimed to investigate the extent to which clinical risk assessment ratings can predict suicide among people receiving secondary mental healthcare. Retrospective inception cohort study (n = 13,758) from the South London and Maudsley NHS Foundation Trust (SLaM) (London, UK) linked with national mortality data (n = 81 suicides). Cox regression models assessed survival from the last suicide risk assessment and ROC curves evaluated the performance of risk assessment total scores. Hopelessness (RR = 2.24, 95% CI 1.05-4.80, p = 0.037) and having a significant loss (RR = 1.91, 95% CI 1.03-3.55, p = 0.041) were significantly associated with suicide in the multivariable Cox regression models. However, screening statistics for the best cut-off point (4-5) of the risk assessment total score were: sensitivity 0.65 (95% CI 0.54-0.76), specificity 0.62 (95% CI 0.62-0.63), positive predictive value 0.01 (95% CI 0.01-0.01) and negative predictive value 0.99 (95% CI 0.99-1.00). Although suicide was linked with hopelessness and having a significant loss, risk assessment performed poorly to predict such an uncommon outcome in a large case register of patients receiving secondary mental healthcare.

  16. Comparison of Risk Scores for Prediction of Complications following Aortic Valve Replacement.

    PubMed

    Wang, Tom Kai Ming; Choi, David Hyun-Min; Haydock, David; Gamble, Greg; Stewart, Ralph; Ruygrok, Peter

    2015-06-01

    Risk models play an important role in stratification of patients for cardiac surgery, but their prognostic utilities for post-operative complications are rarely studied. We compared the EuroSCORE, EuroSCORE II, Society of Thoracic Surgeon's (STS) Score and an Australasian model (Aus-AVR Score) for predicting morbidities after aortic valve replacement (AVR), and also evaluated seven STS complications models in this context. We retrospectively calculated risk scores for 620 consecutive patients undergoing isolated AVR at Auckland City Hospital during 2005-2012, assessing their discrimination and calibration for post-operative complications. Amongst mortality scores, the EuroSCORE was the best at discriminating stroke (c-statistic 0.845); the EuroSCORE II at deep sternal wound infection (c=0.748); and the STS Score at composite morbidity or mortality (c=0.666), renal failure (c=0.634), ventilation>24 hours (c=0.732), return to theatre (c=0.577) and prolonged hospital stay >14 days post-operatively (c=0.707). The individual STS complications models had a marginally higher c-statistic (c=0.634-0.846) for all complications except mediastinitis, and had good calibration (Hosmer-Lemeshow test P-value 0.123-0.915) for all complications. The STS Score was best overall at discriminating post-operative complications and their composite for AVR. All STS complications models except for deep sternal wound infection had good discrimination and calibration for post-operative complications. Copyright © 2014 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  17. Isolated and synergistic effects of PM10 and average temperature on cardiovascular and respiratory mortality.

    PubMed

    Pinheiro, Samya de Lara Lins de Araujo; Saldiva, Paulo Hilário Nascimento; Schwartz, Joel; Zanobetti, Antonella

    2014-12-01

    OBJECTIVE To analyze the effect of air pollution and temperature on mortality due to cardiovascular and respiratory diseases. METHODS We evaluated the isolated and synergistic effects of temperature and particulate matter with aerodynamic diameter < 10 µm (PM10) on the mortality of individuals > 40 years old due to cardiovascular disease and that of individuals > 60 years old due to respiratory diseases in Sao Paulo, SP, Southeastern Brazil, between 1998 and 2008. Three methodologies were used to evaluate the isolated association: time-series analysis using Poisson regression model, bidirectional case-crossover analysis matched by period, and case-crossover analysis matched by the confounding factor, i.e., average temperature or pollutant concentration. The graphical representation of the response surface, generated by the interaction term between these factors added to the Poisson regression model, was interpreted to evaluate the synergistic effect of the risk factors. RESULTS No differences were observed between the results of the case-crossover and time-series analyses. The percentage change in the relative risk of cardiovascular and respiratory mortality was 0.85% (0.45;1.25) and 1.60% (0.74;2.46), respectively, due to an increase of 10 μg/m3 in the PM10 concentration. The pattern of correlation of the temperature with cardiovascular mortality was U-shaped and that with respiratory mortality was J-shaped, indicating an increased relative risk at high temperatures. The values for the interaction term indicated a higher relative risk for cardiovascular and respiratory mortalities at low temperatures and high temperatures, respectively, when the pollution levels reached approximately 60 μg/m3. CONCLUSIONS The positive association standardized in the Poisson regression model for pollutant concentration is not confounded by temperature, and the effect of temperature is not confounded by the pollutant levels in the time-series analysis. The simultaneous exposure to different levels of environmental factors can create synergistic effects that are as disturbing as those caused by extreme concentrations.

  18. A Model To Identify Individuals at High Risk for Esophageal Squamous Cell Carcinoma and Precancerous Lesions in Regions of High Prevalence in China.

    PubMed

    Liu, Mengfei; Liu, Zhen; Cai, Hong; Guo, Chuanhai; Li, Xiang; Zhang, Chaoting; Wang, Hui; Hang, Dong; Liu, Fangfang; Deng, Qiuju; Yang, Xin; Yuan, Wenqing; Pan, Yaqi; Li, Jingjing; Zhang, Chanyuan; Shen, Na; He, Zhonghu; Ke, Yang

    2017-10-01

    We aimed to develop a population-based model to identify individuals at high risk for esophageal squamous cell carcinoma (ESCC) in regions of China with a high prevalence of this cancer. We collected findings from 15,073 permanent residents (45-69 years old) of 334 randomly selected villages in Hua County, Henan Province, China who underwent endoscopic screening (with iodine staining) for ESCC from January 2012 through September 2015. The entire esophagus and stomach were examined; biopsies were collected from all focal lesions (or from standard sites in the esophagus if no abnormalities were found) and analyzed histologically. Squamous dysplasia, carcinoma in situ, and ESCC were independently confirmed by 2 pathologists. Before endoscopy, subjects completed a questionnaire on ESCC risk factors. Variables were evaluated with unconditional univariate logistic regression analysis; variables found to be significantly associated with ESCC were then analyzed by multivariate logistic regression modeling. We used the Akaike information criterion to develop our final model structure and the coding form of variables with multiple measures. We developed 2 groups of models, separately defining severe dysplasia and above (SDA) (lesions including severe dysplasia and higher-grade lesions) and moderate dysplasia and above (lesions including moderate dysplasia and higher-grade lesions) as outcome events. Age-stratified and whole-age models were developed; their discriminative ability in the full multivariate model and the simple age model was compared. We performed area under the receiver operating characteristic curve (AUC) and the DeLong test to evaluate model performance. Our age-stratified prediction models identified individuals 60 years of age or younger with SDA with an AUC value of 0.795 (95% confidence interval, 0.736-0.854) and individuals older than 60 years with SDA with an AUC value of 0.681 (95% confidence interval, 0.618-0.743). Factors associated with SDA in individuals 60 years or younger included age closer to 60 years, use of coal or wood as a main source of cooking fuel, body mass index of 22 kg/m 2 or less, unexplained epigastric pain, and rapid ingestion of meals. In subjects older than 60 years, SDA associated with age, family history of ESCC, cigarette smoking, body mass index of 22 kg/m 2 or less, pesticide exposure, irregular eating habits, intake of high temperature foods, rapid ingestion of meals, and ingestion of leftover food in summer months. Use of our model in screening could have allowed 27% of subjects 60 years or younger and 9% of subjects older than 60 years to avoid endoscopy without missing SDAs. This means that approximately 2500 of endoscopies in total (16.6%) could have been avoided. We developed a low-cost, easy-to-use model to identify individuals at risk for severe dysplasia or cancer of the esophagus living in a region of China with a high risk of ESCC. This model might be used to select individuals and groups of persons who should undergo endoscopy analysis for esophageal cancer. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.

  19. Pharmaceutical quality assurance of local private distributors: a secondary analysis in 13 low-income and middle-income countries.

    PubMed

    Van Assche, Kerlijn; Nebot Giralt, Ariadna; Caudron, Jean Michel; Schiavetti, Benedetta; Pouget, Corinne; Tsoumanis, Achilleas; Meessen, Bruno; Ravinetto, Raffaella

    2018-01-01

    The rapid globalisation of the pharmaceutical production and distribution has not been supported by harmonisation of regulatory systems worldwide. Thus, the supply systems in low-income and middle-income countries (LMICs) remain exposed to the risk of poor-quality medicines. To contribute to estimating this risk in the private sector in LMICs, we assessed the quality assurance system of a convenient sample of local private pharmaceutical distributors. This descriptive study uses secondary data derived from the audits conducted by the QUAMED group at 60 local private pharmaceutical distributors in 13 LMICs. We assessed the distributors' compliance with good distribution practices (GDP), general quality requirements (GQR) and cold chain management (CCM), based on an evaluation tool inspired by the WHO guidelines 'Model Quality Assurance System (MQAS) for procurement agencies'. Descriptive statistics describe the compliance for the whole sample, for distributors in sub-Saharan Africa (SSA) versus those in non-SSA, and for those in low-income countries (LICs) versus middle-income countries (MICs). Local private pharmaceutical distributors in our sample were non-compliant, very low-compliant or low-compliant for GQR (70%), GDP (60%) and CCM (41%). Only 7/60 showed good to full compliance for at least two criteria. Observed compliance varies by geographical region and by income group: maximum values are higher in non-SSA versus SSA and in MICs versus LICs, while minimum values are the same across different groups. The poor compliance with WHO quality standards observed in our sample indicates a concrete risk that patients in LMICs are exposed to poor-quality or degraded medicines. Significant investments are needed to strengthen the regulatory supervision, including on private pharmaceutical distributors. An adapted standardised evaluation tool inspired by the WHO MQAS would be helpful for self-evaluation, audit and inspection purposes.

  20. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  1. Evaluation of the prognostic value of platelet to lymphocyte ratio in patients with hepatocellular carcinoma.

    PubMed

    Wang, Yuchen; Attar, Bashar M; Fuentes, Harry E; Jaiswal, Palashkumar; Tafur, Alfonso J

    2017-12-01

    Hepatocellular carcinoma (HCC) is increasingly common, potentially fatal cancer type globally. Platelet-lymphocyte ratio (PLR) as a biomarker for systemic inflammation has recently been recognized as a valuable prognostic marker in multiple cancer types. The aim of the present study was to assess the prognostic value of PLR in HCC patients and determine the optimal cut-off value for risk stratification. We retrospectively analyzed patients with diagnosis of HCC (screened by ICD-9 code, confirmed with radiographic examination and/or biopsy) at a large public hospital during 15 years (Jan 2000 through July 2015). PLR, among other serology laboratory values were collected at diagnosis of HCC. Its association with overall survival was evaluated with Cox proportional hazard model. Among 270 patients with HCC, 57 (21.1%) patients died within an average follow-up of 11.9 months. PLR at diagnosis was significantly different between survivors and deceased (128.9 vs. 186.7; P=0.003). In multivariate analysis, aspartate transaminase (AST) (HR 2.022, P<0.001) and PLR (HR 1.768, P=0.004) independently predicted mortality. The optimal cut-off value for PLR was determined to be 220 by receiver-operating characteristics curve, and high PLR group had significantly higher mortality (HR 3.42, P<0.001). Our results indicated that elevated PLR at diagnosis above 220 predicted poor prognosis in HCC patients. PLR is a low-cost and convenient tool, which may serve as a useful prognostic marker for HCC.

  2. Prognostic value of heart rate turbulence for risk assessment in patients with unstable angina and non-ST elevation myocardial infarction

    PubMed Central

    Harris, Patricia RE; Stein, Phyllis K; Fung, Gordon L; Drew, Barbara J

    2013-01-01

    Background We sought to examine the prognostic value of heart rate turbulence derived from electrocardiographic recordings initiated in the emergency department for patients with non-ST elevation myocardial infarction (NSTEMI) or unstable angina. Methods Twenty-four-hour Holter recordings were started in patients with cardiac symptoms approximately 45 minutes after arrival in the emergency department. Patients subsequently diagnosed with NSTEMI or unstable angina who had recordings with ≥18 hours of sinus rhythm and sufficient data to compute Thrombolysis In Myocardial Infarction (TIMI) risk scores were chosen for analysis (n = 166). Endpoints were emergent re-entry to the cardiac emergency department and/or death at 30 days and one year. Results In Cox regression models, heart rate turbulence and TIMI risk scores together were significant predictors of 30-day (model chi square 13.200, P = 0.001, C-statistic 0.725) and one-year (model chi square 31.160, P < 0.001, C-statistic 0.695) endpoints, outperforming either measure alone. Conclusion Measurement of heart rate turbulence, initiated upon arrival at the emergency department, may provide additional incremental value in the risk assessment for patients with NSTEMI or unstable angina. PMID:23976860

  3. Two criteria for evaluating risk prediction models

    PubMed Central

    Pfeiffer, R.M.; Gail, M.H.

    2010-01-01

    SUMMARY We propose and study two criteria to assess the usefulness of models that predict risk of disease incidence for screening and prevention, or the usefulness of prognostic models for management following disease diagnosis. The first criterion, the proportion of cases followed PCF(q), is the proportion of individuals who will develop disease who are included in the proportion q of individuals in the population at highest risk. The second criterion is the proportion needed to follow-up, PNF(p), namely the proportion of the general population at highest risk that one needs to follow in order that a proportion p of those destined to become cases will be followed. PCF(q) assesses the effectiveness of a program that follows 100q% of the population at highest risk. PNF(p) assess the feasibility of covering 100p% of cases by indicating how much of the population at highest risk must be followed. We show the relationship of those two criteria to the Lorenz curve and its inverse, and present distribution theory for estimates of PCF and PNF. We develop new methods, based on influence functions, for inference for a single risk model, and also for comparing the PCFs and PNFs of two risk models, both of which were evaluated in the same validation data. PMID:21155746

  4. [Establish research model of post-marketing clinical safety evaluation for Chinese patent medicine].

    PubMed

    Zheng, Wen-ke; Liu, Zhi; Lei, Xiang; Tian, Ran; Zheng, Rui; Li, Nan; Ren, Jing-tian; Du, Xiao-xi; Shang, Hong-cai

    2015-09-01

    The safety of Chinese patent medicine has become a focus of social. It is necessary to carry out work on post-marketing clinical safety evaluation for Chinese patent medicine. However, there have no criterions to guide the related research, it is urgent to set up a model and method to guide the practice for related research. According to a series of clinical research, we put forward some views, which contained clear and definite the objective and content of clinical safety evaluation, the work flow should be determined, make a list of items for safety evaluation project, and put forward the three level classification of risk control. We set up a model of post-marketing clinical safety evaluation for Chinese patent medicine. Based this model, the list of items can be used for ranking medicine risks, and then take steps for different risks, aims to lower the app:ds:risksrisk level. At last, the medicine can be managed by five steps in sequence. The five steps are, collect risk signal, risk recognition, risk assessment, risk management, and aftereffect assessment. We hope to provide new ideas for the future research.

  5. Commitment-Insurance: Compensating for the Autonomy Costs of Interdependence in Close Relationships

    PubMed Central

    Murray, Sandra L.; Holmes, John G.; Aloni, Maya; Pinkus, Rebecca T.; Derrick, Jaye L.; Leder, Sadie

    2014-01-01

    A model of the commitment-insurance system is proposed to examine how low and high self-esteem people cope with the costs interdependence imposes on autonomous goal pursuits. In this system, autonomy costs automatically activate compensatory cognitive processes that attach greater value to the partner. Greater partner-valuing compels greater responsiveness to the partner’s needs. Two experiments and a daily diary study of newlyweds supported the model. Autonomy costs automatically activate more positive implicit evaluations of the partner. On explicit measures of positive illusions, high self-esteem people continue to compensate for costs. However, cost-primed low self-esteem people correct and override their positive implicit sentiments when they have the opportunity to do so. Such corrections put the marriages of low self-esteem people at risk: Failing to compensate for costs predicted declines in satisfaction over a one year period. PMID:19634974

  6. Cost-effectiveness of statins revisited: lessons learned about the value of innovation.

    PubMed

    Lindgren, Peter; Jönsson, Bengt

    2012-08-01

    The economic evaluation of statins has undergone a development from risk-factor-based models to modeling of hard end points in clinical trials with a shift back to risk-factor models after increased confidence in their predictive power has now been established. At this point, we can look back on the historical economic data on simvastatin to see what lesson regarding reimbursement we can learn. Historical data on the usage and sales of simvastatin in Sweden were combined with published epidemiological and clinical data to calculate the social value of simvastatin to the present day and to make projection until projected until 2018. The distribution of the social surplus was calculated by taking the costs born by society and the producer of the drug into consideration. The cost of simvastatin fell drastically following patent expiration, although the number of treated patients has continued to grow. Presently, the use of simvastatin is close to cost neutrality taking direct and indirect cost savings from reduced morbidity into account. However, the major part of the social surplus generated comes from the value of improved quality-adjusted survival. Of the social surplus generated, the producer appropriated 20-43% of the value during the on-patent period, a figure dropping to 1% following loss of exclusivity. The total producer surplus between 1987 and 2018 is 2-5% of the total social surplus. Only a small part of the surplus value generated was appropriated by the producer. A regulatory and reimbursement approach that favors early market access and coverage with evidence development as opposed to long-term trials as a pre-requisite for launch is more attractive from both a company and social perspective.

  7. Population-Level Prediction of Type 2 Diabetes From Claims Data and Analysis of Risk Factors.

    PubMed

    Razavian, Narges; Blecker, Saul; Schmidt, Ann Marie; Smith-McLallen, Aaron; Nigam, Somesh; Sontag, David

    2015-12-01

    We present a new approach to population health, in which data-driven predictive models are learned for outcomes such as type 2 diabetes. Our approach enables risk assessment from readily available electronic claims data on large populations, without additional screening cost. Proposed model uncovers early and late-stage risk factors. Using administrative claims, pharmacy records, healthcare utilization, and laboratory results of 4.1 million individuals between 2005 and 2009, an initial set of 42,000 variables were derived that together describe the full health status and history of every individual. Machine learning was then used to methodically enhance predictive variable set and fit models predicting onset of type 2 diabetes in 2009-2011, 2010-2012, and 2011-2013. We compared the enhanced model with a parsimonious model consisting of known diabetes risk factors in a real-world environment, where missing values are common and prevalent. Furthermore, we analyzed novel and known risk factors emerging from the model at different age groups at different stages before the onset. Parsimonious model using 21 classic diabetes risk factors resulted in area under ROC curve (AUC) of 0.75 for diabetes prediction within a 2-year window following the baseline. The enhanced model increased the AUC to 0.80, with about 900 variables selected as predictive (p < 0.0001 for differences between AUCs). Similar improvements were observed for models predicting diabetes onset 1-3 years and 2-4 years after baseline. The enhanced model improved positive predictive value by at least 50% and identified novel surrogate risk factors for type 2 diabetes, such as chronic liver disease (odds ratio [OR] 3.71), high alanine aminotransferase (OR 2.26), esophageal reflux (OR 1.85), and history of acute bronchitis (OR 1.45). Liver risk factors emerge later in the process of diabetes development compared with obesity-related factors such as hypertension and high hemoglobin A1c. In conclusion, population-level risk prediction for type 2 diabetes using readily available administrative data is feasible and has better prediction performance than classical diabetes risk prediction algorithms on very large populations with missing data. The new model enables intervention allocation at national scale quickly and accurately and recovers potentially novel risk factors at different stages before the disease onset.

  8. Solid pulmonary nodule risk assessment and decision analysis: comparison of four prediction models in 285 cases.

    PubMed

    Perandini, Simone; Soardi, Gian Alberto; Motton, Massimiliano; Rossi, Arianna; Signorini, Manuel; Montemezzi, Stefania

    2016-09-01

    The aim of this study was to compare classification results from four major risk prediction models in a wide population of incidentally detected solitary pulmonary nodules (SPNs) which were selected to crossmatch inclusion criteria for the selected models. A total of 285 solitary pulmonary nodules with a definitive diagnosis were evaluated by means of four major risk assessment models developed from non-screening populations, namely the Mayo, Gurney, PKUPH and BIMC models. Accuracy was evaluated by receiver operating characteristic (ROC) area under the curve (AUC) analysis. Each model's fitness to provide reliable help in decision analysis was primarily assessed by adopting a surgical threshold of 65 % and an observation threshold of 5 % as suggested by ACCP guidelines. ROC AUC values, false positives, false negatives and indeterminate nodules were respectively 0.775, 3, 8, 227 (Mayo); 0.794, 41, 6, 125 (Gurney); 0.889, 42, 0, 144 (PKUPH); 0.898, 16, 0, 118 (BIMC). Resultant data suggests that the BIMC model may be of greater help than Mayo, Gurney and PKUPH models in preoperative SPN characterization when using ACCP risk thresholds because of overall better accuracy and smaller numbers of indeterminate nodules and false positive results. • The BIMC and PKUPH models offer better characterization than older prediction models • Both the PKUPH and BIMC models completely avoided false negative results • The Mayo model suffers from a large number of indeterminate results.

  9. Serum Trimethylamine N-oxide, Carnitine, Choline and Betaine in Relation to Colorectal Cancer Risk in the Alpha Tocopherol and Beta Carotene Study

    PubMed Central

    Guertin, Kristin A.; Li, Xinmin S.; Graubard, Barry I.; Albanes, Demetrius; Weinstein, Stephanie J.; Goedert, James J.; Wang, Zeneng; Hazen, Stanley L.; Sinha, Rashmi

    2017-01-01

    Background TMAO, a choline-derived metabolite produced by gut microbiota, and its biomarker precursors have not been adequately evaluated in relation to colorectal cancer risk. Methods We investigated the relationship between serum concentrations of TMAO and its biomarker precursors (choline, carnitine and betaine) and incident colorectal cancer risk in a nested case-control study of male smokers in the Alpha-Tocopherol, Beta-Carotene Cancer Prevention (ATBC) Study. We measured biomarker concentrations in baseline fasting serum samples from 644 incident colorectal cancer cases and 644 controls using LC-MS/MS. Logistic regression models estimated the odds ratio (OR) and 95% confidence interval (CI) for colorectal cancer by quartile (Q) of serum TMAO, choline, carnitine and betaine concentrations. Results Men with higher serum choline at ATBC baseline had approximately 3-fold greater risk of developing colorectal cancer over the ensuing (median ± IQR) 14 ±10 years (in fully adjusted models, Q4 vs. Q1 OR, 3.22; 95% CI, 2.24–4.61; P trend<0.0001). The prognostic value of serum choline for prediction of incident colorectal cancer development was similarly robust for proximal, distal and rectal colon cancers (all P<0.0001). The association between serum TMAO, carnitine, or betaine and colorectal cancer risk was not statistically significant (P=0.25, P=0.71 and P=0.61, respectively). Conclusions Higher serum choline concentration (but not TMAO, carnitine, or betaine) was associated with increased risk of colorectal cancer. Impact Serum choline levels showed strong prognostic value for prediction of incident colorectal cancer risks across all anatomical subsites, suggesting a role of altered choline metabolism in colorectal cancer pathogenesis. PMID:28077427

  10. Serum Trimethylamine N-oxide, Carnitine, Choline, and Betaine in Relation to Colorectal Cancer Risk in the Alpha Tocopherol, Beta Carotene Cancer Prevention Study.

    PubMed

    Guertin, Kristin A; Li, Xinmin S; Graubard, Barry I; Albanes, Demetrius; Weinstein, Stephanie J; Goedert, James J; Wang, Zeneng; Hazen, Stanley L; Sinha, Rashmi

    2017-06-01

    Background: Trimethylamine N-oxide (TMAO), a choline-derived metabolite produced by gut microbiota, and its biomarker precursors have not been adequately evaluated in relation to colorectal cancer risk. Methods: We investigated the relationship between serum concentrations of TMAO and its biomarker precursors (choline, carnitine, and betaine) and incident colorectal cancer risk in a nested case-control study of male smokers in the Alpha-Tocopherol, Beta-Carotene Cancer Prevention (ATBC) Study. We measured biomarker concentrations in baseline fasting serum samples from 644 incident colorectal cancer cases and 644 controls using LC/MS-MS. Logistic regression models estimated the ORs and 95% confidence interval (CI) for colorectal cancer by quartile (Q) of serum TMAO, choline, carnitine, and betaine concentrations. Results: Men with higher serum choline at ATBC baseline had approximately 3-fold greater risk of developing colorectal cancer over the ensuing (median ± IQR) 14 ± 10 years (in fully adjusted models, Q4 vs. Q1, OR, 3.22; 95% CI, 2.24-4.61; P trend < 0.0001). The prognostic value of serum choline for prediction of incident colorectal cancer was similarly robust for proximal, distal, and rectal colon cancers (all P < 0.0001). The association between serum TMAO, carnitine, or betaine and colorectal cancer risk was not statistically significant ( P = 0.25, 0.71, and 0.61, respectively). Conclusions: Higher serum choline concentration (but not TMAO, carnitine, or betaine) was associated with increased risk of colorectal cancer. Impact: Serum choline levels showed strong prognostic value for prediction of incident colorectal cancer risk across all anatomical subsites, suggesting a role of altered choline metabolism in colorectal cancer pathogenesis. Cancer Epidemiol Biomarkers Prev; 26(6); 945-52. ©2017 AACR . ©2017 American Association for Cancer Research.

  11. Predicting the 10-Year Risks of Atherosclerotic Cardiovascular Disease in Chinese Population: The China-PAR Project (Prediction for ASCVD Risk in China).

    PubMed

    Yang, Xueli; Li, Jianxin; Hu, Dongsheng; Chen, Jichun; Li, Ying; Huang, Jianfeng; Liu, Xiaoqing; Liu, Fangchao; Cao, Jie; Shen, Chong; Yu, Ling; Lu, Fanghong; Wu, Xianping; Zhao, Liancheng; Wu, Xigui; Gu, Dongfeng

    2016-11-08

    The accurate assessment of individual risk can be of great value to guiding and facilitating the prevention of atherosclerotic cardiovascular disease (ASCVD). However, prediction models in common use were formulated primarily in white populations. The China-PAR project (Prediction for ASCVD Risk in China) is aimed at developing and validating 10-year risk prediction equations for ASCVD from 4 contemporary Chinese cohorts. Two prospective studies followed up together with a unified protocol were used as the derivation cohort to develop 10-year ASCVD risk equations in 21 320 Chinese participants. The external validation was evaluated in 2 independent Chinese cohorts with 14 123 and 70 838 participants. Furthermore, model performance was compared with the Pooled Cohort Equations reported in the American College of Cardiology/American Heart Association guideline. Over 12 years of follow-up in the derivation cohort with 21 320 Chinese participants, 1048 subjects developed a first ASCVD event. Sex-specific equations had C statistics of 0.794 (95% confidence interval, 0.775-0.814) for men and 0.811 (95% confidence interval, 0.787-0.835) for women. The predicted rates were similar to the observed rates, as indicated by a calibration χ 2 of 13.1 for men (P=0.16) and 12.8 for women (P=0.17). Good internal and external validations of our equations were achieved in subsequent analyses. Compared with the Chinese equations, the Pooled Cohort Equations had lower C statistics and much higher calibration χ 2 values in men. Our project developed effective tools with good performance for 10-year ASCVD risk prediction among a Chinese population that will help to improve the primary prevention and management of cardiovascular disease. © 2016 American Heart Association, Inc.

  12. AERMOD performance evaluation for three coal-fired electrical generating units in Southwest Indiana.

    PubMed

    Frost, Kali D

    2014-03-01

    An evaluation of the steady-state dispersion model AERMOD was conducted to determine its accuracy at predicting hourly ground-level concentrations of sulfur dioxide (SO2) by comparing model-predicted concentrations to a full year of monitored SO2 data. The two study sites are comprised of three coal-fired electrical generating units (EGUs) located in southwest Indiana. The sites are characterized by tall, buoyant stacks,flat terrain, multiple SO2 monitors, and relatively isolated locations. AERMOD v12060 and AERMOD v12345 with BETA options were evaluated at each study site. For the six monitor-receptor pairs evaluated, AERMOD showed generally good agreement with monitor values for the hourly 99th percentile SO2 design value, with design value ratios that ranged from 0.92 to 1.99. AERMOD was within acceptable performance limits for the Robust Highest Concentration (RHC) statistic (RHC ratios ranged from 0.54 to 1.71) at all six monitors. Analysis of the top 5% of hourly concentrations at the six monitor-receptor sites, paired in time and space, indicated poor model performance in the upper concentration range. The amount of hourly model predicted data that was within a factor of 2 of observations at these higher concentrations ranged from 14 to 43% over the six sites. Analysis of subsets of data showed consistent overprediction during low wind speed and unstable meteorological conditions, and underprediction during stable, low wind conditions. Hourly paired comparisons represent a stringent measure of model performance; however given the potential for application of hourly model predictions to the SO2 NAAQS design value, this may be appropriate. At these two sites, AERMOD v12345 BETA options do not improve model performance. A regulatory evaluation of AERMOD utilizing quantile-quantile (Q-Q) plots, the RHC statistic, and 99th percentile design value concentrations indicates that model performance is acceptable according to widely accepted regulatory performance limits. However, a scientific evaluation examining hourly paired monitor and model values at concentrations of interest indicates overprediction and underprediction bias that is outside of acceptable model performance measures. Overprediction of 1-hr SO2 concentrations by AERMOD presents major ramifications for state and local permitting authorities when establishing emission limits.

  13. Time-dependent efficacy of longitudinal biomarker for clinical endpoint.

    PubMed

    Kolamunnage-Dona, Ruwanthi; Williamson, Paula R

    2018-06-01

    Joint modelling of longitudinal biomarker and event-time processes has gained its popularity in recent years as they yield more accurate and precise estimates. Considering this modelling framework, a new methodology for evaluating the time-dependent efficacy of a longitudinal biomarker for clinical endpoint is proposed in this article. In particular, the proposed model assesses how well longitudinally repeated measurements of a biomarker over various time periods (0,t) distinguish between individuals who developed the disease by time t and individuals who remain disease-free beyond time t. The receiver operating characteristic curve is used to provide the corresponding efficacy summaries at various t based on the association between longitudinal biomarker trajectory and risk of clinical endpoint prior to each time point. The model also allows detecting the time period over which a biomarker should be monitored for its best discriminatory value. The proposed approach is evaluated through simulation and illustrated on the motivating dataset from a prospective observational study of biomarkers to diagnose the onset of sepsis.

  14. A bootstrap based space-time surveillance model with an application to crime occurrences

    NASA Astrophysics Data System (ADS)

    Kim, Youngho; O'Kelly, Morton

    2008-06-01

    This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.

  15. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors. The risk was calculated by multiplying the vulnerability with the spatial probability and the building values. Changes in landslide risk was assessed using the loss estimation of four different periods: (1) pre-August 2003 disaster, (2) the August 2003 event, (3) post-August 2003 to 2011 and (4) smaller frequent events occurring between the entire 1996-2011 period. One of the major findings of our work was the calculation of a significant decrease in landslide risk after the 2003 disaster compared to the pre-disaster risk period. This indicates the importance of estimating risk after a few years of a major event in order to avoid overestimation or exaggeration of future losses.

  16. Thinking Inside the Box: The Health Cube Paradigm for Health and Wellness Program Evaluation and Design

    PubMed Central

    Harris, Sharon

    2013-01-01

    Abstract Appropriately constructed health promotions can improve population health. The authors developed a practical model for designing, evaluating, and improving initiatives to provide optimal value. Three independent model dimensions (impact, engagement, and sustainability) and the resultant three-dimensional paradigm were described using hypothetical case studies, including a walking challenge, a health risk assessment survey, and an individual condition management program. The 3-dimensional model is illustrated and the dimensions are defined. Calculation of a 3-dimensional score for program comparisons, refinements, and measurement is explained. Program 1, the walking challenge, had high engagement and impact, but limited sustainability. Program 2, the health risk assessment survey, had high engagement and sustainability but limited impact. Program 3, the on-site condition management program, had measurable impact and sustainability but limited engagement, because of a lack of program capacity. Each initiative, though successful in 2 dimensions, lacked sufficient evolution along the third axis for optimal value. Calculation of a 3-dimensional score is useful for health promotion program development comparison and refinements, and overall measurement of program success. (Population Health Management 2013;16:291–295) PMID:23869538

  17. The Stroke Assessment of Fall Risk (SAFR): predictive validity in inpatient stroke rehabilitation.

    PubMed

    Breisinger, Terry P; Skidmore, Elizabeth R; Niyonkuru, Christian; Terhorst, Lauren; Campbell, Grace B

    2014-12-01

    To evaluate relative accuracy of a newly developed Stroke Assessment of Fall Risk (SAFR) for classifying fallers and non-fallers, compared with a health system fall risk screening tool, the Fall Harm Risk Screen. Prospective quality improvement study conducted at an inpatient stroke rehabilitation unit at a large urban university hospital. Patients admitted for inpatient stroke rehabilitation (N = 419) with imaging or clinical evidence of ischemic or hemorrhagic stroke, between 1 August 2009 and 31 July 2010. Not applicable. Sensitivity, specificity, and area under the curve for Receiver Operating Characteristic Curves of both scales' classifications, based on fall risk score completed upon admission to inpatient stroke rehabilitation. A total of 68 (16%) participants fell at least once. The SAFR was significantly more accurate than the Fall Harm Risk Screen (p < 0.001), with area under the curve of 0.73, positive predictive value of 0.29, and negative predictive value of 0.94. For the Fall Harm Risk Screen, area under the curve was 0.56, positive predictive value was 0.19, and negative predictive value was 0.86. Sensitivity and specificity of the SAFR (0.78 and 0.63, respectively) was higher than the Fall Harm Risk Screen (0.57 and 0.48, respectively). An evidence-derived, population-specific fall risk assessment may more accurately predict fallers than a general fall risk screen for stroke rehabilitation patients. While the SAFR improves upon the accuracy of a general assessment tool, additional refinement may be warranted. © The Author(s) 2014.

  18. Empirically evaluating decision-analytic models.

    PubMed

    Goldhaber-Fiebert, Jeremy D; Stout, Natasha K; Goldie, Sue J

    2010-08-01

    Model-based cost-effectiveness analyses support decision-making. To augment model credibility, evaluation via comparison to independent, empirical studies is recommended. We developed a structured reporting format for model evaluation and conducted a structured literature review to characterize current model evaluation recommendations and practices. As an illustration, we applied the reporting format to evaluate a microsimulation of human papillomavirus and cervical cancer. The model's outputs and uncertainty ranges were compared with multiple outcomes from a study of long-term progression from high-grade precancer (cervical intraepithelial neoplasia [CIN]) to cancer. Outcomes included 5 to 30-year cumulative cancer risk among women with and without appropriate CIN treatment. Consistency was measured by model ranges overlapping study confidence intervals. The structured reporting format included: matching baseline characteristics and follow-up, reporting model and study uncertainty, and stating metrics of consistency for model and study results. Structured searches yielded 2963 articles with 67 meeting inclusion criteria and found variation in how current model evaluations are reported. Evaluation of the cervical cancer microsimulation, reported using the proposed format, showed a modeled cumulative risk of invasive cancer for inadequately treated women of 39.6% (30.9-49.7) at 30 years, compared with the study: 37.5% (28.4-48.3). For appropriately treated women, modeled risks were 1.0% (0.7-1.3) at 30 years, study: 1.5% (0.4-3.3). To support external and projective validity, cost-effectiveness models should be iteratively evaluated as new studies become available, with reporting standardized to facilitate assessment. Such evaluations are particularly relevant for models used to conduct comparative effectiveness analyses.

  19. Causal modelling applied to the risk assessment of a wastewater discharge.

    PubMed

    Paul, Warren L; Rokahr, Pat A; Webb, Jeff M; Rees, Gavin N; Clune, Tim S

    2016-03-01

    Bayesian networks (BNs), or causal Bayesian networks, have become quite popular in ecological risk assessment and natural resource management because of their utility as a communication and decision-support tool. Since their development in the field of artificial intelligence in the 1980s, however, Bayesian networks have evolved and merged with structural equation modelling (SEM). Unlike BNs, which are constrained to encode causal knowledge in conditional probability tables, SEMs encode this knowledge in structural equations, which is thought to be a more natural language for expressing causal information. This merger has clarified the causal content of SEMs and generalised the method such that it can now be performed using standard statistical techniques. As it was with BNs, the utility of this new generation of SEM in ecological risk assessment will need to be demonstrated with examples to foster an understanding and acceptance of the method. Here, we applied SEM to the risk assessment of a wastewater discharge to a stream, with a particular focus on the process of translating a causal diagram (conceptual model) into a statistical model which might then be used in the decision-making and evaluation stages of the risk assessment. The process of building and testing a spatial causal model is demonstrated using data from a spatial sampling design, and the implications of the resulting model are discussed in terms of the risk assessment. It is argued that a spatiotemporal causal model would have greater external validity than the spatial model, enabling broader generalisations to be made regarding the impact of a discharge, and greater value as a tool for evaluating the effects of potential treatment plant upgrades. Suggestions are made on how the causal model could be augmented to include temporal as well as spatial information, including suggestions for appropriate statistical models and analyses.

  20. The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Latimer, John A.

    2009-01-01

    This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.

  1. Application of Bayesian networks in a hierarchical structure for environmental risk assessment: a case study of the Gabric Dam, Iran.

    PubMed

    Malekmohammadi, Bahram; Tayebzadeh Moghadam, Negar

    2018-04-13

    Environmental risk assessment (ERA) is a commonly used, effective tool applied to reduce adverse effects of environmental risk factors. In this study, ERA was investigated using the Bayesian network (BN) model based on a hierarchical structure of variables in an influence diagram (ID). ID facilitated ranking of the different alternatives under uncertainty that were then used to evaluate comparisons of the different risk factors. BN was used to present a new model for ERA applicable to complicated development projects such as dam construction. The methodology was applied to the Gabric Dam, in southern Iran. The main environmental risk factors in the region, presented by the Gabric Dam, were identified based on the Delphi technique and specific features of the study area. These included the following: flood, water pollution, earthquake, changes in land use, erosion and sedimentation, effects on the population, and ecosensitivity. These risk factors were then categorized based on results from the output decision node of the BN, including expected utility values for risk factors in the decision node. ERA was performed for the Gabric Dam using the analytical hierarchy process (AHP) method to compare results of BN modeling with those of conventional methods. Results determined that a BN-based hierarchical structure to ERA present acceptable and reasonable risk assessment prioritization in proposing suitable solutions to reduce environmental risks and can be used as a powerful decision support system for evaluating environmental risks.

  2. Internal photon and electron dosimetry of the newborn patient—a hybrid computational phantom study

    NASA Astrophysics Data System (ADS)

    Wayson, Michael; Lee, Choonsik; Sgouros, George; Treves, S. Ted; Frey, Eric; Bolch, Wesley E.

    2012-03-01

    Estimates of radiation absorbed dose to organs of the nuclear medicine patient are a requirement for administered activity optimization and for stochastic risk assessment. Pediatric patients, and in particular the newborn child, represent that portion of the patient population where such optimization studies are most crucial owing to the enhanced tissue radiosensitivities and longer life expectancies of this patient subpopulation. In cases where whole-body CT imaging is not available, phantom-based calculations of radionuclide S values—absorbed dose to a target tissue per nuclear transformation in a source tissue—are required for dose and risk evaluation. In this study, a comprehensive model of electron and photon dosimetry of the reference newborn child is presented based on a high-resolution hybrid-voxel phantom from the University of Florida (UF) patient model series. Values of photon specific absorbed fraction (SAF) were assembled for both the reference male and female newborn using the radiation transport code MCNPX v2.6. Values of electron SAF were assembled in a unique and time-efficient manner whereby the collisional and radiative components of organ dose--for both self- and cross-dose terms—were computed separately. Dose to the newborn skeletal tissues were assessed via fluence-to-dose response functions reported for the first time in this study. Values of photon and electron SAFs were used to assemble a complete set of S values for some 16 radionuclides commonly associated with molecular imaging of the newborn. These values were then compared to those available in the OLINDA/EXM software. S value ratios for organ self-dose ranged from 0.46 to 1.42, while similar ratios for organ cross-dose varied from a low of 0.04 to a high of 3.49. These large discrepancies are due in large part to the simplistic organ modeling in the stylized newborn model used in the OLINDA/EXM software. A comprehensive model of internal dosimetry is presented in this study for the newborn nuclear medicine patient based upon the UF hybrid computational phantom. Photon dose response functions, photon and electron SAFs, and tables of radionuclide S values for the newborn child--both male and female--are given in a series of four electronic annexes available at stacks.iop.org/pmb/57/1433/mmedia. These values can be applied to optimization studies of image quality and stochastic risk for this most vulnerable class of pediatric patients.

  3. Performance of new thresholds of the Glasgow Blatchford score in managing patients with upper gastrointestinal bleeding.

    PubMed

    Laursen, Stig B; Dalton, Harry R; Murray, Iain A; Michell, Nick; Johnston, Matt R; Schultz, Michael; Hansen, Jane M; Schaffalitzky de Muckadell, Ove B; Blatchford, Oliver; Stanley, Adrian J

    2015-01-01

    Upper gastrointestinal hemorrhage (UGIH) is a common cause of hospital admission. The Glasgow Blatchford score (GBS) is an accurate determinant of patients' risk for hospital-based intervention or death. Patients with a GBS of 0 are at low risk for poor outcome and could be managed as outpatients. Some investigators therefore have proposed extending the definition of low-risk patients by using a higher GBS cut-off value, possibly with an age adjustment. We compared 3 thresholds of the GBS and 2 age-adjusted modifications to identify the optimal cut-off value or modification. We performed an observational study of 2305 consecutive patients presenting with UGIH at 4 centers (Scotland, England, Denmark, and New Zealand). The performance of each threshold and modification was evaluated based on sensitivity and specificity analyses, the proportion of low-risk patients identified, and outcomes of patients classified as low risk. There were differences in age (P = .0001), need for intervention (P < .0001), mortality (P < .015), and GBS (P = .0001) among sites. All systems identified low-risk patients with high levels of sensitivity (>97%). The GBS at cut-off values of ≤1 and ≤2, and both modifications, identified low-risk patients with higher levels of specificity (40%-49%) than the GBS with a cut-off value of 0 (22% specificity; P < .001). The GBS at a cut-off value of ≤2 had the highest specificity, but 3% of patients classified as low-risk patients had adverse outcomes. All GBS cut-off values, and score modifications, had low levels of specificity when tested in New Zealand (2.5%-11%). A GBS cut-off value of ≤1 and both GBS modifications identify almost twice as many low-risk patients with UGIH as a GBS at a cut-off value of 0. Implementing a protocol for outpatient management, based on one of these scores, could reduce hospital admissions by 15% to 20%. Copyright © 2015 AGA Institute. Published by Elsevier Inc. All rights reserved.

  4. Incremental diagnostic quality gain of CTA over V/Q scan in the assessment of pulmonary embolism by means of a Wells score Bayesian model: results from the ACDC collaboration.

    PubMed

    Cochon, Laila; McIntyre, Kaitlin; Nicolás, José M; Baez, Amado Alejandro

    2017-08-01

    Our objective was to evaluate the diagnostic value of computed tomography angiography (CTA) and ventilation perfusion (V/Q) scan in the assessment of pulmonary embolism (PE) by means of a Bayesian statistical model. Wells criteria defined pretest probability. Sensitivity and specificity of CTA and V/Q scan for PE were derived from pooled meta-analysis data. Likelihood ratios calculated for CTA and V/Q were inserted in the nomogram. Absolute (ADG) and relative diagnostic gains (RDG) were analyzed comparing post- and pretest probability. Comparative gain difference was calculated for CTA ADG over V/Q scan integrating ANOVA p value set at 0.05. The sensitivity for CT was 86.0% (95% CI: 80.2%, 92.1%) and specificity of 93.7% (95% CI: 91.1%, 96.3%). The V/Q scan yielded a sensitivity of 96% (95% CI: 95%, 97%) and a specificity of 97% (95% CI: 96%, 98%). Bayes nomogram results for CTA were low risk and yielded a posttest probability of 71.1%, an ADG of 56.1%, and an RDG of 374%, moderate-risk posttest probability was 85.1%, an ADG of 56.1%, and an RDG of 193.4%, and high-risk posttest probability was 95.2%, an ADG of 36.2%, and an RDG of 61.35%. The comparative gain difference for low-risk population was 46.1%; in moderate-risk 41.6%; and in high-risk a 22.1% superiority. ANOVA analysis for LR+ and LR- showed no significant difference (p = 0.8745, p = 0.9841 respectively). This Bayesian model demonstrated a superiority of CTA when compared to V/Q scan for the diagnosis of pulmonary embolism. Low-risk patients are recognized to have a superior overall comparative gain favoring CTA.

  5. Echocardiography and risk prediction in advanced heart failure: incremental value over clinical markers.

    PubMed

    Agha, Syed A; Kalogeropoulos, Andreas P; Shih, Jeffrey; Georgiopoulou, Vasiliki V; Giamouzis, Grigorios; Anarado, Perry; Mangalat, Deepa; Hussain, Imad; Book, Wendy; Laskar, Sonjoy; Smith, Andrew L; Martin, Randolph; Butler, Javed

    2009-09-01

    Incremental value of echocardiography over clinical parameters for outcome prediction in advanced heart failure (HF) is not well established. We evaluated 223 patients with advanced HF receiving optimal therapy (91.9% angiotensin-converting enzyme inhibitor/angiotensin receptor blocker, 92.8% beta-blockers, 71.8% biventricular pacemaker, and/or defibrillator use). The Seattle Heart Failure Model (SHFM) was used as the reference clinical risk prediction scheme. The incremental value of echocardiographic parameters for event prediction (death or urgent heart transplantation) was measured by the improvement in fit and discrimination achieved by addition of standard echocardiographic parameters to the SHFM. After a median follow-up of 2.4 years, there were 38 (17.0%) events (35 deaths; 3 urgent transplants). The SHFM had likelihood ratio (LR) chi(2) 32.0 and C statistic 0.756 for event prediction. Left ventricular end-systolic volume, stroke volume, and severe tricuspid regurgitation were independent echocardiographic predictors of events. The addition of these parameters to SHFM improved LR chi(2) to 72.0 and C statistic to 0.866 (P < .001 and P=.019, respectively). Reclassifying the SHFM-predicted risk with use of the echocardiography-added model resulted in improved prognostic separation. Addition of standard echocardiographic variables to the SHFM results in significant improvement in risk prediction for patients with advanced HF.

  6. Two-stage stochastic unit commitment model including non-generation resources with conditional value-at-risk constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Yuping; Zheng, Qipeng P.; Wang, Jianhui

    2014-11-01

    tThis paper presents a two-stage stochastic unit commitment (UC) model, which integrates non-generation resources such as demand response (DR) and energy storage (ES) while including riskconstraints to balance between cost and system reliability due to the fluctuation of variable genera-tion such as wind and solar power. This paper uses conditional value-at-risk (CVaR) measures to modelrisks associated with the decisions in a stochastic environment. In contrast to chance-constrained modelsrequiring extra binary variables, risk constraints based on CVaR only involve linear constraints and con-tinuous variables, making it more computationally attractive. The proposed models with risk constraintsare able to avoid over-conservative solutions butmore » still ensure system reliability represented by loss ofloads. Then numerical experiments are conducted to study the effects of non-generation resources ongenerator schedules and the difference of total expected generation costs with risk consideration. Sen-sitivity analysis based on reliability parameters is also performed to test the decision preferences ofconfidence levels and load-shedding loss allowances on generation cost reduction.« less

  7. Integrated Research on the Development of Global Climate Risk Management Strategies - Framework and Initial Results of the Research Project ICA-RUS

    NASA Astrophysics Data System (ADS)

    Emori, Seita; Takahashi, Kiyoshi; Yamagata, Yoshiki; Oki, Taikan; Mori, Shunsuke; Fujigaki, Yuko

    2013-04-01

    With the aim of proposing strategies of global climate risk management, we have launched a five-year research project called ICA-RUS (Integrated Climate Assessment - Risks, Uncertainties and Society). In this project with the phrase "risk management" in its title, we aspire for a comprehensive assessment of climate change risks, explicit consideration of uncertainties, utilization of best available information, and consideration of every possible conditions and options. We also regard the problem as one of decision-making at the human level, which involves social value judgments and adapts to future changes in circumstances. The ICA-RUS project consists of the following five themes: 1) Synthesis of global climate risk management strategies, 2) Optimization of land, water and ecosystem uses for climate risk management, 3) Identification and analysis of critical climate risks, 4) Evaluation of climate risk management options under technological, social and economic uncertainties and 5) Interactions between scientific and social rationalities in climate risk management (see also: http://www.nies.go.jp/ica-rus/en/). For the integration of quantitative knowledge of climate change risks and responses, we apply a tool named AIM/Impact [Policy], which consists of an energy-economic model, a simplified climate model and impact projection modules. At the same time, in order to make use of qualitative knowledge as well, we hold monthly project meetings for the discussion of risk management strategies and publish annual reports based on the quantitative and qualitative information. To enhance the comprehensiveness of the analyses, we maintain an inventory of risks and risk management options. The inventory is revised iteratively through interactive meetings with stakeholders such as policymakers, government officials and industrial representatives.

  8. The evaluation of speed skating helmet performance through peak linear and rotational accelerations.

    PubMed

    Karton, Clara; Rousseau, Philippe; Vassilyadi, Michael; Hoshizaki, Thomas Blaine

    2014-01-01

    Like many sports involving high speeds and body contact, head injuries are a concern for short track speed skating athletes and coaches. While the mandatory use of helmets has managed to nearly eliminate catastrophic head injuries such as skull fractures and cerebral haemorrhages, they may not be as effective at reducing the risk of a concussion. The purpose of this study was to evaluate the performance characteristics of speed skating helmets with respect to managing peak linear and peak rotational acceleration, and to compare their performance against other types of helmets commonly worn within the speed skating sport. Commercially available speed skating, bicycle and ice hockey helmets were evaluated using a three-impact condition test protocol at an impact velocity of 4 m/s. Two speed skating helmet models yielded mean peak linear accelerations at a low-estimated probability range for sustaining a concussion for all three impact conditions. Conversely, the resulting mean peak rotational acceleration values were all found close to the high end of a probability range for sustaining a concussion. A similar tendency was observed for the bicycle and ice hockey helmets under the same impact conditions. Speed skating helmets may not be as effective at managing rotational acceleration and therefore may not successfully protect the user against risks associated with concussion injuries.

  9. Quantifying the predictive accuracy of time-to-event models in the presence of competing risks.

    PubMed

    Schoop, Rotraut; Beyersmann, Jan; Schumacher, Martin; Binder, Harald

    2011-02-01

    Prognostic models for time-to-event data play a prominent role in therapy assignment, risk stratification and inter-hospital quality assurance. The assessment of their prognostic value is vital not only for responsible resource allocation, but also for their widespread acceptance. The additional presence of competing risks to the event of interest requires proper handling not only on the model building side, but also during assessment. Research into methods for the evaluation of the prognostic potential of models accounting for competing risks is still needed, as most proposed methods measure either their discrimination or calibration, but do not examine both simultaneously. We adapt the prediction error proposal of Graf et al. (Statistics in Medicine 1999, 18, 2529–2545) and Gerds and Schumacher (Biometrical Journal 2006, 48, 1029–1040) to handle models with competing risks, i.e. more than one possible event type, and introduce a consistent estimator. A simulation study investigating the behaviour of the estimator in small sample size situations and for different levels of censoring together with a real data application follows.

  10. The application of a decision tree to establish the parameters associated with hypertension.

    PubMed

    Tayefi, Maryam; Esmaeili, Habibollah; Saberi Karimian, Maryam; Amirabadi Zadeh, Alireza; Ebrahimi, Mahmoud; Safarian, Mohammad; Nematy, Mohsen; Parizadeh, Seyed Mohammad Reza; Ferns, Gordon A; Ghayour-Mobarhan, Majid

    2017-02-01

    Hypertension is an important risk factor for cardiovascular disease (CVD). The goal of this study was to establish the factors associated with hypertension by using a decision-tree algorithm as a supervised classification method of data mining. Data from a cross-sectional study were used in this study. A total of 9078 subjects who met the inclusion criteria were recruited. 70% of these subjects (6358 cases) were randomly allocated to the training dataset for the constructing of the decision-tree. The remaining 30% (2720 cases) were used as the testing dataset to evaluate the performance of decision-tree. Two models were evaluated in this study. In model I, age, gender, body mass index, marital status, level of education, occupation status, depression and anxiety status, physical activity level, smoking status, LDL, TG, TC, FBG, uric acid and hs-CRP were considered as input variables and in model II, age, gender, WBC, RBC, HGB, HCT MCV, MCH, PLT, RDW and PDW were considered as input variables. The validation of the model was assessed by constructing a receiver operating characteristic (ROC) curve. The prevalence rates of hypertension were 32% in our population. For the decision-tree model I, the accuracy, sensitivity, specificity and area under the ROC curve (AUC) value for identifying the related risk factors of hypertension were 73%, 63%, 77% and 0.72, respectively. The corresponding values for model II were 70%, 61%, 74% and 0.68, respectively. We have developed a decision tree model to identify the risk factors associated with hypertension that maybe used to develop programs for hypertension management. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Multilevel models for evaluating the risk of pedestrian-motor vehicle collisions at intersections and mid-blocks

    PubMed Central

    Quistberg, D. Alex; Howard, Eric J.; Ebel, Beth E.; Moudon, Anne V.; Saelens, Brian E.; Hurvitz, Philip M.; Curtin, James E.; Rivara, Frederick P.

    2015-01-01

    Walking is a popular form of physical activity associated with clear health benefits. Promoting safe walking for pedestrians requires evaluating the risk of pedestrian-motor vehicle collisions at specific roadway locations in order to identify where road improvements and other interventions may be needed. The objective of this analysis was to estimate the risk of pedestrian collisions at intersections and mid-blocks in Seattle, WA. The study used 2007-2013 pedestrian-motor vehicle collision data from police reports and detailed characteristics of the microenvironment and macroenvironment at intersection and mid-block locations. The primary outcome was the number of pedestrian-motor vehicle collisions over time at each location (incident rate ratio [IRR] and 95% confidence interval [95% CI]). Multilevel mixed effects Poisson models accounted for correlation within and between locations and census blocks over time. Analysis accounted for pedestrian and vehicle activity (e.g., residential density and road classification). In the final multivariable model, intersections with 4 segments or 5 or more segments had higher pedestrian collision rates compared to mid-blocks. Non-residential roads had significantly higher rates than residential roads, with principal arterials having the highest collision rate. The pedestrian collision rate was higher by 9% per 10 feet of street width. Locations with traffic signals had twice the collision rate of locations without a signal and those with marked crosswalks also had a higher rate. Locations with a marked crosswalk also had higher risk of collision. Locations with a one-way road or those with signs encouraging motorists to cede the right-of-way to pedestrians had fewer pedestrian collisions. Collision rates were higher in locations that encourage greater pedestrian activity (more bus use, more fast food restaurants, higher employment, residential, and population densities). Locations with higher intersection density had a lower rate of collisions as did those in areas with higher residential property values. The novel spatiotemporal approach used that integrates road/crossing characteristics with surrounding neighborhood characteristics should help city agencies better identify high-risk locations for further study and analysis. Improving roads and making them safer for pedestrians achieves the public health goals of reducing pedestrian collisions and promoting physical activity. PMID:26339944

  12. Integration of PKPD relationships into benefit–risk analysis

    PubMed Central

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-01-01

    Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398

  13. Integration of PKPD relationships into benefit-risk analysis.

    PubMed

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-11-01

    Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.

  14. Longitudinal histories as predictors of future diagnoses of domestic abuse: modelling study

    PubMed Central

    Kohane, Isaac S; Mandl, Kenneth D

    2009-01-01

    Objective To determine whether longitudinal data in patients’ historical records, commonly available in electronic health record systems, can be used to predict a patient’s future risk of receiving a diagnosis of domestic abuse. Design Bayesian models, known as intelligent histories, used to predict a patient’s risk of receiving a future diagnosis of abuse, based on the patient’s diagnostic history. Retrospective evaluation of the model’s predictions using an independent testing set. Setting A state-wide claims database covering six years of inpatient admissions to hospital, admissions for observation, and encounters in emergency departments. Population All patients aged over 18 who had at least four years between their earliest and latest visits recorded in the database (561 216 patients). Main outcome measures Timeliness of detection, sensitivity, specificity, positive predictive values, and area under the ROC curve. Results 1.04% (5829) of the patients met the narrow case definition for abuse, while 3.44% (19 303) met the broader case definition for abuse. The model achieved sensitive, specific (area under the ROC curve of 0.88), and early (10-30 months in advance, on average) prediction of patients’ future risk of receiving a diagnosis of abuse. Analysis of model parameters showed important differences between sexes in the risks associated with certain diagnoses. Conclusions Commonly available longitudinal diagnostic data can be useful for predicting a patient’s future risk of receiving a diagnosis of abuse. This modelling approach could serve as the basis for an early warning system to help doctors identify high risk patients for further screening. PMID:19789406

  15. Evaluation of Eligibility Criteria Used to Identify Patients for Medication Therapy Management Services: A Retrospective Cohort Study in a Medicare Advantage Part D Population.

    PubMed

    Lee, Janet S; Yang, Jianing; Stockl, Karen M; Lew, Heidi; Solow, Brian K

    2016-01-01

    General eligibility criteria used by the Centers for Medicare & Medicaid Services (CMS) to identify patients for medication therapy management (MTM) services include having multiple chronic conditions, taking multiple Part D drugs, and being likely to incur annual drug costs that exceed a predetermined threshold. The performance of these criteria in identifying patients in greatest need of MTM services is unknown. Although there are numerous possible versions of MTM identification algorithms that satisfy these criteria, there are limited data that evaluate the performance of MTM services using eligibility thresholds representative of those used by the majority of Part D sponsors. To (a) evaluate the performance of the 2013 CMS MTM eligibility criteria thresholds in identifying Medicare Advantage Prescription Drug (MAPD) plan patients with at least 2 drug therapy problems (DTPs) relative to alternative criteria threshold levels and (b) identify additional patient risk factors significantly associated with the number of DTPs for consideration as potential future MTM eligibility criteria. All patients in the Medicare Advantage Part D population who had pharmacy eligibility as of December 31, 2013, were included in this retrospective cohort study. Study outcomes included 7 different types of DTPs: use of high-risk medications in the elderly, gaps in medication therapy, medication nonadherence, drug-drug interactions, duplicate therapy, drug-disease interactions, and brand-to-generic conversion opportunities. DTPs were identified for each member based on 6 months of most recent pharmacy claims data and 14 months of most recent medical claims data. Risk factors examined in this study included patient demographics and prior health care utilization in the most recent 6 months. Descriptive statistics were used to summarize patient characteristics and to evaluate unadjusted relationships between the average number of DTPs identified per patient and each risk factor. Quartile values identified in the study population for number of diseases, number of drugs, and annual spend were used as potential new criteria thresholds, resulting in 27 new MTM criteria combinations. The performance of each eligibility criterion was evaluated using sensitivity, specificity, positive predictive values (PPVs), and negative predictive values (NPVs). Patients identified with at least 2 DTPs were defined as those who would benefit from MTM services and were used as the gold standard. As part of a sensitivity analysis, patients identified with at least 1 DTP were used as the gold standard. Lastly, a multivariable negative binomial regression model was used to evaluate the relationship between each risk factor and the number of identified DTPs per patient while controlling for the patients' number of drugs, number of chronic diseases, and annual drug spend. A total of 2,578,336 patients were included in the study. The sensitivity, specificity, PPV, and NPV of CMS MTM criteria for the 2013 plan year were 15.3%, 95.6%, 51.3%, and 78.8%, respectively. Sensitivity and PPV improved when the drug count threshold increased from 8 to 10, and when the annual drug cost decreased from $3,144 to $2,239 or less. Results were consistent when at least 1 DTP was used as the gold standard. The adjusted rate of DTPs was significantly greater among patients identified with higher drug and disease counts, annual drug spend, and prior ER or outpatient or hospital visits. Patients with higher median household incomes who were male, younger, or white had significantly lower rates of DTPs. The performance of MTM eligibility criteria can be improved by increasing the threshold values for drug count while decreasing the threshold value for annual drug spend. Furthermore, additional risk factors, such as a recent ER or hospital visit, may be considered as potential MTM eligibility criteria.

  16. Validating prediction scales of type 2 diabetes mellitus in Spain: the SPREDIA-2 population-based prospective cohort study protocol

    PubMed Central

    Salinero-Fort, Miguel Ángel; de Burgos-Lunar, Carmen; Mostaza Prieto, José; Lahoz Rallo, Carlos; Abánades-Herranz, Juan Carlos; Gómez-Campelo, Paloma; Laguna Cuesta, Fernando; Estirado De Cabo, Eva; García Iglesias, Francisca; González Alegre, Teresa; Fernández Puntero, Belén; Montesano Sánchez, Luis; Vicent López, David; Cornejo Del Río, Víctor; Fernández García, Pedro J; Sabín Rodríguez, Concesa; López López, Silvia; Patrón Barandío, Pedro

    2015-01-01

    Introduction The incidence of type 2 diabetes mellitus (T2DM) is increasing worldwide. When diagnosed, many patients already have organ damage or advance subclinical atherosclerosis. An early diagnosis could allow the implementation of lifestyle changes and treatment options aimed at delaying the progression of the disease and to avoid cardiovascular complications. Different scores for identifying undiagnosed diabetes have been reported, however, their performance in populations of southern Europe has not been sufficiently evaluated. The main objectives of our study are: to evaluate the screening performance and cut-off points of the main scores that identify the risk of undiagnosed T2DM and prediabetes in a Spanish population, and to develop and validate our own predictive models of undiagnosed T2DM (screening model), and future T2DM (prediction risk model) after 5-year follow-up. As a secondary objective, we will evaluate the atherosclerotic burden of the population with undiagnosed T2DM. Methods and analysis Population-based prospective cohort study with baseline screening, to evaluate the performance of the FINDRISC, DANISH, DESIR, ARIC and QDScore, against the gold standard tests: Fasting plasma glucose, oral glucose tolerance and/or HbA1c. The sample size will include 1352 participants between the ages of 45 and 74 years. Analysis: sensitivity, specificity, positive predictive value, negative predictive value, likelihood ratio positive, likelihood ratio negative and receiver operating characteristic curves and area under curve. Binary logistic regression for the first 700 individuals (derivation) and last 652 (validation) will be performed. All analyses will be calculated with their 95% CI; statistical significance will be p<0.05. Ethics and dissemination The study protocol has been approved by the Research Ethics Committee of the Carlos III Hospital (Madrid). The score performance and predictive model will be presented in medical conferences, workshops, seminars and round table discussions. Furthermore, the predictive model will be published in a peer-reviewed medical journal to further increase the exposure of the scores. PMID:26220868

  17. Coal Seam Methane Pressure as a Parameter Determining the Level of the Outburst Risk - Laboratory and in Situ Research / Ciśnienie Złożowe Jako Parametr Określający Stan Zagrożenia Wyrzutami Metanu I Skał - Badania Laboratoryjne I Kopalniane

    NASA Astrophysics Data System (ADS)

    Skoczylas, Norbert

    2012-12-01

    Scarcity of research focusing on the evaluation of the coal seam methane pressure as a parameter determining the outburst risk makes it difficult to assess the value for which the level of this risk increases considerably. It is obvious that, apart from the gas factor, the evaluation of the threat should also take into account the strength factor. The research presented in this paper attempted at estimating the level of the outburst risk on the basis of the coal seam methane pressure value and firmness of coal. In this work, the author seeks to present both the relevant laboratory research and the measurements carried out in mines.

  18. Measuring the coupled risks: A copula-based CVaR model

    NASA Astrophysics Data System (ADS)

    He, Xubiao; Gong, Pu

    2009-01-01

    Integrated risk management for financial institutions requires an approach for aggregating risk types (such as market and credit) whose distributional shapes vary considerably. The financial institutions often ignore risks' coupling influence so as to underestimate the financial risks. We constructed a copula-based Conditional Value-at-Risk (CVaR) model for market and credit risks. This technique allows us to incorporate realistic marginal distributions that capture essential empirical features of these risks, such as skewness and fat-tails while allowing for a rich dependence structure. Finally, the numerical simulation method is used to implement the model. Our results indicate that the coupled risks for the listed company's stock maybe are undervalued if credit risk is ignored, especially for the listed company with bad credit quality.

  19. Seawater intrusion risk analysis under climate change conditions for the Gaza Strip aquifer (Palestine)

    NASA Astrophysics Data System (ADS)

    Dentoni, Marta; Deidda, Roberto; Paniconi, Claudio; Marrocu, Marino; Lecca, Giuditta

    2014-05-01

    Seawater intrusion (SWI) has become a major threat to coastal freshwater resources, particularly in the Mediterranean basin, where this problem is exacerbated by the lack of appropriate groundwater resources management and with serious potential impacts from projected climate changes. A proper analysis and risk assessment that includes climate scenarios is essential for the design of water management measures to mitigate the environmental and socio-economic impacts of SWI. In this study a methodology for SWI risk analysis in coastal aquifers is developed and applied to the Gaza Strip coastal aquifer in Palestine. The method is based on the origin-pathway-target model, evaluating the final value of SWI risk by applying the overlay principle to the hazard map (representing the origin of SWI), the vulnerability map (representing the pathway of groundwater flow) and the elements map (representing the target of SWI). Results indicate the important role of groundwater simulation in SWI risk assessment and illustrate how mitigation measures can be developed according to predefined criteria to arrive at quantifiable expected benefits. Keywords: Climate change, coastal aquifer, seawater intrusion, risk analysis, simulation/optimization model. Acknowledgements. The study is partially funded by the project "Climate Induced Changes on the Hydrology of Mediterranean Basins (CLIMB)", FP7-ENV-2009-1, GA 244151.

  20. Solving the Value Equation: Assessing Surgeon Performance Using Risk-Adjusted Quality-Cost Diagrams and Surgical Outcomes.

    PubMed

    Knechtle, William S; Perez, Sebastian D; Raval, Mehul V; Sullivan, Patrick S; Duwayri, Yazan M; Fernandez, Felix; Sharma, Joe; Sweeney, John F

    Quality-cost diagrams have been used previously to assess interventions and their cost-effectiveness. This study explores the use of risk-adjusted quality-cost diagrams to compare the value provided by surgeons by presenting cost and outcomes simultaneously. Colectomy cases from a single institution captured in the National Surgical Quality Improvement Program database were linked to hospital cost-accounting data to determine costs per encounter. Risk adjustment models were developed and observed average cost and complication rates per surgeon were compared to expected cost and complication rates using the diagrams. Surgeons were surveyed to determine if the diagrams could provide information that would result in practice adjustment. Of 55 surgeons surveyed on the utility of the diagrams, 92% of respondents believed the diagrams were useful. The diagrams seemed intuitive to interpret, and making risk-adjusted comparisons accounted for patient differences in the evaluation.

  1. Risk of hydrocyanic acid release in the electroplating industry.

    PubMed

    Piccinini, N; Ruggiero, G N; Baldi, G; Robotto, A

    2000-01-07

    This paper suggests assessing the consequences of hydrocyanic acid (HCN) release into the air by aqueous cyanide solutions in abnormal situations such as the accidental introduction of an acid, or the insertion of a cyanide in a pickling bath. It provides a well-defined source model and its resolution by methods peculiar to mass transport phenomena. The procedure consists of four stages: calculation of the liquid phase concentration, estimate of the HCN liquid-vapour equilibrium, determination of the mass transfer coefficient at the liquid-vapour interface, evaluation of the air concentration of HCN and of the damage distances. The results show that small baths operating at high temperatures are the major sources of risk. The building up of lethal air concentrations, on the other hand, is governed by the values of the mass transfer coefficient, which is itself determined by the flow dynamics and bath geometry. Concerning the magnitude of the risk, the fallout for external emergency planning is slight in all the cases investigated.

  2. A probabilistic strategy for parametric catastrophe insurance

    NASA Astrophysics Data System (ADS)

    Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin

    2017-04-01

    Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss events. Due to the nature of parametric programmes, it is still necessary to clearly define when a payout is due or not, and so a decision threshold probability above which a loss event is considered to occur must be set, effectively converting the issued probabilities into deterministic binary outcomes. Model skill and value are evaluated over the range of possible threshold probabilities, with the objective of defining the optimal one. The predictive ability of the model is assessed. In terms of value assessment, a decision model is proposed, allowing users to quantify monetarily their expected expenses when different combinations of model event triggering and actual event occurrence take place, directly tackling the problem of basis risk.

  3. CTLA-4 and MDR1 polymorphisms increase the risk for ulcerative colitis: A meta-analysis.

    PubMed

    Zhao, Jia-Jun; Wang, Di; Yao, Hui; Sun, Da-Wei; Li, Hong-Yu

    2015-09-14

    To evaluate the correlations between cytotoxic T lymphocyte-associated antigen-4 (CTLA-4) and multi-drug resistance 1 (MDR1) genes polymorphisms with ulcerative colitis (UC) risk. PubMed, EMBASE, Web of Science, Cochrane Library, CBM databases, Springerlink, Wiley, EBSCO, Ovid, Wanfang database, VIP database, China National Knowledge Infrastructure, and Weipu Journal databases were exhaustively searched using combinations of keywords relating to CTLA-4, MDR1 and UC. The published studies were filtered using our stringent inclusion and exclusion criteria, the quality assessment for each eligible study was conducted using Critical Appraisal Skill Program and the resultant high-quality data from final selected studies were analyzed using Comprehensive Meta-analysis 2.0 (CMA 2.0) software. The correlations between SNPs of CTLA-4 gene, MDR1 gene and the risk of UC were evaluated by OR at 95%CI. Z test was carried out to evaluate the significance of overall effect values. Cochran's Q-statistic and I(2) tests were applied to quantify heterogeneity among studies. Funnel plots, classic fail-safe N and Egger's linear regression test were inspected for indication of publication bias. A total of 107 studies were initially retrieved and 12 studies were eventually selected for meta-analysis. These 12 case-control studies involved 1860 UC patients and 2663 healthy controls. Our major result revealed that single nucleotide polymorphisms (SNPs) of CTLA-4 gene rs3087243 G > A and rs231775 G > A may increase the risk of UC (rs3087243 G > A: allele model: OR = 1.365, 95%CI: 1.023-1.822, P = 0.035; dominant model: OR = 1.569, 95%CI: 1.269-1.940, P < 0.001; rs231775 G > A: allele model: OR = 1.583, 95%CI: = 1.306-1.918, P < 0.001; dominant model: OR = 1.805, 95%CI: 1.393-2.340, P < 0.001). In addition, based on our result, SNPs of MDR1 gene rs1045642 C > T might also confer a significant increases for the risk of UC (allele model: OR = 1.389, 95%CI: 1.214-1.590, P < 0.001; dominant model: OR = 1.518, 95%CI: 1.222-1.886, P < 0.001). CTLA-4 gene rs3087243 G > A and rs231775 G > A, and MDR1 gene rs1045642 C > T might confer an increase for UC risk.

  4. Modeling the survival of Salmonella Enteritidis and Salmonella Typhimurium during the fermentation of yogurt.

    PubMed

    Savran, Derya; Pérez-Rodríguez, Fernando; Halkman, A Kadir

    2018-03-01

    The objective of this study was to evaluate the behavior of Salmonella Enteritidis and Salmonella Typhimurium, the two most important serovars of salmonellosis , during the fermentation of yogurt. The microorganisms were enumerated in milk throughout the fermentation process at three initial inoculum levels (3, 5 and 7 log CFU/mL). DMFit software was used in the fitting procedure of the data (IFR, Norwich, UK, Version 3.5). The data provided sigmoidal curves that were successfully displayed with the Baranyi model. The results showed that the initial inoculum level did not affect the growth for both pathogens; thus, the µ max values (maximum specific growth rate) did not significantly differ across all the contamination levels, ranging from 0.26 to 0.38 for S. Enteritidis and from 0.50 to 0.56 log CFU/g/h for S. Typhimurium ( P > 0.05). However, the µ max values significantly differed between the two serovars ( P < 0.05). The λ values (lag time) did not have a clear trend in either of the pathogens. The present study showed that Salmonella can survive the fermentation process of milk even at a low contamination level. In addition, the models presented in this study can be used in quantitative risk assessment studies to estimate the threat to consumers.

  5. Ambulatory versus home versus clinic blood pressure: the association with subclinical cerebrovascular diseases: the Ohasama Study.

    PubMed

    Hara, Azusa; Tanaka, Kazushi; Ohkubo, Takayoshi; Kondo, Takeo; Kikuya, Masahiro; Metoki, Hirohito; Hashimoto, Takanao; Satoh, Michihiro; Inoue, Ryusuke; Asayama, Kei; Obara, Taku; Hirose, Takuo; Izumi, Shin-Ichi; Satoh, Hiroshi; Imai, Yutaka

    2012-01-01

    The usefulness of ambulatory, home, and casual/clinic blood pressure measurements to predict subclinical cerebrovascular diseases (silent cerebrovascular lesions and carotid atherosclerosis) was compared in a general population. Data on ambulatory, home, and casual/clinic blood pressures and brain MRI to detect silent cerebrovascular lesions were obtained in 1007 subjects aged ≥55 years in a general population of Ohasama, Japan. Of the 1007 subjects, 583 underwent evaluation of the extent of carotid atherosclerosis. Twenty-four-hour, daytime, and nighttime ambulatory and home blood pressure levels were closely associated with the risk of silent cerebrovascular lesions and carotid atherosclerosis (all P<0.05). When home and one of the ambulatory blood pressure values were simultaneously included in the same regression model, each of the ambulatory blood pressure values remained a significant predictor of silent cerebrovascular lesions, whereas home blood pressure lost its predictive value. Of the ambulatory blood pressure values, nighttime blood pressure was the strongest predictor of silent cerebrovascular lesions. The home blood pressure value was more closely associated with the risk of carotid atherosclerosis than any of the ambulatory blood pressure values when home and one of the ambulatory blood pressure values were simultaneously included in the same regression model. The casual/clinic blood pressure value had no significant association with the risk of subclinical cerebrovascular diseases. Although the clinical indications for ambulatory blood pressure monitoring and home blood pressure measurements may overlap, the clinical significance of each method for predicting target organ damage may differ for different target organs.

  6. Modeling individual movement decisions of brown hare (Lepus europaeus) as a key concept for realistic spatial behavior and exposure: A population model for landscape-level risk assessment.

    PubMed

    Kleinmann, Joachim U; Wang, Magnus

    2017-09-01

    Spatial behavior is of crucial importance for the risk assessment of pesticides and for the assessment of effects of agricultural practice or multiple stressors, because it determines field use, exposition, and recovery. Recently, population models have increasingly been used to understand the mechanisms driving risk and recovery or to conduct landscape-level risk assessments. To include spatial behavior appropriately in population models for use in risk assessments, a new method, "probabilistic walk," was developed, which simulates the detailed daily movement of individuals by taking into account food resources, vegetation cover, and the presence of conspecifics. At each movement step, animals decide where to move next based on probabilities being determined from this information. The model was parameterized to simulate populations of brown hares (Lepus europaeus). A detailed validation of the model demonstrated that it can realistically reproduce various natural patterns of brown hare ecology and behavior. Simulated proportions of time animals spent in fields (PT values) were also comparable to field observations. It is shown that these important parameters for the risk assessment may, however, vary in different landscapes. The results demonstrate the value of using population models to reduce uncertainties in risk assessment and to better understand which factors determine risk in a landscape context. Environ Toxicol Chem 2017;36:2299-2307. © 2017 SETAC. © 2017 SETAC.

  7. Using a Systematic Conceptual Model for a Process Evaluation of a Middle School Obesity Risk-Reduction Nutrition Curriculum Intervention: Choice, Control & Change

    PubMed Central

    Lee, Heewon; Contento, Isobel R.; Koch, Pamela

    2012-01-01

    Objective To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. Design A process evaluation study based on a systematic conceptual model. Setting Five middle schools in New York City. Participants 562 students in 20 classes and their science teachers (n=8). Main Outcome Measures Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers’ curriculum evaluation, and satisfaction with teaching the curriculum. Analysis Descriptive statistics and Spearman’s Rho Correlation for quantitative analysis and content analysis for qualitative data were used. Results Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teachers’ satisfaction with teaching the curriculum was highly correlated with students’ satisfaction (p <.05). Teachers’ perception of amount of student work was negatively correlated with implementation and with student satisfaction (p<.05). Conclusions and implications Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. PMID:23321021

  8. Using a systematic conceptual model for a process evaluation of a middle school obesity risk-reduction nutrition curriculum intervention: choice, control & change.

    PubMed

    Lee, Heewon; Contento, Isobel R; Koch, Pamela

    2013-03-01

    To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. A process evaluation study based on a systematic conceptual model. Five middle schools in New York City. Five hundred sixty-two students in 20 classes and their science teachers (n = 8). Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers' curriculum evaluation, and satisfaction with teaching the curriculum. Descriptive statistics and Spearman ρ correlation for quantitative analysis and content analysis for qualitative data were used. Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and the student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teacher satisfaction with teaching the curriculum was highly correlated with student satisfaction (P < .05). Teacher perception of amount of student work was negatively correlated with implementation and with student satisfaction (P < .05). Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  9. Analyte variations in consecutive 24-hour urine collections in children.

    PubMed

    Ellison, Jonathan S; Hollingsworth, John M; Langman, Craig B; Asplin, John R; Schwaderer, Andrew L; Yan, Phyllis; Bierlein, Maggie; Barraza, Mark A; Defoor, William R; Figueroa, T Ernesto; Jackson, Elizabeth C; Jayanthi, Venkata R; Johnson, Emilie K; Joseph, David B; Shnorhavorian, Margarett

    2017-12-01

    The metabolic evaluation of children with nephrolithiasis begins with a 24-h urine collection. For adults, the diagnostic yield increases with consecutive collections; however, little is known regarding the variability of multiple 24-h studies in the pediatric population. We sought to evaluate the variability of consecutive 24-h urine collection in children through a multi-institutional study hypothesizing that compared with a single collection, consecutive 24-h urine collections would reveal a greater degree of clinically useful information in the evaluation of children at risk for nephrolithiasis. Including data from six institutions, we identified children less than 18 years of age considered at risk for recurrent nephrolithiasis, undergoing metabolic evaluation. We evaluated a subset of patients performing two collections with urine creatinine varying by 10% or less during a 7-day period. Discordance between repeat collections based on normative urine chemistry values was evaluated. A total of 733 children met inclusion criteria, and in over a third both urine calcium and urine volume differed by 30% or more between samples. Urine oxalate demonstrated greater variation between collections in children <5 years than among older children (p = 0.030) while variation in other parameters did not differ by age. Discordance between repeat samples based on normative values was most common for urine oxalate (22.5%) and the derived relative supersaturation ratios for both calcium phosphate (25.1%) and calcium oxalate (20.5%). The proportion of discordant samples, based on normative thresholds, as well as variability greater ≥30% and 50%, respectively, are shown in the table. Our analysis indicates that stone risk in as many as one in four children may be misclassified if normative values of only a single 24-h urine are used. In light of these findings, repeat 24-h urine collections prior to targeted intervention to modify stone risk are advised to increase diagnostic yield in children at risk for nephrolithiasis. Copyright © 2017 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  10. Risk management in a large-scale CO2 geosequestration pilot project, Illinois, USA

    USGS Publications Warehouse

    Hnottavange-Telleen, K.; Chabora, E.; Finley, R.J.; Greenberg, S.E.; Marsteller, S.

    2011-01-01

    Like most large-scale infrastructure projects, carbon dioxide (CO 2) geological sequestration (GS) projects have multiple success criteria and multiple stakeholders. In this context "risk evaluation" encompasses multiple scales. Yet a risk management program aims to maximize the chance of project success by assessing, monitoring, minimizing all risks in a consistent framework. The 150,000-km2 Illinois Basin underlies much of the state of Illinois, USA, and parts of adjacent Kentucky and Indiana. Its potential for CO2 storage is first-rate among basins in North America, an impression that has been strengthened by early testing of the injection well of the Midwest Geological Sequestration Consortium's (MGSC's) Phase III large scale demonstration project, the Illinois Basin - Decatur Project (IBDP). The IBDP, funded by the U.S. Department of Energy's National Energy Technology Laboratory (NETL), represents a key trial of GS technologies and project-management techniques. Though risks are specific to each site and project, IBDP risk management methodologies provide valuable experience for future GS projects. IBDP views risk as the potential for negative impact to any of these five values: health and safety, environment, financial, advancing the viability and public acceptability of a GS industry, and research. Research goals include monitoring one million metric tonnes of injected CO2 in the subsurface. Risk management responds to the ways in which any values are at risk: for example, monitoring is designed to reduce uncertainties in parameter values that are important for research and system control, and is also designed to provide public assurance. Identified risks are the primary basis for risk-reduction measures: risks linked to uncertainty in geologic parameters guide further characterization work and guide simulations applied to performance evaluation. Formally, industry defines risk (more precisely risk criticality) as the product L*S, the Likelihood multiplied by the Severity of negative impact. L and S are each evaluated on five-point scales, yielding a theoretical spread in risk values of 1 through 25. So defined, these judgment-based values are categorical and ordinal - they do not represent physically measurable quantities, but are nonetheless useful for comparison and therefore decision support. The "risk entities" first evaluated are FEPs - conceptual Features, Events, and Processes based on the list published by Quintessa Ltd. After concrete scenarios are generated based on selected FEPs, scenarios become the critical entities whose associated risks are evaluated and tracked. In IBDP workshops, L and S values for 123 FEPs were generated through expert elicitation. About 30 experts in the project or in GS in general were assigned among six facilitated working groups, and each group was charged to envision risks within a sphere of project operations. Working groups covered FEPs with strong spatial characteristics - such as those related to the injection wellbore and simulated plume footprint - and "nonspatial" FEPs related to finance, regulations, legal, and stakeholder issues. Within these working groups, experts shared information, examined assumptions, refined and extended the FEP list, calibrated responses, and provided initial L and S values by consensus. Individual rankings were collected in a follow-up process via emailed spreadsheets. For each of L and S, three values were collected: Lower Bound, Best Guess, and Upper Bound. The Lower-Upper Bound ranges and the spreads among experts can be interpreted to yield rough confidence measures. Based on experts' responses, FEPs were ranked in terms of their L*S risk levels. FEP rankings were determined from individual (not consensus or averaged) results, thus no high-risk responses were damped out. The higher-risk FEPs were used to generate one or more concrete, well defined risk-bearing scenarios for each FEP. Any FEP scored by any expert as having associated risk of

  11. Spatial Variability in ADHD-Related Behaviors Among Children Born to Mothers Residing Near the New Bedford Harbor Superfund Site

    PubMed Central

    Vieira, Verónica M.; Fabian, M. Patricia; Webster, Thomas F.; Levy, Jonathan I.; Korrick, Susan A.

    2017-01-01

    Abstract Attention-deficit/hyperactivity disorder (ADHD) has an uncertain etiology, with potential contributions from different risk factors such as prenatal environmental exposure to organochlorines and metals, social risk factors, and genetics. The degree to which geographic variability in ADHD is independent of, or explained by, risk factors may provide etiological insight. We investigated determinants of geographic variation in ADHD-related behaviors among children living near the polychlorinated biphenyl–contaminated New Bedford Harbor (NBH) Superfund site in Massachusetts. Participants were 573 children recruited at birth (1993–1998) who were born to mothers residing near the NBH site. We assessed ADHD-related behaviors at age 8 years using Conners’ Teacher Rating Scale–Revised: Long Version. Adjusted generalized additive models were used to smooth the association of pregnancy residence with ADHD-related behaviors and assess whether prenatal organochlorine or metal exposures, sociodemographic factors, or other factors explained spatial patterns. Models that adjusted for child's age and sex displayed significantly increased ADHD-related behavior among children whose mothers resided west of the NBH site during pregnancy. These spatial patterns persisted after adjusting for prenatal exposure to organochlorines and metals but were no longer significant after controlling for sociodemographic factors. The findings underscore the value of spatial analysis in identifying high-risk subpopulations and evaluating candidate risk factors. PMID:28444119

  12. Supporting virtual enterprise design by a web-based information model

    NASA Astrophysics Data System (ADS)

    Li, Dong; Barn, Balbir; McKay, Alison; de Pennington, Alan

    2001-10-01

    Development of IT and its applications have led to significant changes in business processes. To pursue agility, flexibility and best service to customers, enterprises focus on their core competence and dynamically build relationships with partners to form virtual enterprises as customer driven temporary demand chains/networks. Building the networked enterprise needs responsively interactive decisions instead of a single-direction partner selection process. Benefits and risks in the combination should be systematically analysed, and aggregated information about value-adding abilities and risks of networks needs to be derived from interactions of all partners. In this research, a hierarchical information model to assess partnerships for designing virtual enterprises was developed. Internet technique has been applied to the evaluation process so that interactive decisions can be visualised and made responsively during the design process. The assessment is based on the process which allows each partner responds to requirements of the virtual enterprise by planning its operational process as a bidder. The assessment is then produced by making an aggregated value to represent prospect of the combination of partners given current bidding. Final design is a combination of partners with the greatest total value-adding capability and lowest risk.

  13. Risk factors for the treatment outcome of retreated pulmonary tuberculosis patients in China: an optimized prediction model.

    PubMed

    Wang, X-M; Yin, S-H; Du, J; Du, M-L; Wang, P-Y; Wu, J; Horbinski, C M; Wu, M-J; Zheng, H-Q; Xu, X-Q; Shu, W; Zhang, Y-J

    2017-07-01

    Retreatment of tuberculosis (TB) often fails in China, yet the risk factors associated with the failure remain unclear. To identify risk factors for the treatment failure of retreated pulmonary tuberculosis (PTB) patients, we analyzed the data of 395 retreated PTB patients who received retreatment between July 2009 and July 2011 in China. PTB patients were categorized into 'success' and 'failure' groups by their treatment outcome. Univariable and multivariable logistic regression were used to evaluate the association between treatment outcome and socio-demographic as well as clinical factors. We also created an optimized risk score model to evaluate the predictive values of these risk factors on treatment failure. Of 395 patients, 99 (25·1%) were diagnosed as retreatment failure. Our results showed that risk factors associated with treatment failure included drug resistance, low education level, low body mass index (6 months), standard treatment regimen, retreatment type, positive culture result after 2 months of treatment, and the place where the first medicine was taken. An Optimized Framingham risk model was then used to calculate the risk scores of these factors. Place where first medicine was taken (temporary living places) received a score of 6, which was highest among all the factors. The predicted probability of treatment failure increases as risk score increases. Ten out of 359 patients had a risk score >9, which corresponded to an estimated probability of treatment failure >70%. In conclusion, we have identified multiple clinical and socio-demographic factors that are associated with treatment failure of retreated PTB patients. We also created an optimized risk score model that was effective in predicting the retreatment failure. These results provide novel insights for the prognosis and improvement of treatment for retreated PTB patients.

  14. Implementing the Keele stratified care model for patients with low back pain: an observational impact study.

    PubMed

    Bamford, Adrian; Nation, Andy; Durrell, Susie; Andronis, Lazaros; Rule, Ellen; McLeod, Hugh

    2017-02-03

    The Keele stratified care model for management of low back pain comprises use of the prognostic STarT Back Screening Tool to allocate patients into one of three risk-defined categories leading to associated risk-specific treatment pathways, such that high-risk patients receive enhanced treatment and more sessions than medium- and low-risk patients. The Keele model is associated with economic benefits and is being widely implemented. The objective was to assess the use of the stratified model following its introduction in an acute hospital physiotherapy department setting in Gloucestershire, England. Physiotherapists recorded data on 201 patients treated using the Keele model in two audits in 2013 and 2014. To assess whether implementation of the stratified model was associated with the anticipated range of treatment sessions, regression analysis of the audit data was used to determine whether high- or medium-risk patients received significantly more treatment sessions than low-risk patients. The analysis controlled for patient characteristics, year, physiotherapists' seniority and physiotherapist. To assess the physiotherapists' views on the usefulness of the stratified model, audit data on this were analysed using framework methods. To assess the potential economic consequences of introducing the stratified care model in Gloucestershire, published economic evaluation findings on back-related National Health Service (NHS) costs, quality-adjusted life years (QALYs) and societal productivity losses were applied to audit data on the proportion of patients by risk classification and estimates of local incidence. When the Keele model was implemented, patients received significantly more treatment sessions as the risk-rating increased, in line with the anticipated impact of targeted treatment pathways. Physiotherapists were largely positive about using the model. The potential annual impact of rolling out the model across Gloucestershire is a gain in approximately 30 QALYs, a reduction in productivity losses valued at £1.4 million and almost no change to NHS costs. The Keele model was implemented and risk-specific treatment pathways successfully used for patients presenting with low back pain. Applying published economic evidence to the Gloucestershire locality suggests that substantial health and productivity outcomes would be associated with rollout of the Keele model while being cost-neutral for the NHS.

  15. Modelling the effect of temperature, water activity and pH on the growth of Serpula lacrymans.

    PubMed

    Maurice, S; Coroller, L; Debaets, S; Vasseur, V; Le Floch, G; Barbier, G

    2011-12-01

    To predict the risk factors for building infestation by Serpula lacrymans, which is one of the most destructive fungi causing timber decay in buildings. The growth rate was assessed on malt extract agar media at temperatures between 1.5 and 45°C, at water activity (a(w)) over the range of 0.800-0.993 and at pH ranges from 1.5 to 11.0. The radial growth rate (μ) and the lag phase (λ) were estimated from the radial growth kinetics via the plots radius vs time. These parameters were then modelled as a function of the environmental factors tested. Models derived from the cardinal model (CM) were used to fit the experimental data and allowed an estimation of the optimal and limit values for fungal growth. Optimal growth rate occurred at 20°C, at high a(w) level (0.993) and at a pH range between 4.0 and 6.0. The strain effect on the temperature parameters was further evaluated using 14 strains of S. lacrymans. The robustness of the temperature model was validated on data sets measured in two different wood-based media (Quercus robur L. and Picea abies). The two-step procedure of exponential model with latency followed by the CM with inflection gives reliable predictions for the growth conditions of a filamentous fungus in our study. The procedure was validated for the study of abiotic factors on the growth rate of S. lacrymans. This work describes the usefulness of evaluating the effect of physico-chemical factors on fungal growth in predictive building mycology. Consequently, the developed mathematical models for predicting fungal growth on a macroscopic scale can be used as a tool for risk assessment of timber decay in buildings. © 2011 The Authors. Journal of Applied Microbiology © 2011 The Society for Applied Microbiology.

  16. Assessing the potential of economic instruments for managing drought risk at river basin scale

    NASA Astrophysics Data System (ADS)

    Pulido-Velazquez, M.; Lopez-Nicolas, A.; Macian-Sorribes, H.

    2015-12-01

    Economic instruments work as incentives to adapt individual decisions to collectively agreed goals. Different types of economic instruments have been applied to manage water resources, such as water-related taxes and charges (water pricing, environmental taxes, etc.), subsidies, markets or voluntary agreements. Hydroeconomic models (HEM) provide useful insight on optimal strategies for coping with droughts by simultaneously analysing engineering, hydrology and economics of water resources management. We use HEMs for evaluating the potential of economic instruments on managing drought risk at river basin scale, considering three criteria for assessing drought risk: reliability, resilience and vulnerability. HEMs allow to calculate water scarcity costs as the economic losses due to water deliveries below the target demands, which can be used as a vulnerability descriptor of drought risk. Two generic hydroeconomic DSS tools, SIMGAMS and OPTIGAMS ( both programmed in GAMS) have been developed to evaluate water scarcity cost at river basin scale based on simulation and optimization approaches. The simulation tool SIMGAMS allocates water according to the system priorities and operating rules, and evaluate the scarcity costs using economic demand functions. The optimization tool allocates water resources for maximizing net benefits (minimizing total water scarcity plus operating cost of water use). SIMGAS allows to simulate incentive water pricing policies based on water availability in the system (scarcity pricing), while OPTIGAMS is used to simulate the effect of ideal water markets by economic optimization. These tools have been applied to the Jucar river system (Spain), highly regulated and with high share of water use for crop irrigation (greater than 80%), where water scarcity, irregular hydrology and groundwater overdraft cause droughts to have significant economic, social and environmental consequences. An econometric model was first used to explain the variation of the production value of irrigated agriculture during droughts, assessing revenue responses to varying crop prices and water availability. Hydroeconomic approaches were then used to show the potential of economic instruments in setting incentives for a more efficient management of water resources systems.

  17. A joint model for longitudinal and time-to-event data to better assess the specific role of donor and recipient factors on long-term kidney transplantation outcomes.

    PubMed

    Fournier, Marie-Cécile; Foucher, Yohann; Blanche, Paul; Buron, Fanny; Giral, Magali; Dantan, Etienne

    2016-05-01

    In renal transplantation, serum creatinine (SCr) is the main biomarker routinely measured to assess patient's health, with chronic increases being strongly associated with long-term graft failure risk (death with a functioning graft or return to dialysis). Joint modeling may be useful to identify the specific role of risk factors on chronic evolution of kidney transplant recipients: some can be related to the SCr evolution, finally leading to graft failure, whereas others can be associated with graft failure without any modification of SCr. Sample data for 2749 patients transplanted between 2000 and 2013 with a functioning kidney at 1-year post-transplantation were obtained from the DIVAT cohort. A shared random effect joint model for longitudinal SCr values and time to graft failure was performed. We show that graft failure risk depended on both the current value and slope of the SCr. Deceased donor graft patient seemed to have a higher SCr increase, similar to patient with diabetes history, while no significant association of these two features with graft failure risk was found. Patient with a second graft was at higher risk of graft failure, independent of changes in SCr values. Anti-HLA immunization was associated with both processes simultaneously. Joint models for repeated and time-to-event data bring new opportunities to improve the epidemiological knowledge of chronic diseases. For instance in renal transplantation, several features should receive additional attention as we demonstrated their correlation with graft failure risk was independent of the SCr evolution.

  18. Residential surface soil guidance values applied worldwide to the original 2001 Stockholm Convention POP pesticides.

    PubMed

    Jennings, Aaron A; Li, Zijian

    2015-09-01

    Surface soil contamination is a worldwide problem. Many regulatory jurisdictions attempt to control human exposures with regulatory guidance values (RGVs) that specify a soil's maximum allowable concentration. Pesticides are important soil contaminants because of their intentional toxicity and widespread surface soil application. Worldwide, at least 174 regulatory jurisdictions from 54 United Nations member states have published more than 19,400 pesticide RGVs for at least 739 chemically unique pesticides. This manuscript examines the variability of the guidance values that are applied worldwide to the original 2001 Stockholm Convention persistent organic pollutants (POP) pesticides (Aldrin, Chlordane, DDT, Dieldrin, Endrin, Heptachlor, Mirex, and Toxaphene) for which at least 1667 RGVs have been promulgated. Results indicate that the spans of the RGVs applied to each of these pesticides vary from 6.1 orders of magnitude for Toxaphene to 10.0 orders of magnitude for Mirex. The distribution of values across these value spans resembles the distribution of lognormal random variables, but also contain non-random value clusters. Approximately 40% of all the POP RGVs fall within uncertainty bounds computed from the U.S. Environmental Protection Agency (USEPA) RGV cancer risk model. Another 22% of the values fall within uncertainty bounds computed from the USEPA's non-cancer risk model, but the cancer risk calculations yield the binding (lowest) value for all POP pesticides except Endrin. The results presented emphasize the continued need to rationalize the RGVs applied worldwide to important soil contaminants. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Evaluation of the National Surgical Quality Improvement Program Universal Surgical Risk Calculator for a gynecologic oncology service.

    PubMed

    Szender, J Brian; Frederick, Peter J; Eng, Kevin H; Akers, Stacey N; Lele, Shashikant B; Odunsi, Kunle

    2015-03-01

    The National Surgical Quality Improvement Program is aimed at preventing perioperative complications. An online calculator was recently published, but the primary studies used limited gynecologic surgery data. The purpose of this study was to evaluate the performance of the National Surgical Quality Improvement Program Universal Surgical Risk Calculator (URC) on the patients of a gynecologic oncology service. We reviewed 628 consecutive surgeries performed by our gynecologic oncology service between July 2012 and June 2013. Demographic data including diagnosis and cancer stage, if applicable, were collected. Charts were reviewed to determine complication rates. Specific complications were as follows: death, pneumonia, cardiac complications, surgical site infection (SSI) or urinary tract infection, renal failure, or venous thromboembolic event. Data were compared with modeled outcomes using Brier scores and receiver operating characteristic curves. Significance was declared based on P < 0.05. The model accurately predicated death and venous thromboembolic event, with Brier scores of 0.004 and 0.003, respectively. Predicted risk was 50% greater than experienced for urinary tract infection; the experienced SSI and pneumonia rates were 43% and 36% greater than predicted. For any complication, the Brier score 0.023 indicates poor performance of the model. In this study of gynecologic surgeries, we could not verify the predictive value of the URC for cardiac complications, SSI, and pneumonia. One disadvantage of applying a URC to multiple subspecialties is that with some categories, complications are not accurately estimated. Our data demonstrate that some predicted risks reported by the calculator need to be interpreted with reservation.

  20. Development and validation of a dementia screening tool for primary care in Taiwan: Brain Health Test

    PubMed Central

    Tsai, Ping-Huang; Liu, Jian-Liang; Lin, Ker-Neng; Chang, Chiung-Chih; Pai, Ming-Chyi; Wang, Wen-Fu; Huang, Jen-Ping; Hwang, Tzung-Jeng; Wang, Pei-Ning

    2018-01-01

    Objectives To develop a simple dementia screening tool to assist primary care physicians in identifying patients with cognitive impairment among subjects with memory complaints or at a high risk for dementia. Design The Brain Health Test (BHT) was developed by several experienced neurologists, psychiatrists, and clinical psychologists in the Taiwan Dementia Society. Validation of the BHT was conducted in the memory clinics of various levels of hospitals in Taiwan. Participants All dementia patients at the memory clinics who met the inclusion criteria of age greater or equal to 50 years were enrolled. Besides the BHT, the Mini-Mental State Examination and Clinical Dementia Rating were used to evaluate the cognition state of the patients and the severity of dementia. Results The BHT includes two parts: a risk evaluation and a cognitive test (BHT-cog). Self or informants reports of memory decline or needing help from others to manage money or medications were significantly associated with cognitive impairment. Among the risk factors evaluated in the BHT, a total risk score greater or equal to 8 was defined as a high risk for dementia. The total score for the finalized BHT-cog was 16. When the cutoff value for the BHT-cog was set to 10 for differentiating dementia and a normal mental state, the sensitivity was 91.5%, the specificity was 87.3%, the positive predictive value was 94.8%, and the negative predictive value was 80.1% The area under the receiver operating characteristic curve between dementia and healthy subjects was 0.958 (95% CI = 0.941–0.975). Conclusions The BHT is a simple tool that may be useful in primary care settings to identify high-risk patients to target for cognitive screening. PMID:29694392

  1. Evaluating scale and roughness effects in urban flood modelling using terrestrial LIDAR data

    NASA Astrophysics Data System (ADS)

    Ozdemir, H.; Sampson, C. C.; de Almeida, G. A. M.; Bates, P. D.

    2013-10-01

    This paper evaluates the results of benchmark testing a new inertial formulation of the St. Venant equations, implemented within the LISFLOOD-FP hydraulic model, using different high resolution terrestrial LiDAR data (10 cm, 50 cm and 1 m) and roughness conditions (distributed and composite) in an urban area. To examine these effects, the model is applied to a hypothetical flooding scenario in Alcester, UK, which experienced surface water flooding during summer 2007. The sensitivities of simulated water depth, extent, arrival time and velocity to grid resolutions and different roughness conditions are analysed. The results indicate that increasing the terrain resolution from 1 m to 10 cm significantly affects modelled water depth, extent, arrival time and velocity. This is because hydraulically relevant small scale topography that is accurately captured by the terrestrial LIDAR system, such as road cambers and street kerbs, is better represented on the higher resolution DEM. It is shown that altering surface friction values within a wide range has only a limited effect and is not sufficient to recover the results of the 10 cm simulation at 1 m resolution. Alternating between a uniform composite surface friction value (n = 0.013) or a variable distributed value based on land use has a greater effect on flow velocities and arrival times than on water depths and inundation extent. We conclude that the use of extra detail inherent in terrestrial laser scanning data compared to airborne sensors will be advantageous for urban flood modelling related to surface water, risk analysis and planning for Sustainable Urban Drainage Systems (SUDS) to attenuate flow.

  2. Evaluating scale and roughness effects in urban flood modelling using terrestrial LIDAR data

    NASA Astrophysics Data System (ADS)

    Ozdemir, H.; Sampson, C. C.; de Almeida, G. A. M.; Bates, P. D.

    2013-05-01

    This paper evaluates the results of benchmark testing a new inertial formulation of the de St. Venant equations, implemented within the LISFLOOD-FP hydraulic model, using different high resolution terrestrial LiDAR data (10 cm, 50 cm and 1 m) and roughness conditions (distributed and composite) in an urban area. To examine these effects, the model is applied to a hypothetical flooding scenario in Alcester, UK, which experienced surface water flooding during summer 2007. The sensitivities of simulated water depth, extent, arrival time and velocity to grid resolutions and different roughness conditions are analysed. The results indicate that increasing the terrain resolution from 1 m to 10 cm significantly affects modelled water depth, extent, arrival time and velocity. This is because hydraulically relevant small scale topography that is accurately captured by the terrestrial LIDAR system, such as road cambers and street kerbs, is better represented on the higher resolution DEM. It is shown that altering surface friction values within a wide range has only a limited effect and is not sufficient to recover the results of the 10 cm simulation at 1 m resolution. Alternating between a uniform composite surface friction value (n = 0.013) or a variable distributed value based on land use has a greater effect on flow velocities and arrival times than on water depths and inundation extent. We conclude that the use of extra detail inherent in terrestrial laser scanning data compared to airborne sensors will be advantageous for urban flood modelling related to surface water, risk analysis and planning for Sustainable Urban Drainage Systems (SUDS) to attenuate flow.

  3. Spatial-temporal and cancer risk assessment of selected hazardous air pollutants in Seattle.

    PubMed

    Wu, Chang-fu; Liu, L-J Sally; Cullen, Alison; Westberg, Hal; Williamson, John

    2011-01-01

    In the Seattle Air Toxics Monitoring Pilot Program, we measured 15 hazardous air pollutants (HAPs) at 6 sites for more than a year between 2000 and 2002. Spatial-temporal variations were evaluated with random-effects models and principal component analyses. The potential health risks were further estimated based on the monitored data, with the incorporation of the bootstrapping technique for the uncertainty analysis. It is found that the temporal variability was generally higher than the spatial variability for most air toxics. The highest temporal variability was observed for tetrachloroethylene (70% temporal vs. 34% spatial variability). Nevertheless, most air toxics still exhibited significant spatial variations, even after accounting for the temporal effects. These results suggest that it would require operating multiple air toxics monitoring sites over a significant period of time with proper monitoring frequency to better evaluate population exposure to HAPs. The median values of the estimated inhalation cancer risks ranged between 4.3 × 10⁻⁵ and 6.0 × 10⁻⁵, with the 5th and 95th percentile levels exceeding the 1 in a million level. VOCs as a whole contributed over 80% of the risk among the HAPs measured and arsenic contributed most substantially to the overall risk associated with metals. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Rapid experimental measurements of physicochemical properties to inform models and testing.

    PubMed

    Nicolas, Chantel I; Mansouri, Kamel; Phillips, Katherine A; Grulke, Christopher M; Richard, Ann M; Williams, Antony J; Rabinowitz, James; Isaacs, Kristin K; Yau, Alice; Wambaugh, John F

    2018-05-02

    The structures and physicochemical properties of chemicals are important for determining their potential toxicological effects, toxicokinetics, and route(s) of exposure. These data are needed to prioritize the risk for thousands of environmental chemicals, but experimental values are often lacking. In an attempt to efficiently fill data gaps in physicochemical property information, we generated new data for 200 structurally diverse compounds, which were rigorously selected from the USEPA ToxCast chemical library, and whose structures are available within the Distributed Structure-Searchable Toxicity Database (DSSTox). This pilot study evaluated rapid experimental methods to determine five physicochemical properties, including the log of the octanol:water partition coefficient (known as log(K ow ) or logP), vapor pressure, water solubility, Henry's law constant, and the acid dissociation constant (pKa). For most compounds, experiments were successful for at least one property; log(K ow ) yielded the largest return (176 values). It was determined that 77 ToxPrint structural features were enriched in chemicals with at least one measurement failure, indicating which features may have played a role in rapid method failures. To gauge consistency with traditional measurement methods, the new measurements were compared with previous measurements (where available). Since quantitative structure-activity/property relationship (QSAR/QSPR) models are used to fill gaps in physicochemical property information, 5 suites of QSPRs were evaluated for their predictive ability and chemical coverage or applicability domain of new experimental measurements. The ability to have accurate measurements of these properties will facilitate better exposure predictions in two ways: 1) direct input of these experimental measurements into exposure models; and 2) construction of QSPRs with a wider applicability domain, as their predicted physicochemical values can be used to parameterize exposure models in the absence of experimental data. Published by Elsevier B.V.

  5. Hypertensive Disorders of Pregnancy in Women with Gestational Diabetes Mellitus on Overweight Status of Their Children

    PubMed Central

    Zhang, Shuang; Wang, Leishen; Leng, Junhong; Liu, Huikun; Li, Weiqin; Zhang, Tao; Li, Nan; Li, Wei; Tian, Huiguang; Baccarelli, Andrea A.; Hou, Lifang; Hu, Gang

    2017-01-01

    Hypertensive disorders of pregnancy (HDP) as a group of medical complications in pregnancy are believed to be associated with an increased risk of poor fetal growth, but the influence on offspring’s body composition is not clear. The aim of the present study was to evaluate the association between maternal hypertensive disorders of pregnancy and overweight status in the offspring of mothers with gestational diabetes mellitus (GDM). A cross-sectional study among 1263 GDM mother-child pairs was performed in Tianjin, China. General linear models and logistic regression models were used to assess the associations of maternal hypertension in pregnancy with anthropometry and overweight status in the offspring from birth to 1–5 years old. Offspring of GDM mothers who were diagnosed with hypertensive disorders during pregnancy had higher mean values of Z scores for birth weight for gestational age and birth weight for length, and higher mean values of Z scores for weight for age, weight for length/height, and body mass index for age at 1–5 years old than those of GDM mothers with normal blood pressure during pregnancy. Maternal hypertensive disorders of pregnancy were associated with increased risks of large for gestational age (OR 1.74, 95%CI 1.08–2.79) and macrosomia (OR 2.02, 95%CI 1.23–3.31) at birth and childhood overweight/obesity at 1–5 years old age (OR 1.88, 95%CI 1.16–3.04). For offspring of mothers with GDM, maternal hypertension during pregnancy was a risk factor for macrosomia at birth and childhood overweight and obesity, and controlling the maternal hypertension may be more important for preventing large for gestational age babies and childhood obesity. PMID:28300070

  6. Assessment of the effectiveness of participatory developed adaptation strategies for HCMC

    NASA Astrophysics Data System (ADS)

    Lasage, R.; Veldkamp, T. I. E.; de Moel, H.; Van, T. C.; Phi, H. L.; Vellinga, P.; Aerts, J. C. J. H.

    2014-01-01

    Coastal cities are vulnerable to flooding, and flood risk to coastal cities will increase due to sea-level rise. Moreover, especially Asian cities are subject to considerable population growth and associated urban developments, increasing this risk even more. Empirical data on vulnerability and the cost and benefits of flood risk reducing measures are therefore paramount for sustainable development of these cities. This paper presents an approach to explore the impacts of sea level rise and socio-economic developments on flood risk for the flood prone District 4 in Ho Chi Minh City, Vietnam, and to develop and evaluate the effects of different adaptation strategies (new levees, dry- and wet flood proofing of buildings). A flood damage model was developed to simulate current and future flood risk using the results from a household survey to establish stage-damage curves for residential buildings. the model has been used to assess the effects of several participatory developed adaptation strategies to reduce flood risk, expressed in Expected Annual Damage (EAD). Adaptation strategies were evaluated assuming combinations of both sea level scenarios and land use scenarios. Together with information on costs of these strategies, we calculated the benefit-cost ratio and net present value for the adaptation strategies until 2100, taking into account depreciation rates of 2.5% and 5%. The results of this modeling study indicate that the current flood risk in District 4 is 0.31 million USD yr-1, increasing up to 0.78 million USD yr-1 in 2100. The net present value and benefit-cost ratios using a discount rate of 5% range from USD -107 to -1.5 million, and from 0.086 to 0.796 for the different strategies. Using a discount rate of 2.5% leads to an increase in both net present value and benefit cost ratio. The adaptation strategies wet proofing and dry proofing generate the best results using these economic indicators. The information on different strategies will be used by the government of Ho Chi Minh City for selecting a new flood protection strategy. Future research should focus on gathering empirical data right after a flood on the occurring damage, as this appears to be the most uncertain factor in the risk assessment.

  7. Assessment of the effectiveness of flood adaptation strategies for HCMC

    NASA Astrophysics Data System (ADS)

    Lasage, R.; Veldkamp, T. I. E.; de Moel, H.; Van, T. C.; Phi, H. L.; Vellinga, P.; Aerts, J. C. J. H.

    2014-06-01

    Coastal cities are vulnerable to flooding, and flood risk to coastal cities will increase due to sea-level rise. Moreover, Asian cities in particular are subject to considerable population growth and associated urban developments, increasing this risk even more. Empirical data on vulnerability and the cost and benefits of flood risk reduction measures are therefore paramount for sustainable development of these cities. This paper presents an approach to explore the impacts of sea-level rise and socio-economic developments on flood risk for the flood-prone District 4 in Ho Chi Minh City, Vietnam, and to develop and evaluate the effects of different adaptation strategies (new levees, dry- and wet proofing of buildings and elevating roads and buildings). A flood damage model was developed to simulate current and future flood risk using the results from a household survey to establish stage-damage curves for residential buildings. The model has been used to assess the effects of several participatory developed adaptation strategies to reduce flood risk, expressed in expected annual damage (EAD). Adaptation strategies were evaluated assuming combinations of both sea-level scenarios and land-use scenarios. Together with information on costs of these strategies, we calculated the benefit-cost ratio and net present value for the adaptation strategies until 2100, taking into account depreciation rates of 2.5% and 5%. The results of this modelling study indicate that the current flood risk in District 4 is USD 0.31 million per year, increasing up to USD 0.78 million per year in 2100. The net present value and benefit-cost ratios using a discount rate of 5 % range from USD -107 to -1.5 million, and from 0.086 to 0.796 for the different strategies. Using a discount rate of 2.5% leads to an increase in both net present value and benefit-cost ratio. The adaptation strategies wet-proofing and dry-proofing generate the best results using these economic indicators. The information on different strategies will be used by the government of Ho Chi Minh City to determine a new flood protection strategy. Future research should focus on gathering empirical data right after a flood on the occurring damage, as this appears to be the most uncertain factor in the risk assessment.

  8. Risk Prediction Models in Psychiatry: Toward a New Frontier for the Prevention of Mental Illnesses.

    PubMed

    Bernardini, Francesco; Attademo, Luigi; Cleary, Sean D; Luther, Charles; Shim, Ruth S; Quartesan, Roberto; Compton, Michael T

    2017-05-01

    We conducted a systematic, qualitative review of risk prediction models designed and tested for depression, bipolar disorder, generalized anxiety disorder, posttraumatic stress disorder, and psychotic disorders. Our aim was to understand the current state of research on risk prediction models for these 5 disorders and thus future directions as our field moves toward embracing prediction and prevention. Systematic searches of the entire MEDLINE electronic database were conducted independently by 2 of the authors (from 1960 through 2013) in July 2014 using defined search criteria. Search terms included risk prediction, predictive model, or prediction model combined with depression, bipolar, manic depressive, generalized anxiety, posttraumatic, PTSD, schizophrenia, or psychosis. We identified 268 articles based on the search terms and 3 criteria: published in English, provided empirical data (as opposed to review articles), and presented results pertaining to developing or validating a risk prediction model in which the outcome was the diagnosis of 1 of the 5 aforementioned mental illnesses. We selected 43 original research reports as a final set of articles to be qualitatively reviewed. The 2 independent reviewers abstracted 3 types of data (sample characteristics, variables included in the model, and reported model statistics) and reached consensus regarding any discrepant abstracted information. Twelve reports described models developed for prediction of major depressive disorder, 1 for bipolar disorder, 2 for generalized anxiety disorder, 4 for posttraumatic stress disorder, and 24 for psychotic disorders. Most studies reported on sensitivity, specificity, positive predictive value, negative predictive value, and area under the (receiver operating characteristic) curve. Recent studies demonstrate the feasibility of developing risk prediction models for psychiatric disorders (especially psychotic disorders). The field must now advance by (1) conducting more large-scale, longitudinal studies pertaining to depression, bipolar disorder, anxiety disorders, and other psychiatric illnesses; (2) replicating and carrying out external validations of proposed models; (3) further testing potential selective and indicated preventive interventions; and (4) evaluating effectiveness of such interventions in the context of risk stratification using risk prediction models. © Copyright 2017 Physicians Postgraduate Press, Inc.

  9. Evaluating the effects of dam breach methodologies on Consequence Estimation through Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Kalyanapu, A. J.; Thames, B. A.

    2013-12-01

    Dam breach modeling often includes application of models that are sophisticated, yet computationally intensive to compute flood propagation at high temporal and spatial resolutions. This results in a significant need for computational capacity that requires development of newer flood models using multi-processor and graphics processing techniques. Recently, a comprehensive benchmark exercise titled the 12th Benchmark Workshop on Numerical Analysis of Dams, is organized by the International Commission on Large Dams (ICOLD) to evaluate the performance of these various tools used for dam break risk assessment. The ICOLD workshop is focused on estimating the consequences of failure of a hypothetical dam near a hypothetical populated area with complex demographics, and economic activity. The current study uses this hypothetical case study and focuses on evaluating the effects of dam breach methodologies on consequence estimation and analysis. The current study uses ICOLD hypothetical data including the topography, dam geometric and construction information, land use/land cover data along with socio-economic and demographic data. The objective of this study is to evaluate impacts of using four different dam breach methods on the consequence estimates used in the risk assessments. The four methodologies used are: i) Froehlich (1995), ii) MacDonald and Langridge-Monopolis 1984 (MLM), iii) Von Thun and Gillete 1990 (VTG), and iv) Froehlich (2008). To achieve this objective, three different modeling components were used. First, using the HEC-RAS v.4.1, dam breach discharge hydrographs are developed. These hydrographs are then provided as flow inputs into a two dimensional flood model named Flood2D-GPU, which leverages the computer's graphics card for much improved computational capabilities of the model input. Lastly, outputs from Flood2D-GPU, including inundated areas, depth grids, velocity grids, and flood wave arrival time grids, are input into HEC-FIA, which provides the consequence assessment for the solution to the problem statement. For the four breach methodologies, a sensitivity analysis of four breach parameters, breach side slope (SS), breach width (Wb), breach invert elevation (Elb), and time of failure (tf), is conducted. Up to, 68 simulations are computed to produce breach hydrographs in HEC-RAS for input into Flood2D-GPU. The Flood2D-GPU simulation results were then post-processed in HEC-FIA to evaluate: Total Population at Risk (PAR), 14-yr and Under PAR (PAR14-), 65-yr and Over PAR (PAR65+), Loss of Life (LOL) and Direct Economic Impact (DEI). The MLM approach resulted in wide variability in simulated minimum and maximum values of PAR, PAR 65+ and LOL estimates. For PAR14- and DEI, Froehlich (1995) resulted in lower values while MLM resulted in higher estimates. This preliminary study demonstrated the relative performance of four commonly used dam breach methodologies and their impacts on consequence estimation.

  10. [Establishment of risk evaluation model of peritoneal metastasis in gastric cancer and its predictive value].

    PubMed

    Zhao, Junjie; Zhou, Rongjian; Zhang, Qi; Shu, Ping; Li, Haojie; Wang, Xuefei; Shen, Zhenbin; Liu, Fenglin; Chen, Weidong; Qin, Jing; Sun, Yihong

    2017-01-25

    To establish an evaluation model of peritoneal metastasis in gastric cancer, and to assess its clinical significance. Clinical and pathologic data of the consecutive cases of gastric cancer admitted between April 2015 and December 2015 in Department of General Surgery, Zhongshan Hospital of Fudan University were analyzed retrospectively. A total of 710 patients were enrolled in the study after 18 patients with other distant metastasis were excluded. The correlations between peritoneal metastasis and different factors were studied through univariate (Pearson's test or Fisher's exact test) and multivariate analyses (Binary Logistic regression). Independent predictable factors for peritoneal metastasis were combined to establish a risk evaluation model (nomogram). The nomogram was created with R software using the 'rms' package. In the nomogram, each factor had different scores, and every patient could have a total score by adding all the scores of each factor. A higher total score represented higher risk of peritoneal metastasis. Receiver operating characteristic (ROC) curve analysis was used to compare the sensitivity and specificity of the established nomogram. Delong. Delong. Clarke-Pearson test was used to compare the difference of the area under the curve (AUC). The cut-off value was determined by the AUC, when the ROC curve had the biggest AUC, the model had the best sensitivity and specificity. Among 710 patients, 47 patients had peritoneal metastasis (6.6%), including 30 male (30/506, 5.9%) and 17 female (17/204, 8.3%); 31 were ≥ 60 years old (31/429, 7.2%); 38 had tumor ≥ 3 cm(38/461, 8.2%). Lauren classification indicated that 2 patients were intestinal type(2/245, 0.8%), 8 patients were mixed type(8/208, 3.8%), 11 patients were diffuse type(11/142, 7.7%), and others had no associated data. CA19-9 of 13 patients was ≥ 37 kU/L(13/61, 21.3%); CA125 of 11 patients was ≥ 35 kU/L(11/36, 30.6%); CA72-4 of 11 patients was ≥ 10 kU/L(11/39, 28.2%). Neutrophil/lymphocyte ratio (NLR) of 26 patients was ≥ 2.37(26/231, 11.3%). Multivariate analysis showed that Lauren classification (HR=8.95, 95%CI:1.32-60.59, P=0.025), CA125(HR=17.45, 95%CI:5.54-54.89, P=0.001), CA72-4(HR=20.06, 95%CI:5.05-79.68, P=0.001), and NLR (HR=4.16, 95%CI:1.17-14.75, P=0.032) were independent risk factors of peritoneal metastasis in gastric cancer. In the nomogram, the highest score was 241, including diffuse or mixed Lauren classification (54 score), CA125 ≥ 35 kU/L (66 score), CA72-4 ≥ 10 kU/L (100 score), and NLR ≥ 2.37 (21 score), which represented a highest risk of peritoneal metastasis (more than 90%). The AUC of nomogram was 0.912, which was superior than any single variable (AUC of Lauren classification: 0.678; AUC of CA125: 0.720; AUC of CA72-4: 0.792; AUC of NLR: 0.613, all P=0.000). The total score of nomogram increased according to the TNM stage, and was highest in the peritoneal metastasis group (F=49.1, P=0.000). When the cut-off value calculated by ROC analysis was set at 140, the model could best balanced the sensitivity (0.79) and the specificity (0.87). Only 5% of patients had peritoneal metastasis when their nomogram scores were lower than 140, while 58% of patients had peritoneal metastasis when their scores were ≥ 140(χ 2 =69.1, P=0.000). The risk evaluation model established with Lauren classification, CA125, CA72-4 and NLR can effectively predict the risk of peritoneal metastasis in gastric cancer, and provide the reference to preoperative staging and choice of therapeutic strategy.

  11. Evaluation of Biomonitoring Data from the CDC National ...

    EPA Pesticide Factsheets

    BACKGROUND: Biomonitoring data reported in the National Report on Human Exposure to Environmental Chemicals (NER) provide information on the presence and concentrations of more than 400 chemicals in human blood and urine. Biomonitoring Equivalents (BEs) and other risk assessment-based values now allow interpretation of these biomonitoring data in a public health risk context. OBJECTIVES: Compare the measured biomarker concentrations in the NER with BEs and similar risk assessment values to provide an across-chemical risk assessment perspective on the measured levels for approximately 130 analytes in the NER. METHODS: Available risk assessment-based biomarker screening values, including BEs and Human Biomonitoring-I (HBM-I) values from the German Human Biomonitoring Commission, were identified. Geometric mean and 95th percentile population biomarker concentrations from the NER were compared to the available screening values to generate chemical-specific hazard quotients (HQ) or cancer risk estimates. CONCLUSIONS: Several analytes in the NER approach or exceed HQ values of 1 or cancer risks greater than 1x 10-4 at the geometric mean or 95th percentile, suggesting exposure levels exceed what is considered safe in a large fraction of the population. Analytes of concern include acrylamide, dioxin-like chemicals, benzene, xylene, several metals, di-2(ethylhexyl)phthalate, and some legacy organochlorine pesticides. This analysis provides for the first time a mean

  12. Risk assessment of metal vapor arcing

    NASA Technical Reports Server (NTRS)

    Hill, Monika C. (Inventor); Leidecker, Henning W. (Inventor)

    2009-01-01

    A method for assessing metal vapor arcing risk for a component is provided. The method comprises acquiring a current variable value associated with an operation of the component; comparing the current variable value with a threshold value for the variable; evaluating compared variable data to determine the metal vapor arcing risk in the component; and generating a risk assessment status for the component.

  13. High Normal Uric Acid Levels Are Associated with an Increased Risk of Diabetes in Lean, Normoglycemic Healthy Women.

    PubMed

    Shani, Michal; Vinker, Shlomo; Dinour, Dganit; Leiba, Merav; Twig, Gilad; Holtzman, Eliezer J; Leiba, Adi

    2016-10-01

    The risk associated with serum uric acid (SUA) levels within the normal range is unknown, especially among lean and apparently healthy adults. Evaluating whether high-normal SUA levels, 6.8 mg/dL and below, are associated with an increased diabetes risk, compared with low-normal SUA. This was a cohort study with 10 years of followup involving all clinics of the largest nationally distributed Health Maintenance Organization in Israel. Participants included 469,947 examinees, 40-70 years old at baseline, who had their SUA measured during 2002. We excluded examinees who had hyperuricemia (SUA > 6.8 mg/dL), impaired fasting glucose, overweight or obesity and chronic cardiovascular or renal disorders. The final cohort was composed of 30 302 participants. Participants were followed up to a new diagnosis of diabetes during the study period. Odds ratio of developing diabetes among participants with high-normal baseline SUA were compared with low-normal (2 ≤ uric acid < 3 and 3 ≤ uric acid < 4 in women and men, respectively). In a logistic regression model adjusted for age, body mass index, socioeconomic status, smoking, baseline estimated glomerular filtration rate, and baseline glucose, SUA levels of 4-5 mg/dL for women were associated with 61% increased risk for incident diabetes (95% confidence interval, 1.1-2.3). At the highest normal levels for women (SUA, 5-6 mg/dL) the odds ratio was 2.7 (1.8-4.0), whereas men had comparable diabetes risk at values of 6-6.8 mg/dL (hazard ratio, 1.35; 95% confidence interval, 0.9-2.1). SUA levels within the normal range are associated with an increased risk for new-onset diabetes among healthy lean women when compared with those with low-normal values.

  14. Refining Ovarian Cancer Test accuracy Scores (ROCkeTS): protocol for a prospective longitudinal test accuracy study to validate new risk scores in women with symptoms of suspected ovarian cancer

    PubMed Central

    Sundar, Sudha; Rick, Caroline; Dowling, Francis; Au, Pui; Rai, Nirmala; Champaneria, Rita; Stobart, Hilary; Neal, Richard; Davenport, Clare; Mallett, Susan; Sutton, Andrew; Kehoe, Sean; Timmerman, Dirk; Bourne, Tom; Van Calster, Ben; Gentry-Maharaj, Aleksandra; Deeks, Jon

    2016-01-01

    Introduction Ovarian cancer (OC) is associated with non-specific symptoms such as bloating, making accurate diagnosis challenging: only 1 in 3 women with OC presents through primary care referral. National Institute for Health and Care Excellence guidelines recommends sequential testing with CA125 and routine ultrasound in primary care. However, these diagnostic tests have limited sensitivity or specificity. Improving accurate triage in women with vague symptoms is likely to improve mortality by streamlining referral and care pathways. The Refining Ovarian Cancer Test Accuracy Scores (ROCkeTS; HTA 13/13/01) project will derive and validate new tests/risk prediction models that estimate the probability of having OC in women with symptoms. This protocol refers to the prospective study only (phase III). Methods and analysis ROCkeTS comprises four parallel phases. The full ROCkeTS protocol can be found at http://www.birmingham.ac.uk/ROCKETS. Phase III is a prospective test accuracy study. The study will recruit 2450 patients from 15 UK sites. Recruited patients complete symptom and anxiety questionnaires, donate a serum sample and undergo ultrasound scored as per International Ovarian Tumour Analysis (IOTA) criteria. Recruitment is at rapid access clinics, emergency departments and elective clinics. Models to be evaluated include those based on ultrasound derived by the IOTA group and novel models derived from analysis of existing data sets. Estimates of sensitivity, specificity, c-statistic (area under receiver operating curve), positive predictive value and negative predictive value of diagnostic tests are evaluated and a calibration plot for models will be presented. ROCkeTS has received ethical approval from the NHS West Midlands REC (14/WM/1241) and is registered on the controlled trials website (ISRCTN17160843) and the National Institute of Health Research Cancer and Reproductive Health portfolios. PMID:27507231

  15. [Study on pollution evaluation of heavy metal in surface soil of the original site of Qingdao North Station].

    PubMed

    Zhu, Lei; Jia, Yong-gang; Pan, Yu-ying

    2013-09-01

    The determination of pollution extent and health risk assessment are the premise of heavy metal contaminated site remediation. The content of Cu, Cr, Pb, Cd, Zn, Ni in Qingdao North Station was detected, and the correlation of the 6 kinds of heavy metal content was analyzed. The pollution extent in excess of background values was characterized by anthropogenic influence multiple, and the pollution of heavy metal in soil was evaluated using geoaccumulation index and a new method which connects geoaccumulation index with Nemero index. Finally, human health risk assessment was carried out with health risk assessment model for heavy metal content. The results showed that Qingdao North Station soil were polluted by heavy metals. Six heavy metal pollution levels were: Cd > Cu > Ni > Pb > Cr > Zn, and Cd had reached the severity pollution level, Cu and Ni followed by, Cr, Pb and Zn were in minor pollution level. The order of coefficient variation in all heavy metals was: Cd > Ni > Cr > Zn > Pb > Cu. Within the study area soil heavy metal distribution was different, but overall discrepancy was small. The order of non-cancer hazards of heavy metals in soil was Cr > Pb > Cu > Ni > Cd > Zn, and the order of carcinogen risks of heavy metals was Ni > Cd. The non-cancer hazard and carcinogen risks values of metals were both lower than that their threshold values. They were not the direct threats to human health.

  16. Intracerebral hemorrhage after external ventricular drain placement: an evaluation of risk factors for post-procedural hemorrhagic complications.

    PubMed

    Rowe, A Shaun; Rinehart, Derrick R; Lezatte, Stephanie; Langdon, J Russell

    2018-03-07

    The objective of this study was to evaluate and identify the risk factors for developing a new or enlarged intracranial hemorrhage (ICH) after the placement of an external ventricular drain. A single center, nested case-control study of individuals who received an external ventricular drain from June 1, 2011 to June 30, 2014 was conducted at a large academic medical center. A bivariate analysis was conducted to compare those individuals who experienced a post-procedural intracranial hemorrhage to those who did not experience a new bleed. The variables identified as having a p-value less than 0.15 in the bivariate analysis were then evaluated using a multivariate logistic regression model. Twenty-seven of the eighty-one study participants experienced a new or enlarged intracranial hemorrhage after the placement of an external ventricular drain. Of these twenty-seven patients, 6 individuals received an antiplatelet within ninety-six hours of external ventricular drain placement (p = 0.024). The multivariate logistic regression model identified antiplatelet use within 96 h of external ventricular drain insertion as an independent risk factor for post-EVD ICH (OR 13.1; 95% CI 1.95-88.6; p = 0.008). Compared to those study participants who did not receive an antiplatelet within 96 h of external ventricular drain placement, those participants who did receive an antiplatelet were 13.1 times more likely to exhibit a new or enlarged intracranial hemorrhage.

  17. A new gender-specific model for skin autofluorescence risk stratification

    PubMed Central

    Ahmad, Muhammad S.; Damanhouri, Zoheir A.; Kimhofer, Torben; Mosli, Hala H.; Holmes, Elaine

    2015-01-01

    Advanced glycation endproducts (AGEs) are believed to play a significant role in the pathophysiology of a variety of diseases including diabetes and cardiovascular diseases. Non-invasive skin autofluorescence (SAF) measurement serves as a proxy for tissue accumulation of AGEs. We assessed reference SAF and skin reflectance (SR) values in a Saudi population (n = 1,999) and evaluated the existing risk stratification scale. The mean SAF of the study cohort was 2.06 (SD = 0.57) arbitrary units (AU), which is considerably higher than the values reported for other populations. We show a previously unreported and significant difference in SAF values between men and women, with median (range) values of 1.77 AU (0.79–4.84 AU) and 2.20 AU (0.75–4.59 AU) respectively (p-value « 0.01). Age, presence of diabetes and BMI were the most influential variables in determining SAF values in men, whilst in female participants, SR was also highly correlated with SAF. Diabetes, hypertension and obesity all showed strong association with SAF, particularly when gender differences were taken into account. We propose an adjusted, gender-specific disease risk stratification scheme for Middle Eastern populations. SAF is a potentially valuable clinical screening tool for cardiovascular risk assessment but risk scores should take gender and ethnicity into consideration for accurate diagnosis. PMID:25974028

  18. Cardiovascular outcomes after pharmacologic stress myocardial perfusion imaging.

    PubMed

    Lee, Douglas S; Husain, Mansoor; Wang, Xuesong; Austin, Peter C; Iwanochko, Robert M

    2016-04-01

    While pharmacologic stress single photon emission computed tomography myocardial perfusion imaging (SPECT-MPI) is used for noninvasive evaluation of patients who are unable to perform treadmill exercise, its impact on net reclassification improvement (NRI) of prognosis is unknown. We evaluated the prognostic value of pharmacologic stress MPI for prediction of cardiovascular death or non-fatal myocardial infarction (MI) within 1 year at a single-center, university-based laboratory. We examined continuous and categorical NRI of pharmacologic SPECT-MPI for prediction of outcomes beyond clinical factors alone. Six thousand two hundred forty patients (median age 66 years [IQR 56-74], 3466 men) were studied and followed for 5963 person-years. SPECT-MPI variables associated with increased risk of cardiovascular death or non-fatal MI included summed stress score, stress ST-shift, and post-stress resting left ventricular ejection fraction ≤50%. Compared to a clinical model which included age, sex, cardiovascular disease, risk factors, and medications, model χ(2) (210.5 vs. 281.9, P < .001) and c-statistic (0.74 vs. 0.78, P < .001) were significantly increased by addition of SPECT-MPI predictors (summed stress score, stress ST-shift and stress resting left ventricular ejection fraction). SPECT-MPI predictors increased continuous NRI by 49.4% (P < .001), reclassifying 66.5% of patients as lower risk and 32.8% as higher risk of cardiovascular death or non-fatal MI. Addition of MPI predictors to clinical factors using risk categories, defined as <1%, 1% to 3%, and >3% annualized risk of cardiovascular death or non-fatal MI, yielded a 15.0% improvement in NRI (95% CI 7.6%-27.6%, P < .001). Pharmacologic stress MPI substantially improved net reclassification of cardiovascular death or MI risk beyond that afforded by clinical factors. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Chronic disease and chronic disease risk factors among First Nations, Inuit and Métis populations of northern Canada.

    PubMed

    Bruce, S G; Riediger, N D; Lix, L M

    2014-11-01

    Aboriginal populations in northern Canada are experiencing rapid changes in their environments, which may negatively impact on health status. The purpose of our study was to compare chronic conditions and risk factors in northern Aboriginal populations, including First Nations (FN), Inuit and Métis populations, and northern non-Aboriginal populations. Data were from the Canadian Community Health Survey for the period from 2005 to 2008. Weighted multiple logistic regression models tested the association between ethnic groups and health outcomes. Model covariates were age, sex, territory of residence, education and income. Odds ratios (ORs) are reported and a bootstrap method calculated 95% confidence intervals (CIs) and p values. Odds of having at least one chronic condition was significantly lower for the Inuit (OR = 0.59; 95% CI: 0.43-0.81) than for non-Aboriginal population, but similar among FN, Métis and non-Aboriginal populations. Prevalence of many risk factors was significantly different for Inuit, FN and Métis populations. Aboriginal populations in Canada's north have heterogeneous health status. Continued chronic disease and risk factor surveillance will be important to monitor changes over time and to evaluate the impact of public health interventions.

  20. Surface complexation modeling of Cd(II) sorption to montmorillonite, bacteria, and their composite

    NASA Astrophysics Data System (ADS)

    Wang, Ning; Du, Huihui; Huang, Qiaoyun; Cai, Peng; Rong, Xingmin; Feng, Xionghan; Chen, Wenli

    2016-10-01

    Surface complexation modeling (SCM) has emerged as a powerful tool for simulating heavy metal adsorption processes on the surface of soil solid components under different geochemical conditions. The component additivity (CA) approach is one of the strategies that have been widely used in multicomponent systems. In this study, potentiometric titration, isothermal adsorption, zeta potential measurement, and extended X-ray absorption fine-structure (EXAFS) spectra analysis were conducted to investigate Cd adsorption on 2 : 1 clay mineral montmorillonite, on Gram-positive bacteria Bacillus subtilis, and their mineral-organic composite. We developed constant capacitance models of Cd adsorption on montmorillonite, bacterial cells, and mineral-organic composite. The adsorption behavior of Cd on the surface of the composite was well explained by CA-SCM. Some deviations were observed from the model simulations at pH < 5, where the values predicted by the model were lower than the experimental results. The Cd complexes of X2Cd, SOCd+, R-COOCd+, and R-POCd+ were the predominant species on the composite surface over the pH range of 3 to 8. The distribution ratio of the adsorbed Cd between montmorillonite and bacterial fractions in the composite as predicted by CA-SCM closely coincided with the estimated value of EXAFS at pH 6. The model could be useful for the prediction of heavy metal distribution at the interface of multicomponents and their risk evaluation in soils and associated environments.

  1. Preoperative anemia versus blood transfusion: Which is the culprit for worse outcomes in cardiac surgery?

    PubMed

    LaPar, Damien J; Hawkins, Robert B; McMurry, Timothy L; Isbell, James M; Rich, Jeffrey B; Speir, Alan M; Quader, Mohammed A; Kron, Irving L; Kern, John A; Ailawadi, Gorav

    2018-04-04

    Reducing blood product utilization after cardiac surgery has become a focus of perioperative care as studies have suggested improved outcomes. The relative impact of preoperative anemia versus packed red blood cells (PRBC) transfusion on outcomes remains poorly understood, however. In this study, we investigated the relative association between preoperative hematocrit (Hct) level and PRBC transfusion on postoperative outcomes after coronary artery bypass grafting (CABG) surgery. Patient records for primary, isolated CABG operations performed between January 2007 and December 2017 at 19 cardiac surgery centers were evaluated. Hierarchical logistic regression modeling was used to estimate the relationship between baseline preoperative Hct level as well as PRBC transfusion and the likelihoods of postoperative mortality and morbidity, adjusted for baseline patient risk. Variable and model performance characteristics were compared to determine the relative strength of association between Hct level and PRBC transfusion and primary outcomes. A total of 33,411 patients (median patient age, 65 years; interquartile range [IQR], 57-72 years; 26% females) were evaluated. The median preoperative Hct value was 39% (IQR, 36%-42%), and the mean Society of Thoracic Surgeons (STS) predicted risk of mortality was 1.8 ± 3.1%. Complications included PRBC transfusion in 31% of patients, renal failure in 2.8%, stroke in 1.3%, and operative mortality in 2.0%. A strong association was observed between preoperative Hct value and the likelihood of PRBC transfusion (P < .001). After risk adjustment, PRBC transfusion, but not Hct value, demonstrated stronger associations with postoperative mortality (odds ratio [OR], 4.3; P < .0001), renal failure (OR 6.3; P < .0001), and stroke (OR, 2.4; P < .0001). A 1-point increase in preoperative Hct was associated with decreased probabilities of mortality (OR, 0.97; P = .0001) and renal failure (OR, 0.94; P < .0001). The models with PRBC had superior predictive power, with a larger area under the curve, compared with Hct for all outcomes (all P < .01). Preoperative anemia was associated with up to a 4-fold increase in the probability of PRBC transfusion, a 3-fold increase in renal failure, and almost double the mortality. PRBC transfusion appears to be more closely associated with risk-adjusted morbidity and mortality compared with preoperative Hct level alone, supporting efforts to reduce unnecessary PRBC transfusions. Preoperative anemia independently increases the risk of postoperative morbidity and mortality. These data suggest that preoperative Hct should be included in the STS risk calculators. Finally, efforts to optimize preoperative hematocrit should be investigated as a potentially modifiable risk factor for mortality and morbidity. Copyright © 2018 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  2. Essays in applied microeconomics

    NASA Astrophysics Data System (ADS)

    Davis, Lucas William

    2005-11-01

    The first essay measures the impact of an outbreak of pediatric leukemia on local housing values. A model of residential location choice is used to describe conditions under which the gradient of the hedonic price function with respect to health risk is equal to household marginal willingness to pay to avoid pediatric leukemia risk. This equalizing differential is estimated using property-level sales records from a county in Nevada where residents have recently experienced a severe increase in pediatric leukemia. Housing values are compared before and after the increase with a nearby county acting as a control group. The results indicate that housing values decreased 15.6% during the period of maximum risk. Results are similar for alternative measures of risk and across houses of different sizes. With risk estimates derived using a Bayesian learning model the results imply a statistical value of pediatric leukemia of $5.6 million. The results from the paper provide some of the first market-based estimates of the value of health for children. The second essay evaluates the cost-effectiveness of public incentives that encourage households to purchase high-efficiency durable goods. The demand for durable goods and the demand for energy and other inputs are modeled jointly as the solution to a household production problem. The empirical analysis focuses on the case of clothes washers. The production technology and utilization decision are estimated using household-level data from field trials in which participants received front-loading clothes washers free of charge. The estimation strategy exploits this quasi-random replacement of washers to derive robust estimates of the utilization decision. The results indicate a price elasticity, -.06, that is statistically different from zero across specifications. The parameters from the utilization decision are used to estimate the purchase decision using data from the Consumer Expenditure Survey, 1994-2002. Households consider optimal utilization levels, purchase prices, water rates, energy rates and other factors when deciding which clothes washer to purchase. The complete model is used to simulate the effects of rebate programs and other policies on adoption patterns of clothes washers and household demand for water and energy.

  3. Inpatient Glucose Values: Determining the Nondiabetic Range and Use in Identifying Patients at High Risk for Diabetes.

    PubMed

    Rhee, Mary K; Safo, Sandra E; Jackson, Sandra L; Xue, Wenqiong; Olson, Darin E; Long, Qi; Barb, Diana; Haw, J Sonya; Tomolo, Anne M; Phillips, Lawrence S

    2018-04-01

    Many individuals with diabetes remain undiagnosed, leading to delays in treatment and higher risk for subsequent diabetes complications. Despite recommendations for diabetes screening in high-risk groups, the optimal approach is not known. We evaluated the utility of inpatient glucose levels as an opportunistic screening tool for identifying patients at high risk for diabetes. We retrospectively examined 462,421 patients in the US Department of Veterans Affairs healthcare system, hospitalized on medical/surgical services in 2000-2010, for ≥3 days, with ≥2 inpatient random plasma glucose (RPG) measurements. All had continuity of care: ≥1 primary care visit and ≥1 glucose measurement within 2 years before hospitalization and yearly for ≥3 years after discharge. Glucose levels during hospitalization and incidence of diabetes within 3 years after discharge in patients without diabetes were evaluated. Patients had a mean age of 65.0 years, body mass index of 29.9 kg/m 2 , and were 96% male, 71% white, and 18% black. Pre-existing diabetes was present in 39.4%, 1.3% were diagnosed during hospitalization, 8.1% were diagnosed 5 years after discharge, and 51.3% were never diagnosed (NonDM). The NonDM group had the lowest mean hospital RPG value (112 mg/dL [6.2 mmol/L]). Having at least 2 RPG values >140 mg/dL (>7.8 mmol/L), the 95th percentile of NonDM hospital glucose, provided 81% specificity for identifying incident diabetes within 3 years after discharge. Screening for diabetes could be considered in patients with at least 2 hospital glucose values at/above the 95th percentile of the nondiabetic range (141 mg/dL [7.8 mmol/L]). Published by Elsevier Inc.

  4. Estimating the value of a Country's built assets: investment-based exposure modelling for global risk assessment

    NASA Astrophysics Data System (ADS)

    Daniell, James; Pomonis, Antonios; Gunasekera, Rashmin; Ishizawa, Oscar; Gaspari, Maria; Lu, Xijie; Aubrecht, Christoph; Ungar, Joachim

    2017-04-01

    In order to quantify disaster risk, there is a demand and need for determining consistent and reliable economic value of built assets at national or sub national level exposed to natural hazards. The value of the built stock in the context of a city or a country is critical for risk modelling applications as it allows for the upper bound in potential losses to be established. Under the World Bank probabilistic disaster risk assessment - Country Disaster Risk Profiles (CDRP) Program and rapid post-disaster loss analyses in CATDAT, key methodologies have been developed that quantify the asset exposure of a country. In this study, we assess the complementary methods determining value of building stock through capital investment data vs aggregated ground up values based on built area and unit cost of construction analyses. Different approaches to modelling exposure around the world, have resulted in estimated values of built assets of some countries differing by order(s) of magnitude. Using the aforementioned methodology of comparing investment data based capital stock and bottom-up unit cost of construction values per square meter of assets; a suitable range of capital stock estimates for built assets have been created. A blind test format was undertaken to compare the two types of approaches from top-down (investment) and bottom-up (construction cost per unit), In many cases, census data, demographic, engineering and construction cost data are key for bottom-up calculations from previous years. Similarly for the top-down investment approach, distributed GFCF (Gross Fixed Capital Formation) data is also required. Over the past few years, numerous studies have been undertaken through the World Bank Caribbean and Central America disaster risk assessment program adopting this methodology initially developed by Gunasekera et al. (2015). The range of values of the building stock is tested for around 15 countries. In addition, three types of costs - Reconstruction cost (building back to the standard required by building codes); Replacement cost (gross capital stock) and Book value (net capital stock - depreciated value of assets) are discussed and the differences in methodologies assessed. We then examine historical costs (reconstruction and replacement) and losses (book value) of natural disasters versus this upper bound of capital stock in various locations to examine the impact of a reasonable capital stock estimate. It is found that some historic loss estimates in publications are not reasonable given the value of assets at the time of the event. This has applications for quantitative disaster risk assessment and development of country disaster risk profiles, economic analyses and benchmarking upper loss limits of built assets damaged due to natural hazards.

  5. Optimizing isothiocyanate formation during enzymatic glucosinolate breakdown by adjusting pH value, temperature and dilution in Brassica vegetables and Arabidopsis thaliana

    NASA Astrophysics Data System (ADS)

    Hanschen, Franziska S.; Klopsch, Rebecca; Oliviero, Teresa; Schreiner, Monika; Verkerk, Ruud; Dekker, Matthijs

    2017-01-01

    Consumption of glucosinolate-rich Brassicales vegetables is associated with a decreased risk of cancer with enzymatic hydrolysis of glucosinolates playing a key role. However, formation of health-promoting isothiocyanates is inhibited by the epithiospecifier protein in favour of nitriles and epithionitriles. Domestic processing conditions, such as changes in pH value, temperature or dilution, might also affect isothiocyanate formation. Therefore, the influences of these three factors were evaluated in accessions of Brassica rapa, Brassica oleracea, and Arabidopsis thaliana. Mathematical modelling was performed to determine optimal isothiocyanate formation conditions and to obtain knowledge on the kinetics of the reactions. At 22 °C and endogenous plant pH, nearly all investigated plants formed nitriles and epithionitriles instead of health-promoting isothiocyanates. Response surface models, however, clearly demonstrated that upon change in pH to domestic acidic (pH 4) or basic pH values (pH 8), isothiocyanate formation considerably increases. While temperature also affects this process, the pH value has the greatest impact. Further, a kinetic model showed that isothiocyanate formation strongly increases due to dilution. Finally, the results show that isothiocyanate intake can be strongly increased by optimizing the conditions of preparation of Brassicales vegetables.

  6. Comparison of Measured and Predicted Bioconcentration Estimates of Pharmaceuticals in Fish Plasma and Prediction of Chronic Risk.

    PubMed

    Nallani, Gopinath; Venables, Barney; Constantine, Lisa; Huggett, Duane

    2016-05-01

    Evaluation of the environmental risk of human pharmaceuticals is now a mandatory component in all new drug applications submitted for approval in EU. With >3000 drugs currently in use, it is not feasible to test each active ingredient, so prioritization is key. A recent review has listed nine prioritization approaches including the fish plasma model (FPM). The present paper focuses on comparison of measured and predicted fish plasma bioconcentration factors (BCFs) of four common over-the-counter/prescribed pharmaceuticals: norethindrone (NET), ibuprofen (IBU), verapamil (VER) and clozapine (CLZ). The measured data were obtained from the earlier published fish BCF studies. The measured BCF estimates of NET, IBU, VER and CLZ were 13.4, 1.4, 0.7 and 31.2, while the corresponding predicted BCFs (based log Kow at pH 7) were 19, 1.0, 7.6 and 30, respectively. These results indicate that the predicted BCFs matched well the measured values. The BCF estimates were used to calculate the human: fish plasma concentration ratios of each drug to predict potential risk to fish. The plasma ratio results show the following order of risk potential for fish: NET > CLZ > VER > IBU. The FPM has value in prioritizing pharmaceutical products for ecotoxicological assessments.

  7. Assessing Breast Cancer Risk with an Artificial Neural Network

    PubMed

    Sepandi, Mojtaba; Taghdir, Maryam; Rezaianzadeh, Abbas; Rahimikazerooni, Salar

    2018-04-25

    Objectives: Radiologists face uncertainty in making decisions based on their judgment of breast cancer risk. Artificial intelligence and machine learning techniques have been widely applied in detection/recognition of cancer. This study aimed to establish a model to aid radiologists in breast cancer risk estimation. This incorporated imaging methods and fine needle aspiration biopsy (FNAB) for cyto-pathological diagnosis. Methods: An artificial neural network (ANN) technique was used on a retrospectively collected dataset including mammographic results, risk factors, and clinical findings to accurately predict the probability of breast cancer in individual patients. Area under the receiver-operating characteristic curve (AUC), accuracy, sensitivity, specificity, and positive and negative predictive values were used to evaluate discriminative performance. Result: The network incorporating the selected features performed best (AUC = 0.955). Sensitivity and specificity of the ANN were respectively calculated as 0.82 and 0.90. In addition, negative and positive predictive values were respectively computed as 0.90 and 0.80. Conclusion: ANN has potential applications as a decision-support tool to help underperforming practitioners to improve the positive predictive value of biopsy recommendations. Creative Commons Attribution License

  8. Evaluating the impact of prioritization of antiretroviral pre-exposure prophylaxis (PrEP) in New York City

    PubMed Central

    Kessler, Jason; Myers, Julie E.; Nucifora, Kimberly A.; Mensah, Nana; Toohey, Christopher; Khademi, Amin; Cutler, Blayne; Braithwaite, R. Scott

    2015-01-01

    Objective To compare the value and effectiveness of different prioritization strategies of pre-exposure prophylaxis (PrEP) in New York City (NYC). Design Mathematical modeling utilized as clinical trial is not feasible. Methods Using a model accounting for both sexual and parenteral transmission of HIV we compare different prioritization strategies (PPS) for PrEP to two scenarios—no PrEP and PrEP for all susceptible at-risk individuals. The PPS included PrEP for all MSM, only high-risk MSM, high-risk heterosexuals, and injection drug users, and all combinations of these four strategies. Outcomes included HIV infections averted, and incremental cost effectiveness (per-infection averted) ratios. Initial assumptions regarding PrEP included a 44% reduction in HIV transmission, 50% uptake in the prioritized population and an annual cost per person of $9,762. Sensitivity analyses on key parameters were conducted. Results Prioritization to all MSM results in a 19% reduction in new HIV infections. Compared to PrEP for all persons at-risk this PPS retains 79% of the preventative effect at 15% of the total cost. PrEP prioritized to only high-risk MSM results in a reduction in new HIV infections of 15%. This PPS retains 60% of the preventative effect at 6% of the total cost. There are diminishing returns when PrEP utilization is expanded beyond this group. Conclusions PrEP implementation is relatively cost-inefficient under our initial assumptions. Our results suggest that PrEP should first be promoted among MSM who are at particularly high-risk of HIV acquisition. Further expansion beyond this group may be cost-effective, but is unlikely to be cost-saving. PMID:25493594

  9. Approaches for the Application of Physiologically Based ...

    EPA Pesticide Factsheets

    This draft report of Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment addresses the application and evaluation of PBPK models for risk assessment purposes. These models represent an important class of dosimetry models that are useful for predicting internal dose at target organs for risk assessment applications. Topics covered include:the types of data required use of PBPK models in risk assessment,evaluation of PBPK models for use in risk assessment, andthe application of these models to address uncertainties resulting from extrapolations (e.g. interspecies extrapolation) often used in risk assessment.In addition, appendices are provided that includea compilation of chemical partition coefficients and rate constants,algorithms for estimating chemical-specific parameters, anda list of publications relating to PBPK modeling. This report is primarily meant to serve as a learning tool for EPA scientists and risk assessors who may be less familiar with the field. In addition, this report can be informative to PBPK modelers within and outside the Agency, as it provides an assessment of the types of data and models that the EPA requires for consideration of a model for use in risk assessment.

  10. Which risk models perform best in selecting ever-smokers for lung cancer screening?

    Cancer.gov

    A new analysis by scientists at NCI evaluates nine different individualized lung cancer risk prediction models based on their selections of ever-smokers for computed tomography (CT) lung cancer screening.

  11. Diffusion-like recommendation with enhanced similarity of objects

    NASA Astrophysics Data System (ADS)

    An, Ya-Hui; Dong, Qiang; Sun, Chong-Jing; Nie, Da-Cheng; Fu, Yan

    2016-11-01

    In the last decade, diversity and accuracy have been regarded as two important measures in evaluating a recommendation model. However, a clear concern is that a model focusing excessively on one measure will put the other one at risk, thus it is not easy to greatly improve diversity and accuracy simultaneously. In this paper, we propose to enhance the Resource-Allocation (RA) similarity in resource transfer equations of diffusion-like models, by giving a tunable exponent to the RA similarity, and traversing the value of this exponent to achieve the optimal recommendation results. In this way, we can increase the recommendation scores (allocated resource) of many unpopular objects. Experiments on three benchmark data sets, MovieLens, Netflix and RateYourMusic show that the modified models can yield remarkable performance improvement compared with the original ones.

  12. Evaluation of pro-convulsant risk in the rat: spontaneous and provoked convulsions.

    PubMed

    Esneault, Elise; Peyon, Guillaume; Froger-Colléaux, Christelle; Castagné, Vincent

    2015-01-01

    The aim of the present study was to evaluate the utility of different tests performed in the absence or presence of factors promoting seizures in order to evaluate the pro-convulsant effects of drugs. We studied the effects of theophylline in the rat since this is a well-known pro-convulsant substance in humans. The occurrence of spontaneous convulsions following administration of theophylline was evaluated by observation in the Irwin Test and by measuring brain activity using video-EEG recording in conscious telemetered animals. Theophylline was also tested in the electroconvulsive shock (ECS) threshold and pentylenetetrazole (PTZ)-induced convulsions tests, two commonly used models of provoked convulsions. In the Irwin test, theophylline induced convulsions in 1 out of 6 rats at 128 mg/kg. Paroxysmal/seizure activity was also observed by video-EEG recording in 4 out of the 12 animals tested at 128 mg/kg, in presence of clonic convulsions in 3 out of the 4 rats. Paroxysmal activity was observed in two rats in the absence of clear behavioral symptoms, indicating that some precursor signs can be detected using video-EEG. Clear pro-convulsant activity was shown over the dose-range 32-128 mg/kg in the ECS threshold and PTZ-induced convulsions tests. Evaluation of spontaneous convulsions provides information on the therapeutic window of a drug and the translational value of the approach is increased by the use of video-EEG. Tests based on provoked convulsions further complement the evaluation since they try to mimic high risk situations. Measurement of both spontaneous and provoked convulsions improves the evaluation of the pro-convulsant risk of novel pharmacological substances. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Risk-Based Questionnaires Fail to Detect Adolescent Iron Deficiency and Anemia.

    PubMed

    Sekhar, Deepa L; Murray-Kolb, Laura E; Schaefer, Eric W; Paul, Ian M

    2017-08-01

    To evaluate the predictive ability of screening questionnaires to identify adolescent women at high-risk for iron deficiency or iron deficiency anemia who warrant objective laboratory testing. Cross-sectional study of 96 female individuals 12-21 years old seen at an academic medical center. Participants completed an iron deficiency risk assessment questionnaire including the 4 Bright Futures Adolescent Previsit Questionnaire anemia questions, along with depression, attention, food insecurity, and daytime sleepiness screens. Multiple linear regression controlling for age, race, and hormonal contraception use compared the predictive ability of 2 models for adolescent iron deficiency (defined as ferritin <12 mcg/L) and anemia (hemoglobin <12 g/dL). Model 1, the Bright Futures questions, was compared with model 2, which included the 4 aforementioned screens and body mass index percentile. Among participants, 18% (17/96) had iron deficiency and 5% (5/96) had iron deficiency anemia. Model 1 (Bright Futures) poorly predicted ferritin and hemoglobin values (R 2  = 0.03 and 0.08, respectively). Model 2 demonstrated similarly poor predictive ability (R 2  = 0.05 and 0.06, respectively). Mean differences for depressive symptoms (0.3, 95% CI -0.2, 0.8), attention difficulty (-0.1, 95% CI -0.5, 0.4), food insecurity (0.04, 95% CI -0.5, 0.6), daytime sleepiness (0.1, 95% CI -0.1, 0.3), and body mass index percentile (-0.04, 95% CI -0.3, 0.2) were not significantly associated with ferritin in model 2. Mean differences for hemoglobin were also nonsignificant. Risk-based surveys poorly predict objective measures of iron status using ferritin and hemoglobin. Next steps are to establish the optimal timing for objective assessment of adolescent iron deficiency and anemia. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Evaluation of regulatory variation and theoretical health risk for pesticide maximum residue limits in food.

    PubMed

    Li, Zijian

    2018-08-01

    To evaluate whether pesticide maximum residue limits (MRLs) can protect public health, a deterministic dietary risk assessment of maximum pesticide legal exposure was conducted to convert global MRLs to theoretical maximum dose intake (TMDI) values by estimating the average food intake rate and human body weight for each country. A total of 114 nations (58% of the total nations in the world) and two international organizations, including the European Union (EU) and Codex (WHO) have regulated at least one of the most currently used pesticides in at least one of the most consumed agricultural commodities. In this study, 14 of the most commonly used pesticides and 12 of the most commonly consumed agricultural commodities were identified and selected for analysis. A health risk analysis indicated that nearly 30% of the computed pesticide TMDI values were greater than the acceptable daily intake (ADI) values; however, many nations lack common pesticide MRLs in many commonly consumed foods and other human exposure pathways, such as soil, water, and air were not considered. Normality tests of the TMDI values set indicated that all distributions had a right skewness due to large TMDI clusters at the low end of the distribution, which were caused by some strict pesticide MRLs regulated by the EU (normally a default MRL of 0.01 mg/kg when essential data are missing). The Box-Cox transformation and optimal lambda (λ) were applied to these TMDI distributions, and normality tests of the transformed data set indicated that the power transformed TMDI values of at least eight pesticides presented a normal distribution. It was concluded that unifying strict pesticide MRLs by nations worldwide could significantly skew the distribution of TMDI values to the right, lower the legal exposure to pesticide, and effectively control human health risks. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Sensor-based fall risk assessment--an expert 'to go'.

    PubMed

    Marschollek, M; Rehwald, A; Wolf, K H; Gietzelt, M; Nemitz, G; Meyer Zu Schwabedissen, H; Haux, R

    2011-01-01

    Falls are a predominant problem in our aging society, often leading to severe somatic and psychological consequences, and having an incidence of about 30% in the group of persons aged 65 years or above. In order to identify persons at risk, many assessment tools and tests have been developed, but most of these have to be conducted in a supervised setting and are dependent on an expert rater. The overall aim of our research work is to develop an objective and unobtrusive method to determine individual fall risk based on the use of motion sensor data. The aims of our work for this paper are to derive a fall risk model based on sensor data that may potentially be measured during typical activities of daily life (aim #1), and to evaluate the resulting model with data from a one-year follow-up study (aim #2). A sample of n = 119 geriatric inpatients wore an accelerometer on the waist during a Timed 'Up & Go' test and a 20 m walk. Fifty patients were included in a one-year follow-up study, assessing fall events and scoring average physical activity at home in telephone interviews. The sensor data were processed to extract gait and dynamic balance parameters, from which four fall risk models--two classification trees and two logistic regression models--were computed: models CT#1 and SL#1 using accelerometer data only, models CT#2 and SL#2 including the physical activity score. The risk models were evaluated in a ten-times tenfold cross-validation procedure, calculating sensitivity (SENS), specificity (SPEC), positive and negative predictive values (PPV, NPV), classification accuracy, area under the curve (AUC) and the Brier score. Both classification trees show a fair to good performance (models CT#1/CT#2): SENS 74%/58%, SPEC 96%/82%, PPV 92%/ 74%, NPV 77%/82%, accuracy 80%/78%, AUC 0.83/0.87 and Brier scores 0.14/0.14. The logistic regression models (SL#1/SL#2) perform worse: SENS 42%/58%, SPEC 82%/ 78%, PPV 62%/65%, NPV 67%/72%, accuracy 65%/70%, AUC 0.65/0.72 and Brier scores 0.23/0.21. Our results suggest that accelerometer data may be used to predict falls in an unsupervised setting. Furthermore, the parameters used for prediction are measurable with an unobtrusive sensor device during normal activities of daily living. These promising results have to be validated in a larger, long-term prospective trial.

  16. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    PubMed

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the "classical" risk assessment approach with the model-based approach. These comparisons showed that TK and TK-TD models can bring more realism to the risk assessment through the possibility to study realistic exposure scenarios and to simulate relevant mechanisms of effects (including delayed toxicity and recovery). Noticeably, using TK-TD models is currently the most relevant way to directly connect realistic exposure patterns to effects. We conclude with recommendations on how to properly use TK and TK-TD model in acute risk assessment for vertebrates. © 2015 SETAC.

  17. The ecological risks of genetically engineered organisms

    NASA Astrophysics Data System (ADS)

    Wolfenbarger, Lareesa

    2001-03-01

    Highly publicized studies have suggested environmental risks of releasing genetically engineered organisms (GEOs) and have renewed concerns over the evaluation and regulation of these products in domestic and international arenas. I present an overview of the risks of GEOs and the available evidence addressing these and discuss the challenges for risk assessment. Main categories of risk include non-target effects from GEOs, emergence of new viral diseases, and the spread of invasive (weedy) characteristics. Studies have detected non-target effects in some cases but not all; however, much less information exists on other risks, in part due to a lack of conceptual knowledge. For example, general models for predicting invasiveness are not well developed for any introduced organism. The risks of GEOs appear comparable to those for any introduced species or organism, but the magnitude of the risk or the pathway of exposure to the risk can differ among introduced organisms. Therefore, assessing the risks requires a case-by-case analysis so that any differences can be identified. Challenges to assessing risks to valued ecosystems include variability in effects and ecosystem complexity. Ecosystems are a dynamic and complex network of biological and physical interactions. Introducing a new biological entity, such as a GEO, may potentially alter any of these interactions, but evaluating all of these is unrealistic. Effects on a valued ecosystem could vary greatly depending on the geographical location of the experimental site, the GEO used, the plot size of the experiment (scaling effects), and the biological and physical parameters used in the experiment. Experiments that address these sources of variability will provide the most useful information for risk assessments.

  18. WHAT ARE THE BEST MEANS TO ASSESS CONTAMINANT TRANSPORT AND BIODEGRADATION AND MOVE TOWARD CLOSURE, USING APPROPRIATE SITE-SPECIFIC RISK EVALUATIONS?

    EPA Science Inventory

    Site remedy and closure decisions are made from a mixture of site data, literature values, and model results. Often assessment of this information is difficult for State Agency case managers because conventional approaches to site characterization do not yield a clear and strai...

  19. Risk assessment of flood disaster and forewarning model at different spatial-temporal scales

    NASA Astrophysics Data System (ADS)

    Zhao, Jun; Jin, Juliang; Xu, Jinchao; Guo, Qizhong; Hang, Qingfeng; Chen, Yaqian

    2018-05-01

    Aiming at reducing losses from flood disaster, risk assessment of flood disaster and forewarning model is studied. The model is built upon risk indices in flood disaster system, proceeding from the whole structure and its parts at different spatial-temporal scales. In this study, on the one hand, it mainly establishes the long-term forewarning model for the surface area with three levels of prediction, evaluation, and forewarning. The method of structure-adaptive back-propagation neural network on peak identification is used to simulate indices in prediction sub-model. Set pair analysis is employed to calculate the connection degrees of a single index, comprehensive index, and systematic risk through the multivariate connection number, and the comprehensive assessment is made by assessment matrixes in evaluation sub-model. The comparison judging method is adopted to divide warning degree of flood disaster on risk assessment comprehensive index with forewarning standards in forewarning sub-model and then the long-term local conditions for proposing planning schemes. On the other hand, it mainly sets up the real-time forewarning model for the spot, which introduces the real-time correction technique of Kalman filter based on hydrological model with forewarning index, and then the real-time local conditions for presenting an emergency plan. This study takes Tunxi area, Huangshan City of China, as an example. After risk assessment and forewarning model establishment and application for flood disaster at different spatial-temporal scales between the actual and simulated data from 1989 to 2008, forewarning results show that the development trend for flood disaster risk remains a decline on the whole from 2009 to 2013, despite the rise in 2011. At the macroscopic level, project and non-project measures are advanced, while at the microcosmic level, the time, place, and method are listed. It suggests that the proposed model is feasible with theory and application, thus offering a way for assessing and forewarning flood disaster risk.

  20. Does the Surgical Apgar Score Measure Intraoperative Performance?

    PubMed Central

    Regenbogen, Scott E.; Lancaster, R. Todd; Lipsitz, Stuart R.; Greenberg, Caprice C.; Hutter, Matthew M.; Gawande, Atul A.

    2008-01-01

    Objective To evaluate whether Surgical Apgar Scores measure the relationship between intraoperative care and surgical outcomes. Summary Background Data With preoperative risk-adjustment now well-developed, the role of intraoperative performance in surgical outcomes may be considered. We previously derived and validated a ten-point Surgical Apgar Score—based on intraoperative blood loss, heart rate, and blood pressure—that effectively predicts major postoperative complications within 30 days of general and vascular surgery. This study evaluates whether the predictive value of this score comes solely from patients’ preoperative risk, or also measures care in the operating room. Methods Among a systematic sample of 4,119 general and vascular surgery patients at a major academic hospital, we constructed a detailed risk-prediction model including 27 patient-comorbidity and procedure-complexity variables, and computed patients’ propensity to suffer a major postoperative complication. We evaluated the prognostic value of patients’ Surgical Apgar Scores before and after adjustment for this preoperative risk. Results After risk-adjustment, the Surgical Apgar Score remained strongly correlated with postoperative outcomes (p<0.0001). Odds of major complications among average-scoring patients (scores 7–8) were equivalent to preoperative predictions (likelihood ratio (LR) 1.05, 95%CI 0.78–1.41), significantly decreased for those who achieved the best scores of 9–10 (LR 0.52, 95%CI 0.35–0.78), and were significantly poorer for those with low scores—LRs 1.60 (1.12–2.28) for scores 5–6, and 2.80 (1.50–5.21) for scores 0–4. Conclusions Even after accounting for fixed preoperative risk—due to patients’ acute condition, comorbidities and/or operative complexity—the Surgical Apgar Score appears to detect differences in intraoperative management that reduce odds of major complications by half, or increase them by nearly three-fold. PMID:18650644

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walsh, Seán, E-mail: walshsharp@gmail.com; Department of Oncology, Gray Institute for Radiation Oncology and Biology, University of Oxford, Oxford OX3 7DQ; Roelofs, Erik

    Purpose: A fully heterogeneous population averaged mechanistic tumor control probability (TCP) model is appropriate for the analysis of external beam radiotherapy (EBRT). This has been accomplished for EBRT photon treatment of intermediate-risk prostate cancer. Extending the TCP model for low and high-risk patients would be beneficial in terms of overall decision making. Furthermore, different radiation treatment modalities such as protons and carbon-ions are becoming increasingly available. Consequently, there is a need for a complete TCP model. Methods: A TCP model was fitted and validated to a primary endpoint of 5-year biological no evidence of disease clinical outcome data obtained frommore » a review of the literature for low, intermediate, and high-risk prostate cancer patients (5218 patients fitted, 1088 patients validated), treated by photons, protons, or carbon-ions. The review followed the preferred reporting item for systematic reviews and meta-analyses statement. Treatment regimens include standard fractionation and hypofractionation treatments. Residual analysis and goodness of fit statistics were applied. Results: The TCP model achieves a good level of fit overall, linear regression results in a p-value of <0.000 01 with an adjusted-weighted-R{sup 2} value of 0.77 and a weighted root mean squared error (wRMSE) of 1.2%, to the fitted clinical outcome data. Validation of the model utilizing three independent datasets obtained from the literature resulted in an adjusted-weighted-R{sup 2} value of 0.78 and a wRMSE of less than 1.8%, to the validation clinical outcome data. The weighted mean absolute residual across the entire dataset is found to be 5.4%. Conclusions: This TCP model fitted and validated to clinical outcome data, appears to be an appropriate model for the inclusion of all clinical prostate cancer risk categories, and allows evaluation of current EBRT modalities with regard to tumor control prediction.« less

  2. Extension of classical hydrological risk analysis to non-stationary conditions due to climate change - application to the Fulda catchment, Germany

    NASA Astrophysics Data System (ADS)

    Fink, G.; Koch, M.

    2010-12-01

    An important aspect in water resources and hydrological engineering is the assessment of hydrological risk, due to the occurrence of extreme events, e.g. droughts or floods. When dealing with the latter - as is the focus here - the classical methods of flood frequency analysis (FFA) are usually being used for the proper dimensioning of a hydraulic structure, for the purpose of bringing down the flood risk to an acceptable level. FFA is based on extreme value statistics theory. Despite the progress of methods in this scientific branch, the development, decision, and fitting of an appropriate distribution function stills remains a challenge, particularly, when certain underlying assumptions of the theory are not met in real applications. This is, for example, the case when the stationarity-condition for a random flood time series is not satisfied anymore, as could be the situation when long-term hydrological impacts of future climate change are to be considered. The objective here is to verify the applicability of classical (stationary) FFA to predicted flood time series in the Fulda catchment in central Germany, as they may occur in the wake of climate change during the 21st century. These discharge time series at the outlet of the Fulda basin have been simulated with a distributed hydrological model (SWAT) that is forced by predicted climate variables of a regional climate model for Germany (REMO). From the simulated future daily time series, annual maximum (extremes) values are computed and analyzed for the purpose of risk evaluation. Although the 21st century estimated extreme flood series of the Fulda river turn out to be only mildly non-stationary, alleviating the need for further action and concern at the first sight, the more detailed analysis of the risk, as quantified, for example, by the return period, shows non-negligent differences in the calculated risk levels. This could be verified by employing a new method, the so-called flood series maximum analysis (FSMA) method, which consists in the stochastic simulation of numerous trajectories of a stochastic process with a given GEV-distribution over a certain length of time (> larger than a desired return period). Then the maximum value for each trajectory is computed, all of which are then used to determine the empirical distribution of this maximum series. Through graphical inversion of this distribution function the size of the design flood for a given risk (quantile) and given life duration can be inferred. The results of numerous simulations show that for stationary flood series, the new FSMA method results, expectedly, in nearly identical risk values as the classical FFA approach. However, once the flood time series becomes slightly non-stationary - for reasons as discussed - and regardless of whether the trend is increasing or decreasing, large differences in the computed risk values for a given design flood occur. Or in other word, for the same risk, the new FSMA method would lead to different values in the design flood for a hydraulic structure than the classical FFA method. This, in turn, could lead to some cost savings in the realization of a hydraulic project.

  3. A potential gender bias in assessing quality of life - a standard gamble experiment among university students.

    PubMed

    Obaidi, Leath Al; Mahlich, Jörg

    2015-01-01

    There are several methodologies that can be used for evaluating patients' perception of their quality of life. Most commonly, utilities are directly elicited by means of either the time-trade-off or the standard-gamble method. In both methods, risk attitudes determine the quality of life values. Quality of life values among 31 Austrian undergraduate students were elicited by means of the standard gamble approach. The impact of several variables such as gender, side job, length of study, and living arrangements on the quality of life were identified using different types of regression techniques (ordinary least squares, generalized linear model, Betafit). Significant evidence was found that females are associated with a higher quality of life in all specifications of our estimations. The observed gender differences in quality of life can be attributed to a higher degree of risk aversion of women. A higher risk aversion leads to a higher valuation of given health states and a potential gender bias in health economic evaluations. This result could have implications for health policy planners when it comes to budget allocation decisions.

  4. Computed Tomography Angiography Evaluation of Risk Factors for Unstable Intracranial Aneurysms.

    PubMed

    Wang, Guang-Xian; Gong, Ming-Fu; Wen, Li; Liu, Lan-Lan; Yin, Jin-Bo; Duan, Chun-Mei; Zhang, Dong

    2018-03-19

    To evaluate risk factors for instability in intracranial aneurysms (IAs) using computed tomography angiography (CTA). A total of 614 consecutive patients diagnosed with 661 IAs between August 2011 and February 2016 were reviewed. Patients and IAs were divided into stable and unstable groups. Along with clinical characteristics, IA characteristics were evaluated by CTA. Multiple logistic regression analysis was used to identify the independent risk factors associated with unstable IAs. Receiver operating characteristic (ROC) curve analysis was performed on the final model, and optimal thresholds were obtained. Patient age (odds ratio [OR], 0.946), cerebral atherosclerosis (CA; OR, 0.525), and IAs located at the middle cerebral artery (OR, 0.473) or internal carotid artery (OR, 0.512) were negatively correlated with instability, whereas IAs with irregular shape (OR, 2.157), deep depth (OR, 1.557), or large flow angle (FA; OR, 1.015) were more likely to be unstable. ROC analysis revealed threshold values of age, depth, and FA of 59.5 years, 4.25 mm, and 87.8°, respectively. The stability of IAs is significantly affected by several factors, including patient age and the presence of CA. IA shape and location also have an impact on the stability of IAs. Growth into an irregular shape, with a deep depth, and a large FA are risk factors for a change in IAs from stable to unstable. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Drawing the line on the sand

    NASA Astrophysics Data System (ADS)

    Ranasinghe, R.; Jongejan, R.; Wainwright, D.; Callaghan, D. P.

    2016-02-01

    Up to 70% of the world's sandy coastlines are eroding, resulting in gradual and continuous coastline recession. The rate of coastline recession is likely to increase due to the projected impacts of climate change on mean sea levels, offshore wave climate and storm surges. At the same time, rapid development in the world's coastal zones continues to increase potential damages, while often reducing the resilience of coastal systems. The risks associated with coastline recession are thus likely to increase over the coming decades, unless effective risk management plans are put in place. Land-use restrictions are a key component of coastal zone risk management plans. These involve the use of coastal setback lines which are mainly established by linearly adding the impacts of storms, recession due to sea level rise, and ambient long term trends in shoreline evolution. This approach does not differentiate between uncertainties that develop differently over time, nor takes into account the value and lifetime of property developments. Both shortcomings could entail considerable social cost. For balancing risk and reward, probabilistic estimates of coastline recession are a pre-requisite. Yet the presently adopted deterministic methods for establishing setback lines are unable to provide such estimates. Here, we present a quantitative risk analysis (QRA) model, underpinned by a multi-scale, physics based coastal recession model capable of providing time-dependent risk estimates. The modelling approach presented enables the determination of setback lines in terms of exceedance probabilities, a quantity that directly feeds into risk evaluations and economic optimizations. As a demonstration, the risk-informed approach is applied to Narrabeen beach, Sydney, Australia.

  6. An Asian validation of the TIMI risk score for ST-segment elevation myocardial infarction.

    PubMed

    Selvarajah, Sharmini; Fong, Alan Yean Yip; Selvaraj, Gunavathy; Haniff, Jamaiyah; Uiterwaal, Cuno S P M; Bots, Michiel L

    2012-01-01

    Risk stratification in ST-elevation myocardial infarction (STEMI) is important, such that the most resource intensive strategy is used to achieve the greatest clinical benefit. This is essential in developing countries with wide variation in health care facilities, scarce resources and increasing burden of cardiovascular diseases. This study sought to validate the Thrombolysis In Myocardial Infarction (TIMI) risk score for STEMI in a multi-ethnic developing country. Data from a national, prospective, observational registry of acute coronary syndromes was used. The TIMI risk score was evaluated in 4701 patients who presented with STEMI. Model discrimination and calibration was tested in the overall population and in subgroups of patients that were at higher risk of mortality; i.e., diabetics and those with renal impairment. Compared to the TIMI population, this study population was younger, had more chronic conditions, more severe index events and received treatment later. The TIMI risk score was strongly associated with 30-day mortality. Discrimination was good for the overall study population (c statistic 0.785) and in the high risk subgroups; diabetics (c statistic 0.764) and renal impairment (c statistic 0.761). Calibration was good for the overall study population and diabetics, with χ2 goodness of fit test p value of 0.936 and 0.983 respectively, but poor for those with renal impairment, χ2 goodness of fit test p value of 0.006. The TIMI risk score is valid and can be used for risk stratification of STEMI patients for better targeted treatment.

  7. Value of Information Analysis Applied to the Economic Evaluation of Interventions Aimed at Reducing Juvenile Delinquency: An Illustration.

    PubMed

    Eeren, Hester V; Schawo, Saskia J; Scholte, Ron H J; Busschbach, Jan J V; Hakkaart, Leona

    2015-01-01

    To investigate whether a value of information analysis, commonly applied in health care evaluations, is feasible and meaningful in the field of crime prevention. Interventions aimed at reducing juvenile delinquency are increasingly being evaluated according to their cost-effectiveness. Results of cost-effectiveness models are subject to uncertainty in their cost and effect estimates. Further research can reduce that parameter uncertainty. The value of such further research can be estimated using a value of information analysis, as illustrated in the current study. We built upon an earlier published cost-effectiveness model that demonstrated the comparison of two interventions aimed at reducing juvenile delinquency. Outcomes were presented as costs per criminal activity free year. At a societal willingness-to-pay of €71,700 per criminal activity free year, further research to eliminate parameter uncertainty was valued at €176 million. Therefore, in this illustrative analysis, the value of information analysis determined that society should be willing to spend a maximum of €176 million in reducing decision uncertainty in the cost-effectiveness of the two interventions. Moreover, the results suggest that reducing uncertainty in some specific model parameters might be more valuable than in others. Using a value of information framework to assess the value of conducting further research in the field of crime prevention proved to be feasible. The results were meaningful and can be interpreted according to health care evaluation studies. This analysis can be helpful in justifying additional research funds to further inform the reimbursement decision in regard to interventions for juvenile delinquents.

  8. Next steps in the development of ecological soil clean-up values for metals.

    PubMed

    Wentsel, Randall; Fairbrother, Anne

    2014-07-01

    This special series in Integrated Environmental Assessment Management presents the results from 6 workgroups that were formed at the workshop on Ecological Soil Levels-Next Steps in the Development of Metal Clean-Up Values (17-21 September 2012, Sundance, Utah). This introductory article presents an overview of the issues assessors face when conducting risk assessments for metals in soils, key US Environmental Protection Agency (USEPA) documents on metals risk assessment, and discusses the importance of leveraging from recent major terrestrial research projects, primarily to address Registration, Evaluation, Authorization and Restriction of Chemical Substances (REACH) requirements in Europe, that have significantly advanced our understanding of the behavior and toxicity of metals in soils. These projects developed large data sets that are useful for the risk assessment of metals in soil environments. The workshop attendees met to work toward developing a process for establishing ecological soil clean-up values (Eco-SCVs). The goal of the workshop was to progress from ecological soil screening values (Eco-SSLs) to final clean-up values by providing regulators with the methods and processes to incorporate bioavailability, normalize toxicity thresholds, address food-web issues, and incorporate background concentrations. The REACH data sets were used by workshop participants as case studies in the development of the ecological standards for soils. The workshop attendees discussed scientific advancements in bioavailability, soil biota and wildlife case studies, soil processes, and food-chain modeling. In addition, one of the workgroups discussed the processes needed to frame the topics to gain regulatory acceptance as a directive or guidance by Canada, the USEPA, or the United States. © 2013 SETAC.

  9. Classification models for identification of at-risk groups for incident memory complaints.

    PubMed

    van den Kommer, Tessa N; Comijs, Hannie C; Rijs, Kelly J; Heymans, Martijn W; van Boxtel, Martin P J; Deeg, Dorly J H

    2014-02-01

    Memory complaints in older adults may be a precursor of measurable cognitive decline. Causes for these complaints may vary across age groups. The goal of this study was to develop classification models for the early identification of persons at risk for memory complaints using a broad range of characteristics. Two age groups were studied, 55-65 years old (N = 1,416.8) and 65-75 years old (N = 471) using data from the Longitudinal Aging Study Amsterdam. Participants reporting memory complaints at baseline were excluded. Data on predictors of memory complaints were collected at baseline and analyzed using logistic regression analyses. Multiple imputation was applied to handle the missing data; missing data due to mortality were not imputed. In persons aged 55-65 years, 14.4% reported memory complaints after three years of follow-up. Persons using medication, who were former smokers and had insufficient/poor hearing, were at the highest risk of developing memory complaints, i.e., a predictive value of 33.3%. In persons 65-75 years old, the incidence of memory complaints was 22.5%. Persons with a low sense of mastery, who reported having pain, were at the highest risk of memory complaints resulting in a final predictive value of 56.9%. In the subsample of persons without a low sense of mastery who (almost) never visited organizations and had a low level of memory performance, 46.8% reported memory complaints at follow-up. The classification models led to the identification of specific target groups at risk for memory complaints. Suggestions for person-tailored interventions may be based on these risk profiles.

  10. [Prognostic value of JAK2, MPL and CALR mutations in Chinese patients with primary myelofibrosis].

    PubMed

    Xu, Z F; Li, B; Liu, J Q; Li, Y; Ai, X F; Zhang, P H; Qin, T J; Zhang, Y; Wang, J Y; Xu, J Q; Zhang, H L; Fang, L W; Pan, L J; Hu, N B; Qu, S Q; Xiao, Z J

    2016-07-01

    To evaluate the prognostic value of JAK2, MPL and CALR mutations in Chinese patients with primary myelofibrosis (PMF). Four hundred and two Chinese patients with PMF were retrospectively analyzed. The Kaplan-Meier method, the Log-rank test, the likelihood ratio test and the Cox proportional hazards regression model were used to evaluate the prognostic scoring system. This cohort of patients included 209 males and 193 females with a median age of 55 years (range: 15- 89). JAK2V617F mutations were detected in 189 subjects (47.0% ), MPLW515 mutations in 13 (3.2%) and CALR mutations in 81 (20.1%) [There were 30 (37.0%) type-1, 48 (59.3%) type-2 and 3 (3.7%) less common CALR mutations], respectively. 119 subjects (29.6%) had no detectable mutation in JAK2, MPL or CALR. Univariate analysis indicated that patients with CALR type-2 mutations or no detectable mutations had inferior survival compared to those with JAK2, MPL or CALR type- 1 or other less common CALR mutations (the median survival was 74vs 168 months, respectively [HR 2.990 (95% CI 1.935-4.619),P<0.001]. Therefore, patients were categorized into the high-risk with CALR type- 2 mutations or no detectable driver mutations and the low- risk without aforementioned mutations status. The DIPSS-Chinese molecular prognostic model was proposed by adopting mutation categories and DIPSS-Chinese risk group. The median survival of patients classified in low risk (132 subjects, 32.8% ), intermediate- 1 risk (143 subjects, 35.6%), intermediate- 2 risk (106 subjects, 26.4%) and high risk (21 subjects, 5.2%) were not reached, 156 (95% CI 117- 194), 60 (95% CI 28- 91) and 22 (95% CI 10- 33) months, respectively, and there was a statistically significant difference in overall survival among the four risk groups (P<0.001). There was significantly higher predictive power for survival according to the DIPSS-Chinese molecular prognostic model compared with the DIPSS-Chinese model (P=0.005, -2 log-likelihood ratios of 855.6 and 869.7, respectively). The impact of the CALR type- 2 mutations or no detectable driver mutation on survival was independent of current prognostic scoring systems. The DIPSS- Chinese molecular prognostic model based on the molecular features of Chinese patients was proposed and worked well for prognostic indication.

  11. Influence of exposure time on toxicity-An overview.

    PubMed

    Connell, Des W; Yu, Qiming J; Verma, Vibha

    2016-04-29

    Data on toxicity of chemicals is usually reported as the LD50, or LC50, with the exposure time from experimental testing in the laboratory reported. But the exposure time is not considered to be a quantifiable variable which can be used to evaluate its importance in expressed toxicity, often described in general terms such as acute, chronic and so on. For the last hundred years Habers Rule has been successfully used to extrapolate from reported exposure times to other exposure times which may be needed for setting standards, health risk assessments and other applications. But it has limitations particularly in environmental applications where exposure levels are low and exposure times are relatively long. The Reduced Life Expectancy (RLE) model overcomes these problems and can be utilised under all exposure conditions. It can be expressed as ln(LT50)=-a (LC50)(ν)+b where the constants ν, a and b can be evaluated by fitting the model to experimental data on the LC50, and corresponding LT50, together with the Normal Life Expectancy (NLE) of the organism being considered as a data point when the LC50 is zero. The constant, ν, at a value of unity gives a linear relationship and where ν<1 the relationship has a concave shape. In our extensive evaluations of the RLE model for fish, invertebrates and mammals involving 115 data sets and with a wide range of organic and inorganic toxicants the RLE model gave correlation coefficients of >0.8 with 107 sets of data. The RLE model can be used to extrapolate from a limited data set on exposure times and corresponding LT50 values to any exposure time and corresponding LT50 value. The discrepancy between Haber's Rule and RLE model increases as the exposure time increases. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. Modelling the influence of predicted future climate change on the risk of wind damage within New Zealand's planted forests.

    PubMed

    Moore, John R; Watt, Michael S

    2015-08-01

    Wind is the major abiotic disturbance in New Zealand's planted forests, but little is known about how the risk of wind damage may be affected by future climate change. We linked a mechanistic wind damage model (ForestGALES) to an empirical growth model for radiata pine (Pinus radiata D. Don) and a process-based growth model (cenw) to predict the risk of wind damage under different future emissions scenarios and assumptions about the future wind climate. The cenw model was used to estimate site productivity for constant CO2 concentration at 1990 values and for assumed increases in CO2 concentration from current values to those expected during 2040 and 2090 under the B1 (low), A1B (mid-range) and A2 (high) emission scenarios. Stand development was modelled for different levels of site productivity, contrasting silvicultural regimes and sites across New Zealand. The risk of wind damage was predicted for each regime and emission scenario combination using the ForestGALES model. The sensitivity to changes in the intensity of the future wind climate was also examined. Results showed that increased tree growth rates under the different emissions scenarios had the greatest impact on the risk of wind damage. The increase in risk was greatest for stands growing at high stand density under the A2 emissions scenario with increased CO2 concentration. The increased productivity under this scenario resulted in increased tree height, without a corresponding increase in diameter, leading to more slender trees that were predicted to be at greater risk from wind damage. The risk of wind damage was further increased by the modest increases in the extreme wind climate that are predicted to occur. These results have implications for the development of silvicultural regimes that are resilient to climate change and also indicate that future productivity gains may be offset by greater losses from disturbances. © 2015 John Wiley & Sons Ltd.

  13. Building a Values-Informed Mental Model for New Orleans Climate Risk Management.

    PubMed

    Bessette, Douglas L; Mayer, Lauren A; Cwik, Bryan; Vezér, Martin; Keller, Klaus; Lempert, Robert J; Tuana, Nancy

    2017-10-01

    Individuals use values to frame their beliefs and simplify their understanding when confronted with complex and uncertain situations. The high complexity and deep uncertainty involved in climate risk management (CRM) lead to individuals' values likely being coupled to and contributing to their understanding of specific climate risk factors and management strategies. Most mental model approaches, however, which are commonly used to inform our understanding of people's beliefs, ignore values. In response, we developed a "Values-informed Mental Model" research approach, or ViMM, to elicit individuals' values alongside their beliefs and determine which values people use to understand and assess specific climate risk factors and CRM strategies. Our results show that participants consistently used one of three values to frame their understanding of risk factors and CRM strategies in New Orleans: (1) fostering a healthy economy, wealth, and job creation, (2) protecting and promoting healthy ecosystems and biodiversity, and (3) preserving New Orleans' unique culture, traditions, and historically significant neighborhoods. While the first value frame is common in analyses of CRM strategies, the latter two are often ignored, despite their mirroring commonly accepted pillars of sustainability. Other values like distributive justice and fairness were prioritized differently depending on the risk factor or strategy being discussed. These results suggest that the ViMM method could be a critical first step in CRM decision-support processes and may encourage adoption of CRM strategies more in line with stakeholders' values. © 2017 Society for Risk Analysis.

  14. Application of a fall screening algorithm stratified fall risk but missed preventive opportunities in community-dwelling older adults: a prospective study.

    PubMed

    Muir, Susan W; Berg, Katherine; Chesworth, Bert; Klar, Neil; Speechley, Mark

    2010-01-01

    Evaluate the ability of the American and British Geriatrics Society fall prevention guideline's screening algorithm to identify and stratify future fall risk in community-dwelling older adults. Prospective cohort of community-dwelling older adults (n = 117) aged 65 to 90 years. Fall history, balance, and gait measured during a comprehensive geriatric assessment at baseline. Falls data were collected monthly for 1 year. The outcomes of any fall and any injurious fall were evaluated. The algorithm stratified participants into 4 hierarchal risk categories. Fall risk was 33% and 68% for the "no intervention" and "comprehensive fall evaluation required" groups respectively. The relative risk estimate for falling comparing participants in the 2 intervention groups was 2.08 (95% CI 1.42-3.05) for any fall and 2.60 (95% Cl 1.53-4.42) for any injurious fall. Prognostic accuracy values were: sensitivity of 0.50 (95% Cl 0.36-0.64) and specificity of 0.82 (95% CI 0.70-0.90) for any fall; and sensitivity of 0.56 (95% CI 0.38-0.72) and specificity of 0.78 (95% Cl 0.67-0.86) for any injurious fall. The algorithm was able to identify and stratify fall risk for each fall outcome, though the values of prognostic accuracy demonstrate moderate clinical utility. The recommendations of fall evaluation for individuals in the highest risk groups appear supported though the recommendation of no intervention in the lowest risk groups may not address their needs for fall prevention interventions. Further evaluation of the algorithm is recommended to refine the identification of fall risk in community-dwelling older adults.

  15. Automated identification and predictive tools to help identify high-risk heart failure patients: pilot evaluation.

    PubMed

    Evans, R Scott; Benuzillo, Jose; Horne, Benjamin D; Lloyd, James F; Bradshaw, Alejandra; Budge, Deborah; Rasmusson, Kismet D; Roberts, Colleen; Buckway, Jason; Geer, Norma; Garrett, Teresa; Lappé, Donald L

    2016-09-01

    Develop and evaluate an automated identification and predictive risk report for hospitalized heart failure (HF) patients. Dictated free-text reports from the previous 24 h were analyzed each day with natural language processing (NLP), to help improve the early identification of hospitalized patients with HF. A second application that uses an Intermountain Healthcare-developed predictive score to determine each HF patient's risk for 30-day hospital readmission and 30-day mortality was also developed. That information was included in an identification and predictive risk report, which was evaluated at a 354-bed hospital that treats high-risk HF patients. The addition of NLP-identified HF patients increased the identification score's sensitivity from 82.6% to 95.3% and its specificity from 82.7% to 97.5%, and the model's positive predictive value is 97.45%. Daily multidisciplinary discharge planning meetings are now based on the information provided by the HF identification and predictive report, and clinician's review of potential HF admissions takes less time compared to the previously used manual methodology (10 vs 40 min). An evaluation of the use of the HF predictive report identified a significant reduction in 30-day mortality and a significant increase in patient discharges to home care instead of to a specialized nursing facility. Using clinical decision support to help identify HF patients and automatically calculating their 30-day all-cause readmission and 30-day mortality risks, coupled with a multidisciplinary care process pathway, was found to be an effective process to improve HF patient identification, significantly reduce 30-day mortality, and significantly increase patient discharges to home care. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Using physiologically based pharmacokinetic modeling and benchmark dose methods to derive an occupational exposure limit for N-methylpyrrolidone.

    PubMed

    Poet, T S; Schlosser, P M; Rodriguez, C E; Parod, R J; Rodwell, D E; Kirman, C R

    2016-04-01

    The developmental effects of NMP are well studied in Sprague-Dawley rats following oral, inhalation, and dermal routes of exposure. Short-term and chronic occupational exposure limit (OEL) values were derived using an updated physiologically based pharmacokinetic (PBPK) model for NMP, along with benchmark dose modeling. Two suitable developmental endpoints were evaluated for human health risk assessment: (1) for acute exposures, the increased incidence of skeletal malformations, an effect noted only at oral doses that were toxic to the dam and fetus; and (2) for repeated exposures to NMP, changes in fetal/pup body weight. Where possible, data from multiple studies were pooled to increase the predictive power of the dose-response data sets. For the purposes of internal dose estimation, the window of susceptibility was estimated for each endpoint, and was used in the dose-response modeling. A point of departure value of 390 mg/L (in terms of peak NMP in blood) was calculated for skeletal malformations based on pooled data from oral and inhalation studies. Acceptable dose-response model fits were not obtained using the pooled data for fetal/pup body weight changes. These data sets were also assessed individually, from which the geometric mean value obtained from the inhalation studies (470 mg*hr/L), was used to derive the chronic OEL. A PBPK model for NMP in humans was used to calculate human equivalent concentrations corresponding to the internal dose point of departure values. Application of a net uncertainty factor of 20-21, which incorporates data-derived extrapolation factors, to the point of departure values yields short-term and chronic occupational exposure limit values of 86 and 24 ppm, respectively. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Evaluation of Two Approaches to Defining Extinction Risk under the U.S. Endangered Species Act.

    PubMed

    Thompson, Grant G; Maguire, Lynn A; Regan, Tracey J

    2018-05-01

    The predominant definition of extinction risk in conservation biology involves evaluating the cumulative distribution function (CDF) of extinction time at a particular point (the "time horizon"). Using the principles of decision theory, this article develops an alternative definition of extinction risk as the expected loss (EL) to society resulting from eventual extinction of a species. Distinct roles are identified for time preference and risk aversion. Ranges of tentative values for the parameters of the two approaches are proposed, and the performances of the two approaches are compared and contrasted for a small set of real-world species with published extinction time distributions and a large set of hypothetical extinction time distributions. Potential issues with each approach are evaluated, and the EL approach is recommended as the better of the two. The CDF approach suffers from the fact that extinctions that occur at any time before the specified time horizon are weighted equally, while extinctions that occur beyond the specified time horizon receive no weight at all. It also suffers from the fact that the time horizon does not correspond to any natural phenomenon, and so is impossible to specify nonarbitrarily; yet the results can depend critically on the specified value. In contrast, the EL approach has the advantage of weighting extinction time continuously, with no artificial time horizon, and the parameters of the approach (the rates of time preference and risk aversion) do correspond to natural phenomena, and so can be specified nonarbitrarily. © 2017 Society for Risk Analysis.

  18. Number of Salmonella on chicken breast filet at retail level and its implications for public health risk.

    PubMed

    Straver, J M; Janssen, A F W; Linnemann, A R; van Boekel, M A J S; Beumer, R R; Zwietering, M H

    2007-09-01

    This study aimed to characterize the number of Salmonella on chicken breast filet at the retail level and to evaluate if this number affects the risk of salmonellosis. From October to December 2005, 220 chilled raw filets (without skin) were collected from five local retail outlets in The Netherlands. Filet rinses that were positive after enrichment were enumerated with a three-tube most-probable-number (MPN) assay. Nineteen filets (8.6%) were contaminated above the detection limit of the MPN method (10 Salmonella per filet). The number of Salmonella on positive filets varied from 1 to 3.81 log MPN per filet. The obtained enumeration data were applied in a risk assessment model. The model considered possible growth during domestic storage, cross-contamination from filet via a cutting board to lettuce, and possible illness due to consumption of the prepared lettuce. A screening analysis with expected-case and worst-case estimates for the input values of the model showed that variability in the inputs was of relevance. Therefore, a Monte Carlo simulation with probability distributions for the inputs was carried out to predict the annual number of illnesses. Remarkably, over two-thirds of annual predicted illnesses were caused by the small fraction of filets containing more than 3 log Salmonella at retail (0.8% of all filets). The enumeration results can be used to confirm this hypothesis in a more elaborate risk assessment. Modeling of the supply chain can provide insight for possible intervention strategies to reduce the incidence of rare, but extreme levels. Reduction seems feasible within current practices, because the retail market study indicated a significant difference between suppliers.

  19. Tailoring the implementation of new biomarkers based on their added predictive value in subgroups of individuals.

    PubMed

    van Giessen, A; Moons, K G M; de Wit, G A; Verschuren, W M M; Boer, J M A; Koffijberg, H

    2015-01-01

    The value of new biomarkers or imaging tests, when added to a prediction model, is currently evaluated using reclassification measures, such as the net reclassification improvement (NRI). However, these measures only provide an estimate of improved reclassification at population level. We present a straightforward approach to characterize subgroups of reclassified individuals in order to tailor implementation of a new prediction model to individuals expected to benefit from it. In a large Dutch population cohort (n = 21,992) we classified individuals to low (< 5%) and high (≥ 5%) fatal cardiovascular disease risk by the Framingham risk score (FRS) and reclassified them based on the systematic coronary risk evaluation (SCORE). Subsequently, we characterized the reclassified individuals and, in case of heterogeneity, applied cluster analysis to identify and characterize subgroups. These characterizations were used to select individuals expected to benefit from implementation of SCORE. Reclassification after applying SCORE in all individuals resulted in an NRI of 5.00% (95% CI [-0.53%; 11.50%]) within the events, 0.06% (95% CI [-0.08%; 0.22%]) within the nonevents, and a total NRI of 0.051 (95% CI [-0.004; 0.116]). Among the correctly downward reclassified individuals cluster analysis identified three subgroups. Using the characterizations of the typically correctly reclassified individuals, implementing SCORE only in individuals expected to benefit (n = 2,707,12.3%) improved the NRI to 5.32% (95% CI [-0.13%; 12.06%]) within the events, 0.24% (95% CI [0.10%; 0.36%]) within the nonevents, and a total NRI of 0.055 (95% CI [0.001; 0.123]). Overall, the risk levels for individuals reclassified by tailored implementation of SCORE were more accurate. In our empirical example the presented approach successfully characterized subgroups of reclassified individuals that could be used to improve reclassification and reduce implementation burden. In particular when newly added biomarkers or imaging tests are costly or burdensome such a tailored implementation strategy may save resources and improve (cost-)effectiveness.

  20. Predicting Future Suicide Attempts Among Adolescent and Emerging Adult Psychiatric Emergency Patients

    PubMed Central

    Horwitz, Adam G.; Czyz, Ewa K.; King, Cheryl A.

    2014-01-01

    Objective The purpose of this study was to longitudinally examine specific characteristics of suicidal ideation in combination with histories of suicide attempts and non-suicidal self-injury (NSSI) to best evaluate risk for a future attempt among high-risk adolescents and emerging adults. Method Participants in this retrospective medical record review study were 473 (53% female; 69% Caucasian) consecutive patients, ages 15–24 years (M = 19.4 years) who presented for psychiatric emergency (PE) services during a 9-month period. These patients’ medical records, including a clinician-administered Columbia-Suicide Severity Rating Scale, were coded at the index visit and at future visits occurring within the next 18 months. Logistic regression models were used to predict suicide attempts during this period. Results SES, suicidal ideation severity (i.e., intent, method), suicidal ideation intensity (i.e., frequency, controllability), a lifetime history of suicide attempt, and a lifetime history of NSSI were significant independent predictors of a future suicide attempt. Suicidal ideation added incremental validity to the prediction of future suicide attempts above and beyond the influence of a past suicide attempt, whereas a lifetime history of NSSI did not. Sex moderated the relationship between the duration of suicidal thoughts and future attempts (predictive for males, but not females). Conclusions Results suggest value in incorporating both past behaviors and current thoughts into suicide risk formulation. Furthermore, suicidal ideation duration warrants additional examination as a potential critical factor for screening assessments evaluating suicide risk among high-risk samples, particularly for males. PMID:24871489

  1. Methodological framework for the probabilistic risk assessment of multi-hazards at a municipal scale: a case study in the Fella river valley, Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; van Westen, Cees; Reichenbach, Paola

    2013-04-01

    Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.

  2. Validation of Risk Assessment Models of Venous Thromboembolism in Hospitalized Medical Patients.

    PubMed

    Greene, M Todd; Spyropoulos, Alex C; Chopra, Vineet; Grant, Paul J; Kaatz, Scott; Bernstein, Steven J; Flanders, Scott A

    2016-09-01

    Patients hospitalized for acute medical illness are at increased risk for venous thromboembolism. Although risk assessment is recommended and several at-admission risk assessment models have been developed, these have not been adequately derived or externally validated. Therefore, an optimal approach to evaluate venous thromboembolism risk in medical patients is not known. We conducted an external validation study of existing venous thromboembolism risk assessment models using data collected on 63,548 hospitalized medical patients as part of the Michigan Hospital Medicine Safety (HMS) Consortium. For each patient, cumulative venous thromboembolism risk scores and risk categories were calculated. Cox regression models were used to quantify the association between venous thromboembolism events and assigned risk categories. Model discrimination was assessed using Harrell's C-index. Venous thromboembolism incidence in hospitalized medical patients is low (1%). Although existing risk assessment models demonstrate good calibration (hazard ratios for "at-risk" range 2.97-3.59), model discrimination is generally poor for all risk assessment models (C-index range 0.58-0.64). The performance of several existing risk assessment models for predicting venous thromboembolism among acutely ill, hospitalized medical patients at admission is limited. Given the low venous thromboembolism incidence in this nonsurgical patient population, careful consideration of how best to utilize existing venous thromboembolism risk assessment models is necessary, and further development and validation of novel venous thromboembolism risk assessment models for this patient population may be warranted. Published by Elsevier Inc.

  3. Electrolyte and Metabolic Disturbances in Ebola Patients during a Clinical Trial, Guinea, 2015

    PubMed Central

    Bah, Elhadj Ibrahima; Haba, Nyankoye; Delamou, Alexandre; Camara, Bienvenu Salim; Olivier, Kadio Jean-Jacques; De Clerck, Hilde; Nordenstedt, Helena; Semple, Malcolm G.; Van Herp, Michel; Buyze, Jozefien; De Crop, Maaike; Van Den Broucke, Steven; Lynen, Lutgarde; De Weggheleire, Anja

    2016-01-01

    By using data from a 2015 clinical trial on Ebola convalescent-phase plasma in Guinea, we assessed the prevalence of electrolyte and metabolic abnormalities at admission and their predictive value to stratify patients into risk groups. Patients underwent testing with a point-of-care device. We used logistic regression to construct a prognostic model and summarized the predictive value with the area under the receiver operating curve. Abnormalities were common among patients, particularly hypokalemia, hypocalcemia, hyponatremia, raised creatinine, high anion gap, and anemia. Besides age and PCR cycle threshold value, renal dysfunction, low calcium levels, and low hemoglobin levels were independently associated with increased risk for death. A prognostic model using all 5 factors was highly discriminatory (area under the receiver operating curve 0.95; 95% CI 0.90–0.99) and enabled the definition of risk criteria to guide targeted care. Most patients had a very low (<5%) or very high (>80%) risk for death. PMID:27869610

  4. Electrolyte and Metabolic Disturbances in Ebola Patients during a Clinical Trial, Guinea, 2015.

    PubMed

    van Griensven, Johan; Bah, Elhadj Ibrahima; Haba, Nyankoye; Delamou, Alexandre; Camara, Bienvenu Salim; Olivier, Kadio Jean-Jacques; De Clerck, Hilde; Nordenstedt, Helena; Semple, Malcolm G; Van Herp, Michel; Buyze, Jozefien; De Crop, Maaike; Van Den Broucke, Steven; Lynen, Lutgarde; De Weggheleire, Anja

    2016-12-01

    By using data from a 2015 clinical trial on Ebola convalescent-phase plasma in Guinea, we assessed the prevalence of electrolyte and metabolic abnormalities at admission and their predictive value to stratify patients into risk groups. Patients underwent testing with a point-of-care device. We used logistic regression to construct a prognostic model and summarized the predictive value with the area under the receiver operating curve. Abnormalities were common among patients, particularly hypokalemia, hypocalcemia, hyponatremia, raised creatinine, high anion gap, and anemia. Besides age and PCR cycle threshold value, renal dysfunction, low calcium levels, and low hemoglobin levels were independently associated with increased risk for death. A prognostic model using all 5 factors was highly discriminatory (area under the receiver operating curve 0.95; 95% CI 0.90-0.99) and enabled the definition of risk criteria to guide targeted care. Most patients had a very low (<5%) or very high (>80%) risk for death.

  5. Psychosocial predictors of cannabis use in adolescents at risk.

    PubMed

    Hüsler, Gebhard; Plancherel, Bernard; Werlen, Egon

    2005-09-01

    This research has tested a social disintegration model in conjunction with risk and protection factors that have the power to differentiate relative, weighted interactions among variables in different socially disintegrated groups. The model was tested in a cross-sectional sample of 1082 at-risk youth in Switzerland. Structural equation analyses show significant differences between the social disintegration (low, moderate, high) groups and gender, indicating that the model works differently for groups and for gender. For the highly disintegrated adolescents results clearly show that the risk factors (negative mood, peer network, delinquency) are more important than the protective factors (family relations, secure sense of self). Family relations lose all protective value against negative peer influence, but personal variables, such as secure self, gain protective power.

  6. Extended robust support vector machine based on financial risk minimization.

    PubMed

    Takeda, Akiko; Fujiwara, Shuhei; Kanamori, Takafumi

    2014-11-01

    Financial risk measures have been used recently in machine learning. For example, ν-support vector machine ν-SVM) minimizes the conditional value at risk (CVaR) of margin distribution. The measure is popular in finance because of the subadditivity property, but it is very sensitive to a few outliers in the tail of the distribution. We propose a new classification method, extended robust SVM (ER-SVM), which minimizes an intermediate risk measure between the CVaR and value at risk (VaR) by expecting that the resulting model becomes less sensitive than ν-SVM to outliers. We can regard ER-SVM as an extension of robust SVM, which uses a truncated hinge loss. Numerical experiments imply the ER-SVM's possibility of achieving a better prediction performance with proper parameter setting.

  7. Quantitative Microbial Risk Assessment for Escherichia coli O157:H7 in Fresh-Cut Lettuce.

    PubMed

    Pang, Hao; Lambertini, Elisabetta; Buchanan, Robert L; Schaffner, Donald W; Pradhan, Abani K

    2017-02-01

    Leafy green vegetables, including lettuce, are recognized as potential vehicles for foodborne pathogens such as Escherichia coli O157:H7. Fresh-cut lettuce is potentially at high risk of causing foodborne illnesses, as it is generally consumed without cooking. Quantitative microbial risk assessments (QMRAs) are gaining more attention as an effective tool to assess and control potential risks associated with foodborne pathogens. This study developed a QMRA model for E. coli O157:H7 in fresh-cut lettuce and evaluated the effects of different potential intervention strategies on the reduction of public health risks. The fresh-cut lettuce production and supply chain was modeled from field production, with both irrigation water and soil as initial contamination sources, to consumption at home. The baseline model (with no interventions) predicted a mean probability of 1 illness per 10 million servings and a mean of 2,160 illness cases per year in the United States. All intervention strategies evaluated (chlorine, ultrasound and organic acid, irradiation, bacteriophage, and consumer washing) significantly reduced the estimated mean number of illness cases when compared with the baseline model prediction (from 11.4- to 17.9-fold reduction). Sensitivity analyses indicated that retail and home storage temperature were the most important factors affecting the predicted number of illness cases. The developed QMRA model provided a framework for estimating risk associated with consumption of E. coli O157:H7-contaminated fresh-cut lettuce and can guide the evaluation and development of intervention strategies aimed at reducing such risk.

  8. How to interpret a small increase in AUC with an additional risk prediction marker: decision analysis comes through.

    PubMed

    Baker, Stuart G; Schuit, Ewoud; Steyerberg, Ewout W; Pencina, Michael J; Vickers, Andrew; Vickers, Andew; Moons, Karel G M; Mol, Ben W J; Lindeman, Karen S

    2014-09-28

    An important question in the evaluation of an additional risk prediction marker is how to interpret a small increase in the area under the receiver operating characteristic curve (AUC). Many researchers believe that a change in AUC is a poor metric because it increases only slightly with the addition of a marker with a large odds ratio. Because it is not possible on purely statistical grounds to choose between the odds ratio and AUC, we invoke decision analysis, which incorporates costs and benefits. For example, a timely estimate of the risk of later non-elective operative delivery can help a woman in labor decide if she wants an early elective cesarean section to avoid greater complications from possible later non-elective operative delivery. A basic risk prediction model for later non-elective operative delivery involves only antepartum markers. Because adding intrapartum markers to this risk prediction model increases AUC by 0.02, we questioned whether this small improvement is worthwhile. A key decision-analytic quantity is the risk threshold, here the risk of later non-elective operative delivery at which a patient would be indifferent between an early elective cesarean section and usual care. For a range of risk thresholds, we found that an increase in the net benefit of risk prediction requires collecting intrapartum marker data on 68 to 124 women for every correct prediction of later non-elective operative delivery. Because data collection is non-invasive, this test tradeoff of 68 to 124 is clinically acceptable, indicating the value of adding intrapartum markers to the risk prediction model. Copyright © 2014 John Wiley & Sons, Ltd.

  9. A Point System to Forecast Hepatocellular Carcinoma Risk Before and After Treatment Among Persons with Chronic Hepatitis C.

    PubMed

    Xing, Jian; Spradling, Philip R; Moorman, Anne C; Holmberg, Scott D; Teshale, Eyasu H; Rupp, Loralee B; Gordon, Stuart C; Lu, Mei; Boscarino, Joseph A; Schmidt, Mark A; Trinacty, Connie M; Xu, Fujie

    2017-11-01

    Risk of hepatocellular carcinoma (HCC) may be difficult to determine in the clinical setting. Develop a scoring system to forecast HCC risk among patients with chronic hepatitis C. Using data from the Chronic Hepatitis Cohort Study collected during 2005-2014, we derived HCC risk scores for males and females using an extended Cox model with aspartate aminotransferase-to-platelet ratio index (APRI) as a time-dependent variables and mean Kaplan-Meier survival functions from patient data at two study sites, and used data collected at two separate sites for external validation. For model calibration, we used the Greenwood-Nam-D'Agostino goodness-of-fit statistic to examine differences between predicted and observed risk. Of 12,469 patients (1628 with a history of sustained viral response [SVR]), 504 developed HCC; median follow-up was 6 years. Final predictors in the model included age, alcohol abuse, interferon-based treatment response, and APRI. Point values, ranging from -3 to 14 (males) and -3 to 12 (females), were established using hazard ratios of the predictors aligned with 1-, 3-, and 5-year Kaplan-Meier survival probabilities of HCC. Discriminatory capacity was high (c-index 0.82 males and 0.84 females) and external calibration demonstrated no differences between predicted and observed HCC risk for 1-, 3-, and 5-year forecasts among males (all p values >0.97) and for 3- and 5-year risk among females (all p values >0.87). This scoring system, based on age, alcohol abuse history, treatment response, and APRI, can be used to forecast up to a 5-year risk of HCC among hepatitis C patients before and after SVR.

  10. Assessment of hazards and risks for landscape protection planning in Sicily.

    PubMed

    La Rosa, Daniele; Martinico, Francesco

    2013-09-01

    Landscape protection planning is a complex task that requires an integrated assessment and involves heterogeneous issues. These issues include not only the management of a considerable amount of data to describe landscape features but also the choice of appropriate tools to evaluate the hazards and risks. The landscape assessment phase can provide fundamental information for the definition of a Landscape Protection Plan, in which the selection of norms for protection or rehabilitation is strictly related to hazards, values and risks that are found. This paper describes a landscape assessment methodology conducted by using GIS, concerning landscape hazards, values and risk. Four hazard categories are introduced and assessed concerning urban sprawl and erosion: landscape transformations by new planned developments, intensification of urban sprawl patterns, loss of agriculture land and erosion. Landscape value is evaluated by using different thematic layers overlaid with GIS geoprocessing. The risk of loss of landscape value is evaluated, with reference to the potential occurrence of the previously assessed hazards. The case study is the Province of Enna (Sicily), where landscape protection is a relevant issue because of the importance of cultural and natural heritage. Results show that high value landscape features have a low risk of loss of landscape value. For this reason, landscape protection policies assume a relevant role in landscapes with low-medium values and they should be addressed to control the urban sprawl processes that are beginning in the area. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Cross-national validation of prognostic models predicting sickness absence and the added value of work environment variables.

    PubMed

    Roelen, Corné A M; Stapelfeldt, Christina M; Heymans, Martijn W; van Rhenen, Willem; Labriola, Merete; Nielsen, Claus V; Bültmann, Ute; Jensen, Chris

    2015-06-01

    To validate Dutch prognostic models including age, self-rated health and prior sickness absence (SA) for ability to predict high SA in Danish eldercare. The added value of work environment variables to the models' risk discrimination was also investigated. 2,562 municipal eldercare workers (95% women) participated in the Working in Eldercare Survey. Predictor variables were measured by questionnaire at baseline in 2005. Prognostic models were validated for predictions of high (≥30) SA days and high (≥3) SA episodes retrieved from employer records during 1-year follow-up. The accuracy of predictions was assessed by calibration graphs and the ability of the models to discriminate between high- and low-risk workers was investigated by ROC-analysis. The added value of work environment variables was measured with Integrated Discrimination Improvement (IDI). 1,930 workers had complete data for analysis. The models underestimated the risk of high SA in eldercare workers and the SA episodes model had to be re-calibrated to the Danish data. Discrimination was practically useful for the re-calibrated SA episodes model, but not the SA days model. Physical workload improved the SA days model (IDI = 0.40; 95% CI 0.19-0.60) and psychosocial work factors, particularly the quality of leadership (IDI = 0.70; 95% CI 053-0.86) improved the SA episodes model. The prognostic model predicting high SA days showed poor performance even after physical workload was added. The prognostic model predicting high SA episodes could be used to identify high-risk workers, especially when psychosocial work factors are added as predictor variables.

  12. Development and validation of a habitat suitability model for ...

    EPA Pesticide Factsheets

    We developed a spatially-explicit, flexible 3-parameter habitat suitability model that can be used to identify and predict areas at higher risk for non-native dwarf eelgrass (Zostera japonica) invasion. The model uses simple environmental parameters (depth, nearshore slope, and salinity) to quantitatively describe habitat suitable for Z. japonica invasion based on ecology and physiology from the primary literature. Habitat suitability is defined with values ranging from zero to one, where one denotes areas most conducive to Z. japonica and zero denotes areas not likely to support Z. japonica growth. The model was applied to Yaquina Bay, Oregon, USA, an area that has well documented Z. japonica expansion over the last two decades. The highest suitability values for Z. japonica occurred in the mid to upper portions of the intertidal zone, with larger expanses occurring in the lower estuary. While the upper estuary did contain suitable habitat, most areas were not as large as in the lower estuary, due to inappropriate depth, a steeply sloping intertidal zone, and lower salinity. The lowest suitability values occurred below the lower intertidal zone, within the Yaquina River channel. The model was validated by comparison to a multi-year time series of Z. japonica maps, revealing a strong predictive capacity. Sensitivity analysis performed to evaluate the contribution of each parameter to the model prediction revealed that depth was the most important factor. Sh

  13. Coastal Adaptation Planning for Sea Level Rise and Extremes: A Global Model for Adaptation Decision-making at the Local Level Given Uncertain Climate Projections

    NASA Astrophysics Data System (ADS)

    Turner, D.

    2014-12-01

    Understanding the potential economic and physical impacts of climate change on coastal resources involves evaluating a number of distinct adaptive responses. This paper presents a tool for such analysis, a spatially-disaggregated optimization model for adaptation to sea level rise (SLR) and storm surge, the Coastal Impact and Adaptation Model (CIAM). This decision-making framework fills a gap between very detailed studies of specific locations and overly aggregate global analyses. While CIAM is global in scope, the optimal adaptation strategy is determined at the local level, evaluating over 12,000 coastal segments as described in the DIVA database (Vafeidis et al. 2006). The decision to pursue a given adaptation measure depends on local socioeconomic factors like income, population, and land values and how they develop over time, relative to the magnitude of potential coastal impacts, based on geophysical attributes like inundation zones and storm surge. For example, the model's decision to protect or retreat considers the costs of constructing and maintaining coastal defenses versus those of relocating people and capital to minimize damages from land inundation and coastal storms. Uncertain storm surge events are modeled with a generalized extreme value distribution calibrated to data on local surge extremes. Adaptation is optimized for the near-term outlook, in an "act then learn then act" framework that is repeated over the model time horizon. This framework allows the adaptation strategy to be flexibly updated, reflecting the process of iterative risk management. CIAM provides new estimates of the economic costs of SLR; moreover, these detailed results can be compactly represented in a set of adaptation and damage functions for use in integrated assessment models. Alongside the optimal result, CIAM evaluates suboptimal cases and finds that global costs could increase by an order of magnitude, illustrating the importance of adaptive capacity and coastal policy.

  14. The clinical value of 201Tl per rectum scintigraphy in the work-up of patients with alcoholic liver disease.

    PubMed

    Urbain, D; Reding, P; Georges, B; Thys, O; Ham, H R

    1986-01-01

    The clinical value of thallium 201 per rectum scintigraphy in the work-up of patients with alcoholic liver disease was evaluated using data obtained in 104 patients. The 25th min ratio of heart to liver activities was used as an index of portal systemic shunting. This ratio was found to be normal in alcoholic patients with normal liver biopsy and also in those presenting only steatosis. It was slightly higher in patients with liver fibrosis and significantly higher values were observed in patients with liver cirrhosis. High values of the ratio were associated with a higher risk of portal systemic encephalopathy and/or gastrointestinal bleeding. The prognostic value of the test was supported by the fact that good correlations were observed between the ratio and widely accepted prognostic scores such as the Child score or the Orrego index. Moreover, high ratios were associated with an increased mortality risk at one year. We conclude that this simple test is interesting in the screening of cirrhotics at risk of encephalopathy, gastrointestinal hemorrhage, or early death.

  15. Predicting dementia risk in primary care: development and validation of the Dementia Risk Score using routinely collected data.

    PubMed

    Walters, K; Hardoon, S; Petersen, I; Iliffe, S; Omar, R Z; Nazareth, I; Rait, G

    2016-01-21

    Existing dementia risk scores require collection of additional data from patients, limiting their use in practice. Routinely collected healthcare data have the potential to assess dementia risk without the need to collect further information. Our objective was to develop and validate a 5-year dementia risk score derived from primary healthcare data. We used data from general practices in The Health Improvement Network (THIN) database from across the UK, randomly selecting 377 practices for a development cohort and identifying 930,395 patients aged 60-95 years without a recording of dementia, cognitive impairment or memory symptoms at baseline. We developed risk algorithm models for two age groups (60-79 and 80-95 years). An external validation was conducted by validating the model on a separate cohort of 264,224 patients from 95 randomly chosen THIN practices that did not contribute to the development cohort. Our main outcome was 5-year risk of first recorded dementia diagnosis. Potential predictors included sociodemographic, cardiovascular, lifestyle and mental health variables. Dementia incidence was 1.88 (95% CI, 1.83-1.93) and 16.53 (95% CI, 16.15-16.92) per 1000 PYAR for those aged 60-79 (n = 6017) and 80-95 years (n = 7104), respectively. Predictors for those aged 60-79 included age, sex, social deprivation, smoking, BMI, heavy alcohol use, anti-hypertensive drugs, diabetes, stroke/TIA, atrial fibrillation, aspirin, depression. The discrimination and calibration of the risk algorithm were good for the 60-79 years model; D statistic 2.03 (95% CI, 1.95-2.11), C index 0.84 (95% CI, 0.81-0.87), and calibration slope 0.98 (95% CI, 0.93-1.02). The algorithm had a high negative predictive value, but lower positive predictive value at most risk thresholds. Discrimination and calibration were poor for the 80-95 years model. Routinely collected data predicts 5-year risk of recorded diagnosis of dementia for those aged 60-79, but not those aged 80+. This algorithm can identify higher risk populations for dementia in primary care. The risk score has a high negative predictive value and may be most helpful in 'ruling out' those at very low risk from further testing or intensive preventative activities.

  16. [The development and evaluation of software to verify diagnostic accuracy].

    PubMed

    Jensen, Rodrigo; de Moraes Lopes, Maria Helena Baena; Silveira, Paulo Sérgio Panse; Ortega, Neli Regina Siqueira

    2012-02-01

    This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

  17. An Interprofessional Model for Serving Youth at Risk for Substance Abuse: The Team Case Study.

    ERIC Educational Resources Information Center

    Cobia, Debra C.; And Others

    1995-01-01

    Three models of interprofessional education appropriate for serving youth at risk for substance abuse are described. The evaluation of the team case study model indicated that the participants were more sensitive to the needs of the youths, experienced increased comfort in consulting other agents, and were more confident in their ability to select…

  18. Predictive genetic testing for the identification of high-risk groups: a simulation study on the impact of predictive ability

    PubMed Central

    2011-01-01

    Background Genetic risk models could potentially be useful in identifying high-risk groups for the prevention of complex diseases. We investigated the performance of this risk stratification strategy by examining epidemiological parameters that impact the predictive ability of risk models. Methods We assessed sensitivity, specificity, and positive and negative predictive value for all possible risk thresholds that can define high-risk groups and investigated how these measures depend on the frequency of disease in the population, the frequency of the high-risk group, and the discriminative accuracy of the risk model, as assessed by the area under the receiver-operating characteristic curve (AUC). In a simulation study, we modeled genetic risk scores of 50 genes with equal odds ratios and genotype frequencies, and varied the odds ratios and the disease frequency across scenarios. We also performed a simulation of age-related macular degeneration risk prediction based on published odds ratios and frequencies for six genetic risk variants. Results We show that when the frequency of the high-risk group was lower than the disease frequency, positive predictive value increased with the AUC but sensitivity remained low. When the frequency of the high-risk group was higher than the disease frequency, sensitivity was high but positive predictive value remained low. When both frequencies were equal, both positive predictive value and sensitivity increased with increasing AUC, but higher AUC was needed to maximize both measures. Conclusions The performance of risk stratification is strongly determined by the frequency of the high-risk group relative to the frequency of disease in the population. The identification of high-risk groups with appreciable combinations of sensitivity and positive predictive value requires higher AUC. PMID:21797996

  19. Patient- and cohort-specific dose and risk estimation for abdominopelvic CT: a study based on 100 patients

    NASA Astrophysics Data System (ADS)

    Tian, Xiaoyu; Li, Xiang; Segars, W. Paul; Frush, Donald P.; Samei, Ehsan

    2012-03-01

    The purpose of this work was twofold: (a) to estimate patient- and cohort-specific radiation dose and cancer risk index for abdominopelvic computer tomography (CT) scans; (b) to evaluate the effects of patient anatomical characteristics (size, age, and gender) and CT scanner model on dose and risk conversion coefficients. The study included 100 patient models (42 pediatric models, 58 adult models) and multi-detector array CT scanners from two commercial manufacturers (LightSpeed VCT, GE Healthcare; SOMATOM Definition Flash, Siemens Healthcare). A previously-validated Monte Carlo program was used to simulate organ dose for each patient model and each scanner, from which DLP-normalized-effective dose (k factor) and DLP-normalized-risk index values (q factor) were derived. The k factor showed exponential decrease with increasing patient size. For a given gender, q factor showed exponential decrease with both increasing patient size and patient age. The discrepancies in k and q factors across scanners were on average 8% and 15%, respectively. This study demonstrates the feasibility of estimating patient-specific organ dose and cohort-specific effective dose and risk index in abdominopelvic CT requiring only the knowledge of patient size, gender, and age.

  20. Evaluation of an inpatient fall risk screening tool to identify the most critical fall risk factors in inpatients.

    PubMed

    Hou, Wen-Hsuan; Kang, Chun-Mei; Ho, Mu-Hsing; Kuo, Jessie Ming-Chuan; Chen, Hsiao-Lien; Chang, Wen-Yin

    2017-03-01

    To evaluate the accuracy of the inpatient fall risk screening tool and to identify the most critical fall risk factors in inpatients. Variations exist in several screening tools applied in acute care hospitals for examining risk factors for falls and identifying high-risk inpatients. Secondary data analysis. A subset of inpatient data for the period from June 2011-June 2014 was extracted from the nursing information system and adverse event reporting system of an 818-bed teaching medical centre in Taipei. Data were analysed using descriptive statistics, receiver operating characteristic curve analysis and logistic regression analysis. During the study period, 205 fallers and 37,232 nonfallers were identified. The results revealed that the inpatient fall risk screening tool (cut-off point of ≥3) had a low sensitivity level (60%), satisfactory specificity (87%), a positive predictive value of 2·0% and a negative predictive value of 99%. The receiver operating characteristic curve analysis revealed an area under the curve of 0·805 (sensitivity, 71·8%; specificity, 78%). To increase the sensitivity values, the Youden index suggests at least 1·5 points to be the most suitable cut-off point for the inpatient fall risk screening tool. Multivariate logistic regression analysis revealed a considerably increased fall risk in patients with impaired balance and impaired elimination. The fall risk factor was also significantly associated with days of hospital stay and with admission to surgical wards. The findings can raise awareness about the two most critical risk factors for falls among future clinical nurses and other healthcare professionals and thus facilitate the development of fall prevention interventions. This study highlights the needs for redefining the cut-off points of the inpatient fall risk screening tool to effectively identify inpatients at a high risk of falls. Furthermore, inpatients with impaired balance and impaired elimination should be closely monitored by nurses to prevent falling during hospitalisations. © 2016 John Wiley & Sons Ltd.

  1. Fuzzy net present valuation based on risk assessment of Malaysian infrastructure

    NASA Astrophysics Data System (ADS)

    Shaffie, Siti Salihah; Jaaman, Saiful Hafizah; Mohamad, Daud

    2017-04-01

    In recent years, built-operate-transfer (BOT) projects have profoundly been accepted under project financing for infrastructure developments in many countries. It requires high financing and involves complex mutual risk. The assessment of the risks is vital to avert huge financial loss. Net present value is widely applied to BOT project where the uncertainties in cash flows are deemed to be deterministic values. This study proposed a fuzzy net present value model taking consideration the assessment of risks from the BOT project. The proposed model is adopted to provide more flexible net present valuation of the project. It is shown and proven that the improved fuzzy cash flow model will provide a valuation that is closed to the real value of the project.

  2. The New York Sepsis Severity Score: Development of a Risk-Adjusted Severity Model for Sepsis.

    PubMed

    Phillips, Gary S; Osborn, Tiffany M; Terry, Kathleen M; Gesten, Foster; Levy, Mitchell M; Lemeshow, Stanley

    2018-05-01

    In accordance with Rory's Regulations, hospitals across New York State developed and implemented protocols for sepsis recognition and treatment to reduce variations in evidence informed care and preventable mortality. The New York Department of Health sought to develop a risk assessment model for accurate and standardized hospital mortality comparisons of adult septic patients across institutions using case-mix adjustment. Retrospective evaluation of prospectively collected data. Data from 43,204 severe sepsis and septic shock patients from 179 hospitals across New York State were evaluated. Prospective data were submitted to a database from January 1, 2015, to December 31, 2015. None. Maximum likelihood logistic regression was used to estimate model coefficients used in the New York State risk model. The mortality probability was estimated using a logistic regression model. Variables to be included in the model were determined as part of the model-building process. Interactions between variables were included if they made clinical sense and if their p values were less than 0.05. Model development used a random sample of 90% of available patients and was validated using the remaining 10%. Hosmer-Lemeshow goodness of fit p values were considerably greater than 0.05, suggesting good calibration. Areas under the receiver operator curve in the developmental and validation subsets were 0.770 (95% CI, 0.765-0.775) and 0.773 (95% CI, 0.758-0.787), respectively, indicating good discrimination. Development and validation datasets had similar distributions of estimated mortality probabilities. Mortality increased with rising age, comorbidities, and lactate. The New York Sepsis Severity Score accurately estimated the probability of hospital mortality in severe sepsis and septic shock patients. It performed well with respect to calibration and discrimination. This sepsis-specific model provides an accurate, comprehensive method for standardized mortality comparison of adult patients with severe sepsis and septic shock.

  3. Health Risk Assessment Research on Heavy Metals Ingestion Through Groundwater Drinking Pathway for the Residents in Baotou, China.

    PubMed

    Bai, Liping; Wang, Yeyao; Guo, Yongli; Zhou, Youya; Liu, Li; Yan, Zengguang; Li, Fasheng; Xie, Xuefeng

    2016-01-01

    Drinking groundwater is a significant pathway for human exposure to heavy metals. To evaluate the health effect of some heavy metals ingestion through the groundwater drinking pathway, the authors collected 35 groundwater samples from the drinking water wells of local residents and the exploitation wells of waterworks in Baotou, China. The monitoring results indicate that the groundwater had been polluted by heavy metals in some regions of the study area. A health risk assessment model derived from the U.S. Environmental Protection Agency was used to determine the noncarcinogenic and carcinogenic effects to residents who drink groundwater. All the respondents in the study area were at potential risk of carcinogenic health effects from arsenic when using the lowest safe standard for carcinogenic risk (1E-06). The hazard quotient values for noncarcinogenic health risk of arsenic exceeded 1 in 14.3% of the sampling wells in the study area. The research results could provide baseline data for groundwater utilization and supervision in the Baotou plain area.

  4. [Evaluation of a training system for middle ear surgery with optoelectric detection].

    PubMed

    Strauss, G; Bahrami, N; Pössneck, A; Strauss, M; Dietz, A; Korb, W; Lüth, T; Haase, R; Moeckel, H; Grunert, R

    2009-10-01

    This work presents a new training concept for surgery of the temporal bone. It is based on a model of gypsum plastic with optoelectric detection of risk structures. A prototypical evaluation is given. The training models are based on high-resolution computed tomographic data of a human skull. The resulting data set was printed by a three-dimensional (3D) printer. A 3D phantom is created from gypsum powder and a bonding agent. Risks structures are the facial nerve, semicircular canal, cochlea, ossicular chain, sigmoid sinus, dura, and internal carotid artery. An electrically conductive metal (Wood's metal) and a fiber-optic cable were used as detection materials for the risk structures. For evaluating the training system, a study was done with eight inexperienced and eight experienced ear surgeons. They were asked to perform temporal bone surgery using two identical training models (group A). In group B, the same surgeons underwent surgical training with human cadavers. In the case of injuries, the number, point in time, degree (facial nerve), and injured structure were documented during the training on the model. In addition, the total time needed was noted. The training systems could be used in all cases. Evaluation of the anatomic accuracy of the models showed results that were between 49.5% and 90% agreement with the anatomic origin. Error detection was evaluated with values between 79% and 100% agreement with the perception of an experienced surgeon. The operating setting was estimated to be better than the previous"gold standard." The possibility of completely replacing the previous training method, which uses cadavers, with the examined training model was affirmed. This study shows that the examined system fulfills the conditions for a new training concept for temporal bone surgery. The system connects the preliminary work with printed and sintered models with the possibilities of microsystem engineering. In addition, the model's digital database permits a complete virtual representation of the model with appropriate further applications ("look behind the wall," virtual endoscopy).

  5. Modeling Flood Insurance Penetration in the European Non-Life Market: An Overview

    NASA Astrophysics Data System (ADS)

    Mohan, P.; Thomson, M.-K.; Das, A.

    2012-04-01

    Non-life property insurance plays a significant role in assessing and managing economic risk. Understanding the exposure, or property at risk, helps insurers and reinsurers to better categorize and manage their portfolios. However, the nature of the flood peril, in particular adverse selection, has led to a complex system of different insurance covers and policies across Europe owing to its public and private distinctions based on premiums provided as ex ante or ex post, socio-economic characterization and various compensation schemes. To model this significant level of complexity within the European flood insurance market requires not only extensive data research, close understanding of insurance companies and associations as well as historic flood events, but also careful evaluation of the flood hazard in terms of return periods and flood extents, and the economic/ financial background of the geographies involved. This abstract explores different approaches for modeling the flood insurance penetration rates in Europe depending on the information available and complexity involved. For countries which have either a regulated market with mandatory or high penetration rate, as for example found in the UK, France and Switzerland, or indeed countries with negligible insurance cover such as Luxembourg, assumptions about the penetration rates can be made at country level. However, in countries with a private insurance market, the picture becomes inherently more complex. For example in both Austria and Germany, flood insurance is generally restricted, associated with high costs to the insured or not available at all in high risk areas. In order to better manage flood risk, the Austria and German government agencies produced the risk classification systems HORA and ZÜRS, respectively, which categorize risk into four risk zones based on the exceedance probability of a flood occurrence. Except for regions that have preserved mandatory flood inclusion from past policies, insurance cover is generally limited or not available in high risk zones due to high risk proximity. To estimate this relationship, flood extent maps, modeled return periods, socio- economic indicators and the spatial distribution of insured portfolios can be used to quantify the economic to insured exposure ratio. Adequately modeling these insurance conditions not only allows developing an industry view of the exposed property and values at risk from flood but also improves loss assessments. From an insurance perspective, such a model is beneficial for assessing insurance cover related to flood damage - especially due to differences in policies in high-exposure zones - the role of governments, and to assist insurers and reinsurers to make informed decision in allocating their portfolios.

  6. Program Evaluation of Growin' to Win: A Latchkey and Summer Program for At-Risk Youth.

    ERIC Educational Resources Information Center

    James, William H.; And Others

    This document presents an evaluation of the effectiveness of the Growin' to Win Project, an after-school and summer program targeted at elementary and middle school aged youth at high risk of substance abuse and gang involvement. Growin' to Win is an expansion of a model latchkey program piloted at two Tacoma (Washington) schools in 1990. The…

  7. An Application of the Patient Rule-Induction Method for Evaluating the Contribution of the Apolipoprotein E and Lipoprotein Lipase Genes to Predicting Ischemic Heart Disease

    PubMed Central

    Dyson, Greg; Frikke-Schmidt, Ruth; Nordestgaard, Børge G.; Tybjærg-Hansen, Anne; Sing, Charles F.

    2007-01-01

    Different combinations of genetic and environmental risk factors are known to contribute to the complex etiology of ischemic heart disease (IHD) in different subsets of individuals. We employed the Patient Rule-Induction Method (PRIM) to select the combination of risk factors and risk factor values that identified each of 16 mutually exclusive partitions of individuals having significantly different levels of risk of IHD. PRIM balances two competing objectives: (1) finding partitions where the risk of IHD is high and (2) maximizing the number of IHD cases explained by the partitions. A sequential PRIM analysis was applied to data on the incidence of IHD collected over 8 years for a sample of 5,455 unrelated individuals from the Copenhagen City Heart Study (CCHS) to assess the added value of variation in two candidate susceptibility genes beyond the traditional, lipid and body mass index risk factors for IHD. An independent sample of 362 unrelated individuals also from the city of Copenhagen was used to test the model obtained for each of the hypothesized partitions. PMID:17436307

  8. Atrial Fibrillation in Hypertrophic Cardiomyopathy: Prevalence, Clinical Correlations, and Mortality in a Large High‐Risk Population

    PubMed Central

    Siontis, Konstantinos C.; Geske, Jeffrey B.; Ong, Kevin; Nishimura, Rick A.; Ommen, Steve R.; Gersh, Bernard J.

    2014-01-01

    Background Atrial fibrillation (AF) is a common sequela of hypertrophic cardiomyopathy (HCM), but evidence on its prevalence, risk factors, and effect on mortality is sparse. We sought to evaluate the prevalence of AF, identify clinical and echocardiographic correlates, and assess its effect on mortality in a large high‐risk HCM population. Methods and Results We identified HCM patients who underwent evaluation at our institution from 1975 to 2012. AF was defined by known history (either chronic or paroxysmal), electrocardiogram, or Holter monitoring at index visit. We examined clinical and echocardiographic variables in association with AF. The effect of AF on overall and cause‐specific mortality was evaluated with multivariate Cox proportional hazards models. Of 3673 patients with HCM, 650 (18%) had AF. Patients with AF were older and more symptomatic (P<0.001). AF was less common among patients with obstructive HCM phenotype and was associated with larger left atria, higher E/e’ ratios, and worse cardiopulmonary exercise tolerance (all P values<0.001). During median (interquartile range) follow‐up of 4.1 (0.2 to 10) years, 1069 (29%) patients died. Patients with AF had worse survival compared to those without AF (P<0.001). In multivariate analysis adjusted for established risk factors of mortality in HCM, the hazard ratio (95% confidence interval) for the effect of AF on overall mortality was 1.48 (1.27 to 1.71). AF did not have an effect on sudden or nonsudden cardiac death. Conclusions In this large referral HCM population, approximately 1 in 5 patients had AF. AF was a strong predictor of mortality, even after adjustment for established risk factors. PMID:24965028

  9. Automated Risk Assessment for School Violence: a Pilot Study.

    PubMed

    Barzman, Drew; Ni, Yizhao; Griffey, Marcus; Bachtel, Alycia; Lin, Kenneth; Jackson, Hannah; Sorter, Michael; DelBello, Melissa

    2018-05-01

    School violence has increased over the past ten years. This study evaluated students using a more standard and sensitive method to help identify students who are at high risk for school violence. 103 participants were recruited through Cincinnati Children's Hospital Medical Center (CCHMC) from psychiatry outpatient clinics, the inpatient units, and the emergency department. Participants (ages 12-18) were active students in 74 traditional schools (i.e. non-online education). Collateral information was gathered from guardians before participants were evaluated. School risk evaluations were performed with each participant, and audio recordings from the evaluations were later transcribed and manually annotated. The BRACHA (School Version) and the School Safety Scale (SSS), both 14-item scales, were used. A template of open-ended questions was also used. This analysis included 103 participants who were recruited from 74 different schools. Of the 103 students evaluated, 55 were found to be moderate to high risk and 48 were found to be low risk based on the paper risk assessments including the BRACHA and SSS. Both the BRACHA and the SSS were highly correlated with risk of violence to others (Pearson correlations>0.82). There were significant differences in BRACHA and SSS total scores between low risk and high risk to others groups (p-values <0.001 under unpaired t-test). In particular, there were significant differences in individual SSS items between the two groups (p-value <0.001). Of these items, Previous Violent Behavior (Pearson Correlation = 0.80), Impulsivity (0.69), School Problems (0.64), and Negative Attitudes (0.61) were positively correlated with risk to others. The novel machine learning algorithm achieved an AUC of 91.02% when using the interview content to predict risk of school violence, and the AUC increased to 91.45% when demographic and socioeconomic data were added. Our study indicates that the BRACHA and SSS are clinically useful for assessing risk for school violence. The machine learning algorithm was highly accurate in assessing school violence risk.

  10. Candida colonization as a risk marker for invasive candidiasis in mixed medical-surgical intensive care units: development and evaluation of a simple, standard protocol.

    PubMed

    Lau, Anna F; Kabir, Masrura; Chen, Sharon C-A; Playford, E Geoffrey; Marriott, Deborah J; Jones, Michael; Lipman, Jeffrey; McBryde, Emma; Gottlieb, Thomas; Cheung, Winston; Seppelt, Ian; Iredell, Jonathan; Sorrell, Tania C

    2015-04-01

    Colonization with Candida species is an independent risk factor for invasive candidiasis (IC), but the minimum and most practicable parameters for prediction of IC have not been optimized. We evaluated Candida colonization in a prospective cohort of 6,015 nonneutropenic, critically ill patients. Throat, perineum, and urine were sampled 72 h post-intensive care unit (ICU) admission and twice weekly until discharge or death. Specimens were cultured onto chromogenic agar, and a subset underwent molecular characterization. Sixty-three (86%) patients who developed IC were colonized prior to infection; 61 (97%) tested positive within the first two time points. The median time from colonization to IC was 7 days (range, 0 to 35). Colonization at any site was predictive of IC, with the risk of infection highest for urine colonization (relative risk [RR]=2.25) but with the sensitivity highest (98%) for throat and/or perineum colonization. Colonization of ≥2 sites and heavy colonization of ≥1 site were significant independent risk factors for IC (RR=2.25 and RR=3.7, respectively), increasing specificity to 71% to 74% but decreasing sensitivity to 48% to 58%. Molecular testing would have prompted a resistance-driven decision to switch from fluconazole treatment in only 11% of patients infected with C. glabrata, based upon species-level identification alone. Positive predictive values (PPVs) were low (2% to 4%) and negative predictive values (NPVs) high (99% to 100%) regardless of which parameters were applied. In the Australian ICU setting, culture of throat and perineum within the first two time points after ICU admission captures 84% (61/73 patients) of subsequent IC cases. These optimized parameters, in combination with clinical risk factors, should strengthen development of a setting-specific risk-predictive model for IC. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  11. The efficacy of serostatus disclosure for HIV Transmission risk reduction.

    PubMed

    O'Connell, Ann A; Reed, Sandra J; Serovich, Julianne A

    2015-02-01

    Interventions to assist HIV+ persons in disclosing their serostatus to sexual partners can play an important role in curbing rates of HIV transmission among men who have sex with men (MSM). Based on the methods of Pinkerton and Galletly (AIDS Behav 11:698-705, 2007), we develop a mathematical probability model for evaluating effectiveness of serostatus disclosure in reducing the risk of HIV transmission and extend the model to examine the impact of serosorting. In baseline data from 164 HIV+ MSM participating in a randomized controlled trial of a disclosure intervention, disclosure is associated with a 45.0 % reduction in the risk of HIV transmission. Accounting for serosorting, a 61.2 % reduction in risk due to disclosure was observed in serodisconcordant couples. The reduction in risk for seroconcordant couples was 38.4 %. Evidence provided supports the value of serostatus disclosure as a risk reduction strategy in HIV+ MSM. Interventions to increase serostatus disclosure and that address serosorting behaviors are needed.

  12. The Efficacy of Serostatus Disclosure for HIV Transmission Risk Reduction

    PubMed Central

    O’Connell, Ann A.; Serovich, Julianne A.

    2015-01-01

    Interventions to assist HIV+ persons in disclosing their serostatus to sexual partners can play an important role in curbing rates of HIV transmission among men who have sex with men (MSM). Based on the methods of Pinkerton and Galletly (AIDS Behav 11:698–705, 2007), we develop a mathematical probability model for evaluating effectiveness of serostatus disclosure in reducing the risk of HIV transmission and extend the model to examine the impact of serosorting. In baseline data from 164 HIV+ MSM participating in a randomized controlled trial of a disclosure intervention, disclosure is associated with a 45.0 % reduction in the risk of HIV transmission. Accounting for serosorting, a 61.2 % reduction in risk due to disclosure was observed in serodisconcordant couples. The reduction in risk for seroconcordant couples was 38.4 %. Evidence provided supports the value of serostatus disclosure as a risk reduction strategy in HIV+ MSM. Interventions to increase serostatus disclosure and that address serosorting behaviors are needed. PMID:25164375

  13. Development of a screening tool using electronic health records for undiagnosed Type 2 diabetes mellitus and impaired fasting glucose detection in the Slovenian population.

    PubMed

    Štiglic, G; Kocbek, P; Cilar, L; Fijačko, N; Stožer, A; Zaletel, J; Sheikh, A; Povalej Bržan, P

    2018-05-01

    To develop and validate a simplified screening test for undiagnosed Type 2 diabetes mellitus and impaired fasting glucose for the Slovenian population (SloRisk) to be used in the general population. Data on 11 391 people were collected from the electronic health records of comprehensive medical examinations in five Slovenian healthcare centres. Fasting plasma glucose as well as information related to the Finnish Diabetes Risk Score questionnaire, FINDRISC, were collected for 2073 people to build predictive models. Bootstrapping-based evaluation was used to estimate the area under the receiver-operating characteristic curve performance metric of two proposed logistic regression models as well as the Finnish Diabetes Risk Score model both at recommended and at alternative cut-off values. The final model contained five questions for undiagnosed Type 2 diabetes prediction and achieved an area under the receiver-operating characteristic curve of 0.851 (95% CI 0.850-0.853). The impaired fasting glucose prediction model included six questions and achieved an area under the receiver-operating characteristic curve of 0.840 (95% CI 0.839-0.840). There were four questions that were included in both models (age, sex, waist circumference and blood sugar history), with physical activity selected only for undiagnosed Type 2 diabetes and questions on family history and hypertension drug use selected only for the impaired fasting glucose prediction model. This study proposes two simplified models based on FINDRISC questions for screening of undiagnosed Type 2 diabetes and impaired fasting glucose in the Slovenian population. A significant improvement in performance was achieved compared with the original FINDRISC questionnaire. Both models include waist circumference instead of BMI. © 2018 Diabetes UK.

  14. Evaluating the Value of High Spatial Resolution in National Capacity Expansion Models using ReEDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Venkat; Cole, Wesley

    2016-11-14

    Power sector capacity expansion models (CEMs) have a broad range of spatial resolutions. This paper uses the Regional Energy Deployment System (ReEDS) model, a long-term national scale electric sector CEM, to evaluate the value of high spatial resolution for CEMs. ReEDS models the United States with 134 load balancing areas (BAs) and captures the variability in existing generation parameters, future technology costs, performance, and resource availability using very high spatial resolution data, especially for wind and solar modeled at 356 resource regions. In this paper we perform planning studies at three different spatial resolutions--native resolution (134 BAs), state-level, and NERCmore » region level--and evaluate how results change under different levels of spatial aggregation in terms of renewable capacity deployment and location, associated transmission builds, and system costs. The results are used to ascertain the value of high geographically resolved models in terms of their impact on relative competitiveness among renewable energy resources.« less

  15. Evaluating child welfare policies with decision-analytic simulation models.

    PubMed

    Goldhaber-Fiebert, Jeremy D; Bailey, Stephanie L; Hurlburt, Michael S; Zhang, Jinjin; Snowden, Lonnie R; Wulczyn, Fred; Landsverk, John; Horwitz, Sarah M

    2012-11-01

    The objective was to demonstrate decision-analytic modeling in support of Child Welfare policymakers considering implementing evidence-based interventions. Outcomes included permanency (e.g., adoptions) and stability (e.g., foster placement changes). Analyses of a randomized trial of KEEP-a foster parenting intervention-and NSCAW-1 estimated placement change rates and KEEP's effects. A microsimulation model generalized these findings to other Child Welfare systems. The model projected that KEEP could increase permanency and stability, identifying strategies targeting higher-risk children and geographical regions that achieve benefits efficiently. Decision-analytic models enable planners to gauge the value of potential implementations.

  16. Predictive validity of the Hendrich fall risk model II in an acute geriatric unit.

    PubMed

    Ivziku, Dhurata; Matarese, Maria; Pedone, Claudio

    2011-04-01

    Falls are the most common adverse events reported in acute care hospitals, and older patients are the most likely to fall. The risk of falling cannot be completely eliminated, but it can be reduced through the implementation of a fall prevention program. A major evidence-based intervention to prevent falls has been the use of fall-risk assessment tools. Many tools have been increasingly developed in recent years, but most instruments have not been investigated regarding reliability, validity and clinical usefulness. This study intends to evaluate the predictive validity and inter-rater reliability of Hendrich fall risk model II (HFRM II) in order to identify older patients at risk of falling in geriatric units and recommend its use in clinical practice. A prospective descriptive design was used. The study was carried out in a geriatric acute care unit of an Italian University hospital. All over 65 years old patients consecutively admitted to a geriatric acute care unit of an Italian University hospital over 8-month period were enrolled. The patients enrolled were screened for the falls risk by nurses with the HFRM II within 24h of admission. The falls occurring during the patient's hospital stay were registered. Inter-rater reliability, area under the ROC curve, sensitivity, specificity, positive and negative predictive values and time for the administration were evaluated. 179 elderly patients were included. The inter-rater reliability was 0.87 (95% CI 0.71-1.00). The administration time was about 1min. The most frequently reported risk factors were depression, incontinence, vertigo. Sensitivity and specificity were respectively 86% and 43%. The optimal cut-off score for screening at risk patients was 5 with an area under the ROC curve of 0.72. The risk factors more strongly associated with falls were confusion and depression. As falls of older patients are a common problem in acute care settings it is necessary that the nurses use specific validate and reliable fall risk assessment tools in order to implement the most effective prevention measures. Our findings provided supporting evidence to the choice of the HFRM II to screen older patients at risk of falling in acute care settings. Copyright © 2010 Elsevier Ltd. All rights reserved.

  17. Predictive risk stratification model: a progressive cluster-randomised trial in chronic conditions management (PRISMATIC) research protocol

    PubMed Central

    2013-01-01

    Background An ageing population increases demand on health and social care. New approaches are needed to shift care from hospital to community and general practice. A predictive risk stratification tool (Prism) has been developed for general practice that estimates risk of an emergency hospital admission in the following year. We present a protocol for the evaluation of Prism. Methods/Design We will undertake a mixed methods progressive cluster-randomised trial. Practices begin as controls, delivering usual care without Prism. Practices will receive Prism and training randomly, and thereafter be able to use Prism with clinical and technical support. We will compare costs, processes of care, satisfaction and patient outcomes at baseline, 6 and 18 months, using routine data and postal questionnaires. We will assess technical performance by comparing predicted against actual emergency admissions. Focus groups and interviews will be undertaken to understand how Prism is perceived and adopted by practitioners and policy makers. We will model data using generalised linear models and survival analysis techniques to determine whether any differences exist between intervention and control groups. We will take account of covariates and explanatory factors. In the economic evaluation we will carry out a cost-effectiveness analysis to examine incremental cost per emergency admission to hospital avoided and will examine costs versus changes in primary and secondary outcomes in a cost-consequence analysis. We will also examine changes in quality of life of patients across the risk spectrum. We will record and transcribe focus groups and interviews and analyse them thematically. We have received full ethical and R&D approvals for the study and Information Governance Review Panel (IGRP) permission for the use of routine data. We will comply with the CONSORT guidelines and will disseminate the findings at national and international conferences and in peer-reviewed journals. Discussion The proposed study will provide information on costs and effects of Prism; how it is used in practice, barriers and facilitators to its implementation; and its perceived value in supporting the management of patients with and at risk of developing chronic conditions. Trial registration Controlled Clinical Trials ISRCTN no. ISRCTN55538212. PMID:24330749

  18. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.

  19. Ecological Screening Values for Surface Water, Sediment, and Soil: 2005 Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friday, G. P.

    2005-07-18

    One of the principal components of the environmental remediation program at the Savannah River Site (SRS) is the assessment of ecological risk. Used to support CERCLA, RCRA, and DOE orders, the ecological risk assessment (ERA) can identify environmental hazards and evaluate remedial action alternatives. Ecological risk assessment is also an essential means for achieving DOE's risk based end state vision for the disposition of nuclear material and waste hazards, the decommissioning of facilities, and the remediation of inactive waste units at SRS. The complexity of an ERA ranges from a screening level ERA (SLERA) to a full baseline ERA. Amore » screening level ecological risk assessments, although abbreviated from a baseline risk assessment, is nonetheless considered a complete risk assessment (EPA, 2001a). One of the initial tasks of any ERA is to identify constituents that potentially or adversely affect the environment. Typically, this is accomplished by comparing a constituent's maximum concentration in surface water, sediment, or soil with an ecological screening value (ESV). The screening process can eliminate many constituents from further consideration in the risk assessment, but it also identifies those that require additional evaluation. This document is an update of a previous compilation (Friday, 1998) and provides a comprehensive listing of ecological screening values for surface water, sediment, and soil. It describes how the screening values were derived and recommends benchmarks that can be used for ecological risk assessment. The sources of these updated benchmarks include the U.S. Environmental Protection Agency (EPA), U.S. Fish and Wildlife Service (USFWS), U.S. Geological Survey (USGS), National Oceanic and Atmospheric Administration (NOAA), Oak Ridge National Laboratory (ORNL), the State of Florida, the Canadian Council of Ministers of the Environment (CCME), the Dutch Ministry of the Environment (RIVM), and the scientific literature. It should be noted that ESV's are continuously revised by the various issuing agencies. The references in this report provide the citations of each source and, where applicable, the internet address where they can be accessed. Although radiological screening values are not included herein due to space limitations, these have been recently derived by a technical working committee sponsored by the U.S. Department of Energy (DOE 2002, 2004). The recommended ecological screening values represent the most conservative concentrations of the cited sources, and are to be used for screening purposes only. They do not represent remedial action cleanup levels. Their use at locations other than SRS should take into account environmental variables such as water quality, soil chemistry, flora and fauna, and other ecological attributes specific to the ecosystem potentially at risk.« less

  20. Evaluation of potential risks from ash disposal site leachate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, W.B.; Loh, J.Y.; Bate, M.C.

    1999-04-01

    A risk-based approach is used to evaluate potential human health risks associated with a discharge from an ash disposal site into a small stream. The RIVRISK model was used to estimate downstream concentrations and corresponding risks. The modeling and risk analyses focus on boron, the constituent of greatest potential concern to public health at the site investigated, in Riddle Run, Pennsylvania. Prior to performing the risk assessment, the model is validated by comparing observed and predicted results. The comparison is good and an uncertainty analysis is provided to explain the comparison. The hazard quotient (HQ) for boron is predicted tomore » be greater than 1 at presently regulated compliance points over a range of flow rates. The reference dose (RfD) currently recommended by the United States Environmental Protection Agency (US EPA) was used for the analyses. However, the toxicity of boron as expressed by the RfD is now under review by both the U.S. EPA and the World Health Organization. Alternative reference doses being examined would produce predicted boron hazard quotients of less than 1 at nearly all flow conditions.« less

Top