Sample records for additive risk model

  1. Using structured additive regression models to estimate risk factors of malaria: analysis of 2010 Malawi malaria indicator survey data.

    PubMed

    Chirombo, James; Lowe, Rachel; Kazembe, Lawrence

    2014-01-01

    After years of implementing Roll Back Malaria (RBM) interventions, the changing landscape of malaria in terms of risk factors and spatial pattern has not been fully investigated. This paper uses the 2010 malaria indicator survey data to investigate if known malaria risk factors remain relevant after many years of interventions. We adopted a structured additive logistic regression model that allowed for spatial correlation, to more realistically estimate malaria risk factors. Our model included child and household level covariates, as well as climatic and environmental factors. Continuous variables were modelled by assuming second order random walk priors, while spatial correlation was specified as a Markov random field prior, with fixed effects assigned diffuse priors. Inference was fully Bayesian resulting in an under five malaria risk map for Malawi. Malaria risk increased with increasing age of the child. With respect to socio-economic factors, the greater the household wealth, the lower the malaria prevalence. A general decline in malaria risk was observed as altitude increased. Minimum temperatures and average total rainfall in the three months preceding the survey did not show a strong association with disease risk. The structured additive regression model offered a flexible extension to standard regression models by enabling simultaneous modelling of possible nonlinear effects of continuous covariates, spatial correlation and heterogeneity, while estimating usual fixed effects of categorical and continuous observed variables. Our results confirmed that malaria epidemiology is a complex interaction of biotic and abiotic factors, both at the individual, household and community level and that risk factors are still relevant many years after extensive implementation of RBM activities.

  2. Using Structured Additive Regression Models to Estimate Risk Factors of Malaria: Analysis of 2010 Malawi Malaria Indicator Survey Data

    PubMed Central

    Chirombo, James; Lowe, Rachel; Kazembe, Lawrence

    2014-01-01

    Background After years of implementing Roll Back Malaria (RBM) interventions, the changing landscape of malaria in terms of risk factors and spatial pattern has not been fully investigated. This paper uses the 2010 malaria indicator survey data to investigate if known malaria risk factors remain relevant after many years of interventions. Methods We adopted a structured additive logistic regression model that allowed for spatial correlation, to more realistically estimate malaria risk factors. Our model included child and household level covariates, as well as climatic and environmental factors. Continuous variables were modelled by assuming second order random walk priors, while spatial correlation was specified as a Markov random field prior, with fixed effects assigned diffuse priors. Inference was fully Bayesian resulting in an under five malaria risk map for Malawi. Results Malaria risk increased with increasing age of the child. With respect to socio-economic factors, the greater the household wealth, the lower the malaria prevalence. A general decline in malaria risk was observed as altitude increased. Minimum temperatures and average total rainfall in the three months preceding the survey did not show a strong association with disease risk. Conclusions The structured additive regression model offered a flexible extension to standard regression models by enabling simultaneous modelling of possible nonlinear effects of continuous covariates, spatial correlation and heterogeneity, while estimating usual fixed effects of categorical and continuous observed variables. Our results confirmed that malaria epidemiology is a complex interaction of biotic and abiotic factors, both at the individual, household and community level and that risk factors are still relevant many years after extensive implementation of RBM activities. PMID:24991915

  3. Low dose radiation risks for women surviving the a-bombs in Japan: generalized additive model.

    PubMed

    Dropkin, Greg

    2016-11-24

    Analyses of cancer mortality and incidence in Japanese A-bomb survivors have been used to estimate radiation risks, which are generally higher for women. Relative Risk (RR) is usually modelled as a linear function of dose. Extrapolation from data including high doses predicts small risks at low doses. Generalized Additive Models (GAMs) are flexible methods for modelling non-linear behaviour. GAMs are applied to cancer incidence in female low dose subcohorts, using anonymous public data for the 1958 - 1998 Life Span Study, to test for linearity, explore interactions, adjust for the skewed dose distribution, examine significance below 100 mGy, and estimate risks at 10 mGy. For all solid cancer incidence, RR estimated from 0 - 100 mGy and 0 - 20 mGy subcohorts is significantly raised. The response tapers above 150 mGy. At low doses, RR increases with age-at-exposure and decreases with time-since-exposure, the preferred covariate. Using the empirical cumulative distribution of dose improves model fit, and capacity to detect non-linear responses. RR is elevated over wide ranges of covariate values. Results are stable under simulation, or when removing exceptional data cells, or adjusting neutron RBE. Estimates of Excess RR at 10 mGy using the cumulative dose distribution are 10 - 45 times higher than extrapolations from a linear model fitted to the full cohort. Below 100 mGy, quasipoisson models find significant effects for all solid, squamous, uterus, corpus, and thyroid cancers, and for respiratory cancers when age-at-exposure > 35 yrs. Results for the thyroid are compatible with studies of children treated for tinea capitis, and Chernobyl survivors. Results for the uterus are compatible with studies of UK nuclear workers and the Techa River cohort. Non-linear models find large, significant cancer risks for Japanese women exposed to low dose radiation from the atomic bombings. The risks should be reflected in protection standards.

  4. Additive mixed effect model for recurrent gap time data.

    PubMed

    Ding, Jieli; Sun, Liuquan

    2017-04-01

    Gap times between recurrent events are often of primary interest in medical and observational studies. The additive hazards model, focusing on risk differences rather than risk ratios, has been widely used in practice. However, the marginal additive hazards model does not take the dependence among gap times into account. In this paper, we propose an additive mixed effect model to analyze gap time data, and the proposed model includes a subject-specific random effect to account for the dependence among the gap times. Estimating equation approaches are developed for parameter estimation, and the asymptotic properties of the resulting estimators are established. In addition, some graphical and numerical procedures are presented for model checking. The finite sample behavior of the proposed methods is evaluated through simulation studies, and an application to a data set from a clinic study on chronic granulomatous disease is provided.

  5. Cardiovascular risk assessment: addition of CKD and race to the Framingham equation

    PubMed Central

    Drawz, Paul E.; Baraniuk, Sarah; Davis, Barry R.; Brown, Clinton D.; Colon, Pedro J.; Cujyet, Aloysius B.; Dart, Richard A.; Graumlich, James F.; Henriquez, Mario A.; Moloo, Jamaluddin; Sakalayen, Mohammed G.; Simmons, Debra L.; Stanford, Carol; Sweeney, Mary Ellen; Wong, Nathan D.; Rahman, Mahboob

    2012-01-01

    Background/Aims The value of the Framingham equation in predicting cardiovascular risk in African Americans and patients with chronic kidney disease (CKD) is unclear. The purpose of the study was to evaluate whether the addition of CKD and race to the Framingham equation improves risk stratification in hypertensive patients. Methods Participants in the Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) were studied. Those randomized to doxazosin, age greater than 74 years, and those with a history of coronary heart disease (CHD) were excluded. Two risk stratification models were developed using Cox proportional hazards models in a two-thirds developmental sample. The first model included the traditional Framingham risk factors. The second model included the traditional risk factors plus CKD, defined by eGFR categories, and stratification by race (Black vs. Non-Black). The primary outcome was a composite of fatal CHD, nonfatal MI, coronary revascularization, and hospitalized angina. Results There were a total of 19,811 eligible subjects. In the validation cohort, there was no difference in C-statistics between the Framingham equation and the ALLHAT model including CKD and race. This was consistent across subgroups by race and gender and among those with CKD. One exception was among Non-Black women where the C-statistic was higher for the Framingham equation (0.68 vs 0.65, P=0.02). Additionally, net reclassification improvement was not significant for any subgroup based on race and gender, ranging from −5.5% to 4.4%. Conclusion The addition of CKD status and stratification by race does not improve risk prediction in high-risk hypertensive patients. PMID:23194494

  6. Improving risk assessment of color additives in medical device polymers.

    PubMed

    Chandrasekar, Vaishnavi; Janes, Dustin W; Forrey, Christopher; Saylor, David M; Bajaj, Akhil; Duncan, Timothy V; Zheng, Jiwen; Riaz Ahmed, Kausar B; Casey, Brendan J

    2018-01-01

    Many polymeric medical device materials contain color additives which could lead to adverse health effects. The potential health risk of color additives may be assessed by comparing the amount of color additive released over time to levels deemed to be safe based on available toxicity data. We propose a conservative model for exposure that requires only the diffusion coefficient of the additive in the polymer matrix, D, to be specified. The model is applied here using a model polymer (poly(ether-block-amide), PEBAX 2533) and color additive (quinizarin blue) system. Sorption experiments performed in an aqueous dispersion of quinizarin blue (QB) into neat PEBAX yielded a diffusivity D = 4.8 × 10 -10 cm 2  s -1 , and solubility S = 0.32 wt %. On the basis of these measurements, we validated the model by comparing predictions to the leaching profile of QB from a PEBAX matrix into physiologically representative media. Toxicity data are not available to estimate a safe level of exposure to QB, as a result, we used a Threshold of Toxicological Concern (TTC) value for QB of 90 µg/adult/day. Because only 30% of the QB is released in the first day of leaching for our film thickness and calculated D, we demonstrate that a device may contain significantly more color additive than the TTC value without giving rise to a toxicological concern. The findings suggest that an initial screening-level risk assessment of color additives and other potentially toxic compounds found in device polymers can be improved. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 106B: 310-319, 2018. © 2017 Wiley Periodicals, Inc.

  7. SEMIPARAMETRIC ADDITIVE RISKS REGRESSION FOR TWO-STAGE DESIGN SURVIVAL STUDIES

    PubMed Central

    Li, Gang; Wu, Tong Tong

    2011-01-01

    In this article we study a semiparametric additive risks model (McKeague and Sasieni (1994)) for two-stage design survival data where accurate information is available only on second stage subjects, a subset of the first stage study. We derive two-stage estimators by combining data from both stages. Large sample inferences are developed. As a by-product, we also obtain asymptotic properties of the single stage estimators of McKeague and Sasieni (1994) when the semiparametric additive risks model is misspecified. The proposed two-stage estimators are shown to be asymptotically more efficient than the second stage estimators. They also demonstrate smaller bias and variance for finite samples. The developed methods are illustrated using small intestine cancer data from the SEER (Surveillance, Epidemiology, and End Results) Program. PMID:21931467

  8. Changes in diet, cardiovascular risk factors and modelled cardiovascular risk following diagnosis of diabetes: 1-year results from the ADDITION-Cambridge trial cohort.

    PubMed

    Savory, L A; Griffin, S J; Williams, K M; Prevost, A T; Kinmonth, A-L; Wareham, N J; Simmons, R K

    2014-02-01

    To describe change in self-reported diet and plasma vitamin C, and to examine associations between change in diet and cardiovascular disease risk factors and modelled 10-year cardiovascular disease risk in the year following diagnosis of Type 2 diabetes. Eight hundred and sixty-seven individuals with screen-detected diabetes underwent assessment of self-reported diet, plasma vitamin C, cardiovascular disease risk factors and modelled cardiovascular disease risk at baseline and 1 year (n = 736) in the ADDITION-Cambridge trial. Multivariable linear regression was used to quantify the association between change in diet and cardiovascular disease risk at 1 year, adjusting for change in physical activity and cardio-protective medication. Participants reported significant reductions in energy, fat and sodium intake, and increases in fruit, vegetable and fibre intake over 1 year. The reduction in energy was equivalent to an average-sized chocolate bar; the increase in fruit was equal to one plum per day. There was a small increase in plasma vitamin C levels. Increases in fruit intake and plasma vitamin C were associated with small reductions in anthropometric and metabolic risk factors. Increased vegetable intake was associated with an increase in BMI and waist circumference. Reductions in fat, energy and sodium intake were associated with reduction in HbA1c , waist circumference and total cholesterol/modelled cardiovascular disease risk, respectively. Improvements in dietary behaviour in this screen-detected population were associated with small reductions in cardiovascular disease risk, independently of change in cardio-protective medication and physical activity. Dietary change may have a role to play in the reduction of cardiovascular disease risk following diagnosis of diabetes. © 2013 The Authors. Diabetic Medicine published by John Wiley & Sons Ltd on behalf of Diabetes UK.

  9. Evaluating cardiovascular mortality in type 2 diabetes patients: an analysis based on competing risks Markov chains and additive regression models.

    PubMed

    Rosato, Rosalba; Ciccone, G; Bo, S; Pagano, G F; Merletti, F; Gregori, D

    2007-06-01

    Type 2 diabetes represents a condition significantly associated with increased cardiovascular mortality. The aims of the study are: (i) to estimate the cumulative incidence function for cause-specific mortality using Cox and Aalen model; (ii) to describe how the prediction of cardiovascular or other causes mortality changes for patients with different pattern of covariates; (iii) to show if different statistical methods may give different results. Cox and Aalen additive regression model through the Markov chain approach, are used to estimate the cause-specific hazard for cardiovascular or other causes mortality in a cohort of 2865 type 2 diabetic patients without insulin treatment. The models are compared in the estimation of the risk of death for patients of different severity. For younger patients with a better covariates profile, the Cumulative Incidence Function estimated by Cox and Aalen model was almost the same; for patients with the worst covariates profile, models gave different results: at the end of follow-up cardiovascular mortality rate estimated by Cox and Aalen model was 0.26 [95% confidence interval (CI) = 0.21-0.31] and 0.14 (95% CI = 0.09-0.18). Standard Cox and Aalen model capture the risk process for patients equally well with average profiles of co-morbidities. The Aalen model, in addition, is shown to be better at identifying cause-specific risk of death for patients with more severe clinical profiles. This result is relevant in the development of analytic tools for research and resource management within diabetes care.

  10. Usefulness of the addition of beta-2-microglobulin, cystatin C and C-reactive protein to an established risk factors model to improve mortality risk prediction in patients undergoing coronary angiography.

    PubMed

    Nead, Kevin T; Zhou, Margaret J; Caceres, Roxanne Diaz; Sharp, Stephen J; Wehner, Mackenzie R; Olin, Jeffrey W; Cooke, John P; Leeper, Nicholas J

    2013-03-15

    Evidence-based therapies are available to reduce the risk for death from cardiovascular disease, yet many patients go untreated. Novel methods are needed to identify those at highest risk for cardiovascular death. In this study, the biomarkers β2-microglobulin, cystatin C, and C-reactive protein were measured at baseline in a cohort of participants who underwent coronary angiography. Adjusted Cox proportional-hazards models were used to determine whether the biomarkers predicted all-cause and cardiovascular mortality. Additionally, improvements in risk reclassification and discrimination were evaluated by calculating the net reclassification improvement, C-index, and integrated discrimination improvement with the addition of the biomarkers to a baseline model of risk factors for cardiovascular disease and death. During a median follow-up period of 5.6 years, there were 78 deaths among 470 participants. All biomarkers independently predicted future all-cause and cardiovascular mortality. A significant improvement in risk reclassification was observed for all-cause (net reclassification improvement 35.8%, p = 0.004) and cardiovascular (net reclassification improvement 61.9%, p = 0.008) mortality compared to the baseline risk factors model. Additionally, there was significantly increased risk discrimination with C-indexes of 0.777 (change in C-index 0.057, 95% confidence interval 0.016 to 0.097) and 0.826 (change in C-index 0.071, 95% confidence interval 0.010 to 0.133) for all-cause and cardiovascular mortality, respectively. Improvements in risk discrimination were further supported using the integrated discrimination improvement index. In conclusion, this study provides evidence that β2-microglobulin, cystatin C, and C-reactive protein predict mortality and improve risk reclassification and discrimination for a high-risk cohort of patients who undergo coronary angiography. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Conservative Exposure Predictions for Rapid Risk Assessment of Phase-Separated Additives in Medical Device Polymers.

    PubMed

    Chandrasekar, Vaishnavi; Janes, Dustin W; Saylor, David M; Hood, Alan; Bajaj, Akhil; Duncan, Timothy V; Zheng, Jiwen; Isayeva, Irada S; Forrey, Christopher; Casey, Brendan J

    2018-01-01

    A novel approach for rapid risk assessment of targeted leachables in medical device polymers is proposed and validated. Risk evaluation involves understanding the potential of these additives to migrate out of the polymer, and comparing their exposure to a toxicological threshold value. In this study, we propose that a simple diffusive transport model can be used to provide conservative exposure estimates for phase separated color additives in device polymers. This model has been illustrated using a representative phthalocyanine color additive (manganese phthalocyanine, MnPC) and polymer (PEBAX 2533) system. Sorption experiments of MnPC into PEBAX were conducted in order to experimentally determine the diffusion coefficient, D = (1.6 ± 0.5) × 10 -11  cm 2 /s, and matrix solubility limit, C s  = 0.089 wt.%, and model predicted exposure values were validated by extraction experiments. Exposure values for the color additive were compared to a toxicological threshold for a sample risk assessment. Results from this study indicate that a diffusion model-based approach to predict exposure has considerable potential for use as a rapid, screening-level tool to assess the risk of color additives and other small molecule additives in medical device polymers.

  12. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers may...

  13. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers may...

  14. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers may...

  15. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers may...

  16. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers may...

  17. Quantile uncertainty and value-at-risk model risk.

    PubMed

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  18. Public risk perception of food additives and food scares. The case in Suzhou, China.

    PubMed

    Wu, Linhai; Zhong, Yingqi; Shan, Lijie; Qin, Wei

    2013-11-01

    This study examined the factors affecting public risk perception of food additive safety and possible resulting food scares using a survey conducted in Suzhou, Jiangsu Province, China. The model was proposed based on literature relating to the role of risk perception and information perception of public purchase intention under food scares. Structural equation modeling (SEM) was used for data analysis. The results showed that attitude towards behavior, subjective norm and information perception exerted moderate to high effect on food scares, and the effects were also mediated by risk perceptions of additive safety. Significant covariance was observed between attitudes toward behavior, subjective norm and information perception. Establishing an effective mechanism of food safety risk communication, releasing information of government supervision on food safety in a timely manner, curbing misleading media reports on public food safety risk, and enhancing public knowledge of the food additives are key to the development and implementation of food safety risk management policies by the Chinese government. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Breast cancer risks and risk prediction models.

    PubMed

    Engel, Christoph; Fischer, Christine

    2015-02-01

    BRCA1/2 mutation carriers have a considerably increased risk to develop breast and ovarian cancer. The personalized clinical management of carriers and other at-risk individuals depends on precise knowledge of the cancer risks. In this report, we give an overview of the present literature on empirical cancer risks, and we describe risk prediction models that are currently used for individual risk assessment in clinical practice. Cancer risks show large variability between studies. Breast cancer risks are at 40-87% for BRCA1 mutation carriers and 18-88% for BRCA2 mutation carriers. For ovarian cancer, the risk estimates are in the range of 22-65% for BRCA1 and 10-35% for BRCA2. The contralateral breast cancer risk is high (10-year risk after first cancer 27% for BRCA1 and 19% for BRCA2). Risk prediction models have been proposed to provide more individualized risk prediction, using additional knowledge on family history, mode of inheritance of major genes, and other genetic and non-genetic risk factors. User-friendly software tools have been developed that serve as basis for decision-making in family counseling units. In conclusion, further assessment of cancer risks and model validation is needed, ideally based on prospective cohort studies. To obtain such data, clinical management of carriers and other at-risk individuals should always be accompanied by standardized scientific documentation.

  20. Testing a Gender Additive Model: The Role of Body Image in Adolescent Depression

    ERIC Educational Resources Information Center

    Bearman, Sarah Kate; Stice, Eric

    2008-01-01

    Despite consistent evidence that adolescent girls are at greater risk of developing depression than adolescent boys, risk factor models that account for this difference have been elusive. The objective of this research was to examine risk factors proposed by the "gender additive" model of depression that attempts to partially explain the increased…

  1. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Additional war risk protection and indemnity insurance. 308.204 Section 308.204 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk...

  2. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Additional war risk protection and indemnity insurance. 308.204 Section 308.204 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk...

  3. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Additional war risk protection and indemnity insurance. 308.204 Section 308.204 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk...

  4. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Additional war risk protection and indemnity insurance. 308.204 Section 308.204 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk...

  5. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Additional war risk protection and indemnity insurance. 308.204 Section 308.204 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk...

  6. Bayesian structured additive regression modeling of epidemic data: application to cholera

    PubMed Central

    2012-01-01

    Background A significant interest in spatial epidemiology lies in identifying associated risk factors which enhances the risk of infection. Most studies, however, make no, or limited use of the spatial structure of the data, as well as possible nonlinear effects of the risk factors. Methods We develop a Bayesian Structured Additive Regression model for cholera epidemic data. Model estimation and inference is based on fully Bayesian approach via Markov Chain Monte Carlo (MCMC) simulations. The model is applied to cholera epidemic data in the Kumasi Metropolis, Ghana. Proximity to refuse dumps, density of refuse dumps, and proximity to potential cholera reservoirs were modeled as continuous functions; presence of slum settlers and population density were modeled as fixed effects, whereas spatial references to the communities were modeled as structured and unstructured spatial effects. Results We observe that the risk of cholera is associated with slum settlements and high population density. The risk of cholera is equal and lower for communities with fewer refuse dumps, but variable and higher for communities with more refuse dumps. The risk is also lower for communities distant from refuse dumps and potential cholera reservoirs. The results also indicate distinct spatial variation in the risk of cholera infection. Conclusion The study highlights the usefulness of Bayesian semi-parametric regression model analyzing public health data. These findings could serve as novel information to help health planners and policy makers in making effective decisions to control or prevent cholera epidemics. PMID:22866662

  7. Evidence That Environmental and Familial Risks for Psychosis Additively Impact a Multidimensional Subthreshold Psychosis Syndrome.

    PubMed

    Pries, Lotta-Katrin; Guloksuz, Sinan; Ten Have, Margreet; de Graaf, Ron; van Dorsselaer, Saskia; Gunther, Nicole; Rauschenberg, Christian; Reininghaus, Ulrich; Radhakrishnan, Rajiv; Bak, Maarten; Rutten, Bart P F; van Os, Jim

    2018-06-06

    The observed link between positive psychotic experiences (PE) and psychosis spectrum disorder (PSD) may be stronger depending on concomitant presence of PE with other dimensions of psychopathology. We examined whether the effect of common risk factors for PSD on PE is additive and whether the impact of risk factors on the occurrence of PE depends on the co-occurrence of other symptom dimensions (affective dysregulation, negative symptoms, and cognitive alteration). Data from the Netherlands Mental Health Survey and Incidence Study 2 were used. Risk factors included childhood adversity, cannabis use, urbanicity, foreign born, hearing impairment, and family history of affective disorders. Logistic regression models were applied to test (1) the additive effect of risk factors (4 levels) on PE and (2) the moderating effects of symptom dimensions on the association between risk factors (present/absent) and PE, using additive interaction, expressed as the interaction contrast ratio. Risk factors were additive: the greater the number of risk factors, the greater the odds of PE. Furthermore, concomitant presence of the other symptom dimensions all increased the impact of risk factors on PE. After controlling for age, sex, and education, only affective dysregulation and negative symptoms remained significant moderators; only affective dysregulation remained a significant moderator if all dimensions were adjusted for each other. Risk factors may not be directly associated with PE but additively give rise to a multidimensional subthreshold state anticipating the multidimensional clinical syndrome. Early motivational and cognitive impairments in the context of PE may be reducible to affective dysregulation.

  8. Chemical Mixture Risk Assessment Additivity-Based Approaches

    EPA Science Inventory

    Powerpoint presentation includes additivity-based chemical mixture risk assessment methods. Basic concepts, theory and example calculations are included. Several slides discuss the use of "common adverse outcomes" in analyzing phthalate mixtures.

  9. Widespread non-additive and interaction effects within HLA loci modulate the risk of autoimmune diseases.

    PubMed

    Lenz, Tobias L; Deutsch, Aaron J; Han, Buhm; Hu, Xinli; Okada, Yukinori; Eyre, Stephen; Knapp, Michael; Zhernakova, Alexandra; Huizinga, Tom W J; Abecasis, Gonçalo; Becker, Jessica; Boeckxstaens, Guy E; Chen, Wei-Min; Franke, Andre; Gladman, Dafna D; Gockel, Ines; Gutierrez-Achury, Javier; Martin, Javier; Nair, Rajan P; Nöthen, Markus M; Onengut-Gumuscu, Suna; Rahman, Proton; Rantapää-Dahlqvist, Solbritt; Stuart, Philip E; Tsoi, Lam C; van Heel, David A; Worthington, Jane; Wouters, Mira M; Klareskog, Lars; Elder, James T; Gregersen, Peter K; Schumacher, Johannes; Rich, Stephen S; Wijmenga, Cisca; Sunyaev, Shamil R; de Bakker, Paul I W; Raychaudhuri, Soumya

    2015-09-01

    Human leukocyte antigen (HLA) genes confer substantial risk for autoimmune diseases on a log-additive scale. Here we speculated that differences in autoantigen-binding repertoires between a heterozygote's two expressed HLA variants might result in additional non-additive risk effects. We tested the non-additive disease contributions of classical HLA alleles in patients and matched controls for five common autoimmune diseases: rheumatoid arthritis (ncases = 5,337), type 1 diabetes (T1D; ncases = 5,567), psoriasis vulgaris (ncases = 3,089), idiopathic achalasia (ncases = 727) and celiac disease (ncases = 11,115). In four of the five diseases, we observed highly significant, non-additive dominance effects (rheumatoid arthritis, P = 2.5 × 10(-12); T1D, P = 2.4 × 10(-10); psoriasis, P = 5.9 × 10(-6); celiac disease, P = 1.2 × 10(-87)). In three of these diseases, the non-additive dominance effects were explained by interactions between specific classical HLA alleles (rheumatoid arthritis, P = 1.8 × 10(-3); T1D, P = 8.6 × 10(-27); celiac disease, P = 6.0 × 10(-100)). These interactions generally increased disease risk and explained moderate but significant fractions of phenotypic variance (rheumatoid arthritis, 1.4%; T1D, 4.0%; celiac disease, 4.1%) beyond a simple additive model.

  10. Improving coeliac disease risk prediction by testing non-HLA variants additional to HLA variants.

    PubMed

    Romanos, Jihane; Rosén, Anna; Kumar, Vinod; Trynka, Gosia; Franke, Lude; Szperl, Agata; Gutierrez-Achury, Javier; van Diemen, Cleo C; Kanninga, Roan; Jankipersadsing, Soesma A; Steck, Andrea; Eisenbarth, Georges; van Heel, David A; Cukrowska, Bozena; Bruno, Valentina; Mazzilli, Maria Cristina; Núñez, Concepcion; Bilbao, Jose Ramon; Mearin, M Luisa; Barisani, Donatella; Rewers, Marian; Norris, Jill M; Ivarsson, Anneli; Boezen, H Marieke; Liu, Edwin; Wijmenga, Cisca

    2014-03-01

    The majority of coeliac disease (CD) patients are not being properly diagnosed and therefore remain untreated, leading to a greater risk of developing CD-associated complications. The major genetic risk heterodimer, HLA-DQ2 and DQ8, is already used clinically to help exclude disease. However, approximately 40% of the population carry these alleles and the majority never develop CD. We explored whether CD risk prediction can be improved by adding non-HLA-susceptible variants to common HLA testing. We developed an average weighted genetic risk score with 10, 26 and 57 single nucleotide polymorphisms (SNP) in 2675 cases and 2815 controls and assessed the improvement in risk prediction provided by the non-HLA SNP. Moreover, we assessed the transferability of the genetic risk model with 26 non-HLA variants to a nested case-control population (n=1709) and a prospective cohort (n=1245) and then tested how well this model predicted CD outcome for 985 independent individuals. Adding 57 non-HLA variants to HLA testing showed a statistically significant improvement compared to scores from models based on HLA only, HLA plus 10 SNP and HLA plus 26 SNP. With 57 non-HLA variants, the area under the receiver operator characteristic curve reached 0.854 compared to 0.823 for HLA only, and 11.1% of individuals were reclassified to a more accurate risk group. We show that the risk model with HLA plus 26 SNP is useful in independent populations. Predicting risk with 57 additional non-HLA variants improved the identification of potential CD patients. This demonstrates a possible role for combined HLA and non-HLA genetic testing in diagnostic work for CD.

  11. Widespread non-additive and interaction effects within HLA loci modulate the risk of autoimmune diseases

    PubMed Central

    Lenz, Tobias L.; Deutsch, Aaron J.; Han, Buhm; Hu, Xinli; Okada, Yukinori; Eyre, Stephen; Knapp, Michael; Zhernakova, Alexandra; Huizinga, Tom W.J.; Abecasis, Goncalo; Becker, Jessica; Boeckxstaens, Guy E.; Chen, Wei-Min; Franke, Andre; Gladman, Dafna D.; Gockel, Ines; Gutierrez-Achury, Javier; Martin, Javier; Nair, Rajan P.; Nöthen, Markus M.; Onengut-Gumuscu, Suna; Rahman, Proton; Rantapää-Dahlqvist, Solbritt; Stuart, Philip E.; Tsoi, Lam C.; Van Heel, David A.; Worthington, Jane; Wouters, Mira M.; Klareskog, Lars; Elder, James T.; Gregersen, Peter K.; Schumacher, Johannes; Rich, Stephen S.; Wijmenga, Cisca; Sunyaev, Shamil R.; de Bakker, Paul I.W.; Raychaudhuri, Soumya

    2015-01-01

    Human leukocyte antigen (HLA) genes confer strong risk for autoimmune diseases on a log-additive scale. Here we speculated that differences in autoantigen binding repertoires between a heterozygote’s two expressed HLA variants may result in additional non-additive risk effects. We tested non-additive disease contributions of classical HLA alleles in patients and matched controls for five common autoimmune diseases: rheumatoid arthritis (RA, Ncases=5,337), type 1 diabetes (T1D, Ncases=5,567), psoriasis vulgaris (Ncases=3,089), idiopathic achalasia (Ncases=727), and celiac disease (Ncases=11,115). In four out of five diseases, we observed highly significant non-additive dominance effects (RA: P=2.5×1012; T1D: P=2.4×10−10; psoriasis: P=5.9×10−6; celiac disease: P=1.2×10−87). In three of these diseases, the dominance effects were explained by interactions between specific classical HLA alleles (RA: P=1.8×10−3; T1D: P=8.6×1027; celiac disease: P=6.0×10−100). These interactions generally increased disease risk and explained moderate but significant fractions of phenotypic variance (RA: 1.4%, T1D: 4.0%, and celiac disease: 4.1%, beyond a simple additive model). PMID:26258845

  12. Calculating excess lifetime risk in relative risk models.

    PubMed Central

    Vaeth, M; Pierce, D A

    1990-01-01

    When assessing the impact of radiation exposure it is common practice to present the final conclusions in terms of excess lifetime cancer risk in a population exposed to a given dose. The present investigation is mainly a methodological study focusing on some of the major issues and uncertainties involved in calculating such excess lifetime risks and related risk projection methods. The age-constant relative risk model used in the recent analyses of the cancer mortality that was observed in the follow-up of the cohort of A-bomb survivors in Hiroshima and Nagasaki is used to describe the effect of the exposure on the cancer mortality. In this type of model the excess relative risk is constant in age-at-risk, but depends on the age-at-exposure. Calculation of excess lifetime risks usually requires rather complicated life-table computations. In this paper we propose a simple approximation to the excess lifetime risk; the validity of the approximation for low levels of exposure is justified empirically as well as theoretically. This approximation provides important guidance in understanding the influence of the various factors involved in risk projections. Among the further topics considered are the influence of a latent period, the additional problems involved in calculations of site-specific excess lifetime cancer risks, the consequences of a leveling off or a plateau in the excess relative risk, and the uncertainties involved in transferring results from one population to another. The main part of this study relates to the situation with a single, instantaneous exposure, but a brief discussion is also given of the problem with a continuous exposure at a low-dose rate. PMID:2269245

  13. Calculating excess lifetime risk in relative risk models.

    PubMed

    Vaeth, M; Pierce, D A

    1990-07-01

    When assessing the impact of radiation exposure it is common practice to present the final conclusions in terms of excess lifetime cancer risk in a population exposed to a given dose. The present investigation is mainly a methodological study focusing on some of the major issues and uncertainties involved in calculating such excess lifetime risks and related risk projection methods. The age-constant relative risk model used in the recent analyses of the cancer mortality that was observed in the follow-up of the cohort of A-bomb survivors in Hiroshima and Nagasaki is used to describe the effect of the exposure on the cancer mortality. In this type of model the excess relative risk is constant in age-at-risk, but depends on the age-at-exposure. Calculation of excess lifetime risks usually requires rather complicated life-table computations. In this paper we propose a simple approximation to the excess lifetime risk; the validity of the approximation for low levels of exposure is justified empirically as well as theoretically. This approximation provides important guidance in understanding the influence of the various factors involved in risk projections. Among the further topics considered are the influence of a latent period, the additional problems involved in calculations of site-specific excess lifetime cancer risks, the consequences of a leveling off or a plateau in the excess relative risk, and the uncertainties involved in transferring results from one population to another. The main part of this study relates to the situation with a single, instantaneous exposure, but a brief discussion is also given of the problem with a continuous exposure at a low-dose rate.

  14. Lunar Landing Operational Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian

    2010-01-01

    Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.

  15. Functional Additive Mixed Models

    PubMed Central

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2014-01-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592

  16. Functional Additive Mixed Models.

    PubMed

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2015-04-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.

  17. Risks associated with endotoxins in feed additives produced by fermentation.

    PubMed

    Wallace, R John; Gropp, Jürgen; Dierick, Noël; Costa, Lucio G; Martelli, Giovanna; Brantom, Paul G; Bampidis, Vasileios; Renshaw, Derek W; Leng, Lubomir

    2016-01-15

    Increasingly, feed additives for livestock, such as amino acids and vitamins, are being produced by Gram-negative bacteria, particularly Escherichia coli. The potential therefore exists for animals, consumers and workers to be exposed to possibly harmful amounts of endotoxin from these products. The aim of this review was to assess the extent of the risk from endotoxins in feed additives and to calculate how such risk can be assessed from the properties of the additive. Livestock are frequently exposed to a relatively high content of endotoxin in the diet: no additional hazard to livestock would be anticipated if the endotoxin concentration of the feed additive falls in the same range as feedstuffs. Consumer exposure will be unaffected by the consumption of food derived from animals receiving endotoxin-containing feed, because the small concentrations of endotoxin absorbed do not accumulate in edible tissues. In contrast, workers processing a dusty additive may be exposed to hazardous amounts of endotoxin even if the endotoxin concentration of the product is low. A calculation method is proposed to compare the potential risk to the worker, based on the dusting potential, the endotoxin concentration and technical guidance of the European Food Safety Authority, with national exposure limits.

  18. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F { X ( t ), t } where F (·,·) is an unknown regression function and X ( t ) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F (·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X ( t ) is a signal from diffusion tensor imaging at position, t , along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online.

  19. Non-additive and epistatic effects of HLA polymorphisms contributing to risk of adult glioma.

    PubMed

    Zhang, Chenan; de Smith, Adam J; Smirnov, Ivan V; Wiencke, John K; Wiemels, Joseph L; Witte, John S; Walsh, Kyle M

    2017-11-01

    Although genome-wide association studies have identified several susceptibility loci for adult glioma, little is known regarding the potential contribution of genetic variation in the human leukocyte antigen (HLA) region to glioma risk. HLA associations have been reported for various malignancies, with many studies investigating selected candidate HLA polymorphisms. However, no systematic analysis has been conducted in glioma patients, and no investigation into potential non-additive effects has been described. We conducted comprehensive genetic analyses of HLA variants among 1746 adult glioma patients and 2312 controls of European-ancestry from the GliomaScan Consortium. Genotype data were generated with the Illumina 660-Quad array, and we imputed HLA alleles using a reference panel of 5225 individuals in the Type 1 Diabetes Genetics Consortium who underwent high-resolution HLA typing via next-generation sequencing. Case-control comparisons were adjusted for population stratification using ancestry-informative principal components. Because alleles in different loci across the HLA region are linked, we created multigene haplotypes consisting of the genes DRB1, DQA1, and DQB1. Although none of the haplotypes were associated with glioma in additive models, inclusion of a dominance term significantly improved the model for multigene haplotype HLA-DRB1*1501-DQA1*0102-DQB1*0602 (P = 0.002). Heterozygous carriers of the haplotype had an increased risk of glioma [odds ratio (OR) 1.23; 95% confidence interval (CI) 1.01-1.49], while homozygous carriers were at decreased risk compared with non-carriers (OR 0.64; 95% CI 0.40-1.01). Our results suggest that the DRB1*1501-DQA1*0102-DQB1*0602 haplotype may contribute to the risk of glioma in a non-additive manner, with the positive dominance effect partly explained by an epistatic interaction with HLA-DRB1*0401-DQA1*0301-DQB1*0301.

  20. Adiponectin provides additional information to conventional cardiovascular risk factors for assessing the risk of atherosclerosis in both genders.

    PubMed

    Yoon, Jin-Ha; Kim, Sung-Kyung; Choi, Ho-June; Choi, Soo-In; Cha, So-Youn; Koh, Sang-Baek; Kang, Hee-Taik; Ahn, Song Vogue

    2013-01-01

    This study evaluated the relation between adiponectin and atherosclerosis in both genders, and investigated whether adiponectin provides useful additional information for assessing the risk of atherosclerosis. We measured serum adiponectin levels and other cardiovascular risk factors in 1033 subjects (454 men, 579 women) from the Korean Genomic Rural Cohort study. Carotid intima-media-thickness (CIMT) was used as measure of atherosclerosis. Odds ratios (ORs) with 95% confidence intervals (95% CI) were calculated using multiple logistic regression, and receiver operating characteristic curves (ROC), the category-free net reclassification improvement (NRI) and integrated discrimination improvement (IDI) were calculated. After adjustment for conventional cardiovascular risk factors, such as age, waist circumference, smoking history, low-density and high-density lipoprotein cholesterol, triglycerides, systolic blood pressure and insulin resistance, the ORs (95%CI) of the third tertile adiponectin group were 0.42 (0.25-0.72) in men and 0.47 (0.29-0.75) in women. The area under the curve (AUC) on the ROC analysis increased significantly by 0.025 in men and 0.022 in women when adiponectin was added to the logistic model of conventional cardiovascular risk factors (AUC in men: 0.655 to 0.680, p = 0.038; AUC in women: 0.654 to 0.676, p = 0.041). The NRI was 0.32 (95%CI: 0.13-0.50, p<0.001), and the IDI was 0.03 (95%CI: 0.01-0.04, p<0.001) for men. For women, the category-free NRI was 0.18 (95%CI: 0.02-0.34, p = 0.031) and the IDI was 0.003 (95%CI: -0.002-0.008, p = 0.189). Adiponectin and atherosclerosis were significantly related in both genders, and these relationships were independent of conventional cardiovascular risk factors. Furthermore, adiponectin provided additional information to conventional cardiovascular risk factors regarding the risk of atherosclerosis.

  1. Additive genetic risk from five serotonin system polymorphisms interacts with interpersonal stress to predict depression.

    PubMed

    Vrshek-Schallhorn, Suzanne; Stroud, Catherine B; Mineka, Susan; Zinbarg, Richard E; Adam, Emma K; Redei, Eva E; Hammen, Constance; Craske, Michelle G

    2015-11-01

    Behavioral genetic research supports polygenic models of depression in which many genetic variations each contribute a small amount of risk, and prevailing diathesis-stress models suggest gene-environment interactions (G×E). Multilocus profile scores of additive risk offer an approach that is consistent with polygenic models of depression risk. In a first demonstration of this approach in a G×E predicting depression, we created an additive multilocus profile score from 5 serotonin system polymorphisms (1 each in the genes HTR1A, HTR2A, HTR2C, and 2 in TPH2). Analyses focused on 2 forms of interpersonal stress as environmental risk factors. Using 5 years of longitudinal diagnostic and life stress interviews from 387 emerging young adults in the Youth Emotion Project, survival analyses show that this multilocus profile score interacts with major interpersonal stressful life events to predict major depressive episode onsets (hazard ratio [HR] = 1.815, p = .007). Simultaneously, there was a significant protective effect of the profile score without a recent event (HR = 0.83, p = .030). The G×E effect with interpersonal chronic stress was not significant (HR = 1.15, p = .165). Finally, effect sizes for genetic factors examined ignoring stress suggested such an approach could lead to overlooking or misinterpreting genetic effects. Both the G×E effect and the protective simple main effect were replicated in a sample of early adolescent girls (N = 105). We discuss potential benefits of the multilocus genetic profile score approach and caveats for future research. (c) 2015 APA, all rights reserved).

  2. Additive Genetic Risk from Five Serotonin System Polymorphisms Interacts with Interpersonal Stress to Predict Depression

    PubMed Central

    Vrshek-Schallhorn, Suzanne; Stroud, Catherine B.; Mineka, Susan; Zinbarg, Richard E.; Adam, Emma K.; Redei, Eva E.; Hammen, Constance; Craske, Michelle G.

    2016-01-01

    Behavioral genetic research supports polygenic models of depression in which many genetic variations each contribute a small amount of risk, and prevailing diathesis-stress models suggest gene-environment interactions (GxE). Multilocus profile scores of additive risk offer an approach that is consistent with polygenic models of depression risk. In a first demonstration of this approach in a GxE predicting depression, we created an additive multilocus profile score from five serotonin system polymorphisms (one each in the genes HTR1A, HTR2A, HTR2C, and two in TPH2). Analyses focused on two forms of interpersonal stress as environmental risk factors. Using five years of longitudinal diagnostic and life stress interviews from 387 emerging young adults in the Youth Emotion Project, survival analyses show that this multilocus profile score interacts with major interpersonal stressful life events to predict major depressive episode onsets (HR = 1.815, p = .007). Simultaneously, there was a significant protective effect of the profile score without a recent event (HR = 0.83, p = .030). The GxE effect with interpersonal chronic stress was not significant (HR = 1.15, p = .165). Finally, effect sizes for genetic factors examined ignoring stress suggested such an approach could lead to overlooking or misinterpreting genetic effects. Both the GxE effect and the protective simple main effect were replicated in a sample of early adolescent girls (N = 105). We discuss potential benefits of the multilocus genetic profile score approach and caveats for future research. PMID:26595467

  3. Modeling the cardiovascular system using a nonlinear additive autoregressive model with exogenous input

    NASA Astrophysics Data System (ADS)

    Riedl, M.; Suhrbier, A.; Malberg, H.; Penzel, T.; Bretthauer, G.; Kurths, J.; Wessel, N.

    2008-07-01

    The parameters of heart rate variability and blood pressure variability have proved to be useful analytical tools in cardiovascular physics and medicine. Model-based analysis of these variabilities additionally leads to new prognostic information about mechanisms behind regulations in the cardiovascular system. In this paper, we analyze the complex interaction between heart rate, systolic blood pressure, and respiration by nonparametric fitted nonlinear additive autoregressive models with external inputs. Therefore, we consider measurements of healthy persons and patients suffering from obstructive sleep apnea syndrome (OSAS), with and without hypertension. It is shown that the proposed nonlinear models are capable of describing short-term fluctuations in heart rate as well as systolic blood pressure significantly better than similar linear ones, which confirms the assumption of nonlinear controlled heart rate and blood pressure. Furthermore, the comparison of the nonlinear and linear approaches reveals that the heart rate and blood pressure variability in healthy subjects is caused by a higher level of noise as well as nonlinearity than in patients suffering from OSAS. The residue analysis points at a further source of heart rate and blood pressure variability in healthy subjects, in addition to heart rate, systolic blood pressure, and respiration. Comparison of the nonlinear models within and among the different groups of subjects suggests the ability to discriminate the cohorts that could lead to a stratification of hypertension risk in OSAS patients.

  4. How to interpret a small increase in AUC with an additional risk prediction marker: decision analysis comes through.

    PubMed

    Baker, Stuart G; Schuit, Ewoud; Steyerberg, Ewout W; Pencina, Michael J; Vickers, Andrew; Vickers, Andew; Moons, Karel G M; Mol, Ben W J; Lindeman, Karen S

    2014-09-28

    An important question in the evaluation of an additional risk prediction marker is how to interpret a small increase in the area under the receiver operating characteristic curve (AUC). Many researchers believe that a change in AUC is a poor metric because it increases only slightly with the addition of a marker with a large odds ratio. Because it is not possible on purely statistical grounds to choose between the odds ratio and AUC, we invoke decision analysis, which incorporates costs and benefits. For example, a timely estimate of the risk of later non-elective operative delivery can help a woman in labor decide if she wants an early elective cesarean section to avoid greater complications from possible later non-elective operative delivery. A basic risk prediction model for later non-elective operative delivery involves only antepartum markers. Because adding intrapartum markers to this risk prediction model increases AUC by 0.02, we questioned whether this small improvement is worthwhile. A key decision-analytic quantity is the risk threshold, here the risk of later non-elective operative delivery at which a patient would be indifferent between an early elective cesarean section and usual care. For a range of risk thresholds, we found that an increase in the net benefit of risk prediction requires collecting intrapartum marker data on 68 to 124 women for every correct prediction of later non-elective operative delivery. Because data collection is non-invasive, this test tradeoff of 68 to 124 is clinically acceptable, indicating the value of adding intrapartum markers to the risk prediction model. Copyright © 2014 John Wiley & Sons, Ltd.

  5. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  6. Adiponectin Provides Additional Information to Conventional Cardiovascular Risk Factors for Assessing the Risk of Atherosclerosis in Both Genders

    PubMed Central

    Yoon, Jin-Ha; Kim, Sung-Kyung; Choi, Ho-June; Choi, Soo-In; Cha, So-Youn; Koh, Sang-Baek

    2013-01-01

    Background This study evaluated the relation between adiponectin and atherosclerosis in both genders, and investigated whether adiponectin provides useful additional information for assessing the risk of atherosclerosis. Methods We measured serum adiponectin levels and other cardiovascular risk factors in 1033 subjects (454 men, 579 women) from the Korean Genomic Rural Cohort study. Carotid intima–media-thickness (CIMT) was used as measure of atherosclerosis. Odds ratios (ORs) with 95% confidence intervals (95% CI) were calculated using multiple logistic regression, and receiver operating characteristic curves (ROC), the category-free net reclassification improvement (NRI) and integrated discrimination improvement (IDI) were calculated. Results After adjustment for conventional cardiovascular risk factors, such as age, waist circumference, smoking history, low-density and high-density lipoprotein cholesterol, triglycerides, systolic blood pressure and insulin resistance, the ORs (95%CI) of the third tertile adiponectin group were 0.42 (0.25–0.72) in men and 0.47 (0.29–0.75) in women. The area under the curve (AUC) on the ROC analysis increased significantly by 0.025 in men and 0.022 in women when adiponectin was added to the logistic model of conventional cardiovascular risk factors (AUC in men: 0.655 to 0.680, p = 0.038; AUC in women: 0.654 to 0.676, p = 0.041). The NRI was 0.32 (95%CI: 0.13–0.50, p<0.001), and the IDI was 0.03 (95%CI: 0.01–0.04, p<0.001) for men. For women, the category-free NRI was 0.18 (95%CI: 0.02–0.34, p = 0.031) and the IDI was 0.003 (95%CI: −0.002–0.008, p = 0.189). Conclusion Adiponectin and atherosclerosis were significantly related in both genders, and these relationships were independent of conventional cardiovascular risk factors. Furthermore, adiponectin provided additional information to conventional cardiovascular risk factors regarding the risk of atherosclerosis. PMID:24116054

  7. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  8. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  9. Additive composite ABCG2, SLC2A9 and SLC22A12 scores of high-risk alleles with alcohol use modulate gout risk.

    PubMed

    Tu, Hung-Pin; Chung, Chia-Min; Min-Shan Ko, Albert; Lee, Su-Shin; Lai, Han-Ming; Lee, Chien-Hung; Huang, Chung-Ming; Liu, Chiu-Shong; Ko, Ying-Chin

    2016-09-01

    The aim of the present study was to evaluate the contribution of urate transporter genes and alcohol use to the risk of gout/tophi. Eight variants of ABCG2, SLC2A9, SLC22A12, SLC22A11 and SLC17A3 were genotyped in male individuals in a case-control study with 157 gout (33% tophi), 106 asymptomatic hyperuricaemia and 295 control subjects from Taiwan. The multilocus profiles of the genetic risk scores for urate gene variants were used to evaluate the risk of asymptomatic hyperuricaemia, gout and tophi. ABCG2 Q141K (T), SLC2A9 rs1014290 (A) and SLC22A12 rs475688 (C) under an additive model and alcohol use independently predicted the risk of gout (respective odds ratio for each factor=2.48, 2.03, 1.95 and 2.48). The additive composite Q141K, rs1014290 and rs475688 scores of high-risk alleles were associated with gout risk (P<0.0001). We observed the supramultiplicative interaction effect of genetic urate scores and alcohol use on gout and tophi risk (P for interaction=0.0452, 0.0033). The synergistic effect of genetic urate score 5-6 and alcohol use indicates that these combined factors correlate with gout and tophi occurrence.

  10. Compound risk judgment in tasks with both idiosyncratic and systematic risk: The "Robust Beauty" of additive probability integration.

    PubMed

    Sundh, Joakim; Juslin, Peter

    2018-02-01

    In this study, we explore how people integrate risks of assets in a simulated financial market into a judgment of the conjunctive risk that all assets decrease in value, both when assets are independent and when there is a systematic risk present affecting all assets. Simulations indicate that while mental calculation according to naïve application of probability theory is best when the assets are independent, additive or exemplar-based algorithms perform better when systematic risk is high. Considering that people tend to intuitively approach compound probability tasks using additive heuristics, we expected the participants to find it easiest to master tasks with high systematic risk - the most complex tasks from the standpoint of probability theory - while they should shift to probability theory or exemplar memory with independence between the assets. The results from 3 experiments confirm that participants shift between strategies depending on the task, starting off with the default of additive integration. In contrast to results in similar multiple cue judgment tasks, there is little evidence for use of exemplar memory. The additive heuristics also appear to be surprisingly context-sensitive, with limited generalization across formally very similar tasks. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. "The Dose Makes the Poison": Informing Consumers About the Scientific Risk Assessment of Food Additives.

    PubMed

    Bearth, Angela; Cousin, Marie-Eve; Siegrist, Michael

    2016-01-01

    Intensive risk assessment is required before the approval of food additives. During this process, based on the toxicological principle of "the dose makes the poison,ˮ maximum usage doses are assessed. However, most consumers are not aware of these efforts to ensure the safety of food additives and are therefore sceptical, even though food additives bring certain benefits to consumers. This study investigated the effect of a short video, which explains the scientific risk assessment and regulation of food additives, on consumers' perceptions and acceptance of food additives. The primary goal of this study was to inform consumers and enable them to construct their own risk-benefit assessment and make informed decisions about food additives. The secondary goal was to investigate whether people have different perceptions of food additives of artificial (i.e., aspartame) or natural origin (i.e., steviolglycoside). To attain these research goals, an online experiment was conducted on 185 Swiss consumers. Participants were randomly assigned to either the experimental group, which was shown a video about the scientific risk assessment of food additives, or the control group, which was shown a video about a topic irrelevant to the study. After watching the video, the respondents knew significantly more, expressed more positive thoughts and feelings, had less risk perception, and more acceptance than prior to watching the video. Thus, it appears that informing consumers about complex food safety topics, such as the scientific risk assessment of food additives, is possible, and using a carefully developed information video is a successful strategy for informing consumers. © 2015 Society for Risk Analysis.

  12. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    PubMed

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  13. Concentration Addition, Independent Action and Generalized Concentration Addition Models for Mixture Effect Prediction of Sex Hormone Synthesis In Vitro

    PubMed Central

    Hadrup, Niels; Taxvig, Camilla; Pedersen, Mikael; Nellemann, Christine; Hass, Ulla; Vinggaard, Anne Marie

    2013-01-01

    Humans are concomitantly exposed to numerous chemicals. An infinite number of combinations and doses thereof can be imagined. For toxicological risk assessment the mathematical prediction of mixture effects, using knowledge on single chemicals, is therefore desirable. We investigated pros and cons of the concentration addition (CA), independent action (IA) and generalized concentration addition (GCA) models. First we measured effects of single chemicals and mixtures thereof on steroid synthesis in H295R cells. Then single chemical data were applied to the models; predictions of mixture effects were calculated and compared to the experimental mixture data. Mixture 1 contained environmental chemicals adjusted in ratio according to human exposure levels. Mixture 2 was a potency adjusted mixture containing five pesticides. Prediction of testosterone effects coincided with the experimental Mixture 1 data. In contrast, antagonism was observed for effects of Mixture 2 on this hormone. The mixtures contained chemicals exerting only limited maximal effects. This hampered prediction by the CA and IA models, whereas the GCA model could be used to predict a full dose response curve. Regarding effects on progesterone and estradiol, some chemicals were having stimulatory effects whereas others had inhibitory effects. The three models were not applicable in this situation and no predictions could be performed. Finally, the expected contributions of single chemicals to the mixture effects were calculated. Prochloraz was the predominant but not sole driver of the mixtures, suggesting that one chemical alone was not responsible for the mixture effects. In conclusion, the GCA model seemed to be superior to the CA and IA models for the prediction of testosterone effects. A situation with chemicals exerting opposing effects, for which the models could not be applied, was identified. In addition, the data indicate that in non-potency adjusted mixtures the effects cannot always be

  14. Mean-variance model for portfolio optimization with background risk based on uncertainty theory

    NASA Astrophysics Data System (ADS)

    Zhai, Jia; Bai, Manying

    2018-04-01

    The aim of this paper is to develop a mean-variance model for portfolio optimization considering the background risk, liquidity and transaction cost based on uncertainty theory. In portfolio selection problem, returns of securities and assets liquidity are assumed as uncertain variables because of incidents or lacking of historical data, which are common in economic and social environment. We provide crisp forms of the model and a hybrid intelligent algorithm to solve it. Under a mean-variance framework, we analyze the portfolio frontier characteristic considering independently additive background risk. In addition, we discuss some effects of background risk and liquidity constraint on the portfolio selection. Finally, we demonstrate the proposed models by numerical simulations.

  15. Persistent hemifacial spasm after microvascular decompression: a risk assessment model.

    PubMed

    Shah, Aalap; Horowitz, Michael

    2017-06-01

    Microvascular decompression (MVD) for hemifacial spasm (HFS) provides resolution of disabling symptoms such as eyelid twitching and muscle contractions of the entire hemiface. The primary aim of this study was to evaluate the predictive value of patient demographics and spasm characteristics on long-term outcomes, with or without intraoperative lateral spread response (LSR) as an additional variable in a risk assessment model. A retrospective study was undertaken to evaluate the associations of pre-operative patient characteristics, as well as intraoperative LSR and need for a staged procedure on the presence of persistent or recurrent HFS at the time of hospital discharge and at follow-up. A risk assessment model was constructed with the inclusion of six clinically or statistically significant variables from the univariate analyses. A receiving operator characteristic curve was generated, and area under the curve was calculated to determine the strength of the predictive model. A risk assessment model was first created consisting of significant pre-operative variables (Model 1) (age >50, female gender, history of botulinum toxin use, platysma muscle involvement). This model demonstrated borderline predictive value for persistent spasm at discharge (AUC .60; p=.045) and fair predictive value at follow-up (AUC .75; p=.001). Intraoperative variables (e.g. LSR persistence) demonstrated little additive value (Model 2) (AUC .67). Patients with a higher risk score (three or greater) demonstrated greater odds of persistent HFS at the time of discharge (OR 1.5 [95%CI 1.16-1.97]; p=.035), as well as greater odds of persistent or recurrent spasm at the time of follow-up (OR 3.0 [95%CI 1.52-5.95]; p=.002) Conclusions: A risk assessment model consisting of pre-operative clinical characteristics is useful in prognosticating HFS persistence at follow-up.

  16. Reference centiles for the middle cerebral artery and umbilical artery pulsatility index and cerebro-placental ratio from a low-risk population - a Generalised Additive Model for Location, Shape and Scale (GAMLSS) approach.

    PubMed

    Flatley, Christopher; Kumar, Sailesh; Greer, Ristan M

    2018-02-06

    The primary aim of this study was to create reference ranges for the fetal Middle Cerebral artery Pulsatility Index (MCA PI), Umbilical Artery Pulsatility Index (UA PI) and the Cerebro-Placental Ratio (CPR) in a clearly defined low-risk cohort using the Generalised Additive Model for Location, Shape and Scale (GAMLSS) method. Prospectively collected cross-sectional biometry and Doppler data from low-risk women attending the Mater Mother's Hospital, Maternal and Fetal Medicine Department in Brisbane, Australia between January 2010 and April 2017 were used to derive gestation specific centiles for the MCA PI, UA PI and CPR. All ultrasound scans were performed between 18 + 0 and 41 + 6 weeks gestation with recorded data for the MCA PI and/or UA PI. The GAMLSS method was used for the calculation of gestational age-adjusted centiles. Distributions and additive terms were assessed and the final model was chosen on the basis of the Global Deviance, Akaike information criterion (AIC) and Schwartz bayesian criterion (SBC), along with the results of the model and residual diagnostics as well as visual assessment of the centiles themselves. Over the study period 6013 women met the inclusion criteria. The MCA PI was recorded in 4473 fetuses, the UA PI in 6008 fetuses and the CPR was able to be calculated in 4464 cases. The centiles for the MCA PI used a fractional polynomial additive term and Box-Cox t (BCT) distribution. Centiles for the UA PI used a cubic spline additive term with BCT distribution and the CPR used a fractional polynomial additive term and a BCT distribution. We have created gestational centile reference ranges for the MCA PI, UA PI and CPR from a large low-risk cohort that supports their applicability and generalisability.

  17. VARIABLE SELECTION IN NONPARAMETRIC ADDITIVE MODELS

    PubMed Central

    Huang, Jian; Horowitz, Joel L.; Wei, Fengrong

    2010-01-01

    We consider a nonparametric additive model of a conditional mean function in which the number of variables and additive components may be larger than the sample size but the number of nonzero additive components is “small” relative to the sample size. The statistical problem is to determine which additive components are nonzero. The additive components are approximated by truncated series expansions with B-spline bases. With this approximation, the problem of component selection becomes that of selecting the groups of coefficients in the expansion. We apply the adaptive group Lasso to select nonzero components, using the group Lasso to obtain an initial estimator and reduce the dimension of the problem. We give conditions under which the group Lasso selects a model whose number of components is comparable with the underlying model, and the adaptive group Lasso selects the nonzero components correctly with probability approaching one as the sample size increases and achieves the optimal rate of convergence. The results of Monte Carlo experiments show that the adaptive group Lasso procedure works well with samples of moderate size. A data example is used to illustrate the application of the proposed method. PMID:21127739

  18. Development and External Validation of a Melanoma Risk Prediction Model Based on Self-assessed Risk Factors.

    PubMed

    Vuong, Kylie; Armstrong, Bruce K; Weiderpass, Elisabete; Lund, Eiliv; Adami, Hans-Olov; Veierod, Marit B; Barrett, Jennifer H; Davies, John R; Bishop, D Timothy; Whiteman, David C; Olsen, Catherine M; Hopper, John L; Mann, Graham J; Cust, Anne E; McGeechan, Kevin

    2016-08-01

    Identifying individuals at high risk of melanoma can optimize primary and secondary prevention strategies. To develop and externally validate a risk prediction model for incident first-primary cutaneous melanoma using self-assessed risk factors. We used unconditional logistic regression to develop a multivariable risk prediction model. Relative risk estimates from the model were combined with Australian melanoma incidence and competing mortality rates to obtain absolute risk estimates. A risk prediction model was developed using the Australian Melanoma Family Study (629 cases and 535 controls) and externally validated using 4 independent population-based studies: the Western Australia Melanoma Study (511 case-control pairs), Leeds Melanoma Case-Control Study (960 cases and 513 controls), Epigene-QSkin Study (44 544, of which 766 with melanoma), and Swedish Women's Lifestyle and Health Cohort Study (49 259 women, of which 273 had melanoma). We validated model performance internally and externally by assessing discrimination using the area under the receiver operating curve (AUC). Additionally, using the Swedish Women's Lifestyle and Health Cohort Study, we assessed model calibration and clinical usefulness. The risk prediction model included hair color, nevus density, first-degree family history of melanoma, previous nonmelanoma skin cancer, and lifetime sunbed use. On internal validation, the AUC was 0.70 (95% CI, 0.67-0.73). On external validation, the AUC was 0.66 (95% CI, 0.63-0.69) in the Western Australia Melanoma Study, 0.67 (95% CI, 0.65-0.70) in the Leeds Melanoma Case-Control Study, 0.64 (95% CI, 0.62-0.66) in the Epigene-QSkin Study, and 0.63 (95% CI, 0.60-0.67) in the Swedish Women's Lifestyle and Health Cohort Study. Model calibration showed close agreement between predicted and observed numbers of incident melanomas across all deciles of predicted risk. In the external validation setting, there was higher net benefit when using the risk prediction

  19. Risk prediction models of breast cancer: a systematic review of model performances.

    PubMed

    Anothaisintawee, Thunyarat; Teerawattananon, Yot; Wiratkapun, Chollathip; Kasamesup, Vijj; Thakkinstian, Ammarin

    2012-05-01

    The number of risk prediction models has been increasingly developed, for estimating about breast cancer in individual women. However, those model performances are questionable. We therefore have conducted a study with the aim to systematically review previous risk prediction models. The results from this review help to identify the most reliable model and indicate the strengths and weaknesses of each model for guiding future model development. We searched MEDLINE (PubMed) from 1949 and EMBASE (Ovid) from 1974 until October 2010. Observational studies which constructed models using regression methods were selected. Information about model development and performance were extracted. Twenty-five out of 453 studies were eligible. Of these, 18 developed prediction models and 7 validated existing prediction models. Up to 13 variables were included in the models and sample sizes for each study ranged from 550 to 2,404,636. Internal validation was performed in four models, while five models had external validation. Gail and Rosner and Colditz models were the significant models which were subsequently modified by other scholars. Calibration performance of most models was fair to good (expected/observe ratio: 0.87-1.12), but discriminatory accuracy was poor to fair both in internal validation (concordance statistics: 0.53-0.66) and in external validation (concordance statistics: 0.56-0.63). Most models yielded relatively poor discrimination in both internal and external validation. This poor discriminatory accuracy of existing models might be because of a lack of knowledge about risk factors, heterogeneous subtypes of breast cancer, and different distributions of risk factors across populations. In addition the concordance statistic itself is insensitive to measure the improvement of discrimination. Therefore, the new method such as net reclassification index should be considered to evaluate the improvement of the performance of a new develop model.

  20. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  1. Problems With Risk Reclassification Methods for Evaluating Prediction Models

    PubMed Central

    Pepe, Margaret S.

    2011-01-01

    For comparing the performance of a baseline risk prediction model with one that includes an additional predictor, a risk reclassification analysis strategy has been proposed. The first step is to cross-classify risks calculated according to the 2 models for all study subjects. Summary measures including the percentage of reclassification and the percentage of correct reclassification are calculated, along with 2 reclassification calibration statistics. The author shows that interpretations of the proposed summary measures and P values are problematic. The author's recommendation is to display the reclassification table, because it shows interesting information, but to use alternative methods for summarizing and comparing model performance. The Net Reclassification Index has been suggested as one alternative method. The author argues for reporting components of the Net Reclassification Index because they are more clinically relevant than is the single numerical summary measure. PMID:21555714

  2. A modeling framework for exposing risks in complex systems.

    PubMed

    Sharit, J

    2000-08-01

    This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.

  3. A regularized variable selection procedure in additive hazards model with stratified case-cohort design.

    PubMed

    Ni, Ai; Cai, Jianwen

    2018-07-01

    Case-cohort designs are commonly used in large epidemiological studies to reduce the cost associated with covariate measurement. In many such studies the number of covariates is very large. An efficient variable selection method is needed for case-cohort studies where the covariates are only observed in a subset of the sample. Current literature on this topic has been focused on the proportional hazards model. However, in many studies the additive hazards model is preferred over the proportional hazards model either because the proportional hazards assumption is violated or the additive hazards model provides more relevent information to the research question. Motivated by one such study, the Atherosclerosis Risk in Communities (ARIC) study, we investigate the properties of a regularized variable selection procedure in stratified case-cohort design under an additive hazards model with a diverging number of parameters. We establish the consistency and asymptotic normality of the penalized estimator and prove its oracle property. Simulation studies are conducted to assess the finite sample performance of the proposed method with a modified cross-validation tuning parameter selection methods. We apply the variable selection procedure to the ARIC study to demonstrate its practical use.

  4. Breast cancer risk prediction using a clinical risk model and polygenic risk score.

    PubMed

    Shieh, Yiwey; Hu, Donglei; Ma, Lin; Huntsman, Scott; Gard, Charlotte C; Leung, Jessica W T; Tice, Jeffrey A; Vachon, Celine M; Cummings, Steven R; Kerlikowske, Karla; Ziv, Elad

    2016-10-01

    Breast cancer risk assessment can inform the use of screening and prevention modalities. We investigated the performance of the Breast Cancer Surveillance Consortium (BCSC) risk model in combination with a polygenic risk score (PRS) comprised of 83 single nucleotide polymorphisms identified from genome-wide association studies. We conducted a nested case-control study of 486 cases and 495 matched controls within a screening cohort. The PRS was calculated using a Bayesian approach. The contributions of the PRS and variables in the BCSC model to breast cancer risk were tested using conditional logistic regression. Discriminatory accuracy of the models was compared using the area under the receiver operating characteristic curve (AUROC). Increasing quartiles of the PRS were positively associated with breast cancer risk, with OR 2.54 (95 % CI 1.69-3.82) for breast cancer in the highest versus lowest quartile. In a multivariable model, the PRS, family history, and breast density remained strong risk factors. The AUROC of the PRS was 0.60 (95 % CI 0.57-0.64), and an Asian-specific PRS had AUROC 0.64 (95 % CI 0.53-0.74). A combined model including the BCSC risk factors and PRS had better discrimination than the BCSC model (AUROC 0.65 versus 0.62, p = 0.01). The BCSC-PRS model classified 18 % of cases as high-risk (5-year risk ≥3 %), compared with 7 % using the BCSC model. The PRS improved discrimination of the BCSC risk model and classified more cases as high-risk. Further consideration of the PRS's role in decision-making around screening and prevention strategies is merited.

  5. Breast Cancer Risk Prediction Using a Clinical Risk Model and Polygenic Risk Score

    PubMed Central

    Shieh, Yiwey; Hu, Donglei; Ma, Lin; Huntsman, Scott; Gard, Charlotte C.; Leung, Jessica W.T.; Tice, Jeffrey A.; Vachon, Celine M.; Cummings, Steven R.; Kerlikowske, Karla; Ziv, Elad

    2016-01-01

    Purpose Breast cancer risk assessment can inform the use of screening and prevention modalities. We investigated the performance of the Breast Cancer Surveillance Consortium (BCSC) risk model in combination with a polygenic risk score (PRS) comprised of 83 single nucleotide polymorphisms identified from genome wide association studies. Methods We conducted a nested case-control study of 486 cases and 495 matched controls within a screening cohort. The PRS was calculated using a Bayesian approach. The contributions of the PRS and variables in the BCSC model to breast cancer risk were tested using conditional logistic regression. Discriminatory accuracy of the models was compared using the area under the receiver operating characteristic curve (AUROC). Results Increasing quartiles of the PRS were positively associated with breast cancer risk, with OR 2.54 (95% CI 1.69-3.82) for breast cancer in the highest versus lowest quartile. In a multivariable model, the PRS, family history, and breast density remained strong risk factors. The AUROC of the PRS was 0.60 (95% CI 0.57-0.64), and an Asian-specific PRS had AUROC 0.64 (95% CI 0.53-0.74). A combined model including the BCSC risk factors and PRS had better discrimination than the BCSC model (AUROC 0.65 versus 0.62, p = 0.01). The BCSC-PRS model classified 18% of cases as high-risk (5-year risk ≥ 3%), compared with 7% using the BCSC model. Conclusion The PRS improved discrimination of the BCSC risk model and classified more cases as high-risk. Impact Further consideration of the PRS's role in decision-making around screening and prevention strategies is merited. PMID:27565998

  6. Doubly Robust Additive Hazards Models to Estimate Effects of a Continuous Exposure on Survival.

    PubMed

    Wang, Yan; Lee, Mihye; Liu, Pengfei; Shi, Liuhua; Yu, Zhi; Abu Awad, Yara; Zanobetti, Antonella; Schwartz, Joel D

    2017-11-01

    The effect of an exposure on survival can be biased when the regression model is misspecified. Hazard difference is easier to use in risk assessment than hazard ratio and has a clearer interpretation in the assessment of effect modifications. We proposed two doubly robust additive hazards models to estimate the causal hazard difference of a continuous exposure on survival. The first model is an inverse probability-weighted additive hazards regression. The second model is an extension of the doubly robust estimator for binary exposures by categorizing the continuous exposure. We compared these with the marginal structural model and outcome regression with correct and incorrect model specifications using simulations. We applied doubly robust additive hazard models to the estimation of hazard difference of long-term exposure to PM2.5 (particulate matter with an aerodynamic diameter less than or equal to 2.5 microns) on survival using a large cohort of 13 million older adults residing in seven states of the Southeastern United States. We showed that the proposed approaches are doubly robust. We found that each 1 μg m increase in annual PM2.5 exposure was associated with a causal hazard difference in mortality of 8.0 × 10 (95% confidence interval 7.4 × 10, 8.7 × 10), which was modified by age, medical history, socioeconomic status, and urbanicity. The overall hazard difference translates to approximately 5.5 (5.1, 6.0) thousand deaths per year in the study population. The proposed approaches improve the robustness of the additive hazards model and produce a novel additive causal estimate of PM2.5 on survival and several additive effect modifications, including social inequality.

  7. Impact of model-based risk analysis for liver surgery planning.

    PubMed

    Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K

    2014-05-01

    A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.

  8. A probabilistic topic model for clinical risk stratification from electronic health records.

    PubMed

    Huang, Zhengxing; Dong, Wei; Duan, Huilong

    2015-12-01

    Risk stratification aims to provide physicians with the accurate assessment of a patient's clinical risk such that an individualized prevention or management strategy can be developed and delivered. Existing risk stratification techniques mainly focus on predicting the overall risk of an individual patient in a supervised manner, and, at the cohort level, often offer little insight beyond a flat score-based segmentation from the labeled clinical dataset. To this end, in this paper, we propose a new approach for risk stratification by exploring a large volume of electronic health records (EHRs) in an unsupervised fashion. Along this line, this paper proposes a novel probabilistic topic modeling framework called probabilistic risk stratification model (PRSM) based on Latent Dirichlet Allocation (LDA). The proposed PRSM recognizes a patient clinical state as a probabilistic combination of latent sub-profiles, and generates sub-profile-specific risk tiers of patients from their EHRs in a fully unsupervised fashion. The achieved stratification results can be easily recognized as high-, medium- and low-risk, respectively. In addition, we present an extension of PRSM, called weakly supervised PRSM (WS-PRSM) by incorporating minimum prior information into the model, in order to improve the risk stratification accuracy, and to make our models highly portable to risk stratification tasks of various diseases. We verify the effectiveness of the proposed approach on a clinical dataset containing 3463 coronary heart disease (CHD) patient instances. Both PRSM and WS-PRSM were compared with two established supervised risk stratification algorithms, i.e., logistic regression and support vector machine, and showed the effectiveness of our models in risk stratification of CHD in terms of the Area Under the receiver operating characteristic Curve (AUC) analysis. As well, in comparison with PRSM, WS-PRSM has over 2% performance gain, on the experimental dataset, demonstrating that

  9. Predicting Risk of Type 2 Diabetes Mellitus with Genetic Risk Models on the Basis of Established Genome-wide Association Markers: A Systematic Review

    PubMed Central

    Bao, Wei; Hu, Frank B.; Rong, Shuang; Rong, Ying; Bowers, Katherine; Schisterman, Enrique F.; Liu, Liegang; Zhang, Cuilin

    2013-01-01

    This study aimed to evaluate the predictive performance of genetic risk models based on risk loci identified and/or confirmed in genome-wide association studies for type 2 diabetes mellitus. A systematic literature search was conducted in the PubMed/MEDLINE and EMBASE databases through April 13, 2012, and published data relevant to the prediction of type 2 diabetes based on genome-wide association marker–based risk models (GRMs) were included. Of the 1,234 potentially relevant articles, 21 articles representing 23 studies were eligible for inclusion. The median area under the receiver operating characteristic curve (AUC) among eligible studies was 0.60 (range, 0.55–0.68), which did not differ appreciably by study design, sample size, participants’ race/ethnicity, or the number of genetic markers included in the GRMs. In addition, the AUCs for type 2 diabetes did not improve appreciably with the addition of genetic markers into conventional risk factor–based models (median AUC, 0.79 (range, 0.63–0.91) vs. median AUC, 0.78 (range, 0.63–0.90), respectively). A limited number of included studies used reclassification measures and yielded inconsistent results. In conclusion, GRMs showed a low predictive performance for risk of type 2 diabetes, irrespective of study design, participants’ race/ethnicity, and the number of genetic markers included. Moreover, the addition of genome-wide association markers into conventional risk models produced little improvement in predictive performance. PMID:24008910

  10. An Agent-Based Model of Evolving Community Flood Risk.

    PubMed

    Tonn, Gina L; Guikema, Seth D

    2018-06-01

    Although individual behavior plays a major role in community flood risk, traditional flood risk models generally do not capture information on how community policies and individual decisions impact the evolution of flood risk over time. The purpose of this study is to improve the understanding of the temporal aspects of flood risk through a combined analysis of the behavioral, engineering, and physical hazard aspects of flood risk. Additionally, the study aims to develop a new modeling approach for integrating behavior, policy, flood hazards, and engineering interventions. An agent-based model (ABM) is used to analyze the influence of flood protection measures, individual behavior, and the occurrence of floods and near-miss flood events on community flood risk. The ABM focuses on the following decisions and behaviors: dissemination of flood management information, installation of community flood protection, elevation of household mechanical equipment, and elevation of homes. The approach is place based, with a case study area in Fargo, North Dakota, but is focused on generalizable insights. Generally, community mitigation results in reduced future damage, and individual action, including mitigation and movement into and out of high-risk areas, can have a significant influence on community flood risk. The results of this study provide useful insights into the interplay between individual and community actions and how it affects the evolution of flood risk. This study lends insight into priorities for future work, including the development of more in-depth behavioral and decision rules at the individual and community level. © 2017 Society for Risk Analysis.

  11. Enhanced risk prediction model for emergency department use and hospitalizations in patients in a primary care medical home.

    PubMed

    Takahashi, Paul Y; Heien, Herbert C; Sangaralingham, Lindsey R; Shah, Nilay D; Naessens, James M

    2016-07-01

    With the advent of healthcare payment reform, identifying high-risk populations has become more important to providers. Existing risk-prediction models often focus on chronic conditions. This study sought to better understand other factors to improve identification of the highest risk population. A retrospective cohort study of a paneled primary care population utilizing 2010 data to calibrate a risk prediction model of hospital and emergency department (ED) use in 2011. Data were randomly split into development and validation data sets. We compared the enhanced model containing the additional risk predictors with the Minnesota medical tiering model. The study was conducted in the primary care practice of an integrated delivery system at an academic medical center in Rochester, Minnesota. The study focus was primary care medical home patients in 2010 and 2011 (n = 84,752), with the primary outcome of subsequent hospitalization or ED visit. A total of 42,384 individuals derived the enhanced risk-prediction model and 42,368 individuals validated the model. Predictors included Adjusted Clinical Groups-based Minnesota medical tiering, patient demographics, insurance status, and prior year healthcare utilization. Additional variables included specific mental and medical conditions, use of high-risk medications, and body mass index. The area under the curve in the enhanced model was 0.705 (95% CI, 0.698-0.712) compared with 0.662 (95% CI, 0.656-0.669) in the Minnesota medical tiering-only model. New high-risk patients in the enhanced model were more likely to have lack of health insurance, presence of Medicaid, diagnosed depression, and prior ED utilization. An enhanced model including additional healthcare-related factors improved the prediction of risk of hospitalization or ED visit.

  12. Risk prediction models for selection of lung cancer screening candidates: A retrospective validation study

    PubMed Central

    ten Haaf, Kevin; Tammemägi, Martin C.; Han, Summer S.; Kong, Chung Yin; Plevritis, Sylvia K.; de Koning, Harry J.; Steyerberg, Ewout W.

    2017-01-01

    Background Selection of candidates for lung cancer screening based on individual risk has been proposed as an alternative to criteria based on age and cumulative smoking exposure (pack-years). Nine previously established risk models were assessed for their ability to identify those most likely to develop or die from lung cancer. All models considered age and various aspects of smoking exposure (smoking status, smoking duration, cigarettes per day, pack-years smoked, time since smoking cessation) as risk predictors. In addition, some models considered factors such as gender, race, ethnicity, education, body mass index, chronic obstructive pulmonary disease, emphysema, personal history of cancer, personal history of pneumonia, and family history of lung cancer. Methods and findings Retrospective analyses were performed on 53,452 National Lung Screening Trial (NLST) participants (1,925 lung cancer cases and 884 lung cancer deaths) and 80,672 Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO) ever-smoking participants (1,463 lung cancer cases and 915 lung cancer deaths). Six-year lung cancer incidence and mortality risk predictions were assessed for (1) calibration (graphically) by comparing the agreement between the predicted and the observed risks, (2) discrimination (area under the receiver operating characteristic curve [AUC]) between individuals with and without lung cancer (death), and (3) clinical usefulness (net benefit in decision curve analysis) by identifying risk thresholds at which applying risk-based eligibility would improve lung cancer screening efficacy. To further assess performance, risk model sensitivities and specificities in the PLCO were compared to those based on the NLST eligibility criteria. Calibration was satisfactory, but discrimination ranged widely (AUCs from 0.61 to 0.81). The models outperformed the NLST eligibility criteria over a substantial range of risk thresholds in decision curve analysis, with a higher

  13. Risk-adjusted econometric model to estimate postoperative costs: an additional instrument for monitoring performance after major lung resection.

    PubMed

    Brunelli, Alessandro; Salati, Michele; Refai, Majed; Xiumé, Francesco; Rocco, Gaetano; Sabbatini, Armando

    2007-09-01

    The objectives of this study were to develop a risk-adjusted model to estimate individual postoperative costs after major lung resection and to use it for internal economic audit. Variable and fixed hospital costs were collected for 679 consecutive patients who underwent major lung resection from January 2000 through October 2006 at our unit. Several preoperative variables were used to develop a risk-adjusted econometric model from all patients operated on during the period 2000 through 2003 by a stepwise multiple regression analysis (validated by bootstrap). The model was then used to estimate the postoperative costs in the patients operated on during the 3 subsequent periods (years 2004, 2005, and 2006). Observed and predicted costs were then compared within each period by the Wilcoxon signed rank test. Multiple regression and bootstrap analysis yielded the following model predicting postoperative cost: 11,078 + 1340.3X (age > 70 years) + 1927.8X cardiac comorbidity - 95X ppoFEV1%. No differences between predicted and observed costs were noted in the first 2 periods analyzed (year 2004, $6188.40 vs $6241.40, P = .3; year 2005, $6308.60 vs $6483.60, P = .4), whereas in the most recent period (2006) observed costs were significantly lower than the predicted ones ($3457.30 vs $6162.70, P < .0001). Greater precision in predicting outcome and costs after therapy may assist clinicians in the optimization of clinical pathways and allocation of resources. Our economic model may be used as a methodologic template for economic audit in our specialty and complement more traditional outcome measures in the assessment of performance.

  14. Ridge, Lasso and Bayesian additive-dominance genomic models.

    PubMed

    Azevedo, Camila Ferreira; de Resende, Marcos Deon Vilela; E Silva, Fabyano Fonseca; Viana, José Marcelo Soriano; Valente, Magno Sávio Ferreira; Resende, Márcio Fernando Ribeiro; Muñoz, Patricio

    2015-08-25

    A complete approach for genome-wide selection (GWS) involves reliable statistical genetics models and methods. Reports on this topic are common for additive genetic models but not for additive-dominance models. The objective of this paper was (i) to compare the performance of 10 additive-dominance predictive models (including current models and proposed modifications), fitted using Bayesian, Lasso and Ridge regression approaches; and (ii) to decompose genomic heritability and accuracy in terms of three quantitative genetic information sources, namely, linkage disequilibrium (LD), co-segregation (CS) and pedigree relationships or family structure (PR). The simulation study considered two broad sense heritability levels (0.30 and 0.50, associated with narrow sense heritabilities of 0.20 and 0.35, respectively) and two genetic architectures for traits (the first consisting of small gene effects and the second consisting of a mixed inheritance model with five major genes). G-REML/G-BLUP and a modified Bayesian/Lasso (called BayesA*B* or t-BLASSO) method performed best in the prediction of genomic breeding as well as the total genotypic values of individuals in all four scenarios (two heritabilities x two genetic architectures). The BayesA*B*-type method showed a better ability to recover the dominance variance/additive variance ratio. Decomposition of genomic heritability and accuracy revealed the following descending importance order of information: LD, CS and PR not captured by markers, the last two being very close. Amongst the 10 models/methods evaluated, the G-BLUP, BAYESA*B* (-2,8) and BAYESA*B* (4,6) methods presented the best results and were found to be adequate for accurately predicting genomic breeding and total genotypic values as well as for estimating additive and dominance in additive-dominance genomic models.

  15. Validation of a novel air toxic risk model with air monitoring.

    PubMed

    Pratt, Gregory C; Dymond, Mary; Ellickson, Kristie; Thé, Jesse

    2012-01-01

    Three modeling systems were used to estimate human health risks from air pollution: two versions of MNRiskS (for Minnesota Risk Screening), and the USEPA National Air Toxics Assessment (NATA). MNRiskS is a unique cumulative risk modeling system used to assess risks from multiple air toxics, sources, and pathways on a local to a state-wide scale. In addition, ambient outdoor air monitoring data were available for estimation of risks and comparison with the modeled estimates of air concentrations. Highest air concentrations and estimated risks were generally found in the Minneapolis-St. Paul metropolitan area and lowest risks in undeveloped rural areas. Emissions from mobile and area (nonpoint) sources created greater estimated risks than emissions from point sources. Highest cancer risks were via ingestion pathway exposures to dioxins and related compounds. Diesel particles, acrolein, and formaldehyde created the highest estimated inhalation health impacts. Model-estimated air concentrations were generally highest for NATA and lowest for the AERMOD version of MNRiskS. This validation study showed reasonable agreement between available measurements and model predictions, although results varied among pollutants, and predictions were often lower than measurements. The results increased confidence in identifying pollutants, pathways, geographic areas, sources, and receptors of potential concern, and thus provide a basis for informing pollution reduction strategies and focusing efforts on specific pollutants (diesel particles, acrolein, and formaldehyde), geographic areas (urban centers), and source categories (nonpoint sources). The results heighten concerns about risks from food chain exposures to dioxins and PAHs. Risk estimates were sensitive to variations in methodologies for treating emissions, dispersion, deposition, exposure, and toxicity. © 2011 Society for Risk Analysis.

  16. A Cooperative Model for IS Security Risk Management in Distributed Environment

    PubMed Central

    Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively. PMID:24563626

  17. A cooperative model for IS security risk management in distributed environment.

    PubMed

    Feng, Nan; Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.

  18. Assessing non-additive effects in GBLUP model.

    PubMed

    Vieira, I C; Dos Santos, J P R; Pires, L P M; Lima, B M; Gonçalves, F M A; Balestre, M

    2017-05-10

    Understanding non-additive effects in the expression of quantitative traits is very important in genotype selection, especially in species where the commercial products are clones or hybrids. The use of molecular markers has allowed the study of non-additive genetic effects on a genomic level, in addition to a better understanding of its importance in quantitative traits. Thus, the purpose of this study was to evaluate the behavior of the GBLUP model in different genetic models and relationship matrices and their influence on the estimates of genetic parameters. We used real data of the circumference at breast height in Eucalyptus spp and simulated data from a population of F 2 . Three commonly reported kinship structures in the literature were adopted. The simulation results showed that the inclusion of epistatic kinship improved prediction estimates of genomic breeding values. However, the non-additive effects were not accurately recovered. The Fisher information matrix for real dataset showed high collinearity in estimates of additive, dominant, and epistatic variance, causing no gain in the prediction of the unobserved data and convergence problems. Estimates presented differences of genetic parameters and correlations considering the different kinship structures. Our results show that the inclusion of non-additive effects can improve the predictive ability or even the prediction of additive effects. However, the high distortions observed in the variance estimates when the Hardy-Weinberg equilibrium assumption is violated due to the presence of selection or inbreeding can converge at zero gains in models that consider epistasis in genomic kinship.

  19. Development of a prototype Typhoon Risk Model over the Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Kim, K. Y.; Cocke, S.; Shin, D. W.; CHOI, M.; Kwon, J.

    2016-12-01

    Risk can be defined as probability of a given hazard of a given level causing a particular level of loss of damage (Alexander, 2000). Risk management is important for mitigation and developing plans for emergencies. More effective risk management strategies can help reduce potential losses from natural disasters like typhoon, floods, earthquakes, and so on. We are developing a prototype typhoon risk model to assess the current and potentially future hazard due to typhoons in the Western Pacific. To develop the typhoon risk model, a variety of sources of data over Korea are used such as population, damage to buildings, agriculture, ships, etc. The model is based on proven concepts used in catastrophe models that have been used in the U.S. and other regions of the world. Recently, the sea surface temperatures where typhoons have occurred have tended to increase. According to recent studies of global warming, the intensity of typhoons could increase, and the frequency of typhoons may decrease in the future climate. The prototype risk model can help us determine the change in risk as a consequence of the change in typhoon activity. We focus on Korea and other regions of interest to Korean insurers, re-insurers, and related industries. The model can potentially be coupled to various damage models or emergency management systems for planning and mitigation. In addition, the assessment would be useful for emergency planners, coastal community planners, and private and governmental insurance programs. This work was funded by the Korea Meteorological Administration Research and Development Program under Grant KMIPA2016-8030.

  20. Combined proportional and additive residual error models in population pharmacokinetic modelling.

    PubMed

    Proost, Johannes H

    2017-11-15

    In pharmacokinetic modelling, a combined proportional and additive residual error model is often preferred over a proportional or additive residual error model. Different approaches have been proposed, but a comparison between approaches is still lacking. The theoretical background of the methods is described. Method VAR assumes that the variance of the residual error is the sum of the statistically independent proportional and additive components; this method can be coded in three ways. Method SD assumes that the standard deviation of the residual error is the sum of the proportional and additive components. Using datasets from literature and simulations based on these datasets, the methods are compared using NONMEM. The different coding of methods VAR yield identical results. Using method SD, the values of the parameters describing residual error are lower than for method VAR, but the values of the structural parameters and their inter-individual variability are hardly affected by the choice of the method. Both methods are valid approaches in combined proportional and additive residual error modelling, and selection may be based on OFV. When the result of an analysis is used for simulation purposes, it is essential that the simulation tool uses the same method as used during analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Surrogate modeling of joint flood risk across coastal watersheds

    NASA Astrophysics Data System (ADS)

    Bass, Benjamin; Bedient, Philip

    2018-03-01

    This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.

  2. Using risk-adjustment models to identify high-cost risks.

    PubMed

    Meenan, Richard T; Goodman, Michael J; Fishman, Paul A; Hornbrook, Mark C; O'Keeffe-Rosetti, Maureen C; Bachman, Donald J

    2003-11-01

    We examine the ability of various publicly available risk models to identify high-cost individuals and enrollee groups using multi-HMO administrative data. Five risk-adjustment models (the Global Risk-Adjustment Model [GRAM], Diagnostic Cost Groups [DCGs], Adjusted Clinical Groups [ACGs], RxRisk, and Prior-expense) were estimated on a multi-HMO administrative data set of 1.5 million individual-level observations for 1995-1996. Models produced distributions of individual-level annual expense forecasts for comparison to actual values. Prespecified "high-cost" thresholds were set within each distribution. The area under the receiver operating characteristic curve (AUC) for "high-cost" prevalences of 1% and 0.5% was calculated, as was the proportion of "high-cost" dollars correctly identified. Results are based on a separate 106,000-observation validation dataset. For "high-cost" prevalence targets of 1% and 0.5%, ACGs, DCGs, GRAM, and Prior-expense are very comparable in overall discrimination (AUCs, 0.83-0.86). Given a 0.5% prevalence target and a 0.5% prediction threshold, DCGs, GRAM, and Prior-expense captured $963,000 (approximately 3%) more "high-cost" sample dollars than other models. DCGs captured the most "high-cost" dollars among enrollees with asthma, diabetes, and depression; predictive performance among demographic groups (Medicaid members, members over 64, and children under 13) varied across models. Risk models can efficiently identify enrollees who are likely to generate future high costs and who could benefit from case management. The dollar value of improved prediction performance of the most accurate risk models should be meaningful to decision-makers and encourage their broader use for identifying high costs.

  3. Comparison of risk assessment based on clinical judgement and Cariogram in addition to patient perceived treatment need.

    PubMed

    Hänsel Petersson, Gunnel; Åkerman, Sigvard; Isberg, Per-Erik; Ericson, Dan

    2016-07-07

    Predicting future risk for oral diseases, treatment need and prognosis are tasks performed daily in clinical practice. A large variety of methods have been reported, ranging from clinical judgement or "gut feeling" or even patient interviewing, to complex assessments of combinations of known risk factors. In clinical practice, there is an ongoing continuous search for less complicated and more valid tools for risk assessment. There is also a lack of knowledge how different common methods relates to one another. The aim of this study was to investigate if caries risk assessment (CRA) based on clinical judgement and the Cariogram model give similar results. In addition, to assess which factors from clinical status and history agree best with the CRA based on clinical judgement and how the patient's own perception of future oral treatment need correspond with the sum of examiners risk score. Clinical examinations were performed on randomly selected individuals 20-89 years old living in Skåne, Sweden. In total, 451 individuals were examined, 51 % women. The clinical examination included caries detection, saliva samples and radiographic examination together with history and a questionnaire. The examiners made a risk classification and the authors made a second risk calculation according to the Cariogram. For those assessed as low risk using the Cariogram 69 % also were assessed as low risk based on clinical judgement. For the other risk groups the agreement was lower. Clinical variables that significantly related to CRA based on clinical judgement were DS (decayed surfaces) and combining DS and incipient lesions, DMFT (decayed, missed, filled teeth), plaque amount, history and soft drink intake. Patients' perception of future oral treatment need correlated to some extent with the sum of examiners risk score. The main finding was that CRA based on clinical judgement and the Cariogram model gave similar results for the groups that were predicted at low level of future

  4. Applying Additive Hazards Models for Analyzing Survival in Patients with Colorectal Cancer in Fars Province, Southern Iran

    PubMed

    Madadizadeh, Farzan; Ghanbarnejad, Amin; Ghavami, Vahid; Zare Bandamiri, Mohammad; Mohammadianpanah, Mohammad

    2017-04-01

    Introduction: Colorectal cancer (CRC) is a commonly fatal cancer that ranks as third worldwide and third and the fifth in Iranian women and men, respectively. There are several methods for analyzing time to event data. Additive hazards regression models take priority over the popular Cox proportional hazards model if the absolute hazard (risk) change instead of hazard ratio is of primary concern, or a proportionality assumption is not made. Methods: This study used data gathered from medical records of 561 colorectal cancer patients who were admitted to Namazi Hospital, Shiraz, Iran, during 2005 to 2010 and followed until December 2015. The nonparametric Aalen’s additive hazards model, semiparametric Lin and Ying’s additive hazards model and Cox proportional hazards model were applied for data analysis. The proportionality assumption for the Cox model was evaluated with a test based on the Schoenfeld residuals and for test goodness of fit in additive models, Cox-Snell residual plots were used. Analyses were performed with SAS 9.2 and R3.2 software. Results: The median follow-up time was 49 months. The five-year survival rate and the mean survival time after cancer diagnosis were 59.6% and 68.1±1.4 months, respectively. Multivariate analyses using Lin and Ying’s additive model and the Cox proportional model indicated that the age of diagnosis, site of tumor, stage, and proportion of positive lymph nodes, lymphovascular invasion and type of treatment were factors affecting survival of the CRC patients. Conclusion: Additive models are suitable alternatives to the Cox proportionality model if there is interest in evaluation of absolute hazard change, or no proportionality assumption is made. Creative Commons Attribution License

  5. Population Modeling of Modified Risk Tobacco Products Accounting for Smoking Reduction and Gradual Transitions of Relative Risk.

    PubMed

    Poland, Bill; Teischinger, Florian

    2017-11-01

    As suggested by the Food and Drug Administration (FDA) Modified Risk Tobacco Product (MRTP) Applications Draft Guidance, we developed a statistical model based on public data to explore the effect on population mortality of an MRTP resulting in reduced conventional cigarette smoking. Many cigarette smokers who try an MRTP persist as dual users while smoking fewer conventional cigarettes per day (CPD). Lower-CPD smokers have lower mortality risk based on large cohort studies. However, with little data on the effect of smoking reduction on mortality, predictive modeling is needed. We generalize prior assumptions of gradual, exponential decay of Excess Risk (ER) of death, relative to never-smokers, after quitting or reducing CPD. The same age-dependent slopes are applied to all transitions, including initiation to conventional cigarettes and to a second product (MRTP). A Monte Carlo simulation model generates random individual product use histories, including CPD, to project cumulative deaths through 2060 in a population with versus without the MRTP. Transitions are modeled to and from dual use, which affects CPD and cigarette quit rates, and to MRTP use only. Results in a hypothetical scenario showed high sensitivity of long-run mortality to CPD reduction levels and moderate sensitivity to ER transition rates. Models to project population effects of an MRTP should account for possible mortality effects of reduced smoking among dual users. In addition, studies should follow dual-user CPD histories and quit rates over long time periods to clarify long-term usage patterns and thereby improve health impact projections. We simulated mortality effects of a hypothetical MRTP accounting for cigarette smoking reduction by smokers who add MRTP use. Data on relative mortality risk versus CPD suggest that this reduction may have a substantial effect on mortality rates, unaccounted for in other models. This effect is weighed with additional hypothetical effects in an example.

  6. Risk modelling in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  7. Default risk modeling beyond the first-passage approximation: extended Black-Cox model.

    PubMed

    Katz, Yuri A; Shokhirev, Nikolai V

    2010-07-01

    We develop a generalization of the Black-Cox structural model of default risk. The extended model captures uncertainty related to firm's ability to avoid default even if company's liabilities momentarily exceeding its assets. Diffusion in a linear potential with the radiation boundary condition is used to mimic a company's default process. The exact solution of the corresponding Fokker-Planck equation allows for derivation of analytical expressions for the cumulative probability of default and the relevant hazard rate. Obtained closed formulas fit well the historical data on global corporate defaults and demonstrate the split behavior of credit spreads for bonds of companies in different categories of speculative-grade ratings with varying time to maturity. Introduction of the finite rate of default at the boundary improves valuation of credit risk for short time horizons, which is the key advantage of the proposed model. We also consider the influence of uncertainty in the initial distance to the default barrier on the outcome of the model and demonstrate that this additional source of incomplete information may be responsible for nonzero credit spreads for bonds with very short time to maturity.

  8. Default risk modeling beyond the first-passage approximation: Extended Black-Cox model

    NASA Astrophysics Data System (ADS)

    Katz, Yuri A.; Shokhirev, Nikolai V.

    2010-07-01

    We develop a generalization of the Black-Cox structural model of default risk. The extended model captures uncertainty related to firm’s ability to avoid default even if company’s liabilities momentarily exceeding its assets. Diffusion in a linear potential with the radiation boundary condition is used to mimic a company’s default process. The exact solution of the corresponding Fokker-Planck equation allows for derivation of analytical expressions for the cumulative probability of default and the relevant hazard rate. Obtained closed formulas fit well the historical data on global corporate defaults and demonstrate the split behavior of credit spreads for bonds of companies in different categories of speculative-grade ratings with varying time to maturity. Introduction of the finite rate of default at the boundary improves valuation of credit risk for short time horizons, which is the key advantage of the proposed model. We also consider the influence of uncertainty in the initial distance to the default barrier on the outcome of the model and demonstrate that this additional source of incomplete information may be responsible for nonzero credit spreads for bonds with very short time to maturity.

  9. [Application of three risk assessment models in occupational health risk assessment of dimethylformamide].

    PubMed

    Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J

    2016-08-20

    Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions

  10. Conceptualizing a Dynamic Fall Risk Model Including Intrinsic Risks and Exposures.

    PubMed

    Klenk, Jochen; Becker, Clemens; Palumbo, Pierpaolo; Schwickert, Lars; Rapp, Kilan; Helbostad, Jorunn L; Todd, Chris; Lord, Stephen R; Kerse, Ngaire

    2017-11-01

    Falls are a major cause of injury and disability in older people, leading to serious health and social consequences including fractures, poor quality of life, loss of independence, and institutionalization. To design and provide adequate prevention measures, accurate understanding and identification of person's individual fall risk is important. However, to date, the performance of fall risk models is weak compared with models estimating, for example, cardiovascular risk. This deficiency may result from 2 factors. First, current models consider risk factors to be stable for each person and not change over time, an assumption that does not reflect real-life experience. Second, current models do not consider the interplay of individual exposure including type of activity (eg, walking, undertaking transfers) and environmental risks (eg, lighting, floor conditions) in which activity is performed. Therefore, we posit a dynamic fall risk model consisting of intrinsic risk factors that vary over time and exposure (activity in context). eHealth sensor technology (eg, smartphones) begins to enable the continuous measurement of both the above factors. We illustrate our model with examples of real-world falls from the FARSEEING database. This dynamic framework for fall risk adds important aspects that may improve understanding of fall mechanisms, fall risk models, and the development of fall prevention interventions. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  11. The Pittsburgh Cervical Cancer Screening Model: a risk assessment tool.

    PubMed

    Austin, R Marshall; Onisko, Agnieszka; Druzdzel, Marek J

    2010-05-01

    Evaluation of cervical cancer screening has grown increasingly complex with the introduction of human papillomavirus (HPV) vaccination and newer screening technologies approved by the US Food and Drug Administration. To create a unique Pittsburgh Cervical Cancer Screening Model (PCCSM) that quantifies risk for histopathologic cervical precancer (cervical intraepithelial neoplasia [CIN] 2, CIN3, and adenocarcinoma in situ) and cervical cancer in an environment predominantly using newer screening technologies. The PCCSM is a dynamic Bayesian network consisting of 19 variables available in the laboratory information system, including patient history data (most recent HPV vaccination data), Papanicolaou test results, high-risk HPV results, procedure data, and histopathologic results. The model's graphic structure was based on the published literature. Results from 375 441 patient records from 2005 through 2008 were used to build and train the model. Additional data from 45 930 patients were used to test the model. The PCCSM compares risk quantitatively over time for histopathologically verifiable CIN2, CIN3, adenocarcinoma in situ, and cervical cancer in screened patients for each current cytology result category and for each HPV result. For each current cytology result, HPV test results affect risk; however, the degree of cytologic abnormality remains the largest positive predictor of risk. Prior history also alters the CIN2, CIN3, adenocarcinoma in situ, and cervical cancer risk for patients with common current cytology and HPV test results. The PCCSM can also generate negative risk projections, estimating the likelihood of the absence of histopathologic CIN2, CIN3, adenocarcinoma in situ, and cervical cancer in screened patients. The PCCSM is a dynamic Bayesian network that computes quantitative cervical disease risk estimates for patients undergoing cervical screening. Continuously updatable with current system data, the PCCSM provides a new tool to monitor

  12. The estimation of time-varying risks in asset pricing modelling using B-Spline method

    NASA Astrophysics Data System (ADS)

    Nurjannah; Solimun; Rinaldo, Adji

    2017-12-01

    Asset pricing modelling has been extensively studied in the past few decades to explore the risk-return relationship. The asset pricing literature typically assumed a static risk-return relationship. However, several studies found few anomalies in the asset pricing modelling which captured the presence of the risk instability. The dynamic model is proposed to offer a better model. The main problem highlighted in the dynamic model literature is that the set of conditioning information is unobservable and therefore some assumptions have to be made. Hence, the estimation requires additional assumptions about the dynamics of risk. To overcome this problem, the nonparametric estimators can also be used as an alternative for estimating risk. The flexibility of the nonparametric setting avoids the problem of misspecification derived from selecting a functional form. This paper investigates the estimation of time-varying asset pricing model using B-Spline, as one of nonparametric approach. The advantages of spline method is its computational speed and simplicity, as well as the clarity of controlling curvature directly. The three popular asset pricing models will be investigated namely CAPM (Capital Asset Pricing Model), Fama-French 3-factors model and Carhart 4-factors model. The results suggest that the estimated risks are time-varying and not stable overtime which confirms the risk instability anomaly. The results is more pronounced in Carhart’s 4-factors model.

  13. Modelling tsunami inundation for risk analysis at the Andaman Sea Coast of Thailand

    NASA Astrophysics Data System (ADS)

    Kaiser, G.; Kortenhaus, A.

    2009-04-01

    The mega-tsunami of Dec. 26, 2004 strongly impacted the Andaman Sea coast of Thailand and devastated coastal ecosystems as well as towns, settlements and tourism resorts. In addition to the tragic loss of many lives, the destruction or damage of life-supporting infrastructure, such as buildings, roads, water & power supply etc. caused high economic losses in the region. To mitigate future tsunami impacts there is a need to assess the tsunami hazard and vulnerability in flood prone areas at the Andaman Sea coast in order to determine the spatial distribution of risk and to develop risk management strategies. In the bilateral German-Thai project TRAIT research is performed on integrated risk assessment for the Provinces Phang Nga and Phuket in southern Thailand, including a hazard analysis, i.e. modelling tsunami propagation to the coast, tsunami wave breaking and inundation characteristics, as well as vulnerability analysis of the socio-economic and the ecological system in order to determine the scenario-based, specific risk for the region. In this presentation results of the hazard analysis and the inundation simulation are presented and discussed. Numerical modelling of tsunami propagation and inundation simulation is an inevitable tool for risk analysis, risk management and evacuation planning. While numerous investigations have been made to model tsunami wave generation and propagation in the Indian Ocean, there is still a lack in determining detailed inundation patterns, i.e. water depth and flow dynamics. However, for risk management and evacuation planning this knowledge is essential. As the accuracy of the inundation simulation is strongly depending on the available bathymetric and the topographic data, a multi-scale approach is chosen in this work. The ETOPO Global Relief Model as a bathymetric basis and the Shuttle Radar Topography Mission (SRTM90) have been widely applied in tsunami modelling approaches as these data are free and almost world

  14. Major histocompatibility complex harbors widespread genotypic variability of non-additive risk of rheumatoid arthritis including epistasis.

    PubMed

    Wei, Wen-Hua; Bowes, John; Plant, Darren; Viatte, Sebastien; Yarwood, Annie; Massey, Jonathan; Worthington, Jane; Eyre, Stephen

    2016-04-25

    Genotypic variability based genome-wide association studies (vGWASs) can identify potentially interacting loci without prior knowledge of the interacting factors. We report a two-stage approach to make vGWAS applicable to diseases: firstly using a mixed model approach to partition dichotomous phenotypes into additive risk and non-additive environmental residuals on the liability scale and secondly using the Levene's (Brown-Forsythe) test to assess equality of the residual variances across genotype groups per marker. We found widespread significant (P < 2.5e-05) vGWAS signals within the major histocompatibility complex (MHC) across all three study cohorts of rheumatoid arthritis. We further identified 10 epistatic interactions between the vGWAS signals independent of the MHC additive effects, each with a weak effect but jointly explained 1.9% of phenotypic variance. PTPN22 was also identified in the discovery cohort but replicated in only one independent cohort. Combining the three cohorts boosted power of vGWAS and additionally identified TYK2 and ANKRD55. Both PTPN22 and TYK2 had evidence of interactions reported elsewhere. We conclude that vGWAS can help discover interacting loci for complex diseases but require large samples to find additional signals.

  15. Adolescent mental health and academic functioning: empirical support for contrasting models of risk and vulnerability.

    PubMed

    Lucier-Greer, Mallory; O'Neal, Catherine W; Arnold, A Laura; Mancini, Jay A; Wickrama, Kandauda K A S

    2014-11-01

    Adolescents in military families contend with normative stressors that are universal and exist across social contexts (minority status, family disruptions, and social isolation) as well as stressors reflective of their military life context (e.g., parental deployment, school transitions, and living outside the United States). This study utilizes a social ecological perspective and a stress process lens to examine the relationship between multiple risk factors and relevant indicators of youth well-being, namely depressive symptoms and academic performance, as well as the mediating role of self-efficacy (N = 1,036). Three risk models were tested: an additive effects model (each risk factor uniquely influences outcomes), a full cumulative effects model (the collection of risk factors influences outcomes), a comparative model (a cumulative effects model exploring the differential effects of normative and military-related risks). This design allowed for the simultaneous examination of multiple risk factors and a comparison of alternative perspectives on measuring risk. Each model was predictive of depressive symptoms and academic performance through persistence; however, each model provides unique findings about the relationship between risk factors and youth outcomes. Discussion is provided pertinent to service providers and researchers on how risk is conceptualized and suggestions for identifying at-risk youth. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  16. Additive Synergism between Asbestos and Smoking in Lung Cancer Risk: A Systematic Review and Meta-Analysis

    PubMed Central

    Ngamwong, Yuwadee; Tangamornsuksan, Wimonchat; Lohitnavy, Ornrat; Chaiyakunapruk, Nathorn; Scholfield, C. Norman; Reisfeld, Brad; Lohitnavy, Manupat

    2015-01-01

    Smoking and asbestos exposure are important risks for lung cancer. Several epidemiological studies have linked asbestos exposure and smoking to lung cancer. To reconcile and unify these results, we conducted a systematic review and meta-analysis to provide a quantitative estimate of the increased risk of lung cancer associated with asbestos exposure and cigarette smoking and to classify their interaction. Five electronic databases were searched from inception to May, 2015 for observational studies on lung cancer. All case-control (N = 10) and cohort (N = 7) studies were included in the analysis. We calculated pooled odds ratios (ORs), relative risks (RRs) and 95% confidence intervals (CIs) using a random-effects model for the association of asbestos exposure and smoking with lung cancer. Lung cancer patients who were not exposed to asbestos and non-smoking (A-S-) were compared with; (i) asbestos-exposed and non-smoking (A+S-), (ii) non-exposure to asbestos and smoking (A-S+), and (iii) asbestos-exposed and smoking (A+S+). Our meta-analysis showed a significant difference in risk of developing lung cancer among asbestos exposed and/or smoking workers compared to controls (A-S-), odds ratios for the disease (95% CI) were (i) 1.70 (A+S-, 1.31–2.21), (ii) 5.65; (A-S+, 3.38–9.42), (iii) 8.70 (A+S+, 5.8–13.10). The additive interaction index of synergy was 1.44 (95% CI = 1.26–1.77) and the multiplicative index = 0.91 (95% CI = 0.63–1.30). Corresponding values for cohort studies were 1.11 (95% CI = 1.00–1.28) and 0.51 (95% CI = 0.31–0.85). Our results point to an additive synergism for lung cancer with co-exposure of asbestos and cigarette smoking. Assessments of industrial health risks should take smoking and other airborne health risks when setting occupational asbestos exposure limits. PMID:26274395

  17. Risk-trading in flood management: An economic model.

    PubMed

    Chang, Chiung Ting

    2017-09-15

    Although flood management is no longer exclusively a topic of engineering, flood mitigation continues to be associated with hard engineering options. Flood adaptation or the capacity to adapt to flood risk, as well as a demand for internalizing externalities caused by flood risk between regions, complicate flood management activities. Even though integrated river basin management has long been recommended to resolve the above issues, it has proven difficult to apply widely, and sometimes even to bring into existence. This article explores how internalization of externalities as well as the realization of integrated river basin management can be encouraged via the use of a market-based approach, namely a flood risk trading program. In addition to maintaining efficiency of optimal resource allocation, a flood risk trading program may also provide a more equitable distribution of benefits by facilitating decentralization. This article employs a graphical analysis to show how flood risk trading can be implemented to encourage mitigation measures that increase infiltration and storage capacity. A theoretical model is presented to demonstrate the economic conditions necessary for flood risk trading. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. A poisson process model for hip fracture risk.

    PubMed

    Schechner, Zvi; Luo, Gangming; Kaufman, Jonathan J; Siffert, Robert S

    2010-08-01

    The primary method for assessing fracture risk in osteoporosis relies primarily on measurement of bone mass. Estimation of fracture risk is most often evaluated using logistic or proportional hazards models. Notwithstanding the success of these models, there is still much uncertainty as to who will or will not suffer a fracture. This has led to a search for other components besides mass that affect bone strength. The purpose of this paper is to introduce a new mechanistic stochastic model that characterizes the risk of hip fracture in an individual. A Poisson process is used to model the occurrence of falls, which are assumed to occur at a rate, lambda. The load induced by a fall is assumed to be a random variable that has a Weibull probability distribution. The combination of falls together with loads leads to a compound Poisson process. By retaining only those occurrences of the compound Poisson process that result in a hip fracture, a thinned Poisson process is defined that itself is a Poisson process. The fall rate is modeled as an affine function of age, and hip strength is modeled as a power law function of bone mineral density (BMD). The risk of hip fracture can then be computed as a function of age and BMD. By extending the analysis to a Bayesian framework, the conditional densities of BMD given a prior fracture and no prior fracture can be computed and shown to be consistent with clinical observations. In addition, the conditional probabilities of fracture given a prior fracture and no prior fracture can also be computed, and also demonstrate results similar to clinical data. The model elucidates the fact that the hip fracture process is inherently random and improvements in hip strength estimation over and above that provided by BMD operate in a highly "noisy" environment and may therefore have little ability to impact clinical practice.

  19. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  20. A Multiple Risk Factors Model of the Development of Aggression among Early Adolescents from Urban Disadvantaged Neighborhoods

    ERIC Educational Resources Information Center

    Kim, Sangwon; Orpinas, Pamela; Kamphaus, Randy; Kelder, Steven H.

    2011-01-01

    This study empirically derived a multiple risk factors model of the development of aggression among middle school students in urban, low-income neighborhoods, using Hierarchical Linear Modeling (HLM). Results indicated that aggression increased from sixth to eighth grade. Additionally, the influences of four risk domains (individual, family,…

  1. Validation analysis of probabilistic models of dietary exposure to food additives.

    PubMed

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  2. A flexible count data regression model for risk analysis.

    PubMed

    Guikema, Seth D; Coffelt, Jeremy P; Goffelt, Jeremy P

    2008-02-01

    In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets.

  3. University of North Carolina Caries Risk Assessment Study: comparisons of high risk prediction, any risk prediction, and any risk etiologic models.

    PubMed

    Beck, J D; Weintraub, J A; Disney, J A; Graves, R C; Stamm, J W; Kaste, L M; Bohannan, H M

    1992-12-01

    The purpose of this analysis is to compare three different statistical models for predicting children likely to be at risk of developing dental caries over a 3-yr period. Data are based on 4117 children who participated in the University of North Carolina Caries Risk Assessment Study, a longitudinal study conducted in the Aiken, South Carolina, and Portland, Maine areas. The three models differed with respect to either the types of variables included or the definition of disease outcome. The two "Prediction" models included both risk factor variables thought to cause dental caries and indicator variables that are associated with dental caries, but are not thought to be causal for the disease. The "Etiologic" model included only etiologic factors as variables. A dichotomous outcome measure--none or any 3-yr increment, was used in the "Any Risk Etiologic model" and the "Any Risk Prediction Model". Another outcome, based on a gradient measure of disease, was used in the "High Risk Prediction Model". The variables that are significant in these models vary across grades and sites, but are more consistent among the Etiologic model than the Predictor models. However, among the three sets of models, the Any Risk Prediction Models have the highest sensitivity and positive predictive values, whereas the High Risk Prediction Models have the highest specificity and negative predictive values. Considerations in determining model preference are discussed.

  4. Multilevel joint competing risk models

    NASA Astrophysics Data System (ADS)

    Karunarathna, G. H. S.; Sooriyarachchi, M. R.

    2017-09-01

    Joint modeling approaches are often encountered for different outcomes of competing risk time to event and count in many biomedical and epidemiology studies in the presence of cluster effect. Hospital length of stay (LOS) has been the widely used outcome measure in hospital utilization due to the benchmark measurement for measuring multiple terminations such as discharge, transferred, dead and patients who have not completed the event of interest at the follow up period (censored) during hospitalizations. Competing risk models provide a method of addressing such multiple destinations since classical time to event models yield biased results when there are multiple events. In this study, the concept of joint modeling has been applied to the dengue epidemiology in Sri Lanka, 2006-2008 to assess the relationship between different outcomes of LOS and platelet count of dengue patients with the district cluster effect. Two key approaches have been applied to build up the joint scenario. In the first approach, modeling each competing risk separately using the binary logistic model, treating all other events as censored under the multilevel discrete time to event model, while the platelet counts are assumed to follow a lognormal regression model. The second approach is based on the endogeneity effect in the multilevel competing risks and count model. Model parameters were estimated using maximum likelihood based on the Laplace approximation. Moreover, the study reveals that joint modeling approach yield more precise results compared to fitting two separate univariate models, in terms of AIC (Akaike Information Criterion).

  5. Bacterial-based additives for the production of artificial snow: what are the risks to human health?

    PubMed

    Lagriffoul, A; Boudenne, J L; Absi, R; Ballet, J J; Berjeaud, J M; Chevalier, S; Creppy, E E; Gilli, E; Gadonna, J P; Gadonna-Widehem, P; Morris, C E; Zini, S

    2010-03-01

    For around two decades, artificial snow has been used by numerous winter sports resorts to ensure good snow cover at low altitude areas or more generally, to lengthen the skiing season. Biological additives derived from certain bacteria are regularly used to make artificial snow. However, the use of these additives has raised doubts concerning the potential impact on human health and the environment. In this context, the French health authorities have requested the French Agency for Environmental and Occupational Health Safety (Afsset) to assess the health risks resulting from the use of such additives. The health risk assessment was based on a review of the scientific literature, supplemented by professional consultations and expertise. Biological or chemical hazards from additives derived from the ice nucleation active bacterium Pseudomonas syringae were characterised. Potential health hazards to humans were considered in terms of infectious, toxic and allergenic capacities with respect to human populations liable to be exposed and the means of possible exposure. Taking into account these data, a qualitative risk assessment was carried out, according to four exposure scenarios, involving the different populations exposed, and the conditions and routes of exposure. It was concluded that certain health risks can exist for specific categories of professional workers (mainly snowmakers during additive mixing and dilution tank cleaning steps, with risks estimated to be negligible to low if workers comply with safety precautions). P. syringae does not present any pathogenic capacity to humans and that the level of its endotoxins found in artificial snow do not represent a danger beyond that of exposure to P. syringae endotoxins naturally present in snow. However, the risk of possible allergy in some particularly sensitive individuals cannot be excluded. Another important conclusion of this study concerns use of poor microbiological water quality to make artificial snow.

  6. Extracting risk modeling information from medical articles.

    PubMed

    Deleris, Léa A; Sacaleanu, Bogdan; Tounsi, Lamia

    2013-01-01

    Risk modeling in healthcare is both ubiquitous and in its infancy. On the one hand, a significant proportion of medical research focuses on determining the factors that influence the incidence, severity and treatment of diseases, which is a form of risk identification. Those studies typically investigate the micro-level of risk modeling, i.e., the existence of dependences between a reduced set of hypothesized (or demonstrated) risk factors and a focus disease or treatment. On the other hand, the macro-level of risk modeling, i.e., articulating how a large number of such risk factors interact to affect diseases and treatments is not widespread, though essential for medical decision support modeling. By exploiting advances in natural language processing, we believe that information contained in unstructured texts such as medical articles could be extracted to facilitate aggregation into macro-level risk models.

  7. Using generalized additive (mixed) models to analyze single case designs.

    PubMed

    Shadish, William R; Zuur, Alain F; Sullivan, Kristynn J

    2014-04-01

    This article shows how to apply generalized additive models and generalized additive mixed models to single-case design data. These models excel at detecting the functional form between two variables (often called trend), that is, whether trend exists, and if it does, what its shape is (e.g., linear and nonlinear). In many respects, however, these models are also an ideal vehicle for analyzing single-case designs because they can consider level, trend, variability, overlap, immediacy of effect, and phase consistency that single-case design researchers examine when interpreting a functional relation. We show how these models can be implemented in a wide variety of ways to test whether treatment is effective, whether cases differ from each other, whether treatment effects vary over cases, and whether trend varies over cases. We illustrate diagnostic statistics and graphs, and we discuss overdispersion of data in detail, with examples of quasibinomial models for overdispersed data, including how to compute dispersion and quasi-AIC fit indices in generalized additive models. We show how generalized additive mixed models can be used to estimate autoregressive models and random effects and discuss the limitations of the mixed models compared to generalized additive models. We provide extensive annotated syntax for doing all these analyses in the free computer program R. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  8. A Hydrological Modeling Framework for Flood Risk Assessment for Japan

    NASA Astrophysics Data System (ADS)

    Ashouri, H.; Chinnayakanahalli, K.; Chowdhary, H.; Sen Gupta, A.

    2016-12-01

    Flooding has been the most frequent natural disaster that claims lives and imposes significant economic losses to human societies worldwide. Japan, with an annual rainfall of up to approximately 4000 mm is extremely vulnerable to flooding. The focus of this research is to develop a macroscale hydrologic model for simulating flooding toward an improved understanding and assessment of flood risk across Japan. The framework employs a conceptual hydrological model, known as the Probability Distributed Model (PDM), as well as the Muskingum-Cunge flood routing procedure for simulating streamflow. In addition, a Temperature-Index model is incorporated to account for snowmelt and its contribution to streamflow. For an efficient calibration of the model, in terms of computational timing and convergence of the parameters, a set of A Priori parameters is obtained based on the relationships between the model parameters and the physical properties of watersheds. In this regard, we have implemented a particle tracking algorithm and a statistical model which use high resolution Digital Terrain Models to estimate different time related parameters of the model such as time to peak of the unit hydrograph. In addition, global soil moisture and depth data are used to generate A Priori estimation of maximum soil moisture capacity, an important parameter of the PDM model. Once the model is calibrated, its performance is examined during the Typhoon Nabi which struck Japan in September 2005 and caused severe flooding throughout the country. The model is also validated for the extreme precipitation event in 2012 which affected Kyushu. In both cases, quantitative measures show that simulated streamflow depicts good agreement with gauge-based observations. The model is employed to simulate thousands of possible flood events for the entire Japan which makes a basis for a comprehensive flood risk assessment and loss estimation for the flood insurance industry.

  9. Carotid plaque-thickness and common carotid IMT show additive value in cardiovascular risk prediction and reclassification.

    PubMed

    Amato, Mauro; Veglia, Fabrizio; de Faire, Ulf; Giral, Philippe; Rauramaa, Rainer; Smit, Andries J; Kurl, Sudhir; Ravani, Alessio; Frigerio, Beatrice; Sansaro, Daniela; Bonomi, Alice; Tedesco, Calogero C; Castelnuovo, Samuela; Mannarino, Elmo; Humphries, Steve E; Hamsten, Anders; Tremoli, Elena; Baldassarre, Damiano

    2017-08-01

    Carotid plaque size and the mean common carotid intima-media thickness measured in plaque-free areas (PF CC-IMT mean ) have been identified as predictors of vascular events (VEs), but their complementarity in risk prediction and stratification is still unresolved. The aim of this study was to evaluate the independence of carotid plaque thickness and PF CC-IMT mean in cardiovascular risk prediction and risk stratification. The IMPROVE-study is a European cohort (n = 3703), where the thickness of the largest plaque detected in the whole carotid tree was indexed as cIMT max . PF CC-IMT mean was also assessed. Hazard Ratios (HR) comparing the top quartiles of cIMT max and PF CC-IMT mean versus their respective 1-3 quartiles were calculated using Cox regression. After a 36.2-month follow-up, there were 215 VEs (125 coronary, 73 cerebral and 17 peripheral). Both cIMT max and PF CC-IMT mean were mutually independent predictors of combined-VEs, after adjustment for center, age, sex, risk factors and pharmacological treatment [HR (95% CI) = 1.98 (1.47, 2.67) and 1.68 (1.23, 2.29), respectively]. Both variables were independent predictors of cerebrovascular events (ischemic stroke, transient ischemic attack), while only cIMT max was an independent predictor of coronary events (myocardial infarction, sudden cardiac death, angina pectoris, angioplasty, coronary bypass grafting). In reclassification analyses, PF CC-IMT mean significantly adds to a model including both Framingham Risk Factors and cIMT max (Integrated Discrimination Improvement; IDI = 0.009; p = 0.0001) and vice-versa (IDI = 0.02; p < 0.0001). cIMT max and PF CC-IMT mean are independent predictors of VEs, and as such, they should be used as additive rather than alternative variables in models for cardiovascular risk prediction and reclassification. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. A model for assessing the risk of human trafficking on a local level

    NASA Astrophysics Data System (ADS)

    Colegrove, Amanda

    Human trafficking is a human rights violation that is difficult to quantify. Models for estimating the number of victims of trafficking presented by previous researchers depend on inconsistent, poor quality data. As an intermediate step to help current efforts by nonprofits to combat human trafficking, this project presents a model that is not dependent on quantitative data specific to human trafficking, but rather profiles the risk of human trafficking at the local level through causative factors. Businesses, indicated by the literature, were weighted based on the presence of characteristics that increase the likelihood of trafficking in persons. The mean risk was calculated by census tract to reveal the multiplicity of risk levels in both rural and urban settings. Results indicate that labor trafficking may be a more diffuse problem in Missouri than sex trafficking. Additionally, spatial patterns of risk remained largely the same regardless of adjustments made to the model.

  11. Melanoma Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. [Theoretical model study about the application risk of high risk medical equipment].

    PubMed

    Shang, Changhao; Yang, Fenghui

    2014-11-01

    Research for establishing a risk monitoring theoretical model of high risk medical equipment at applying site. Regard the applying site as a system which contains some sub-systems. Every sub-system consists of some risk estimating indicators. After quantizing of each indicator, the quantized values are multiplied with corresponding weight and then the products are accumulated. Hence, the risk estimating value of each subsystem is attained. Follow the calculating method, the risk estimating values of each sub-system are multiplied with corresponding weights and then the product is accumulated. The cumulative sum is the status indicator of the high risk medical equipment at applying site. The status indicator reflects the applying risk of the medical equipment at applying site. Establish a risk monitoring theoretical model of high risk medical equipment at applying site. The model can monitor the applying risk of high risk medical equipment at applying site dynamically and specially.

  13. A Dual-Process Approach to Health Risk Decision Making: The Prototype Willingness Model

    ERIC Educational Resources Information Center

    Gerrard, Meg; Gibbons, Frederick X.; Houlihan, Amy E.; Stock, Michelle L.; Pomery, Elizabeth A.

    2008-01-01

    Although dual-process models in cognitive, personality, and social psychology have stimulated a large body of research about analytic and heuristic modes of decision making, these models have seldom been applied to the study of adolescent risk behaviors. In addition, the developmental course of these two kinds of information processing, and their…

  14. Spatial model for risk prediction and sub-national prioritization to aid poliovirus eradication in Pakistan.

    PubMed

    Mercer, Laina D; Safdar, Rana M; Ahmed, Jamal; Mahamud, Abdirahman; Khan, M Muzaffar; Gerber, Sue; O'Leary, Aiden; Ryan, Mike; Salet, Frank; Kroiss, Steve J; Lyons, Hil; Upfill-Brown, Alexander; Chabot-Couture, Guillaume

    2017-10-11

    Pakistan is one of only three countries where poliovirus circulation remains endemic. For the Pakistan Polio Eradication Program, identifying high risk districts is essential to target interventions and allocate limited resources. Using a hierarchical Bayesian framework we developed a spatial Poisson hurdle model to jointly model the probability of one or more paralytic polio cases, and the number of cases that would be detected in the event of an outbreak. Rates of underimmunization, routine immunization, and population immunity, as well as seasonality and a history of cases were used to project future risk of cases. The expected number of cases in each district in a 6-month period was predicted using indicators from the previous 6-months and the estimated coefficients from the model. The model achieves an average of 90% predictive accuracy as measured by area under the receiver operating characteristic (ROC) curve, for the past 3 years of cases. The risk of poliovirus has decreased dramatically in many of the key reservoir areas in Pakistan. The results of this model have been used to prioritize sub-national areas in Pakistan to receive additional immunization activities, additional monitoring, or other special interventions.

  15. Lifetime and 5 years risk of breast cancer and attributable risk factor according to Gail model in Iranian women

    PubMed Central

    Mohammadbeigi, Abolfazl; Mohammadsalehi, Narges; Valizadeh, Razieh; Momtaheni, Zeinab; Mokhtari, Mohsen; Ansari, Hossein

    2015-01-01

    Introduction: Breast cancer is the most commonly diagnosed cancers in women worldwide and in Iran. It is expected to account for 29% of all new cancers in women at 2015. This study aimed to assess the 5 years and lifetime risk of breast cancer according to Gail model, and to evaluate the effect of other additional risk factors on the Gail risk. Materials and Methods: A cross sectional study conducted on 296 women aged more than 34-year-old in Qom, Center of Iran. Breast Cancer Risk Assessment Tool calculated the Gail risk for each subject. Data were analyzed by paired t-test, independent t-test, and analysis of variance in bivariate approach to evaluate the effect of each factor on Gail risk. Multiple linear regression models with stepwise method were used to predict the effect of each variable on the Gail risk. Results: The mean age of the participants was 47.8 ± 8.8-year-old and 47% have Fars ethnicity. The 5 years and lifetime risk was 0.37 ± 0.18 and 4.48 ± 0.925%, respectively. It was lower than the average risk in same race and age women (P < 0.001). Being single, positive family history of breast cancer, positive history of biopsy, and radiotherapy as well as using nonhormonal contraceptives were related to higher lifetime risk (P < 0.05). Moreover, a significant direct correlation observed between lifetime risk and body mass index, age of first live birth, and menarche age. While an inversely correlation observed between lifetimes risk of breast cancer and total month of breast feeding duration and age. Conclusion: Based on our results, the 5 years and lifetime risk of breast cancer according to Gail model was lower than the same race and age. Moreover, by comparison with national epidemiologic indicators about morbidity and mortality of breast cancer, it seems that the Gail model overestimate the risk of breast cancer in Iranian women. PMID:26229355

  16. A methodology for modeling regional terrorism risk.

    PubMed

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  17. Versatility of Cooperative Transcriptional Activation: A Thermodynamical Modeling Analysis for Greater-Than-Additive and Less-Than-Additive Effects

    PubMed Central

    Frank, Till D.; Carmody, Aimée M.; Kholodenko, Boris N.

    2012-01-01

    We derive a statistical model of transcriptional activation using equilibrium thermodynamics of chemical reactions. We examine to what extent this statistical model predicts synergy effects of cooperative activation of gene expression. We determine parameter domains in which greater-than-additive and less-than-additive effects are predicted for cooperative regulation by two activators. We show that the statistical approach can be used to identify different causes of synergistic greater-than-additive effects: nonlinearities of the thermostatistical transcriptional machinery and three-body interactions between RNA polymerase and two activators. In particular, our model-based analysis suggests that at low transcription factor concentrations cooperative activation cannot yield synergistic greater-than-additive effects, i.e., DNA transcription can only exhibit less-than-additive effects. Accordingly, transcriptional activity turns from synergistic greater-than-additive responses at relatively high transcription factor concentrations into less-than-additive responses at relatively low concentrations. In addition, two types of re-entrant phenomena are predicted. First, our analysis predicts that under particular circumstances transcriptional activity will feature a sequence of less-than-additive, greater-than-additive, and eventually less-than-additive effects when for fixed activator concentrations the regulatory impact of activators on the binding of RNA polymerase to the promoter increases from weak, to moderate, to strong. Second, for appropriate promoter conditions when activator concentrations are increased then the aforementioned re-entrant sequence of less-than-additive, greater-than-additive, and less-than-additive effects is predicted as well. Finally, our model-based analysis suggests that even for weak activators that individually induce only negligible increases in promoter activity, promoter activity can exhibit greater-than-additive responses when

  18. Boosting structured additive quantile regression for longitudinal childhood obesity data.

    PubMed

    Fenske, Nora; Fahrmeir, Ludwig; Hothorn, Torsten; Rzehak, Peter; Höhle, Michael

    2013-07-25

    Childhood obesity and the investigation of its risk factors has become an important public health issue. Our work is based on and motivated by a German longitudinal study including 2,226 children with up to ten measurements on their body mass index (BMI) and risk factors from birth to the age of 10 years. We introduce boosting of structured additive quantile regression as a novel distribution-free approach for longitudinal quantile regression. The quantile-specific predictors of our model include conventional linear population effects, smooth nonlinear functional effects, varying-coefficient terms, and individual-specific effects, such as intercepts and slopes. Estimation is based on boosting, a computer intensive inference method for highly complex models. We propose a component-wise functional gradient descent boosting algorithm that allows for penalized estimation of the large variety of different effects, particularly leading to individual-specific effects shrunken toward zero. This concept allows us to flexibly estimate the nonlinear age curves of upper quantiles of the BMI distribution, both on population and on individual-specific level, adjusted for further risk factors and to detect age-varying effects of categorical risk factors. Our model approach can be regarded as the quantile regression analog of Gaussian additive mixed models (or structured additive mean regression models), and we compare both model classes with respect to our obesity data.

  19. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  20. Risk modelling study for carotid endarterectomy.

    PubMed

    Kuhan, G; Gardiner, E D; Abidia, A F; Chetter, I C; Renwick, P M; Johnson, B F; Wilkinson, A R; McCollum, P T

    2001-12-01

    The aims of this study were to identify factors that influence the risk of stroke or death following carotid endarterectomy (CEA) and to develop a model to aid in comparative audit of vascular surgeons and units. A series of 839 CEAs performed by four vascular surgeons between 1992 and 1999 was analysed. Multiple logistic regression analysis was used to model the effect of 15 possible risk factors on the 30-day risk of stroke or death. Outcome was compared for four surgeons and two units after adjustment for the significant risk factors. The overall 30-day stroke or death rate was 3.9 per cent (29 of 741). Heart disease, diabetes and stroke were significant risk factors. The 30-day predicted stroke or death rates increased with increasing risk scores. The observed 30-day stroke or death rate was 3.9 per cent for both vascular units and varied from 3.0 to 4.2 per cent for the four vascular surgeons. Differences in the outcomes between the surgeons and vascular units did not reach statistical significance after risk adjustment. Diabetes, heart disease and stroke are significant risk factors for stroke or death following CEA. The risk score model identified patients at higher risk and aided in comparative audit.

  1. Effects of additional food in a delayed predator-prey model.

    PubMed

    Sahoo, Banshidhar; Poria, Swarup

    2015-03-01

    We examine the effects of supplying additional food to predator in a gestation delay induced predator-prey system with habitat complexity. Additional food works in favor of predator growth in our model. Presence of additional food reduces the predatory attack rate to prey in the model. Supplying additional food we can control predator population. Taking time delay as bifurcation parameter the stability of the coexisting equilibrium point is analyzed. Hopf bifurcation analysis is done with respect to time delay in presence of additional food. The direction of Hopf bifurcations and the stability of bifurcated periodic solutions are determined by applying the normal form theory and the center manifold theorem. The qualitative dynamical behavior of the model is simulated using experimental parameter values. It is observed that fluctuations of the population size can be controlled either by supplying additional food suitably or by increasing the degree of habitat complexity. It is pointed out that Hopf bifurcation occurs in the system when the delay crosses some critical value. This critical value of delay strongly depends on quality and quantity of supplied additional food. Therefore, the variation of predator population significantly effects the dynamics of the model. Model results are compared with experimental results and biological implications of the analytical findings are discussed in the conclusion section. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Development and Validation of Osteoporosis Risk-Assessment Model for Korean Men

    PubMed Central

    Oh, Sun Min; Song, Bo Mi; Nam, Byung-Ho; Rhee, Yumie; Moon, Seong-Hwan; Kim, Deog Young; Kang, Dae Ryong

    2016-01-01

    Purpose The aim of the present study was to develop an osteoporosis risk-assessment model to identify high-risk individuals among Korean men. Materials and Methods The study used data from 1340 and 1110 men ≥50 years who participated in the 2009 and 2010 Korean National Health and Nutrition Examination Survey, respectively, for development and validation of an osteoporosis risk-assessment model. Osteoporosis was defined as T score ≤-2.5 at either the femoral neck or lumbar spine. Performance of the candidate models and the Osteoporosis Self-assessment Tool for Asian (OSTA) was compared with sensitivity, specificity, and area under the receiver operating characteristics curve (AUC). A net reclassification improvement was further calculated to compare the developed Korean Osteoporosis Risk-Assessment Model for Men (KORAM-M) with OSTA. Results In the development dataset, the prevalence of osteoporosis was 8.1%. KORAM-M, consisting of age and body weight, had a sensitivity of 90.8%, a specificity of 42.4%, and an AUC of 0.666 with a cut-off score of -9. In the validation dataset, similar results were shown: sensitivity 87.9%, specificity 39.7%, and AUC 0.638. Additionally, risk categorization with KORAM-M showed improved reclassification over that of OSTA up to 22.8%. Conclusion KORAM-M can be simply used as a pre-screening tool to identify candidates for dual energy X-ray absorptiometry tests. PMID:26632400

  3. A calibration hierarchy for risk models was defined: from utopia to empirical data.

    PubMed

    Van Calster, Ben; Nieboer, Daan; Vergouwe, Yvonne; De Cock, Bavo; Pencina, Michael J; Steyerberg, Ewout W

    2016-06-01

    Calibrated risk models are vital for valid decision support. We define four levels of calibration and describe implications for model development and external validation of predictions. We present results based on simulated data sets. A common definition of calibration is "having an event rate of R% among patients with a predicted risk of R%," which we refer to as "moderate calibration." Weaker forms of calibration only require the average predicted risk (mean calibration) or the average prediction effects (weak calibration) to be correct. "Strong calibration" requires that the event rate equals the predicted risk for every covariate pattern. This implies that the model is fully correct for the validation setting. We argue that this is unrealistic: the model type may be incorrect, the linear predictor is only asymptotically unbiased, and all nonlinear and interaction effects should be correctly modeled. In addition, we prove that moderate calibration guarantees nonharmful decision making. Finally, results indicate that a flexible assessment of calibration in small validation data sets is problematic. Strong calibration is desirable for individualized decision support but unrealistic and counter productive by stimulating the development of overly complex models. Model development and external validation should focus on moderate calibration. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Comprehensive European dietary exposure model (CEDEM) for food additives.

    PubMed

    Tennant, David R

    2016-05-01

    European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database.

  5. Geographic exposure risk of variant Creutzfeldt-Jakob disease in US blood donors: a risk-ranking model to evaluate alternative donor-deferral policies.

    PubMed

    Yang, Hong; Huang, Yin; Gregori, Luisa; Asher, David M; Bui, Travis; Forshee, Richard A; Anderson, Steven A

    2017-04-01

    Variant Creutzfeldt-Jakob disease (vCJD) has been transmitted by blood transfusion (TTvCJD). The US Food and Drug Administration (FDA) recommends deferring blood donors who resided in or traveled to 30 European countries where they may have been exposed to bovine spongiform encephalopathy (BSE) through beef consumption. Those recommendations warrant re-evaluation, because new cases of BSE and vCJD have markedly abated. The FDA developed a risk-ranking model to calculate the geographic vCJD risk using country-specific case rates and person-years of exposure of US blood donors. We used the reported country vCJD case rates, when available, or imputed vCJD case rates from reported BSE and UK beef exports during the risk period. We estimated the risk reduction and donor loss should the deferral be restricted to a few high-risk countries. We also estimated additional risk reduction by leukocyte reduction (LR) of red blood cells (RBCs). The United Kingdom, Ireland, and France had the greatest vCJD risk, contributing approximately 95% of the total risk. The model estimated that deferring US donors who spent extended periods of time in these three countries, combined with currently voluntary LR (95% of RBC units), would reduce the vCJD risk by 89.3%, a reduction similar to that achieved under the current policy (89.8%). Limiting deferrals to exposure in these three countries would potentially allow donations from an additional 100,000 donors who are currently deferred. Our analysis suggests that a deferral option focusing on the three highest risk countries would achieve a level of blood safety similar to that achieved by the current policy. © 2016 AABB.

  6. Competing risks models and time-dependent covariates

    PubMed Central

    Barnett, Adrian; Graves, Nick

    2008-01-01

    New statistical models for analysing survival data in an intensive care unit context have recently been developed. Two models that offer significant advantages over standard survival analyses are competing risks models and multistate models. Wolkewitz and colleagues used a competing risks model to examine survival times for nosocomial pneumonia and mortality. Their model was able to incorporate time-dependent covariates and so examine how risk factors that changed with time affected the chances of infection or death. We briefly explain how an alternative modelling technique (using logistic regression) can more fully exploit time-dependent covariates for this type of data. PMID:18423067

  7. Synergistic effect of rice husk addition on hydrothermal treatment of sewage sludge: fate and environmental risk of heavy metals.

    PubMed

    Shi, Wansheng; Liu, Chunguang; Shu, Youju; Feng, Chuanping; Lei, Zhongfang; Zhang, Zhenya

    2013-12-01

    Hydrothermal treatment (HTT) at 200°C was applied to immobilize heavy metals (HMs) and the effect of rice husk (RH) addition was investigated based on total HMs concentration, fractionation and leaching tests. The results indicated that a synergistic effect of RH addition and HTT could be achieved on reducing the risk of HMs from medium and low risk to no risk. Metals were redistributed and transformed from weakly bounded state to stable state during the HTT process under RH addition. Notably at a RH/sludge ratio of 1/1.75 (d.w.), all the HMs showed no eco-toxicity and no leaching toxicity, with the concentrations of leachable Cr, Ni, Cu and Cd decreased by 17%, 89%, 95% and 93%, respectively. This synergistic effect of RH addition and HTT on the risk reduction of HMs implies that HTT process with RH addition could be a promising and safe disposal technology for sewage sludge treatment in practice. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. A measurement model of perinatal stressors: identifying risk for postnatal emotional distress in mothers of high-risk infants.

    PubMed

    DeMier, R L; Hynan, M T; Hatfield, R F; Varner, M W; Harris, H B; Manniello, R L

    2000-01-01

    A measurement model of perinatal stressors was first evaluated for reliability and then used to identify risk factors for postnatal emotional distress in high-risk mothers. In Study 1, six measures (gestational age of the baby, birthweight, length of the baby's hospitalization, a postnatal complications rating for the infant, and Apgar scores at 1 and 5 min) were obtained from chart reviews of preterm births at two different hospitals. Confirmatory factor analyses revealed that the six measures could be accounted for by three factors: (a) Infant Maturity, (b) Apgar Ratings, and (c) Complications. In Study 2, a modified measurement model indicated that Infant Maturity and Complications were significant predictors of postnatal emotional distress in an additional sample of mothers. This measurement model may also be useful in predicting (a) other measures of psychological distress in parents, and (b) measures of cognitive and motor development in infants.

  9. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  10. Modeling risk of occupational zoonotic influenza infection in swine workers.

    PubMed

    Paccha, Blanca; Jones, Rachael M; Gibbs, Shawn; Kane, Michael J; Torremorell, Montserrat; Neira-Ramirez, Victor; Rabinowitz, Peter M

    2016-08-01

    Zoonotic transmission of influenza A virus (IAV) between swine and workers in swine production facilities may play a role in the emergence of novel influenza strains with pandemic potential. Guidelines to prevent transmission of influenza to swine workers have been developed but there is a need for evidence-based decision-making about protective measures such as respiratory protection. A mathematical model was applied to estimate the risk of occupational IAV exposure to swine workers by contact and airborne transmission, and to evaluate the use of respirators to reduce transmission.  The Markov model was used to simulate the transport and exposure of workers to IAV in a swine facility. A dose-response function was used to estimate the risk of infection. This approach is similar to methods previously used to estimate the risk of infection in human health care settings. This study uses concentration of virus in air from field measurements collected during outbreaks of influenza in commercial swine facilities, and analyzed by polymerase chain reaction.  It was found that spending 25 min working in a barn during an influenza outbreak in a swine herd could be sufficient to cause zoonotic infection in a worker. However, this risk estimate was sensitive to estimates of viral infectivity to humans. Wearing an excellent fitting N95 respirator reduced this risk, but with high aerosol levels the predicted risk of infection remained high under certain assumptions.  The results of this analysis indicate that under the conditions studied, swine workers are at risk of zoonotic influenza infection. The use of an N95 respirator could reduce such risk. These findings have implications for risk assessment and preventive programs targeting swine workers. The exact level of risk remains uncertain, since our model may have overestimated the viability or infectivity of IAV. Additionally, the potential for partial immunity in swine workers associated with repeated low

  11. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  12. Managing security risks for inter-organisational information systems: a multiagent collaborative model

    NASA Astrophysics Data System (ADS)

    Feng, Nan; Wu, Harris; Li, Minqiang; Wu, Desheng; Chen, Fuzan; Tian, Jin

    2016-09-01

    Information sharing across organisations is critical to effectively managing the security risks of inter-organisational information systems. Nevertheless, few previous studies on information systems security have focused on inter-organisational information sharing, and none have studied the sharing of inferred beliefs versus factual observations. In this article, a multiagent collaborative model (MACM) is proposed as a practical solution to assess the risk level of each allied organisation's information system and support proactive security treatment by sharing beliefs on event probabilities as well as factual observations. In MACM, for each allied organisation's information system, we design four types of agents: inspection agent, analysis agent, control agent, and communication agent. By sharing soft findings (beliefs) in addition to hard findings (factual observations) among the organisations, each organisation's analysis agent is capable of dynamically predicting its security risk level using a Bayesian network. A real-world implementation illustrates how our model can be used to manage security risks in distributed information systems and that sharing soft findings leads to lower expected loss from security risks.

  13. A review of the additive health risk of cannabis and tobacco co-use.

    PubMed

    Meier, Ellen; Hatsukami, Dorothy K

    2016-09-01

    Cannabis and tobacco are the most widely used substances, and are often used together. The present review examines the toxicant exposure associated with co-use (e.g., carbon monoxide, carcinogens), co-use via electronic nicotine delivery systems (ENDS), and problematic methodological issues present across co-use studies. An extensive literature search through PubMed was conducted and studies utilizing human subjects and in vitro methods were included. Keywords included tobacco, cigarette, e-cigarette, ENDS, smoking, or nicotine AND marijuana OR cannabis OR THC. Co-use may pose additive risk for toxicant exposure as certain co-users (e.g., blunt users) tend to have higher breath carbon monoxide levels and cannabis smoke can have higher levels of some carcinogens than tobacco smoke. Cannabis use via ENDS is low and occurs primarily among established tobacco or cannabis users, but its incidence may be increasing and expanding to tobacco/cannabis naïve individuals. There are several methodological issues across co-use research including varying definitions of co-use, sample sizes, lack of control for important covariates (e.g., time since last cigarette), and inconsistent measurement of outcome variables. There are some known additive risks for toxicant exposure as a result of co-use. Research utilizing consistent methodologies is needed to further establish the additive risk of co-use. Future research should also be aware of novel technologies (e.g., ENDS) as they likely alter some toxicant exposure when used alone and with cannabis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Increased risk of sadness and suicidality among victims of bullying experiencing additional threats to physical safety.

    PubMed

    Pham, Tammy B; Adesman, Andrew

    2017-11-23

    Objective To examine, in a nationally-representative sample of high school students, to what extent one or more additional threats to physical safety exacerbates the risk of sadness and suicidality among victims of school and/or cyber-bullying. Methods National data from the 2015 Youth Risk Behavior Survey (YRBS) were analyzed for grades 9-12 (n = 15,624). Victimization groups were characterized by school-bullying and cyber-bullying, with and without additional threats to physical safety: fighting at school, being threatened/injured at school, and skipping school out of fear for one's safety. Outcomes included 2-week sadness and suicidality. Outcomes for victimization groups were compared to non-victims using logistic regression adjusting for sex, grade and race/ethnicity. Results Overall, 20.2% of students were school-bullied, and 15.5% were cyber-bullied in the past year. Compared to non-victims, victims of school-bullying and victims of cyber-bullying (VoCBs) who did not experience additional threats to physical safety were 2.76 and 3.83 times more likely to report 2-week sadness, and 3.39 and 3.27 times more likely to exhibit suicidality, respectively. Conversely, victims of bullying who experienced one or more additional threats to physical safety were successively more likely to report these adverse outcomes. Notably, victims of school-bullying and VoCBs with all three additional risk factors were 13.13 and 17.75 times more likely to exhibit suicidality, respectively. Conclusion Risk of depression symptoms and suicidality among victims of school-bullying and/or cyber-bullying is greatly increased among those who have experienced additional threats to physical safety: fighting at school, being threatened/injured at school and skipping school out of fear for their safety.

  15. Extracting additional risk managers information from a risk assessment of Listeria monocytogenes in deli meats.

    PubMed

    Pérez-Rodríguez, F; van Asselt, E D; Garcia-Gimeno, R M; Zurera, G; Zwietering, M H

    2007-05-01

    The risk assessment study of Listeria monocytogenes in ready-to-eat foods conducted by the U.S. Food and Drug Administration is an example of an extensive quantitative microbiological risk assessment that could be used by risk analysts and other scientists to obtain information and by managers and stakeholders to make decisions on food safety management. The present study was conducted to investigate how detailed sensitivity analysis can be used by assessors to extract more information on risk factors and how results can be communicated to managers and stakeholders in an understandable way. The extended sensitivity analysis revealed that the extremes at the right side of the dose distribution (at consumption, 9 to 11.5 log CFU per serving) were responsible for most of the cases of listeriosis simulated. For concentration at retail, values below the detection limit of 0.04 CFU/g and the often used limit for L. monocytogenes of 100 CFU/g (also at retail) were associated with a high number of annual cases of listeriosis (about 29 and 82%, respectively). This association can be explained by growth of L. monocytogenes at both average and extreme values of temperature and time, indicating that a wide distribution can lead to high risk levels. Another finding is the importance of the maximal population density (i.e., the maximum concentration of L. monocytogenes assumed at a certain temperature) for accurately estimating the risk of infection by opportunistic pathogens such as L. monocytogenes. According to the obtained results, mainly concentrations corresponding to the highest maximal population densities caused risk in the simulation. However, sensitivity analysis applied to the uncertainty parameters revealed that prevalence at retail was the most important source of uncertainty in the model.

  16. Potential uncertainty reduction in model-averaged benchmark dose estimates informed by an additional dose study.

    PubMed

    Shao, Kan; Small, Mitchell J

    2011-10-01

    A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted. © 2011 Society for Risk Analysis.

  17. Testing the Predictive Validity of the Hendrich II Fall Risk Model.

    PubMed

    Jung, Hyesil; Park, Hyeoun-Ae

    2018-03-01

    Cumulative data on patient fall risk have been compiled in electronic medical records systems, and it is possible to test the validity of fall-risk assessment tools using these data between the times of admission and occurrence of a fall. The Hendrich II Fall Risk Model scores assessed during three time points of hospital stays were extracted and used for testing the predictive validity: (a) upon admission, (b) when the maximum fall-risk score from admission to falling or discharge, and (c) immediately before falling or discharge. Predictive validity was examined using seven predictive indicators. In addition, logistic regression analysis was used to identify factors that significantly affect the occurrence of a fall. Among the different time points, the maximum fall-risk score assessed between admission and falling or discharge showed the best predictive performance. Confusion or disorientation and having a poor ability to rise from a sitting position were significant risk factors for a fall.

  18. Calibration plots for risk prediction models in the presence of competing risks.

    PubMed

    Gerds, Thomas A; Andersen, Per K; Kattan, Michael W

    2014-08-15

    A predicted risk of 17% can be called reliable if it can be expected that the event will occur to about 17 of 100 patients who all received a predicted risk of 17%. Statistical models can predict the absolute risk of an event such as cardiovascular death in the presence of competing risks such as death due to other causes. For personalized medicine and patient counseling, it is necessary to check that the model is calibrated in the sense that it provides reliable predictions for all subjects. There are three often encountered practical problems when the aim is to display or test if a risk prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves for competing risks models based on jackknife pseudo-values that are combined with a nearest neighborhood smoother and a cross-validation approach to deal with all three problems. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Modelling the behaviour of additives in gun barrels

    NASA Astrophysics Data System (ADS)

    Rhodes, N.; Ludwig, J. C.

    1986-01-01

    A mathematical model which predicts the flow and heat transfer in a gun barrel is described. The model is transient, two-dimensional and equations are solved for velocities and enthalpies of a gas phase, which arises from the combustion of propellant and cartridge case, for particle additives which are released from the case; volume fractions of the gas and particles. Closure of the equations is obtained using a two-equation turbulence model. Preliminary calculations are described in which the proportions of particle additives in the cartridge case was altered. The model gives a good prediction of the ballistic performance and the gas to wall heat transfer. However, the expected magnitude of reduction in heat transfer when particles are present is not predicted. The predictions of gas flow invalidate some of the assumptions made regarding case and propellant behavior during combustion and further work is required to investigate these effects and other possible interactions, both chemical and physical, between gas and particles.

  20. Relative Importance and Additive Effects of Maternal and Infant Risk Factors on Childhood Asthma

    PubMed Central

    Rosas-Salazar, Christian; James, Kristina; Escobar, Gabriel; Gebretsadik, Tebeb; Li, Sherian Xu; Carroll, Kecia N.; Walsh, Eileen; Mitchel, Edward; Das, Suman; Kumar, Rajesh; Yu, Chang; Dupont, William D.; Hartert, Tina V.

    2016-01-01

    Background Environmental exposures that occur in utero and during early life may contribute to the development of childhood asthma through alteration of the human microbiome. The objectives of this study were to estimate the cumulative effect and relative importance of environmental exposures on the risk of childhood asthma. Methods We conducted a population-based birth cohort study of mother-child dyads who were born between 1995 and 2003 and were continuously enrolled in the PRIMA (Prevention of RSV: Impact on Morbidity and Asthma) cohort. The individual and cumulative impact of maternal urinary tract infections (UTI) during pregnancy, maternal colonization with group B streptococcus (GBS), mode of delivery, infant antibiotic use, and older siblings at home, on the risk of childhood asthma were estimated using logistic regression. Dose-response effect on childhood asthma risk was assessed for continuous risk factors: number of maternal UTIs during pregnancy, courses of infant antibiotics, and number of older siblings at home. We further assessed and compared the relative importance of these exposures on the asthma risk. In a subgroup of children for whom maternal antibiotic use during pregnancy information was available, the effect of maternal antibiotic use on the risk of childhood asthma was estimated. Results Among 136,098 singleton birth infants, 13.29% developed asthma. In both univariate and adjusted analyses, maternal UTI during pregnancy (odds ratio [OR] 1.2, 95% confidence interval [CI] 1.18, 1.25; adjusted OR [AOR] 1.04, 95%CI 1.02, 1.07 for every additional UTI) and infant antibiotic use (OR 1.21, 95%CI 1.20, 1.22; AOR 1.16, 95%CI 1.15, 1.17 for every additional course) were associated with an increased risk of childhood asthma, while having older siblings at home (OR 0.92, 95%CI 0.91, 0.93; AOR 0.85, 95%CI 0.84, 0.87 for each additional sibling) was associated with a decreased risk of childhood asthma, in a dose-dependent manner. Compared with vaginal

  1. Relative Importance and Additive Effects of Maternal and Infant Risk Factors on Childhood Asthma.

    PubMed

    Wu, Pingsheng; Feldman, Amy S; Rosas-Salazar, Christian; James, Kristina; Escobar, Gabriel; Gebretsadik, Tebeb; Li, Sherian Xu; Carroll, Kecia N; Walsh, Eileen; Mitchel, Edward; Das, Suman; Kumar, Rajesh; Yu, Chang; Dupont, William D; Hartert, Tina V

    2016-01-01

    Environmental exposures that occur in utero and during early life may contribute to the development of childhood asthma through alteration of the human microbiome. The objectives of this study were to estimate the cumulative effect and relative importance of environmental exposures on the risk of childhood asthma. We conducted a population-based birth cohort study of mother-child dyads who were born between 1995 and 2003 and were continuously enrolled in the PRIMA (Prevention of RSV: Impact on Morbidity and Asthma) cohort. The individual and cumulative impact of maternal urinary tract infections (UTI) during pregnancy, maternal colonization with group B streptococcus (GBS), mode of delivery, infant antibiotic use, and older siblings at home, on the risk of childhood asthma were estimated using logistic regression. Dose-response effect on childhood asthma risk was assessed for continuous risk factors: number of maternal UTIs during pregnancy, courses of infant antibiotics, and number of older siblings at home. We further assessed and compared the relative importance of these exposures on the asthma risk. In a subgroup of children for whom maternal antibiotic use during pregnancy information was available, the effect of maternal antibiotic use on the risk of childhood asthma was estimated. Among 136,098 singleton birth infants, 13.29% developed asthma. In both univariate and adjusted analyses, maternal UTI during pregnancy (odds ratio [OR] 1.2, 95% confidence interval [CI] 1.18, 1.25; adjusted OR [AOR] 1.04, 95%CI 1.02, 1.07 for every additional UTI) and infant antibiotic use (OR 1.21, 95%CI 1.20, 1.22; AOR 1.16, 95%CI 1.15, 1.17 for every additional course) were associated with an increased risk of childhood asthma, while having older siblings at home (OR 0.92, 95%CI 0.91, 0.93; AOR 0.85, 95%CI 0.84, 0.87 for each additional sibling) was associated with a decreased risk of childhood asthma, in a dose-dependent manner. Compared with vaginal delivery, C

  2. Information risk and security modeling

    NASA Astrophysics Data System (ADS)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  3. Sexual risk behavior among youth: modeling the influence of prosocial activities and socioeconomic factors.

    PubMed

    Ramirez-Valles, J; Zimmerman, M A; Newcomb, M D

    1998-09-01

    Sexual activity among high-school-aged youths has steadily increased since the 1970s, emerging as a significant public health concern. Yet, patterns of youth sexual risk behavior are shaped by social class, race, and gender. Based on sociological theories of financial deprivation and collective socialization, we develop and test a model of the relationships among neighborhood poverty; family structure and social class position; parental involvement; prosocial activities; race; and gender as they predict youth sexual risk behavior. We employ structural equation modeling to test this model on a cross-sectional sample of 370 sexually active high-school students from a midwestern city; 57 percent (n = 209) are males and 86 percent are African American. We find that family structure indirectly predicts sexual risk behavior through neighborhood poverty, parental involvement, and prosocial activities. In addition, family class position indirectly predicts sexual risk behavior through neighborhood poverty and prosocial activities. We address implications for theory and health promotion.

  4. Risk factors for an additional port in single-incision laparoscopic cholecystectomy in patients with cholecystitis.

    PubMed

    Araki, Kenichiro; Shirabe, Ken; Watanabe, Akira; Kubo, Norio; Sasaki, Shigeru; Suzuki, Hideki; Asao, Takayuki; Kuwano, Hiroyuki

    2017-01-01

    Although single-incision laparoscopic cholecystectomy is now widely performed in patients with cholecystitis, some cases require an additional port to complete the procedure. In this study, we focused on risk factor of additional port in this surgery. We performed single-incision cholecystectomy in 75 patients with acute cholecystitis or after cholecystitis between 2010 and 2014 at Gunma University Hospital. Surgical indications followed the TG13 guidelines. Our standard procedure for single-incision cholecystectomy routinely uses two needlescopic devices. We used logistic regression analysis to identify the risk factors associated with use of an additional full-size port (5 or 10 mm). Surgical outcome was acceptable without biliary injury. Nine patients (12.0%) required an additional port, and one patient (1.3%) required conversion to open cholecystectomy because of severe adhesions around the cystic duct and common bile duct. In multivariate analysis, high C-reactive protein (CRP) values (>7.0 mg/dl) during cholecystitis attacks were significantly correlated with the need for an additional port (P = 0.009), with a sensitivity of 55.6%, specificity of 98.5%, and accuracy of 93.3%. This study indicated that the severe inflammation indicated by high CRP values during cholecystitis attacks predicts the need for an additional port. J. Med. Invest. 64: 245-249, August, 2017.

  5. The Risk GP Model: the standard model of prediction in medicine.

    PubMed

    Fuller, Jonathan; Flores, Luis J

    2015-12-01

    With the ascent of modern epidemiology in the Twentieth Century came a new standard model of prediction in public health and clinical medicine. In this article, we describe the structure of the model. The standard model uses epidemiological measures-most commonly, risk measures-to predict outcomes (prognosis) and effect sizes (treatment) in a patient population that can then be transformed into probabilities for individual patients. In the first step, a risk measure in a study population is generalized or extrapolated to a target population. In the second step, the risk measure is particularized or transformed to yield probabilistic information relevant to a patient from the target population. Hence, we call the approach the Risk Generalization-Particularization (Risk GP) Model. There are serious problems at both stages, especially with the extent to which the required assumptions will hold and the extent to which we have evidence for the assumptions. Given that there are other models of prediction that use different assumptions, we should not inflexibly commit ourselves to one standard model. Instead, model pluralism should be standard in medical prediction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Risk analysis of sulfites used as food additives in China.

    PubMed

    Zhang, Jian Bo; Zhang, Hong; Wang, Hua Li; Zhang, Ji Yue; Luo, Peng Jie; Zhu, Lei; Wang, Zhu Tian

    2014-02-01

    This study was to analyze the risk of sulfites in food consumed by the Chinese people and assess the health protection capability of maximum-permitted level (MPL) of sulfites in GB 2760-2011. Sulfites as food additives are overused or abused in many food categories. When the MPL in GB 2760-2011 was used as sulfites content in food, the intake of sulfites in most surveyed populations was lower than the acceptable daily intake (ADI). Excess intake of sulfites was found in all the surveyed groups when a high percentile of sulfites in food was in taken. Moreover, children aged 1-6 years are at a high risk to intake excess sulfites. The primary cause for the excess intake of sulfites in Chinese people is the overuse and abuse of sulfites by the food industry. The current MPL of sulfites in GB 2760-2011 protects the health of most populations. Copyright © 2014 The Editorial Board of Biomedical and Environmental Sciences. Published by China CDC. All rights reserved.

  7. Risk assessment of consuming agricultural products irrigated with reclaimed wastewater: An exposure model

    NASA Astrophysics Data System (ADS)

    van Ginneken, Meike; Oron, Gideon

    2000-09-01

    This study assesses health risks to consumers due to the use of agricultural products irrigated with reclaimed wastewater. The analysis is based on a definition of an exposure model which takes into account several parameters: (1) the quality of the applied wastewater, (2) the irrigation method, (3) the elapsed times between irrigation, harvest, and product consumption, and (4) the consumers' habits. The exposure model is used for numerical simulation of human consumers' risks using the Monte Carlo simulation method. The results of the numerical simulation show large deviations, probably caused by uncertainty (impreciseness in quality of input data) and variability due to diversity among populations. There is a 10-orders of magnitude difference in the risk of infection between the different exposure scenarios with the same water quality. This variation indicates the need for setting risk-based criteria for wastewater reclamation rather than single water quality guidelines. Extra data are required to decrease uncertainty in the risk assessment. Future research needs to include definition of acceptable risk criteria, more accurate dose-response modeling, information regarding pathogen survival in treated wastewater, additional data related to the passage of pathogens into and in the plants during irrigation, and information regarding the behavior patterns of the community of human consumers.

  8. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  9. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  10. Genomic Model with Correlation Between Additive and Dominance Effects.

    PubMed

    Xiang, Tao; Christensen, Ole Fredslund; Vitezica, Zulma Gladis; Legarra, Andres

    2018-05-09

    Dominance genetic effects are rarely included in pedigree-based genetic evaluation. With the availability of single nucleotide polymorphism markers and the development of genomic evaluation, estimates of dominance genetic effects have become feasible using genomic best linear unbiased prediction (GBLUP). Usually, studies involving additive and dominance genetic effects ignore possible relationships between them. It has been often suggested that the magnitude of functional additive and dominance effects at the quantitative trait loci are related, but there is no existing GBLUP-like approach accounting for such correlation. Wellmann and Bennewitz showed two ways of considering directional relationships between additive and dominance effects, which they estimated in a Bayesian framework. However, these relationships cannot be fitted at the level of individuals instead of loci in a mixed model and are not compatible with standard animal or plant breeding software. This comes from a fundamental ambiguity in assigning the reference allele at a given locus. We show that, if there has been selection, assigning the most frequent as the reference allele orients the correlation between functional additive and dominance effects. As a consequence, the most frequent reference allele is expected to have a positive value. We also demonstrate that selection creates negative covariance between genotypic additive and dominance genetic values. For parameter estimation, it is possible to use a combined additive and dominance relationship matrix computed from marker genotypes, and to use standard restricted maximum likelihood (REML) algorithms based on an equivalent model. Through a simulation study, we show that such correlations can easily be estimated by mixed model software and accuracy of prediction for genetic values is slightly improved if such correlations are used in GBLUP. However, a model assuming uncorrelated effects and fitting orthogonal breeding values and dominant

  11. Applying risk and resilience models to predicting the effects of media violence on development.

    PubMed

    Prot, Sara; Gentile, Douglas A

    2014-01-01

    Although the effects of media violence on children and adolescents have been studied for over 50 years, they remain controversial. Much of this controversy is driven by a misunderstanding of causality that seeks the cause of atrocities such as school shootings. Luckily, several recent developments in risk and resilience theories offer a way out of this controversy. Four risk and resilience models are described, including the cascade model, dose-response gradients, pathway models, and turning-point models. Each is described and applied to the existing media effects literature. Recommendations for future research are discussed with regard to each model. In addition, we examine current developments in theorizing that stressors have sensitizing versus steeling effects and recent interest in biological and gene by environment interactions. We also discuss several of the cultural aspects that have supported the polarization and misunderstanding of the literature, and argue that applying risk and resilience models to the theories and data offers a more balanced way to understand the subtle effects of media violence on aggression within a multicausal perspective.

  12. Prostate Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Bladder Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Ovarian Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  15. Pancreatic Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  16. Testicular Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  17. Breast Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  18. Esophageal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  19. Cervical Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  20. Liver Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  1. Lung Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  2. Colorectal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  3. Validation of a new mortality risk prediction model for people 65 years and older in northwest Russia: The Crystal risk score.

    PubMed

    Turusheva, Anna; Frolova, Elena; Bert, Vaes; Hegendoerfer, Eralda; Degryse, Jean-Marie

    2017-07-01

    Prediction models help to make decisions about further management in clinical practice. This study aims to develop a mortality risk score based on previously identified risk predictors and to perform internal and external validations. In a population-based prospective cohort study of 611 community-dwelling individuals aged 65+ in St. Petersburg (Russia), all-cause mortality risks over 2.5 years follow-up were determined based on the results obtained from anthropometry, medical history, physical performance tests, spirometry and laboratory tests. C-statistic, risk reclassification analysis, integrated discrimination improvement analysis, decision curves analysis, internal validation and external validation were performed. Older adults were at higher risk for mortality [HR (95%CI)=4.54 (3.73-5.52)] when two or more of the following components were present: poor physical performance, low muscle mass, poor lung function, and anemia. If anemia was combined with high C-reactive protein (CRP) and high B-type natriuretic peptide (BNP) was added the HR (95%CI) was slightly higher (5.81 (4.73-7.14)) even after adjusting for age, sex and comorbidities. Our models were validated in an external population of adults 80+. The extended model had a better predictive capacity for cardiovascular mortality [HR (95%CI)=5.05 (2.23-11.44)] compared to the baseline model [HR (95%CI)=2.17 (1.18-4.00)] in the external population. We developed and validated a new risk prediction score that may be used to identify older adults at higher risk for mortality in Russia. Additional studies need to determine which targeted interventions improve the outcomes of these at-risk individuals. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. A model-based analysis of decision making under risk in obsessive-compulsive and hoarding disorders.

    PubMed

    Aranovich, Gabriel J; Cavagnaro, Daniel R; Pitt, Mark A; Myung, Jay I; Mathews, Carol A

    2017-07-01

    Attitudes towards risk are highly consequential in clinical disorders thought to be prone to "risky behavior", such as substance dependence, as well as those commonly associated with excessive risk aversion, such as obsessive-compulsive disorder (OCD) and hoarding disorder (HD). Moreover, it has recently been suggested that attitudes towards risk may serve as a behavioral biomarker for OCD. We investigated the risk preferences of participants with OCD and HD using a novel adaptive task and a quantitative model from behavioral economics that decomposes risk preferences into outcome sensitivity and probability sensitivity. Contrary to expectation, compared to healthy controls, participants with OCD and HD exhibited less outcome sensitivity, implying less risk aversion in the standard economic framework. In addition, risk attitudes were strongly correlated with depression, hoarding, and compulsion scores, while compulsion (hoarding) scores were associated with more (less) "rational" risk preferences. These results demonstrate how fundamental attitudes towards risk relate to specific psychopathology and thereby contribute to our understanding of the cognitive manifestations of mental disorders. In addition, our findings indicate that the conclusion made in recent work that decision making under risk is unaltered in OCD is premature. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  6. Use of Prolonged Travel to Improve Pediatric Risk-Adjustment Models

    PubMed Central

    Lorch, Scott A; Silber, Jeffrey H; Even-Shoshan, Orit; Millman, Andrea

    2009-01-01

    Objective To determine whether travel variables could explain previously reported differences in lengths of stay (LOS), readmission, or death at children's hospitals versus other hospital types. Data Source Hospital discharge data from Pennsylvania between 1996 and 1998. Study Design A population cohort of children aged 1–17 years with one of 19 common pediatric conditions was created (N=51,855). Regression models were constructed to determine difference for LOS, readmission, or death between children's hospitals and other types of hospitals after including five types of additional illness severity variables to a traditional risk-adjustment model. Principal Findings With the traditional risk-adjustment model, children traveling longer to children's or rural hospitals had longer adjusted LOS and higher readmission rates. Inclusion of either a geocoded travel time variable or a nongeocoded travel distance variable provided the largest reduction in adjusted LOS, adjusted readmission rates, and adjusted mortality rates for children's hospitals and rural hospitals compared with other types of hospitals. Conclusions Adding a travel variable to traditional severity adjustment models may improve the assessment of an individual hospital's pediatric care by reducing systematic differences between different types of hospitals. PMID:19207591

  7. The linearized multistage model and the future of quantitative risk assessment.

    PubMed

    Crump, K S

    1996-10-01

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for

  8. An electrical circuit model for additive-modified SnO2 ceramics

    NASA Astrophysics Data System (ADS)

    Karami Horastani, Zahra; Alaei, Reza; Karami, Amirhossein

    2018-05-01

    In this paper an electrical circuit model for additive-modified metal oxide ceramics based on their physical structures and electrical resistivities is presented. The model predicts resistance of the sample at different additive concentrations and different temperatures. To evaluate the model two types of composite ceramics, SWCNT/SnO2 with SWCNT concentrations of 0.3, 0.6, 1.2, 2.4 and 3.8%wt, and Ag/SnO2 with Ag concentrations of 0.3, 0.5, 0.8 and 1.5%wt, were prepared and their electrical resistances versus temperature were experimentally measured. It is shown that the experimental data are in good agreement with the results obtained from the model. The proposed model can be used in the design process of ceramic-based gas sensors, and it also clarifies the role of additive in gas sensing process of additive-modified metal oxide gas sensors. Furthermore the model can be used in the system level modeling of designs in which these sensors are also present.

  9. Estimating interaction on an additive scale between continuous determinants in a logistic regression model.

    PubMed

    Knol, Mirjam J; van der Tweel, Ingeborg; Grobbee, Diederick E; Numans, Mattijs E; Geerlings, Mirjam I

    2007-10-01

    To determine the presence of interaction in epidemiologic research, typically a product term is added to the regression model. In linear regression, the regression coefficient of the product term reflects interaction as departure from additivity. However, in logistic regression it refers to interaction as departure from multiplicativity. Rothman has argued that interaction estimated as departure from additivity better reflects biologic interaction. So far, literature on estimating interaction on an additive scale using logistic regression only focused on dichotomous determinants. The objective of the present study was to provide the methods to estimate interaction between continuous determinants and to illustrate these methods with a clinical example. and results From the existing literature we derived the formulas to quantify interaction as departure from additivity between one continuous and one dichotomous determinant and between two continuous determinants using logistic regression. Bootstrapping was used to calculate the corresponding confidence intervals. To illustrate the theory with an empirical example, data from the Utrecht Health Project were used, with age and body mass index as risk factors for elevated diastolic blood pressure. The methods and formulas presented in this article are intended to assist epidemiologists to calculate interaction on an additive scale between two variables on a certain outcome. The proposed methods are included in a spreadsheet which is freely available at: http://www.juliuscenter.nl/additive-interaction.xls.

  10. Risk terrain modeling predicts child maltreatment.

    PubMed

    Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye

    2016-12-01

    As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Improving Risk Adjustment for Mortality After Pediatric Cardiac Surgery: The UK PRAiS2 Model.

    PubMed

    Rogers, Libby; Brown, Katherine L; Franklin, Rodney C; Ambler, Gareth; Anderson, David; Barron, David J; Crowe, Sonya; English, Kate; Stickley, John; Tibby, Shane; Tsang, Victor; Utley, Martin; Witter, Thomas; Pagel, Christina

    2017-07-01

    Partial Risk Adjustment in Surgery (PRAiS), a risk model for 30-day mortality after children's heart surgery, has been used by the UK National Congenital Heart Disease Audit to report expected risk-adjusted survival since 2013. This study aimed to improve the model by incorporating additional comorbidity and diagnostic information. The model development dataset was all procedures performed between 2009 and 2014 in all UK and Ireland congenital cardiac centers. The outcome measure was death within each 30-day surgical episode. Model development followed an iterative process of clinical discussion and development and assessment of models using logistic regression under 25 × 5 cross-validation. Performance was measured using Akaike information criterion, the area under the receiver-operating characteristic curve (AUC), and calibration. The final model was assessed in an external 2014 to 2015 validation dataset. The development dataset comprised 21,838 30-day surgical episodes, with 539 deaths (mortality, 2.5%). The validation dataset comprised 4,207 episodes, with 97 deaths (mortality, 2.3%). The updated risk model included 15 procedural, 11 diagnostic, and 4 comorbidity groupings, and nonlinear functions of age and weight. Performance under cross-validation was: median AUC of 0.83 (range, 0.82 to 0.83), median calibration slope and intercept of 0.92 (range, 0.64 to 1.25) and -0.23 (range, -1.08 to 0.85) respectively. In the validation dataset, the AUC was 0.86 (95% confidence interval [CI], 0.82 to 0.89), and the calibration slope and intercept were 1.01 (95% CI, 0.83 to 1.18) and 0.11 (95% CI, -0.45 to 0.67), respectively, showing excellent performance. A more sophisticated PRAiS2 risk model for UK use was developed with additional comorbidity and diagnostic information, alongside age and weight as nonlinear variables. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Comparison of time series models for predicting campylobacteriosis risk in New Zealand.

    PubMed

    Al-Sakkaf, A; Jones, G

    2014-05-01

    Predicting campylobacteriosis cases is a matter of considerable concern in New Zealand, after the number of the notified cases was the highest among the developed countries in 2006. Thus, there is a need to develop a model or a tool to predict accurately the number of campylobacteriosis cases as the Microbial Risk Assessment Model used to predict the number of campylobacteriosis cases failed to predict accurately the number of actual cases. We explore the appropriateness of classical time series modelling approaches for predicting campylobacteriosis. Finding the most appropriate time series model for New Zealand data has additional practical considerations given a possible structural change, that is, a specific and sudden change in response to the implemented interventions. A univariate methodological approach was used to predict monthly disease cases using New Zealand surveillance data of campylobacteriosis incidence from 1998 to 2009. The data from the years 1998 to 2008 were used to model the time series with the year 2009 held out of the data set for model validation. The best two models were then fitted to the full 1998-2009 data and used to predict for each month of 2010. The Holt-Winters (multiplicative) and ARIMA (additive) intervention models were considered the best models for predicting campylobacteriosis in New Zealand. It was noticed that the prediction by an additive ARIMA with intervention was slightly better than the prediction by a Holt-Winter multiplicative method for the annual total in year 2010, the former predicting only 23 cases less than the actual reported cases. It is confirmed that classical time series techniques such as ARIMA with intervention and Holt-Winters can provide a good prediction performance for campylobacteriosis risk in New Zealand. The results reported by this study are useful to the New Zealand Health and Safety Authority's efforts in addressing the problem of the campylobacteriosis epidemic. © 2013 Blackwell Verlag GmbH.

  13. An integrated model-based approach to the risk assessment of pesticide drift from vineyards

    NASA Astrophysics Data System (ADS)

    Pivato, Alberto; Barausse, Alberto; Zecchinato, Francesco; Palmeri, Luca; Raga, Roberto; Lavagnolo, Maria Cristina; Cossu, Raffaello

    2015-06-01

    The inhalation of pesticide in air is of particular concern for people living in close contact with intensive agricultural activities. This study aims to develop an integrated modelling methodology to assess whether pesticides pose a risk to the health of people living near vineyards, and apply this methodology in the world-renowned Prosecco DOCG (Italian label for protection of origin and geographical indication of wines) region. A sample field in Bigolino di Valdobbiadene (North-Eastern Italy) was selected to perform the pesticide fate modellization and the consequent inhalation risk assessment for people living in the area. The modellization accounts for the direct pesticide loss during the treatment of vineyards and for the volatilization from soil after the end of the treatment. A fugacity model was used to assess the volatilization flux from soil. The Gaussian puff air dispersion model CALPUFF was employed to assess the airborne concentration of the emitted pesticide over the simulation domain. The subsequent risk assessment integrates the HArmonised environmental Indicators for pesticide Risk (HAIR) and US-EPA guidelines. In this case study the modelled situation turned to be safe from the point of view of human health in the case of non-carcinogenic compounds, and additional improvements were suggested to further mitigate the effect of the most critical compound.

  14. Modeling Errors in Daily Precipitation Measurements: Additive or Multiplicative?

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Huffman, George J.; Adler, Robert F.; Tang, Ling; Sapiano, Matthew; Maggioni, Viviana; Wu, Huan

    2013-01-01

    The definition and quantification of uncertainty depend on the error model used. For uncertainties in precipitation measurements, two types of error models have been widely adopted: the additive error model and the multiplicative error model. This leads to incompatible specifications of uncertainties and impedes intercomparison and application.In this letter, we assess the suitability of both models for satellite-based daily precipitation measurements in an effort to clarify the uncertainty representation. Three criteria were employed to evaluate the applicability of either model: (1) better separation of the systematic and random errors; (2) applicability to the large range of variability in daily precipitation; and (3) better predictive skills. It is found that the multiplicative error model is a much better choice under all three criteria. It extracted the systematic errors more cleanly, was more consistent with the large variability of precipitation measurements, and produced superior predictions of the error characteristics. The additive error model had several weaknesses, such as non constant variance resulting from systematic errors leaking into random errors, and the lack of prediction capability. Therefore, the multiplicative error model is a better choice.

  15. Modelling recurrent events: comparison of statistical models with continuous and discontinuous risk intervals on recurrent malaria episodes data

    PubMed Central

    2014-01-01

    Background Recurrent events data analysis is common in biomedicine. Literature review indicates that most statistical models used for such data are often based on time to the first event or consider events within a subject as independent. Even when taking into account the non-independence of recurrent events within subjects, data analyses are mostly done with continuous risk interval models, which may not be appropriate for treatments with sustained effects (e.g., drug treatments of malaria patients). Furthermore, results can be biased in cases of a confounding factor implying different risk exposure, e.g. in malaria transmission: if subjects are located at zones showing different environmental factors implying different risk exposures. Methods This work aimed to compare four different approaches by analysing recurrent malaria episodes from a clinical trial assessing the effectiveness of three malaria treatments [artesunate + amodiaquine (AS + AQ), artesunate + sulphadoxine-pyrimethamine (AS + SP) or artemether-lumefantrine (AL)], with continuous and discontinuous risk intervals: Andersen-Gill counting process (AG-CP), Prentice-Williams-Peterson counting process (PWP-CP), a shared gamma frailty model, and Generalized Estimating Equations model (GEE) using Poisson distribution. Simulations were also made to analyse the impact of the addition of a confounding factor on malaria recurrent episodes. Results Using the discontinuous interval analysis, AG-CP and Shared gamma frailty models provided similar estimations of treatment effect on malaria recurrent episodes when adjusted on age category. The patients had significant decreased risk of recurrent malaria episodes when treated with AS + AQ or AS + SP arms compared to AL arm; Relative Risks were: 0.75 (95% CI (Confidence Interval): 0.62-0.89), 0.74 (95% CI: 0.62-0.88) respectively for AG-CP model and 0.76 (95% CI: 0.64-0.89), 0.74 (95% CI: 0.62-0.87) for the Shared gamma frailty model. With both

  16. A Corrosion Risk Assessment Model for Underground Piping

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  17. Conceptual models for cumulative risk assessment.

    PubMed

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.

  18. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    PubMed

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2017-10-01

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  19. Innovative Models of Dental Care Delivery and Coverage: Patient-Centric Dental Benefits Based on Digital Oral Health Risk Assessment.

    PubMed

    Martin, John; Mills, Shannon; Foley, Mary E

    2018-04-01

    Innovative models of dental care delivery and coverage are emerging across oral health care systems causing changes to treatment and benefit plans. A novel addition to these models is digital risk assessment, which offers a promising new approach that incorporates the use of a cloud-based technology platform to assess an individual patient's risk for oral disease. Risk assessment changes treatment by including risk as a modifier of treatment and as a determinant of preventive services. Benefit plans are being developed to use risk assessment to predetermine preventive benefits for patients identified at elevated risk for oral disease. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Quantifying the predictive accuracy of time-to-event models in the presence of competing risks.

    PubMed

    Schoop, Rotraut; Beyersmann, Jan; Schumacher, Martin; Binder, Harald

    2011-02-01

    Prognostic models for time-to-event data play a prominent role in therapy assignment, risk stratification and inter-hospital quality assurance. The assessment of their prognostic value is vital not only for responsible resource allocation, but also for their widespread acceptance. The additional presence of competing risks to the event of interest requires proper handling not only on the model building side, but also during assessment. Research into methods for the evaluation of the prognostic potential of models accounting for competing risks is still needed, as most proposed methods measure either their discrimination or calibration, but do not examine both simultaneously. We adapt the prediction error proposal of Graf et al. (Statistics in Medicine 1999, 18, 2529–2545) and Gerds and Schumacher (Biometrical Journal 2006, 48, 1029–1040) to handle models with competing risks, i.e. more than one possible event type, and introduce a consistent estimator. A simulation study investigating the behaviour of the estimator in small sample size situations and for different levels of censoring together with a real data application follows.

  1. Structural equation modeling in environmental risk assessment.

    PubMed

    Buncher, C R; Succop, P A; Dietrich, K N

    1991-01-01

    Environmental epidemiology requires effective models that take individual observations of environmental factors and connect them into meaningful patterns. Single-factor relationships have given way to multivariable analyses; simple additive models have been augmented by multiplicative (logistic) models. Each of these steps has produced greater enlightenment and understanding. Models that allow for factors causing outputs that can affect later outputs with putative causation working at several different time points (e.g., linkage) are not commonly used in the environmental literature. Structural equation models are a class of covariance structure models that have been used extensively in economics/business and social science but are still little used in the realm of biostatistics. Path analysis in genetic studies is one simplified form of this class of models. We have been using these models in a study of the health and development of infants who have been exposed to lead in utero and in the postnatal home environment. These models require as input the directionality of the relationship and then produce fitted models for multiple inputs causing each factor and the opportunity to have outputs serve as input variables into the next phase of the simultaneously fitted model. Some examples of these models from our research are presented to increase familiarity with this class of models. Use of these models can provide insight into the effect of changing an environmental factor when assessing risk. The usual cautions concerning believing a model, believing causation has been proven, and the assumptions that are required for each model are operative.

  2. Characterizing uncertainty when evaluating risk management metrics: risk assessment modeling of Listeria monocytogenes contamination in ready-to-eat deli meats.

    PubMed

    Gallagher, Daniel; Ebel, Eric D; Gallagher, Owen; Labarre, David; Williams, Michael S; Golden, Neal J; Pouillot, Régis; Dearfield, Kerry L; Kause, Janell

    2013-04-01

    This report illustrates how the uncertainty about food safety metrics may influence the selection of a performance objective (PO). To accomplish this goal, we developed a model concerning Listeria monocytogenes in ready-to-eat (RTE) deli meats. This application used a second order Monte Carlo model that simulates L. monocytogenes concentrations through a series of steps: the food-processing establishment, transport, retail, the consumer's home and consumption. The model accounted for growth inhibitor use, retail cross contamination, and applied an FAO/WHO dose response model for evaluating the probability of illness. An appropriate level of protection (ALOP) risk metric was selected as the average risk of illness per serving across all consumed servings-per-annum and the model was used to solve for the corresponding performance objective (PO) risk metric as the maximum allowable L. monocytogenes concentration (cfu/g) at the processing establishment where regulatory monitoring would occur. Given uncertainty about model inputs, an uncertainty distribution of the PO was estimated. Additionally, we considered how RTE deli meats contaminated at levels above the PO would be handled by the industry using three alternative approaches. Points on the PO distribution represent the probability that - if the industry complies with a particular PO - the resulting risk-per-serving is less than or equal to the target ALOP. For example, assuming (1) a target ALOP of -6.41 log10 risk of illness per serving, (2) industry concentrations above the PO that are re-distributed throughout the remaining concentration distribution and (3) no dose response uncertainty, establishment PO's of -4.98 and -4.39 log10 cfu/g would be required for 90% and 75% confidence that the target ALOP is met, respectively. The PO concentrations from this example scenario are more stringent than the current typical monitoring level of an absence in 25 g (i.e., -1.40 log10 cfu/g) or a stricter criteria of absence

  3. Ejaculation Frequency and Risk of Prostate Cancer: Updated Results with an Additional Decade of Follow-up

    PubMed Central

    Rider, Jennifer R.; Wilson, Kathryn M.; Sinnott, Jennifer A.; Kelly, Rachel S.; Mucci, Lorelei A.; Giovannucci, Edward L.

    2016-01-01

    Background Evidence suggests that ejaculation frequency may be inversely related to the risk of prostate cancer (PCa), a disease for which few modifiable risk factors have been identified. Objective To incorporate an additional 10 yr of follow-up into an original analysis and to comprehensively evaluate the association between ejaculation frequency and PCa, accounting for screening, clinically relevant disease subgroups, and the impact of mortality from other causes. Design, setting, and participants A prospective cohort study of participants in the Health Professionals Follow-up Study utilizing self-reported data on average monthly ejaculation frequency. The study includes 31 925 men who answered questions on ejaculation frequency on a 1992 questionnaire and followed through to 2010. The average monthly ejaculation frequency was assessed at three time points: age 20–29 yr, age 40–49 yr, and the year before questionnaire distribution. Outcome measurements and statistical analysis Incidence of total PCa and clinically relevant disease subgroups. Cox models were used to estimate hazard ratios (HRs) and 95% confidence intervals (CIs). Results and limitations During 480 831 person-years, 3839 men were diagnosed with PCa. Ejaculation frequency at age 40–49 yr was positively associated with age-standardized body mass index, physical activity, divorce, history of sexually transmitted infections, and consumption of total calories and alcohol. Prostate-specific antigen (PSA) test utilization by 2008, number of PSA tests, and frequency of prostate biopsy were similar across frequency categories. In multivariable analyses, the hazard ratio for PCa incidence for ≥21 compared to 4–7 ejaculations per month was 0.81 (95% confidence interval [CI] 0.72–0.92; p < 0.0001 for trend) for frequency at age 20–29 yr and 0.78 (95% CI 0.69–0.89; p < 0.0001 for trend) for frequency at age 40–49 yr. Associations were driven by low-risk disease, were similar when restricted

  4. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    PubMed

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.

  5. Modeling Research Project Risks with Fuzzy Maps

    ERIC Educational Resources Information Center

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  6. Additional risk factors for lethal hypothermia.

    PubMed

    Bright, Fiona; Gilbert, John D; Winskog, Calle; Byard, Roger W

    2013-08-01

    An 86-year-old woman was found dead lying on her back on the floor of an unkempt kitchen. She had last been seen four days before. Her dress was pulled up and she was not wearing underpants. The house was noted to be in "disarray" with papers covering most surfaces and the floor. Rubbish was piled up against one of the doors. At autopsy the major findings were of a fractured left neck of femur, fresh pressure areas over her right buttock, Wischnewski spots of the stomach and foci of pancreatic necrosis, in keeping with hypothermia. No significant underlying organic diseases were identified and there was no other evidence of trauma. Death was due to hypothermia complicating immobility from a fractured neck of femur. This case confirms the vulnerability of frail, elderly and socially-isolated individuals to death from hypothermia if a significant illness or injury occurs. Additional risk factors for hypothermia are also illustrated in this case that involve inadequate housing construction with absent insulation and window double glazing. The approach to hypothermic deaths should, therefore, include checking for these features as well as measuring room and environmental temperatures, evaluating the type and quality of heating and the nature of the floor and its coverings, Given the ageing population in many Western countries, increasing social isolation of the elderly, cost of fuel and electricity, and lack of energy efficient housing, this type of death may become an increasingly witnessed occurrence during the colder months of the year. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  7. Statistical inference for the additive hazards model under outcome-dependent sampling.

    PubMed

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo

    2015-09-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.

  8. Statistical inference for the additive hazards model under outcome-dependent sampling

    PubMed Central

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P.; Zhou, Haibo

    2015-01-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer. PMID:26379363

  9. Use of generalised additive models to categorise continuous variables in clinical prediction

    PubMed Central

    2013-01-01

    Background In medical practice many, essentially continuous, clinical parameters tend to be categorised by physicians for ease of decision-making. Indeed, categorisation is a common practice both in medical research and in the development of clinical prediction rules, particularly where the ensuing models are to be applied in daily clinical practice to support clinicians in the decision-making process. Since the number of categories into which a continuous predictor must be categorised depends partly on the relationship between the predictor and the outcome, the need for more than two categories must be borne in mind. Methods We propose a categorisation methodology for clinical-prediction models, using Generalised Additive Models (GAMs) with P-spline smoothers to determine the relationship between the continuous predictor and the outcome. The proposed method consists of creating at least one average-risk category along with high- and low-risk categories based on the GAM smooth function. We applied this methodology to a prospective cohort of patients with exacerbated chronic obstructive pulmonary disease. The predictors selected were respiratory rate and partial pressure of carbon dioxide in the blood (PCO2), and the response variable was poor evolution. An additive logistic regression model was used to show the relationship between the covariates and the dichotomous response variable. The proposed categorisation was compared to the continuous predictor as the best option, using the AIC and AUC evaluation parameters. The sample was divided into a derivation (60%) and validation (40%) samples. The first was used to obtain the cut points while the second was used to validate the proposed methodology. Results The three-category proposal for the respiratory rate was ≤ 20;(20,24];> 24, for which the following values were obtained: AIC=314.5 and AUC=0.638. The respective values for the continuous predictor were AIC=317.1 and AUC=0.634, with no statistically

  10. Two criteria for evaluating risk prediction models

    PubMed Central

    Pfeiffer, R.M.; Gail, M.H.

    2010-01-01

    SUMMARY We propose and study two criteria to assess the usefulness of models that predict risk of disease incidence for screening and prevention, or the usefulness of prognostic models for management following disease diagnosis. The first criterion, the proportion of cases followed PCF(q), is the proportion of individuals who will develop disease who are included in the proportion q of individuals in the population at highest risk. The second criterion is the proportion needed to follow-up, PNF(p), namely the proportion of the general population at highest risk that one needs to follow in order that a proportion p of those destined to become cases will be followed. PCF(q) assesses the effectiveness of a program that follows 100q% of the population at highest risk. PNF(p) assess the feasibility of covering 100p% of cases by indicating how much of the population at highest risk must be followed. We show the relationship of those two criteria to the Lorenz curve and its inverse, and present distribution theory for estimates of PCF and PNF. We develop new methods, based on influence functions, for inference for a single risk model, and also for comparing the PCFs and PNFs of two risk models, both of which were evaluated in the same validation data. PMID:21155746

  11. Conceptual Models for Cumulative Risk Assessment

    PubMed Central

    Sexton, Ken

    2011-01-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive “family” of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects. PMID:22021317

  12. Modeling additive and non-additive effects in a hybrid population using genome-wide genotyping: prediction accuracy implications

    PubMed Central

    Bouvet, J-M; Makouanzi, G; Cros, D; Vigneron, Ph

    2016-01-01

    Hybrids are broadly used in plant breeding and accurate estimation of variance components is crucial for optimizing genetic gain. Genome-wide information may be used to explore models designed to assess the extent of additive and non-additive variance and test their prediction accuracy for the genomic selection. Ten linear mixed models, involving pedigree- and marker-based relationship matrices among parents, were developed to estimate additive (A), dominance (D) and epistatic (AA, AD and DD) effects. Five complementary models, involving the gametic phase to estimate marker-based relationships among hybrid progenies, were developed to assess the same effects. The models were compared using tree height and 3303 single-nucleotide polymorphism markers from 1130 cloned individuals obtained via controlled crosses of 13 Eucalyptus urophylla females with 9 Eucalyptus grandis males. Akaike information criterion (AIC), variance ratios, asymptotic correlation matrices of estimates, goodness-of-fit, prediction accuracy and mean square error (MSE) were used for the comparisons. The variance components and variance ratios differed according to the model. Models with a parent marker-based relationship matrix performed better than those that were pedigree-based, that is, an absence of singularities, lower AIC, higher goodness-of-fit and accuracy and smaller MSE. However, AD and DD variances were estimated with high s.es. Using the same criteria, progeny gametic phase-based models performed better in fitting the observations and predicting genetic values. However, DD variance could not be separated from the dominance variance and null estimates were obtained for AA and AD effects. This study highlighted the advantages of progeny models using genome-wide information. PMID:26328760

  13. Animal models of polycystic ovary syndrome: a focused review of rodent models in relationship to clinical phenotypes and cardiometabolic risk.

    PubMed

    Shi, Danni; Vine, Donna F

    2012-07-01

    To review rodent animal models of polycystic ovary syndrome (PCOS), with a focus on those associated with the metabolic syndrome and cardiovascular disease risk factors. Review. Rodent models of PCOS. Description and comparison of animal models. Comparison of animal models to clinical phenotypes of PCOS. Animals used to study PCOS include rodents, mice, rhesus monkeys, and ewes. Major methods to induce PCOS in these models include subcutaneous injection or implantation of androgens, estrogens, antiprogesterone, letrozole, prenatal exposure to excess androgens, and exposure to constant light. In addition, transgenic mice models and spontaneous PCOS-like rodent models have also been developed. Rodents are the most economical and widely used animals to study PCOS and ovarian dysfunction. The model chosen to study the development of PCOS and other metabolic parameters remains dependent on the specific etiologic hypotheses being investigated. Rodent models have been shown to demonstrate changes in insulin metabolism, with or without induction of hyperandrogenemia, and limited studies have investigated cardiometabolic risk factors for type 2 diabetes and cardiovascular disease. Given the clinical heterogeneity of PCOS, the utilization of different animal models may be the best approach to further our understanding of the pathophysiologic mechanisms associated with the early etiology of PCOS and cardiometabolic risk. Copyright © 2012 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  14. Risk transfer modeling among hierarchically associated stakeholders in development of space systems

    NASA Astrophysics Data System (ADS)

    Henkle, Thomas Grove, III

    Research develops an empirically derived cardinal model that prescribes handling and transfer of risks between organizations with hierarchical relationships. Descriptions of mission risk events, risk attitudes, and conditions for risk transfer are determined for client and underwriting entities associated with acquisition, production, and deployment of space systems. The hypothesis anticipates that large client organizations should be able to assume larger dollar-value risks of a program in comparison to smaller organizations even though many current risk transfer arrangements via space insurance violate this hypothesis. A literature survey covers conventional and current risk assessment methods, current techniques used in the satellite industry for complex system development, cardinal risk modeling, and relevant aspects of utility theory. Data gathered from open literature on demonstrated launch vehicle and satellite in-orbit reliability, annual space insurance premiums and losses, and ground fatalities and range damage associated with satellite launch activities are presented. Empirically derived models are developed for risk attitudes of space system clients and third-party underwriters associated with satellite system development and deployment. Two application topics for risk transfer are examined: the client-underwriter relationship on assumption or transfer of risks associated with first-year mission success, and statutory risk transfer agreements between space insurance underwriters and the US government to promote growth in both commercial client and underwriting industries. Results indicate that client entities with wealth of at least an order of magnitude above satellite project costs should retain risks to first-year mission success despite present trends. Furthermore, large client entities such as the US government should never pursue risk transfer via insurance under previously demonstrated probabilities of mission success; potential savings may

  15. Uncertainty in surface water flood risk modelling

    NASA Astrophysics Data System (ADS)

    Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.

    2009-04-01

    Two thirds of the flooding that occurred in the UK during summer 2007 was as a result of surface water (otherwise known as ‘pluvial') rather than river or coastal flooding. In response, the Environment Agency and Interim Pitt Reviews have highlighted the need for surface water risk mapping and warning tools to identify, and prepare for, flooding induced by heavy rainfall events. This need is compounded by the likely increase in rainfall intensities due to climate change. The Association of British Insurers has called for the Environment Agency to commission nationwide flood risk maps showing the relative risk of flooding from all sources. At the wider European scale, the recently-published EC Directive on the assessment and management of flood risks will require Member States to evaluate, map and model flood risk from a variety of sources. As such, there is now a clear and immediate requirement for the development of techniques for assessing and managing surface water flood risk across large areas. This paper describes an approach for integrating rainfall, drainage network and high-resolution topographic data using Flowroute™, a high-resolution flood mapping and modelling platform, to produce deterministic surface water flood risk maps. Information is provided from UK case studies to enable assessment and validation of modelled results using historical flood information and insurance claims data. Flowroute was co-developed with flood scientists at Cambridge University specifically to simulate river dynamics and floodplain inundation in complex, congested urban areas in a highly computationally efficient manner. It utilises high-resolution topographic information to route flows around individual buildings so as to enable the prediction of flood depths, extents, durations and velocities. As such, the model forms an ideal platform for the development of surface water flood risk modelling and mapping capabilities. The 2-dimensional component of Flowroute employs

  16. Assessment of yellow fever epidemic risk: an original multi-criteria modeling approach.

    PubMed

    Briand, Sylvie; Beresniak, Ariel; Nguyen, Tim; Yonli, Tajoua; Duru, Gerard; Kambire, Chantal; Perea, William

    2009-07-14

    Yellow fever (YF) virtually disappeared in francophone West African countries as a result of YF mass vaccination campaigns carried out between 1940 and 1953. However, because of the failure to continue mass vaccination campaigns, a resurgence of the deadly disease in many African countries began in the early 1980s. We developed an original modeling approach to assess YF epidemic risk (vulnerability) and to prioritize the populations to be vaccinated. We chose a two-step assessment of vulnerability at district level consisting of a quantitative and qualitative assessment per country. Quantitative assessment starts with data collection on six risk factors: five risk factors associated with "exposure" to virus/vector and one with "susceptibility" of a district to YF epidemics. The multiple correspondence analysis (MCA) modeling method was specifically adapted to reduce the five exposure variables to one aggregated exposure indicator. Health districts were then projected onto a two-dimensional graph to define different levels of vulnerability. Districts are presented on risk maps for qualitative analysis in consensus groups, allowing the addition of factors, such as population migrations or vector density, that could not be included in MCA. The example of rural districts in Burkina Faso show five distinct clusters of risk profiles. Based on this assessment, 32 of 55 districts comprising over 7 million people were prioritized for preventive vaccination campaigns. This assessment of yellow fever epidemic risk at the district level includes MCA modeling and consensus group modification. MCA provides a standardized way to reduce complexity. It supports an informed public health decision-making process that empowers local stakeholders through the consensus group. This original approach can be applied to any disease with documented risk factors.

  17. Metal-Polycyclic Aromatic Hydrocarbon Mixture Toxicity in Hyalella azteca. 1. Response Surfaces and Isoboles To Measure Non-additive Mixture Toxicity and Ecological Risk.

    PubMed

    Gauthier, Patrick T; Norwood, Warren P; Prepas, Ellie E; Pyle, Greg G

    2015-10-06

    Mixtures of metals and polycyclic aromatic hydrocarbons (PAHs) occur ubiquitously in aquatic environments, yet relatively little is known regarding their potential to produce non-additive toxicity (i.e., antagonism or potentiation). A review of the lethality of metal-PAH mixtures in aquatic biota revealed that more-than-additive lethality is as common as strictly additive effects. Approaches to ecological risk assessment do not consider non-additive toxicity of metal-PAH mixtures. Forty-eight-hour water-only binary mixture toxicity experiments were conducted to determine the additive toxic nature of mixtures of Cu, Cd, V, or Ni with phenanthrene (PHE) or phenanthrenequinone (PHQ) using the aquatic amphipod Hyalella azteca. In cases where more-than-additive toxicity was observed, we calculated the possible mortality rates at Canada's environmental water quality guideline concentrations. We used a three-dimensional response surface isobole model-based approach to compare the observed co-toxicity in juvenile amphipods to predicted outcomes based on concentration addition or effects addition mixtures models. More-than-additive lethality was observed for all Cu-PHE, Cu-PHQ, and several Cd-PHE, Cd-PHQ, and Ni-PHE mixtures. Our analysis predicts Cu-PHE, Cu-PHQ, Cd-PHE, and Cd-PHQ mixtures at the Canadian Water Quality Guideline concentrations would produce 7.5%, 3.7%, 4.4% and 1.4% mortality, respectively.

  18. Assessing Academic Risk of Student-Athletes: Applicability of the NCAA Graduation Risk Overview Model to GPA

    ERIC Educational Resources Information Center

    Johnson, James

    2013-01-01

    In an effort to standardize academic risk assessment, the NCAA developed the graduation risk overview (GRO) model. Although this model was designed to assess graduation risk, its ability to predict grade-point average (GPA) remained unknown. Therefore, 134 individual risk assessments were made to determine GRO model effectiveness in the…

  19. THE COMBINED CARCINOGENIC RISK FOR EXPOSURE TO MIXTURES OF DRINKING WATER DISINFECTION BY-PRODUCTS MAY BE LESS THAN ADDITIVE

    EPA Science Inventory

    The Combined Carcinogenic Risk for Exposure to Mixtures of Drinking Water Disinfection By-Products May be Less Than Additive

    Risk assessment methods for chemical mixtures in drinking water are not well defined. Current default risk assessments for chemical mixtures assume...

  20. The Global Earthquake Model and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  1. LIFETIME LUNG CANCER RISKS ASSOCIATED WITH INDOOR RADON EXPOSURE BASED ON VARIOUS RADON RISK MODELS FOR CANADIAN POPULATION.

    PubMed

    Chen, Jing

    2017-04-01

    This study calculates and compares the lifetime lung cancer risks associated with indoor radon exposure based on well-known risk models in the literature; two risk models are from joint studies among miners and the other three models were developed from pooling studies on residential radon exposure from China, Europe and North America respectively. The aim of this article is to make clear that the various models are mathematical descriptions of epidemiologically observed real risks in different environmental settings. The risk from exposure to indoor radon is real and it is normal that variations could exist among different risk models even when they were applied to the same dataset. The results show that lifetime risk estimates vary significantly between the various risk models considered here: the model based on the European residential data provides the lowest risk estimates, while models based on the European miners and Chinese residential pooling with complete dosimetry give the highest values. The lifetime risk estimates based on the EPA/BEIR-VI model lie within this range and agree reasonably well with the averages of risk estimates from the five risk models considered in this study. © Crown copyright 2016.

  2. Predicting the cumulative risk of death during hospitalization by modeling weekend, weekday and diurnal mortality risks.

    PubMed

    Coiera, Enrico; Wang, Ying; Magrabi, Farah; Concha, Oscar Perez; Gallego, Blanca; Runciman, William

    2014-05-21

    Current prognostic models factor in patient and disease specific variables but do not consider cumulative risks of hospitalization over time. We developed risk models of the likelihood of death associated with cumulative exposure to hospitalization, based on time-varying risks of hospitalization over any given day, as well as day of the week. Model performance was evaluated alone, and in combination with simple disease-specific models. Patients admitted between 2000 and 2006 from 501 public and private hospitals in NSW, Australia were used for training and 2007 data for evaluation. The impact of hospital care delivered over different days of the week and or times of the day was modeled by separating hospitalization risk into 21 separate time periods (morning, day, night across the days of the week). Three models were developed to predict death up to 7-days post-discharge: 1/a simple background risk model using age, gender; 2/a time-varying risk model for exposure to hospitalization (admission time, days in hospital); 3/disease specific models (Charlson co-morbidity index, DRG). Combining these three generated a full model. Models were evaluated by accuracy, AUC, Akaike and Bayesian information criteria. There was a clear diurnal rhythm to hospital mortality in the data set, peaking in the evening, as well as the well-known 'weekend-effect' where mortality peaks with weekend admissions. Individual models had modest performance on the test data set (AUC 0.71, 0.79 and 0.79 respectively). The combined model which included time-varying risk however yielded an average AUC of 0.92. This model performed best for stays up to 7-days (93% of admissions), peaking at days 3 to 5 (AUC 0.94). Risks of hospitalization vary not just with the day of the week but also time of the day, and can be used to make predictions about the cumulative risk of death associated with an individual's hospitalization. Combining disease specific models with such time varying- estimates appears to

  3. The Common Risk Model for Dams: A Portfolio Approach to Security Risk Assessments

    DTIC Science & Technology

    2013-06-01

    and threat estimates in a way that accounts for the relationships among these variables. The CRM -D can effectively quantify the benefits of...consequence, vulnerability, and threat estimates in a way that properly accounts for the relationships among these variables. The CRM -D can effectively...Common RiskModel ( CRM ) for evaluating and comparing risks associated with the nation’s critical infrastructure. This model incorporates commonly used risk

  4. A Patient Risk Model of Chemotherapy-Induced Febrile Neutropenia: Lessons Learned From the ANC Study Group.

    PubMed

    Lyman, Gary H; Poniewierski, Marek S

    2017-12-01

    Neutropenia and its complications, including febrile neutropenia (FN), represent major toxicities associated with cancer chemotherapy, resulting in considerable morbidity, mortality, and costs. The myeloid growth factors such as granulocyte colony-stimulating factor (G-CSF) have been shown to reduce the risk of neutropenia complications while enabling safe and effective chemotherapy dose intensity. Concerns about the high costs of these agents along with limited physician adherence to clinical practice guidelines, resulting in both overuse and underuse, has stimulated interest in models for individual patient risk assessment to guide appropriate use of G-CSF. In a model developed and validated by the ANC Study Group, half of patients were classified as high risk and half as low risk based on patient-, disease-, and treatment-related factors. This model has been further validated in an independent patient population. Physician-assessed risk of FN, as well as the decision to use prophylactic CSF, has been shown to correlate poorly with the FN risk estimated by the model. Additional modeling efforts in both adults and children receiving cancer treatment have been reported. Identification of patients at a high individual risk for FN and its consequences may offer the potential for optimal chemotherapy delivery and patient outcomes. Likewise, identification of patients at low risk for neutropenic events may reduce costs when such supportive care is not warranted. This article reviews and summarizes FN modeling studies and the opportunities for personalizing supportive care in patients receiving chemotherapy. Copyright © 2017 by the National Comprehensive Cancer Network.

  5. Does additional cone beam computed tomography decrease the risk of inferior alveolar nerve injury in high-risk cases undergoing third molar surgery?Does CBCT decrease the risk of IAN injury?

    PubMed

    Korkmaz, Y T; Kayıpmaz, S; Senel, F C; Atasoy, K T; Gumrukcu, Z

    2017-05-01

    The objectives of this study were to evaluate the efficacy of additional cone beam computed tomography (CBCT) imaging on decreasing the risk of inferior alveolar nerve (IAN) injury during third molar removal in patients at high risk and to assess the surgical outcomes. The study sample included patients considered at high risk for IAN injury based on panoramic radiography (PAN) evaluation. The primary predictor was the type of imaging method (PAN only or with additional CBCT). The other variables were demographic and anatomical/radiographic factors. The primary outcome variable was IAN injury. The secondary outcome variables were the preoperative surgical plan and surgical results including IAN exposure and duration of surgery. The sample comprised 122 patients (139 teeth) aged 18-48 years. Postoperative temporary IAN injury was present in three (4.2%) cases in the CBCT group and 11 (16.4%) in the PAN group at 7 days after surgery. However, none of the patients had a permanent IAN injury at the 6-month follow-up. Additional CBCT imaging was not superior to PAN in reducing IAN injury after third molar surgery during long-term follow-up. Nonetheless, CBCT may decrease the prevalence of temporary IAN injury and improve the surgical outcomes in high-risk patients. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  6. Do Health Professionals Need Additional Competencies for Stratified Cancer Prevention Based on Genetic Risk Profiling?

    PubMed Central

    Chowdhury, Susmita; Henneman, Lidewij; Dent, Tom; Hall, Alison; Burton, Alice; Pharoah, Paul; Pashayan, Nora; Burton, Hilary

    2015-01-01

    There is growing evidence that inclusion of genetic information about known common susceptibility variants may enable population risk-stratification and personalized prevention for common diseases including cancer. This would require the inclusion of genetic testing as an integral part of individual risk assessment of an asymptomatic individual. Front line health professionals would be expected to interact with and assist asymptomatic individuals through the risk stratification process. In that case, additional knowledge and skills may be needed. Current guidelines and frameworks for genetic competencies of non-specialist health professionals place an emphasis on rare inherited genetic diseases. For common diseases, health professionals do use risk assessment tools but such tools currently do not assess genetic susceptibility of individuals. In this article, we compare the skills and knowledge needed by non-genetic health professionals, if risk-stratified prevention is implemented, with existing competence recommendations from the UK, USA and Europe, in order to assess the gaps in current competences. We found that health professionals would benefit from understanding the contribution of common genetic variations in disease risk, the rationale for a risk-stratified prevention pathway, and the implications of using genomic information in risk-assessment and risk management of asymptomatic individuals for common disease prevention. PMID:26068647

  7. Risk Decision Making Model for Reservoir Floodwater resources Utilization

    NASA Astrophysics Data System (ADS)

    Huang, X.

    2017-12-01

    Floodwater resources utilization(FRU) can alleviate the shortage of water resources, but there are risks. In order to safely and efficiently utilize the floodwater resources, it is necessary to study the risk of reservoir FRU. In this paper, the risk rate of exceeding the design flood water level and the risk rate of exceeding safety discharge are estimated. Based on the principle of the minimum risk and the maximum benefit of FRU, a multi-objective risk decision making model for FRU is constructed. Probability theory and mathematical statistics method is selected to calculate the risk rate; C-D production function method and emergy analysis method is selected to calculate the risk benefit; the risk loss is related to flood inundation area and unit area loss; the multi-objective decision making problem of the model is solved by the constraint method. Taking the Shilianghe reservoir in Jiangsu Province as an example, the optimal equilibrium solution of FRU of the Shilianghe reservoir is found by using the risk decision making model, and the validity and applicability of the model are verified.

  8. Risk perception in epidemic modeling

    NASA Astrophysics Data System (ADS)

    Bagnoli, Franco; Liò, Pietro; Sguanci, Luca

    2007-12-01

    We investigate the effects of risk perception in a simple model of epidemic spreading. We assume that the perception of the risk of being infected depends on the fraction of neighbors that are ill. The effect of this factor is to decrease the infectivity, that therefore becomes a dynamical component of the model. We study the problem in the mean-field approximation and by numerical simulations for regular, random, and scale-free networks. We show that for homogeneous and random networks, there is always a value of perception that stops the epidemics. In the “worst-case” scenario of a scale-free network with diverging input connectivity, a linear perception cannot stop the epidemics; however, we show that a nonlinear increase of the perception risk may lead to the extinction of the disease. This transition is discontinuous, and is not predicted by the mean-field analysis.

  9. Multifractal Value at Risk model

    NASA Astrophysics Data System (ADS)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  10. Malignancy Risk Models for Oral Lesions

    PubMed Central

    Zarate, Ana M.; Brezzo, María M.; Secchi, Dante G.; Barra, José L.

    2013-01-01

    Objectives: The aim of this work was to assess risk habits, clinical and cellular phenotypes and TP53 DNA changes in oral mucosa samples from patients with Oral Potentially Malignant Disorders (OPMD), in order to create models that enable genotypic and phenotypic patterns to be obtained that determine the risk of lesions becoming malignant. Study Design: Clinical phenotypes, family history of cancer and risk habits were collected in clinical histories. TP53 gene mutation and morphometric-morphological features were studied, and multivariate models were applied. Three groups were estabished: a) oral cancer (OC) group (n=10), b) OPMD group (n=10), and c) control group (n=8). Results: An average of 50% of patients with malignancy were found to have smoking and drinking habits. A high percentage of TP53 mutations were observed in OC (30%) and OPMD (average 20%) lesions (p=0.000). The majority of these mutations were GC ? TA transversion mutations (60%). However, patients with OC presented mutations in all the exons and introns studied. Highest diagnostic accuracy (p=0.0001) was observed when incorporating alcohol and tobacco habits variables with TP53 mutations. Conclusions: Our results prove to be statistically reliable, with parameter estimates that are nearly unbiased even for small sample sizes. Models 2 and 3 were the most accurate for assessing the risk of an OPMD becoming cancerous. However, in a public health context, model 3 is the most recommended because the characteristics considered are easier and less costly to evaluate. Key words:TP53, oral potentially malignant disorders, risk factors, genotype, phenotype. PMID:23722122

  11. A predictive model of hospitalization risk among disabled medicaid enrollees.

    PubMed

    McAna, John F; Crawford, Albert G; Novinger, Benjamin W; Sidorov, Jaan; Din, Franklin M; Maio, Vittorio; Louis, Daniel Z; Goldfarb, Neil I

    2013-05-01

    To identify Medicaid patients, based on 1 year of administrative data, who were at high risk of admission to a hospital in the next year, and who were most likely to benefit from outreach and targeted interventions. Observational cohort study for predictive modeling. Claims, enrollment, and eligibility data for 2007 from a state Medicaid program were used to provide the independent variables for a logistic regression model to predict inpatient stays in 2008 for fully covered, continuously enrolled, disabled members. The model was developed using a 50% random sample from the state and was validated against the other 50%. Further validation was carried out by applying the parameters from the model to data from a second state's disabled Medicaid population. The strongest predictors in the model developed from the first 50% sample were over age 65 years, inpatient stay(s) in 2007, and higher Charlson Comorbidity Index scores. The areas under the receiver operating characteristic curve for the model based on the 50% state sample and its application to the 2 other samples ranged from 0.79 to 0.81. Models developed independently for all 3 samples were as high as 0.86. The results show a consistent trend of more accurate prediction of hospitalization with increasing risk score. This is a fairly robust method for targeting Medicaid members with a high probability of future avoidable hospitalizations for possible case management or other interventions. Comparison with a second state's Medicaid program provides additional evidence for the usefulness of the model.

  12. Optimizing ACS NSQIP modeling for evaluation of surgical quality and risk: patient risk adjustment, procedure mix adjustment, shrinkage adjustment, and surgical focus.

    PubMed

    Cohen, Mark E; Ko, Clifford Y; Bilimoria, Karl Y; Zhou, Lynn; Huffman, Kristopher; Wang, Xue; Liu, Yaoming; Kraemer, Kari; Meng, Xiangju; Merkow, Ryan; Chow, Warren; Matel, Brian; Richards, Karen; Hart, Amy J; Dimick, Justin B; Hall, Bruce L

    2013-08-01

    The American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) collects detailed clinical data from participating hospitals using standardized data definitions, analyzes these data, and provides participating hospitals with reports that permit risk-adjusted comparisons with a surgical quality standard. Since its inception, the ACS NSQIP has worked to refine surgical outcomes measurements and enhance statistical methods to improve the reliability and validity of this hospital profiling. From an original focus on controlling for between-hospital differences in patient risk factors with logistic regression, ACS NSQIP has added a variable to better adjust for the complexity and risk profile of surgical procedures (procedure mix adjustment) and stabilized estimates derived from small samples by using a hierarchical model with shrinkage adjustment. New models have been developed focusing on specific surgical procedures (eg, "Procedure Targeted" models), which provide opportunities to incorporate indication and other procedure-specific variables and outcomes to improve risk adjustment. In addition, comparative benchmark reports given to participating hospitals have been expanded considerably to allow more detailed evaluations of performance. Finally, procedures have been developed to estimate surgical risk for individual patients. This article describes the development of, and justification for, these new statistical methods and reporting strategies in ACS NSQIP. Copyright © 2013 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  13. Risk prediction model: Statistical and artificial neural network approach

    NASA Astrophysics Data System (ADS)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  14. Assessment of Yellow Fever Epidemic Risk: An Original Multi-criteria Modeling Approach

    PubMed Central

    Briand, Sylvie; Beresniak, Ariel; Nguyen, Tim; Yonli, Tajoua; Duru, Gerard; Kambire, Chantal; Perea, William

    2009-01-01

    Background Yellow fever (YF) virtually disappeared in francophone West African countries as a result of YF mass vaccination campaigns carried out between 1940 and 1953. However, because of the failure to continue mass vaccination campaigns, a resurgence of the deadly disease in many African countries began in the early 1980s. We developed an original modeling approach to assess YF epidemic risk (vulnerability) and to prioritize the populations to be vaccinated. Methods and Findings We chose a two-step assessment of vulnerability at district level consisting of a quantitative and qualitative assessment per country. Quantitative assessment starts with data collection on six risk factors: five risk factors associated with “exposure” to virus/vector and one with “susceptibility” of a district to YF epidemics. The multiple correspondence analysis (MCA) modeling method was specifically adapted to reduce the five exposure variables to one aggregated exposure indicator. Health districts were then projected onto a two-dimensional graph to define different levels of vulnerability. Districts are presented on risk maps for qualitative analysis in consensus groups, allowing the addition of factors, such as population migrations or vector density, that could not be included in MCA. The example of rural districts in Burkina Faso show five distinct clusters of risk profiles. Based on this assessment, 32 of 55 districts comprising over 7 million people were prioritized for preventive vaccination campaigns. Conclusion This assessment of yellow fever epidemic risk at the district level includes MCA modeling and consensus group modification. MCA provides a standardized way to reduce complexity. It supports an informed public health decision-making process that empowers local stakeholders through the consensus group. This original approach can be applied to any disease with documented risk factors. PMID:19597548

  15. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  16. Predictive risk models for proximal aortic surgery

    PubMed Central

    Díaz, Rocío; Pascual, Isaac; Álvarez, Rubén; Alperi, Alberto; Rozado, Jose; Morales, Carlos; Silva, Jacobo; Morís, César

    2017-01-01

    Predictive risk models help improve decision making, information to our patients and quality control comparing results between surgeons and between institutions. The use of these models promotes competitiveness and led to increasingly better results. All these virtues are of utmost importance when the surgical operation entails high-risk. Although proximal aortic surgery is less frequent than other cardiac surgery operations, this procedure itself is more challenging and technically demanding than other common cardiac surgery techniques. The aim of this study is to review the current status of predictive risk models for patients who undergo proximal aortic surgery, which means aortic root replacement, supracoronary ascending aortic replacement or aortic arch surgery. PMID:28616348

  17. Comparing GWAS Results of Complex Traits Using Full Genetic Model and Additive Models for Revealing Genetic Architecture

    PubMed Central

    Monir, Md. Mamun; Zhu, Jun

    2017-01-01

    Most of the genome-wide association studies (GWASs) for human complex diseases have ignored dominance, epistasis and ethnic interactions. We conducted comparative GWASs for total cholesterol using full model and additive models, which illustrate the impacts of the ignoring genetic variants on analysis results and demonstrate how genetic effects of multiple loci could differ across different ethnic groups. There were 15 quantitative trait loci with 13 individual loci and 3 pairs of epistasis loci identified by full model, whereas only 14 loci (9 common loci and 5 different loci) identified by multi-loci additive model. Again, 4 full model detected loci were not detected using multi-loci additive model. PLINK-analysis identified two loci and GCTA-analysis detected only one locus with genome-wide significance. Full model identified three previously reported genes as well as several new genes. Bioinformatics analysis showed some new genes are related with cholesterol related chemicals and/or diseases. Analyses of cholesterol data and simulation studies revealed that the full model performs were better than the additive-model performs in terms of detecting power and unbiased estimations of genetic variants of complex traits. PMID:28079101

  18. A Probabilistic Typhoon Risk Model for Vietnam

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  19. Radiation-Induced Leukemia at Doses Relevant to Radiation Therapy: Modeling Mechanisms and Estimating Risks

    NASA Technical Reports Server (NTRS)

    Shuryak, Igor; Sachs, Rainer K.; Hlatky, Lynn; Mark P. Little; Hahnfeldt, Philip; Brenner, David J.

    2006-01-01

    Because many cancer patients are diagnosed earlier and live longer than in the past, second cancers induced by radiation therapy have become a clinically significant issue. An earlier biologically based model that was designed to estimate risks of high-dose radiation induced solid cancers included initiation of stem cells to a premalignant state, inactivation of stem cells at high radiation doses, and proliferation of stem cells during cellular repopulation after inactivation. This earlier model predicted the risks of solid tumors induced by radiation therapy but overestimated the corresponding leukemia risks. Methods: To extend the model to radiation-induced leukemias, we analyzed in addition to cellular initiation, inactivation, and proliferation a repopulation mechanism specific to the hematopoietic system: long-range migration through the blood stream of hematopoietic stem cells (HSCs) from distant locations. Parameters for the model were derived from HSC biologic data in the literature and from leukemia risks among atomic bomb survivors v^ ho were subjected to much lower radiation doses. Results: Proliferating HSCs that migrate from sites distant from the high-dose region include few preleukemic HSCs, thus decreasing the high-dose leukemia risk. The extended model for leukemia provides risk estimates that are consistent with epidemiologic data for leukemia risk associated with radiation therapy over a wide dose range. For example, when applied to an earlier case-control study of 110000 women undergoing radiotherapy for uterine cancer, the model predicted an excess relative risk (ERR) of 1.9 for leukemia among women who received a large inhomogeneous fractionated external beam dose to the bone marrow (mean = 14.9 Gy), consistent with the measured ERR (2.0, 95% confidence interval [CI] = 0.2 to 6.4; from 3.6 cases expected and 11 cases observed). As a corresponding example for brachytherapy, the predicted ERR of 0.80 among women who received an inhomogeneous low

  20. Additions to Mars Global Reference Atmospheric Model (MARS-GRAM)

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; James, Bonnie

    1992-01-01

    Three major additions or modifications were made to the Mars Global Reference Atmospheric Model (Mars-GRAM): (1) in addition to the interactive version, a new batch version is available, which uses NAMELIST input, and is completely modular, so that the main driver program can easily be replaced by any calling program, such as a trajectory simulation program; (2) both the interactive and batch versions now have an option for treating local-scale dust storm effects, rather than just the global-scale dust storms in the original Mars-GRAM; and (3) the Zurek wave perturbation model was added, to simulate the effects of tidal perturbations, in addition to the random (mountain wave) perturbation model of the original Mars-GRAM. A minor modification was also made which allows heights to go 'below' local terrain height and return 'realistic' pressure, density, and temperature, and not the surface values, as returned by the original Mars-GRAM. This feature will allow simulations of Mars rover paths which might go into local 'valley' areas which lie below the average height of the present, rather coarse-resolution, terrain height data used by Mars-GRAM. Sample input and output of both the interactive and batch versions of Mars-GRAM are presented.

  1. Additions to Mars Global Reference Atmospheric Model (Mars-GRAM)

    NASA Technical Reports Server (NTRS)

    Justus, C. G.

    1991-01-01

    Three major additions or modifications were made to the Mars Global Reference Atmospheric Model (Mars-GRAM): (1) in addition to the interactive version, a new batch version is available, which uses NAMELIST input, and is completely modular, so that the main driver program can easily be replaced by any calling program, such as a trajectory simulation program; (2) both the interactive and batch versions now have an option for treating local-scale dust storm effects, rather than just the global-scale dust storms in the original Mars-GRAM; and (3) the Zurek wave perturbation model was added, to simulate the effects of tidal perturbations, in addition to the random (mountain wave) perturbation model of the original Mars-GRAM. A minor modification has also been made which allows heights to go below local terrain height and return realistic pressure, density, and temperature (not the surface values) as returned by the original Mars-GRAM. This feature will allow simulations of Mars rover paths which might go into local valley areas which lie below the average height of the present, rather coarse-resolution, terrain height data used by Mars-GRAM. Sample input and output of both the interactive and batch version of Mars-GRAM are presented.

  2. The Tripartite Model of Risk Perception (TRIRISK): Distinguishing Deliberative, Affective, and Experiential Components of Perceived Risk.

    PubMed

    Ferrer, Rebecca A; Klein, William M P; Persoskie, Alexander; Avishai-Yitshak, Aya; Sheeran, Paschal

    2016-10-01

    Although risk perception is a key predictor in health behavior theories, current conceptions of risk comprise only one (deliberative) or two (deliberative vs. affective/experiential) dimensions. This research tested a tripartite model that distinguishes among deliberative, affective, and experiential components of risk perception. In two studies, and in relation to three common diseases (cancer, heart disease, diabetes), we used confirmatory factor analyses to examine the factor structure of the tripartite risk perception (TRIRISK) model and compared the fit of the TRIRISK model to dual-factor and single-factor models. In a third study, we assessed concurrent validity by examining the impact of cancer diagnosis on (a) levels of deliberative, affective, and experiential risk perception, and (b) the strength of relations among risk components, and tested predictive validity by assessing relations with behavioral intentions to prevent cancer. The tripartite factor structure was supported, producing better model fit across diseases (studies 1 and 2). Inter-correlations among the components were significantly smaller among participants who had been diagnosed with cancer, suggesting that affected populations make finer-grained distinctions among risk perceptions (study 3). Moreover, all three risk perception components predicted unique variance in intentions to engage in preventive behavior (study 3). The TRIRISK model offers both a novel conceptualization of health-related risk perceptions, and new measures that enhance predictive validity beyond that engendered by unidimensional and bidimensional models. The present findings have implications for the ways in which risk perceptions are targeted in health behavior change interventions, health communications, and decision aids.

  3. Additional risk of end-of-the-pipe geoengineering technologies

    NASA Astrophysics Data System (ADS)

    Bohle, Martin

    2014-05-01

    qualitatively from the known successes. They do not tackle the initial cause, namely the carbon-dioxide inputs that are too high. This is their additional specific risk. 'The acceptability of geoengineering will be determined as much by social, legal and political issues as by scientific and technical factors', conclude Adam Corner and Nick Pidgeon (2010) when reviewing social and ethical implications of geoengineering the climate. It is to debate in that context that most geoengineering technologies are 'end of the pipe technologies', what involves an additional specific risk. Should these technologies be part of the toolbox to tackle anthropogenic climate change? Adam Corner and Nick Pidgeon 2010, Geoengineering the climate: The social and ethical implications, Environment Vol. 52.

  4. Monitoring risk-adjusted outcomes in congenital heart surgery: does the appropriateness of a risk model change with time?

    PubMed

    Tsang, Victor T; Brown, Katherine L; Synnergren, Mats Johanssen; Kang, Nicholas; de Leval, Marc R; Gallivan, Steve; Utley, Martin

    2009-02-01

    Risk adjustment of outcomes in pediatric congenital heart surgery is challenging due to the great diversity in diagnoses and procedures. We have previously shown that variable life-adjusted display (VLAD) charts provide an effective graphic display of risk-adjusted outcomes in this specialty. A question arises as to whether the risk model used remains appropriate over time. We used a recently developed graphic technique to evaluate the performance of an existing risk model among those patients at a single center during 2000 to 2003 originally used in model development. We then compared the distribution of predicted risk among these patients with that among patients in 2004 to 2006. Finally, we constructed a VLAD chart of risk-adjusted outcomes for the latter period. Among 1083 patients between April 2000 and March 2003, the risk model performed well at predicted risks above 3%, underestimated mortality at 2% to 3% predicted risk, and overestimated mortality below 2% predicted risk. There was little difference in the distribution of predicted risk among these patients and among 903 patients between June 2004 and October 2006. Outcomes for the more recent period were appreciably better than those expected according to the risk model. This finding cannot be explained by any apparent bias in the risk model combined with changes in case-mix. Risk models can, and hopefully do, become out of date. There is scope for complacency in the risk-adjusted audit if the risk model used is not regularly recalibrated to reflect changing standards and expectations.

  5. Dynamic Modeling of Systemic Risk in Financial Networks

    NASA Astrophysics Data System (ADS)

    Avakian, Adam

    Modern financial networks are complicated structures that can contain multiple types of nodes and connections between those nodes. Banks, governments and even individual people weave into an intricate network of debt, risk correlations and many other forms of interconnectedness. We explore multiple types of financial network models with a focus on understanding the dynamics and causes of cascading failures in such systems. In particular, we apply real-world data from multiple sources to these models to better understand real-world financial networks. We use the results of the Federal Reserve "Banking Organization Systemic Risk Report" (FR Y-15), which surveys the largest US banks on their level of interconnectedness, to find relationships between various measures of network connectivity and systemic risk in the US financial sector. This network model is then stress-tested under a number of scenarios to determine systemic risks inherent in the various network structures. We also use detailed historical balance sheet data from the Venezuelan banking system to build a bipartite network model and find relationships between the changing network structure over time and the response of the system to various shocks. We find that the relationship between interconnectedness and systemic risk is highly dependent on the system and model but that it is always a significant one. These models are useful tools that add value to regulators in creating new measurements of systemic risk in financial networks. These models could be used as macroprudential tools for monitoring the health of the entire banking system as a whole rather than only of individual banks.

  6. Risk adjustment model of credit life insurance using a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Saputra, A.; Sukono; Rusyaman, E.

    2018-03-01

    In managing the risk of credit life insurance, insurance company should acknowledge the character of the risks to predict future losses. Risk characteristics can be learned in a claim distribution model. There are two standard approaches in designing the distribution model of claims over the insurance period i.e, collective risk model and individual risk model. In the collective risk model, the claim arises when risk occurs is called individual claim, accumulation of individual claim during a period of insurance is called an aggregate claim. The aggregate claim model may be formed by large model and a number of individual claims. How the measurement of insurance risk with the premium model approach and whether this approach is appropriate for estimating the potential losses occur in the future. In order to solve the problem Genetic Algorithm with Roulette Wheel Selection is used.

  7. Mechanistic modeling of insecticide risks to breeding birds in ...

    EPA Pesticide Factsheets

    Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of the insecticide and the risk the insecticide poses to the environment and non-target wildlife. At the present time, current USEPA risk assessments do not include population-level endpoints. In this paper, we present a new mechanistic model, which allows risk assessors to estimate the effects of insecticide exposure on the survival and seasonal productivity of birds known to use agricultural fields during their breeding season. The new model was created from two existing USEPA avian risk assessment models, the Terrestrial Investigation Model (TIM v.3.0) and the Markov Chain Nest Productivity model (MCnest). The integrated TIM/MCnest model has been applied to assess the relative risk of 12 insecticides used to control corn pests on a suite of 31 avian species known to use cornfields in midwestern agroecosystems. The 12 insecticides that were assessed in this study are all used to treat major pests of corn (corn root worm borer, cutworm, and armyworm). After running the integrated TIM/MCnest model, we found extensive differences in risk to birds among insecticides, with chlorpyrifos and malathion (organophosphates) generally posing the greatest risk, and bifenthrin and ë-cyhalothrin (

  8. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents. © 2015 Society for Risk Analysis.

  9. Usefulness and limitations of global flood risk models

    NASA Astrophysics Data System (ADS)

    Ward, Philip; Jongman, Brenden; Salamon, Peter; Simpson, Alanna; Bates, Paul; De Groeve, Tom; Muis, Sanne; Coughlan de Perez, Erin; Rudari, Roberto; Mark, Trigg; Winsemius, Hessel

    2016-04-01

    Global flood risk models are now a reality. Initially, their development was driven by a demand from users for first-order global assessments to identify risk hotspots. Relentless upward trends in flood damage over the last decade have enhanced interest in such assessments. The adoption of the Sendai Framework for Disaster Risk Reduction and the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts have made these efforts even more essential. As a result, global flood risk models are being used more and more in practice, by an increasingly large number of practitioners and decision-makers. However, they clearly have their limits compared to local models. To address these issues, a team of scientists and practitioners recently came together at the Global Flood Partnership meeting to critically assess the question 'What can('t) we do with global flood risk models?'. The results of this dialogue (Ward et al., 2013) will be presented, opening a discussion on similar broader initiatives at the science-policy interface in other natural hazards. In this contribution, examples are provided of successful applications of global flood risk models in practice (for example together with the World Bank, Red Cross, and UNISDR), and limitations and gaps between user 'wish-lists' and model capabilities are discussed. Finally, a research agenda is presented for addressing these limitations and reducing the gaps. Ward et al., 2015. Nature Climate Change, doi:10.1038/nclimate2742

  10. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  11. Developing a novel risk prediction model for severe malarial anemia.

    PubMed

    Brickley, E B; Kabyemela, E; Kurtis, J D; Fried, M; Wood, A M; Duffy, P E

    2017-01-01

    As a pilot study to investigate whether personalized medicine approaches could have value for the reduction of malaria-related mortality in young children, we evaluated questionnaire and biomarker data collected from the Mother Offspring Malaria Study Project birth cohort (Muheza, Tanzania, 2002-2006) at the time of delivery as potential prognostic markers for pediatric severe malarial anemia. Severe malarial anemia, defined here as a Plasmodium falciparum infection accompanied by hemoglobin levels below 50 g/L, is a key manifestation of life-threatening malaria in high transmission regions. For this study sample, a prediction model incorporating cord blood levels of interleukin-1β provided the strongest discrimination of severe malarial anemia risk with a C-index of 0.77 (95% CI 0.70-0.84), whereas a pragmatic model based on sex, gravidity, transmission season at delivery, and bed net possession yielded a more modest C-index of 0.63 (95% CI 0.54-0.71). Although additional studies, ideally incorporating larger sample sizes and higher event per predictor ratios, are needed to externally validate these prediction models, the findings provide proof of concept that risk score-based screening programs could be developed to avert severe malaria cases in early childhood.

  12. A probabilistic asteroid impact risk model: assessment of sub-300 m impacts

    NASA Astrophysics Data System (ADS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2017-06-01

    A comprehensive asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain input parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions for objects up to 300 m in diameter. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data have little effect on the metrics of interest.

  13. Optimizing D'Amico risk groups in radical prostatectomy through the addition of magnetic resonance imaging data.

    PubMed

    Algarra, R; Zudaire, B; Tienza, A; Velis, J M; Rincón, A; Pascual, I; Zudaire, J

    2014-11-01

    To improve the predictive efficacy of the D'Amico risk classification system with magnetic resonance imaging (MRI) of the pelvis. We studied 729 patients from a series of 1310 radical prostatectomies for T1-T2 prostate cancer who underwent staging pelvic MRI. Each patient was classified with T2, T3a or T3b MRI, and N (+) patients were excluded. We identified the therapeutic factors that affected the biochemical progression-free survival (BPFS) time (prostate specific antigen [PSA] levels>0.4ng/mL) using a univariate and multivariate study with Cox models. We attempted to improve the predictive power of the D'Amico model (low risk: T1; Gleason 2-6; PSA levels<10ng/mL; intermediate risk: T2 or Gleason 7 or PSA levels 10-20ng/mL; high risk: T3 or Gleason 8-10 or PSA levels>20ng/mL). In the univariate study, the clinical factors that influenced BPFS were the following: Gleason 7 (HR: 1.7); Gleason 8-10 (HR: 2.9); T2 (HR: 1.6); PSA levels 10-20 (HR: 2); PSA levels>20 (HR: 4.3); D'Amico intermediate (HR: 2.1) and high (HR: 4.8) risk; T3a MRI (HR: 2.3) and T3b MRI (HR: 4.5). In the multivariate study, the only variables that affected BPFS were the following: D'Amico intermediate risk (HR: 2; 95% CI 1.2-3.3); D'Amico high risk (HR: 4.1; 95% CI 2.4-6.8); T3a MRI (HR: 1.9; 95% CI 1.2-2.9) and T3b MRI (HR: 3.9; 95% CI 2.5-6.1). Predictive model: Using the multivariate Cox models, we assessed the weight of each variable. A value of 1 was given to D'Amico low risk and T2 MRI; a value of 2 was given to D'Amico intermediate risk and T3a MRI and a value 3 was given to D'Amico high risk and T3b MRI. Each patient had a marker that varied between 2 and 6. The best model included 3 groups, as follows: 494 (67.7%) patients in group 1, with a score of 2-3 points (HR, 1), a BPFS of 86%±2% and 79%±2% at 5 and 10 years, respectively; 179 (24.6%) patients in group 2, with a score of 4 points (HR, 3), a BPFS of 60%±4% and 54%±5% at 5 and 10 years, respectively; and 56 (7.7%) patients in

  14. Risk Modelling of Agricultural Products

    NASA Astrophysics Data System (ADS)

    Nugrahani, E. H.

    2017-03-01

    In the real world market, agricultural commodity are imposed with fluctuating prices. This means that the price of agricultural products are relatively volatile, which means that agricultural business is a quite risky business for farmers. This paper presents some mathematical models to model such risks in the form of its volatility, based on certain assumptions. The proposed models are time varying volatility model, as well as time varying volatility with mean reversion and with seasonal mean equation models. Implementation on empirical data show that agricultural products are indeed risky.

  15. Weekend hospitalization and additional risk of death: an analysis of inpatient data.

    PubMed

    Freemantle, N; Richardson, M; Wood, J; Ray, D; Khosla, S; Shahian, D; Roche, W R; Stephens, I; Keogh, B; Pagano, D

    2012-02-01

    To assess whether weekend admissions to hospital and/or already being an inpatient on weekend days were associated with any additional mortality risk. Retrospective observational survivorship study. We analysed all admissions to the English National Health Service (NHS) during the financial year 2009/10, following up all patients for 30 days after admission and accounting for risk of death associated with diagnosis, co-morbidities, admission history, age, sex, ethnicity, deprivation, seasonality, day of admission and hospital trust, including day of death as a time dependent covariate. The principal analysis was based on time to in-hospital death. National Health Service Hospitals in England. 30 day mortality (in or out of hospital). There were 14,217,640 admissions included in the principal analysis, with 187,337 in-hospital deaths reported within 30 days of admission. Admission on weekend days was associated with a considerable increase in risk of subsequent death compared with admission on weekdays, hazard ratio for Sunday versus Wednesday 1.16 (95% CI 1.14 to 1.18; P < .0001), and for Saturday versus Wednesday 1.11 (95% CI 1.09 to 1.13; P < .0001). Hospital stays on weekend days were associated with a lower risk of death than midweek days, hazard ratio for being in hospital on Sunday versus Wednesday 0.92 (95% CI 0.91 to 0.94; P < .0001), and for Saturday versus Wednesday 0.95 (95% CI 0.93 to 0.96; P < .0001). Similar findings were observed on a smaller US data set. Admission at the weekend is associated with increased risk of subsequent death within 30 days of admission. The likelihood of death actually occurring is less on a weekend day than on a mid-week day.

  16. Optimal dividends in the Brownian motion risk model with interest

    NASA Astrophysics Data System (ADS)

    Fang, Ying; Wu, Rong

    2009-07-01

    In this paper, we consider a Brownian motion risk model, and in addition, the surplus earns investment income at a constant force of interest. The objective is to find a dividend policy so as to maximize the expected discounted value of dividend payments. It is well known that optimality is achieved by using a barrier strategy for unrestricted dividend rate. However, ultimate ruin of the company is certain if a barrier strategy is applied. In many circumstances this is not desirable. This consideration leads us to impose a restriction on the dividend stream. We assume that dividends are paid to the shareholders according to admissible strategies whose dividend rate is bounded by a constant. Under this additional constraint, we show that the optimal dividend strategy is formed by a threshold strategy.

  17. A review of unmanned aircraft system ground risk models

    NASA Astrophysics Data System (ADS)

    Washington, Achim; Clothier, Reece A.; Silva, Jose

    2017-11-01

    There is much effort being directed towards the development of safety regulations for unmanned aircraft systems (UAS). National airworthiness authorities have advocated the adoption of a risk-based approach, whereby regulations are driven by the outcomes of a systematic process to assess and manage identified safety risks. Subsequently, models characterising the primary hazards associated with UAS operations have now become critical to the development of regulations and in turn, to the future of the industry. Key to the development of airworthiness regulations for UAS is a comprehensive understanding of the risks UAS operations pose to people and property on the ground. A comprehensive review of the literature identified 33 different models (and component sub models) used to estimate ground risk posed by UAS. These models comprise failure, impact location, recovery, stress, exposure, incident stress and harm sub-models. The underlying assumptions and treatment of uncertainties in each of these sub-models differ significantly between models, which can have a significant impact on the development of regulations. This paper reviews the state-of-the-art in research into UAS ground risk modelling, discusses how the various sub-models relate to the different components of the regulation, and explores how model-uncertainties potentially impact the development of regulations for UAS.

  18. Landslide risk models for decision making.

    PubMed

    Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio

    2009-11-01

    This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.

  19. Prognostic significance of smoking in addition to established risk factors in patients with Dukes B and C colorectal cancer: a retrospective analysis.

    PubMed

    Diamantis, N; Xynos, I D; Amptulah, S; Karadima, M; Skopelitis, H; Tsavaris, N

    2013-01-01

    To investigate the prognostic significance of smoking in addition to established risk factors in patients with Dukes stage B and C colorectal cancer (CRC). 291 consecutive non-selected CRC patients were studied retrospectively. Twenty-three variables were examined using a regression statistical model to identify relevant prognostic factors related to disease free survival (DFS) and overall survival (OS). On multivariate analysis DFS was found to be negatively affected in patients with a smoking history of ≤10 pack-years vs. non-smokers (p<0.016). Additionally, performance status (PS)<90 (p<0.001), Dukes stage C (p<0.001) and elevated tumor markers (p<0.001) at the time of diagnosis were found to adversely affect DFS. Smoking also had a significant association with relapse. Patients with a smoking history of ≤10 pack-years had 2.45 (p<0.018) higher risk of recurrence compared to patients with no smoking history. OS was influenced by Karnofsky performance status (PS), Dukes stage, and elevated tumor markers. In particular patients with PS< 90 had a 4.69-fold higher risk of death (p<0.001) than patients with better PS. Stage C disease was associated with 2.27-fold higher risk of death (p<0.001) than stage B disease, and patients with elevated tumor markers at the time of diagnosis had 2.74-fold higher risk of death (p<0.014) when compared to those whose tumor markers were normal at presentation. Our study associates smoking and relapse incidence in non-clinical- trial CRC patients and reiterates the prognostic significance of PS, stage and tumor markers at the time of diagnosis.

  20. Linking livestock snow disaster mortality and environmental stressors in the Qinghai-Tibetan Plateau: Quantification based on generalized additive models.

    PubMed

    Li, Yijia; Ye, Tao; Liu, Weihang; Gao, Yu

    2018-06-01

    Livestock snow disaster occurs widely in Central-to-Eastern Asian temperate and alpine grasslands. The effects of snow disaster on livestock involve a complex interaction between precipitation, vegetation, livestock, and herder communities. Quantifying the relationship among livestock mortality, snow hazard intensity, and seasonal environmental stressors is of great importance for snow disaster early warning, risk assessments, and adaptation strategies. Using a wide-spatial extent, long-time series, and event-based livestock snow disaster dataset, this study quantified those relationships and established a quantitative model of livestock mortality for prediction purpose for the Qinghai-Tibet Plateau region. Estimations using generalized additive models (GAMs) were shown to accurately predict livestock mortality and mortality rate due to snow disaster, with adjusted-R 2 up to 0.794 and 0.666, respectively. These results showed that a longer snow disaster duration, lower temperatures during the disaster, and a drier summer with less vegetation all contribute significantly and non-linearly to higher mortality (rate), after controlling for elevation and socioeconomic conditions. These results can be readily applied to risk assessment and risk-based adaptation actions. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. A comprehensive Network Security Risk Model for process control networks.

    PubMed

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example.

  2. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  3. Genetic predisposition to coronary heart disease and stroke using an additive genetic risk score: a population-based study in Greece

    USDA-ARS?s Scientific Manuscript database

    Objective: To determine the extent to which the risk for incident coronary heart disease (CHD) increases in relation to a genetic risk score (GRS) that additively integrates the influence of high-risk alleles in nine documented single nucleotide polymorphisms (SNPs) for CHD, and to examine whether t...

  4. Hierarchical Bayesian modeling of spatio-temporal patterns of lung cancer incidence risk in Georgia, USA: 2000-2007

    NASA Astrophysics Data System (ADS)

    Yin, Ping; Mu, Lan; Madden, Marguerite; Vena, John E.

    2014-10-01

    Lung cancer is the second most commonly diagnosed cancer in both men and women in Georgia, USA. However, the spatio-temporal patterns of lung cancer risk in Georgia have not been fully studied. Hierarchical Bayesian models are used here to explore the spatio-temporal patterns of lung cancer incidence risk by race and gender in Georgia for the period of 2000-2007. With the census tract level as the spatial scale and the 2-year period aggregation as the temporal scale, we compare a total of seven Bayesian spatio-temporal models including two under a separate modeling framework and five under a joint modeling framework. One joint model outperforms others based on the deviance information criterion. Results show that the northwest region of Georgia has consistently high lung cancer incidence risk for all population groups during the study period. In addition, there are inverse relationships between the socioeconomic status and the lung cancer incidence risk among all Georgian population groups, and the relationships in males are stronger than those in females. By mapping more reliable variations in lung cancer incidence risk at a relatively fine spatio-temporal scale for different Georgian population groups, our study aims to better support healthcare performance assessment, etiological hypothesis generation, and health policy making.

  5. Electroacoustics modeling of piezoelectric welders for ultrasonic additive manufacturing processes

    NASA Astrophysics Data System (ADS)

    Hehr, Adam; Dapino, Marcelo J.

    2016-04-01

    Ultrasonic additive manufacturing (UAM) is a recent 3D metal printing technology which utilizes ultrasonic vibrations from high power piezoelectric transducers to additively weld similar and dissimilar metal foils. CNC machining is used intermittent of welding to create internal channels, embed temperature sensitive components, sensors, and materials, and for net shaping parts. Structural dynamics of the welder and work piece influence the performance of the welder and part quality. To understand the impact of structural dynamics on UAM, a linear time-invariant model is used to relate system shear force and electric current inputs to the system outputs of welder velocity and voltage. Frequency response measurements are combined with in-situ operating measurements of the welder to identify model parameters and to verify model assumptions. The proposed LTI model can enhance process consistency, performance, and guide the development of improved quality monitoring and control strategies.

  6. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    PubMed

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the

  7. Modeling process-structure-property relationships for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Yu, Cheng; Liu, Zeliang; Lian, Yanping; Wolff, Sarah; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-02-01

    This paper presents our latest work on comprehensive modeling of process-structure-property relationships for additive manufacturing (AM) materials, including using data-mining techniques to close the cycle of design-predict-optimize. To illustrate the processstructure relationship, the multi-scale multi-physics process modeling starts from the micro-scale to establish a mechanistic heat source model, to the meso-scale models of individual powder particle evolution, and finally to the macro-scale model to simulate the fabrication process of a complex product. To link structure and properties, a highefficiency mechanistic model, self-consistent clustering analyses, is developed to capture a variety of material response. The model incorporates factors such as voids, phase composition, inclusions, and grain structures, which are the differentiating features of AM metals. Furthermore, we propose data-mining as an effective solution for novel rapid design and optimization, which is motivated by the numerous influencing factors in the AM process. We believe this paper will provide a roadmap to advance AM fundamental understanding and guide the monitoring and advanced diagnostics of AM processing.

  8. Physics-based Entry, Descent and Landing Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  9. The Effects of Revealed Information on Catastrophe Loss Projection Models' Characterization of Risk: Damage Vulnerability Evidence from Florida.

    PubMed

    Karl, J Bradley; Medders, Lorilee A; Maroney, Patrick F

    2016-06-01

    We examine whether the risk characterization estimated by catastrophic loss projection models is sensitive to the revelation of new information regarding risk type. We use commercial loss projection models from two widely employed modeling firms to estimate the expected hurricane losses of Florida Atlantic University's building stock, both including and excluding secondary information regarding hurricane mitigation features that influence damage vulnerability. We then compare the results of the models without and with this revealed information and find that the revelation of additional, secondary information influences modeled losses for the windstorm-exposed university building stock, primarily evidenced by meaningful percent differences in the loss exceedance output indicated after secondary modifiers are incorporated in the analysis. Secondary risk characteristics for the data set studied appear to have substantially greater impact on probable maximum loss estimates than on average annual loss estimates. While it may be intuitively expected for catastrophe models to indicate that secondary risk characteristics hold value for reducing modeled losses, the finding that the primary value of secondary risk characteristics is in reduction of losses in the "tail" (low probability, high severity) events is less intuitive, and therefore especially interesting. Further, we address the benefit-cost tradeoffs that commercial entities must consider when deciding whether to undergo the data collection necessary to include secondary information in modeling. Although we assert the long-term benefit-cost tradeoff is positive for virtually every entity, we acknowledge short-term disincentives to such an effort. © 2015 Society for Risk Analysis.

  10. Collision risk model for NAT region.

    DOT National Transportation Integrated Search

    1971-05-01

    The paper reviews and summarizes the essential features of the collision risk model used to analyze the effects of separation standards on safety for the parallel tracking system employed in the North Atlantic. The derivation of the model is traced f...

  11. Modeling risk of pneumonia epizootics in bighorn sheep

    USGS Publications Warehouse

    Sells, Sarah N.; Mitchell, Michael S.; Nowak, J. Joshua; Lukacs, Paul M.; Anderson, Neil J.; Ramsey, Jennifer M.; Gude, Justin A.; Krausman, Paul R.

    2015-01-01

    Pneumonia epizootics are a major challenge for management of bighorn sheep (Ovis canadensis) affecting persistence of herds, satisfaction of stakeholders, and allocations of resources by management agencies. Risk factors associated with the disease are poorly understood, making pneumonia epizootics hard to predict; such epizootics are thus managed reactively rather than proactively. We developed a model for herds in Montana that identifies risk factors and addresses biological questions about risk. Using Bayesian logistic regression with repeated measures, we found that private land, weed control using domestic sheep or goats, pneumonia history, and herd density were positively associated with risk of pneumonia epizootics in 43 herds that experienced 22 epizootics out of 637 herd-years from 1979–2013. We defined an area of high risk for pathogen exposure as the area of each herd distribution plus a 14.5-km buffer from that boundary. Within this area, the odds of a pneumonia epizootic increased by >1.5 times per additional unit of private land (unit is the standardized % of private land where global  = 25.58% and SD = 14.53%). Odds were >3.3 times greater if domestic sheep or goats were used for weed control in a herd's area of high risk. If a herd or its neighbors within the area of high risk had a history of a pneumonia epizootic, odds of a subsequent pneumonia epizootic were >10 times greater. Risk greatly increased when herds were at high density, with nearly 15 times greater odds of a pneumonia epizootic compared to when herds were at low density. Odds of a pneumonia epizootic also appeared to decrease following increased spring precipitation (odds = 0.41 per unit increase, global  = 100.18% and SD = 26.97%). Risk was not associated with number of federal sheep and goat allotments, proximity to nearest herds of bighorn sheep, ratio of rams to ewes, percentage of average winter precipitation, or whether herds were of native versus mixed

  12. Additivity and Interactions in Ecotoxicity of Pollutant Mixtures: Some Patterns, Conclusions, and Open Questions

    PubMed Central

    Rodea-Palomares, Ismael; González-Pleiter, Miguel; Martín-Betancor, Keila; Rosal, Roberto; Fernández-Piñas, Francisca

    2015-01-01

    Understanding the effects of exposure to chemical mixtures is a common goal of pharmacology and ecotoxicology. In risk assessment-oriented ecotoxicology, defining the scope of application of additivity models has received utmost attention in the last 20 years, since they potentially allow one to predict the effect of any chemical mixture relying on individual chemical information only. The gold standard for additivity in ecotoxicology has demonstrated to be Loewe additivity which originated the so-called Concentration Addition (CA) additivity model. In pharmacology, the search for interactions or deviations from additivity (synergism and antagonism) has similarly captured the attention of researchers over the last 20 years and has resulted in the definition and application of the Combination Index (CI) Theorem. CI is based on Loewe additivity, but focused on the identification and quantification of synergism and antagonism. Despite additive models demonstrating a surprisingly good predictive power in chemical mixture risk assessment, concerns still exist due to the occurrence of unpredictable synergism or antagonism in certain experimental situations. In the present work, we summarize the parallel history of development of CA, IA, and CI models. We also summarize the applicability of these concepts in ecotoxicology and how their information may be integrated, as well as the possibility of prediction of synergism. Inside the box, the main question remaining is whether it is worthy to consider departures from additivity in mixture risk assessment and how to predict interactions among certain mixture components. Outside the box, the main question is whether the results observed under the experimental constraints imposed by fractional approaches are a de fide reflection of what it would be expected from chemical mixtures in real world circumstances. PMID:29051468

  13. Estimating community health needs against a Triple Aim background: What can we learn from current predictive risk models?

    PubMed

    Elissen, Arianne M J; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk

    2015-05-01

    To support providers and commissioners in accurately assessing their local populations' health needs, this study produces an overview of Dutch predictive risk models for health care, focusing specifically on the type, combination and relevance of included determinants for achieving the Triple Aim (improved health, better care experience, and lower costs). We conducted a mixed-methods study combining document analyses, interviews and a Delphi study. Predictive risk models were identified based on a web search and expert input. Participating in the study were Dutch experts in predictive risk modelling (interviews; n=11) and experts in healthcare delivery, insurance and/or funding methodology (Delphi panel; n=15). Ten predictive risk models were analysed, comprising 17 unique determinants. Twelve were considered relevant by experts for estimating community health needs. Although some compositional similarities were identified between models, the combination and operationalisation of determinants varied considerably. Existing predictive risk models provide a good starting point, but optimally balancing resources and targeting interventions on the community level will likely require a more holistic approach to health needs assessment. Development of additional determinants, such as measures of people's lifestyle and social network, may require policies pushing the integration of routine data from different (healthcare) sources. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Modeling of Ti-W Solidification Microstructures Under Additive Manufacturing Conditions

    NASA Astrophysics Data System (ADS)

    Rolchigo, Matthew R.; Mendoza, Michael Y.; Samimi, Peyman; Brice, David A.; Martin, Brian; Collins, Peter C.; LeSar, Richard

    2017-07-01

    Additive manufacturing (AM) processes have many benefits for the fabrication of alloy parts, including the potential for greater microstructural control and targeted properties than traditional metallurgy processes. To accelerate utilization of this process to produce such parts, an effective computational modeling approach to identify the relationships between material and process parameters, microstructure, and part properties is essential. Development of such a model requires accounting for the many factors in play during this process, including laser absorption, material addition and melting, fluid flow, various modes of heat transport, and solidification. In this paper, we start with a more modest goal, to create a multiscale model for a specific AM process, Laser Engineered Net Shaping (LENS™), which couples a continuum-level description of a simplified beam melting problem (coupling heat absorption, heat transport, and fluid flow) with a Lattice Boltzmann-cellular automata (LB-CA) microscale model of combined fluid flow, solute transport, and solidification. We apply this model to a binary Ti-5.5 wt pct W alloy and compare calculated quantities, such as dendrite arm spacing, with experimental results reported in a companion paper.

  15. Modeling returns volatility: Realized GARCH incorporating realized risk measure

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Ruan, Qingsong; Li, Jianfeng; Li, Ye

    2018-06-01

    This study applies realized GARCH models by introducing several risk measures of intraday returns into the measurement equation, to model the daily volatility of E-mini S&P 500 index futures returns. Besides using the conventional realized measures, realized volatility and realized kernel as our benchmarks, we also use generalized realized risk measures, realized absolute deviation, and two realized tail risk measures, realized value-at-risk and realized expected shortfall. The empirical results show that realized GARCH models using the generalized realized risk measures provide better volatility estimation for the in-sample and substantial improvement in volatility forecasting for the out-of-sample. In particular, the realized expected shortfall performs best for all of the alternative realized measures. Our empirical results reveal that future volatility may be more attributable to present losses (risk measures). The results are robust to different sample estimation windows.

  16. Challenges of Modeling Flood Risk at Large Scales

    NASA Astrophysics Data System (ADS)

    Guin, J.; Simic, M.; Rowe, J.

    2009-04-01

    Flood risk management is a major concern for many nations and for the insurance sector in places where this peril is insured. A prerequisite for risk management, whether in the public sector or in the private sector is an accurate estimation of the risk. Mitigation measures and traditional flood management techniques are most successful when the problem is viewed at a large regional scale such that all inter-dependencies in a river network are well understood. From an insurance perspective the jury is still out there on whether flood is an insurable peril. However, with advances in modeling techniques and computer power it is possible to develop models that allow proper risk quantification at the scale suitable for a viable insurance market for flood peril. In order to serve the insurance market a model has to be event-simulation based and has to provide financial risk estimation that forms the basis for risk pricing, risk transfer and risk management at all levels of insurance industry at large. In short, for a collection of properties, henceforth referred to as a portfolio, the critical output of the model is an annual probability distribution of economic losses from a single flood occurrence (flood event) or from an aggregation of all events in any given year. In this paper, the challenges of developing such a model are discussed in the context of Great Britain for which a model has been developed. The model comprises of several, physically motivated components so that the primary attributes of the phenomenon are accounted for. The first component, the rainfall generator simulates a continuous series of rainfall events in space and time over thousands of years, which are physically realistic while maintaining the statistical properties of rainfall at all locations over the model domain. A physically based runoff generation module feeds all the rivers in Great Britain, whose total length of stream links amounts to about 60,000 km. A dynamical flow routing

  17. Materials Testing and Cost Modeling for Composite Parts Through Additive Manufacturing

    DTIC Science & Technology

    2016-04-30

    FDM include plastic jet printing (PJP), fused filament modeling ( FFM ), and fused filament fabrication (FFF). FFF was coined by the RepRap project to...additive manufacturing processes? • Fused deposition modeling (FDM) trademarked by Stratasys • Fused filament modeling ( FFM ) and fused filament

  18. Modeling financial disaster risk management in developing countries

    NASA Astrophysics Data System (ADS)

    Mechler, R.; Hochrainer, S.; Pflug, G.; Linnerooth-Bayer, J.

    2005-12-01

    The public sector plays a major role in reducing the long-term economic repercussions of disasters by repairing damaged infrastructure and providing financial assistance to households and businesses. If critical infrastructure is not repaired in a timely manner, there can be serious effects on the economy and the livelihoods of the population. The repair of public infrastructure, however, can be a significant drain on public budgets especially in developing and transition countries. Developing country governments frequently lack the liquidity, even including international aid and loans, to fully repair damaged critical public infrastructure or provide sufficient support to households and businesses for their recovery. The earthquake in Gujarat, and other recent cases of government post-disaster liquidity crises, have sounded an alarm, prompting financial development organizations, such as the World Bank, among others, to call for greater attention to reducing financial vulnerability and increasing the resilience of the public sector. This talk reports on a model designed to illustrate the tradeoffs and choices a developing country must make in financially managing the economic risks due to natural disasters. Budgetary resources allocated to pre-disaster risk management strategies, such as loss mitigation measures, a catastrophe reserve fund, insurance and contingent credit arrangements for public assets, reduce the probability of financing gaps - the inability of governments to meet their full obligations in providing relief to private victims and restoring public infrastructure - or prevent the deterioration of the ability to undertake additional borrowing without incurring a debt crisis. The model -which is equipped with a graphical interface - can be a helpful tool for building capacity of policy makers for developing and assessing public financing strategies for disaster risk by indicating the respective costs and consequences of financing alternatives.

  19. Score tests for independence in semiparametric competing risks models.

    PubMed

    Saïd, Mériem; Ghazzali, Nadia; Rivest, Louis-Paul

    2009-12-01

    A popular model for competing risks postulates the existence of a latent unobserved failure time for each risk. Assuming that these underlying failure times are independent is attractive since it allows standard statistical tools for right-censored lifetime data to be used in the analysis. This paper proposes simple independence score tests for the validity of this assumption when the individual risks are modeled using semiparametric proportional hazards regressions. It assumes that covariates are available, making the model identifiable. The score tests are derived for alternatives that specify that copulas are responsible for a possible dependency between the competing risks. The test statistics are constructed by adding to the partial likelihoods for the individual risks an explanatory variable for the dependency between the risks. A variance estimator is derived by writing the score function and the Fisher information matrix for the marginal models as stochastic integrals. Pitman efficiencies are used to compare test statistics. A simulation study and a numerical example illustrate the methodology proposed in this paper.

  20. Modeling perceptions of climatic risk in crop production.

    PubMed

    Reinmuth, Evelyn; Parker, Phillip; Aurbacher, Joachim; Högy, Petra; Dabbert, Stephan

    2017-01-01

    In agricultural production, land-use decisions are components of economic planning that result in the strategic allocation of fields. Climate variability represents an uncertainty factor in crop production. Considering yield impact, climatic influence is perceived during and evaluated at the end of crop production cycles. In practice, this information is then incorporated into planning for the upcoming season. This process contributes to attitudes toward climate-induced risk in crop production. In the literature, however, the subjective valuation of risk is modeled as a risk attitude toward variations in (monetary) outcomes. Consequently, climatic influence may be obscured by political and market influences so that risk perceptions during the production process are neglected. We present a utility concept that allows the inclusion of annual risk scores based on mid-season risk perceptions that are incorporated into field-planning decisions. This approach is exemplified and implemented for winter wheat production in the Kraichgau, a region in Southwest Germany, using the integrated bio-economic simulation model FarmActor and empirical data from the region. Survey results indicate that a profitability threshold for this crop, the level of "still-good yield" (sgy), is 69 dt ha-1 (regional mean Kraichgau sample) for a given season. This threshold governs the monitoring process and risk estimators. We tested the modeled estimators against simulation results using ten projected future weather time series for winter wheat production. The mid-season estimators generally proved to be effective. This approach can be used to improve the modeling of planning decisions by providing a more comprehensive evaluation of field-crop response to climatic changes from an economic risk point of view. The methodology further provides economic insight in an agrometeorological context where prices for crops or inputs are lacking, but farmer attitudes toward risk should still be included in

  1. Modeling perceptions of climatic risk in crop production

    PubMed Central

    Parker, Phillip; Aurbacher, Joachim; Högy, Petra; Dabbert, Stephan

    2017-01-01

    In agricultural production, land-use decisions are components of economic planning that result in the strategic allocation of fields. Climate variability represents an uncertainty factor in crop production. Considering yield impact, climatic influence is perceived during and evaluated at the end of crop production cycles. In practice, this information is then incorporated into planning for the upcoming season. This process contributes to attitudes toward climate-induced risk in crop production. In the literature, however, the subjective valuation of risk is modeled as a risk attitude toward variations in (monetary) outcomes. Consequently, climatic influence may be obscured by political and market influences so that risk perceptions during the production process are neglected. We present a utility concept that allows the inclusion of annual risk scores based on mid-season risk perceptions that are incorporated into field-planning decisions. This approach is exemplified and implemented for winter wheat production in the Kraichgau, a region in Southwest Germany, using the integrated bio-economic simulation model FarmActor and empirical data from the region. Survey results indicate that a profitability threshold for this crop, the level of “still-good yield” (sgy), is 69 dt ha-1 (regional mean Kraichgau sample) for a given season. This threshold governs the monitoring process and risk estimators. We tested the modeled estimators against simulation results using ten projected future weather time series for winter wheat production. The mid-season estimators generally proved to be effective. This approach can be used to improve the modeling of planning decisions by providing a more comprehensive evaluation of field-crop response to climatic changes from an economic risk point of view. The methodology further provides economic insight in an agrometeorological context where prices for crops or inputs are lacking, but farmer attitudes toward risk should still be included

  2. Urothelial cancer of the upper urinary tract: emerging biomarkers and integrative models for risk stratification.

    PubMed

    Mathieu, Romain; Vartolomei, Mihai D; Mbeutcha, Aurélie; Karakiewicz, Pierre I; Briganti, Alberto; Roupret, Morgan; Shariat, Shahrokh F

    2016-08-01

    The aim of this review was to provide an overview of current biomarkers and risk stratification models in urothelial cancer of the upper urinary tract (UTUC). A non-systematic Medline/PubMed literature search was performed using the terms "biomarkers", "preoperative models", "postoperative models", "risk stratification", together with "upper tract urothelial carcinoma". Original articles published between January 2003 and August 2015 were included based on their clinical relevance. Additional references were collected by cross referencing the bibliography of the selected articles. Various promising predictive and prognostic biomarkers have been identified in UTUC thanks to the increasing knowledge of the different biological pathways involved in UTUC tumorigenesis. These biomarkers may help identify tumors with aggressive biology and worse outcomes. Current tools aim at predicting muscle invasive or non-organ confined disease, renal failure after radical nephroureterectomy and survival outcomes. These models are still mainly based on imaging and clinicopathological feature and none has integrated biomarkers. Risk stratification in UTUC is still suboptimal, especially in the preoperative setting due to current limitations in staging and grading. Identification of novel biomarkers and external validation of current prognostic models may help improve risk stratification to allow evidence-based counselling for kidney-sparing approaches, perioperative chemotherapy and/or risk-based surveillance. Despite growing understanding of the biology underlying UTUC, management of this disease remains difficult due to the lack of validated biomarkers and the limitations of current predictive and prognostic tools. Further efforts and collaborations are necessaryry to allow their integration in daily practice.

  3. A systematic review of breast cancer incidence risk prediction models with meta-analysis of their performance.

    PubMed

    Meads, Catherine; Ahmed, Ikhlaaq; Riley, Richard D

    2012-04-01

    'Gail 2' model showed the average C statistic was 0.63 (95% CI 0.59-0.67), and the expected/observed ratio of events varied considerably across studies (95% prediction interval for E/O ratio when the model was applied in practice was 0.75-1.19). There is a need for models with better predictive performance but, given the large amount of work already conducted, further improvement of existing models based on conventional risk factors is perhaps unlikely. Research to identify new risk factors with large additionally predictive ability is therefore needed, alongside clearer reporting and continual validation of new models as they develop.

  4. Fire risk in San Diego County, California: A weighted Bayesian model approach

    USGS Publications Warehouse

    Kolden, Crystal A.; Weigel, Timothy J.

    2007-01-01

    Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.

  5. Assessing Uncertainty in Risk Assessment Models (BOSC CSS meeting)

    EPA Science Inventory

    In vitro assays are increasingly being used in risk assessments Uncertainty in assays leads to uncertainty in models used for risk assessments. This poster assesses uncertainty in the ER and AR models.

  6. Validation of Risk Assessment Models of Venous Thromboembolism in Hospitalized Medical Patients.

    PubMed

    Greene, M Todd; Spyropoulos, Alex C; Chopra, Vineet; Grant, Paul J; Kaatz, Scott; Bernstein, Steven J; Flanders, Scott A

    2016-09-01

    Patients hospitalized for acute medical illness are at increased risk for venous thromboembolism. Although risk assessment is recommended and several at-admission risk assessment models have been developed, these have not been adequately derived or externally validated. Therefore, an optimal approach to evaluate venous thromboembolism risk in medical patients is not known. We conducted an external validation study of existing venous thromboembolism risk assessment models using data collected on 63,548 hospitalized medical patients as part of the Michigan Hospital Medicine Safety (HMS) Consortium. For each patient, cumulative venous thromboembolism risk scores and risk categories were calculated. Cox regression models were used to quantify the association between venous thromboembolism events and assigned risk categories. Model discrimination was assessed using Harrell's C-index. Venous thromboembolism incidence in hospitalized medical patients is low (1%). Although existing risk assessment models demonstrate good calibration (hazard ratios for "at-risk" range 2.97-3.59), model discrimination is generally poor for all risk assessment models (C-index range 0.58-0.64). The performance of several existing risk assessment models for predicting venous thromboembolism among acutely ill, hospitalized medical patients at admission is limited. Given the low venous thromboembolism incidence in this nonsurgical patient population, careful consideration of how best to utilize existing venous thromboembolism risk assessment models is necessary, and further development and validation of novel venous thromboembolism risk assessment models for this patient population may be warranted. Published by Elsevier Inc.

  7. Lymphatic filariasis transmission risk map of India, based on a geo-environmental risk model.

    PubMed

    Sabesan, Shanmugavelu; Raju, Konuganti Hari Kishan; Subramanian, Swaminathan; Srivastava, Pradeep Kumar; Jambulingam, Purushothaman

    2013-09-01

    The strategy adopted by a global program to interrupt transmission of lymphatic filariasis (LF) is mass drug administration (MDA) using chemotherapy. India also followed this strategy by introducing MDA in the historically known endemic areas. All other areas, which remained unsurveyed, were presumed to be nonendemic and left without any intervention. Therefore, identification of LF transmission risk areas in the entire country has become essential so that they can be targeted for intervention. A geo-environmental risk model (GERM) developed earlier was used to create a filariasis transmission risk map for India. In this model, a Standardized Filariasis Transmission Risk Index (SFTRI, based on geo-environmental risk variables) was used as a predictor of transmission risk. The relationship between SFTRI and endemicity (historically known) of an area was quantified by logistic regression analysis. The quantified relationship was validated by assessing the filarial antigenemia status of children living in the unsurveyed areas through a ground truth study. A significant positive relationship was observed between SFTRI and the endemicity of an area. Overall, the model prediction of filarial endemic status of districts was found to be correct in 92.8% of the total observations. Thus, among the 190 districts hitherto unsurveyed, as many as 113 districts were predicted to be at risk, and the remaining at no risk. The GERM developed on geographic information system (GIS) platform is useful for LF spatial delimitation on a macrogeographic/regional scale. Furthermore, the risk map developed will be useful for the national LF elimination program by identifying areas at risk for intervention and for undertaking surveillance in no-risk areas.

  8. Flood loss model transfer: on the value of additional data

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Lüdtke, Stefan; Vogel, Kristin; Kreibich, Heidi; Thieken, Annegret; Merz, Bruno

    2017-04-01

    The transfer of models across geographical regions and flood events is a key challenge in flood loss estimation. Variations in local characteristics and continuous system changes require regional adjustments and continuous updating with current evidence. However, acquiring data on damage influencing factors is expensive and therefore assessing the value of additional data in terms of model reliability and performance improvement is of high relevance. The present study utilizes empirical flood loss data on direct damage to residential buildings available from computer aided telephone interviews that were carried out after the floods in 2002, 2005, 2006, 2010, 2011 and 2013 mainly in the Elbe and Danube catchments in Germany. Flood loss model performance is assessed for incrementally increased numbers of loss data which are differentiated according to region and flood event. Two flood loss modeling approaches are considered: (i) a multi-variable flood loss model approach using Random Forests and (ii) a uni-variable stage damage function. Both model approaches are embedded in a bootstrapping process which allows evaluating the uncertainty of model predictions. Predictive performance of both models is evaluated with regard to mean bias, mean absolute and mean squared errors, as well as hit rate and sharpness. Mean bias and mean absolute error give information about the accuracy of model predictions; mean squared error and sharpness about precision and hit rate is an indicator for model reliability. The results of incremental, regional and temporal updating demonstrate the usefulness of additional data to improve model predictive performance and increase model reliability, particularly in a spatial-temporal transfer setting.

  9. Development of a Risk Prediction Model and Clinical Risk Score for Isolated Tricuspid Valve Surgery.

    PubMed

    LaPar, Damien J; Likosky, Donald S; Zhang, Min; Theurer, Patty; Fonner, C Edwin; Kern, John A; Bolling, Stephen F; Drake, Daniel H; Speir, Alan M; Rich, Jeffrey B; Kron, Irving L; Prager, Richard L; Ailawadi, Gorav

    2018-02-01

    While tricuspid valve (TV) operations remain associated with high mortality (∼8-10%), no robust prediction models exist to support clinical decision-making. We developed a preoperative clinical risk model with an easily calculable clinical risk score (CRS) to predict mortality and major morbidity after isolated TV surgery. Multi-state Society of Thoracic Surgeons database records were evaluated for 2,050 isolated TV repair and replacement operations for any etiology performed at 50 hospitals (2002-2014). Parsimonious preoperative risk prediction models were developed using multi-level mixed effects regression to estimate mortality and composite major morbidity risk. Model results were utilized to establish a novel CRS for patients undergoing TV operations. Models were evaluated for discrimination and calibration. Operative mortality and composite major morbidity rates were 9% and 42%, respectively. Final regression models performed well (both P<0.001, AUC = 0.74 and 0.76) and included preoperative factors: age, gender, stroke, hemodialysis, ejection fraction, lung disease, NYHA class, reoperation and urgent or emergency status (all P<0.05). A simple CRS from 0-10+ was highly associated (P<0.001) with incremental increases in predicted mortality and major morbidity. Predicted mortality risk ranged from 2%-34% across CRS categories, while predicted major morbidity risk ranged from 13%-71%. Mortality and major morbidity after isolated TV surgery can be predicted using preoperative patient data from the STS Adult Cardiac Database. A simple clinical risk score predicts mortality and major morbidity after isolated TV surgery. This score may facilitate perioperative counseling and identification of suitable patients for TV surgery. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  10. Risk assessment models to predict caries recurrence after oral rehabilitation under general anaesthesia: a pilot study.

    PubMed

    Lin, Yai-Tin; Kalhan, Ashish Chetan; Lin, Yng-Tzer Joseph; Kalhan, Tosha Ashish; Chou, Chein-Chin; Gao, Xiao Li; Hsu, Chin-Ying Stephen

    2018-05-08

    Oral rehabilitation under general anaesthesia (GA), commonly employed to treat high caries-risk children, has been associated with high economic and individual/family burden, besides high post-GA caries recurrence rates. As there is no caries prediction model available for paediatric GA patients, this study was performed to build caries risk assessment/prediction models using pre-GA data and to explore mid-term prognostic factors for early identification of high-risk children prone to caries relapse post-GA oral rehabilitation. Ninety-two children were identified and recruited with parental consent before oral rehabilitation under GA. Biopsychosocial data collection at baseline and the 6-month follow-up were conducted using questionnaire (Q), microbiological assessment (M) and clinical examination (C). The prediction models constructed using data collected from Q, Q + M and Q + M + C demonstrated an accuracy of 72%, 78% and 82%, respectively. Furthermore, of the 83 (90.2%) patients recalled 6 months after GA intervention, recurrent caries was identified in 54.2%, together with reduced bacterial counts, lower plaque index and increased percentage of children toothbrushing for themselves (all P < 0.05). Additionally, meal-time and toothbrushing duration were shown, through bivariate analyses, to be significant prognostic determinants for caries recurrence (both P < 0.05). Risk assessment/prediction models built using pre-GA data may be promising in identifying high-risk children prone to post-GA caries recurrence, although future internal and external validation of predictive models is warranted. © 2018 FDI World Dental Federation.

  11. Transmission of risk from parents with chronic pain to offspring: an integrative conceptual model

    PubMed Central

    Stone, Amanda L.; Wilson, Anna C.

    2017-01-01

    Offspring of parents with chronic pain are at increased risk for pain and adverse mental and physical health outcomes (Higgins et al, 2015). Although the association between chronic pain in parents and offspring has been established, few studies have addressed why or how this relation occurs. Identifying mechanisms for the transmission of risk that leads to the development of chronic pain in offspring is important for developing preventive interventions targeted to decrease risk for chronic pain and related outcomes (eg, disability and internalizing symptoms). This review presents a conceptual model for the intergenerational transmission of chronic pain from parents to offspring with the goal of setting an agenda for future research and the development of preventive interventions. Our proposed model highlights 5 potential mechanisms for the relation between parental chronic pain and pediatric chronic pain and related adverse outcomes: (1) genetics, (2) alterations in early neurobiological development, (3) pain-specific social learning, (4), general parenting and family health, and (5) exposure to stressful environment. In addition, the model presents 3 potential moderators for the relation between parent and child chronic pain: (1) the presence of chronic pain in a second parent, (2) timing, course, and location of parental chronic pain, and (3) offspring’s characteristics (ie, sex, developmental stage, race or ethnicity, and temperament). Such a framework highlights chronic pain as inherently familial and intergenerational, opening up avenues for new models of intervention and prevention that can be family centered and include at-risk children. PMID:27380502

  12. Transmission of risk from parents with chronic pain to offspring: an integrative conceptual model.

    PubMed

    Stone, Amanda L; Wilson, Anna C

    2016-12-01

    Offspring of parents with chronic pain are at increased risk for pain and adverse mental and physical health outcomes (Higgins et al, 2015). Although the association between chronic pain in parents and offspring has been established, few studies have addressed why or how this relation occurs. Identifying mechanisms for the transmission of risk that leads to the development of chronic pain in offspring is important for developing preventive interventions targeted to decrease risk for chronic pain and related outcomes (eg, disability and internalizing symptoms). This review presents a conceptual model for the intergenerational transmission of chronic pain from parents to offspring with the goal of setting an agenda for future research and the development of preventive interventions. Our proposed model highlights 5 potential mechanisms for the relation between parental chronic pain and pediatric chronic pain and related adverse outcomes: (1) genetics, (2) alterations in early neurobiological development, (3) pain-specific social learning, (4), general parenting and family health, and (5) exposure to stressful environment. In addition, the model presents 3 potential moderators for the relation between parent and child chronic pain: (1) the presence of chronic pain in a second parent, (2) timing, course, and location of parental chronic pain, and (3) offspring's characteristics (ie, sex, developmental stage, race or ethnicity, and temperament). Such a framework highlights chronic pain as inherently familial and intergenerational, opening up avenues for new models of intervention and prevention that can be family centered and include at-risk children.

  13. Using Set Model for Learning Addition of Integers

    ERIC Educational Resources Information Center

    Lestari, Umi Puji; Putri, Ratu Ilma Indra; Hartono, Yusuf

    2015-01-01

    This study aims to investigate how set model can help students' understanding of addition of integers in fourth grade. The study has been carried out to 23 students and a teacher of IVC SD Iba Palembang in January 2015. This study is a design research that also promotes PMRI as the underlying design context and activity. Results showed that the…

  14. [A model list of high risk drugs].

    PubMed

    Cotrina Luque, J; Guerrero Aznar, M D; Alvarez del Vayo Benito, C; Jimenez Mesa, E; Guzman Laura, K P; Fernández Fernández, L

    2013-12-01

    «High-risk drugs» are those that have a very high «risk» of causing death or serious injury if an error occurs during its use. The Institute for Safe Medication Practices (ISMP) has prepared a high-risk drugs list applicable to the general population (with no differences between the pediatric and adult population). Thus, there is a lack of information for the pediatric population. The main objective of this work is to develop a high-risk drug list adapted to the neonatal or pediatric population as a reference model for the pediatric hospital health workforce. We made a literature search in May 2012 to identify any published lists or references in relation to pediatric and/or neonatal high-risk drugs. A total of 15 studies were found, from which 9 were selected. A model list was developed mainly based on the ISMP one, adding strongly perceived pediatric risk drugs and removing those where the pediatric use was anecdotal. There is no published list that suits pediatric risk management. The list of pediatric and neonatal high-risk drugs presented here could be a «reference list of high-risk drugs » for pediatric hospitals. Using this list and training will help to prevent medication errors in each drug supply chain (prescribing, transcribing, dispensing and administration). Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  15. Models for Pesticide Risk Assessment

    EPA Pesticide Factsheets

    EPA considers the toxicity of the pesticide as well as the amount of pesticide to which a person or the environments may be exposed in risk assessment. Scientists use mathematical models to predict pesticide concentrations in exposure assessment.

  16. East meets West: the influence of racial, ethnic and cultural risk factors on cardiac surgical risk model performance.

    PubMed

    Soo-Hoo, Sarah; Nemeth, Samantha; Baser, Onur; Argenziano, Michael; Kurlansky, Paul

    2018-01-01

    To explore the impact of racial and ethnic diversity on the performance of cardiac surgical risk models, the Chinese SinoSCORE was compared with the Society of Thoracic Surgeons (STS) risk model in a diverse American population. The SinoSCORE risk model was applied to 13 969 consecutive coronary artery bypass surgery patients from twelve American institutions. SinoSCORE risk factors were entered into a logistic regression to create a 'derived' SinoSCORE whose performance was compared with that of the STS risk model. Observed mortality was 1.51% (66% of that predicted by STS model). The SinoSCORE 'low-risk' group had a mortality of 0.15%±0.04%, while the medium-risk and high-risk groups had mortalities of 0.35%±0.06% and 2.13%±0.14%, respectively. The derived SinoSCORE model had a relatively good discrimination (area under of the curve (AUC)=0.785) compared with that of the STS risk score (AUC=0.811; P=0.18 comparing the two). However, specific factors that were significant in the original SinoSCORE but that lacked significance in our derived model included body mass index, preoperative atrial fibrillation and chronic obstructive pulmonary disease. SinoSCORE demonstrated limited discrimination when applied to an American population. The derived SinoSCORE had a discrimination comparable with that of the STS, suggesting underlying similarities of physiological substrate undergoing surgery. However, differential influence of various risk factors suggests that there may be varying degrees of importance and interactions between risk factors. Clinicians should exercise caution when applying risk models across varying populations due to potential differences that racial, ethnic and geographic factors may play in cardiac disease and surgical outcomes.

  17. What can('t) we do with global flood risk models?

    NASA Astrophysics Data System (ADS)

    Ward, Philip; Jongman, Brenden; Salamon, Peter; Simpson, Alanna; Winsemius, Hessel

    2015-04-01

    In recent years, several global scale flood risk models have become available. Within the scientific community these have been, and are being, used to assess and map the current levels of risk faced by countries and societies. Increasingly, they are also being used to assess how that level of risk may change in the future, under scenarios of climate change and/or socioeconomic development. More and more, these 'quick and not so dirty' methods are also being used in practice, for a large range of uses and applications, and by an increasing range of practitioners and decision makers. For example, assessments can be used by: International Financing Institutes for prioritising investments in the most promising natural disaster risk reduction measures and strategies; intra-national institutes in the monitoring of progress on risk reduction activities; the (re-)insurance industry in assessing their risk portfolios and potential changes in those portfolios under climate change; by multinational companies in assessing risks to their regional investments and supply chains; and by international aid organisations for improved resource planning. However, global scale flood risk models clearly have their limits, and therefore both modellers and users need to critically address the question 'What can('t) we do with global flood risk models?'. This contribution is intended to start a dialogue between model developers, users, and decision makers to better answer this question. We will provide a number of examples of how the GLOFRIS global flood risk model has recently been used in several practical applications, and share both the positive and negative insights gained through these experiences. We wish to discuss similar experiences with other groups of modelers, users, and decision-makers, in order to better understand and harness the potential of this new generation of models, understand the differences in model approaches followed and their impacts on applicability, and develop

  18. Modelling of additive manufacturing processes: a review and classification

    NASA Astrophysics Data System (ADS)

    Stavropoulos, Panagiotis; Foteinopoulos, Panagis

    2018-03-01

    Additive manufacturing (AM) is a very promising technology; however, there are a number of open issues related to the different AM processes. The literature on modelling the existing AM processes is reviewed and classified. A categorization of the different AM processes in process groups, according to the process mechanism, has been conducted and the most important issues are stated. Suggestions are made as to which approach is more appropriate according to the key performance indicator desired to be modelled and a discussion is included as to the way that future modelling work can better contribute to improving today's AM process understanding.

  19. Theory-Based Cartographic Risk Model Development and Application for Home Fire Safety.

    PubMed

    Furmanek, Stephen; Lehna, Carlee; Hanchette, Carol

    There is a gap in the use of predictive risk models to identify areas at risk for home fires and burn injury. The purpose of this study was to describe the creation, validation, and application of such a model using a sample from an intervention study with parents of newborns in Jefferson County, KY, as an example. Performed was a literature search to identify risk factors for home fires and burn injury in the target population. Obtained from the American Community Survey at the census tract level and synthesized to create a predictive cartographic risk model was risk factor data. Model validation was performed through correlation, regression, and Moran's I with fire incidence data from open records. Independent samples t-tests were used to examine the model in relation to geocoded participant addresses. Participant risk level for fire rate was determined and proximity to fire station service areas and hospitals. The model showed high and severe risk clustering in the northwest section of the county. Strongly correlated with fire rate was modeled risk; the best predictive model for fire risk contained home value (low), race (black), and non high school graduates. Applying the model to the intervention sample, the majority of participants were at lower risk and mostly within service areas closest to a fire department and hospital. Cartographic risk models were useful in identifying areas at risk and analyzing participant risk level. The methods outlined in this study are generalizable to other public health issues.

  20. Integrating Professional and Folk Models of HIV Risk: YMSM's Perceptions of High-Risk Sex

    ERIC Educational Resources Information Center

    Kubicek, Katrina; Carpineto, Julie; McDavitt, Bryce; Weiss, George; Iverson, Ellen F.; Au, Chi-Wai; Kerrone, Dustin; Martinez, Miguel; Kipke, Michele D.

    2008-01-01

    Risks associated with HIV are well documented in research literature. Although a great deal has been written about high-risk sex, little research has been conducted to examine how young men who have sex with men (YMSM) perceive and define high-risk sexual behavior. In this study, we compare the "professional" and "folk" models of HIV risk based on…

  1. Forecasting extinction risk with nonstationary matrix models.

    PubMed

    Gotelli, Nicholas J; Ellison, Aaron M

    2006-02-01

    Matrix population growth models are standard tools for forecasting population change and for managing rare species, but they are less useful for predicting extinction risk in the face of changing environmental conditions. Deterministic models provide point estimates of lambda, the finite rate of increase, as well as measures of matrix sensitivity and elasticity. Stationary matrix models can be used to estimate extinction risk in a variable environment, but they assume that the matrix elements are randomly sampled from a stationary (i.e., non-changing) distribution. Here we outline a method for using nonstationary matrix models to construct realistic forecasts of population fluctuation in changing environments. Our method requires three pieces of data: (1) field estimates of transition matrix elements, (2) experimental data on the demographic responses of populations to altered environmental conditions, and (3) forecasting data on environmental drivers. These three pieces of data are combined to generate a series of sequential transition matrices that emulate a pattern of long-term change in environmental drivers. Realistic estimates of population persistence and extinction risk can be derived from stochastic permutations of such a model. We illustrate the steps of this analysis with data from two populations of Sarracenia purpurea growing in northern New England. Sarracenia purpurea is a perennial carnivorous plant that is potentially at risk of local extinction because of increased nitrogen deposition. Long-term monitoring records or models of environmental change can be used to generate time series of driver variables under different scenarios of changing environments. Both manipulative and natural experiments can be used to construct a linking function that describes how matrix parameters change as a function of the environmental driver. This synthetic modeling approach provides quantitative estimates of extinction probability that have an explicit mechanistic

  2. CREATION OF THE MODEL ADDITIONAL PROTOCOL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houck, F.; Rosenthal, M.; Wulf, N.

    In 1991, the international nuclear nonproliferation community was dismayed to discover that the implementation of safeguards by the International Atomic Energy Agency (IAEA) under its NPT INFCIRC/153 safeguards agreement with Iraq had failed to detect Iraq's nuclear weapon program. It was now clear that ensuring that states were fulfilling their obligations under the NPT would require not just detecting diversion but also the ability to detect undeclared materials and activities. To achieve this, the IAEA initiated what would turn out to be a five-year effort to reappraise the NPT safeguards system. The effort engaged the IAEA and its Member Statesmore » and led to agreement in 1997 on a new safeguards agreement, the Model Protocol Additional to the Agreement(s) between States and the International Atomic Energy Agency for the Application of Safeguards. The Model Protocol makes explicit that one IAEA goal is to provide assurance of the absence of undeclared nuclear material and activities. The Model Protocol requires an expanded declaration that identifies a State's nuclear potential, empowers the IAEA to raise questions about the correctness and completeness of the State's declaration, and, if needed, allows IAEA access to locations. The information required and the locations available for access are much broader than those provided for under INFCIRC/153. The negotiation was completed in quite a short time because it started with a relatively complete draft of an agreement prepared by the IAEA Secretariat. This paper describes how the Model Protocol was constructed and reviews key decisions that were made both during the five-year period and in the actual negotiation.« less

  3. Improving Disease Prediction by Incorporating Family Disease History in Risk Prediction Models with Large-Scale Genetic Data.

    PubMed

    Gim, Jungsoo; Kim, Wonji; Kwak, Soo Heon; Choi, Hosik; Park, Changyi; Park, Kyong Soo; Kwon, Sunghoon; Park, Taesung; Won, Sungho

    2017-11-01

    Despite the many successes of genome-wide association studies (GWAS), the known susceptibility variants identified by GWAS have modest effect sizes, leading to notable skepticism about the effectiveness of building a risk prediction model from large-scale genetic data. However, in contrast to genetic variants, the family history of diseases has been largely accepted as an important risk factor in clinical diagnosis and risk prediction. Nevertheless, the complicated structures of the family history of diseases have limited their application in clinical practice. Here, we developed a new method that enables incorporation of the general family history of diseases with a liability threshold model, and propose a new analysis strategy for risk prediction with penalized regression analysis that incorporates both large numbers of genetic variants and clinical risk factors. Application of our model to type 2 diabetes in the Korean population (1846 cases and 1846 controls) demonstrated that single-nucleotide polymorphisms accounted for 32.5% of the variation explained by the predicted risk scores in the test data set, and incorporation of family history led to an additional 6.3% improvement in prediction. Our results illustrate that family medical history provides valuable information on the variation of complex diseases and improves prediction performance. Copyright © 2017 by the Genetics Society of America.

  4. Risk prediction model for knee pain in the Nottingham community: a Bayesian modelling approach.

    PubMed

    Fernandes, G S; Bhattacharya, A; McWilliams, D F; Ingham, S L; Doherty, M; Zhang, W

    2017-03-20

    Twenty-five percent of the British population over the age of 50 years experiences knee pain. Knee pain can limit physical ability and cause distress and bears significant socioeconomic costs. The objectives of this study were to develop and validate the first risk prediction model for incident knee pain in the Nottingham community and validate this internally within the Nottingham cohort and externally within the Osteoarthritis Initiative (OAI) cohort. A total of 1822 participants from the Nottingham community who were at risk for knee pain were followed for 12 years. Of this cohort, two-thirds (n = 1203) were used to develop the risk prediction model, and one-third (n = 619) were used to validate the model. Incident knee pain was defined as pain on most days for at least 1 month in the past 12 months. Predictors were age, sex, body mass index, pain elsewhere, prior knee injury and knee alignment. A Bayesian logistic regression model was used to determine the probability of an OR >1. The Hosmer-Lemeshow χ 2 statistic (HLS) was used for calibration, and ROC curve analysis was used for discrimination. The OAI cohort from the United States was also used to examine the performance of the model. A risk prediction model for knee pain incidence was developed using a Bayesian approach. The model had good calibration, with an HLS of 7.17 (p = 0.52) and moderate discriminative ability (ROC 0.70) in the community. Individual scenarios are given using the model. However, the model had poor calibration (HLS 5866.28, p < 0.01) and poor discriminative ability (ROC 0.54) in the OAI cohort. To our knowledge, this is the first risk prediction model for knee pain, regardless of underlying structural changes of knee osteoarthritis, in the community using a Bayesian modelling approach. The model appears to work well in a community-based population but not in individuals with a higher risk for knee osteoarthritis, and it may provide a convenient tool for use in

  5. Development and Validation of a Practical Two-Step Prediction Model and Clinical Risk Score for Post-Thrombotic Syndrome.

    PubMed

    Amin, Elham E; van Kuijk, Sander M J; Joore, Manuela A; Prandoni, Paolo; Cate, Hugo Ten; Cate-Hoek, Arina J Ten

    2018-06-04

     Post-thrombotic syndrome (PTS) is a common chronic consequence of deep vein thrombosis that affects the quality of life and is associated with substantial costs. In clinical practice, it is not possible to predict the individual patient risk. We develop and validate a practical two-step prediction tool for PTS in the acute and sub-acute phase of deep vein thrombosis.  Multivariable regression modelling with data from two prospective cohorts in which 479 (derivation) and 1,107 (validation) consecutive patients with objectively confirmed deep vein thrombosis of the leg, from thrombosis outpatient clinic of Maastricht University Medical Centre, the Netherlands (derivation) and Padua University hospital in Italy (validation), were included. PTS was defined as a Villalta score of ≥ 5 at least 6 months after acute thrombosis.  Variables in the baseline model in the acute phase were: age, body mass index, sex, varicose veins, history of venous thrombosis, smoking status, provoked thrombosis and thrombus location. For the secondary model, the additional variable was residual vein obstruction. Optimism-corrected area under the receiver operating characteristic curves (AUCs) were 0.71 for the baseline model and 0.60 for the secondary model. Calibration plots showed well-calibrated predictions. External validation of the derived clinical risk scores was successful: AUC, 0.66 (95% confidence interval [CI], 0.63-0.70) and 0.64 (95% CI, 0.60-0.69).  Individual risk for PTS in the acute phase of deep vein thrombosis can be predicted based on readily accessible baseline clinical and demographic characteristics. The individual risk in the sub-acute phase can be predicted with limited additional clinical characteristics. Schattauer GmbH Stuttgart.

  6. A Model for Risk Analysis of Oil Tankers

    NASA Astrophysics Data System (ADS)

    Montewka, Jakub; Krata, Przemysław; Goerland, Floris; Kujala, Pentti

    2010-01-01

    The paper presents a model for risk analysis regarding marine traffic, with the emphasis on two types of the most common marine accidents which are: collision and grounding. The focus is on oil tankers as these pose the highest environmental risk. A case study in selected areas of Gulf of Finland in ice free conditions is presented. The model utilizes a well-founded formula for risk calculation, which combines the probability of an unwanted event with its consequences. Thus the model is regarded a block type model, consisting of blocks for the probability of collision and grounding estimation respectively as well as blocks for consequences of an accident modelling. Probability of vessel colliding is assessed by means of a Minimum Distance To Collision (MDTC) based model. The model defines in a novel way the collision zone, using mathematical ship motion model and recognizes traffic flow as a non homogeneous process. The presented calculations address waterways crossing between Helsinki and Tallinn, where dense cross traffic during certain hours is observed. For assessment of a grounding probability, a new approach is proposed, which utilizes a newly developed model, where spatial interactions between objects in different locations are recognized. A ship at a seaway and navigational obstructions may be perceived as interacting objects and their repulsion may be modelled by a sort of deterministic formulation. Risk due to tankers running aground addresses an approach fairway to an oil terminal in Sköldvik, near Helsinki. The consequences of an accident are expressed in monetary terms, and concern costs of an oil spill, based on statistics of compensations claimed from the International Oil Pollution Compensation Funds (IOPC Funds) by parties involved.

  7. An original traffic additional emission model and numerical simulation on a signalized road

    NASA Astrophysics Data System (ADS)

    Zhu, Wen-Xing; Zhang, Jing-Yu

    2017-02-01

    Based on VSP (Vehicle Specific Power) model traffic real emissions were theoretically classified into two parts: basic emission and additional emission. An original additional emission model was presented to calculate the vehicle's emission due to the signal control effects. Car-following model was developed and used to describe the traffic behavior including cruising, accelerating, decelerating and idling at a signalized intersection. Simulations were conducted under two situations: single intersection and two adjacent intersections with their respective control policy. Results are in good agreement with the theoretical analysis. It is also proved that additional emission model may be used to design the signal control policy in our modern traffic system to solve the serious environmental problems.

  8. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  9. Environmental fate and exposure models: advances and challenges in 21st century chemical risk assessment.

    PubMed

    Di Guardo, Antonio; Gouin, Todd; MacLeod, Matthew; Scheringer, Martin

    2018-01-24

    Environmental fate and exposure models are a powerful means to integrate information on chemicals, their partitioning and degradation behaviour, the environmental scenario and the emissions in order to compile a picture of chemical distribution and fluxes in the multimedia environment. A 1995 pioneering book, resulting from a series of workshops among model developers and users, reported the main advantages and identified needs for research in the field of multimedia fate models. Considerable efforts were devoted to their improvement in the past 25 years and many aspects were refined; notably the inclusion of nanomaterials among the modelled substances, the development of models at different spatial and temporal scales, the estimation of chemical properties and emission data, the incorporation of additional environmental media and processes, the integration of sensitivity and uncertainty analysis in the simulations. However, some challenging issues remain and require research efforts and attention: the need of methods to estimate partition coefficients for polar and ionizable chemical in the environment, a better description of bioavailability in different environments as well as the requirement of injecting more ecological realism in exposure predictions to account for the diversity of ecosystem structures and functions in risk assessment. Finally, to transfer new scientific developments into the realm of regulatory risk assessment, we propose the formation of expert groups that compare, discuss and recommend model modifications and updates and help develop practical tools for risk assessment.

  10. Claims-based risk model for first severe COPD exacerbation.

    PubMed

    Stanford, Richard H; Nag, Arpita; Mapel, Douglas W; Lee, Todd A; Rosiello, Richard; Schatz, Michael; Vekeman, Francis; Gauthier-Loiselle, Marjolaine; Merrigan, J F Philip; Duh, Mei Sheng

    2018-02-01

    To develop and validate a predictive model for first severe chronic obstructive pulmonary disease (COPD) exacerbation using health insurance claims data and to validate the risk measure of controller medication to total COPD treatment (controller and rescue) ratio (CTR). A predictive model was developed and validated in 2 managed care databases: Truven Health MarketScan database and Reliant Medical Group database. This secondary analysis assessed risk factors, including CTR, during the baseline period (Year 1) to predict risk of severe exacerbation in the at-risk period (Year 2). Patients with COPD who were 40 years or older and who had at least 1 COPD medication dispensed during the year following COPD diagnosis were included. Subjects with severe exacerbations in the baseline year were excluded. Risk factors in the baseline period were included as potential predictors in multivariate analysis. Performance was evaluated using C-statistics. The analysis included 223,824 patients. The greatest risk factors for first severe exacerbation were advanced age, chronic oxygen therapy usage, COPD diagnosis type, dispensing of 4 or more canisters of rescue medication, and having 2 or more moderate exacerbations. A CTR of 0.3 or greater was associated with a 14% lower risk of severe exacerbation. The model performed well with C-statistics, ranging from 0.711 to 0.714. This claims-based risk model can predict the likelihood of first severe COPD exacerbation. The CTR could also potentially be used to target populations at greatest risk for severe exacerbations. This could be relevant for providers and payers in approaches to prevent severe exacerbations and reduce costs.

  11. Probability based models for estimation of wildfire risk

    Treesearch

    Haiganoush Preisler; D. R. Brillinger; R. E. Burgan; John Benoit

    2004-01-01

    We present a probability-based model for estimating fire risk. Risk is defined using three probabilities: the probability of fire occurrence; the conditional probability of a large fire given ignition; and the unconditional probability of a large fire. The model is based on grouped data at the 1 km²-day cell level. We fit a spatially and temporally explicit non-...

  12. Research on Capacity Addition using Market Model with Transmission Congestion under Competitive Environment

    NASA Astrophysics Data System (ADS)

    Katsura, Yasufumi; Attaviriyanupap, Pathom; Kataoka, Yoshihiko

    In this research, the fundamental premises for deregulation of the electric power industry are reevaluated. The authors develop a simple model to represent wholesale electricity market with highly congested network. The model is developed by simplifying the power system and market in New York ISO based on available data of New York ISO in 2004 with some estimation. Based on the developed model and construction cost data from the past, the economic impact of transmission line addition on market participants and the impact of deregulation on power plant additions under market with transmission congestion are studied. Simulation results show that the market signals may fail to facilitate proper capacity additions and results in the undesirable over-construction and insufficient-construction cycle of capacity addition.

  13. Improving default risk prediction using Bayesian model uncertainty techniques.

    PubMed

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. © 2012 Society for Risk Analysis.

  14. Avian collision risk models for wind energy impact assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masden, E.A., E-mail: elizabeth.masden@uhi.ac.uk; Cook, A.S.C.P.

    2016-01-15

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measuremore » of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.« less

  15. Simulation Assisted Risk Assessment: Blast Overpressure Modeling

    NASA Technical Reports Server (NTRS)

    Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael

    2006-01-01

    A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.

  16. Creation of mortality risk charts using 123I meta-iodobenzylguanidine heart-to-mediastinum ratio in patients with heart failure: 2- and 5-year risk models.

    PubMed

    Nakajima, Kenichi; Nakata, Tomoaki; Matsuo, Shinro; Jacobson, Arnold F

    2016-10-01

    (123)I meta-iodobenzylguanidine (MIBG) imaging has been extensively used for prognostication in patients with chronic heart failure (CHF). The purpose of this study was to create mortality risk charts for short-term (2 years) and long-term (5 years) prediction of cardiac mortality. Using a pooled database of 1322 CHF patients, multivariate analysis, including (123)I-MIBG late heart-to-mediastinum ratio (HMR), left ventricular ejection fraction (LVEF), and clinical factors, was performed to determine optimal variables for the prediction of 2- and 5-year mortality risk using subsets of the patients (n = 1280 and 933, respectively). Multivariate logistic regression analysis was performed to create risk charts. Cardiac mortality was 10 and 22% for the sub-population of 2- and 5-year analyses. A four-parameter multivariate logistic regression model including age, New York Heart Association (NYHA) functional class, LVEF, and HMR was used. Annualized mortality rate was <1% in patients with NYHA Class I-II and HMR ≥ 2.0, irrespective of age and LVEF. In patients with NYHA Class III-IV, mortality rate was 4-6 times higher for HMR < 1.40 compared with HMR ≥ 2.0 in all LVEF classes. Among the subset of patients with b-type natriuretic peptide (BNP) results (n = 491 and 359 for 2- and 5-year models, respectively), the 5-year model showed incremental value of HMR in addition to BNP. Both 2- and 5-year risk prediction models with (123)I-MIBG HMR can be used to identify low-risk as well as high-risk patients, which can be effective for further risk stratification of CHF patients even when BNP is available. © The Author 2015. Published by Oxford University Press on behalf of the European Society of Cardiology.

  17. Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.

    PubMed

    Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi

    2015-10-01

    In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. © 2015 Society for Risk Analysis.

  18. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    NASA Technical Reports Server (NTRS)

    2012-01-01

    At the request of NASA, the National Research Council's (NRC's) Committee for Evaluation of Space Radiation Cancer Risk Model1 reviewed a number of changes that NASA proposes to make to its model for estimating the risk of radiation-induced cancer in astronauts. The NASA model in current use was last updated in 2005, and the proposed model would incorporate recent research directed at improving the quantification and understanding of the health risks posed by the space radiation environment. NASA's proposed model is defined by the 2011 NASA report Space Radiation Cancer Risk Projections and Uncertainties--2010 . The committee's evaluation is based primarily on this source, which is referred to hereafter as the 2011 NASA report, with mention of specific sections or tables. The overall process for estimating cancer risks due to low linear energy transfer (LET) radiation exposure has been fully described in reports by a number of organizations. The approaches described in the reports from all of these expert groups are quite similar. NASA's proposed space radiation cancer risk assessment model calculates, as its main output, age- and gender-specific risk of exposure-induced death (REID) for use in the estimation of mission and astronaut-specific cancer risk. The model also calculates the associated uncertainties in REID. The general approach for estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements. However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as

  19. How TK-TD and population models for aquatic macrophytes could support the risk assessment for plant protection products.

    PubMed

    Hommen, Udo; Schmitt, Walter; Heine, Simon; Brock, Theo Cm; Duquesne, Sabine; Manson, Phil; Meregalli, Giovanna; Ochoa-Acuña, Hugo; van Vliet, Peter; Arts, Gertie

    2016-01-01

    This case study of the Society of Environmental Toxicology and Chemistry (SETAC) workshop MODELINK demonstrates the potential use of mechanistic effects models for macrophytes to extrapolate from effects of a plant protection product observed in laboratory tests to effects resulting from dynamic exposure on macrophyte populations in edge-of-field water bodies. A standard European Union (EU) risk assessment for an example herbicide based on macrophyte laboratory tests indicated risks for several exposure scenarios. Three of these scenarios are further analyzed using effect models for 2 aquatic macrophytes, the free-floating standard test species Lemna sp., and the sediment-rooted submerged additional standard test species Myriophyllum spicatum. Both models include a toxicokinetic (TK) part, describing uptake and elimination of the toxicant, a toxicodynamic (TD) part, describing the internal concentration-response function for growth inhibition, and a description of biomass growth as a function of environmental factors to allow simulating seasonal dynamics. The TK-TD models are calibrated and tested using laboratory tests, whereas the growth models were assumed to be fit for purpose based on comparisons of predictions with typical growth patterns observed in the field. For the risk assessment, biomass dynamics are predicted for the control situation and for several exposure levels. Based on specific protection goals for macrophytes, preliminary example decision criteria are suggested for evaluating the model outputs. The models refined the risk indicated by lower tier testing for 2 exposure scenarios, while confirming the risk associated for the third. Uncertainties related to the experimental and the modeling approaches and their application in the risk assessment are discussed. Based on this case study and the assumption that the models prove suitable for risk assessment once fully evaluated, we recommend that 1) ecological scenarios be developed that are also

  20. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  1. Holistic flood risk assessment using agent-based modelling: the case of Sint Maarten Island

    NASA Astrophysics Data System (ADS)

    Abayneh Abebe, Yared; Vojinovic, Zoran; Nikolic, Igor; Hammond, Michael; Sanchez, Arlex; Pelling, Mark

    2015-04-01

    Floods in coastal regions are regarded as one of the most dangerous and harmful disasters. Though commonly referred to as natural disasters, coastal floods are also attributable to various social, economic, historical and political issues. Rapid urbanisation in coastal areas combined with climate change and poor governance can lead to a significant increase in the risk of pluvial flooding coinciding with fluvial and coastal flooding posing a greater risk of devastation in coastal communities. Disasters that can be triggered by hydro-meteorological events are interconnected and interrelated with both human activities and natural processes. They, therefore, require holistic approaches to help understand their complexity in order to design and develop adaptive risk management approaches that minimise social and economic losses and environmental impacts, and increase resilience to such events. Being located in the North Atlantic Ocean, Sint Maarten is frequently subjected to hurricanes. In addition, the stormwater catchments and streams on Sint Maarten have several unique characteristics that contribute to the severity of flood-related impacts. Urban environments are usually situated in low-lying areas, with little consideration for stormwater drainage, and as such are subject to flash flooding. Hence, Sint Maarten authorities drafted policies to minimise the risk of flood-related disasters on the island. In this study, an agent-based model is designed and applied to understand the implications of introduced policies and regulations, and to understand how different actors' behaviours influence the formation, propagation and accumulation of flood risk. The agent-based model built for this study is based on the MAIA meta-model, which helps to decompose, structure and conceptualize socio-technical systems with an agent-oriented perspective, and is developed using the NetLogo simulation environment. The agents described in this model are households and businesses, and

  2. On an Additive Semigraphoid Model for Statistical Networks With Application to Pathway Analysis.

    PubMed

    Li, Bing; Chun, Hyonho; Zhao, Hongyu

    2014-09-01

    We introduce a nonparametric method for estimating non-gaussian graphical models based on a new statistical relation called additive conditional independence, which is a three-way relation among random vectors that resembles the logical structure of conditional independence. Additive conditional independence allows us to use one-dimensional kernel regardless of the dimension of the graph, which not only avoids the curse of dimensionality but also simplifies computation. It also gives rise to a parallel structure to the gaussian graphical model that replaces the precision matrix by an additive precision operator. The estimators derived from additive conditional independence cover the recently introduced nonparanormal graphical model as a special case, but outperform it when the gaussian copula assumption is violated. We compare the new method with existing ones by simulations and in genetic pathway analysis.

  3. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    NASA Technical Reports Server (NTRS)

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  4. On an additive partial correlation operator and nonparametric estimation of graphical models.

    PubMed

    Lee, Kuang-Yao; Li, Bing; Zhao, Hongyu

    2016-09-01

    We introduce an additive partial correlation operator as an extension of partial correlation to the nonlinear setting, and use it to develop a new estimator for nonparametric graphical models. Our graphical models are based on additive conditional independence, a statistical relation that captures the spirit of conditional independence without having to resort to high-dimensional kernels for its estimation. The additive partial correlation operator completely characterizes additive conditional independence, and has the additional advantage of putting marginal variation on appropriate scales when evaluating interdependence, which leads to more accurate statistical inference. We establish the consistency of the proposed estimator. Through simulation experiments and analysis of the DREAM4 Challenge dataset, we demonstrate that our method performs better than existing methods in cases where the Gaussian or copula Gaussian assumption does not hold, and that a more appropriate scaling for our method further enhances its performance.

  5. On an additive partial correlation operator and nonparametric estimation of graphical models

    PubMed Central

    Li, Bing; Zhao, Hongyu

    2016-01-01

    Abstract We introduce an additive partial correlation operator as an extension of partial correlation to the nonlinear setting, and use it to develop a new estimator for nonparametric graphical models. Our graphical models are based on additive conditional independence, a statistical relation that captures the spirit of conditional independence without having to resort to high-dimensional kernels for its estimation. The additive partial correlation operator completely characterizes additive conditional independence, and has the additional advantage of putting marginal variation on appropriate scales when evaluating interdependence, which leads to more accurate statistical inference. We establish the consistency of the proposed estimator. Through simulation experiments and analysis of the DREAM4 Challenge dataset, we demonstrate that our method performs better than existing methods in cases where the Gaussian or copula Gaussian assumption does not hold, and that a more appropriate scaling for our method further enhances its performance. PMID:29422689

  6. Measuring the coupled risks: A copula-based CVaR model

    NASA Astrophysics Data System (ADS)

    He, Xubiao; Gong, Pu

    2009-01-01

    Integrated risk management for financial institutions requires an approach for aggregating risk types (such as market and credit) whose distributional shapes vary considerably. The financial institutions often ignore risks' coupling influence so as to underestimate the financial risks. We constructed a copula-based Conditional Value-at-Risk (CVaR) model for market and credit risks. This technique allows us to incorporate realistic marginal distributions that capture essential empirical features of these risks, such as skewness and fat-tails while allowing for a rich dependence structure. Finally, the numerical simulation method is used to implement the model. Our results indicate that the coupled risks for the listed company's stock maybe are undervalued if credit risk is ignored, especially for the listed company with bad credit quality.

  7. A Process Model for Assessing Adolescent Risk for Suicide.

    ERIC Educational Resources Information Center

    Stoelb, Matt; Chiriboga, Jennifer

    1998-01-01

    This comprehensive assessment process model includes primary, secondary, and situational risk factors and their combined implications and significance in determining an adolescent's level or risk for suicide. Empirical data and clinical intuition are integrated to form a working client model that guides the professional in continuously reassessing…

  8. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data.

    PubMed

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2013-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions.

  9. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data

    PubMed Central

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2012-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions. PMID:23645976

  10. Managing risks in the fisheries supply chain using House of Risk Framework (HOR) and Interpretive Structural Modeling (ISM)

    NASA Astrophysics Data System (ADS)

    Nguyen, T. L. T.; Tran, T. T.; Huynh, T. P.; Ho, T. K. D.; Le, A. T.; Do, T. K. H.

    2018-04-01

    One of the sectors which contributes importantly to the development of Vietnam economy is fishery industry. However, during recent year, it has been witnessed many difficulties on managing the performance of the fishery supply chain operations as a whole. In this paper, a framework for supply chain risk management (SCRM) is proposed. Initially, all the activities are mapped by using Supply Chain Operations Reference (SCOR) model. Next, the risk ranking is analyzed in House of Risk. Furthermore, interpretive structural modeling (ISM) is used to identify inter-relationships among supply chain risks and to visualize the risks according to their levels. For illustration, the model has been tested in several case studies with fishery companies in Can Tho, Mekong Delta. This study identifies 22 risk events and 20 risk agents through the supply chain. Also, the risk priority could be used for further House of Risk with proactive actions in future studies.

  11. Flood risk (d)evolution: Disentangling key drivers of flood risk change with a retro-model experiment.

    PubMed

    Zischg, Andreas Paul; Hofer, Patrick; Mosimann, Markus; Röthlisberger, Veronika; Ramirez, Jorge A; Keiler, Margreth; Weingartner, Rolf

    2018-05-19

    Flood risks are dynamically changing over time. Over decades and centuries, the main drivers for flood risk change are influenced either by perturbations or slow alterations in the natural environment or, more importantly, by socio-economic development and human interventions. However, changes in the natural and human environment are intertwined. Thus, the analysis of the main drivers for flood risk changes requires a disentangling of the individual risk components. Here, we present a method for isolating the individual effects of selected drivers of change and selected flood risk management options based on a model experiment. In contrast to purely synthetic model experiments, we built our analyses upon a retro-model consisting of several spatio-temporal stages of river morphology and settlement structure. The main advantage of this approach is that the overall long-term dynamics are known and do not have to be assumed. We used this model setup to analyse the temporal evolution of the flood risk, for an ex-post evaluation of the key drivers of change, and for analysing possible alternative pathways for flood risk evolution under different governance settings. We showed that in the study region the construction of lateral levees and the consecutive river incision are the main drivers for decreasing flood risks over the last century. A rebound effect in flood risk can be observed following an increase in settlements since the 1960s. This effect is not as relevant as the river engineering measures, but it will become increasingly relevant in the future with continued socio-economic growth. The presented approach could provide a methodological framework for studying pathways for future flood risk evolvement and for the formulation of narratives for adapting governmental flood risk strategies to the spatio-temporal dynamics in the built environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Comparing predictions of extinction risk using models and subjective judgement

    NASA Astrophysics Data System (ADS)

    McCarthy, Michael A.; Keith, David; Tietjen, Justine; Burgman, Mark A.; Maunder, Mark; Master, Larry; Brook, Barry W.; Mace, Georgina; Possingham, Hugh P.; Medellin, Rodrigo; Andelman, Sandy; Regan, Helen; Regan, Tracey; Ruckelshaus, Mary

    2004-10-01

    Models of population dynamics are commonly used to predict risks in ecology, particularly risks of population decline. There is often considerable uncertainty associated with these predictions. However, alternatives to predictions based on population models have not been assessed. We used simulation models of hypothetical species to generate the kinds of data that might typically be available to ecologists and then invited other researchers to predict risks of population declines using these data. The accuracy of the predictions was assessed by comparison with the forecasts of the original model. The researchers used either population models or subjective judgement to make their predictions. Predictions made using models were only slightly more accurate than subjective judgements of risk. However, predictions using models tended to be unbiased, while subjective judgements were biased towards over-estimation. Psychology literature suggests that the bias of subjective judgements is likely to vary somewhat unpredictably among people, depending on their stake in the outcome. This will make subjective predictions more uncertain and less transparent than those based on models.

  13. Comparative analysis of bleeding risk by the location and shape of arachnoid cysts: a finite element model analysis.

    PubMed

    Lee, Chang-Hyun; Han, In Seok; Lee, Ji Yeoun; Phi, Ji Hoon; Kim, Seung-Ki; Kim, Young-Eun; Wang, Kyu-Chang

    2017-01-01

    Although arachnoid cysts (ACs) are observed in various locations, only sylvian ACs are mainly regarded to be associated with bleeding. The reason for this selective association of sylvian ACs with bleeding is not understood well. This study is to investigate the effect of the location and shape of ACs on the risk of bleeding. A developed finite element model of the head/brain was modified for models of sylvian, suprasellar, and posterior fossa ACs. A spherical AC was placed at each location to compare the effect of AC location. Bowl-shaped and oval-shaped AC models were developed to compare the effect by shape. The shear force on the spot-weld elements (SFSW) was measured between the dura and the outer wall of the ACs or the comparable arachnoid membrane in the normal model. All AC models revealed higher SFSW than comparable normal models. By location, sylvian AC displayed the highest SFSW for frontal and lateral impacts. By shape, small outer wall AC models showed higher SFSW than large wall models in sylvian area and lower SFSW than large ones in posterior fossa. In regression analysis, the presence of AC was the only independent risk of bleeding. The bleeding mechanism of ACs is very complex, and the risk quantification failed to show a significant role of location and shape of ACs. The presence of AC increases shear force on impact condition and may be a risk factor of bleeding, and sylvian location of AC may not have additive risks of AC bleeding.

  14. Efficient GIS-based model-driven method for flood risk management and its application in central China

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Zhou, J.; Song, L.; Zou, Q.; Guo, J.; Wang, Y.

    2014-02-01

    In recent years, an important development in flood management has been the focal shift from flood protection towards flood risk management. This change greatly promoted the progress of flood control research in a multidisciplinary way. Moreover, given the growing complexity and uncertainty in many decision situations of flood risk management, traditional methods, e.g., tight-coupling integration of one or more quantitative models, are not enough to provide decision support for managers. Within this context, this paper presents a beneficial methodological framework to enhance the effectiveness of decision support systems, through the dynamic adaptation of support regarding the needs of the decision-maker. In addition, we illustrate a loose-coupling technical prototype for integrating heterogeneous elements, such as multi-source data, multidisciplinary models, GIS tools and existing systems. The main innovation is the application of model-driven concepts, which put the system in a state of continuous iterative optimization. We define the new system as a model-driven decision support system (MDSS ). Two characteristics that differentiate the MDSS are as follows: (1) it is made accessible to non-technical specialists; and (2) it has a higher level of adaptability and compatibility. Furthermore, the MDSS was employed to manage the flood risk in the Jingjiang flood diversion area, located in central China near the Yangtze River. Compared with traditional solutions, we believe that this model-driven method is efficient, adaptable and flexible, and thus has bright prospects of application for comprehensive flood risk management.

  15. Sensitivity to Uncertainty in Asteroid Impact Risk Assessment

    NASA Astrophysics Data System (ADS)

    Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.

    2015-12-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.

  16. A dynamical systems model for nuclear power plant risk

    NASA Astrophysics Data System (ADS)

    Hess, Stephen Michael

    The recent transition to an open access generation marketplace has forced nuclear plant operators to become much more cost conscious and focused on plant performance. Coincidentally, the regulatory perspective also is in a state of transition from a command and control framework to one that is risk-informed and performance-based. Due to these structural changes in the economics and regulatory system associated with commercial nuclear power plant operation, there is an increased need for plant management to explicitly manage nuclear safety risk. Application of probabilistic risk assessment techniques to model plant hardware has provided a significant contribution to understanding the potential initiating events and equipment failures that can lead to core damage accidents. Application of the lessons learned from these analyses has supported improved plant operation and safety over the previous decade. However, this analytical approach has not been nearly as successful in addressing the impact of plant processes and management effectiveness on the risks of plant operation. Thus, the research described in this dissertation presents a different approach to address this issue. Here we propose a dynamical model that describes the interaction of important plant processes among themselves and their overall impact on nuclear safety risk. We first provide a review of the techniques that are applied in a conventional probabilistic risk assessment of commercially operating nuclear power plants and summarize the typical results obtained. The limitations of the conventional approach and the status of research previously performed to address these limitations also are presented. Next, we present the case for the application of an alternative approach using dynamical systems theory. This includes a discussion of previous applications of dynamical models to study other important socio-economic issues. Next, we review the analytical techniques that are applicable to analysis of

  17. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  18. The additive effects of depressive symptoms and polysubstance use on HIV risk among gay, bisexual, and other men who have sex with men.

    PubMed

    Card, Kiffer G; Lachowsky, Nathan J; Armstrong, Heather L; Cui, Zishan; Wang, Lu; Sereda, Paul; Jollimore, Jody; Patterson, Thomas L; Corneil, Trevor; Hogg, Robert S; Roth, Eric A; Moore, David M

    2018-07-01

    Among gay, bisexual, and other men who have sex with men (GBM), collinearity between polysubstance use and mental health concerns has obscured their combined effects on HIV risk with multivariable results often highlighting only one or the other. We used mediation and moderation analyses to examine the effects of polysubstance use and depressive symptoms on high-risk sex (i.e., condomless anal sex with serodiscordant/unknown status partner) in a sample of sexually-active GBM, aged ≥16 years, recruited in Metro Vancouver using respondent driven sampling. Hospital Anxiety and Depression Scale scores assessed mental health. Alcohol Use Disorder Identification Test scores assessed alcohol disorders. Poly-use of multiple drug types (e.g., stimulants, sedatives, opiates, hallucinogens) was assessed over the previous six months. Among 719 predominantly white (68.0%), gay-identified (80.7%) GBM, alcohol use was not associated with increased prevalence of high-risk sex. Controlling for demographic factors and partner number, an interaction between polysubstance use and depressive symptoms revealed that the combined effects were additively associated with increased odds for high-risk sex. Mediation models showed that polysubstance use partially mediated the relationship between depressive symptoms and high-risk sex. An interaction effect between polysubstance use (defined by using 3 or more substances in the past six months) and depressive symptoms (defined by HADS scores) revealed that the combination of these factors was associated with increased risk for high-risk sex - supporting a syndemic understanding of the production of HIV risk. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Predictive Modeling of Risk Factors and Complications of Cataract Surgery

    PubMed Central

    Gaskin, Gregory L; Pershing, Suzann; Cole, Tyler S; Shah, Nigam H

    2016-01-01

    Purpose To quantify the relationship between aggregated preoperative risk factors and cataract surgery complications, as well as to build a model predicting outcomes on an individual-level—given a constellation of demographic, baseline, preoperative, and intraoperative patient characteristics. Setting Stanford Hospital and Clinics between 1994 and 2013. Design Retrospective cohort study Methods Patients age 40 or older who received cataract surgery between 1994 and 2013. Risk factors, complications, and demographic information were extracted from the Electronic Health Record (EHR), based on International Classification of Diseases, 9th edition (ICD-9) codes, Current Procedural Terminology (CPT) codes, drug prescription information, and text data mining using natural language processing. We used a bootstrapped least absolute shrinkage and selection operator (LASSO) model to identify highly-predictive variables. We built random forest classifiers for each complication to create predictive models. Results Our data corroborated existing literature on postoperative complications—including the association of intraoperative complications, complex cataract surgery, black race, and/or prior eye surgery with an increased risk of any postoperative complications. We also found a number of other, less well-described risk factors, including systemic diabetes mellitus, young age (<60 years old), and hyperopia as risk factors for complex cataract surgery and intra- and post-operative complications. Our predictive models based on aggregated outperformed existing published models. Conclusions The constellations of risk factors and complications described here can guide new avenues of research and provide specific, personalized risk assessment for a patient considering cataract surgery. The predictive capacity of our models can enable risk stratification of patients, which has utility as a teaching tool as well as informing quality/value-based reimbursements. PMID:26692059

  20. WRF-based fire risk modelling and evaluation for years 2010 and 2012 in Poland

    NASA Astrophysics Data System (ADS)

    Stec, Magdalena; Szymanowski, Mariusz; Kryza, Maciej

    2016-04-01

    Wildfires are one of the main ecosystems' disturbances for forested, seminatural and agricultural areas. They generate significant economic loss, especially in forest management and agriculture. Forest fire risk modeling is therefore essential e.g. for forestry administration. In August 2015 a new method of forest fire risk forecasting entered into force in Poland. The method allows to predict a fire risk level in a 4-degree scale (0 - no risk, 3 - highest risk) and consists of a set of linearized regression equations. Meteorological information is used as predictors in regression equations, with air temperature, relative humidity, average wind speed, cloudiness and rainfall. The equations include also pine litter humidity as a measure of potential fuel characteristics. All these parameters are measured routinely in Poland at 42 basic and 94 auxiliary sites. The fire risk level is estimated for a current (basing on morning measurements) or next day (basing on midday measurements). Entire country is divided into 42 prognostic zones, and fire risk level for each zone is taken from the closest measuring site. The first goal of this work is to assess if the measurements needed for fire risk forecasting may be replaced by the data from mesoscale meteorological model. Additionally, the use of a meteorological model would allow to take into account much more realistic spatial differentiation of weather elements determining the fire risk level instead of discrete point-made measurements. Meteorological data have been calculated using the Weather Research and Forecasting model (WRF). For the purpose of this study the WRF model is run in the reanalysis mode allowing to estimate all required meteorological data in a 5-kilometers grid. The only parameter that cannot be directly calculated using WRF is the litter humidity, which has been estimated using empirical formula developed by Sakowska (2007). The experiments are carried out for two selected years: 2010 and 2012. The

  1. Predicting the occurrence of wildfires with binary structured additive regression models.

    PubMed

    Ríos-Pena, Laura; Kneib, Thomas; Cadarso-Suárez, Carmen; Marey-Pérez, Manuel

    2017-02-01

    Wildfires are one of the main environmental problems facing societies today, and in the case of Galicia (north-west Spain), they are the main cause of forest destruction. This paper used binary structured additive regression (STAR) for modelling the occurrence of wildfires in Galicia. Binary STAR models are a recent contribution to the classical logistic regression and binary generalized additive models. Their main advantage lies in their flexibility for modelling non-linear effects, while simultaneously incorporating spatial and temporal variables directly, thereby making it possible to reveal possible relationships among the variables considered. The results showed that the occurrence of wildfires depends on many covariates which display variable behaviour across space and time, and which largely determine the likelihood of ignition of a fire. The joint possibility of working on spatial scales with a resolution of 1 × 1 km cells and mapping predictions in a colour range makes STAR models a useful tool for plotting and predicting wildfire occurrence. Lastly, it will facilitate the development of fire behaviour models, which can be invaluable when it comes to drawing up fire-prevention and firefighting plans. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    This study statistically analyzed a grain-size based additivity model that has been proposed to scale reaction rates and parameters from laboratory to field. The additivity model assumed that reaction properties in a sediment including surface area, reactive site concentration, reaction rate, and extent can be predicted from field-scale grain size distribution by linearly adding reaction properties for individual grain size fractions. This study focused on the statistical analysis of the additivity model with respect to reaction rate constants using multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment as an example. Experimental data of rate-limited U(VI) desorption in amore » stirred flow-cell reactor were used to estimate the statistical properties of multi-rate parameters for individual grain size fractions. The statistical properties of the rate constants for the individual grain size fractions were then used to analyze the statistical properties of the additivity model to predict rate-limited U(VI) desorption in the composite sediment, and to evaluate the relative importance of individual grain size fractions to the overall U(VI) desorption. The result indicated that the additivity model provided a good prediction of the U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model, and U(VI) desorption in individual grain size fractions have to be simulated in order to apply the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel size fraction (2-8mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  3. Additive Interaction of MTHFR C677T and MTRR A66G Polymorphisms with Being Overweight/Obesity on the Risk of Type 2 Diabetes.

    PubMed

    Zhi, Xueyuan; Yang, Boyi; Fan, Shujun; Li, Yongfang; He, Miao; Wang, Da; Wang, Yanxun; Wei, Jian; Zheng, Quanmei; Sun, Guifan

    2016-12-15

    Although both methylenetetrahydrofolate reductase ( MTHFR ) C677T and methionine synthase reductase ( MTRR ) A66G polymorphisms have been associated with type 2 diabetes (T2D), their interactions with being overweight/obesity on T2D risk remain unclear. To evaluate the associations of the two polymorphisms with T2D and their interactions with being overweight/obesity on T2D risk, a case-control study of 180 T2D patients and 350 healthy controls was conducted in northern China. Additive interaction was estimated using relative excess risk due to interaction (RERI), attributable proportion due to interaction (AP) and synergy index (S). After adjustments for age and gender, borderline significant associations of the MTHFR C677T and MTRR A66G polymorphisms with T2D were observed under recessive (OR = 1.43, 95% CI: 0.98-2.10) and dominant (OR = 1.43, 95% CI: 1.00-2.06) models, respectively. There was a significant interaction between the MTHFR 677TT genotype and being overweight/obesity on T2D risk (AP = 0.404, 95% CI: 0.047-0.761), in addition to the MTRR 66AG/GG genotypes (RERI = 1.703, 95% CI: 0.401-3.004; AP = 0.528, 95% CI: 0.223-0.834). Our findings suggest that individuals with the MTHFR 677TT or MTRR 66AG/GG genotypes are more susceptible to the detrimental effect of being overweight/obesity on T2D. Further large-scale studies are still needed to confirm our findings.

  4. Breast Cancer Risk Assessment SAS Macro (Gail Model)

    Cancer.gov

    A SAS macro (commonly referred to as the Gail Model) that projects absolute risk of invasive breast cancer according to NCI’s Breast Cancer Risk Assessment Tool (BCRAT) algorithm for specified race/ethnic groups and age intervals.

  5. Long-Term Post-CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions.

    PubMed

    Carr, Brendan M; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C; Zhu, Wei; Shroyer, A Laurie

    2016-01-01

    Clinical risk models are commonly used to predict short-term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long-term mortality. The added value of long-term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long-term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Long-term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c-index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Mortality rates were 3%, 9%, and 17% at one-, three-, and five years, respectively (median follow-up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long-term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Long-term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long-term mortality risk can be accurately assessed and subgroups of higher-risk patients can be identified for enhanced follow-up care. More research appears warranted to refine long-term CABG clinical risk models. © 2015 The Authors. Journal of Cardiac Surgery Published by Wiley Periodicals, Inc.

  6. Long‐Term Post‐CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions

    PubMed Central

    Carr, Brendan M.; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C.; Zhu, Wei

    2015-01-01

    Abstract Background/aim Clinical risk models are commonly used to predict short‐term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long‐term mortality. The added value of long‐term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long‐term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Methods Long‐term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c‐index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Results Mortality rates were 3%, 9%, and 17% at one‐, three‐, and five years, respectively (median follow‐up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long‐term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Conclusions Long‐term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long‐term mortality risk can be accurately assessed and subgroups of higher‐risk patients can be identified for enhanced follow‐up care. More research appears warranted to refine long‐term CABG clinical risk models. doi: 10.1111/jocs.12665 (J Card Surg 2016;31:23–30) PMID:26543019

  7. Ambassadors: Models for At-Risk Students.

    ERIC Educational Resources Information Center

    Cahoon, Peggy

    1989-01-01

    The Ambassador Program, a partnership between Ferron Elementary School and the University of Nevada, Las Vegas, pairs university students with at-risk elementary students once a week to serve as role models. (TE)

  8. Additive influence of genetic predisposition and conventional risk factors in the incidence of coronary heart disease: a population-based study in Greece

    PubMed Central

    Yiannakouris, Nikos; Katsoulis, Michail; Trichopoulou, Antonia; Ordovas, Jose M; Trichopoulos, Dimitrios

    2014-01-01

    Objectives An additive genetic risk score (GRS) for coronary heart disease (CHD) has previously been associated with incident CHD in the population-based Greek European Prospective Investigation into Cancer and nutrition (EPIC) cohort. In this study, we explore GRS-‘environment’ joint actions on CHD for several conventional cardiovascular risk factors (ConvRFs), including smoking, hypertension, type-2 diabetes mellitus (T2DM), body mass index (BMI), physical activity and adherence to the Mediterranean diet. Design A case–control study. Setting The general Greek population of the EPIC study. Participants and outcome measures 477 patients with medically confirmed incident CHD and 1271 controls participated in this study. We estimated the ORs for CHD by dividing participants at higher or lower GRS and, alternatively, at higher or lower ConvRF, and calculated the relative excess risk due to interaction (RERI) as a measure of deviation from additivity. Results The joint presence of higher GRS and higher risk ConvRF was in all instances associated with an increased risk of CHD, compared with the joint presence of lower GRS and lower risk ConvRF. The OR (95% CI) was 1.7 (1.2 to 2.4) for smoking, 2.7 (1.9 to 3.8) for hypertension, 4.1 (2.8 to 6.1) for T2DM, 1.9 (1.4 to 2.5) for lower physical activity, 2.0 (1.3 to 3.2) for high BMI and 1.5 (1.1 to 2.1) for poor adherence to the Mediterranean diet. In all instances, RERI values were fairly small and not statistically significant, suggesting that the GRS and the ConvRFs do not have effects beyond additivity. Conclusions Genetic predisposition to CHD, operationalised through a multilocus GRS, and ConvRFs have essentially additive effects on CHD risk. PMID:24500614

  9. Formation and reduction of carcinogenic furan in various model systems containing food additives.

    PubMed

    Kim, Jin-Sil; Her, Jae-Young; Lee, Kwang-Geun

    2015-12-15

    The aim of this study was to analyse and reduce furan in various model systems. Furan model systems consisting of monosaccharides (0.5M glucose and ribose), amino acids (0.5M alanine and serine) and/or 1.0M ascorbic acid were heated at 121°C for 25 min. The effects of food additives (each 0.1M) such as metal ions (iron sulphate, magnesium sulphate, zinc sulphate and calcium sulphate), antioxidants (BHT and BHA), and sodium sulphite on the formation of furan were measured. The level of furan formed in the model systems was 6.8-527.3 ng/ml. The level of furan in the model systems of glucose/serine and glucose/alanine increased 7-674% when food additives were added. In contrast, the level of furan decreased by 18-51% in the Maillard reaction model systems that included ribose and alanine/serine with food additives except zinc sulphate. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Risk assessment of additives through soft drinks and nectars consumption on Portuguese population: a 2010 survey.

    PubMed

    Diogo, Janina S G; Silva, Liliana S O; Pena, Angelina; Lino, Celeste M

    2013-12-01

    This study investigated whether the Portuguese population is at risk of exceeding ADI levels for acesulfame-K, saccharin, aspartame, caffeine, benzoic and sorbic acid through an assessment of dietary intake of additives and specific consumption of four types of beverages, traditional soft drinks and soft drinks based on mineral waters, energetic drinks, and nectars. The highest mean levels of additives were found for caffeine in energetic drinks, 293.5mg/L, for saccharin in traditional soft drinks, 18.4 mg/L, for acesulfame-K and aspartame in nectars, with 88.2 and 97.8 mg/L, respectively, for benzoic acid in traditional soft drinks, 125.7 mg/L, and for sorbic acid in soft drinks based on mineral water, 166.5 mg/L. Traditional soft drinks presented the highest acceptable daily intake percentages (ADIs%) for acesulfame-K, aspartame, benzoic and sorbic acid and similar value for saccharin (0.5%) when compared with soft drinks based on mineral water, 0.7%, 0.08%, 7.3%, and 1.92% versus 0.2%, 0.053%, 0.6%, and 0.28%, respectively. However for saccharin the highest percentage of ADI was obtained for nectars, 0.9%, in comparison with both types of soft drinks, 0.5%. Therefore, it is concluded that the Portuguese population is not at risk of exceeding the established ADIs for the studied additives. Copyright © 2013. Published by Elsevier Ltd.

  11. Aviation Safety Risk Modeling: Lessons Learned From Multiple Knowledge Elicitation Sessions

    NASA Technical Reports Server (NTRS)

    Luxhoj, J. T.; Ancel, E.; Green, L. L.; Shih, A. T.; Jones, S. M.; Reveley, M. S.

    2014-01-01

    Aviation safety risk modeling has elements of both art and science. In a complex domain, such as the National Airspace System (NAS), it is essential that knowledge elicitation (KE) sessions with domain experts be performed to facilitate the making of plausible inferences about the possible impacts of future technologies and procedures. This study discusses lessons learned throughout the multiple KE sessions held with domain experts to construct probabilistic safety risk models for a Loss of Control Accident Framework (LOCAF), FLightdeck Automation Problems (FLAP), and Runway Incursion (RI) mishap scenarios. The intent of these safety risk models is to support a portfolio analysis of NASA's Aviation Safety Program (AvSP). These models use the flexible, probabilistic approach of Bayesian Belief Networks (BBNs) and influence diagrams to model the complex interactions of aviation system risk factors. Each KE session had a different set of experts with diverse expertise, such as pilot, air traffic controller, certification, and/or human factors knowledge that was elicited to construct a composite, systems-level risk model. There were numerous "lessons learned" from these KE sessions that deal with behavioral aggregation, conditional probability modeling, object-oriented construction, interpretation of the safety risk results, and model verification/validation that are presented in this paper.

  12. A new risk prediction model for critical care: the Intensive Care National Audit & Research Centre (ICNARC) model.

    PubMed

    Harrison, David A; Parry, Gareth J; Carpenter, James R; Short, Alasdair; Rowan, Kathy

    2007-04-01

    To develop a new model to improve risk prediction for admissions to adult critical care units in the UK. Prospective cohort study. The setting was 163 adult, general critical care units in England, Wales, and Northern Ireland, December 1995 to August 2003. Patients were 216,626 critical care admissions. None. The performance of different approaches to modeling physiologic measurements was evaluated, and the best methods were selected to produce a new physiology score. This physiology score was combined with other information relating to the critical care admission-age, diagnostic category, source of admission, and cardiopulmonary resuscitation before admission-to develop a risk prediction model. Modeling interactions between diagnostic category and physiology score enabled the inclusion of groups of admissions that are frequently excluded from risk prediction models. The new model showed good discrimination (mean c index 0.870) and fit (mean Shapiro's R 0.665, mean Brier's score 0.132) in 200 repeated validation samples and performed well when compared with recalibrated versions of existing published risk prediction models in the cohort of patients eligible for all models. The hypothesis of perfect fit was rejected for all models, including the Intensive Care National Audit & Research Centre (ICNARC) model, as is to be expected in such a large cohort. The ICNARC model demonstrated better discrimination and overall fit than existing risk prediction models, even following recalibration of these models. We recommend it be used to replace previously published models for risk adjustment in the UK.

  13. Validation of a multifactorial risk factor model used for predicting future caries risk with Nevada adolescents.

    PubMed

    Ditmyer, Marcia M; Dounis, Georgia; Howard, Katherine M; Mobley, Connie; Cappelli, David

    2011-05-20

    The objective of this study was to measure the validity and reliability of a multifactorial Risk Factor Model developed for use in predicting future caries risk in Nevada adolescents in a public health setting. This study examined retrospective data from an oral health surveillance initiative that screened over 51,000 students 13-18 years of age, attending public/private schools in Nevada across six academic years (2002/2003-2007/2008). The Risk Factor Model included ten demographic variables: exposure to fluoridation in the municipal water supply, environmental smoke exposure, race, age, locale (metropolitan vs. rural), tobacco use, Body Mass Index, insurance status, sex, and sealant application. Multiple regression was used in a previous study to establish which significantly contributed to caries risk. Follow-up logistic regression ascertained the weight of contribution and odds ratios of the ten variables. Researchers in this study computed sensitivity, specificity, positive predictive value (PVP), negative predictive value (PVN), and prevalence across all six years of screening to assess the validity of the Risk Factor Model. Subjects' overall mean caries prevalence across all six years was 66%. Average sensitivity across all six years was 79%; average specificity was 81%; average PVP was 89% and average PVN was 67%. Overall, the Risk Factor Model provided a relatively constant, valid measure of caries that could be used in conjunction with a comprehensive risk assessment in population-based screenings by school nurses/nurse practitioners, health educators, and physicians to guide them in assessing potential future caries risk for use in prevention and referral practices.

  14. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    NASA Technical Reports Server (NTRS)

    2012-01-01

    estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements (NCRP, 2000, 2006). However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as opposed to the previous approach based on linear energy transfer, the development of a new solar particle event (SPE) model, and the updates to galactic cosmic ray (GCR) and shielding transport models. The newer epidemiological information includes updates to the cancer incidence rates from the life span study (LSS) of the Japanese atomic bomb survivors (Preston et al., 2007), transferred to the U.S. population and converted to cancer mortality rates from U.S. population statistics. In addition, the proposed model provides an alternative analysis applicable to lifetime never-smokers (NSs). Details of the uncertainty analysis in the model have also been updated and revised. NASA's proposed model and associated uncertainties are complex in their formulation and as such require a very clear and precise set of descriptions. The committee found the 2011 NASA report challenging to review largely because of the lack of clarity in the model descriptions and derivation of the various parameters used. The committee requested some clarifications from NASA throughout its review and was able to resolve many, but not all, of the ambiguities in the written description.

  15. MODELING APPROACHES TO POPULATION-LEVEL RISK AESSESSMENT

    EPA Science Inventory

    A SETAC Pellston Workshop on Population-Level Risk Assessment was held in Roskilde, Denmark on 23-27 August 2003. One aspect of this workshop focused on modeling approaches for characterizing population-level effects of chemical exposure. The modeling work group identified th...

  16. SMALL POPULATIONS REQUIRE SPECIFIC MODELING APPROACHES FOR ASSESSING RISK

    EPA Science Inventory

    All populations face non-zero risks of extinction. However, the risks for small populations, and therefore the modeling approaches necessary to predict them, are different from those of large populations. These differences are currently hindering assessment of risk to small pop...

  17. Comparison of prospective risk estimates for postoperative complications: human vs computer model.

    PubMed

    Glasgow, Robert E; Hawn, Mary T; Hosokawa, Patrick W; Henderson, William G; Min, Sung-Joon; Richman, Joshua S; Tomeh, Majed G; Campbell, Darrell; Neumayer, Leigh A

    2014-02-01

    Surgical quality improvement tools such as NSQIP are limited in their ability to prospectively affect individual patient care by the retrospective audit and feedback nature of their design. We hypothesized that statistical models using patient preoperative characteristics could prospectively provide risk estimates of postoperative adverse events comparable to risk estimates provided by experienced surgeons, and could be useful for stratifying preoperative assessment of patient risk. This was a prospective observational cohort. Using previously developed models for 30-day postoperative mortality, overall morbidity, cardiac, thromboembolic, pulmonary, renal, and surgical site infection (SSI) complications, model and surgeon estimates of risk were compared with each other and with actual 30-day outcomes. The study cohort included 1,791 general surgery patients operated on between June 2010 and January 2012. Observed outcomes were mortality (0.2%), overall morbidity (8.2%), and pulmonary (1.3%), cardiac (0.3%), thromboembolism (0.2%), renal (0.4%), and SSI (3.8%) complications. Model and surgeon risk estimates showed significant correlation (p < 0.0001) for each outcome category. When surgeons perceived patient risk for overall morbidity to be low, the model-predicted risk and observed morbidity rates were 2.8% and 4.1%, respectively, compared with 10% and 18% in perceived high risk patients. Patients in the highest quartile of model-predicted risk accounted for 75% of observed mortality and 52% of morbidity. Across a broad range of general surgical operations, we confirmed that the model risk estimates are in fairly good agreement with risk estimates of experienced surgeons. Using these models prospectively can identify patients at high risk for morbidity and mortality, who could then be targeted for intervention to reduce postoperative complications. Published by Elsevier Inc.

  18. Modeling of Flood Risk for the Continental United States

    NASA Astrophysics Data System (ADS)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  19. [Multiple risk factors models of patients with acute coronary syndromes of different genders].

    PubMed

    Sun, Wanglexian; Hu, Tiemin; Huang, Xiansheng; Zhang, Ying; Guo, Jinrui; Wang, Wenfeng; Shi, Fei; Wang, Pengfei; Wang, Huarong; Sun, Jing; Li, Chunhua

    2014-12-23

    To establish the multiple risk factors models for patients with acute coronary syndromes (ACS) of different genders and quantitatively assess the pathopoiesis of all factors. A total of 2 308 consecutive ACS inpatients and a control group of 256 cases with normal coronary artery from January 2010 to December 2012 were enrolled and divided into 4 groups of female ACS (n = 970), male ACS (n = 1 338), female control (n = 136) and male control (n = 120). All demographic and clinical data were collected by the physicians and master degree candidates in the division of cardiology. The Logistic regression models of multiple risk factors were established for ACS by different genders. More than 45 years of age, dyslipidemia, type 2 diabetes mellitus, obesity and hypertension were all independent risk factors of ACS for different genders (P < 0.05). However, the same risk factors had different pathogenic effects on ACS between genders. The odds ratio (OR) was markedly different for females and males: per 5-year increase aged over 45 years (1.45 vs 1.13), dyslipidemia (3.45 vs 1.68), type 2 diabetes mellitus (4.06 vs 2.33), obesity (2.93 vs 1.91) and hypertension (1.78 vs 3.80) respectively (all P < 0.05). In addition, current smoking increased the risk of ACS attack in males by 5.49 (P < 0.05) while not statistically significant in females. Particularly cerebral ischemic stroke increased the risk of ACS attack by 5.49 folds in males other than females (P < 0.05). Type 2 diabetes mellitus, dyslipidemia and obesity may present higher risks of ACS attack for females than males. And smoking and hypertension are much more dangerous for males. Males with cerebral infarction are more susceptible for ACS than females.

  20. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  1. Emerging Infectious Diseases and Blood Safety: Modeling the Transfusion-Transmission Risk.

    PubMed

    Kiely, Philip; Gambhir, Manoj; Cheng, Allen C; McQuilten, Zoe K; Seed, Clive R; Wood, Erica M

    2017-07-01

    While the transfusion-transmission (TT) risk associated with the major transfusion-relevant viruses such as HIV is now very low, during the last 20 years there has been a growing awareness of the threat to blood safety from emerging infectious diseases, a number of which are known to be, or are potentially, transfusion transmissible. Two published models for estimating the transfusion-transmission risk from EIDs, referred to as the Biggerstaff-Petersen model and the European Upfront Risk Assessment Tool (EUFRAT), respectively, have been applied to several EIDs in outbreak situations. We describe and compare the methodological principles of both models, highlighting their similarities and differences. We also discuss the appropriateness of comparing results from the two models. Quantitating the TT risk of EIDs can inform decisions about risk mitigation strategies and their cost-effectiveness. Finally, we present a qualitative risk assessment for Zika virus (ZIKV), an EID agent that has caused several outbreaks since 2007. In the latest and largest ever outbreak, several probable cases of transfusion-transmission ZIKV have been reported, indicating that it is transfusion-transmissible and therefore a risk to blood safety. We discuss why quantitative modeling the TT risk of ZIKV is currently problematic. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  2. Risk management modeling and its application in maritime safety

    NASA Astrophysics Data System (ADS)

    Qin, Ting-Rong; Chen, Wei-Jiong; Zeng, Xiang-Kun

    2008-12-01

    Quantified risk assessment (QRA) needs mathematicization of risk theory. However, attention has been paid almost exclusively to applications of assessment methods, which has led to neglect of research into fundamental theories, such as the relationships among risk, safety, danger, and so on. In order to solve this problem, as a first step, fundamental theoretical relationships about risk and risk management were analyzed for this paper in the light of mathematics, and then illustrated with some charts. Second, man-machine-environment-management (MMEM) theory was introduced into risk theory to analyze some properties of risk. On the basis of this, a three-dimensional model of risk management was established that includes: a goal dimension; a management dimension; an operation dimension. This goal management operation (GMO) model was explained and then emphasis was laid on the discussion of the risk flowchart (operation dimension), which lays the groundwork for further study of risk management and qualitative and quantitative assessment. Next, the relationship between Formal Safety Assessment (FSA) and Risk Management was researched. This revealed that the FSA method, which the international maritime organization (IMO) is actively spreading, comes from Risk Management theory. Finally, conclusion were made about how to apply this risk management method to concrete fields efficiently and conveniently, as well as areas where further research is required.

  3. A simulation model for risk assessment of turbine wheels

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Hage, Richard T.

    1991-01-01

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  4. A simulation model for risk assessment of turbine wheels

    NASA Astrophysics Data System (ADS)

    Safie, Fayssal M.; Hage, Richard T.

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  5. Prediction models for the risk of spontaneous preterm birth based on maternal characteristics: a systematic review and independent external validation.

    PubMed

    Meertens, Linda J E; van Montfort, Pim; Scheepers, Hubertina C J; van Kuijk, Sander M J; Aardenburg, Robert; Langenveld, Josje; van Dooren, Ivo M A; Zwaan, Iris M; Spaanderman, Marc E A; Smits, Luc J M

    2018-04-17

    Prediction models may contribute to personalized risk-based management of women at high risk of spontaneous preterm delivery. Although prediction models are published frequently, often with promising results, external validation generally is lacking. We performed a systematic review of prediction models for the risk of spontaneous preterm birth based on routine clinical parameters. Additionally, we externally validated and evaluated the clinical potential of the models. Prediction models based on routinely collected maternal parameters obtainable during first 16 weeks of gestation were eligible for selection. Risk of bias was assessed according to the CHARMS guidelines. We validated the selected models in a Dutch multicenter prospective cohort study comprising 2614 unselected pregnant women. Information on predictors was obtained by a web-based questionnaire. Predictive performance of the models was quantified by the area under the receiver operating characteristic curve (AUC) and calibration plots for the outcomes spontaneous preterm birth <37 weeks and <34 weeks of gestation. Clinical value was evaluated by means of decision curve analysis and calculating classification accuracy for different risk thresholds. Four studies describing five prediction models fulfilled the eligibility criteria. Risk of bias assessment revealed a moderate to high risk of bias in three studies. The AUC of the models ranged from 0.54 to 0.67 and from 0.56 to 0.70 for the outcomes spontaneous preterm birth <37 weeks and <34 weeks of gestation, respectively. A subanalysis showed that the models discriminated poorly (AUC 0.51-0.56) for nulliparous women. Although we recalibrated the models, two models retained evidence of overfitting. The decision curve analysis showed low clinical benefit for the best performing models. This review revealed several reporting and methodological shortcomings of published prediction models for spontaneous preterm birth. Our external validation study

  6. Launch Vehicle Debris Models and Crew Vehicle Ascent Abort Risk

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott

    2013-01-01

    For manned space launch systems, a reliable abort system is required to reduce the risks associated with a launch vehicle failure during ascent. Understanding the risks associated with failure environments can be achieved through the use of physics-based models of these environments. Debris fields due to destruction of the launch vehicle is one such environment. To better analyze the risk posed by debris, a physics-based model for generating launch vehicle debris catalogs has been developed. The model predicts the mass distribution of the debris field based on formulae developed from analysis of explosions. Imparted velocity distributions are computed using a shock-physics code to model the explosions within the launch vehicle. A comparison of the debris catalog with an existing catalog for the Shuttle external tank show good comparison in the debris characteristics and the predicted debris strike probability. The model is used to analyze the effects of number of debris pieces and velocity distributions on the strike probability and risk.

  7. Dynamic drought risk assessment using crop model and remote sensing techniques

    NASA Astrophysics Data System (ADS)

    Sun, H.; Su, Z.; Lv, J.; Li, L.; Wang, Y.

    2017-02-01

    Drought risk assessment is of great significance to reduce the loss of agricultural drought and ensure food security. The normally drought risk assessment method is to evaluate its exposure to the hazard and the vulnerability to extended periods of water shortage for a specific region, which is a static evaluation method. The Dynamic Drought Risk Assessment (DDRA) is to estimate the drought risk according to the crop growth and water stress conditions in real time. In this study, a DDRA method using crop model and remote sensing techniques was proposed. The crop model we employed is DeNitrification and DeComposition (DNDC) model. The drought risk was quantified by the yield losses predicted by the crop model in a scenario-based method. The crop model was re-calibrated to improve the performance by the Leaf Area Index (LAI) retrieved from MODerate Resolution Imaging Spectroradiometer (MODIS) data. And the in-situ station-based crop model was extended to assess the regional drought risk by integrating crop planted mapping. The crop planted area was extracted with extended CPPI method from MODIS data. This study was implemented and validated on maize crop in Liaoning province, China.

  8. Limits of Risk Predictability in a Cascading Alternating Renewal Process Model.

    PubMed

    Lin, Xin; Moussawi, Alaa; Korniss, Gyorgy; Bakdash, Jonathan Z; Szymanski, Boleslaw K

    2017-07-27

    Most risk analysis models systematically underestimate the probability and impact of catastrophic events (e.g., economic crises, natural disasters, and terrorism) by not taking into account interconnectivity and interdependence of risks. To address this weakness, we propose the Cascading Alternating Renewal Process (CARP) to forecast interconnected global risks. However, assessments of the model's prediction precision are limited by lack of sufficient ground truth data. Here, we establish prediction precision as a function of input data size by using alternative long ground truth data generated by simulations of the CARP model with known parameters. We illustrate the approach on a model of fires in artificial cities assembled from basic city blocks with diverse housing. The results confirm that parameter recovery variance exhibits power law decay as a function of the length of available ground truth data. Using CARP, we also demonstrate estimation using a disparate dataset that also has dependencies: real-world prediction precision for the global risk model based on the World Economic Forum Global Risk Report. We conclude that the CARP model is an efficient method for predicting catastrophic cascading events with potential applications to emerging local and global interconnected risks.

  9. The cardiovascular event reduction tool (CERT)--a simplified cardiac risk prediction model developed from the West of Scotland Coronary Prevention Study (WOSCOPS).

    PubMed

    L'Italien, G; Ford, I; Norrie, J; LaPuerta, P; Ehreth, J; Jackson, J; Shepherd, J

    2000-03-15

    The clinical decision to treat hypercholesterolemia is premised on an awareness of patient risk, and cardiac risk prediction models offer a practical means of determining such risk. However, these models are based on observational cohorts where estimates of the treatment benefit are largely inferred. The West of Scotland Coronary Prevention Study (WOSCOPS) provides an opportunity to develop a risk-benefit prediction model from the actual observed primary event reduction seen in the trial. Five-year Cox model risk estimates were derived from all WOSCOPS subjects (n = 6,595 men, aged 45 to 64 years old at baseline) using factors previously shown to be predictive of definite fatal coronary heart disease or nonfatal myocardial infarction. Model risk factors included age, diastolic blood pressure, total cholesterol/ high-density lipoprotein ratio (TC/HDL), current smoking, diabetes, family history of fatal coronary heart disease, nitrate use or angina, and treatment (placebo/ 40-mg pravastatin). All risk factors were expressed as categorical variables to facilitate risk assessment. Risk estimates were incorporated into a simple, hand-held slide rule or risk tool. Risk estimates were identified for 5-year age bands (45 to 65 years), 4 categories of TC/HDL ratio (<5.5, 5.5 to <6.5, 6.5 to <7.5, > or = 7.5), 2 levels of diastolic blood pressure (<90, > or = 90 mm Hg), from 0 to 3 additional risk factors (current smoking, diabetes, family history of premature fatal coronary heart disease, nitrate use or angina), and pravastatin treatment. Five-year risk estimates ranged from 2% in very low-risk subjects to 61% in the very high-risk subjects. Risk reduction due to pravastatin treatment averaged 31%. Thus, the Cardiovascular Event Reduction Tool (CERT) is a risk prediction model derived from the WOSCOPS trial. Its use will help physicians identify patients who will benefit from cholesterol reduction.

  10. What can'(t) we do with global flood risk models?

    NASA Astrophysics Data System (ADS)

    Ward, P.; Jongman, B.; Salamon, P.; Simpson, A.; Bates, P. D.; de Groeve, T.; Muis, S.; Coughlan, E.; Rudari, R.; Trigg, M. A.; Winsemius, H.

    2015-12-01

    Global flood risk models are now a reality. Initially, their development was driven by a demand from users for first-order global assessments to identify risk hotspots. Relentless upward trends in flood damage over the last decade have enhanced interest in such assessments. The adoption of the Sendai Framework for Disaster Risk Reduction and the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts have made these efforts even more essential. As a result, global flood risk models are being used more and more in practice, by an increasingly large number of practitioners and decision-makers. However, they clearly have their limits compared to local models. To address these issues, a team of scientists and practitioners recently came together at the Global Flood Partnership meeting to critically assess the question 'What can('t) we do with global flood risk models?'. The results of this dialogue (Ward et al., 2013) will be presented, opening a discussion on similar broader initiatives at the science-policy interface in other natural hazards. In this contribution, examples are provided of successful applications of global flood risk models in practice (for example together with the World Bank, Red Cross, and UNISDR), and limitations and gaps between user 'wish-lists' and model capabilities are discussed. Finally, a research agenda is presented for addressing these limitations and reducing the gaps. Ward, P.J. et al., 2015. Nature Climate Change, doi:10.1038/nclimate2742.

  11. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    NASA Technical Reports Server (NTRS)

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition

  12. Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.

    PubMed

    Haimes, Yacov Y

    2018-01-01

    The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.

  13. Testing a fall risk model for injection drug users.

    PubMed

    Pieper, Barbara; Templin, Thomas N; Goldberg, Allon

    2012-01-01

    Fall risk is a critical component of clinical assessment and has not been examined for persons who have injected illicit drugs and are aging. The aim of this study was to test and develop the Fall Risk Model for Injection Drug Users by examining the relationships among injection drug use, chronic venous insufficiency, lower extremity impairments (i.e., decreased ankle range of motion, reduced calf muscle endurance, and leg pain), age and other covariates, and the Tinetti balance and gait total score as a measure of fall risk. A cross-sectional comparative design was used with four crossed factors. Standardized instruments were used to assess the variables. Moderated multiple regression with linear and quadratic trends in age was used to examine the nature of the relationship between the Tinetti balance and gait total and age and the potential moderating role of injection drug use. A prespecified series of models was tested. Participants (n = 713) were men (46.9%) and women with a mean age of 46.26 years and primarily African American (61.7%) in methadone treatment centers. The fall risk of a 48-year-old leg injector was comparable with the fall risk of a 69-year-old who had not injected drugs. Variables were added to the model sequentially, resulting in some lost significance of some when they were explained by subsequent variables. Final significant variables in the model were employment status, number of comorbidities, ankle range of motion, leg pain, and calf muscle endurance. Fall risk was associated with route of drug use. Lower extremity impairments accounted for the effects of injection drug use and chronic venous insufficiency on risk for falls. Further understanding of fall risk in injection users is necessary as they age, attempt to work, and participate in activities.

  14. Pharmaceutical supply chain risk assessment in Iran using analytic hierarchy process (AHP) and simple additive weighting (SAW) methods.

    PubMed

    Jaberidoost, Mona; Olfat, Laya; Hosseini, Alireza; Kebriaeezadeh, Abbas; Abdollahi, Mohammad; Alaeddini, Mahdi; Dinarvand, Rassoul

    2015-01-01

    Pharmaceutical supply chain is a significant component of the health system in supplying medicines, particularly in countries where main drugs are provided by local pharmaceutical companies. No previous studies exist assessing risks and disruptions in pharmaceutical companies while assessing the pharmaceutical supply chain. Any risks affecting the pharmaceutical companies could disrupt supply medicines and health system efficiency. The goal of this study was the risk assessment in pharmaceutical industry in Iran considering process's priority, hazard and probability of risks. The study was carried out in 4 phases; risk identification through literature review, risk identification in Iranian pharmaceutical companies through interview with experts, risk analysis through a questionnaire and consultation with experts using group analytic hierarchy process (AHP) method and rating scale (RS) and risk evaluation of simple additive weighting (SAW) method. In total, 86 main risks were identified in the pharmaceutical supply chain with perspective of pharmaceutical companies classified in 11 classes. The majority of risks described in this study were related to the financial and economic category. Also financial management was found to be the most important factor for consideration. Although pharmaceutical industry and supply chain were affected by current political conditions in Iran during the study time, but half of total risks in the pharmaceutical supply chain were found to be internal risks which could be fixed by companies, internally. Likewise, political status and related risks forced companies to focus more on financial and supply management resulting in less attention to quality management.

  15. High-risk regions and outbreak modelling of tularemia in humans.

    PubMed

    Desvars-Larrive, A; Liu, X; Hjertqvist, M; Sjöstedt, A; Johansson, A; Rydén, P

    2017-02-01

    Sweden reports large and variable numbers of human tularemia cases, but the high-risk regions are anecdotally defined and factors explaining annual variations are poorly understood. Here, high-risk regions were identified by spatial cluster analysis on disease surveillance data for 1984-2012. Negative binomial regression with five previously validated predictors (including predicted mosquito abundance and predictors based on local weather data) was used to model the annual number of tularemia cases within the high-risk regions. Seven high-risk regions were identified with annual incidences of 3·8-44 cases/100 000 inhabitants, accounting for 56·4% of the tularemia cases but only 9·3% of Sweden's population. For all high-risk regions, most cases occurred between July and September. The regression models explained the annual variation of tularemia cases within most high-risk regions and discriminated between years with and without outbreaks. In conclusion, tularemia in Sweden is concentrated in a few high-risk regions and shows high annual and seasonal variations. We present reproducible methods for identifying tularemia high-risk regions and modelling tularemia cases within these regions. The results may help health authorities to target populations at risk and lay the foundation for developing an early warning system for outbreaks.

  16. Accelerated failure time models for semi-competing risks data in the presence of complex censoring.

    PubMed

    Lee, Kyu Ha; Rondeau, Virginie; Haneuse, Sebastien

    2017-12-01

    Statistical analyses that investigate risk factors for Alzheimer's disease (AD) are often subject to a number of challenges. Some of these challenges arise due to practical considerations regarding data collection such that the observation of AD events is subject to complex censoring including left-truncation and either interval or right-censoring. Additional challenges arise due to the fact that study participants under investigation are often subject to competing forces, most notably death, that may not be independent of AD. Towards resolving the latter, researchers may choose to embed the study of AD within the "semi-competing risks" framework for which the recent statistical literature has seen a number of advances including for the so-called illness-death model. To the best of our knowledge, however, the semi-competing risks literature has not fully considered analyses in contexts with complex censoring, as in studies of AD. This is particularly the case when interest lies with the accelerated failure time (AFT) model, an alternative to the traditional multiplicative Cox model that places emphasis away from the hazard function. In this article, we outline a new Bayesian framework for estimation/inference of an AFT illness-death model for semi-competing risks data subject to complex censoring. An efficient computational algorithm that gives researchers the flexibility to adopt either a fully parametric or a semi-parametric model specification is developed and implemented. The proposed methods are motivated by and illustrated with an analysis of data from the Adult Changes in Thought study, an on-going community-based prospective study of incident AD in western Washington State. © 2017, The International Biometric Society.

  17. Risk stratification following acute myocardial infarction.

    PubMed

    Singh, Mandeep

    2007-07-01

    This article reviews the current risk assessment models available for patients presenting with myocardial infarction (MI). These practical tools enhance the health care provider's ability to rapidly and accurately assess patient risk from the event or revascularization therapy, and are of paramount importance in managing patients presenting with MI. This article highlights the models used for ST-elevation MI (STEMI) and non-ST elevation MI (NSTEMI) and provides an additional description of models used to assess risks after primary angioplasty (ie, angioplasty performed for STEMI).

  18. Are Masking-Based Models of Risk Useful?

    PubMed

    Gisiner, Robert C

    2016-01-01

    As our understanding of directly observable effects from anthropogenic sound exposure has improved, concern about "unobservable" effects such as stress and masking have received greater attention. Equal energy models of masking such as power spectrum models have the appeal of simplicity, but do they offer biologically realistic assessments of the risk of masking? Data relevant to masking such as critical ratios, critical bandwidths, temporal resolution, and directional resolution along with what is known about general mammalian antimasking mechanisms all argue for a much more complicated view of masking when making decisions about the risk of masking inherent in a given anthropogenic sound exposure scenario.

  19. Comparison of GWAS models to identify non-additive genetic control of flowering time in sunflower hybrids.

    PubMed

    Bonnafous, Fanny; Fievet, Ghislain; Blanchet, Nicolas; Boniface, Marie-Claude; Carrère, Sébastien; Gouzy, Jérôme; Legrand, Ludovic; Marage, Gwenola; Bret-Mestries, Emmanuelle; Munos, Stéphane; Pouilly, Nicolas; Vincourt, Patrick; Langlade, Nicolas; Mangin, Brigitte

    2018-02-01

    This study compares five models of GWAS, to show the added value of non-additive modeling of allelic effects to identify genomic regions controlling flowering time of sunflower hybrids. Genome-wide association studies are a powerful and widely used tool to decipher the genetic control of complex traits. One of the main challenges for hybrid crops, such as maize or sunflower, is to model the hybrid vigor in the linear mixed models, considering the relatedness between individuals. Here, we compared two additive and three non-additive association models for their ability to identify genomic regions associated with flowering time in sunflower hybrids. A panel of 452 sunflower hybrids, corresponding to incomplete crossing between 36 male lines and 36 female lines, was phenotyped in five environments and genotyped for 2,204,423 SNPs. Intra-locus effects were estimated in multi-locus models to detect genomic regions associated with flowering time using the different models. Thirteen quantitative trait loci were identified in total, two with both model categories and one with only non-additive models. A quantitative trait loci on LG09, detected by both the additive and non-additive models, is located near a GAI homolog and is presented in detail. Overall, this study shows the added value of non-additive modeling of allelic effects for identifying genomic regions that control traits of interest and that could participate in the heterosis observed in hybrids.

  20. Predictors of incident heart failure in patients after an acute coronary syndrome: The LIPID heart failure risk-prediction model.

    PubMed

    Driscoll, Andrea; Barnes, Elizabeth H; Blankenberg, Stefan; Colquhoun, David M; Hunt, David; Nestel, Paul J; Stewart, Ralph A; West, Malcolm J; White, Harvey D; Simes, John; Tonkin, Andrew

    2017-12-01

    Coronary heart disease is a major cause of heart failure. Availability of risk-prediction models that include both clinical parameters and biomarkers is limited. We aimed to develop such a model for prediction of incident heart failure. A multivariable risk-factor model was developed for prediction of first occurrence of heart failure death or hospitalization. A simplified risk score was derived that enabled subjects to be grouped into categories of 5-year risk varying from <5% to >20%. Among 7101 patients from the LIPID study (84% male), with median age 61years (interquartile range 55-67years), 558 (8%) died or were hospitalized because of heart failure. Older age, history of claudication or diabetes mellitus, body mass index>30kg/m 2 , LDL-cholesterol >2.5mmol/L, heart rate>70 beats/min, white blood cell count, and the nature of the qualifying acute coronary syndrome (myocardial infarction or unstable angina) were associated with an increase in heart failure events. Coronary revascularization was associated with a lower event rate. Incident heart failure increased with higher concentrations of B-type natriuretic peptide >50ng/L, cystatin C>0.93nmol/L, D-dimer >273nmol/L, high-sensitivity C-reactive protein >4.8nmol/L, and sensitive troponin I>0.018μg/L. Addition of biomarkers to the clinical risk model improved the model's C statistic from 0.73 to 0.77. The net reclassification improvement incorporating biomarkers into the clinical model using categories of 5-year risk was 23%. Adding a multibiomarker panel to conventional parameters markedly improved discrimination and risk classification for future heart failure events. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  1. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  2. Development and validation of risk models to select ever-smokers for CT lung-cancer screening

    PubMed Central

    Katki, Hormuzd A.; Kovalchik, Stephanie A.; Berg, Christine D.; Cheung, Li C.; Chaturvedi, Anil K.

    2016-01-01

    Importance The US Preventive Services Task Force (USPSTF) recommends computed-tomography (CT) lung-cancer screening for ever-smokers ages 55-80 years who smoked at least 30 pack-years with no more than 15 years since quitting. However, selecting ever-smokers for screening using individualized lung-cancer risk calculations may be more effective and efficient than current USPSTF recommendations. Objective Comparison of modeled outcomes from risk-based CT lung-screening strategies versus USPSTF recommendations. Design/Setting/Participants Empirical risk models for lung-cancer incidence and death in the absence of CT screening using data on ever-smokers from the Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO; 1993-2009) control group. Covariates included age, education, sex, race, smoking intensity/duration/quit-years, Body Mass Index, family history of lung-cancer, and self-reported emphysema. Model validation in the chest radiography groups of the PLCO and the National Lung Screening Trial (NLST; 2002-2009), with additional validation of the death model in the National Health Interview Survey (NHIS; 1997-2001), a representative sample of the US. Models applied to US ever-smokers ages 50-80 (NHIS 2010-2012) to estimate outcomes of risk-based selection for CT lung-screening, assuming screening for all ever-smokers yields the percent changes in lung-cancer detection and death observed in the NLST. Exposure Annual CT lung-screening for 3 years. Main Outcomes and Measures Model validity: calibration (number of model-predicted cases divided by number of observed cases (Estimated/Observed)) and discrimination (Area-Under-Curve (AUC)). Modeled screening outcomes: estimated number of screen-avertable lung-cancer deaths, estimated screening effectiveness (number needed to screen (NNS) to prevent 1 lung-cancer death). Results Lung-cancer incidence and death risk models were well-calibrated in PLCO and NLST. The lung-cancer death model calibrated and

  3. Development and Validation of Risk Models to Select Ever-Smokers for CT Lung Cancer Screening.

    PubMed

    Katki, Hormuzd A; Kovalchik, Stephanie A; Berg, Christine D; Cheung, Li C; Chaturvedi, Anil K

    2016-06-07

    The US Preventive Services Task Force (USPSTF) recommends computed tomography (CT) lung cancer screening for ever-smokers aged 55 to 80 years who have smoked at least 30 pack-years with no more than 15 years since quitting. However, selecting ever-smokers for screening using individualized lung cancer risk calculations may be more effective and efficient than current USPSTF recommendations. Comparison of modeled outcomes from risk-based CT lung-screening strategies vs USPSTF recommendations. Empirical risk models for lung cancer incidence and death in the absence of CT screening using data on ever-smokers from the Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial (PLCO; 1993-2009) control group. Covariates included age; education; sex; race; smoking intensity, duration, and quit-years; body mass index; family history of lung cancer; and self-reported emphysema. Model validation in the chest radiography groups of the PLCO and the National Lung Screening Trial (NLST; 2002-2009), with additional validation of the death model in the National Health Interview Survey (NHIS; 1997-2001), a representative sample of the United States. Models were applied to US ever-smokers aged 50 to 80 years (NHIS 2010-2012) to estimate outcomes of risk-based selection for CT lung screening, assuming screening for all ever-smokers, yield the percent changes in lung cancer detection and death observed in the NLST. Annual CT lung screening for 3 years beginning at age 50 years. For model validity: calibration (number of model-predicted cases divided by number of observed cases [estimated/observed]) and discrimination (area under curve [AUC]). For modeled screening outcomes: estimated number of screen-avertable lung cancer deaths and estimated screening effectiveness (number needed to screen [NNS] to prevent 1 lung cancer death). Lung cancer incidence and death risk models were well calibrated in PLCO and NLST. The lung cancer death model calibrated and discriminated well for US

  4. 46 CFR 308.502 - Additional insurance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Additional insurance. 308.502 Section 308.502 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Introduction § 308.502 Additional insurance. The assured may place increased value or...

  5. Cost-effectiveness of additional catheter-directed thrombolysis for deep vein thrombosis.

    PubMed

    Enden, T; Resch, S; White, C; Wik, H S; Kløw, N E; Sandset, P M

    2013-06-01

    Additional treatment with catheter-directed thrombolysis (CDT) has recently been shown to reduce post-thrombotic syndrome (PTS). To estimate the cost effectiveness of additional CDT compared with standard treatment alone. Using a Markov decision model, we compared the two treatment strategies in patients with a high proximal deep vein thrombosis (DVT) and a low risk of bleeding. The model captured the development of PTS, recurrent venous thromboembolism and treatment-related adverse events within a lifetime horizon and the perspective of a third-party payer. Uncertainty was assessed with one-way and probabilistic sensitivity analyzes. Model inputs from the CaVenT study included PTS development, major bleeding from CDT and utilities for post DVT states including PTS. The remaining clinical inputs were obtained from the literature. Costs obtained from the CaVenT study, hospital accounts and the literature are expressed in US dollars ($); effects in quality adjusted life years (QALY). In base case analyzes, additional CDT accumulated 32.31 QALYs compared with 31.68 QALYs after standard treatment alone. Direct medical costs were $64,709 for additional CDT and $51,866 for standard treatment. The incremental cost-effectiveness ratio (ICER) was $20,429/QALY gained. One-way sensitivity analysis showed model sensitivity to the clinical efficacy of both strategies, but the ICER remained < $55,000/QALY over the full range of all parameters. The probability that CDT is cost effective was 82% at a willingness to pay threshold of $50,000/QALY gained. Additional CDT is likely to be a cost-effective alternative to the standard treatment for patients with a high proximal DVT and a low risk of bleeding. © 2013 International Society on Thrombosis and Haemostasis.

  6. Cost-effectiveness of additional catheter-directed thrombolysis for deep vein thrombosis

    PubMed Central

    ENDEN, T.; RESCH, S.; WHITE, C.; WIK, H. S.; KLØW, N. E.; SANDSET, P. M.

    2013-01-01

    Summary Background Additional treatment with catheter-directed thrombolysis (CDT) has recently been shown to reduce post-thrombotic syndrome (PTS). Objectives To estimate the cost effectiveness of additional CDT compared with standard treatment alone. Methods Using a Markov decision model, we compared the two treatment strategies in patients with a high proximal deep vein thrombosis (DVT) and a low risk of bleeding. The model captured the development of PTS, recurrent venous thromboembolism and treatment-related adverse events within a lifetime horizon and the perspective of a third-party payer. Uncertainty was assessed with one-way and probabilistic sensitivity analyzes. Model inputs from the CaVenT study included PTS development, major bleeding from CDT and utilities for post DVT states including PTS. The remaining clinical inputs were obtained from the literature. Costs obtained from the CaVenT study, hospital accounts and the literature are expressed in US dollars ($); effects in quality adjusted life years (QALY). Results In base case analyzes, additional CDT accumulated 32.31 QALYs compared with 31.68 QALYs after standard treatment alone. Direct medical costs were $64 709 for additional CDT and $51 866 for standard treatment. The incremental cost-effectiveness ratio (ICER) was $20 429/QALY gained. One-way sensitivity analysis showed model sensitivity to the clinical efficacy of both strategies, but the ICER remained < $55 000/QALY over the full range of all parameters. The probability that CDT is cost effective was 82% at a willingness to pay threshold of $50 000/QALY gained. Conclusions Additional CDT is likely to be a cost-effective alternative to the standard treatment for patients with a high proximal DVT and a low risk of bleeding. PMID:23452204

  7. Landslide risk mapping and modeling in China

    NASA Astrophysics Data System (ADS)

    Li, W.; Hong, Y.

    2015-12-01

    Under circumstances of global climate change, tectonic stress and human effect, landslides are among the most frequent and severely widespread natural hazards on Earth, as demonstrated in the World Atlas of Natural Hazards (McGuire et al., 2004). Every year, landslide activities cause serious economic loss as well as casualties (Róbert et al., 2005). How landslides can be monitored and predicted is an urgent research topic of the international landslide research community. Particularly, there is a lack of high quality and updated landslide risk maps and guidelines that can be employed to better mitigate and prevent landslide disasters in many emerging regions, including China (Hong, 2007). Since the 1950s, landslide events have been recorded in the statistical yearbooks, newspapers, and monographs in China. As disasters have been increasingly concerned by the government and the public, information about landslide events is becoming available from online news reports (Liu et al., 2012).This study presents multi-scale landslide risk mapping and modeling in China. At the national scale, based on historical data and practical experiences, we carry out landslide susceptibility and risk mapping by adopting a statistical approach and pattern recognition methods to construct empirical models. Over the identified landslide hot-spot areas, we further evaluate the slope-stability for each individual site (Sidle and Hirotaka, 2006), with the ultimate goal to set up a space-time multi-scale coupling system of Landslide risk mapping and modeling for landslide hazard monitoring and early warning.

  8. Reciprocating Risks of Peer Problems and Aggression for Children's Internalizing Problems

    ERIC Educational Resources Information Center

    Hoglund, Wendy L. G.; Chisholm, Courtney A.

    2014-01-01

    Three complementary models of how peer relationship problems (exclusion and victimization) and aggressive behaviors relate to prospective levels of internalizing problems are examined. The additive risks model proposes that peer problems and aggression cumulatively increase risks for internalizing problems. The reciprocal risks model hypothesizes…

  9. Interaction of Reward Seeking and Self-Regulation in the Prediction of Risk Taking: A Cross-National Test of the Dual Systems Model

    ERIC Educational Resources Information Center

    Duell, Natasha; Steinberg, Laurence; Chein, Jason; Al-Hassan, Suha M.; Bacchini, Dario; Lei, Chang; Chaudhary, Nandita; Di Giunta, Laura; Dodge, Kenneth A.; Fanti, Kostas A.; Lansford, Jennifer E.; Malone, Patrick S.; Oburu, Paul; Pastorelli, Concetta; Skinner, Ann T.; Sorbring, Emma; Tapanya, Sombat; Uribe Tirado, Liliana Maria; Alampay, Liane Peña

    2016-01-01

    In the present analysis, we test the dual systems model of adolescent risk taking in a cross-national sample of over 5,200 individuals aged 10 through 30 (M = 17.05 years, SD = 5.91) from 11 countries. We examine whether reward seeking and self-regulation make independent, additive, or interactive contributions to risk taking, and ask whether…

  10. [Estimation of individual breast cancer risk: relevance and limits of risk estimation models].

    PubMed

    De Pauw, A; Stoppa-Lyonnet, D; Andrieu, N; Asselain, B

    2009-10-01

    Several risk estimation models for breast or ovarian cancers have been developed these last decades. All these models take into account the family history, with different levels of sophistication. Gail model was developed in 1989 taking into account the family history (0, 1 or > or = 2 affected relatives) and several environmental factors. In 1990, Claus model was the first to integrate explicit assumptions about genetic effects, assuming a single gene dominantly inherited occurring with a low frequency in the population. BRCAPRO model, posterior to the identification of BRCA1 and BRCA2, assumes a restricted transmission with only these two dominantly inherited genes. BOADICEA model adds the effect of a polygenic component to the effect of BRCA1 and BRCA2 to explain the residual clustering of breast cancer. At last, IBIS model assumes a third dominantly inherited gene to explain this residual clustering. Moreover, this model incorporates environmental factors. We applied the Claus, BRCAPRO, BOADICEA and IBIS models to four clinical situations, corresponding to more or less heavy family histories, in order to study the consistency of the risk estimates. The three more recent models (BRCAPRO, BOADICEA and IBIS) gave the closer estimations. These estimates could be useful in clinical practice in front of complex analysis of breast and/or ovarian cancers family history.

  11. Modeling insurer-homeowner interactions in managing natural disaster risk.

    PubMed

    Kesete, Yohannes; Peng, Jiazhen; Gao, Yang; Shan, Xiaojun; Davidson, Rachel A; Nozick, Linda K; Kruse, Jamie

    2014-06-01

    The current system for managing natural disaster risk in the United States is problematic for both homeowners and insurers. Homeowners are often uninsured or underinsured against natural disaster losses, and typically do not invest in retrofits that can reduce losses. Insurers often do not want to insure against these losses, which are some of their biggest exposures and can cause an undesirably high chance of insolvency. There is a need to design an improved system that acknowledges the different perspectives of the stakeholders. In this article, we introduce a new modeling framework to help understand and manage the insurer's role in catastrophe risk management. The framework includes a new game-theoretic optimization model of insurer decisions that interacts with a utility-based homeowner decision model and is integrated with a regional catastrophe loss estimation model. Reinsurer and government roles are represented as bounds on the insurer-insured interactions. We demonstrate the model for a full-scale case study for hurricane risk to residential buildings in eastern North Carolina; present the results from the perspectives of all stakeholders-primary insurers, homeowners (insured and uninsured), and reinsurers; and examine the effect of key parameters on the results. © 2014 Society for Risk Analysis.

  12. Quantile regression via vector generalized additive models.

    PubMed

    Yee, Thomas W

    2004-07-30

    One of the most popular methods for quantile regression is the LMS method of Cole and Green. The method naturally falls within a penalized likelihood framework, and consequently allows for considerable flexible because all three parameters may be modelled by cubic smoothing splines. The model is also very understandable: for a given value of the covariate, the LMS method applies a Box-Cox transformation to the response in order to transform it to standard normality; to obtain the quantiles, an inverse Box-Cox transformation is applied to the quantiles of the standard normal distribution. The purposes of this article are three-fold. Firstly, LMS quantile regression is presented within the framework of the class of vector generalized additive models. This confers a number of advantages such as a unifying theory and estimation process. Secondly, a new LMS method based on the Yeo-Johnson transformation is proposed, which has the advantage that the response is not restricted to be positive. Lastly, this paper describes a software implementation of three LMS quantile regression methods in the S language. This includes the LMS-Yeo-Johnson method, which is estimated efficiently by a new numerical integration scheme. The LMS-Yeo-Johnson method is illustrated by way of a large cross-sectional data set from a New Zealand working population. Copyright 2004 John Wiley & Sons, Ltd.

  13. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  14. The Society of Thoracic Surgeons Congenital Heart Surgery Database Mortality Risk Model: Part 1—Statistical Methodology

    PubMed Central

    O’Brien, Sean M.; Jacobs, Jeffrey P.; Pasquali, Sara K.; Gaynor, J. William; Karamlou, Tara; Welke, Karl F.; Filardo, Giovanni; Han, Jane M.; Kim, Sunghee; Shahian, David M.; Jacobs, Marshall L.

    2016-01-01

    Background This study’s objective was to develop a risk model incorporating procedure type and patient factors to be used for case-mix adjustment in the analysis of hospital-specific operative mortality rates after congenital cardiac operations. Methods Included were patients of all ages undergoing cardiac operations, with or without cardiopulmonary bypass, at centers participating in The Society of Thoracic Surgeons Congenital Heart Surgery Database during January 1, 2010, to December 31, 2013. Excluded were isolated patent ductus arteriosus closures in patients weighing less than or equal to 2.5 kg, centers with more than 10% missing data, and patients with missing data for key variables. Data from the first 3.5 years were used for model development, and data from the last 0.5 year were used for assessing model discrimination and calibration. Potential risk factors were proposed based on expert consensus and selected after empirically comparing a variety of modeling options. Results The study cohort included 52,224 patients from 86 centers with 1,931 deaths (3.7%). Covariates included in the model were primary procedure, age, weight, and 11 additional patient factors reflecting acuity status and comorbidities. The C statistic in the validation sample was 0.858. Plots of observed-vs-expected mortality rates revealed good calibration overall and within subgroups, except for a slight overestimation of risk in the highest decile of predicted risk. Removing patient preoperative factors from the model reduced the C statistic to 0.831 and affected the performance classification for 12 of 86 hospitals. Conclusions The risk model is well suited to adjust for case mix in the analysis and reporting of hospital-specific mortality for congenital heart operations. Inclusion of patient factors added useful discriminatory power and reduced bias in the calculation of hospital-specific mortality metrics. PMID:26245502

  15. Shape of the BMI-mortality association by cause of death, using generalized additive models: NHIS 1986-2006.

    PubMed

    Zajacova, Anna; Burgard, Sarah A

    2012-03-01

    Numerous studies have examined the association between body mass index (BMI) and mortality. The precise shape of their association, however, has not been established. We use nonparametric methods to determine the relationship between BMI and mortality. Data from the National Health Interview Survey-Linked Mortality Files 1986-2006 for adults aged 50 to 80 are analyzed using a Poisson approach to survival modeling within the generalized additive model (GAM) framework. The BMI-mortality association is more V shaped than U shaped, with the odds of dying rising steeply from the lowest risk point at BMIs of 23 to 26. The association varies considerably by time since interview and cause of death. For instance, the association has an inverted J shape for respiratory causes but is monotonically increasing for diabetes deaths. Our findings have implications for interpreting results from BMI-mortality studies and suggest caution in translating the findings into public health messages.

  16. A software quality model and metrics for risk assessment

    NASA Technical Reports Server (NTRS)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  17. Network Reconstruction Using Nonparametric Additive ODE Models

    PubMed Central

    Henderson, James; Michailidis, George

    2014-01-01

    Network representations of biological systems are widespread and reconstructing unknown networks from data is a focal problem for computational biologists. For example, the series of biochemical reactions in a metabolic pathway can be represented as a network, with nodes corresponding to metabolites and edges linking reactants to products. In a different context, regulatory relationships among genes are commonly represented as directed networks with edges pointing from influential genes to their targets. Reconstructing such networks from data is a challenging problem receiving much attention in the literature. There is a particular need for approaches tailored to time-series data and not reliant on direct intervention experiments, as the former are often more readily available. In this paper, we introduce an approach to reconstructing directed networks based on dynamic systems models. Our approach generalizes commonly used ODE models based on linear or nonlinear dynamics by extending the functional class for the functions involved from parametric to nonparametric models. Concomitantly we limit the complexity by imposing an additive structure on the estimated slope functions. Thus the submodel associated with each node is a sum of univariate functions. These univariate component functions form the basis for a novel coupling metric that we define in order to quantify the strength of proposed relationships and hence rank potential edges. We show the utility of the method by reconstructing networks using simulated data from computational models for the glycolytic pathway of Lactocaccus Lactis and a gene network regulating the pluripotency of mouse embryonic stem cells. For purposes of comparison, we also assess reconstruction performance using gene networks from the DREAM challenges. We compare our method to those that similarly rely on dynamic systems models and use the results to attempt to disentangle the distinct roles of linearity, sparsity, and derivative

  18. Risk Factors for the Development of Heterotopic Ossification in Seriously Burned Adults: A NIDRR Burn Model System Database Analysis

    PubMed Central

    Levi, Benjamin; Jayakumar, Prakash; Giladi, Avi; Jupiter, Jesse B.; Ring, David C.; Kowalske, Karen; Gibran, Nicole S.; Herndon, David; Schneider, Jeffrey C.; Ryan, Colleen M.

    2015-01-01

    Purpose Heterotopic ossification (HO) is a debilitating complication of burn injury; however, incidence and risk factors are poorly understood. In this study we utilize a multicenter database of adults with burn injuries to identify and analyze clinical factors that predict HO formation. Methods Data from 6 high-volume burn centers, in the Burn Injury Model System Database, were analyzed. Univariate logistic regression models were used for model selection. Cluster-adjusted multivariate logistic regression was then used to evaluate the relationship between clinical and demographic data and the development of HO. Results Of 2,979 patients in the database with information on HO that addressed risk factors for development of HO, 98 (3.5%) developed HO. Of these 98 patients, 97 had arm burns, and 96 had arm grafts. Controlling for age and sex in a multivariate model, patients with >30% total body surface area (TBSA) burn had 11.5x higher odds of developing HO (p<0.001), and those with arm burns that required skin grafting had 96.4x higher odds of developing HO (p=0.04). For each additional time a patient went to the operating room, odds of HO increased 30% (OR 1.32, p<0.001), and each additional ventilator day increase odds 3.5% (OR 1.035, p<0.001). Joint contracture, inhalation injury, and bone exposure did not significantly increase odds of HO. Conclusion Risk factors for HO development include >30% TBSA burn, arm burns, arm grafts, ventilator days, and number of trips to the operating room. Future studies can use these results to identify highest-risk patients to guide deployment of prophylactic and experimental treatments. PMID:26496115

  19. Initial Risk Analysis and Decision Making Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coalmore » electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.« less

  20. 46 CFR 308.502 - Additional insurance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Additional insurance. 308.502 Section 308.502 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance I-Introduction § 308.502 Additional insurance. The assured may place increased value or...

  1. 46 CFR 308.502 - Additional insurance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Additional insurance. 308.502 Section 308.502 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance I-Introduction § 308.502 Additional insurance. The assured may place increased value or...

  2. 46 CFR 308.502 - Additional insurance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Additional insurance. 308.502 Section 308.502 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance I-Introduction § 308.502 Additional insurance. The assured may place increased value or...

  3. 46 CFR 308.502 - Additional insurance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Additional insurance. 308.502 Section 308.502 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance I-Introduction § 308.502 Additional insurance. The assured may place increased value or...

  4. Modelling West Nile virus transmission risk in Europe: effect of temperature and mosquito biotypes on the basic reproduction number.

    PubMed

    Vogels, Chantal B F; Hartemink, Nienke; Koenraadt, Constantianus J M

    2017-07-10

    West Nile virus (WNV) is a mosquito-borne flavivirus which has caused repeated outbreaks in humans in southern and central Europe, but thus far not in northern Europe. The main mosquito vector for WNV, Culex pipiens, consists of two behaviourally distinct biotypes, pipiens and molestus, which can form hybrids. Differences between biotypes, such as vector competence and host preference, could be important in determining the risk of WNV outbreaks. Risks for WNV establishment can be modelled with basic reproduction number (R 0 ) models. However, existing R 0 models have not differentiated between biotypes. The aim of this study was, therefore, to explore the role of temperature-dependent and biotype-specific effects on the risk of WNV establishment in Europe. We developed an R 0 model with temperature-dependent and biotype-specific parameters, and calculated R 0 values using the next-generation matrix for several scenarios relevant for Europe. In addition, elasticity analysis was done to investigate the contribution of each biotype to R 0 . Global warming and increased mosquito-to-host ratios can possibly result in more intense WNV circulation in birds and spill-over to humans in northern Europe. Different contributions of the Cx. pipiens biotypes to R 0 shows the importance of including biotype-specific parameters in models for reliable WNV risk assessments.

  5. Evaluating biomarkers to model cancer risk post cosmic ray exposure

    PubMed Central

    Sridhara, Deepa M.; Asaithamby, Aroumougame; Blattnig, Steve R.; Costes, Sylvain V.; Doetsch, Paul W.; Dynan, William S.; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D.; Peterson, Leif E.; Plante, Ianik; Ponomarev, Artem L.; Saha, Janapriya; Snijders, Antoine M.; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M.

    2017-01-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  6. Evaluating biomarkers to model cancer risk post cosmic ray exposure

    NASA Astrophysics Data System (ADS)

    Sridharan, Deepa M.; Asaithamby, Aroumougame; Blattnig, Steve R.; Costes, Sylvain V.; Doetsch, Paul W.; Dynan, William S.; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D.; Peterson, Leif E.; Plante, Ianik; Ponomarev, Artem L.; Saha, Janapriya; Snijders, Antoine M.; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M.

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  7. Adoption of Building Information Modelling in project planning risk management

    NASA Astrophysics Data System (ADS)

    Mering, M. M.; Aminudin, E.; Chai, C. S.; Zakaria, R.; Tan, C. S.; Lee, Y. Y.; Redzuan, A. A.

    2017-11-01

    An efficient and effective risk management required a systematic and proper methodology besides knowledge and experience. However, if the risk management is not discussed from the starting of the project, this duty is notably complicated and no longer efficient. This paper presents the adoption of Building Information Modelling (BIM) in project planning risk management. The objectives is to identify the traditional risk management practices and its function, besides, determine the best function of BIM in risk management and investigating the efficiency of adopting BIM-based risk management during the project planning phase. In order to obtain data, a quantitative approach is adopted in this research. Based on data analysis, the lack of compliance with project requirements and failure to recognise risk and develop responses to opportunity are the risks occurred when traditional risk management is implemented. When using BIM in project planning, it works as the tracking of cost control and cash flow give impact on the project cycle to be completed on time. 5D cost estimation or cash flow modeling benefit risk management in planning, controlling and managing budget and cost reasonably. There were two factors that mostly benefit a BIM-based technology which were formwork plan with integrated fall plan and design for safety model check. By adopting risk management, potential risks linked with a project and acknowledging to those risks can be identified to reduce them to an acceptable extent. This means recognizing potential risks and avoiding threat by reducing their negative effects. The BIM-based risk management can enhance the planning process of construction projects. It benefits the construction players in various aspects. It is important to know the application of BIM-based risk management as it can be a lesson learnt to others to implement BIM and increase the quality of the project.

  8. Fine-mapping additive and dominant SNP effects using group-LASSO and Fractional Resample Model Averaging

    PubMed Central

    Sabourin, Jeremy; Nobel, Andrew B.; Valdar, William

    2014-01-01

    Genomewide association studies sometimes identify loci at which both the number and identities of the underlying causal variants are ambiguous. In such cases, statistical methods that model effects of multiple SNPs simultaneously can help disentangle the observed patterns of association and provide information about how those SNPs could be prioritized for follow-up studies. Current multi-SNP methods, however, tend to assume that SNP effects are well captured by additive genetics; yet when genetic dominance is present, this assumption translates to reduced power and faulty prioritizations. We describe a statistical procedure for prioritizing SNPs at GWAS loci that efficiently models both additive and dominance effects. Our method, LLARRMA-dawg, combines a group LASSO procedure for sparse modeling of multiple SNP effects with a resampling procedure based on fractional observation weights; it estimates for each SNP the robustness of association with the phenotype both to sampling variation and to competing explanations from other SNPs. In producing a SNP prioritization that best identifies underlying true signals, we show that: our method easily outperforms a single marker analysis; when additive-only signals are present, our joint model for additive and dominance is equivalent to or only slightly less powerful than modeling additive-only effects; and, when dominance signals are present, even in combination with substantial additive effects, our joint model is unequivocally more powerful than a model assuming additivity. We also describe how performance can be improved through calibrated randomized penalization, and discuss how dominance in ungenotyped SNPs can be incorporated through either heterozygote dosage or multiple imputation. PMID:25417853

  9. Job strain (demands and control model) as a predictor of cardiovascular risk factors among petrochemical personnel

    PubMed Central

    Habibi, Ehsanollah; Poorabdian, Siamak; Shakerian, Mahnaz

    2015-01-01

    Background: One of the practical models for the assessment of stressful working conditions due to job strain is job demand and control model, which explains how physical and psychological adverse consequences, including cardiovascular risk factors can be established due to high work demands (the amount of workload, in addition to time limitations to complete that work) and low control of the worker on his/her work (lack of decision making) in the workplace. The aim of this study was to investigate how certain cardiovascular risk factors (including body mass index [BMI], heart rate, blood pressure, cholesterol and smoking) and the job demand and job control are related to each other. Materials and Methods: This prospective cohort study was conducted on 500 workers of the petrochemical industry in south of Iran, 2009. The study population was selected using simple random statistical method. They completed job demand and control questionnaire. The cardiovascular risk factors data was extracted from the workers hygiene profiles. Chi-square (χ2) test and hypothesis test (η) were used to assess the possible relationship between different quantified variables, individual demographic and cardiovascular risk factors. Results: The results of this study revealed that a significant relationship can be found between job demand control model and cardiovascular risk factors. Chi-square test result for the heart rate showed the highest (χ2 = 145.078) relationship, the corresponding results for smoking and BMI were χ2 = 85.652 and χ2 = 30.941, respectively. Subsequently, hypothesis testing results for cholesterol and hypertension was 0.469 and 0.684, respectively. Discussion: Job strain is likely to be associated with an increased risk of cardiovascular risk factors among male staff in a petrochemical company in Iran. The parameters illustrated in the Job demands and control model can act as acceptable predictors for the probability of job stress occurrence followed by showing

  10. MeProRisk - a Joint Venture for Minimizing Risk in Geothermal Reservoir Development

    NASA Astrophysics Data System (ADS)

    Clauser, C.; Marquart, G.

    2009-12-01

    Exploration and development of geothermal reservoirs for the generation of electric energy involves high engineering and economic risks due to the need for 3-D geophysical surface surveys and deep boreholes. The MeProRisk project provides a strategy guideline for reducing these risks by combining cross-disciplinary information from different specialists: Scientists from three German universities and two private companies contribute with new methods in seismic modeling and interpretation, numerical reservoir simulation, estimation of petrophysical parameters, and 3-D visualization. The approach chosen in MeProRisk consists in considering prospecting and developing of geothermal reservoirs as an iterative process. A first conceptual model for fluid flow and heat transport simulation can be developed based on limited available initial information on geology and rock properties. In the next step, additional data is incorporated which is based on (a) new seismic interpretation methods designed for delineating fracture systems, (b) statistical studies on large numbers of rock samples for estimating reliable rock parameters, (c) in situ estimates of the hydraulic conductivity tensor. This results in a continuous refinement of the reservoir model where inverse modelling of fluid flow and heat transport allows infering the uncertainty and resolution of the model at each iteration step. This finally yields a calibrated reservoir model which may be used to direct further exploration by optimizing additional borehole locations, estimate the uncertainty of key operational and economic parameters, and optimize the long-term operation of a geothermal resrvoir.

  11. Two risk score models for predicting incident Type 2 diabetes in Japan.

    PubMed

    Doi, Y; Ninomiya, T; Hata, J; Hirakawa, Y; Mukai, N; Iwase, M; Kiyohara, Y

    2012-01-01

    Risk scoring methods are effective for identifying persons at high risk of Type 2 diabetes mellitus, but such approaches have not yet been established in Japan. A total of 1935 subjects of a derivation cohort were followed up for 14 years from 1988 and 1147 subjects of a validation cohort independent of the derivation cohort were followed up for 5 years from 2002. Risk scores were estimated based on the coefficients (β) of Cox proportional hazards model in the derivation cohort and were verified in the validation cohort. In the derivation cohort, the non-invasive risk model was established using significant risk factors; namely, age, sex, family history of diabetes, abdominal circumference, body mass index, hypertension, regular exercise and current smoking. We also created another scoring risk model by adding fasting plasma glucose levels to the non-invasive model (plus-fasting plasma glucose model). The area under the curve of the non-invasive model was 0.700 and it increased significantly to 0.772 (P < 0.001) in the plus-fasting plasma glucose model. The ability of the non-invasive model to predict Type 2 diabetes was comparable with that of impaired glucose tolerance, and the plus-fasting plasma glucose model was superior to it. The cumulative incidence of Type 2 diabetes was significantly increased with elevating quintiles of the sum scores of both models in the validation cohort (P for trend < 0.001). We developed two practical risk score models for easily identifying individuals at high risk of incident Type 2 diabetes without an oral glucose tolerance test in the Japanese population. © 2011 The Authors. Diabetic Medicine © 2011 Diabetes UK.

  12. Dynamical modeling approach to risk assessment for radiogenic leukemia among astronauts engaged in interplanetary space missions.

    PubMed

    Smirnova, Olga A; Cucinotta, Francis A

    2018-02-01

    A recently developed biologically motivated dynamical model of the assessment of the excess relative risk (ERR) for radiogenic leukemia among acutely/continuously irradiated humans (Smirnova, 2015, 2017) is applied to estimate the ERR for radiogenic leukemia among astronauts engaged in long-term interplanetary space missions. Numerous scenarios of space radiation exposure during space missions are used in the modeling studies. The dependence of the ERR for leukemia among astronauts on several mission parameters including the dose equivalent rates of galactic cosmic rays (GCR) and large solar particle events (SPEs), the number of large SPEs, the time interval between SPEs, mission duration, the degree of astronaut's additional shielding during SPEs, the degree of their additional 12-hour's daily shielding, as well as the total mission dose equivalent, is examined. The results of the estimation of ERR for radiogenic leukemia among astronauts, which are obtained in the framework of the developed dynamical model for various scenarios of space radiation exposure, are compared with the corresponding results, computed by the commonly used linear model. It is revealed that the developed dynamical model along with the linear model can be applied to estimate ERR for radiogenic leukemia among astronauts engaged in long-term interplanetary space missions in the range of applicability of the latter. In turn, the developed dynamical model is capable of predicting the ERR for leukemia among astronauts for the irradiation regimes beyond the applicability range of the linear model in emergency cases. As a supplement to the estimations of cancer incidence and death (REIC and REID) (Cucinotta et al., 2013, 2017), the developed dynamical model for the assessment of the ERR for leukemia can be employed on the pre-mission design phase for, e.g., the optimization of the regimes of astronaut's additional shielding in the course of interplanetary space missions. The developed model can

  13. An Integrated Risk Management Model for Source Water Protection Areas

    PubMed Central

    Chiueh, Pei-Te; Shang, Wei-Ting; Lo, Shang-Lien

    2012-01-01

    Watersheds are recognized as the most effective management unit for the protection of water resources. For surface water supplies that use water from upstream watersheds, evaluating threats to water quality and implementing a watershed management plan are crucial for the maintenance of drinking water safe for humans. The aim of this article is to establish a risk assessment model that provides basic information for identifying critical pollutants and areas at high risk for degraded water quality. In this study, a quantitative risk model that uses hazard quotients for each water quality parameter was combined with a qualitative risk model that uses the relative risk level of potential pollution events in order to characterize the current condition and potential risk of watersheds providing drinking water. In a case study of Taipei Source Water Area in northern Taiwan, total coliforms and total phosphorus were the top two pollutants of concern. Intensive tea-growing and recreational activities around the riparian zone may contribute the greatest pollution to the watershed. Our risk assessment tool may be enhanced by developing, recording, and updating information on pollution sources in the water supply watersheds. Moreover, management authorities could use the resultant information to create watershed risk management plans. PMID:23202770

  14. People's Risk Recognition Preceding Evacuation and Its Role in Demand Modeling and Planning.

    PubMed

    Urata, Junji; Pel, Adam J

    2018-05-01

    Evacuation planning and management involves estimating the travel demand in the event that such action is required. This is usually done as a function of people's decision to evacuate, which we show is strongly linked to their risk awareness. We use an empirical data set, which shows tsunami evacuation behavior, to demonstrate that risk recognition is not synonymous with objective risk, but is instead determined by a combination of factors including risk education, information, and sociodemographics, and that it changes dynamically over time. Based on these findings, we formulate an ordered logit model to describe risk recognition combined with a latent class model to describe evacuation choices. Our proposed evacuation choice model along with a risk recognition class can evaluate quantitatively the influence of disaster mitigation measures, risk education, and risk information. The results obtained from the risk recognition model show that risk information has a greater impact in the sense that people recognize their high risk. The results of the evacuation choice model show that people who are unaware of their risk take a longer time to evacuate. © 2017 Society for Risk Analysis.

  15. Literature Review on Modeling Cyber Networks and Evaluating Cyber Risks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelic, Andjelka; Campbell, Philip L

    The National Infrastructure Simulations and Analysis Center (NISAC) conducted a literature review on modeling cyber networks and evaluating cyber risks. The literature review explores where modeling is used in the cyber regime and ways that consequence and risk are evaluated. The relevant literature clusters in three different spaces: network security, cyber-physical, and mission assurance. In all approaches, some form of modeling is utilized at varying levels of detail, while the ability to understand consequence varies, as do interpretations of risk. This document summarizes the different literature viewpoints and explores their applicability to securing enterprise networks.

  16. Non-Targeted Effects Models Predict Significantly Higher Mars Mission Cancer Risk than Targeted Effects Models

    DOE PAGES

    Cucinotta, Francis A.; Cacao, Eliedonna

    2017-05-12

    Cancer risk is an important concern for galactic cosmic ray (GCR) exposures, which consist of a wide-energy range of protons, heavy ions and secondary radiation produced in shielding and tissues. Relative biological effectiveness (RBE) factors for surrogate cancer endpoints in cell culture models and tumor induction in mice vary considerable, including significant variations for different tissues and mouse strains. Many studies suggest non-targeted effects (NTE) occur for low doses of high linear energy transfer (LET) radiation, leading to deviation from the linear dose response model used in radiation protection. Using the mouse Harderian gland tumor experiment, the only extensive data-setmore » for dose response modelling with a variety of particle types (>4), for the first-time a particle track structure model of tumor prevalence is used to investigate the effects of NTEs in predictions of chronic GCR exposure risk. The NTE model led to a predicted risk 2-fold higher compared to a targeted effects model. The scarcity of data with animal models for tissues that dominate human radiation cancer risk, including lung, colon, breast, liver, and stomach, suggest that studies of NTEs in other tissues are urgently needed prior to long-term space missions outside the protection of the Earth’s geomagnetic sphere.« less

  17. Non-Targeted Effects Models Predict Significantly Higher Mars Mission Cancer Risk than Targeted Effects Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cucinotta, Francis A.; Cacao, Eliedonna

    Cancer risk is an important concern for galactic cosmic ray (GCR) exposures, which consist of a wide-energy range of protons, heavy ions and secondary radiation produced in shielding and tissues. Relative biological effectiveness (RBE) factors for surrogate cancer endpoints in cell culture models and tumor induction in mice vary considerable, including significant variations for different tissues and mouse strains. Many studies suggest non-targeted effects (NTE) occur for low doses of high linear energy transfer (LET) radiation, leading to deviation from the linear dose response model used in radiation protection. Using the mouse Harderian gland tumor experiment, the only extensive data-setmore » for dose response modelling with a variety of particle types (>4), for the first-time a particle track structure model of tumor prevalence is used to investigate the effects of NTEs in predictions of chronic GCR exposure risk. The NTE model led to a predicted risk 2-fold higher compared to a targeted effects model. The scarcity of data with animal models for tissues that dominate human radiation cancer risk, including lung, colon, breast, liver, and stomach, suggest that studies of NTEs in other tissues are urgently needed prior to long-term space missions outside the protection of the Earth’s geomagnetic sphere.« less

  18. The Climate-Agriculture-Modeling and Decision Tool (CAMDT) for Climate Risk Management in Agriculture

    NASA Astrophysics Data System (ADS)

    Ines, A. V. M.; Han, E.; Baethgen, W.

    2017-12-01

    Advances in seasonal climate forecasts (SCFs) during the past decades have brought great potential to improve agricultural climate risk managements associated with inter-annual climate variability. In spite of popular uses of crop simulation models in addressing climate risk problems, the models cannot readily take seasonal climate predictions issued in the format of tercile probabilities of most likely rainfall categories (i.e, below-, near- and above-normal). When a skillful SCF is linked with the crop simulation models, the informative climate information can be further translated into actionable agronomic terms and thus better support strategic and tactical decisions. In other words, crop modeling connected with a given SCF allows to simulate "what-if" scenarios with different crop choices or management practices and better inform the decision makers. In this paper, we present a decision support tool, called CAMDT (Climate Agriculture Modeling and Decision Tool), which seamlessly integrates probabilistic SCFs to DSSAT-CSM-Rice model to guide decision-makers in adopting appropriate crop and agricultural water management practices for given climatic conditions. The CAMDT has a functionality to disaggregate a probabilistic SCF into daily weather realizations (either a parametric or non-parametric disaggregation method) and to run DSSAT-CSM-Rice with the disaggregated weather realizations. The convenient graphical user-interface allows easy implementation of several "what-if" scenarios for non-technical users and visualize the results of the scenario runs. In addition, the CAMDT also translates crop model outputs to economic terms once the user provides expected crop price and cost. The CAMDT is a practical tool for real-world applications, specifically for agricultural climate risk management in the Bicol region, Philippines, having a great flexibility for being adapted to other crops or regions in the world. CAMDT GitHub: https://github.com/Agro-Climate/CAMDT

  19. Stochastic Watershed Models for Risk Based Decision Making

    NASA Astrophysics Data System (ADS)

    Vogel, R. M.

    2017-12-01

    Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation

  20. A risk evaluation model and its application in online retailing trustfulness

    NASA Astrophysics Data System (ADS)

    Ye, Ruyi; Xu, Yingcheng

    2017-08-01

    Building a general model for risks evaluation in advance could improve the convenience, normality and comparability of the results of repeating risks evaluation in the case that the repeating risks evaluating are in the same area and for a similar purpose. One of the most convenient and common risks evaluation models is an index system including of several index, according weights and crediting method. One method to build a risk evaluation index system that guarantees the proportional relationship between the resulting credit and the expected risk loss is proposed and an application example is provided in online retailing in this article.

  1. Integrated Water and Sanitation Risk Assessment and Modeling in the Upper Sonora River basin (Northwest, Mexico)

    NASA Astrophysics Data System (ADS)

    Mayer, A. S.; Robles-Morua, A.; Halvorsen, K. E.; Vivoni, E. R.; Auer, M. T.

    2011-12-01

    explore the use of participatory modeling frameworks in less developed regions. Results indicate that respondents agreed strongly with the hydrologic and water quality modeling methodologies presented and considered the modeling results useful. Our results also show that participatory modeling approaches can have short term impacts as seen in the changes in water-related risk perceptions. In total, these projects revealed that water resources management solutions need to take into account variations across the human landscape (i.e. risk perceptions) and variations in the biophysical response of watersheds to natural phenomena (i.e. streamflow generation) and to anthropogenic activities (i.e. contaminant fate and transport). In addition, this work underscores the notion that sustainable water resources solutions need to contend with uncertainty in our understanding and predictions of human perceptions and biophysical systems.

  2. A Single-Boundary Accumulator Model of Response Times in an Addition Verification Task

    PubMed Central

    Faulkenberry, Thomas J.

    2017-01-01

    Current theories of mathematical cognition offer competing accounts of the interplay between encoding and calculation in mental arithmetic. Additive models propose that manipulations of problem format do not interact with the cognitive processes used in calculation. Alternatively, interactive models suppose that format manipulations have a direct effect on calculation processes. In the present study, we tested these competing models by fitting participants' RT distributions in an arithmetic verification task with a single-boundary accumulator model (the shifted Wald distribution). We found that in addition to providing a more complete description of RT distributions, the accumulator model afforded a potentially more sensitive test of format effects. Specifically, we found that format affected drift rate, which implies that problem format has a direct impact on calculation processes. These data give further support for an interactive model of mental arithmetic. PMID:28769853

  3. Use of a Generalized Additive Model to Investigate Key Abiotic Factors Affecting Microcystin Cellular Quotas in Heavy Bloom Areas of Lake Taihu

    PubMed Central

    Tao, Min; Xie, Ping; Chen, Jun; Qin, Boqiang; Zhang, Dawen; Niu, Yuan; Zhang, Meng; Wang, Qing; Wu, Laiyan

    2012-01-01

    Lake Taihu is the third largest freshwater lake in China and is suffering from serious cyanobacterial blooms with the associated drinking water contamination by microcystin (MC) for millions of citizens. So far, most studies on MCs have been limited to two small bays, while systematic research on the whole lake is lacking. To explain the variations in MC concentrations during cyanobacterial bloom, a large-scale survey at 30 sites across the lake was conducted monthly in 2008. The health risks of MC exposure were high, especially in the northern area. Both Microcystis abundance and MC cellular quotas presented positive correlations with MC concentration in the bloom seasons, suggesting that the toxic risks during Microcystis proliferations were affected by variations in both Microcystis density and MC production per Microcystis cell. Use of a powerful predictive modeling tool named generalized additive model (GAM) helped visualize significant effects of abiotic factors related to carbon fixation and proliferation of Microcystis (conductivity, dissolved inorganic carbon (DIC), water temperature and pH) on MC cellular quotas from recruitment period of Microcystis to the bloom seasons, suggesting the possible use of these factors, in addition to Microcystis abundance, as warning signs to predict toxic events in the future. The interesting relationship between macrophytes and MC cellular quotas of Microcystis (i.e., high MC cellular quotas in the presence of macrophytes) needs further investigation. PMID:22384128

  4. Electricity market pricing, risk hedging and modeling

    NASA Astrophysics Data System (ADS)

    Cheng, Xu

    In this dissertation, we investigate the pricing, price risk hedging/arbitrage, and simplified system modeling for a centralized LMP-based electricity market. In an LMP-based market model, the full AC power flow model and the DC power flow model are most widely used to represent the transmission system. We investigate the differences of dispatching results, congestion pattern, and LMPs for the two power flow models. An appropriate LMP decomposition scheme to quantify the marginal costs of the congestion and real power losses is critical for the implementation of financial risk hedging markets. However, the traditional LMP decomposition heavily depends on the slack bus selection. In this dissertation we propose a slack-independent scheme to break LMP down into energy, congestion, and marginal loss components by analyzing the actual marginal cost of each bus at the optimal solution point. The physical and economic meanings of the marginal effect at each bus provide accurate price information for both congestion and losses, and thus the slack-dependency of the traditional scheme is eliminated. With electricity priced at the margin instead of the average value, the market operator typically collects more revenue from power sellers than that paid to power buyers. According to the LMP decomposition results, the revenue surplus is then divided into two parts: congestion charge surplus and marginal loss revenue surplus. We apply the LMP decomposition results to the financial tools, such as financial transmission right (FTR) and loss hedging right (LHR), which have been introduced to hedge against price risks associated to congestion and losses, to construct a full price risk hedging portfolio. The two-settlement market structure and the introduction of financial tools inevitably create market manipulation opportunities. We investigate several possible market manipulation behaviors by virtual bidding and propose a market monitor approach to identify and quantify such

  5. Risk assessment and remedial policy evaluation using predictive modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linkov, L.; Schell, W.R.

    1996-06-01

    As a result of nuclear industry operation and accidents, large areas of natural ecosystems have been contaminated by radionuclides and toxic metals. Extensive societal pressure has been exerted to decrease the radiation dose to the population and to the environment. Thus, in making abatement and remediation policy decisions, not only economic costs but also human and environmental risk assessments are desired. This paper introduces a general framework for risk assessment and remedial policy evaluation using predictive modeling. Ecological risk assessment requires evaluation of the radionuclide distribution in ecosystems. The FORESTPATH model is used for predicting the radionuclide fate in forestmore » compartments after deposition as well as for evaluating the efficiency of remedial policies. Time of intervention and radionuclide deposition profile was predicted as being crucial for the remediation efficiency. Risk assessment conducted for a critical group of forest users in Belarus shows that consumption of forest products (berries and mushrooms) leads to about 0.004% risk of a fatal cancer annually. Cost-benefit analysis for forest cleanup suggests that complete removal of organic layer is too expensive for application in Belarus and a better methodology is required. In conclusion, FORESTPATH modeling framework could have wide applications in environmental remediation of radionuclides and toxic metals as well as in dose reconstruction and, risk-assessment.« less

  6. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb survivor...

  7. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb survivor...

  8. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb survivor...

  9. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb survivor...

  10. Crisis and emergency risk communication as an integrative model.

    PubMed

    Reynolds, Barbara; W Seeger, Matthew

    2005-01-01

    This article describes a model of communication known as crisis and emergency risk communication (CERC). The model is outlined as a merger of many traditional notions of health and risk communication with work in crisis and disaster communication. The specific kinds of communication activities that should be called for at various stages of disaster or crisis development are outlined. Although crises are by definition uncertain, equivocal, and often chaotic situations, the CERC model is presented as a tool health communicators can use to help manage these complex events.

  11. Team Risk Management: A New Model for Customer-Supplier Relationships

    DTIC Science & Technology

    1994-07-01

    Management : A New Model for Customer - Supplier Relationships Ronald P. Higuera "Audrey J. Dorofee Julie A. Walker Ray C. Williams July 1994 ""•// 94...N/A N/A N/A 11. TITLE (Include Secuity Claaaificatioa) Team Risk Management : A New Model for Customer -Supplier Relationships 12. PERSONAL AUTHOR(S...by block number) FIELD GROUP SUB. GR. Customer - Supplier Relationships Risk Team Risk Management 19. ABSTRACT (cominus on = if necesaryd id’y by block

  12. Determination of osteoporosis risk factors using a multiple logistic regression model in postmenopausal Turkish women.

    PubMed

    Akkus, Zeki; Camdeviren, Handan; Celik, Fatma; Gur, Ali; Nas, Kemal

    2005-09-01

    To determine the risk factors of osteoporosis using a multiple binary logistic regression method and to assess the risk variables for osteoporosis, which is a major and growing health problem in many countries. We presented a case-control study, consisting of 126 postmenopausal healthy women as control group and 225 postmenopausal osteoporotic women as the case group. The study was carried out in the Department of Physical Medicine and Rehabilitation, Dicle University, Diyarbakir, Turkey between 1999-2002. The data from the 351 participants were collected using a standard questionnaire that contains 43 variables. A multiple logistic regression model was then used to evaluate the data and to find the best regression model. We classified 80.1% (281/351) of the participants using the regression model. Furthermore, the specificity value of the model was 67% (84/126) of the control group while the sensitivity value was 88% (197/225) of the case group. We found the distribution of residual values standardized for final model to be exponential using the Kolmogorow-Smirnow test (p=0.193). The receiver operating characteristic curve was found successful to predict patients with risk for osteoporosis. This study suggests that low levels of dietary calcium intake, physical activity, education, and longer duration of menopause are independent predictors of the risk of low bone density in our population. Adequate dietary calcium intake in combination with maintaining a daily physical activity, increasing educational level, decreasing birth rate, and duration of breast-feeding may contribute to healthy bones and play a role in practical prevention of osteoporosis in Southeast Anatolia. In addition, the findings of the present study indicate that the use of multivariate statistical method as a multiple logistic regression in osteoporosis, which maybe influenced by many variables, is better than univariate statistical evaluation.

  13. Risk assessment model for development of advanced age-related macular degeneration.

    PubMed

    Klein, Michael L; Francis, Peter J; Ferris, Frederick L; Hamon, Sara C; Clemons, Traci E

    2011-12-01

    To design a risk assessment model for development of advanced age-related macular degeneration (AMD) incorporating phenotypic, demographic, environmental, and genetic risk factors. We evaluated longitudinal data from 2846 participants in the Age-Related Eye Disease Study. At baseline, these individuals had all levels of AMD, ranging from none to unilateral advanced AMD (neovascular or geographic atrophy). Follow-up averaged 9.3 years. We performed a Cox proportional hazards analysis with demographic, environmental, phenotypic, and genetic covariates and constructed a risk assessment model for development of advanced AMD. Performance of the model was evaluated using the C statistic and the Brier score and externally validated in participants in the Complications of Age-Related Macular Degeneration Prevention Trial. The final model included the following independent variables: age, smoking history, family history of AMD (first-degree member), phenotype based on a modified Age-Related Eye Disease Study simple scale score, and genetic variants CFH Y402H and ARMS2 A69S. The model did well on performance measures, with very good discrimination (C statistic = 0.872) and excellent calibration and overall performance (Brier score at 5 years = 0.08). Successful external validation was performed, and a risk assessment tool was designed for use with or without the genetic component. We constructed a risk assessment model for development of advanced AMD. The model performed well on measures of discrimination, calibration, and overall performance and was successfully externally validated. This risk assessment tool is available for online use.

  14. Future Bloom and Blossom Frost Risk for Malus domestica Considering Climate Model and Impact Model Uncertainties

    PubMed Central

    Hoffmann, Holger; Rath, Thomas

    2013-01-01

    The future bloom and risk of blossom frosts for Malus domestica were projected using regional climate realizations and phenological ( = impact) models. As climate impact projections are susceptible to uncertainties of climate and impact models and model concatenation, the significant horizon of the climate impact signal was analyzed by applying 7 impact models, including two new developments, on 13 climate realizations of the IPCC emission scenario A1B. Advancement of phenophases and a decrease in blossom frost risk for Lower Saxony (Germany) for early and late ripeners was determined by six out of seven phenological models. Single model/single grid point time series of bloom showed significant trends by 2021–2050 compared to 1971–2000, whereas the joint signal of all climate and impact models did not stabilize until 2043. Regarding blossom frost risk, joint projection variability exceeded the projected signal. Thus, blossom frost risk cannot be stated to be lower by the end of the 21st century despite a negative trend. As a consequence it is however unlikely to increase. Uncertainty of temperature, blooming date and blossom frost risk projection reached a minimum at 2078–2087. The projected phenophases advanced by 5.5 d K−1, showing partial compensation of delayed fulfillment of the winter chill requirement and faster completion of the following forcing phase in spring. Finally, phenological model performance was improved by considering the length of day. PMID:24116022

  15. Future bloom and blossom frost risk for Malus domestica considering climate model and impact model uncertainties.

    PubMed

    Hoffmann, Holger; Rath, Thomas

    2013-01-01

    The future bloom and risk of blossom frosts for Malus domestica were projected using regional climate realizations and phenological ( = impact) models. As climate impact projections are susceptible to uncertainties of climate and impact models and model concatenation, the significant horizon of the climate impact signal was analyzed by applying 7 impact models, including two new developments, on 13 climate realizations of the IPCC emission scenario A1B. Advancement of phenophases and a decrease in blossom frost risk for Lower Saxony (Germany) for early and late ripeners was determined by six out of seven phenological models. Single model/single grid point time series of bloom showed significant trends by 2021-2050 compared to 1971-2000, whereas the joint signal of all climate and impact models did not stabilize until 2043. Regarding blossom frost risk, joint projection variability exceeded the projected signal. Thus, blossom frost risk cannot be stated to be lower by the end of the 21st century despite a negative trend. As a consequence it is however unlikely to increase. Uncertainty of temperature, blooming date and blossom frost risk projection reached a minimum at 2078-2087. The projected phenophases advanced by 5.5 d K(-1), showing partial compensation of delayed fulfillment of the winter chill requirement and faster completion of the following forcing phase in spring. Finally, phenological model performance was improved by considering the length of day.

  16. Revenue Risk Modelling and Assessment on BOT Highway Project

    NASA Astrophysics Data System (ADS)

    Novianti, T.; Setyawan, H. Y.

    2018-01-01

    The infrastructure project which is considered as a public-private partnership approach under BOT (Build-Operate-Transfer) arrangement, such as a highway, is risky. Therefore, assessment on risk factors is essential as the project have a concession period and is influenced by macroeconomic factors and consensus period. In this study, pre-construction risks of a highway were examined by using a Delphi method to create a space for offline expert discussions; a fault tree analysis to map intuition of experts and to create a model from the underlying risk events; a fuzzy logic to interpret the linguistic data of risk models. The loss of revenue for risk tariff, traffic volume, force majeure, and income were then measured. The results showed that the loss of revenue caused by the risk tariff was 10.5% of the normal total revenue. The loss of revenue caused by the risk of traffic volume was 21.0% of total revenue. The loss of revenue caused by the force majeure was 12.2% of the normal income. The loss of income caused by the non-revenue events was 6.9% of the normal revenue. It was also found that the volume of traffic was the major risk of a highway project because it related to customer preferences.

  17. Water- and wastewater-related disease and infection risks: what is an appropriate value for the maximum tolerable additional burden of disease?

    PubMed

    Mara, Duncan

    2011-06-01

    The maximum additional burden of water- and wastewater-related disease of 10-6 disability-adjusted life year (DALY) loss per person per year (pppy), used in the WHO Drinking-water Quality Guidelines and the WHO Guidelines for Wastewater Use in Agriculture, is based on US EPA'S acceptance of a 70-year lifetime waterborne cancer risk of 10(-5) per person, equivalent to an annual risk of 1.4x10(-7) per person which is four orders of magnitude lower than the actual all-cancer incidence in the USA in 2009 of 1.8x10(-3) pppy. A maximum additional burden of 10(-4) DALY loss pppy would reduce this risk to a more cost-effective, but still low, risk of 1.4x10(-5) pppy. It would increase the DALY loss pppy in low- and middle-income countries due to diarrhoeal diseases from the current level of 0.0119 pppy to 0.0120 pppy, and that due to ascariasis from 0.0026 pppy to 0.0027 pppy, but neither increase is of public-health significance. It is therefore recommended that the maximum additional burden of disease from these activities be increased to a DALY loss of 10(-4) pppy as this provides an adequate margin of public-health safety in relation to waterborne-cancer deaths, diarrhoeal disease and ascariasis in all countries.

  18. Evaluation of a model of violence risk assessment among forensic psychiatric patients.

    PubMed

    Douglas, Kevin S; Ogloff, James R P; Hart, Stephen D

    2003-10-01

    This study tested the interrater reliability and criterion-related validity of structured violence risk judgments made by using one application of the structured professional judgment model of violence risk assessment, the HCR-20 violence risk assessment scheme, which assesses 20 key risk factors in three domains: historical, clinical, and risk management. The HCR-20 was completed for a sample of 100 forensic psychiatric patients who had been found not guilty by reason of a mental disorder and were subsequently released to the community. Violence in the community was determined from multiple file-based sources. Interrater reliability of structured final risk judgments of low, moderate, or high violence risk made on the basis of the structured professional judgment model was acceptable (weighted kappa=.61). Structured final risk judgments were significantly predictive of postrelease community violence, yielding moderate to large effect sizes. Event history analyses showed that final risk judgments made with the structured professional judgment model added incremental validity to the HCR-20 used in an actuarial (numerical) sense. The findings support the structured professional judgment model of risk assessment as well as the HCR-20 specifically and suggest that clinical judgment, if made within a structured context, can contribute in meaningful ways to the assessment of violence risk.

  19. [Case study on health risk assessment based on site-specific conceptual model].

    PubMed

    Zhong, Mao-Sheng; Jiang, Lin; Yao, Jue-Jun; Xia, Tian-Xiang; Zhu, Xiao-Ying; Han, Dan; Zhang, Li-Na

    2013-02-01

    Site investigation was carried out on an area to be redeveloped as a subway station, which is right downstream of the groundwater of a former chemical plant. The results indicate the subsurface soil and groundwater in the area are both polluted heavily by 1,2-dichloroethane, which was caused by the chemical plant upstream with the highest concentration was 104.08 mg.kg-1 for soil sample at 8.6 m below ground and the highest concentration was 18500 microg.L-1 for groundwater. Further, a site-specific contamination conceptual model, giving consideration to the specific structure configuration of the station, was developed, and the corresponding risk calculation equation was derived. The carcinogenic risks calculated with models developed on the generic site conceptual model and derived herein on the site-specific conceptual model were compared. Both models indicate that the carcinogenic risk is significantly higher than the acceptable level which is 1 x 10(-6). The comparison result reveals that the risk calculated with the former models for soil and groundwater are higher than the one calculated with the latter models by 2 times and 1.5 times, respectively. The finding in this paper indicates that the generic risk assessment model may underestimate the risk if specific site conditions and structure configuration are not considered.

  20. Metal Big Area Additive Manufacturing: Process Modeling and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan; Nycz, Andrzej; Noakes, Mark W

    Metal Big Area Additive Manufacturing (mBAAM) is a new additive manufacturing (AM) technology for printing large-scale 3D objects. mBAAM is based on the gas metal arc welding process and uses a continuous feed of welding wire to manufacture an object. An electric arc forms between the wire and the substrate, which melts the wire and deposits a bead of molten metal along the predetermined path. In general, the welding process parameters and local conditions determine the shape of the deposited bead. The sequence of the bead deposition and the corresponding thermal history of the manufactured object determine the long rangemore » effects, such as thermal-induced distortions and residual stresses. Therefore, the resulting performance or final properties of the manufactured object are dependent on its geometry and the deposition path, in addition to depending on the basic welding process parameters. Physical testing is critical for gaining the necessary knowledge for quality prints, but traversing the process parameter space in order to develop an optimized build strategy for each new design is impractical by pure experimental means. Computational modeling and optimization may accelerate development of a build process strategy and saves time and resources. Because computational modeling provides these opportunities, we have developed a physics-based Finite Element Method (FEM) simulation framework and numerical models to support the mBAAM process s development and design. In this paper, we performed a sequentially coupled heat transfer and stress analysis for predicting the final deformation of a small rectangular structure printed using the mild steel welding wire. Using the new simulation technologies, material was progressively added into the FEM simulation as the arc weld traversed the build path. In the sequentially coupled heat transfer and stress analysis, the heat transfer was performed to calculate the temperature evolution, which was used in a stress

  1. Generalised additive modelling approach to the fermentation process of glutamate.

    PubMed

    Liu, Chun-Bo; Li, Yun; Pan, Feng; Shi, Zhong-Ping

    2011-03-01

    In this work, generalised additive models (GAMs) were used for the first time to model the fermentation of glutamate (Glu). It was found that three fermentation parameters fermentation time (T), dissolved oxygen (DO) and oxygen uptake rate (OUR) could capture 97% variance of the production of Glu during the fermentation process through a GAM model calibrated using online data from 15 fermentation experiments. This model was applied to investigate the individual and combined effects of T, DO and OUR on the production of Glu. The conditions to optimize the fermentation process were proposed based on the simulation study from this model. Results suggested that the production of Glu can reach a high level by controlling concentration levels of DO and OUR to the proposed optimization conditions during the fermentation process. The GAM approach therefore provides an alternative way to model and optimize the fermentation process of Glu. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  2. Advantages of new cardiovascular risk-assessment strategies in high-risk patients with hypertension.

    PubMed

    Ruilope, Luis M; Segura, Julian

    2005-10-01

    Accurate assessment of cardiovascular disease (CVD) risk in patients with hypertension is important when planning appropriate treatment of modifiable risk factors. The causes of CVD are multifactorial, and hypertension seldom exists as an isolated risk factor. Classic models of risk assessment are more accurate than a simple counting of risk factors, but they are not generalizable to all populations. In addition, the risk associated with hypertension is graded, continuous, and independent of other risk factors, and this is not reflected in classic models of risk assessment. This article is intended to review both classic and newer models of CVD risk assessment. MEDLINE was searched for articles published between 1990 and 2005 that contained the terms cardiovascular disease, hypertension, or risk assessment. Articles describing major clinical trials, new data about cardiovascular risk, or global risk stratification were selected for review. Some patients at high long-term risk for CVD events (eg, patients aged <50 years with multiple risk factors) may go untreated because they do not meet the absolute risk-intervention threshold of 20% risk over 10 years with the classic model. Recognition of the limitations of classic risk-assessment models led to new guidelines, particularly those of the European Society of Hypertension-European Society of Cardiology. These guidelines view hypertension as one of many risk and disease factors that require treatment to decrease risk. These newer guidelines include a more comprehensive range of risk factors and more finely graded blood pressure ranges to stratify patients by degree of risk. Whether they accurately predict CVD risk in most populations is not known. Evidence from the Valsartan Antihypertensive Long-term Use Evaluation (VALUE) study, which stratified patients by several risk and disease factors, highlights the predictive value of some newer CVD risk assessments. Modern risk assessments, which include blood pressure

  3. Life stress and risk of precancerous cervical lesions: a pretest directed by the life stress model.

    PubMed

    Lovejoy, N C; Roche, N; McLean, D

    1997-01-01

    To present the results of a pilot study to pretest instruments designed to measure selected variables named in the Life Stress Model, a model of health outcomes. Additional aims were to determine the effect of completing personal risk assessments for precancerous squamous-cell intraepithelial lesions of the cervic (SIL) on receptivity to cervical cancer prevention information and to extend knowledge of stressful life events experienced by inner-city women attending high-risk health clinics. Cross-sectional. 20 adult women attending high-risk prenatal or human immunodeficiency virus (HIV) outpatient clinics in one of two New York City hospitals. Most were recovering drugs abusers; half were diagnosed with HIV infections. Data were collected by self-report using standard measures. Demographics, medical histories of immunosuppressive states, and investigator-developed screening inventories of behavioral and dietary risk factors associated with SIL also were administered. Life event stressors, psychological state, social support, symptom distress, and SIL diagnosis. Instruments met acceptable psychometric standards for internal consistency, but the standard measure of stressful life events did not capture the full range of stressors experienced by this group of patients: hovering relatives, abusive spouses, HIV diagnosis, changes in welfare benefits, the HIV status of the unborn, and mandatory foster care. Although unable to recall cancer histories of family members, about half of the women who completed study instruments were interested in receiving more information cervical cancer. Exploratory analyses suggest that women diagnosed with SIL experience more psychological distress and family disfunction than women without SIL. Completing personal SIL risk assessments may stimulate patients' receptivity to cancer risk factor information. Health outcomes studies guided by the Life Stress Model may prove fruitful. Cancer risk factor assessments may be an important method

  4. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    The additivity model assumed that field-scale reaction properties in a sediment including surface area, reactive site concentration, and reaction rate can be predicted from field-scale grain-size distribution by linearly adding reaction properties estimated in laboratory for individual grain-size fractions. This study evaluated the additivity model in scaling mass transfer-limited, multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment. Experimental data of rate-limited U(VI) desorption in a stirred flow-cell reactor were used to estimate the statistical properties of the rate constants for individual grain-size fractions, which were then used to predict rate-limited U(VI) desorption in the composite sediment. The resultmore » indicated that the additivity model with respect to the rate of U(VI) desorption provided a good prediction of U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel-size fraction (2 to 8 mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  5. Acute radiation risk models

    NASA Astrophysics Data System (ADS)

    Smirnova, Olga

    Biologically motivated mathematical models, which describe the dynamics of the major hematopoietic lineages (the thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems) in acutely/chronically irradiated humans are developed. These models are implemented as systems of nonlinear differential equations, which variables and constant parameters have clear biological meaning. It is shown that the developed models are capable of reproducing clinical data on the dynamics of these systems in humans exposed to acute radiation in the result of incidents and accidents, as well as in humans exposed to low-level chronic radiation. Moreover, the averaged value of the "lethal" dose rates of chronic irradiation evaluated within models of these four major hematopoietic lineages coincides with the real minimal dose rate of lethal chronic irradiation. The demonstrated ability of the models of the human thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems to predict the dynamical response of these systems to acute/chronic irradiation in wide ranges of doses and dose rates implies that these mathematical models form an universal tool for the investigation and prediction of the dynamics of the major human hematopoietic lineages for a vast pattern of irradiation scenarios. In particular, these models could be applied for the radiation risk assessment for health of astronauts exposed to space radiation during long-term space missions, such as voyages to Mars or Lunar colonies, as well as for health of people exposed to acute/chronic irradiation due to environmental radiological events.

  6. Modelling BSE trend over time in Europe, a risk assessment perspective.

    PubMed

    Ducrot, Christian; Sala, Carole; Ru, Giuseppe; de Koeijer, Aline; Sheridan, Hazel; Saegerman, Claude; Selhorst, Thomas; Arnold, Mark; Polak, Miroslaw P; Calavas, Didier

    2010-06-01

    BSE is a zoonotic disease that caused the emergence of variant Creuzfeldt-Jakob disease in the mid 1990s. The trend of the BSE epidemic in seven European countries was assessed and compared, using Age-Period-Cohort and Reproduction Ratio modelling applied to surveillance data 2001-2007. A strong decline in BSE risk was observed for all countries that applied control measures during the 1990s, starting at different points in time in the different countries. Results were compared with the type and date of the BSE control measures implemented between 1990 and 2001 in each country. Results show that a ban on the feeding of meat and bone meal (MBM) to cattle alone was not sufficient to eliminate BSE. The fading out of the epidemic started shortly after the complementary measures targeted at controlling the risk in MBM. Given the long incubation period, it is still too early to estimate the additional effect of the ban on the feeding of animal protein to all farm animals that started in 2001. These results provide new insights in the risk assessment of BSE for cattle and Humans, which will especially be useful in the context of possible relaxing BSE surveillance and control measures.

  7. Risk Classification with an Adaptive Naive Bayes Kernel Machine Model.

    PubMed

    Minnier, Jessica; Yuan, Ming; Liu, Jun S; Cai, Tianxi

    2015-04-22

    Genetic studies of complex traits have uncovered only a small number of risk markers explaining a small fraction of heritability and adding little improvement to disease risk prediction. Standard single marker methods may lack power in selecting informative markers or estimating effects. Most existing methods also typically do not account for non-linearity. Identifying markers with weak signals and estimating their joint effects among many non-informative markers remains challenging. One potential approach is to group markers based on biological knowledge such as gene structure. If markers in a group tend to have similar effects, proper usage of the group structure could improve power and efficiency in estimation. We propose a two-stage method relating markers to disease risk by taking advantage of known gene-set structures. Imposing a naive bayes kernel machine (KM) model, we estimate gene-set specific risk models that relate each gene-set to the outcome in stage I. The KM framework efficiently models potentially non-linear effects of predictors without requiring explicit specification of functional forms. In stage II, we aggregate information across gene-sets via a regularization procedure. Estimation and computational efficiency is further improved with kernel principle component analysis. Asymptotic results for model estimation and gene set selection are derived and numerical studies suggest that the proposed procedure could outperform existing procedures for constructing genetic risk models.

  8. Predicting readmission risk with institution-specific prediction models.

    PubMed

    Yu, Shipeng; Farooq, Faisal; van Esbroeck, Alexander; Fung, Glenn; Anand, Vikram; Krishnapuram, Balaji

    2015-10-01

    The ability to predict patient readmission risk is extremely valuable for hospitals, especially under the Hospital Readmission Reduction Program of the Center for Medicare and Medicaid Services which went into effect starting October 1, 2012. There is a plethora of work in the literature that deals with developing readmission risk prediction models, but most of them do not have sufficient prediction accuracy to be deployed in a clinical setting, partly because different hospitals may have different characteristics in their patient populations. We propose a generic framework for institution-specific readmission risk prediction, which takes patient data from a single institution and produces a statistical risk prediction model optimized for that particular institution and, optionally, for a specific condition. This provides great flexibility in model building, and is also able to provide institution-specific insights in its readmitted patient population. We have experimented with classification methods such as support vector machines, and prognosis methods such as the Cox regression. We compared our methods with industry-standard methods such as the LACE model, and showed the proposed framework is not only more flexible but also more effective. We applied our framework to patient data from three hospitals, and obtained some initial results for heart failure (HF), acute myocardial infarction (AMI), pneumonia (PN) patients as well as patients with all conditions. On Hospital 2, the LACE model yielded AUC 0.57, 0.56, 0.53 and 0.55 for AMI, HF, PN and All Cause readmission prediction, respectively, while the proposed model yielded 0.66, 0.65, 0.63, 0.74 for the corresponding conditions, all significantly better than the LACE counterpart. The proposed models that leverage all features at discharge time is more accurate than the models that only leverage features at admission time (0.66 vs. 0.61 for AMI, 0.65 vs. 0.61 for HF, 0.63 vs. 0.56 for PN, 0.74 vs. 0.60 for All

  9. Lung cancer in never smokers Epidemiology and risk prediction models

    PubMed Central

    McCarthy, William J.; Meza, Rafael; Jeon, Jihyoun; Moolgavkar, Suresh

    2012-01-01

    In this chapter we review the epidemiology of lung cancer incidence and mortality among never smokers/ nonsmokers and describe the never smoker lung cancer risk models used by CISNET modelers. Our review focuses on those influences likely to have measurable population impact on never smoker risk, such as secondhand smoke, even though the individual-level impact may be small. Occupational exposures may also contribute importantly to the population attributable risk of lung cancer. We examine the following risk factors in this chapter: age, environmental tobacco smoke, cooking fumes, ionizing radiation including radon gas, inherited genetic susceptibility, selected occupational exposures, preexisting lung disease, and oncogenic viruses. We also compare the prevalence of never smokers between the three CISNET smoking scenarios and present the corresponding lung cancer mortality estimates among never smokers as predicted by a typical CISNET model. PMID:22882894

  10. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  11. Modeling of Near-Surface Leakage and Seepage of CO2 for Risk Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oldenburg, Curtis M.; Unger, Andre A.J.

    2004-02-18

    The injection of carbon dioxide (CO2) into deep geologic carbon sequestration sites entails risk that CO2 will leak away from the primary storage formation and migrate upwards to the unsaturated zone from which it can seep out of the ground. We have developed a coupled modeling framework called T2CA for simulating CO2 leakage and seepage in the subsurface and in the atmospheric surface layer. The results of model simulations can be used to calculate the two key health, safety, and environmental (HSE) risk drivers, namely CO2 seepage flux and nearsurface CO2 concentrations. Sensitivity studies for a subsurface system with amore » thick unsaturated zone show limited leakage attenuation resulting in correspondingly large CO2 concentrations in the shallow subsurface. Large CO2 concentrations in the shallow subsurface present a risk to plant and tree roots, and to humans and other animals in subsurface structures such as basements or utility vaults. Whereas CO2 concentrations in the subsurface can be high, surfacelayer winds reduce CO2 concentrations to low levels for the fluxes investigated. We recommend more verification and case studies be carried out with T2CA, along with the development of extensions to handle additional scenarios such as calm conditions, topographic effects, and catastrophic surface-layer discharge events.« less

  12. Individual-based model for radiation risk assessment

    NASA Astrophysics Data System (ADS)

    Smirnova, O.

    A mathematical model is developed which enables one to predict the life span probability for mammals exposed to radiation. It relates statistical biometric functions with statistical and dynamic characteristics of an organism's critical system. To calculate the dynamics of the latter, the respective mathematical model is used too. This approach is applied to describe the effects of low level chronic irradiation on mice when the hematopoietic system (namely, thrombocytopoiesis) is the critical one. For identification of the joint model, experimental data on hematopoiesis in nonirradiated and irradiated mice, as well as on mortality dynamics of those in the absence of radiation are utilized. The life span probability and life span shortening predicted by the model agree with corresponding experimental data. Modeling results show the significance of ac- counting the variability of the individual radiosensitivity of critical system cells when estimating the radiation risk. These findings are corroborated by clinical data on persons involved in the elimination of the Chernobyl catastrophe after- effects. All this makes it feasible to use the model for radiation risk assessments for cosmonauts and astronauts on long-term missions such as a voyage to Mars or a lunar colony. In this case the model coefficients have to be determined by making use of the available data for humans. Scenarios for the dynamics of dose accumulation during space flights should also be taken into account.

  13. The influence of uncertain map features on risk beliefs and perceived ambiguity for maps of modeled cancer risk from air pollution

    PubMed Central

    Myers, Jeffrey D.

    2012-01-01

    Maps are often used to convey information generated by models, for example, modeled cancer risk from air pollution. The concrete nature of images, such as maps, may convey more certainty than warranted for modeled information. Three map features were selected to communicate the uncertainty of modeled cancer risk: (a) map contours appeared in or out of focus, (b) one or three colors were used, and (c) a verbal-relative or numeric risk expression was used in the legend. Study aims were to assess how these features influenced risk beliefs and the ambiguity of risk beliefs at four assigned map locations that varied by risk level. We applied an integrated conceptual framework to conduct this full factorial experiment with 32 maps that varied by the three dichotomous features and four risk levels; 826 university students participated. Data was analyzed using structural equation modeling. Unfocused contours and the verbal-relative risk expression generated more ambiguity than their counterparts. Focused contours generated stronger risk beliefs for higher risk levels and weaker beliefs for lower risk levels. Number of colors had minimal influence. The magnitude of risk level, conveyed using incrementally darker shading, had a substantial dose-response influence on the strength of risk beliefs. Personal characteristics of prior beliefs and numeracy also had substantial influences. Bottom-up and top-down information processing suggest why iconic visual features of incremental shading and contour focus had the strongest visual influences on risk beliefs and ambiguity. Variations in contour focus and risk expression show promise for fostering appropriate levels of ambiguity. PMID:22985196

  14. ADDITIVITY ASSESSMENT OF TRIHALOMETHANE MIXTURES BY PROPORTIONAL RESPONSE ADDITION

    EPA Science Inventory

    If additivity is known or assumed, the toxicity of a chemical mixture may be predicted from the dose response curves of the individual chemicals comprising the mixture. As single chemical data are abundant and mixture data sparse, mixture risk methods that utilize single chemical...

  15. A Risk Prediction Model for In-hospital Mortality in Patients with Suspected Myocarditis

    PubMed Central

    Xu, Duo; Zhao, Ruo-Chi; Gao, Wen-Hui; Cui, Han-Bin

    2017-01-01

    TnT ≥50 μg/L = 1, cTnT <50 μg/L = 0]). The area under the receiver operating characteristic curve was 0.96 (standard error = 0.015, 95% confidence interval [CI]: 0.93-0.99). The model demonstrated that a Ccr <60 ml/min (odds ratio [OR] = 19.94, 95% CI: 5.66–70.26), an age ≥50 years (OR = 7.43, 95% CI: 2.18–25.34), VT (OR = 6.89, 95% CI: 1.86–25.44), a NYHA classification ≥3 (OR = 4.03, 95% CI: 1.13–14.32), male gender (OR = 3.48, 95% CI: 0.99–12.20), and a cTnT level ≥50 μg/L (OR = 3.10, 95% CI: 0.91–10.62) were the independent risk factors for in-hospital mortality. Conclusions: A Ccr <60 ml/min, an age ≥50 years, VT, an NYHA classification ≥3, male gender, and a cTnT level ≥50 μg/L were the independent risk factors resulting from the prediction model for in-hospital mortality in patients with suspected myocarditis. In addition, sufficient life support during the early stage of the disease might improve the prognoses of patients with suspected myocarditis with multiple risk factors for in-hospital mortality. PMID:28345541

  16. Developing a suitable model for supplier selection based on supply chain risks: an empirical study from Iranian pharmaceutical companies.

    PubMed

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts' opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry.

  17. Using the Job Demands-Resources model to investigate risk perception, safety climate and job satisfaction in safety critical organizations.

    PubMed

    Nielsen, Morten Birkeland; Mearns, Kathryn; Matthiesen, Stig Berge; Eid, Jarle

    2011-10-01

    Using the Job Demands-Resources model (JD-R) as a theoretical framework, this study investigated the relationship between risk perception as a job demand and psychological safety climate as a job resource with regard to job satisfaction in safety critical organizations. In line with the JD-R model, it was hypothesized that high levels of risk perception is related to low job satisfaction and that a positive perception of safety climate is related to high job satisfaction. In addition, it was hypothesized that safety climate moderates the relationship between risk perception and job satisfaction. Using a sample of Norwegian offshore workers (N = 986), all three hypotheses were supported. In summary, workers who perceived high levels of risk reported lower levels of job satisfaction, whereas this effect diminished when workers perceived their safety climate as positive. Follow-up analyses revealed that this interaction was dependent on the type of risks in question. The results of this study supports the JD-R model, and provides further evidence for relationships between safety-related concepts and work-related outcomes indicating that organizations should not only develop and implement sound safety procedures to reduce the effects of risks and hazards on workers, but can also enhance other areas of organizational life through a focus on safety. © 2011 The Authors. Scandinavian Journal of Psychology © 2011 The Scandinavian Psychological Associations.

  18. Developing a Suitable Model for Supplier Selection Based on Supply Chain Risks: An Empirical Study from Iranian Pharmaceutical Companies

    PubMed Central

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts’ opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry. PMID:24250442

  19. A fuzzy linguistic model for the prediction of carpal tunnel syndrome risks in an occupational environment.

    PubMed

    Bell, P M; Crumpton, L

    1997-08-01

    This research presents the development and evaluation of a fuzzy linguistic model designated to predict the risk of carpal tunnel syndrome (CTS) in an occupational setting. CTS has become one of the largest problems facing ergonomists and the medical community because it is developing in epidemic proportions within the occupational environment. In addition, practitioners are interested in identifying accurate methods for evaluating the risk of CTS in an occupational setting. It is hypothesized that many factors impact an individual's likelihood of developing CTS and the eventual development of CTS. This disparity in the occurrence of CTS for workers with similar backgrounds and work activities has confused researchers and has been a stumbling block in the development of a model for widespread use in evaluating the development of CTS. Thus this research is an attempt to develop a method that can be used to predict the likelihood of CTS risk in a variety of environments. The intent is that this model will be applied eventually in an occupational setting, thus model development was focused on a method that provided a usable interface and the desired system inputs can also be obtained without the benefit of a medical practitioner. The methodology involves knowledge acquisition to identify and categorize a holistic set of risk factors that include task-related, personal, and organizational categories. The determination of relative factor importance was accomplished using analytic hierarchy processing (AHP) analysis. Finally a mathematical representation of the CTS risk was accomplished by utilizing fuzzy set theory in order to quantify linguistic input parameters. An evaluation of the model including determination of sensitivity and specificity is conducted and the results of the model indicate that the results are fairly accurate and this method has the potential for widespread use. A significant aspect of this research is the comparison of this technique to other

  20. A comparison of imputation techniques for handling missing predictor values in a risk model with a binary outcome.

    PubMed

    Ambler, Gareth; Omar, Rumana Z; Royston, Patrick

    2007-06-01

    Risk models that aim to predict the future course and outcome of disease processes are increasingly used in health research, and it is important that they are accurate and reliable. Most of these risk models are fitted using routinely collected data in hospitals or general practices. Clinical outcomes such as short-term mortality will be near-complete, but many of the predictors may have missing values. A common approach to dealing with this is to perform a complete-case analysis. However, this may lead to overfitted models and biased estimates if entire patient subgroups are excluded. The aim of this paper is to investigate a number of methods for imputing missing data to evaluate their effect on risk model estimation and the reliability of the predictions. Multiple imputation methods, including hotdecking and multiple imputation by chained equations (MICE), were investigated along with several single imputation methods. A large national cardiac surgery database was used to create simulated yet realistic datasets. The results suggest that complete case analysis may produce unreliable risk predictions and should be avoided. Conditional mean imputation performed well in our scenario, but may not be appropriate if using variable selection methods. MICE was amongst the best performing multiple imputation methods with regards to the quality of the predictions. Additionally, it produced the least biased estimates, with good coverage, and hence is recommended for use in practice.

  1. Reducing uncertainty in risk modeling for methylmercury exposure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ponce, R.; Egeland, G.; Middaugh, J.

    The biomagnification and bioaccumulation of methylmercury in marine species represents a challenge for risk assessment related to the consumption of subsistence foods in Alaska. Because of the profound impact that food consumption advisories have on indigenous peoples seeking to preserve a way of life, there is a need to reduce uncertainty in risk assessment. Thus, research was initiated to reduce the uncertainty in assessing the health risks associated with the consumption of subsistence foods. Because marine subsistence foods typically contain elevated levels of methylmercury, preliminary research efforts have focused on methylmercury as the principal chemical of concern. Of particular interestmore » are the antagonistic effects of selenium on methylmercury toxicity. Because of this antagonism, methylmercury exposure through the consumption of marine mammal meat (with high selenium) may not be as toxic as comparable exposures through other sources of dietary intake, such as in the contaminated bread episode of Iraq (containing relatively low selenium). This hypothesis is supported by animal experiments showing reduced toxicity of methylmercury associated with marine mammal meat, by the antagonistic influence of selenium on methylmercury toxicity, and by negative clinical findings in adult populations exposed to methylmercury through a marine diet not subject to industrial contamination. Exploratory model development is underway to identify potential improvements and applications of current deterministic and probabilistic models, particularly by incorporating selenium as an antagonist in risk modeling methods.« less

  2. Joint relative risks for estrogen receptor-positive breast cancer from a clinical model, polygenic risk score, and sex hormones.

    PubMed

    Shieh, Yiwey; Hu, Donglei; Ma, Lin; Huntsman, Scott; Gard, Charlotte C; Leung, Jessica W T; Tice, Jeffrey A; Ziv, Elad; Kerlikowske, Karla; Cummings, Steven R

    2017-11-01

    Models that predict the risk of estrogen receptor (ER)-positive breast cancers may improve our ability to target chemoprevention. We investigated the contributions of sex hormones to the discrimination of the Breast Cancer Surveillance Consortium (BCSC) risk model and a polygenic risk score comprised of 83 single nucleotide polymorphisms. We conducted a nested case-control study of 110 women with ER-positive breast cancers and 214 matched controls within a mammography screening cohort. Participants were postmenopausal and not on hormonal therapy. The associations of estradiol, estrone, testosterone, and sex hormone binding globulin with ER-positive breast cancer were evaluated using conditional logistic regression. We assessed the individual and combined discrimination of estradiol, the BCSC risk score, and polygenic risk score using the area under the receiver operating characteristic curve (AUROC). Of the sex hormones assessed, estradiol (OR 3.64, 95% CI 1.64-8.06 for top vs bottom quartile), and to a lesser degree estrone, was most strongly associated with ER-positive breast cancer in unadjusted analysis. The BCSC risk score (OR 1.32, 95% CI 1.00-1.75 per 1% increase) and polygenic risk score (OR 1.58, 95% CI 1.06-2.36 per standard deviation) were also associated with ER-positive cancers. A model containing the BCSC risk score, polygenic risk score, and estradiol levels showed good discrimination for ER-positive cancers (AUROC 0.72, 95% CI 0.65-0.79), representing a significant improvement over the BCSC risk score (AUROC 0.58, 95% CI 0.50-0.65). Adding estradiol and a polygenic risk score to a clinical risk model improves discrimination for postmenopausal ER-positive breast cancers.

  3. [Food additives and healthiness].

    PubMed

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects.

  4. Home energy efficiency and radon related risk of lung cancer: modelling study

    PubMed Central

    Milner, James; Shrubsole, Clive; Das, Payel; Jones, Benjamin; Ridley, Ian; Chalabi, Zaid; Hamilton, Ian; Armstrong, Ben; Davies, Michael

    2014-01-01

    Objective To investigate the effect of reducing home ventilation as part of household energy efficiency measures on deaths from radon related lung cancer. Design Modelling study. Setting England. Intervention Home energy efficiency interventions, motivated in part by targets for reducing greenhouse gases, which entail reduction in uncontrolled ventilation in keeping with good practice guidance. Main outcome measures Modelled current and future distributions of indoor radon levels for the English housing stock and associated changes in life years due to lung cancer mortality, estimated using life tables. Results Increasing the air tightness of dwellings (without compensatory purpose-provided ventilation) increased mean indoor radon concentrations by an estimated 56.6%, from 21.2 becquerels per cubic metre (Bq/m3) to 33.2 Bq/m3. After the lag in lung cancer onset, this would result in an additional annual burden of 4700 life years lost and (at peak) 278 deaths. The increases in radon levels for the millions of homes that would contribute most of the additional burden are below the threshold at which radon remediation measures are cost effective. Fitting extraction fans and trickle ventilators to restore ventilation will help offset the additional burden but only if the ventilation related energy efficiency gains are lost. Mechanical ventilation systems with heat recovery may lower radon levels and the risk of cancer while maintaining the advantage of energy efficiency for the most airtight dwellings but there is potential for a major adverse impact on health if such systems fail. Conclusion Unless specific remediation is used, reducing the ventilation of dwellings will improve energy efficiency only at the expense of population wide adverse impact on indoor exposure to radon and risk of lung cancer. The implications of this and other consequences of changes to ventilation need to be carefully evaluated to ensure that the desirable health and environmental benefits of

  5. Antimicrobial combinations: Bliss independence and Loewe additivity derived from mechanistic multi-hit models

    PubMed Central

    Yu, Guozhi; Hozé, Nathanaël; Rolff, Jens

    2016-01-01

    Antimicrobial peptides (AMPs) and antibiotics reduce the net growth rate of bacterial populations they target. It is relevant to understand if effects of multiple antimicrobials are synergistic or antagonistic, in particular for AMP responses, because naturally occurring responses involve multiple AMPs. There are several competing proposals describing how multiple types of antimicrobials add up when applied in combination, such as Loewe additivity or Bliss independence. These additivity terms are defined ad hoc from abstract principles explaining the supposed interaction between the antimicrobials. Here, we link these ad hoc combination terms to a mathematical model that represents the dynamics of antimicrobial molecules hitting targets on bacterial cells. In this multi-hit model, bacteria are killed when a certain number of targets are hit by antimicrobials. Using this bottom-up approach reveals that Bliss independence should be the model of choice if no interaction between antimicrobial molecules is expected. Loewe additivity, on the other hand, describes scenarios in which antimicrobials affect the same components of the cell, i.e. are not acting independently. While our approach idealizes the dynamics of antimicrobials, it provides a conceptual underpinning of the additivity terms. The choice of the additivity term is essential to determine synergy or antagonism of antimicrobials. This article is part of the themed issue ‘Evolutionary ecology of arthropod antimicrobial peptides’. PMID:27160596

  6. Method of Breast Reconstruction Determines Venous Thromboembolism Risk Better Than Current Prediction Models

    PubMed Central

    Patel, Niyant V.; Wagner, Douglas S.

    2015-01-01

    Background: Venous thromboembolism (VTE) risk models including the Davison risk score and the 2005 Caprini risk assessment model have been validated in plastic surgery patients. However, their utility and predictive value in breast reconstruction has not been well described. We sought to determine the utility of current VTE risk models in this population and the VTE rate observed in various methods of breast reconstruction. Methods: A retrospective review of breast reconstructions by a single surgeon was performed. One hundred consecutive transverse rectus abdominis myocutaneous (TRAM) patients, 100 consecutive implant patients, and 100 consecutive latissimus dorsi patients were identified over a 10-year period. Patient demographics and presence of symptomatic VTE were collected. 2005 Caprini risk scores and Davison risk scores were calculated for each patient. Results: The TRAM reconstruction group was found to have a higher VTE rate (6%) than the implant (0%) and latissimus (0%) reconstruction groups (P < 0.01). Mean Davison risk scores and 2005 Caprini scores were similar across all reconstruction groups (P > 0.1). The vast majority of patients were stratified as high risk (87.3%) by the VTE risk models. However, only TRAM reconstruction patients demonstrated significant VTE risk. Conclusions: TRAM reconstruction appears to have a significantly higher risk of VTE than both implant and latissimus reconstruction. Current risk models do not effectively stratify breast reconstruction patients at risk for VTE. The method of breast reconstruction appears to have a significant role in patients’ VTE risk. PMID:26090287

  7. Data Model for Multi Hazard Risk Assessment Spatial Support Decision System

    NASA Astrophysics Data System (ADS)

    Andrejchenko, Vera; Bakker, Wim; van Westen, Cees

    2014-05-01

    The goal of the CHANGES Spatial Decision Support System is to support end-users in making decisions related to risk reduction measures for areas at risk from multiple hydro-meteorological hazards. The crucial parts in the design of the system are the user requirements, the data model, the data storage and management, and the relationships between the objects in the system. The implementation of the data model is carried out entirely with an open source database management system with a spatial extension. The web application is implemented using open source geospatial technologies with PostGIS as the database, Python for scripting, and Geoserver and javascript libraries for visualization and the client-side user-interface. The model can handle information from different study areas (currently, study areas from France, Romania, Italia and Poland are considered). Furthermore, the data model handles information about administrative units, projects accessible by different types of users, user-defined hazard types (floods, snow avalanches, debris flows, etc.), hazard intensity maps of different return periods, spatial probability maps, elements at risk maps (buildings, land parcels, linear features etc.), economic and population vulnerability information dependent on the hazard type and the type of the element at risk, in the form of vulnerability curves. The system has an inbuilt database of vulnerability curves, but users can also add their own ones. Included in the model is the management of a combination of different scenarios (e.g. related to climate change, land use change or population change) and alternatives (possible risk-reduction measures), as well as data-structures for saving the calculated economic or population loss or exposure per element at risk, aggregation of the loss and exposure using the administrative unit maps, and finally, producing the risk maps. The risk data can be used for cost-benefit analysis (CBA) and multi-criteria evaluation (SMCE). The

  8. A Comprehensive Review of Existing Risk Assessment Models in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Amini, Ahmad; Jamil, Norziana

    2018-05-01

    Cloud computing is a popular paradigm in information technology and computing as it offers numerous advantages in terms of economical saving and minimal management effort. Although elasticity and flexibility brings tremendous benefits, it still raises many information security issues due to its unique characteristic that allows ubiquitous computing. Therefore, the vulnerabilities and threats in cloud computing have to be identified and proper risk assessment mechanism has to be in place for better cloud computing management. Various quantitative and qualitative risk assessment models have been proposed but up to our knowledge, none of them is suitable for cloud computing environment. This paper, we compare and analyse the strengths and weaknesses of existing risk assessment models. We then propose a new risk assessment model that sufficiently address all the characteristics of cloud computing, which was not appeared in the existing models.

  9. Microbial Risk Assessment

    NASA Technical Reports Server (NTRS)

    Ott, C. M.; Mena, K. D.; Nickerson, C.A.; Pierson, D. L.

    2009-01-01

    Historically, microbiological spaceflight requirements have been established in a subjective manner based upon expert opinion of both environmental and clinical monitoring results and the incidence of disease. The limited amount of data, especially from long-duration missions, has created very conservative requirements based primarily on the concentration of microorganisms. Periodic reevaluations of new data from later missions have allowed some relaxation of these stringent requirements. However, the requirements remain very conservative and subjective in nature, and the risk of crew illness due to infectious microorganisms is not well defined. The use of modeling techniques for microbial risk has been applied in the food and potable water industries and has exceptional potential for spaceflight applications. From a productivity standpoint, this type of modeling can (1) decrease unnecessary costs and resource usage and (2) prevent inadequate or inappropriate data for health assessment. In addition, a quantitative model has several advantages for risk management and communication. By identifying the variable components of the model and the knowledge associated with each component, this type of modeling can: (1) Systematically identify and close knowledge gaps, (2) Systematically identify acceptable and unacceptable risks, (3) Improve communication with stakeholders as to the reasons for resource use, and (4) Facilitate external scientific approval of the NASA requirements. The modeling of microbial risk involves the evaluation of several key factors including hazard identification, crew exposure assessment, dose-response assessment, and risk characterization. Many of these factors are similar to conditions found on Earth; however, the spaceflight environment is very specialized as the inhabitants live in a small, semi-closed environment that is often dependent on regenerative life support systems. To further complicate modeling efforts, microbial dose

  10. Climate-Agriculture-Modeling and Decision Tool for Disease (CAMDT-Disease) for seasonal climate forecast-based crop disease risk management in agriculture

    NASA Astrophysics Data System (ADS)

    Kim, K. H.; Lee, S.; Han, E.; Ines, A. V. M.

    2017-12-01

    Climate-Agriculture-Modeling and Decision Tool (CAMDT) is a decision support system (DSS) tool that aims to facilitate translations of probabilistic seasonal climate forecasts (SCF) to crop responses such as yield and water stress. Since CAMDT is a software framework connecting different models and algorithms with SCF information, it can be easily customized for different types of agriculture models. In this study, we replaced the DSSAT-CSM-Rice model originally incorporated in CAMDT with a generic epidemiological model, EPIRICE, to generate a seasonal pest outlook. The resulting CAMDT-Disease generates potential risks for selected fungal, viral, and bacterial diseases of rice over the next months by translating SCFs into agriculturally-relevant risk information. The integrated modeling procedure of CAMDT-Disease first disaggregates a given SCF using temporal downscaling methods (predictWTD or FResampler1), runs EPIRICE with the downscaled weather inputs, and finally visualizes the EPIRICE outputs as disease risk compared to that of the previous year and the 30-year-climatological average. In addition, the easy-to-use graphical user interface adopted from CAMDT allows users to simulate "what-if" scenarios of disease risks over different planting dates with given SCFs. Our future work includes the simulation of the effect of crop disease on yields through the disease simulation models with the DSSAT-CSM-Rice model, as disease remains one of the most critical yield-reducing factors in the field.

  11. Validation of visualized transgenic zebrafish as a high throughput model to assay bradycardia related cardio toxicity risk candidates.

    PubMed

    Wen, Dingsheng; Liu, Aiming; Chen, Feng; Yang, Julin; Dai, Renke

    2012-10-01

    Drug-induced QT prolongation usually leads to torsade de pointes (TdP), thus for drugs in the early phase of development this risk should be evaluated. In the present study, we demonstrated a visualized transgenic zebrafish as an in vivo high-throughput model to assay the risk of drug-induced QT prolongation. Zebrafish larvae 48 h post-fertilization expressing green fluorescent protein in myocardium were incubated with compounds reported to induce QT prolongation or block the human ether-a-go-go-related gene (hERG) K⁺ current. The compounds sotalol, indapaminde, erythromycin, ofoxacin, levofloxacin, sparfloxacin and roxithromycin were additionally administrated by microinjection into the larvae yolk sac. The ventricle heart rate was recorded using the automatic monitoring system after incubation or microinjection. As a result, 14 out of 16 compounds inducing dog QT prolongation caused bradycardia in zebrafish. A similar result was observed with 21 out of 26 compounds which block hERG current. Among the 30 compounds which induced human QT prolongation, 25 caused bradycardia in this model. Thus, the risk of compounds causing bradycardia in this transgenic zebrafish correlated with that causing QT prolongation and hERG K⁺ current blockage in established models. The tendency that high logP values lead to high risk of QT prolongation in this model was indicated, and non-sensitivity of this model to antibacterial agents was revealed. These data suggest application of this transgenic zebrafish as a high-throughput model to screen QT prolongation-related cardio toxicity of the drug candidates. Copyright © 2012 John Wiley & Sons, Ltd.

  12. Re-operative urethroplasty after failed hypospadias repair: how prior surgery impacts risk for additional complications.

    PubMed

    Snodgrass, W; Bush, N C

    2017-06-01

    (16%) without treatment, P = 0.0001. Logistic regression in 1536 patients demonstrated that each prior surgery increased the odds of subsequent urethroplasty complications 1.5-fold (OR 1.51, 95% CI 1.25-1.83), along with small glans <14 mm (OR 2.40, 95% CI 1.48-3.87), mid/proximal meatal location (OR 2.54, 95% CI 1.65-3.92), and use of pre-operative testosterone (OR 2.57, 95% CI 1.53-4.31); age and surgery type did not increase odds (AUC = 0.739). Urethroplasty complications doubled in people undergoing a second hypospadias urethroplasty compared with those undergoing primary repair. This risk increased to 40% with three or more re-operations. Logistic regression demonstrates that each surgery increases the odds for additional complications 1.5-fold. Mid/proximal meatal location, small glans <14 mm, and use of pre-operative testosterone also significantly increase odds for complications. These observations support the theory that previously operated tissues have less robust vascularity than assumed in a primary repair, and suggest additional adjunctive therapies are needed to improve wound healing in re-operations. The finding that even a single re-operative urethroplasty has twice the risk for additional complications vs. a primary repair emphasizes the need for hypospadias surgeons to 'get it right the first time'. The fact that 40% of the re-operative urethroplasties in this series followed distal repairs emphasizes that there is no 'minor' hypospadias. A single re-operative hypospadias urethroplasty has twice the risk for additional complications vs. the primary repair, which increases to 40% with three or more re-operations. These results support a theory that vascularity of penile tissues decreases with successive operations, and suggest the need for treatments to improve vascularity. The higher risk for complications during re-operative urethroplasties also emphasizes the need to get the initial repair correct. Copyright © 2016 Journal of Pediatric

  13. Bayesian Framework for Water Quality Model Uncertainty Estimation and Risk Management

    EPA Science Inventory

    A formal Bayesian methodology is presented for integrated model calibration and risk-based water quality management using Bayesian Monte Carlo simulation and maximum likelihood estimation (BMCML). The primary focus is on lucid integration of model calibration with risk-based wat...

  14. Large-scale model-based assessment of deer-vehicle collision risk.

    PubMed

    Hothorn, Torsten; Brandl, Roland; Müller, Jörg

    2012-01-01

    Ungulates, in particular the Central European roe deer Capreolus capreolus and the North American white-tailed deer Odocoileus virginianus, are economically and ecologically important. The two species are risk factors for deer-vehicle collisions and as browsers of palatable trees have implications for forest regeneration. However, no large-scale management systems for ungulates have been implemented, mainly because of the high efforts and costs associated with attempts to estimate population sizes of free-living ungulates living in a complex landscape. Attempts to directly estimate population sizes of deer are problematic owing to poor data quality and lack of spatial representation on larger scales. We used data on >74,000 deer-vehicle collisions observed in 2006 and 2009 in Bavaria, Germany, to model the local risk of deer-vehicle collisions and to investigate the relationship between deer-vehicle collisions and both environmental conditions and browsing intensities. An innovative modelling approach for the number of deer-vehicle collisions, which allows nonlinear environment-deer relationships and assessment of spatial heterogeneity, was the basis for estimating the local risk of collisions for specific road types on the scale of Bavarian municipalities. Based on this risk model, we propose a new "deer-vehicle collision index" for deer management. We show that the risk of deer-vehicle collisions is positively correlated to browsing intensity and to harvest numbers. Overall, our results demonstrate that the number of deer-vehicle collisions can be predicted with high precision on the scale of municipalities. In the densely populated and intensively used landscapes of Central Europe and North America, a model-based risk assessment for deer-vehicle collisions provides a cost-efficient instrument for deer management on the landscape scale. The measures derived from our model provide valuable information for planning road protection and defining hunting quota. Open

  15. Large-Scale Model-Based Assessment of Deer-Vehicle Collision Risk

    PubMed Central

    Hothorn, Torsten; Brandl, Roland; Müller, Jörg

    2012-01-01

    Ungulates, in particular the Central European roe deer Capreolus capreolus and the North American white-tailed deer Odocoileus virginianus, are economically and ecologically important. The two species are risk factors for deer–vehicle collisions and as browsers of palatable trees have implications for forest regeneration. However, no large-scale management systems for ungulates have been implemented, mainly because of the high efforts and costs associated with attempts to estimate population sizes of free-living ungulates living in a complex landscape. Attempts to directly estimate population sizes of deer are problematic owing to poor data quality and lack of spatial representation on larger scales. We used data on 74,000 deer–vehicle collisions observed in 2006 and 2009 in Bavaria, Germany, to model the local risk of deer–vehicle collisions and to investigate the relationship between deer–vehicle collisions and both environmental conditions and browsing intensities. An innovative modelling approach for the number of deer–vehicle collisions, which allows nonlinear environment–deer relationships and assessment of spatial heterogeneity, was the basis for estimating the local risk of collisions for specific road types on the scale of Bavarian municipalities. Based on this risk model, we propose a new “deer–vehicle collision index” for deer management. We show that the risk of deer–vehicle collisions is positively correlated to browsing intensity and to harvest numbers. Overall, our results demonstrate that the number of deer–vehicle collisions can be predicted with high precision on the scale of municipalities. In the densely populated and intensively used landscapes of Central Europe and North America, a model-based risk assessment for deer–vehicle collisions provides a cost-efficient instrument for deer management on the landscape scale. The measures derived from our model provide valuable information for planning road protection and

  16. Sensitivity Analysis of Median Lifetime on Radiation Risks Estimates for Cancer and Circulatory Disease amongst Never-Smokers

    NASA Technical Reports Server (NTRS)

    Chappell, Lori J.; Cucinotta, Francis A.

    2011-01-01

    Radiation risks are estimated in a competing risk formalism where age or time after exposure estimates of increased risks for cancer and circulatory diseases are folded with a probability to survive to a given age. The survival function, also called the life-table, changes with calendar year, gender, smoking status and other demographic variables. An outstanding problem in risk estimation is the method of risk transfer between exposed populations and a second population where risks are to be estimated. Approaches used to transfer risks are based on: 1) Multiplicative risk transfer models -proportional to background disease rates. 2) Additive risk transfer model -risks independent of background rates. In addition, a Mixture model is often considered where the multiplicative and additive transfer assumptions are given weighted contributions. We studied the influence of the survival probability on the risk of exposure induced cancer and circulatory disease morbidity and mortality in the Multiplicative transfer model and the Mixture model. Risks for never-smokers (NS) compared to the average U.S. population are estimated to be reduced between 30% and 60% dependent on model assumptions. Lung cancer is the major contributor to the reduction for NS, with additional contributions from circulatory diseases and cancers of the stomach, liver, bladder, oral cavity, esophagus, colon, a portion of the solid cancer remainder, and leukemia. Greater improvements in risk estimates for NS s are possible, and would be dependent on improved understanding of risk transfer models, and elucidating the role of space radiation on the various stages of disease formation (e.g. initiation, promotion, and progression).

  17. Sparse Additive Ordinary Differential Equations for Dynamic Gene Regulatory Network Modeling.

    PubMed

    Wu, Hulin; Lu, Tao; Xue, Hongqi; Liang, Hua

    2014-04-02

    The gene regulation network (GRN) is a high-dimensional complex system, which can be represented by various mathematical or statistical models. The ordinary differential equation (ODE) model is one of the popular dynamic GRN models. High-dimensional linear ODE models have been proposed to identify GRNs, but with a limitation of the linear regulation effect assumption. In this article, we propose a sparse additive ODE (SA-ODE) model, coupled with ODE estimation methods and adaptive group LASSO techniques, to model dynamic GRNs that could flexibly deal with nonlinear regulation effects. The asymptotic properties of the proposed method are established and simulation studies are performed to validate the proposed approach. An application example for identifying the nonlinear dynamic GRN of T-cell activation is used to illustrate the usefulness of the proposed method.

  18. "Near-term" Natural Catastrophe Risk Management and Risk Hedging in a Changing Climate

    NASA Astrophysics Data System (ADS)

    Michel, Gero; Tiampo, Kristy

    2014-05-01

    Competing with analytics - Can the insurance market take advantage of seasonal or "near-term" forecasting and temporal changes in risk? Natural perils (re)insurance has been based on models following climatology i.e. the long-term "historical" average. This is opposed to considering the "near-term" and forecasting hazard and risk for the seasons or years to come. Variability and short-term changes in risk are deemed abundant for almost all perils. In addition to hydrometeorological perils whose changes are vastly discussed, earthquake activity might also change over various time-scales affected by earlier local (or even global) events, regional changes in the distribution of stresses and strains and more. Only recently has insurance risk modeling of (stochastic) hurricane-years or extratropical-storm-years started considering our ability to forecast climate variability herewith taking advantage of apparent correlations between climate indicators and the activity of storm events. Once some of these "near-term measures" were in the market, rating agencies and regulators swiftly adopted these concepts demanding companies to deploy a selection of more conservative "time-dependent" models. This was despite the fact that the ultimate effect of some of these measures on insurance risk was not well understood. Apparent short-term success over the last years in near-term seasonal hurricane forecasting was brought to a halt in 2013 when these models failed to forecast the exceptional shortage of hurricanes herewith contradicting an active-year forecast. The focus of earthquake forecasting has in addition been mostly on high rather than low temporal and regional activity despite the fact that avoiding losses does not by itself create a product. This presentation sheds light on new risk management concepts for over-regional and global (re)insurance portfolios that take advantage of forecasting changes in risk. The presentation focuses on the "upside" and on new opportunities

  19. Geo-additive modelling of malaria in Burundi

    PubMed Central

    2011-01-01

    Background Malaria is a major public health issue in Burundi in terms of both morbidity and mortality, with around 2.5 million clinical cases and more than 15,000 deaths each year. It is still the single main cause of mortality in pregnant women and children below five years of age. Because of the severe health and economic burden of malaria, there is still a growing need for methods that will help to understand the influencing factors. Several studies/researches have been done on the subject yielding different results as which factors are most responsible for the increase in malaria transmission. This paper considers the modelling of the dependence of malaria cases on spatial determinants and climatic covariates including rainfall, temperature and humidity in Burundi. Methods The analysis carried out in this work exploits real monthly data collected in the area of Burundi over 12 years (1996-2007). Semi-parametric regression models are used. The spatial analysis is based on a geo-additive model using provinces as the geographic units of study. The spatial effect is split into structured (correlated) and unstructured (uncorrelated) components. Inference is fully Bayesian and uses Markov chain Monte Carlo techniques. The effects of the continuous covariates are modelled by cubic p-splines with 20 equidistant knots and second order random walk penalty. For the spatially correlated effect, Markov random field prior is chosen. The spatially uncorrelated effects are assumed to be i.i.d. Gaussian. The effects of climatic covariates and the effects of other spatial determinants are estimated simultaneously in a unified regression framework. Results The results obtained from the proposed model suggest that although malaria incidence in a given month is strongly positively associated with the minimum temperature of the previous months, regional patterns of malaria that are related to factors other than climatic variables have been identified, without being able to explain

  20. Phosphate additives in food--a health risk.

    PubMed

    Ritz, Eberhard; Hahn, Kai; Ketteler, Markus; Kuhlmann, Martin K; Mann, Johannes

    2012-01-01

    Hyperphosphatemia has been identified in the past decade as a strong predictor of mortality in advanced chronic kidney disease (CKD). For example, a study of patients in stage CKD 5 (with an annual mortality of about 20%) revealed that 12% of all deaths in this group were attributable to an elevated serum phosphate concentration. Recently, a high-normal serum phosphate concentration has also been found to be an independent predictor of cardiovascular events and mortality in the general population. Therefore, phosphate additives in food are a matter of concern, and their potential impact on health may well have been underappreciated. We reviewed pertinent literature retrieved by a selective search of the PubMed and EU databases (www.zusatzstoffe-online.de, www.codexalimentarius.de), with the search terms "phosphate additives" and "hyperphosphatemia." There is no need to lower the content of natural phosphate, i.e. organic esters, in food, because this type of phosphate is incompletely absorbed; restricting its intake might even lead to protein malnutrition. On the other hand, inorganic phosphate in food additives is effectively absorbed and can measurably elevate the serum phosphate concentration in patients with advanced CKD. Foods with added phosphate tend to be eaten by persons at the lower end of the socioeconomic scale, who consume more processed and "fast" food. The main pathophysiological effect of phosphate is vascular damage, e.g. endothelial dysfunction and vascular calcification. Aside from the quality of phosphate in the diet (which also requires attention), the quantity of phosphate consumed by patients with advanced renal failure should not exceed 1000 mg per day, according to the guidelines. Prospective controlled trials are currently unavailable. In view of the high prevalence of CKD and the potential harm caused by phosphate additives to food, the public should be informed that added phosphate is damaging to health. Furthermore, calls for labeling

  1. Geometric Modeling of Cellular Materials for Additive Manufacturing in Biomedical Field: A Review.

    PubMed

    Savio, Gianpaolo; Rosso, Stefano; Meneghello, Roberto; Concheri, Gianmaria

    2018-01-01

    Advances in additive manufacturing technologies facilitate the fabrication of cellular materials that have tailored functional characteristics. The application of solid freeform fabrication techniques is especially exploited in designing scaffolds for tissue engineering. In this review, firstly, a classification of cellular materials from a geometric point of view is proposed; then, the main approaches on geometric modeling of cellular materials are discussed. Finally, an investigation on porous scaffolds fabricated by additive manufacturing technologies is pointed out. Perspectives in geometric modeling of scaffolds for tissue engineering are also proposed.

  2. Energy risk in the arbitrage pricing model: an empirical and theoretical study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bremer, M.A.

    1986-01-01

    This dissertation empirically explores the Arbitrage Pricing Theory in the context of energy risk for securities over the 1960s, 1970s, and early 1980s. Starting from a general multifactor pricing model, the paper develops a two factor model based on a market-like factor and an energy factor. This model is then tested on portfolios of securities grouped according to industrial classification using several econometric techniques designed to overcome some of the more serious estimation problems common to these models. The paper concludes that energy risk is priced in the 1970s and possibly even in the 1960s. Energy risk is found tomore » be priced in the sense that investors who hold assets subjected to energy risk are paid for this risk. The classic version of the Capital Asset Pricing Model which posits the market as the single priced factor is rejected in favor of the Arbitrage Pricing Theory or multi-beta versions of the Capital Asset Pricing Model. The study introduces some original econometric methodology to carry out empirical tests.« less

  3. Modeling Data Containing Outliers using ARIMA Additive Outlier (ARIMA-AO)

    NASA Astrophysics Data System (ADS)

    Saleh Ahmar, Ansari; Guritno, Suryo; Abdurakhman; Rahman, Abdul; Awi; Alimuddin; Minggi, Ilham; Arif Tiro, M.; Kasim Aidid, M.; Annas, Suwardi; Utami Sutiksno, Dian; Ahmar, Dewi S.; Ahmar, Kurniawan H.; Abqary Ahmar, A.; Zaki, Ahmad; Abdullah, Dahlan; Rahim, Robbi; Nurdiyanto, Heri; Hidayat, Rahmat; Napitupulu, Darmawan; Simarmata, Janner; Kurniasih, Nuning; Andretti Abdillah, Leon; Pranolo, Andri; Haviluddin; Albra, Wahyudin; Arifin, A. Nurani M.

    2018-01-01

    The aim this study is discussed on the detection and correction of data containing the additive outlier (AO) on the model ARIMA (p, d, q). The process of detection and correction of data using an iterative procedure popularized by Box, Jenkins, and Reinsel (1994). By using this method we obtained an ARIMA models were fit to the data containing AO, this model is added to the original model of ARIMA coefficients obtained from the iteration process using regression methods. In the simulation data is obtained that the data contained AO initial models are ARIMA (2,0,0) with MSE = 36,780, after the detection and correction of data obtained by the iteration of the model ARIMA (2,0,0) with the coefficients obtained from the regression Zt = 0,106+0,204Z t-1+0,401Z t-2-329X 1(t)+115X 2(t)+35,9X 3(t) and MSE = 19,365. This shows that there is an improvement of forecasting error rate data.

  4. Risk attitudes in a changing environment: An evolutionary model of the fourfold pattern of risk preferences.

    PubMed

    Mallpress, Dave E W; Fawcett, Tim W; Houston, Alasdair I; McNamara, John M

    2015-04-01

    A striking feature of human decision making is the fourfold pattern of risk attitudes, involving risk-averse behavior in situations of unlikely losses and likely gains, but risk-seeking behavior in response to likely losses and unlikely gains. Current theories to explain this pattern assume particular psychological processes to reproduce empirical observations, but do not address whether it is adaptive for the decision maker to respond to risk in this way. Here, drawing on insights from behavioral ecology, we build an evolutionary model of risk-sensitive behavior, to investigate whether particular types of environmental conditions could favor a fourfold pattern of risk attitudes. We consider an individual foraging in a changing environment, where energy is needed to prevent starvation and build up reserves for reproduction. The outcome, in terms of reproductive value (a rigorous measure of evolutionary success), of a one-off choice between a risky and a safe gain, or between a risky and a safe loss, determines the risk-sensitive behavior we should expect to see in this environment. Our results show that the fourfold pattern of risk attitudes may be adaptive in an environment in which conditions vary stochastically but are autocorrelated in time. In such an environment the current options provide information about the likely environmental conditions in the future, which affect the optimal pattern of risk sensitivity. Our model predicts that risk preferences should be both path dependent and affected by the decision maker's current state. (c) 2015 APA, all rights reserved).

  5. Back-end Science Model Integration for Ecological Risk Assessment

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  6. Back-end Science Model Integration for Ecological Risk Assessment.

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  7. Posttraumatic stress disorder, alone or additively with early life adversity, is associated with obesity and cardiometabolic risk.

    PubMed

    Farr, O M; Ko, B-J; Joung, K E; Zaichenko, L; Usher, N; Tsoukas, M; Thakkar, B; Davis, C R; Crowell, J A; Mantzoros, C S

    2015-05-01

    There is some evidence that posttraumatic stress disorder (PTSD) and early life adversity may influence metabolic outcomes such as obesity, diabetes, and cardiovascular disease. However, whether and how these interact is not clear. We analyzed data from a cross-sectional and longitudinal study to determine how PTSD severity influences obesity, insulin sensitivity, and key measures and biomarkers of cardiovascular risk. We then looked at how PTSD and early life adversity may interact to impact these same outcomes. PTSD severity is associated with increasing risk of obesity, diabetes, and cardiovascular disease, with higher symptoms correlating with higher values of BMI, leptin, fibrinogen, and blood pressure, and lower values of insulin sensitivity. PTSD and early life adversity have an additive effect on these metabolic outcomes. The longitudinal study confirmed findings from the cross sectional study and showed that fat mass, leptin, CRP, sICAM-1, and sTNFRII were significantly increased with higher PTSD severity during a 2.5 year follow-up period. Individuals with early life adversity and PTSD are at high risk and should be monitored carefully for obesity, insulin resistance, and cardiometabolic risk. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Process Modeling and Validation for Metal Big Area Additive Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan; Nycz, Andrzej; Noakes, Mark W.

    Metal Big Area Additive Manufacturing (mBAAM) is a new additive manufacturing (AM) technology based on the metal arc welding. A continuously fed metal wire is melted by an electric arc that forms between the wire and the substrate, and deposited in the form of a bead of molten metal along the predetermined path. Objects are manufactured one layer at a time starting from the base plate. The final properties of the manufactured object are dependent on its geometry and the metal deposition path, in addition to depending on the basic welding process parameters. Computational modeling can be used to acceleratemore » the development of the mBAAM technology as well as a design and optimization tool for the actual manufacturing process. We have developed a finite element method simulation framework for mBAAM using the new features of software ABAQUS. The computational simulation of material deposition with heat transfer is performed first, followed by the structural analysis based on the temperature history for predicting the final deformation and stress state. In this formulation, we assume that two physics phenomena are coupled in only one direction, i.e. the temperatures are driving the deformation and internal stresses, but their feedback on the temperatures is negligible. The experiment instrumentation (measurement types, sensor types, sensor locations, sensor placements, measurement intervals) and the measurements are presented. The temperatures and distortions from the simulations show good correlation with experimental measurements. Ongoing modeling work is also briefly discussed.« less

  9. Methods for assessing fracture risk prediction models: experience with FRAX in a large integrated health care delivery system.

    PubMed

    Pressman, Alice R; Lo, Joan C; Chandra, Malini; Ettinger, Bruce

    2011-01-01

    Area under the receiver operating characteristics (AUROC) curve is often used to evaluate risk models. However, reclassification tests provide an alternative assessment of model performance. We performed both evaluations on results from FRAX (World Health Organization Collaborating Centre for Metabolic Bone Diseases, University of Sheffield, UK), a fracture risk tool, using Kaiser Permanente Northern California women older than 50yr with bone mineral density (BMD) measured during 1997-2003. We compared FRAX performance with and without BMD in the model. Among 94,489 women with mean follow-up of 6.6yr, 1579 (1.7%) sustained a hip fracture. Overall, AUROCs were 0.83 and 0.84 for FRAX without and with BMD, suggesting that BMD did not contribute to model performance. AUROC decreased with increasing age, and BMD contributed significantly to higher AUROC among those aged 70yr and older. Using an 81% sensitivity threshold (optimum level from receiver operating characteristic curve, corresponding to 1.2% cutoff), 35% of those categorized above were reassigned below when BMD was added. In contrast, only 10% of those categorized below were reassigned to the higher risk category when BMD was added. The net reclassification improvement was 5.5% (p<0.01). Two versions of this risk tool have similar AUROCs, but alternative assessments indicate that addition of BMD improves performance. Multiple methods should be used to evaluate risk tool performance with less reliance on AUROC alone. Copyright © 2011 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  10. A simulations approach for meta-analysis of genetic association studies based on additive genetic model.

    PubMed

    John, Majnu; Lencz, Todd; Malhotra, Anil K; Correll, Christoph U; Zhang, Jian-Ping

    2018-06-01

    Meta-analysis of genetic association studies is being increasingly used to assess phenotypic differences between genotype groups. When the underlying genetic model is assumed to be dominant or recessive, assessing the phenotype differences based on summary statistics, reported for individual studies in a meta-analysis, is a valid strategy. However, when the genetic model is additive, a similar strategy based on summary statistics will lead to biased results. This fact about the additive model is one of the things that we establish in this paper, using simulations. The main goal of this paper is to present an alternate strategy for the additive model based on simulating data for the individual studies. We show that the alternate strategy is far superior to the strategy based on summary statistics.

  11. Adolescents Exiting Homelessness over Two Years: The Risk Amplification and Abatement Model

    ERIC Educational Resources Information Center

    Milburn, Norweeta G.; Rice, Eric; Rotheram-Borus, Mary Jane; Mallett, Shelley; Rosenthal, Doreen; Batterham, Phillip; May, Susanne J.; Witkin, Andrea; Duan, Naihua

    2009-01-01

    The Risk Amplification and Abatement Model (RAAM) demonstrates that negative contact with socializing agents amplify risk, while positive contact abates risk for homeless adolescents. To test this model, the likelihood of exiting homelessness and returning to familial housing at 2 years and stably exiting over time are examined with longitudinal…

  12. PACE and the Medicare+Choice risk-adjusted payment model.

    PubMed

    Temkin-Greener, H; Meiners, M R; Gruenberg, L

    2001-01-01

    This paper investigates the impact of the Medicare principal inpatient diagnostic cost group (PIP-DCG) payment model on the Program of All-Inclusive Care for the Elderly (PACE). Currently, more than 6,000 Medicare beneficiaries who are nursing home certifiable receive care from PACE, a program poised for expansion under the Balanced Budget Act of 1997. Overall, our analysis suggests that the application of the PIP-DCG model to the PACE program would reduce Medicare payments to PACE, on average, by 38%. The PIP-DCG payment model bases its risk adjustment on inpatient diagnoses and does not capture adequately the risk of caring for a population with functional impairments.

  13. Risk Prediction Models for Other Cancers or Multiple Sites

    Cancer.gov

    Developing statistical models that estimate the probability of developing other multiple cancers over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. A review and critique of some models used in competing risk analysis.

    PubMed

    Gail, M

    1975-03-01

    We have introduced a notation which allows one to define competing risk models easily and to examine underlying assumptions. We have treated the actuarial model for competing risk in detail, comparing it with other models and giving useful variance formulae both for the case when times of death are available and for the case when they are not. The generality of these methods is illustrated by an example treating two dependent competing risks.

  15. Geometric Modeling of Cellular Materials for Additive Manufacturing in Biomedical Field: A Review

    PubMed Central

    Rosso, Stefano; Meneghello, Roberto; Concheri, Gianmaria

    2018-01-01

    Advances in additive manufacturing technologies facilitate the fabrication of cellular materials that have tailored functional characteristics. The application of solid freeform fabrication techniques is especially exploited in designing scaffolds for tissue engineering. In this review, firstly, a classification of cellular materials from a geometric point of view is proposed; then, the main approaches on geometric modeling of cellular materials are discussed. Finally, an investigation on porous scaffolds fabricated by additive manufacturing technologies is pointed out. Perspectives in geometric modeling of scaffolds for tissue engineering are also proposed. PMID:29487626

  16. A Concentration Addition Model to Assess Activation of the Pregnane X Receptor (PXR) by Pesticide Mixtures Found in the French Diet

    PubMed Central

    de Sousa, Georges; Nawaz, Ahmad; Cravedi, Jean-Pierre; Rahmani, Roger

    2014-01-01

    French consumers are exposed to mixtures of pesticide residues in part through food consumption. As a xenosensor, the pregnane X receptor (hPXR) is activated by numerous pesticides, the combined effect of which is currently unknown. We examined the activation of hPXR by seven pesticide mixtures most likely found in the French diet and their individual components. The mixture's effect was estimated using the concentration addition (CA) model. PXR transactivation was measured by monitoring luciferase activity in hPXR/HepG2 cells and CYP3A4 expression in human hepatocytes. The three mixtures with the highest potency were evaluated using the CA model, at equimolar concentrations and at their relative proportion in the diet. The seven mixtures significantly activated hPXR and induced the expression of CYP3A4 in human hepatocytes. Of the 14 pesticides which constitute the three most active mixtures, four were found to be strong hPXR agonists, four medium, and six weak. Depending on the mixture and pesticide proportions, additive, greater than additive or less than additive effects between compounds were demonstrated. Predictions of the combined effects were obtained with both real-life and equimolar proportions at low concentrations. Pesticides act mostly additively to activate hPXR, when present in a mixture. Modulation of hPXR activation and its target genes induction may represent a risk factor contributing to exacerbate the physiological response of the hPXR signaling pathways and to explain some adverse effects in humans. PMID:25028461

  17. DEVELOPMENT AND REVIEW OF MONITORING METHODS AND RISK ASSESSMENT MODELS USED TO DETERMINE THE EFFECTS OF BIOSOLIDS LAND APPLICATION ON HUMAN HEALTH AND THE ENVIRONMENT

    EPA Science Inventory

    Development and Review of monitoring methods and risk assessment models for biosolids land application impacts on air and land

    Ronald F Herrmann (NRMRL), Mike Broder (NCEA), and Mike Ware (NERL)

    Science Questions .

    MYP Science Question: What additional model...

  18. Analysis of dengue fever risk using geostatistics model in bone regency

    NASA Astrophysics Data System (ADS)

    Amran, Stang, Mallongi, Anwar

    2017-03-01

    This research aim is to analysis of dengue fever risk based on Geostatistics model in Bone Regency. Risk levels of dengue fever are denoted by parameter of Binomial distribution. Effect of temperature, rainfalls, elevation, and larvae abundance are investigated through Geostatistics model. Bayesian hierarchical method is used in estimation process. Using dengue fever data in eleven locations this research shows that temperature and rainfall have significant effect of dengue fever risk in Bone regency.

  19. The role of building models in the evaluation of heat-related risks

    NASA Astrophysics Data System (ADS)

    Buchin, Oliver; Jänicke, Britta; Meier, Fred; Scherer, Dieter; Ziegler, Felix

    2016-04-01

    Hazard-risk relationships in epidemiological studies are generally based on the outdoor climate, despite the fact that most of humans' lifetime is spent indoors. By coupling indoor and outdoor climates with a building model, the risk concept developed can still be based on the outdoor conditions but also includes exposure to the indoor climate. The influence of non-linear building physics and the impact of air conditioning on heat-related risks can be assessed in a plausible manner using this risk concept. For proof of concept, the proposed risk concept is compared to a traditional risk analysis. As an example, daily and city-wide mortality data of the age group 65 and older in Berlin, Germany, for the years 2001-2010 are used. Four building models with differing complexity are applied in a time-series regression analysis. This study shows that indoor hazard better explains the variability in the risk data compared to outdoor hazard, depending on the kind of building model. Simplified parameter models include the main non-linear effects and are proposed for the time-series analysis. The concept shows that the definitions of heat events, lag days, and acclimatization in a traditional hazard-risk relationship are influenced by the characteristics of the prevailing building stock.

  20. Study of abrasive resistance of foundries models obtained with use of additive technology

    NASA Astrophysics Data System (ADS)

    Ol'khovik, Evgeniy

    2017-10-01

    A problem of determination of resistance of the foundry models and patterns from ABS (PLA) plastic, obtained by the method of 3D printing with using FDM additive technology, to abrasive wear and resistance in the environment of foundry sand mould is considered in the present study. The description of a technique and equipment for tests of castings models and patterns for wear is provided in the article. The manufacturing techniques of models with the use of the 3D printer (additive technology) are described. The scheme with vibration load was applied to samples tests. For the most qualitative research of influence of sandy mix on plastic, models in real conditions of abrasive wear have been organized. The results also examined the application of acrylic paintwork to the plastic model and a two-component coating. The practical offers and recommendation on production of master models with the use of FDM technology allowing one to reach indicators of durability, exceeding 2000 cycles of moulding in foundry sand mix, are described.

  1. Mechanistic modeling of insecticide risks to breeding birds in North American agroecosystems

    PubMed Central

    Garber, Kristina; Odenkirchen, Edward

    2017-01-01

    Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. There is accumulating evidence that insecticides adversely affect non-target wildlife species, including birds, causing mortality, reproductive impairment, and indirect effects through loss of prey base, and the type and magnitude of such effects differs by chemical class, or mode of action. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of the insecticide and the risk the insecticide poses to the environment and non-target wildlife. Current USEPA risk assessments for pesticides generally rely on endpoints from laboratory based toxicity studies focused on groups of individuals and do not directly assess population-level endpoints. In this paper, we present a mechanistic model, which allows risk assessors to estimate the effects of insecticide exposure on the survival and seasonal productivity of birds known to forage in agricultural fields during their breeding season. This model relies on individual-based toxicity data and translates effects into endpoints meaningful at the population level (i.e., magnitude of mortality and reproductive impairment). The model was created from two existing USEPA avian risk assessment models, the Terrestrial Investigation Model (TIM v.3.0) and the Markov Chain Nest Productivity model (MCnest). The integrated TIM/MCnest model was used to assess the relative risk of 12 insecticides applied via aerial spray to control corn pests on a suite of 31 avian species known to forage in cornfields in agroecosystems of the Midwest, USA. We found extensive differences in risk to birds among insecticides, with chlorpyrifos and malathion (organophosphates) generally posing the greatest risk, and bifenthrin and λ-cyhalothrin (pyrethroids) posing the least risk. Comparative sensitivity analysis across the 31 species showed

  2. Mechanistic modeling of insecticide risks to breeding birds in North American agroecosystems.

    PubMed

    Etterson, Matthew; Garber, Kristina; Odenkirchen, Edward

    2017-01-01

    Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. There is accumulating evidence that insecticides adversely affect non-target wildlife species, including birds, causing mortality, reproductive impairment, and indirect effects through loss of prey base, and the type and magnitude of such effects differs by chemical class, or mode of action. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of the insecticide and the risk the insecticide poses to the environment and non-target wildlife. Current USEPA risk assessments for pesticides generally rely on endpoints from laboratory based toxicity studies focused on groups of individuals and do not directly assess population-level endpoints. In this paper, we present a mechanistic model, which allows risk assessors to estimate the effects of insecticide exposure on the survival and seasonal productivity of birds known to forage in agricultural fields during their breeding season. This model relies on individual-based toxicity data and translates effects into endpoints meaningful at the population level (i.e., magnitude of mortality and reproductive impairment). The model was created from two existing USEPA avian risk assessment models, the Terrestrial Investigation Model (TIM v.3.0) and the Markov Chain Nest Productivity model (MCnest). The integrated TIM/MCnest model was used to assess the relative risk of 12 insecticides applied via aerial spray to control corn pests on a suite of 31 avian species known to forage in cornfields in agroecosystems of the Midwest, USA. We found extensive differences in risk to birds among insecticides, with chlorpyrifos and malathion (organophosphates) generally posing the greatest risk, and bifenthrin and λ-cyhalothrin (pyrethroids) posing the least risk. Comparative sensitivity analysis across the 31 species showed

  3. Cascading Failures in Bi-partite Graphs: Model for Systemic Risk Propagation

    PubMed Central

    Huang, Xuqing; Vodenska, Irena; Havlin, Shlomo; Stanley, H. Eugene

    2013-01-01

    As economic entities become increasingly interconnected, a shock in a financial network can provoke significant cascading failures throughout the system. To study the systemic risk of financial systems, we create a bi-partite banking network model composed of banks and bank assets and propose a cascading failure model to describe the risk propagation process during crises. We empirically test the model with 2007 US commercial banks balance sheet data and compare the model prediction of the failed banks with the real failed banks after 2007. We find that our model efficiently identifies a significant portion of the actual failed banks reported by Federal Deposit Insurance Corporation. The results suggest that this model could be useful for systemic risk stress testing for financial systems. The model also identifies that commercial rather than residential real estate assets are major culprits for the failure of over 350 US commercial banks during 2008–2011. PMID:23386974

  4. The risk of infant and fetal death by each additional week of expectant management in intrahepatic cholestasis of pregnancy by gestational age.

    PubMed

    Puljic, Anela; Kim, Elissa; Page, Jessica; Esakoff, Tania; Shaffer, Brian; LaCoursiere, Daphne Y; Caughey, Aaron B

    2015-05-01

    The objective of the study was to characterize the risk of infant and fetal death by each additional week of expectant management vs immediate delivery in pregnancies complicated by cholestasis. This was a retrospective cohort study of 1,604,386 singleton, nonanomalous pregnancies of women between 34 and 40 weeks' gestation with and without intrahepatic cholestasis of pregnancy (ICP) in the state of California during the years of 2005-2008. International Classification of Diseases, 9th version, codes and linked hospital discharge and vital statistics data were utilized. For each week of gestation, the following outcomes were assessed: the risk of stillbirth, the risk of delivery (represented by the risk of infant death at a given week of gestation), and the composite risk of expectant management for 1 additional week. Composite risk combines the risk of stillbirth at this gestational age week plus the risk of infant death if delivered at the subsequent week of gestation. Among women with ICP, the mortality risk of delivery is lower than the risk of expectant management at 36 weeks' gestation (4.7 vs 19.2 per 10,000). The risk of expectant management remains higher than delivery and continues to rise by week of gestation beyond 36 weeks. The risk of expectant management in women with ICP reaches a nadir at 35 weeks (9.1 per 10,000; 95% confidence interval, 1.4-16.9) and rises at 36 weeks (19.2 per 10,000; 95% confidence interval, 7.6-30.8). Among women with ICP, delivery at 36 weeks' gestation would reduce the perinatal mortality risk as compared with expectant management. For later diagnoses, this would also be true at gestational ages beyond 36 weeks. Timing of delivery must take into account both the reduction in stillbirth risk balanced with the morbidities associated with preterm delivery. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Comparison of prosthetic models produced by traditional and additive manufacturing methods.

    PubMed

    Park, Jin-Young; Kim, Hae-Young; Kim, Ji-Hwan; Kim, Jae-Hong; Kim, Woong-Chul

    2015-08-01

    The purpose of this study was to verify the clinical-feasibility of additive manufacturing by comparing the accuracy of four different manufacturing methods for metal coping: the conventional lost wax technique (CLWT); subtractive methods with wax blank milling (WBM); and two additive methods, multi jet modeling (MJM), and micro-stereolithography (Micro-SLA). Thirty study models were created using an acrylic model with the maxillary upper right canine, first premolar, and first molar teeth. Based on the scan files from a non-contact blue light scanner (Identica; Medit Co. Ltd., Seoul, Korea), thirty cores were produced using the WBM, MJM, and Micro-SLA methods, respectively, and another thirty frameworks were produced using the CLWT method. To measure the marginal and internal gap, the silicone replica method was adopted, and the silicone images obtained were evaluated using a digital microscope (KH-7700; Hirox, Tokyo, Japan) at 140X magnification. Analyses were performed using two-way analysis of variance (ANOVA) and Tukey post hoc test (α=.05). The mean marginal gaps and internal gaps showed significant differences according to tooth type (P<.001 and P<.001, respectively) and manufacturing method (P<.037 and P<.001, respectively). Micro-SLA did not show any significant difference from CLWT regarding mean marginal gap compared to the WBM and MJM methods. The mean values of gaps resulting from the four different manufacturing methods were within a clinically allowable range, and, thus, the clinical use of additive manufacturing methods is acceptable as an alternative to the traditional lost wax-technique and subtractive manufacturing.

  6. A risk prediction model for xerostomia: a retrospective cohort study.

    PubMed

    Villa, Alessandro; Nordio, Francesco; Gohel, Anita

    2016-12-01

    We investigated the prevalence of xerostomia in dental patients and built a xerostomia risk prediction model by incorporating a wide range of risk factors. Socio-demographic data, past medical history, self-reported dry mouth and related symptoms were collected retrospectively from January 2010 to September 2013 for all new dental patients. A logistic regression framework was used to build a risk prediction model for xerostomia. External validation was performed using an independent data set to test the prediction power. A total of 12 682 patients were included in this analysis (54.3%, females). Xerostomia was reported by 12.2% of patients. The proportion of people reporting xerostomia was higher among those who were taking more medications (OR = 1.11, 95% CI = 1.08-1.13) or recreational drug users (OR = 1.4, 95% CI = 1.1-1.9). Rheumatic diseases (OR = 2.17, 95% CI = 1.88-2.51), psychiatric diseases (OR = 2.34, 95% CI = 2.05-2.68), eating disorders (OR = 2.28, 95% CI = 1.55-3.36) and radiotherapy (OR = 2.00, 95% CI = 1.43-2.80) were good predictors of xerostomia. For the test model performance, the ROC-AUC was 0.816 and in the external validation sample, the ROC-AUC was 0.799. The xerostomia risk prediction model had high accuracy and discriminated between high- and low-risk individuals. Clinicians could use this model to identify the classes of medications and systemic diseases associated with xerostomia. © 2015 John Wiley & Sons A/S and The Gerodontology Association. Published by John Wiley & Sons Ltd.

  7. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  8. Predictor characteristics necessary for building a clinically useful risk prediction model: a simulation study.

    PubMed

    Schummers, Laura; Himes, Katherine P; Bodnar, Lisa M; Hutcheon, Jennifer A

    2016-09-21

    Compelled by the intuitive appeal of predicting each individual patient's risk of an outcome, there is a growing interest in risk prediction models. While the statistical methods used to build prediction models are increasingly well understood, the literature offers little insight to researchers seeking to gauge a priori whether a prediction model is likely to perform well for their particular research question. The objective of this study was to inform the development of new risk prediction models by evaluating model performance under a wide range of predictor characteristics. Data from all births to overweight or obese women in British Columbia, Canada from 2004 to 2012 (n = 75,225) were used to build a risk prediction model for preeclampsia. The data were then augmented with simulated predictors of the outcome with pre-set prevalence values and univariable odds ratios. We built 120 risk prediction models that included known demographic and clinical predictors, and one, three, or five of the simulated variables. Finally, we evaluated standard model performance criteria (discrimination, risk stratification capacity, calibration, and Nagelkerke's r 2 ) for each model. Findings from our models built with simulated predictors demonstrated the predictor characteristics required for a risk prediction model to adequately discriminate cases from non-cases and to adequately classify patients into clinically distinct risk groups. Several predictor characteristics can yield well performing risk prediction models; however, these characteristics are not typical of predictor-outcome relationships in many population-based or clinical data sets. Novel predictors must be both strongly associated with the outcome and prevalent in the population to be useful for clinical prediction modeling (e.g., one predictor with prevalence ≥20 % and odds ratio ≥8, or 3 predictors with prevalence ≥10 % and odds ratios ≥4). Area under the receiver operating characteristic curve

  9. Recent development of risk-prediction models for incident hypertension: An updated systematic review

    PubMed Central

    Xiao, Lei; Liu, Ya; Wang, Zuoguang; Li, Chuang; Jin, Yongxin; Zhao, Qiong

    2017-01-01

    Background Hypertension is a leading global health threat and a major cardiovascular disease. Since clinical interventions are effective in delaying the disease progression from prehypertension to hypertension, diagnostic prediction models to identify patient populations at high risk for hypertension are imperative. Methods Both PubMed and Embase databases were searched for eligible reports of either prediction models or risk scores of hypertension. The study data were collected, including risk factors, statistic methods, characteristics of study design and participants, performance measurement, etc. Results From the searched literature, 26 studies reporting 48 prediction models were selected. Among them, 20 reports studied the established models using traditional risk factors, such as body mass index (BMI), age, smoking, blood pressure (BP) level, parental history of hypertension, and biochemical factors, whereas 6 reports used genetic risk score (GRS) as the prediction factor. AUC ranged from 0.64 to 0.97, and C-statistic ranged from 60% to 90%. Conclusions The traditional models are still the predominant risk prediction models for hypertension, but recently, more models have begun to incorporate genetic factors as part of their model predictors. However, these genetic predictors need to be well selected. The current reported models have acceptable to good discrimination and calibration ability, but whether the models can be applied in clinical practice still needs more validation and adjustment. PMID:29084293

  10. Deriving forest fire ignition risk with biogeochemical process modelling.

    PubMed

    Eastaugh, C S; Hasenauer, H

    2014-05-01

    Climate impacts the growth of trees and also affects disturbance regimes such as wildfire frequency. The European Alps have warmed considerably over the past half-century, but incomplete records make it difficult to definitively link alpine wildfire to climate change. Complicating this is the influence of forest composition and fuel loading on fire ignition risk, which is not considered by purely meteorological risk indices. Biogeochemical forest growth models track several variables that may be used as proxies for fire ignition risk. This study assesses the usefulness of the ecophysiological model BIOME-BGC's 'soil water' and 'labile litter carbon' variables in predicting fire ignition. A brief application case examines historic fire occurrence trends over pre-defined regions of Austria from 1960 to 2008. Results show that summer fire ignition risk is largely a function of low soil moisture, while winter fire ignitions are linked to the mass of volatile litter and atmospheric dryness.

  11. Competing risk models in reliability systems, an exponential distribution model with Bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, I.

    2018-03-01

    The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.

  12. Insights from socio-hydrology modelling on dealing with flood risk - Roles of collective memory, risk-taking attitude and trust

    NASA Astrophysics Data System (ADS)

    Viglione, Alberto; Di Baldassarre, Giuliano; Brandimarte, Luigia; Kuil, Linda; Carr, Gemma; Salinas, José Luis; Scolobig, Anna; Blöschl, Günter

    2014-10-01

    The risk coping culture of a community plays a major role in the development of urban floodplains. In this paper we analyse, in a conceptual way, the interplay of community risk coping culture, flooding damage and economic growth. We particularly focus on three aspects: (i) collective memory, i.e., the capacity of the community to keep risk awareness high; (ii) risk-taking attitude, i.e., the amount of risk the community is collectively willing to be exposed to; and (iii) trust of the community in risk reduction measures. To this end, we use a dynamic model that represents the feedback between the hydrological and social system components. Model results indicate that, on the one hand, by under perceiving the risk of flooding (because of short collective memory and too much trust in flood protection structures) in combination with a high risk-taking attitude, community development is severely limited because of high damages caused by flooding. On the other hand, overestimation of risk (long memory and lack of trust in flood protection structures) leads to lost economic opportunities and recession. There are many scenarios of favourable development resulting from a trade-off between collective memory and trust in risk reduction measures combined with a low to moderate risk-taking attitude. Interestingly, the model gives rise to situations in which the development of the community in the floodplain is path dependent, i.e., the history of flooding may lead to community growth or recession.

  13. Dynamic modeling of environmental risk associated with drilling discharges to marine sediments.

    PubMed

    Durgut, İsmail; Rye, Henrik; Reed, Mark; Smit, Mathijs G D; Ditlevsen, May Kristin

    2015-10-15

    Drilling discharges are complex mixtures of base-fluids, chemicals and particulates, and may, after discharge to the marine environment, result in adverse effects on benthic communities. A numerical model was developed to estimate the fate of drilling discharges in the marine environment, and associated environmental risks. Environmental risk from deposited drilling waste in marine sediments is generally caused by four types of stressors: oxygen depletion, toxicity, burial and change of grain size. In order to properly model these stressors, natural burial, biodegradation and bioturbation processes were also included. Diagenetic equations provide the basis for quantifying environmental risk. These equations are solved numerically by an implicit-central differencing scheme. The sediment model described here is, together with a fate and risk model focusing on the water column, implemented in the DREAM and OSCAR models, both available within the Marine Environmental Modeling Workbench (MEMW) at SINTEF in Trondheim, Norway. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Frailty Models for Familial Risk with Application to Breast Cancer.

    PubMed

    Gorfine, Malka; Hsu, Li; Parmigiani, Giovanni

    2013-12-01

    In evaluating familial risk for disease we have two main statistical tasks: assessing the probability of carrying an inherited genetic mutation conferring higher risk; and predicting the absolute risk of developing diseases over time, for those individuals whose mutation status is known. Despite substantial progress, much remains unknown about the role of genetic and environmental risk factors, about the sources of variation in risk among families that carry high-risk mutations, and about the sources of familial aggregation beyond major Mendelian effects. These sources of heterogeneity contribute substantial variation in risk across families. In this paper we present simple and efficient methods for accounting for this variation in familial risk assessment. Our methods are based on frailty models. We implemented them in the context of generalizing Mendelian models of cancer risk, and compared our approaches to others that do not consider heterogeneity across families. Our extensive simulation study demonstrates that when predicting the risk of developing a disease over time conditional on carrier status, accounting for heterogeneity results in a substantial improvement in the area under the curve of the receiver operating characteristic. On the other hand, the improvement for carriership probability estimation is more limited. We illustrate the utility of the proposed approach through the analysis of BRCA1 and BRCA2 mutation carriers in the Washington Ashkenazi Kin-Cohort Study of Breast Cancer.

  15. A Probabilistic Model for Hydrokinetic Turbine Collision Risks: Exploring Impacts on Fish

    PubMed Central

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  16. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    PubMed

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  17. Structural equation models to estimate risk of infection and tolerance to bovine mastitis.

    PubMed

    Detilleux, Johann; Theron, Léonard; Duprez, Jean-Noël; Reding, Edouard; Humblet, Marie-France; Planchon, Viviane; Delfosse, Camille; Bertozzi, Carlo; Mainil, Jacques; Hanzen, Christian

    2013-03-06

    One method to improve durably animal welfare is to select, as reproducers, animals with the highest ability to resist or tolerate infection. To do so, it is necessary to distinguish direct and indirect mechanisms of resistance and tolerance because selection on these traits is believed to have different epidemiological and evolutionary consequences. We propose structural equation models with latent variables (1) to quantify the latent risk of infection and to identify, among the many potential mediators of infection, the few ones that influence it significantly and (2) to estimate direct and indirect levels of tolerance of animals infected naturally with pathogens. We applied the method to two surveys of bovine mastitis in the Walloon region of Belgium, in which we recorded herd management practices, mastitis frequency, and results of bacteriological analyses of milk samples. Structural equation models suggested that, among more than 35 surveyed herd characteristics, only nine (age, addition of urea in the rations, treatment of subclinical mastitis, presence of dirty liner, cows with hyperkeratotic teats, machine stripping, pre- and post-milking teat disinfection, and housing of milking cows in cubicles) were directly and significantly related to a latent measure of bovine mastitis, and that treatment of subclinical mastitis was involved in the pathway between post-milking teat disinfection and latent mastitis. These models also allowed the separation of direct and indirect effects of bacterial infection on milk productivity. Results suggested that infected cows were tolerant but not resistant to mastitis pathogens. We revealed the advantages of structural equation models, compared to classical models, for dissecting measurements of resistance and tolerance to infectious diseases, here bovine mastitis. Using our method, we identified nine major risk factors that were directly associated with an increased risk of mastitis and suggested that cows were tolerant but

  18. Assessing Climate Change Risks Using a Multi-Model Approach

    NASA Astrophysics Data System (ADS)

    Knorr, W.; Scholze, M.; Prentice, C.

    2007-12-01

    We quantify the risks of climate-induced changes in key ecosystem processes during the 21st century by forcing a dynamic global vegetation model with multiple scenarios from the IPCC AR4 data archive using 16 climate models and mapping the proportions of model runs showing exceedance of natural variability in wildfire frequency and freshwater supply or shifts in vegetation cover. Our analysis does not assign probabilities to scenarios. Instead, we consider the distribution of outcomes within three sets of model runs grouped according to the amount of global warming they simulate: < 2 degree C (including committed climate change simulations), 2-3 degree C, and >3 degree C. Here, we are contrasting two different methods for calculating the risks: first we use an equal weighting approach giving every model within one of the three sets the same weight, and second, we weight the models according to their ability to model ENSO. The differences are underpinning the need for the development of more robust performance metrics for global climate models.

  19. Site-wide seismic risk model for Savannah River Site nuclear facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eide, S.A.; Shay, R.S.; Durant, W.S.

    1993-09-01

    The 200,000 acre Savannah River Site (SRS) has nearly 30 nuclear facilities spread throughout the site. The safety of each facility has been established in facility-specific safety analysis reports (SARs). Each SAR contains an analysis of risk from seismic events to both on-site workers and the off-site population. Both radiological and chemical releases are considered, and air and water pathways are modeled. Risks to the general public are generally characterized by evaluating exposure to the maximally exposed individual located at the SRS boundary and to the off-site population located within 50 miles. Although the SARs are appropriate methods for studyingmore » individual facility risks, there is a class of accident initiators that can simultaneously affect several of all of the facilities, Examples include seismic events, strong winds or tornados, floods, and loss of off-site electrical power. Overall risk to the off-site population from such initiators is not covered by the individual SARs. In such cases multiple facility radionuclide or chemical releases could occur, and off-site exposure would be greater than that indicated in a single facility SAR. As a step towards an overall site-wide risk model that adequately addresses multiple facility releases, a site-wide seismic model for determining off-site risk has been developed for nuclear facilities at the SRS. Risk from seismic events up to the design basis earthquake (DBE) of 0.2 g (frequency of 2.0E-4/yr) is covered by the model. Present plans include expanding the scope of the model to include other types of initiators that can simultaneously affect multiple facilities.« less

  20. Additional risk factors for infection by multidrug-resistant pathogens in healthcare-associated infection: a large cohort study.

    PubMed

    Cardoso, Teresa; Ribeiro, Orquídea; Aragão, Irene César; Costa-Pereira, Altamiro; Sarmento, António Eugénio

    2012-12-26

    There is a lack of consensus regarding the definition of risk factors for healthcare-associated infection (HCAI). The purpose of this study was to identify additional risk factors for HCAI, which are not included in the current definition of HCAI, associated with infection by multidrug-resistant (MDR) pathogens, in all hospitalized infected patients from the community. This 1-year prospective cohort study included all patients with infection admitted to a large, tertiary care, university hospital. Risk factors not included in the HCAI definition, and independently associated with MDR pathogen infection, namely MDR Gram-negative (MDR-GN) and ESKAPE microorganisms (vancomycin-resistant Enterococcus faecium, methicillin-resistant Staphylococcus aureus, extended-spectrum beta-lactamase-producing Escherichia coli and Klebsiella species, carbapenem-hydrolyzing Klebsiella pneumonia and MDR Acinetobacter baumannii, Pseudomonas aeruginosa, Enterobacter species), were identified by logistic regression among patients admitted from the community (either with community-acquired or HCAI). There were 1035 patients with infection, 718 from the community. Of these, 439 (61%) had microbiologic documentation; 123 were MDR (28%). Among MDR: 104 (85%) had MDR-GN and 41 (33%) had an ESKAPE infection. Independent risk factors associated with MDR and MDR-GN infection were: age (adjusted odds ratio (OR) = 1.7 and 1.5, p = 0.001 and p = 0.009, respectively), and hospitalization in the previous year (between 4 and 12 months previously) (adjusted OR = 2.0 and 1,7, p = 0.008 and p = 0.048, respectively). Infection by pathogens from the ESKAPE group was independently associated with previous antibiotic therapy (adjusted OR = 7.2, p < 0.001) and a Karnofsky index <70 (adjusted OR = 3.7, p = 0.003). Patients with infection by MDR, MDR-GN and pathogens from the ESKAPE group had significantly higher rates of inadequate antibiotic therapy than those without (46% vs 7%, 44% vs 10%, 61% vs 15