Science.gov

Sample records for additive risk model

  1. Goodness-of-fit tests for the additive risk model with (p > 2)-dimensional time-invariant covariates.

    PubMed

    Kim, J; Song, M S; Lee, S

    1998-01-01

    This paper presents methods for checking the goodness-of-fit of the additive risk model with p(> 2)-dimensional time-invariant covariates. The procedures are an extension of Kim and Lee (1996) who developed a test to assess the additive risk assumption for two-sample censored data. We apply the proposed tests to survival data from South Wales nikel refinery workers. Simulation studies are carried out to investigate the performance of the proposed tests for practical sample sizes. PMID:9880997

  2. Changes in diet, cardiovascular risk factors and modelled cardiovascular risk following diagnosis of diabetes: 1-year results from the ADDITION-Cambridge trial cohort

    PubMed Central

    Savory, L A; Griffin, S J; Williams, K M; Prevost, A T; Kinmonth, A-L; Wareham, N J; Simmons, R K

    2014-01-01

    Aims To describe change in self-reported diet and plasma vitamin C, and to examine associations between change in diet and cardiovascular disease risk factors and modelled 10-year cardiovascular disease risk in the year following diagnosis of Type 2 diabetes. Methods Eight hundred and sixty-seven individuals with screen-detected diabetes underwent assessment of self-reported diet, plasma vitamin C, cardiovascular disease risk factors and modelled cardiovascular disease risk at baseline and 1 year (n = 736) in the ADDITION-Cambridge trial. Multivariable linear regression was used to quantify the association between change in diet and cardiovascular disease risk at 1 year, adjusting for change in physical activity and cardio-protective medication. Results Participants reported significant reductions in energy, fat and sodium intake, and increases in fruit, vegetable and fibre intake over 1 year. The reduction in energy was equivalent to an average-sized chocolate bar; the increase in fruit was equal to one plum per day. There was a small increase in plasma vitamin C levels. Increases in fruit intake and plasma vitamin C were associated with small reductions in anthropometric and metabolic risk factors. Increased vegetable intake was associated with an increase in BMI and waist circumference. Reductions in fat, energy and sodium intake were associated with reduction in HbA1c, waist circumference and total cholesterol/modelled cardiovascular disease risk, respectively. Conclusions Improvements in dietary behaviour in this screen-detected population were associated with small reductions in cardiovascular disease risk, independently of change in cardio-protective medication and physical activity. Dietary change may have a role to play in the reduction of cardiovascular disease risk following diagnosis of diabetes. PMID:24102972

  3. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online.

  4. Mixed additive models

    NASA Astrophysics Data System (ADS)

    Carvalho, Francisco; Covas, Ricardo

    2016-06-01

    We consider mixed models y =∑i =0 w Xiβi with V (y )=∑i =1 w θiMi Where Mi=XiXi⊤ , i = 1, . . ., w, and µ = X0β0. For these we will estimate the variance components θ1, . . ., θw, aswell estimable vectors through the decomposition of the initial model into sub-models y(h), h ∈ Γ, with V (y (h ))=γ (h )Ig (h )h ∈Γ . Moreover we will consider L extensions of these models, i.e., y˚=Ly+ɛ, where L=D (1n1, . . ., 1nw) and ɛ, independent of y, has null mean vector and variance covariance matrix θw+1Iw, where w =∑i =1 n wi .

  5. Changes in physical activity and modelled cardiovascular risk following diagnosis of diabetes: 1-year results from the ADDITION-Cambridge trial cohort

    PubMed Central

    Barakat, A; Williams, K M; Prevost, A T; Kinmonth, A-L; Wareham, N J; Griffin, S J; Simmons, R K

    2013-01-01

    Aims To describe change in physical activity over 1 year and associations with change in cardiovascular disease risk factors in a population with screen-detected Type 2 diabetes. Methods Eight hundred and sixty-seven individuals with screen-detected diabetes underwent measurement of self-reported physical activity, cardiovascular disease risk factors and modelled cardiovascular disease risk at baseline and 1 year (n = 736) in the ADDITION-Cambridge trial. Multiple linear regression was used to quantify the association between change in different physical activity domains and cardiovascular disease risk factors at 1 year. Results There was no change in self-reported physical activity over 12 months. Even relatively large changes in physical activity were associated with relatively small changes in cardiovascular disease risk factors after allowing for changes in self-reported medication and diet. For every 30 metabolic equivalent-h increase in recreational activity (equivalent to 10 h/brisk walking/week), there was an average reduction of 0.1% in HbA1c in men (95% CI −0.15 to −0.01, P = 0.021) and an average reduction of 2 mmHg in systolic blood pressure in women (95% CI −4.0 to −0.05, P = 0.045). Conclusions Few associations were observed between change in different physical activity domains and cardiovascular disease risk factors in this trial cohort. Cardiovascular disease risk reduction appeared to be driven largely by factors other than changes in self-reported physical activity in the first year following diagnosis. PMID:22913463

  6. Polymorphisms associated with the risk of lung cancer in a healthy Mexican Mestizo population: Application of the additive model for cancer

    PubMed Central

    Pérez-Morales, Rebeca; Méndez-Ramírez, Ignacio; Castro-Hernández, Clementina; Martínez-Ramírez, Ollin C.; Gonsebatt, María Eugenia; Rubio, Julieta

    2011-01-01

    Lung cancer is the leading cause of cancer mortality in Mexico and worldwide. In the past decade, there has been an increase in the number of lung cancer cases in young people, which suggests an important role for genetic background in the etiology of this disease. In this study, we genetically characterized 16 polymorphisms in 12 low penetrance genes (AhR, CYP1A1, CYP2E1, EPHX1, GSTM1, GSTT1, GSTPI, XRCC1, ERCC2, MGMT, CCND1 and TP53) in 382 healthy Mexican Mestizos as the first step in elucidating the genetic structure of this population and identifying high risk individuals. All of the genotypes analyzed were in Hardy-Weinberg equilibrium, but different degrees of linkage were observed for polymorphisms in the CYP1A1 and EPHX1 genes. The genetic variability of this population was distributed in six clusters that were defined based on their genetic characteristics. The use of a polygenic model to assess the additive effect of low penetrance risk alleles identified combinations of risk genotypes that could be useful in predicting a predisposition to lung cancer. Estimation of the level of genetic susceptibility showed that the individual calculated risk value (iCRV) ranged from 1 to 16, with a higher iCRV indicating a greater genetic susceptibility to lung cancer. PMID:22215955

  7. Does early intensive multifactorial therapy reduce modelled cardiovascular risk in individuals with screen-detected diabetes? Results from the ADDITION-Europe cluster randomized trial

    PubMed Central

    Black, J A; Sharp, S J; Wareham, N J; Sandbæk, A; Rutten, G E H M; Lauritzen, T; Khunti, K; Davies, M J; Borch-Johnsen, K; Griffin, S J; Simmons, R K

    2014-01-01

    Aims Little is known about the long-term effects of intensive multifactorial treatment early in the diabetes disease trajectory. In the absence of long-term data on hard outcomes, we described change in 10-year modelled cardiovascular risk in the 5 years following diagnosis, and quantified the impact of intensive treatment on 10-year modelled cardiovascular risk at 5 years. Methods In a pragmatic, cluster-randomized, parallel-group trial in Denmark, the Netherlands and the UK, 3057 people with screen-detected Type 2 diabetes were randomized by general practice to receive (1) routine care of diabetes according to national guidelines (1379 patients) or (2) intensive multifactorial target-driven management (1678 patients). Ten-year modelled cardiovascular disease risk was calculated at baseline and 5 years using the UK Prospective Diabetes Study Risk Engine (version 3β). Results Among 2101 individuals with complete data at follow up (73.4%), 10-year modelled cardiovascular disease risk was 27.3% (sd 13.9) at baseline and 21.3% (sd 13.8) at 5-year follow-up (intensive treatment group difference –6.9, sd 9.0; routine care group difference –5.0, sd 12.2). Modelled 10-year cardiovascular disease risk was lower in the intensive treatment group compared with the routine care group at 5 years, after adjustment for baseline cardiovascular disease risk and clustering (–2.0; 95% CI –3.1 to –0.9). Conclusions Despite increasing age and diabetes duration, there was a decline in modelled cardiovascular disease risk in the 5 years following diagnosis. Compared with routine care, 10-year modelled cardiovascular disease risk was lower in the intensive treatment group at 5 years. Our results suggest that patients benefit from intensive treatment early in the diabetes disease trajectory, where the rate of cardiovascular disease risk progression may be slowed. PMID:24533664

  8. Biosafety Risk Assessment Model

    SciTech Connect

    Daniel Bowen, Susan Caskey

    2011-05-27

    Software tool based on a structured methodology for conducting laboratory biosafety risk assessments by biosafety experts. Software is based upon an MCDA scheme and uses peer reviewed criteria and weights. The software was developed upon Microsoft’s .net framework. The methodology defines likelihood and consequence of a laboratory exposure for thirteen unique scenarios and provides numerical relative risks for each of the relevant thirteen. The software produces 2-d graphs reflecting the relative risk and a sensitivity analysis which highlights the overall importance of each factor. The software works as a set of questions with absolute scales and uses a weighted additive model to calculate the likelihood and consequence.

  9. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  10. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  11. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  12. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  13. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  14. Biosafety Risk Assessment Model

    2011-05-27

    Software tool based on a structured methodology for conducting laboratory biosafety risk assessments by biosafety experts. Software is based upon an MCDA scheme and uses peer reviewed criteria and weights. The software was developed upon Microsoft’s .net framework. The methodology defines likelihood and consequence of a laboratory exposure for thirteen unique scenarios and provides numerical relative risks for each of the relevant thirteen. The software produces 2-d graphs reflecting the relative risk and a sensitivitymore » analysis which highlights the overall importance of each factor. The software works as a set of questions with absolute scales and uses a weighted additive model to calculate the likelihood and consequence.« less

  15. Additive interaction in survival analysis: use of the additive hazards model.

    PubMed

    Rod, Naja Hulvej; Lange, Theis; Andersen, Ingelise; Marott, Jacob Louis; Diderichsen, Finn

    2012-09-01

    It is a widely held belief in public health and clinical decision-making that interventions or preventive strategies should be aimed at patients or population subgroups where most cases could potentially be prevented. To identify such subgroups, deviation from additivity of absolute effects is the relevant measure of interest. Multiplicative survival models, such as the Cox proportional hazards model, are often used to estimate the association between exposure and risk of disease in prospective studies. In Cox models, deviations from additivity have usually been assessed by surrogate measures of additive interaction derived from multiplicative models-an approach that is both counter-intuitive and sometimes invalid. This paper presents a straightforward and intuitive way of assessing deviation from additivity of effects in survival analysis by use of the additive hazards model. The model directly estimates the absolute size of the deviation from additivity and provides confidence intervals. In addition, the model can accommodate both continuous and categorical exposures and models both exposures and potential confounders on the same underlying scale. To illustrate the approach, we present an empirical example of interaction between education and smoking on risk of lung cancer. We argue that deviations from additivity of effects are important for public health interventions and clinical decision-making, and such estimations should be encouraged in prospective studies on health. A detailed implementation guide of the additive hazards model is provided in the appendix.

  16. Chemical Mixture Risk Assessment Additivity-Based Approaches

    EPA Science Inventory

    Powerpoint presentation includes additivity-based chemical mixture risk assessment methods. Basic concepts, theory and example calculations are included. Several slides discuss the use of "common adverse outcomes" in analyzing phthalate mixtures.

  17. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  18. Melanoma Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  19. Public risk perception of food additives and food scares. The case in Suzhou, China.

    PubMed

    Wu, Linhai; Zhong, Yingqi; Shan, Lijie; Qin, Wei

    2013-11-01

    This study examined the factors affecting public risk perception of food additive safety and possible resulting food scares using a survey conducted in Suzhou, Jiangsu Province, China. The model was proposed based on literature relating to the role of risk perception and information perception of public purchase intention under food scares. Structural equation modeling (SEM) was used for data analysis. The results showed that attitude towards behavior, subjective norm and information perception exerted moderate to high effect on food scares, and the effects were also mediated by risk perceptions of additive safety. Significant covariance was observed between attitudes toward behavior, subjective norm and information perception. Establishing an effective mechanism of food safety risk communication, releasing information of government supervision on food safety in a timely manner, curbing misleading media reports on public food safety risk, and enhancing public knowledge of the food additives are key to the development and implementation of food safety risk management policies by the Chinese government. PMID:23831014

  20. Public risk perception of food additives and food scares. The case in Suzhou, China.

    PubMed

    Wu, Linhai; Zhong, Yingqi; Shan, Lijie; Qin, Wei

    2013-11-01

    This study examined the factors affecting public risk perception of food additive safety and possible resulting food scares using a survey conducted in Suzhou, Jiangsu Province, China. The model was proposed based on literature relating to the role of risk perception and information perception of public purchase intention under food scares. Structural equation modeling (SEM) was used for data analysis. The results showed that attitude towards behavior, subjective norm and information perception exerted moderate to high effect on food scares, and the effects were also mediated by risk perceptions of additive safety. Significant covariance was observed between attitudes toward behavior, subjective norm and information perception. Establishing an effective mechanism of food safety risk communication, releasing information of government supervision on food safety in a timely manner, curbing misleading media reports on public food safety risk, and enhancing public knowledge of the food additives are key to the development and implementation of food safety risk management policies by the Chinese government.

  1. Criteria for deviation from predictions by the concentration addition model.

    PubMed

    Takeshita, Jun-Ichi; Seki, Masanori; Kamo, Masashi

    2016-07-01

    Loewe's additivity (concentration addition) is a well-known model for predicting the toxic effects of chemical mixtures under the additivity assumption of toxicity. However, from the perspective of chemical risk assessment and/or management, it is important to identify chemicals whose toxicities are additive when present concurrently, that is, it should be established whether there are chemical mixtures to which the concentration addition predictive model can be applied. The objective of the present study was to develop criteria for judging test results that deviated from the predictions by the concentration addition chemical mixture model. These criteria were based on the confidence interval of the concentration addition model's prediction and on estimation of errors of the predicted concentration-effect curves by toxicity tests after exposure to single chemicals. A log-logit model with 2 parameters was assumed for the concentration-effect curve of each individual chemical. These parameters were determined by the maximum-likelihood method, and the criteria were defined using the variances and the covariance of the parameters. In addition, the criteria were applied to a toxicity test of a binary mixture of p-n-nonylphenol and p-n-octylphenol using the Japanese killifish, medaka (Oryzias latipes). Consequently, the concentration addition model using confidence interval was capable of predicting the test results at any level, and no reason for rejecting the concentration addition was found. Environ Toxicol Chem 2016;35:1806-1814. © 2015 SETAC. PMID:26660330

  2. Risk analysis of sulfites used as food additives in China.

    PubMed

    Zhang, Jian Bo; Zhang, Hong; Wang, Hua Li; Zhang, Ji Yue; Luo, Peng Jie; Zhu, Lei; Wang, Zhu Tian

    2014-02-01

    This study was to analyze the risk of sulfites in food consumed by the Chinese people and assess the health protection capability of maximum-permitted level (MPL) of sulfites in GB 2760-2011. Sulfites as food additives are overused or abused in many food categories. When the MPL in GB 2760-2011 was used as sulfites content in food, the intake of sulfites in most surveyed populations was lower than the acceptable daily intake (ADI). Excess intake of sulfites was found in all the surveyed groups when a high percentile of sulfites in food was in taken. Moreover, children aged 1-6 years are at a high risk to intake excess sulfites. The primary cause for the excess intake of sulfites in Chinese people is the overuse and abuse of sulfites by the food industry. The current MPL of sulfites in GB 2760-2011 protects the health of most populations.

  3. Lunar Landing Operational Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian

    2010-01-01

    Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.

  4. Network reconstruction using nonparametric additive ODE models.

    PubMed

    Henderson, James; Michailidis, George

    2014-01-01

    Network representations of biological systems are widespread and reconstructing unknown networks from data is a focal problem for computational biologists. For example, the series of biochemical reactions in a metabolic pathway can be represented as a network, with nodes corresponding to metabolites and edges linking reactants to products. In a different context, regulatory relationships among genes are commonly represented as directed networks with edges pointing from influential genes to their targets. Reconstructing such networks from data is a challenging problem receiving much attention in the literature. There is a particular need for approaches tailored to time-series data and not reliant on direct intervention experiments, as the former are often more readily available. In this paper, we introduce an approach to reconstructing directed networks based on dynamic systems models. Our approach generalizes commonly used ODE models based on linear or nonlinear dynamics by extending the functional class for the functions involved from parametric to nonparametric models. Concomitantly we limit the complexity by imposing an additive structure on the estimated slope functions. Thus the submodel associated with each node is a sum of univariate functions. These univariate component functions form the basis for a novel coupling metric that we define in order to quantify the strength of proposed relationships and hence rank potential edges. We show the utility of the method by reconstructing networks using simulated data from computational models for the glycolytic pathway of Lactocaccus Lactis and a gene network regulating the pluripotency of mouse embryonic stem cells. For purposes of comparison, we also assess reconstruction performance using gene networks from the DREAM challenges. We compare our method to those that similarly rely on dynamic systems models and use the results to attempt to disentangle the distinct roles of linearity, sparsity, and derivative

  5. Network Reconstruction Using Nonparametric Additive ODE Models

    PubMed Central

    Henderson, James; Michailidis, George

    2014-01-01

    Network representations of biological systems are widespread and reconstructing unknown networks from data is a focal problem for computational biologists. For example, the series of biochemical reactions in a metabolic pathway can be represented as a network, with nodes corresponding to metabolites and edges linking reactants to products. In a different context, regulatory relationships among genes are commonly represented as directed networks with edges pointing from influential genes to their targets. Reconstructing such networks from data is a challenging problem receiving much attention in the literature. There is a particular need for approaches tailored to time-series data and not reliant on direct intervention experiments, as the former are often more readily available. In this paper, we introduce an approach to reconstructing directed networks based on dynamic systems models. Our approach generalizes commonly used ODE models based on linear or nonlinear dynamics by extending the functional class for the functions involved from parametric to nonparametric models. Concomitantly we limit the complexity by imposing an additive structure on the estimated slope functions. Thus the submodel associated with each node is a sum of univariate functions. These univariate component functions form the basis for a novel coupling metric that we define in order to quantify the strength of proposed relationships and hence rank potential edges. We show the utility of the method by reconstructing networks using simulated data from computational models for the glycolytic pathway of Lactocaccus Lactis and a gene network regulating the pluripotency of mouse embryonic stem cells. For purposes of comparison, we also assess reconstruction performance using gene networks from the DREAM challenges. We compare our method to those that similarly rely on dynamic systems models and use the results to attempt to disentangle the distinct roles of linearity, sparsity, and derivative

  6. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  7. Acute radiation risk models

    NASA Astrophysics Data System (ADS)

    Smirnova, Olga

    Biologically motivated mathematical models, which describe the dynamics of the major hematopoietic lineages (the thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems) in acutely/chronically irradiated humans are developed. These models are implemented as systems of nonlinear differential equations, which variables and constant parameters have clear biological meaning. It is shown that the developed models are capable of reproducing clinical data on the dynamics of these systems in humans exposed to acute radiation in the result of incidents and accidents, as well as in humans exposed to low-level chronic radiation. Moreover, the averaged value of the "lethal" dose rates of chronic irradiation evaluated within models of these four major hematopoietic lineages coincides with the real minimal dose rate of lethal chronic irradiation. The demonstrated ability of the models of the human thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems to predict the dynamical response of these systems to acute/chronic irradiation in wide ranges of doses and dose rates implies that these mathematical models form an universal tool for the investigation and prediction of the dynamics of the major human hematopoietic lineages for a vast pattern of irradiation scenarios. In particular, these models could be applied for the radiation risk assessment for health of astronauts exposed to space radiation during long-term space missions, such as voyages to Mars or Lunar colonies, as well as for health of people exposed to acute/chronic irradiation due to environmental radiological events.

  8. CREATION OF THE MODEL ADDITIONAL PROTOCOL

    SciTech Connect

    Houck, F.; Rosenthal, M.; Wulf, N.

    2010-05-25

    In 1991, the international nuclear nonproliferation community was dismayed to discover that the implementation of safeguards by the International Atomic Energy Agency (IAEA) under its NPT INFCIRC/153 safeguards agreement with Iraq had failed to detect Iraq's nuclear weapon program. It was now clear that ensuring that states were fulfilling their obligations under the NPT would require not just detecting diversion but also the ability to detect undeclared materials and activities. To achieve this, the IAEA initiated what would turn out to be a five-year effort to reappraise the NPT safeguards system. The effort engaged the IAEA and its Member States and led to agreement in 1997 on a new safeguards agreement, the Model Protocol Additional to the Agreement(s) between States and the International Atomic Energy Agency for the Application of Safeguards. The Model Protocol makes explicit that one IAEA goal is to provide assurance of the absence of undeclared nuclear material and activities. The Model Protocol requires an expanded declaration that identifies a State's nuclear potential, empowers the IAEA to raise questions about the correctness and completeness of the State's declaration, and, if needed, allows IAEA access to locations. The information required and the locations available for access are much broader than those provided for under INFCIRC/153. The negotiation was completed in quite a short time because it started with a relatively complete draft of an agreement prepared by the IAEA Secretariat. This paper describes how the Model Protocol was constructed and reviews key decisions that were made both during the five-year period and in the actual negotiation.

  9. Detecting contaminated birthdates using generalized additive models

    PubMed Central

    2014-01-01

    Background Erroneous patient birthdates are common in health databases. Detection of these errors usually involves manual verification, which can be resource intensive and impractical. By identifying a frequent manifestation of birthdate errors, this paper presents a principled and statistically driven procedure to identify erroneous patient birthdates. Results Generalized additive models (GAM) enabled explicit incorporation of known demographic trends and birth patterns. With false positive rates controlled, the method identified birthdate contamination with high accuracy. In the health data set used, of the 58 actual incorrect birthdates manually identified by the domain expert, the GAM-based method identified 51, with 8 false positives (resulting in a positive predictive value of 86.0% (51/59) and a false negative rate of 12.0% (7/58)). These results outperformed linear time-series models. Conclusions The GAM-based method is an effective approach to identify systemic birthdate errors, a common data quality issue in both clinical and administrative databases, with high accuracy. PMID:24923281

  10. Are major behavioral and sociodemographic risk factors for mortality additive or multiplicative in their effects?

    PubMed

    Mehta, Neil; Preston, Samuel

    2016-04-01

    All individuals are subject to multiple risk factors for mortality. In this paper, we consider the nature of interactions between certain major sociodemographic and behavioral risk factors associated with all-cause mortality in the United States. We develop the formal logic pertaining to two forms of interaction between risk factors, additive and multiplicative relations. We then consider the general circumstances in which additive or multiplicative relations might be expected. We argue that expectations about interactions among socio-demographic variables, and their relation to behavioral variables, have been stated in terms of additivity. However, the statistical models typically used to estimate the relation between risk factors and mortality assume that risk factors act multiplicatively. We examine empirically the nature of interactions among five major risk factors associated with all-cause mortality: smoking, obesity, race, sex, and educational attainment. Data were drawn from the cross-sectional NHANES III (1988-1994) and NHANES 1999-2010 surveys, linked to death records through December 31, 2011. Our analytic sample comprised 35,604 respondents and 5369 deaths. We find that obesity is additive with each of the remaining four variables. We speculate that its additivity is a reflection of the fact that obese status is generally achieved later in life. For all pairings of socio-demographic variables, risks are multiplicative. For survival chances, it is much more dangerous to be poorly educated if you are black or if you are male. And it is much riskier to be a male if you are black. These traits, established at birth or during childhood, literally result in deadly combinations. We conclude that the identification of interactions among risk factors can cast valuable light on the nature of the process being studied. It also has public health implications by identifying especially vulnerable groups and by properly identifying the proportion of deaths

  11. Additional risk of end-of-the-pipe geoengineering technologies

    NASA Astrophysics Data System (ADS)

    Bohle, Martin

    2014-05-01

    qualitatively from the known successes. They do not tackle the initial cause, namely the carbon-dioxide inputs that are too high. This is their additional specific risk. 'The acceptability of geoengineering will be determined as much by social, legal and political issues as by scientific and technical factors', conclude Adam Corner and Nick Pidgeon (2010) when reviewing social and ethical implications of geoengineering the climate. It is to debate in that context that most geoengineering technologies are 'end of the pipe technologies', what involves an additional specific risk. Should these technologies be part of the toolbox to tackle anthropogenic climate change? Adam Corner and Nick Pidgeon 2010, Geoengineering the climate: The social and ethical implications, Environment Vol. 52.

  12. Testing a Gender Additive Model: The Role of Body Image in Adolescent Depression

    ERIC Educational Resources Information Center

    Bearman, Sarah Kate; Stice, Eric

    2008-01-01

    Despite consistent evidence that adolescent girls are at greater risk of developing depression than adolescent boys, risk factor models that account for this difference have been elusive. The objective of this research was to examine risk factors proposed by the "gender additive" model of depression that attempts to partially explain the increased…

  13. Mental Models of Security Risks

    NASA Astrophysics Data System (ADS)

    Asgharpour, Farzaneh; Liu, Debin; Camp, L. Jean

    In computer security, risk communication refers to informing computer users about the likelihood and magnitude of a threat. Efficacy of risk communication depends not only on the nature of the risk, but also on the alignment between the conceptual model embedded in the risk communication and the user's mental model of the risk. The gap between the mental models of security experts and non-experts could lead to ineffective risk communication. Our research shows that for a variety of the security risks self-identified security experts and non-experts have different mental models. We propose that the design of the risk communication methods should be based on the non-expert mental models.

  14. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  15. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  16. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  17. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  18. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  19. Improving coeliac disease risk prediction by testing non-HLA variants additional to HLA variants

    PubMed Central

    Romanos, Jihane; Rosén, Anna; Kumar, Vinod; Trynka, Gosia; Franke, Lude; Szperl, Agata; Gutierrez-Achury, Javier; van Diemen, Cleo C; Kanninga, Roan; Jankipersadsing, Soesma A; Steck, Andrea; Eisenbarth, Georges; van Heel, David A; Cukrowska, Bozena; Bruno, Valentina; Mazzilli, Maria Cristina; Núñez, Concepcion; Bilbao, Jose Ramon; Mearin, M Luisa; Barisani, Donatella; Rewers, Marian; Norris, Jill M; Ivarsson, Anneli; Boezen, H Marieke; Liu, Edwin; Wijmenga, Cisca

    2014-01-01

    Background The majority of coeliac disease (CD) patients are not being properly diagnosed and therefore remain untreated, leading to a greater risk of developing CD-associated complications. The major genetic risk heterodimer, HLA-DQ2 and DQ8, is already used clinically to help exclude disease. However, approximately 40% of the population carry these alleles and the majority never develop CD. Objective We explored whether CD risk prediction can be improved by adding non-HLA-susceptible variants to common HLA testing. Design We developed an average weighted genetic risk score with 10, 26 and 57 single nucleotide polymorphisms (SNP) in 2675 cases and 2815 controls and assessed the improvement in risk prediction provided by the non-HLA SNP. Moreover, we assessed the transferability of the genetic risk model with 26 non-HLA variants to a nested case–control population (n=1709) and a prospective cohort (n=1245) and then tested how well this model predicted CD outcome for 985 independent individuals. Results Adding 57 non-HLA variants to HLA testing showed a statistically significant improvement compared to scores from models based on HLA only, HLA plus 10 SNP and HLA plus 26 SNP. With 57 non-HLA variants, the area under the receiver operator characteristic curve reached 0.854 compared to 0.823 for HLA only, and 11.1% of individuals were reclassified to a more accurate risk group. We show that the risk model with HLA plus 26 SNP is useful in independent populations. Conclusions Predicting risk with 57 additional non-HLA variants improved the identification of potential CD patients. This demonstrates a possible role for combined HLA and non-HLA genetic testing in diagnostic work for CD. PMID:23704318

  20. Additional safety risk to exceptionally approved drugs in Europe?

    PubMed Central

    Arnardottir, Arna H; Haaijer-Ruskamp, Flora M; Straus, Sabine M J; Eichler, Hans-Georg; de Graeff, Pieter A; Mol, Peter G M

    2011-01-01

    AIMS Regulatory requirements for new drugs have increased. Special approval procedures with priority assessment are possible for drugs with clear ‘unmet medical need’. We question whether these Exceptional Circumstances (EC) or Conditional Approval (CA) procedures have led to a higher probability of serious safety issues. METHODS A retrospective cohort study was performed of new drugs approved in Europe between 1999 and 2009. The determinant was EC/CA vs. standard procedure approval. Outcome variables were frequency and timing of a first Direct Healthcare Professional Communication (DHPC). An association between approval procedure and the time from market approval to DHPC was assessed using Kaplan-Meyer survival analysis and Cox-regression to correct for covariates. RESULTS In total 289 new drugs were approved. Forty-six (16.4%) were approved under EC or CA, of which seven received a DHPC (15%). This was similar to the standard approval drugs (243), of which 33 received one or more DHPC (14%, P = 0.77). The probability of acquiring a DHPC for standard approval drugs vs. EC/CA drugs during 11-year follow-up is 22% (95% CI 14%, 29%) and 26% (95% CI 8%, 44%), respectively (log-rank P = 0.726). This difference remained not significant in the Cox-regression model: hazard ratio 0.94 (95% CI 0.40, 2.20). Only drug type was identified as a confounding covariate. CONCLUSION The EC/CA procedure is not associated with a higher probability of DHPCs despite limited clinical development data. These data do not support the view that early drug approval increases the risk of serious safety issues emerging after market approval. PMID:21501215

  1. Widespread non-additive and interaction effects within HLA loci modulate the risk of autoimmune diseases.

    PubMed

    Lenz, Tobias L; Deutsch, Aaron J; Han, Buhm; Hu, Xinli; Okada, Yukinori; Eyre, Stephen; Knapp, Michael; Zhernakova, Alexandra; Huizinga, Tom W J; Abecasis, Gonçalo; Becker, Jessica; Boeckxstaens, Guy E; Chen, Wei-Min; Franke, Andre; Gladman, Dafna D; Gockel, Ines; Gutierrez-Achury, Javier; Martin, Javier; Nair, Rajan P; Nöthen, Markus M; Onengut-Gumuscu, Suna; Rahman, Proton; Rantapää-Dahlqvist, Solbritt; Stuart, Philip E; Tsoi, Lam C; van Heel, David A; Worthington, Jane; Wouters, Mira M; Klareskog, Lars; Elder, James T; Gregersen, Peter K; Schumacher, Johannes; Rich, Stephen S; Wijmenga, Cisca; Sunyaev, Shamil R; de Bakker, Paul I W; Raychaudhuri, Soumya

    2015-09-01

    Human leukocyte antigen (HLA) genes confer substantial risk for autoimmune diseases on a log-additive scale. Here we speculated that differences in autoantigen-binding repertoires between a heterozygote's two expressed HLA variants might result in additional non-additive risk effects. We tested the non-additive disease contributions of classical HLA alleles in patients and matched controls for five common autoimmune diseases: rheumatoid arthritis (ncases = 5,337), type 1 diabetes (T1D; ncases = 5,567), psoriasis vulgaris (ncases = 3,089), idiopathic achalasia (ncases = 727) and celiac disease (ncases = 11,115). In four of the five diseases, we observed highly significant, non-additive dominance effects (rheumatoid arthritis, P = 2.5 × 10(-12); T1D, P = 2.4 × 10(-10); psoriasis, P = 5.9 × 10(-6); celiac disease, P = 1.2 × 10(-87)). In three of these diseases, the non-additive dominance effects were explained by interactions between specific classical HLA alleles (rheumatoid arthritis, P = 1.8 × 10(-3); T1D, P = 8.6 × 10(-27); celiac disease, P = 6.0 × 10(-100)). These interactions generally increased disease risk and explained moderate but significant fractions of phenotypic variance (rheumatoid arthritis, 1.4%; T1D, 4.0%; celiac disease, 4.1%) beyond a simple additive model. PMID:26258845

  2. Widespread non-additive and interaction effects within HLA loci modulate the risk of autoimmune diseases

    PubMed Central

    Lenz, Tobias L.; Deutsch, Aaron J.; Han, Buhm; Hu, Xinli; Okada, Yukinori; Eyre, Stephen; Knapp, Michael; Zhernakova, Alexandra; Huizinga, Tom W.J.; Abecasis, Goncalo; Becker, Jessica; Boeckxstaens, Guy E.; Chen, Wei-Min; Franke, Andre; Gladman, Dafna D.; Gockel, Ines; Gutierrez-Achury, Javier; Martin, Javier; Nair, Rajan P.; Nöthen, Markus M.; Onengut-Gumuscu, Suna; Rahman, Proton; Rantapää-Dahlqvist, Solbritt; Stuart, Philip E.; Tsoi, Lam C.; Van Heel, David A.; Worthington, Jane; Wouters, Mira M.; Klareskog, Lars; Elder, James T.; Gregersen, Peter K.; Schumacher, Johannes; Rich, Stephen S.; Wijmenga, Cisca; Sunyaev, Shamil R.; de Bakker, Paul I.W.; Raychaudhuri, Soumya

    2015-01-01

    Human leukocyte antigen (HLA) genes confer strong risk for autoimmune diseases on a log-additive scale. Here we speculated that differences in autoantigen binding repertoires between a heterozygote’s two expressed HLA variants may result in additional non-additive risk effects. We tested non-additive disease contributions of classical HLA alleles in patients and matched controls for five common autoimmune diseases: rheumatoid arthritis (RA, Ncases=5,337), type 1 diabetes (T1D, Ncases=5,567), psoriasis vulgaris (Ncases=3,089), idiopathic achalasia (Ncases=727), and celiac disease (Ncases=11,115). In four out of five diseases, we observed highly significant non-additive dominance effects (RA: P=2.5×1012; T1D: P=2.4×10−10; psoriasis: P=5.9×10−6; celiac disease: P=1.2×10−87). In three of these diseases, the dominance effects were explained by interactions between specific classical HLA alleles (RA: P=1.8×10−3; T1D: P=8.6×1027; celiac disease: P=6.0×10−100). These interactions generally increased disease risk and explained moderate but significant fractions of phenotypic variance (RA: 1.4%, T1D: 4.0%, and celiac disease: 4.1%, beyond a simple additive model). PMID:26258845

  3. Extracting additional risk managers information from a risk assessment of Listeria monocytogenes in deli meats.

    PubMed

    Pérez-Rodríguez, F; van Asselt, E D; Garcia-Gimeno, R M; Zurera, G; Zwietering, M H

    2007-05-01

    The risk assessment study of Listeria monocytogenes in ready-to-eat foods conducted by the U.S. Food and Drug Administration is an example of an extensive quantitative microbiological risk assessment that could be used by risk analysts and other scientists to obtain information and by managers and stakeholders to make decisions on food safety management. The present study was conducted to investigate how detailed sensitivity analysis can be used by assessors to extract more information on risk factors and how results can be communicated to managers and stakeholders in an understandable way. The extended sensitivity analysis revealed that the extremes at the right side of the dose distribution (at consumption, 9 to 11.5 log CFU per serving) were responsible for most of the cases of listeriosis simulated. For concentration at retail, values below the detection limit of 0.04 CFU/g and the often used limit for L. monocytogenes of 100 CFU/g (also at retail) were associated with a high number of annual cases of listeriosis (about 29 and 82%, respectively). This association can be explained by growth of L. monocytogenes at both average and extreme values of temperature and time, indicating that a wide distribution can lead to high risk levels. Another finding is the importance of the maximal population density (i.e., the maximum concentration of L. monocytogenes assumed at a certain temperature) for accurately estimating the risk of infection by opportunistic pathogens such as L. monocytogenes. According to the obtained results, mainly concentrations corresponding to the highest maximal population densities caused risk in the simulation. However, sensitivity analysis applied to the uncertainty parameters revealed that prevalence at retail was the most important source of uncertainty in the model.

  4. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  5. Modeling techniques for gaining additional urban space

    NASA Astrophysics Data System (ADS)

    Thunig, Holger; Naumann, Simone; Siegmund, Alexander

    2009-09-01

    One of the major accompaniments of the globalization is the rapid growing of urban areas. Urban sprawl is the main environmental problem affecting those cities across different characteristics and continents. Various reasons for the increase in urban sprawl in the last 10 to 30 years have been proposed [1], and often depend on the socio-economic situation of cities. The quantitative reduction and the sustainable handling of land should be performed by inner urban development instead of expanding urban regions. Following the principal "spare the urban fringe, develop the inner suburbs first" requires differentiated tools allowing for quantitative and qualitative appraisals of current building potentials. Using spatial high resolution remote sensing data within an object-based approach enables the detection of potential areas while GIS-data provides information for the quantitative valuation. This paper presents techniques for modeling urban environment and opportunities of utilization of the retrieved information for urban planners and their special needs.

  6. RISK 0301 - MOLECULAR MODELING

    EPA Science Inventory

    Risk assessment practices, in general, for a range of diseases now encourages the use of mechanistic data to enhance the ability to predict responses at low, environmental exposures. In particular, the pathway from normal biology to pathologic state can be dcscribed by a set of m...

  7. Pancreatic Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Colorectal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Bladder Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Testicular Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Lung Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Ovarian Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Liver Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Prostate Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  15. Esophageal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  16. Cervical Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  17. Breast Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  18. Risk modelling in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  19. Estimation of radiation risk in presence of classical additive and Berkson multiplicative errors in exposure doses.

    PubMed

    Masiuk, S V; Shklyar, S V; Kukush, A G; Carroll, R J; Kovgan, L N; Likhtarov, I A

    2016-07-01

    In this paper, the influence of measurement errors in exposure doses in a regression model with binary response is studied. Recently, it has been recognized that uncertainty in exposure dose is characterized by errors of two types: classical additive errors and Berkson multiplicative errors. The combination of classical additive and Berkson multiplicative errors has not been considered in the literature previously. In a simulation study based on data from radio-epidemiological research of thyroid cancer in Ukraine caused by the Chornobyl accident, it is shown that ignoring measurement errors in doses leads to overestimation of background prevalence and underestimation of excess relative risk. In the work, several methods to reduce these biases are proposed. They are new regression calibration, an additive version of efficient SIMEX, and novel corrected score methods.

  20. Additive Genetic Risk from Five Serotonin System Polymorphisms Interacts with Interpersonal Stress to Predict Depression

    PubMed Central

    Vrshek-Schallhorn, Suzanne; Stroud, Catherine B.; Mineka, Susan; Zinbarg, Richard E.; Adam, Emma K.; Redei, Eva E.; Hammen, Constance; Craske, Michelle G.

    2016-01-01

    Behavioral genetic research supports polygenic models of depression in which many genetic variations each contribute a small amount of risk, and prevailing diathesis-stress models suggest gene-environment interactions (GxE). Multilocus profile scores of additive risk offer an approach that is consistent with polygenic models of depression risk. In a first demonstration of this approach in a GxE predicting depression, we created an additive multilocus profile score from five serotonin system polymorphisms (one each in the genes HTR1A, HTR2A, HTR2C, and two in TPH2). Analyses focused on two forms of interpersonal stress as environmental risk factors. Using five years of longitudinal diagnostic and life stress interviews from 387 emerging young adults in the Youth Emotion Project, survival analyses show that this multilocus profile score interacts with major interpersonal stressful life events to predict major depressive episode onsets (HR = 1.815, p = .007). Simultaneously, there was a significant protective effect of the profile score without a recent event (HR = 0.83, p = .030). The GxE effect with interpersonal chronic stress was not significant (HR = 1.15, p = .165). Finally, effect sizes for genetic factors examined ignoring stress suggested such an approach could lead to overlooking or misinterpreting genetic effects. Both the GxE effect and the protective simple main effect were replicated in a sample of early adolescent girls (N = 105). We discuss potential benefits of the multilocus genetic profile score approach and caveats for future research. PMID:26595467

  1. Multifractal Value at Risk model

    NASA Astrophysics Data System (ADS)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  2. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  3. Information risk and security modeling

    NASA Astrophysics Data System (ADS)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  4. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  5. Evaluation of the Performance of Smoothing Functions in Generalized Additive Models for Spatial Variation in Disease

    PubMed Central

    Siangphoe, Umaporn; Wheeler, David C.

    2015-01-01

    Generalized additive models (GAMs) with bivariate smoothing functions have been applied to estimate spatial variation in risk for many types of cancers. Only a handful of studies have evaluated the performance of smoothing functions applied in GAMs with regard to different geographical areas of elevated risk and different risk levels. This study evaluates the ability of different smoothing functions to detect overall spatial variation of risk and elevated risk in diverse geographical areas at various risk levels using a simulation study. We created five scenarios with different true risk area shapes (circle, triangle, linear) in a square study region. We applied four different smoothing functions in the GAMs, including two types of thin plate regression splines (TPRS) and two versions of locally weighted scatterplot smoothing (loess). We tested the null hypothesis of constant risk and detected areas of elevated risk using analysis of deviance with permutation methods and assessed the performance of the smoothing methods based on the spatial detection rate, sensitivity, accuracy, precision, power, and false-positive rate. The results showed that all methods had a higher sensitivity and a consistently moderate-to-high accuracy rate when the true disease risk was higher. The models generally performed better in detecting elevated risk areas than detecting overall spatial variation. One of the loess methods had the highest precision in detecting overall spatial variation across scenarios and outperformed the other methods in detecting a linear elevated risk area. The TPRS methods outperformed loess in detecting elevated risk in two circular areas. PMID:25983545

  6. Towards internationally acceptable standards for food additives and contaminants based on the use of risk analysis.

    PubMed

    Huggett, A; Petersen, B J; Walker, R; Fisher, C E; Notermans, S H; Rombouts, F M; Abbott, P; Debackere, M; Hathaway, S C; Hecker, E F; Knaap, A G; Kuznesof, P M; Meyland, I; Moy, G; Narbonne, J F; Paakkanen, J; Smith, M R; Tennant, D; Wagstaffe, P; Wargo, J; Würtzen, G

    1998-06-01

    Internationally acceptable norms need to incorporate sound science and consistent risk management principles in an open and transparent manner, as set out in the Agreement on the Application of Sanitary and Phytosanitary Measures (the SPS Agreement). The process of risk analysis provides a procedure to reach these goals. The interaction between risk assessors and risk managers is considered vital to this procedure. This paper reports the outcome of a meeting of risk assessors and risk managers on specific aspects of risk analysis and its application to international standard setting for food additives and contaminants. Case studies on aflatoxins and aspartame were used to identify the key steps of the interaction process which ensure scientific justification for risk management decisions. A series of recommendations were proposed in order to enhance the scientific transparency in these critical phases of the standard setting procedure.

  7. Comprehensive European dietary exposure model (CEDEM) for food additives.

    PubMed

    Tennant, David R

    2016-05-01

    European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database.

  8. "The Dose Makes the Poison": Informing Consumers About the Scientific Risk Assessment of Food Additives.

    PubMed

    Bearth, Angela; Cousin, Marie-Eve; Siegrist, Michael

    2016-01-01

    Intensive risk assessment is required before the approval of food additives. During this process, based on the toxicological principle of "the dose makes the poison,ˮ maximum usage doses are assessed. However, most consumers are not aware of these efforts to ensure the safety of food additives and are therefore sceptical, even though food additives bring certain benefits to consumers. This study investigated the effect of a short video, which explains the scientific risk assessment and regulation of food additives, on consumers' perceptions and acceptance of food additives. The primary goal of this study was to inform consumers and enable them to construct their own risk-benefit assessment and make informed decisions about food additives. The secondary goal was to investigate whether people have different perceptions of food additives of artificial (i.e., aspartame) or natural origin (i.e., steviolglycoside). To attain these research goals, an online experiment was conducted on 185 Swiss consumers. Participants were randomly assigned to either the experimental group, which was shown a video about the scientific risk assessment of food additives, or the control group, which was shown a video about a topic irrelevant to the study. After watching the video, the respondents knew significantly more, expressed more positive thoughts and feelings, had less risk perception, and more acceptance than prior to watching the video. Thus, it appears that informing consumers about complex food safety topics, such as the scientific risk assessment of food additives, is possible, and using a carefully developed information video is a successful strategy for informing consumers. PMID:25951078

  9. "The Dose Makes the Poison": Informing Consumers About the Scientific Risk Assessment of Food Additives.

    PubMed

    Bearth, Angela; Cousin, Marie-Eve; Siegrist, Michael

    2016-01-01

    Intensive risk assessment is required before the approval of food additives. During this process, based on the toxicological principle of "the dose makes the poison,ˮ maximum usage doses are assessed. However, most consumers are not aware of these efforts to ensure the safety of food additives and are therefore sceptical, even though food additives bring certain benefits to consumers. This study investigated the effect of a short video, which explains the scientific risk assessment and regulation of food additives, on consumers' perceptions and acceptance of food additives. The primary goal of this study was to inform consumers and enable them to construct their own risk-benefit assessment and make informed decisions about food additives. The secondary goal was to investigate whether people have different perceptions of food additives of artificial (i.e., aspartame) or natural origin (i.e., steviolglycoside). To attain these research goals, an online experiment was conducted on 185 Swiss consumers. Participants were randomly assigned to either the experimental group, which was shown a video about the scientific risk assessment of food additives, or the control group, which was shown a video about a topic irrelevant to the study. After watching the video, the respondents knew significantly more, expressed more positive thoughts and feelings, had less risk perception, and more acceptance than prior to watching the video. Thus, it appears that informing consumers about complex food safety topics, such as the scientific risk assessment of food additives, is possible, and using a carefully developed information video is a successful strategy for informing consumers.

  10. How to interpret a small increase in AUC with an additional risk prediction marker: decision analysis comes through.

    PubMed

    Baker, Stuart G; Schuit, Ewoud; Steyerberg, Ewout W; Pencina, Michael J; Vickers, Andrew; Vickers, Andew; Moons, Karel G M; Mol, Ben W J; Lindeman, Karen S

    2014-09-28

    An important question in the evaluation of an additional risk prediction marker is how to interpret a small increase in the area under the receiver operating characteristic curve (AUC). Many researchers believe that a change in AUC is a poor metric because it increases only slightly with the addition of a marker with a large odds ratio. Because it is not possible on purely statistical grounds to choose between the odds ratio and AUC, we invoke decision analysis, which incorporates costs and benefits. For example, a timely estimate of the risk of later non-elective operative delivery can help a woman in labor decide if she wants an early elective cesarean section to avoid greater complications from possible later non-elective operative delivery. A basic risk prediction model for later non-elective operative delivery involves only antepartum markers. Because adding intrapartum markers to this risk prediction model increases AUC by 0.02, we questioned whether this small improvement is worthwhile. A key decision-analytic quantity is the risk threshold, here the risk of later non-elective operative delivery at which a patient would be indifferent between an early elective cesarean section and usual care. For a range of risk thresholds, we found that an increase in the net benefit of risk prediction requires collecting intrapartum marker data on 68 to 124 women for every correct prediction of later non-elective operative delivery. Because data collection is non-invasive, this test tradeoff of 68 to 124 is clinically acceptable, indicating the value of adding intrapartum markers to the risk prediction model.

  11. How to interpret a small increase in AUC with an additional risk prediction marker: Decision analysis comes through

    PubMed Central

    Baker, Stuart G.; Schuit, Ewoud; Steyerberg, Ewout W.; Pencina, Michael J.; Vickers, Andew; Moons, Karel G. M.; Mol, Ben W.J.; Lindeman, Karen S.

    2014-01-01

    An important question in the evaluation of an additional risk prediction marker is how to interpret a small increase in the area under the receiver operating characteristic curve (AUC). Many researchers believe that a change in AUC is a poor metric because it increases only slightly with the addition of a marker with a large odds ratio. Because it is not possible on purely statistical grounds to choose between the odds ratio and AUC, we invoke decision analysis, which incorporates costs and benefits. For example a timely estimate of the risk of later non-elective operative delivery can help a woman in labor decide if she wants an early elective cesarean section to avoid greater complications from possible later non-elective operative delivery. A basic risk prediction model for later non-elective operative delivery involves only antepartum markers. Because adding intrapartum markers to this risk prediction model increases AUC by 0.02, we questioned whether this small improvement is worthwhile. A key decision-analytic quantity is the risk threshold, here the risk of later non-elective operative delivery at which a patient would be indifferent between an early elective cesarean section and usual care. For a range of risk thresholds, we found that an increase in the net benefit of risk prediction requires collecting intrapartum marker data on 68 to 124 women for every correct prediction of later non-elective operative delivery. Because data collection is non-invasive, this test tradeoff of 68 to 124 is clinically acceptable, indicating the value of adding intrapartum markers to the risk prediction model. PMID:24825728

  12. Transferability of regional permafrost disturbance susceptibility modelling using generalized linear and generalized additive models

    NASA Astrophysics Data System (ADS)

    Rudy, Ashley C. A.; Lamoureux, Scott F.; Treitz, Paul; van Ewijk, Karin Y.

    2016-07-01

    To effectively assess and mitigate risk of permafrost disturbance, disturbance-prone areas can be predicted through the application of susceptibility models. In this study we developed regional susceptibility models for permafrost disturbances using a field disturbance inventory to test the transferability of the model to a broader region in the Canadian High Arctic. Resulting maps of susceptibility were then used to explore the effect of terrain variables on the occurrence of disturbances within this region. To account for a large range of landscape characteristics, the model was calibrated using two locations: Sabine Peninsula, Melville Island, NU, and Fosheim Peninsula, Ellesmere Island, NU. Spatial patterns of disturbance were predicted with a generalized linear model (GLM) and generalized additive model (GAM), each calibrated using disturbed and randomized undisturbed locations from both locations and GIS-derived terrain predictor variables including slope, potential incoming solar radiation, wetness index, topographic position index, elevation, and distance to water. Each model was validated for the Sabine and Fosheim Peninsulas using independent data sets while the transferability of the model to an independent site was assessed at Cape Bounty, Melville Island, NU. The regional GLM and GAM validated well for both calibration sites (Sabine and Fosheim) with the area under the receiver operating curves (AUROC) > 0.79. Both models were applied directly to Cape Bounty without calibration and validated equally with AUROC's of 0.76; however, each model predicted disturbed and undisturbed samples differently. Additionally, the sensitivity of the transferred model was assessed using data sets with different sample sizes. Results indicated that models based on larger sample sizes transferred more consistently and captured the variability within the terrain attributes in the respective study areas. Terrain attributes associated with the initiation of disturbances were

  13. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  14. Modeling the cardiovascular system using a nonlinear additive autoregressive model with exogenous input

    NASA Astrophysics Data System (ADS)

    Riedl, M.; Suhrbier, A.; Malberg, H.; Penzel, T.; Bretthauer, G.; Kurths, J.; Wessel, N.

    2008-07-01

    The parameters of heart rate variability and blood pressure variability have proved to be useful analytical tools in cardiovascular physics and medicine. Model-based analysis of these variabilities additionally leads to new prognostic information about mechanisms behind regulations in the cardiovascular system. In this paper, we analyze the complex interaction between heart rate, systolic blood pressure, and respiration by nonparametric fitted nonlinear additive autoregressive models with external inputs. Therefore, we consider measurements of healthy persons and patients suffering from obstructive sleep apnea syndrome (OSAS), with and without hypertension. It is shown that the proposed nonlinear models are capable of describing short-term fluctuations in heart rate as well as systolic blood pressure significantly better than similar linear ones, which confirms the assumption of nonlinear controlled heart rate and blood pressure. Furthermore, the comparison of the nonlinear and linear approaches reveals that the heart rate and blood pressure variability in healthy subjects is caused by a higher level of noise as well as nonlinearity than in patients suffering from OSAS. The residue analysis points at a further source of heart rate and blood pressure variability in healthy subjects, in addition to heart rate, systolic blood pressure, and respiration. Comparison of the nonlinear models within and among the different groups of subjects suggests the ability to discriminate the cohorts that could lead to a stratification of hypertension risk in OSAS patients.

  15. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins; John M. Beck

    2011-11-01

    The Next Generation Nuclear Plant (NGNP) Risk Management System (RMS) is a database used to maintain the project risk register. The RMS also maps risk reduction activities to specific identified risks. Further functionality of the RMS includes mapping reactor suppliers Design Data Needs (DDNs) to risk reduction tasks and mapping Phenomena Identification Ranking Table (PIRTs) to associated risks. This document outlines the basic instructions on how to use the RMS. This document constitutes Revision 1 of the NGNP Risk Management Database: A Model for Managing Risk. It incorporates the latest enhancements to the RMS. The enhancements include six new custom views of risk data - Impact/Consequence, Tasks by Project Phase, Tasks by Status, Tasks by Project Phase/Status, Tasks by Impact/WBS, and Tasks by Phase/Impact/WBS.

  16. Do Health Professionals Need Additional Competencies for Stratified Cancer Prevention Based on Genetic Risk Profiling?

    PubMed Central

    Chowdhury, Susmita; Henneman, Lidewij; Dent, Tom; Hall, Alison; Burton, Alice; Pharoah, Paul; Pashayan, Nora; Burton, Hilary

    2015-01-01

    There is growing evidence that inclusion of genetic information about known common susceptibility variants may enable population risk-stratification and personalized prevention for common diseases including cancer. This would require the inclusion of genetic testing as an integral part of individual risk assessment of an asymptomatic individual. Front line health professionals would be expected to interact with and assist asymptomatic individuals through the risk stratification process. In that case, additional knowledge and skills may be needed. Current guidelines and frameworks for genetic competencies of non-specialist health professionals place an emphasis on rare inherited genetic diseases. For common diseases, health professionals do use risk assessment tools but such tools currently do not assess genetic susceptibility of individuals. In this article, we compare the skills and knowledge needed by non-genetic health professionals, if risk-stratified prevention is implemented, with existing competence recommendations from the UK, USA and Europe, in order to assess the gaps in current competences. We found that health professionals would benefit from understanding the contribution of common genetic variations in disease risk, the rationale for a risk-stratified prevention pathway, and the implications of using genomic information in risk-assessment and risk management of asymptomatic individuals for common disease prevention. PMID:26068647

  17. Modeling Errors in Daily Precipitation Measurements: Additive or Multiplicative?

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Huffman, George J.; Adler, Robert F.; Tang, Ling; Sapiano, Matthew; Maggioni, Viviana; Wu, Huan

    2013-01-01

    The definition and quantification of uncertainty depend on the error model used. For uncertainties in precipitation measurements, two types of error models have been widely adopted: the additive error model and the multiplicative error model. This leads to incompatible specifications of uncertainties and impedes intercomparison and application.In this letter, we assess the suitability of both models for satellite-based daily precipitation measurements in an effort to clarify the uncertainty representation. Three criteria were employed to evaluate the applicability of either model: (1) better separation of the systematic and random errors; (2) applicability to the large range of variability in daily precipitation; and (3) better predictive skills. It is found that the multiplicative error model is a much better choice under all three criteria. It extracted the systematic errors more cleanly, was more consistent with the large variability of precipitation measurements, and produced superior predictions of the error characteristics. The additive error model had several weaknesses, such as non constant variance resulting from systematic errors leaking into random errors, and the lack of prediction capability. Therefore, the multiplicative error model is a better choice.

  18. Electroacoustics modeling of piezoelectric welders for ultrasonic additive manufacturing processes

    NASA Astrophysics Data System (ADS)

    Hehr, Adam; Dapino, Marcelo J.

    2016-04-01

    Ultrasonic additive manufacturing (UAM) is a recent 3D metal printing technology which utilizes ultrasonic vibrations from high power piezoelectric transducers to additively weld similar and dissimilar metal foils. CNC machining is used intermittent of welding to create internal channels, embed temperature sensitive components, sensors, and materials, and for net shaping parts. Structural dynamics of the welder and work piece influence the performance of the welder and part quality. To understand the impact of structural dynamics on UAM, a linear time-invariant model is used to relate system shear force and electric current inputs to the system outputs of welder velocity and voltage. Frequency response measurements are combined with in-situ operating measurements of the welder to identify model parameters and to verify model assumptions. The proposed LTI model can enhance process consistency, performance, and guide the development of improved quality monitoring and control strategies.

  19. Additive composite ABCG2, SLC2A9 and SLC22A12 scores of high-risk alleles with alcohol use modulate gout risk.

    PubMed

    Tu, Hung-Pin; Chung, Chia-Min; Min-Shan Ko, Albert; Lee, Su-Shin; Lai, Han-Ming; Lee, Chien-Hung; Huang, Chung-Ming; Liu, Chiu-Shong; Ko, Ying-Chin

    2016-09-01

    The aim of the present study was to evaluate the contribution of urate transporter genes and alcohol use to the risk of gout/tophi. Eight variants of ABCG2, SLC2A9, SLC22A12, SLC22A11 and SLC17A3 were genotyped in male individuals in a case-control study with 157 gout (33% tophi), 106 asymptomatic hyperuricaemia and 295 control subjects from Taiwan. The multilocus profiles of the genetic risk scores for urate gene variants were used to evaluate the risk of asymptomatic hyperuricaemia, gout and tophi. ABCG2 Q141K (T), SLC2A9 rs1014290 (A) and SLC22A12 rs475688 (C) under an additive model and alcohol use independently predicted the risk of gout (respective odds ratio for each factor=2.48, 2.03, 1.95 and 2.48). The additive composite Q141K, rs1014290 and rs475688 scores of high-risk alleles were associated with gout risk (P<0.0001). We observed the supramultiplicative interaction effect of genetic urate scores and alcohol use on gout and tophi risk (P for interaction=0.0452, 0.0033). The synergistic effect of genetic urate score 5-6 and alcohol use indicates that these combined factors correlate with gout and tophi occurrence.

  20. An Additional Symmetry in the Weinberg-Salam Model

    SciTech Connect

    Bakker, B.L.G.; Veselov, A.I.; Zubkov, M.A.

    2005-06-01

    An additional Z{sub 6} symmetry hidden in the fermion and Higgs sectors of the Standard Model has been found recently. It has a singular nature and is connected to the centers of the SU(3) and SU(2) subgroups of the gauge group. A lattice regularization of the Standard Model was constructed that possesses this symmetry. In this paper, we report our results on the numerical simulation of its electroweak sector.

  1. Modeling uranium transport in acidic contaminated groundwater with base addition

    SciTech Connect

    Zhang, Fan; Luo, Wensui; Parker, Jack C.; Brooks, Scott C; Watson, David B; Jardine, Philip; Gu, Baohua

    2011-01-01

    This study investigates reactive transport modeling in a column of uranium(VI)-contaminated sediments with base additions in the circulating influent. The groundwater and sediment exhibit oxic conditions with low pH, high concentrations of NO{sub 3}{sup -}, SO{sub 4}{sup 2-}, U and various metal cations. Preliminary batch experiments indicate that additions of strong base induce rapid immobilization of U for this material. In the column experiment that is the focus of the present study, effluent groundwater was titrated with NaOH solution in an inflow reservoir before reinjection to gradually increase the solution pH in the column. An equilibrium hydrolysis, precipitation and ion exchange reaction model developed through simulation of the preliminary batch titration experiments predicted faster reduction of aqueous Al than observed in the column experiment. The model was therefore modified to consider reaction kinetics for the precipitation and dissolution processes which are the major mechanism for Al immobilization. The combined kinetic and equilibrium reaction model adequately described variations in pH, aqueous concentrations of metal cations (Al, Ca, Mg, Sr, Mn, Ni, Co), sulfate and U(VI). The experimental and modeling results indicate that U(VI) can be effectively sequestered with controlled base addition due to sorption by slowly precipitated Al with pH-dependent surface charge. The model may prove useful to predict field-scale U(VI) sequestration and remediation effectiveness.

  2. Modeling uranium transport in acidic contaminated groundwater with base addition.

    PubMed

    Zhang, Fan; Luo, Wensui; Parker, Jack C; Brooks, Scott C; Watson, David B; Jardine, Philip M; Gu, Baohua

    2011-06-15

    This study investigates reactive transport modeling in a column of uranium(VI)-contaminated sediments with base additions in the circulating influent. The groundwater and sediment exhibit oxic conditions with low pH, high concentrations of NO(3)(-), SO(4)(2-), U and various metal cations. Preliminary batch experiments indicate that additions of strong base induce rapid immobilization of U for this material. In the column experiment that is the focus of the present study, effluent groundwater was titrated with NaOH solution in an inflow reservoir before reinjection to gradually increase the solution pH in the column. An equilibrium hydrolysis, precipitation and ion exchange reaction model developed through simulation of the preliminary batch titration experiments predicted faster reduction of aqueous Al than observed in the column experiment. The model was therefore modified to consider reaction kinetics for the precipitation and dissolution processes which are the major mechanism for Al immobilization. The combined kinetic and equilibrium reaction model adequately described variations in pH, aqueous concentrations of metal cations (Al, Ca, Mg, Sr, Mn, Ni, Co), sulfate and U(VI). The experimental and modeling results indicate that U(VI) can be effectively sequestered with controlled base addition due to sorption by slowly precipitated Al with pH-dependent surface charge. The model may prove useful to predict field-scale U(VI) sequestration and remediation effectiveness.

  3. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  4. Using Set Model for Learning Addition of Integers

    ERIC Educational Resources Information Center

    Lestari, Umi Puji; Putri, Ratu Ilma Indra; Hartono, Yusuf

    2015-01-01

    This study aims to investigate how set model can help students' understanding of addition of integers in fourth grade. The study has been carried out to 23 students and a teacher of IVC SD Iba Palembang in January 2015. This study is a design research that also promotes PMRI as the underlying design context and activity. Results showed that the…

  5. Testing Nested Additive, Multiplicative, and General Multitrait-Multimethod Models.

    ERIC Educational Resources Information Center

    Coenders, Germa; Saris, Willem E.

    2000-01-01

    Provides alternatives to the definitions of additive and multiplicative method effects in multitrait-multimethod data given by D. Campbell and E. O'Connell (1967). The alternative definitions can be formulated by means of constraints in the parameters of the correlated uniqueness model (H. Marsh, 1989). (SLD)

  6. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  7. Structural equation modeling in environmental risk assessment.

    PubMed

    Buncher, C R; Succop, P A; Dietrich, K N

    1991-01-01

    Environmental epidemiology requires effective models that take individual observations of environmental factors and connect them into meaningful patterns. Single-factor relationships have given way to multivariable analyses; simple additive models have been augmented by multiplicative (logistic) models. Each of these steps has produced greater enlightenment and understanding. Models that allow for factors causing outputs that can affect later outputs with putative causation working at several different time points (e.g., linkage) are not commonly used in the environmental literature. Structural equation models are a class of covariance structure models that have been used extensively in economics/business and social science but are still little used in the realm of biostatistics. Path analysis in genetic studies is one simplified form of this class of models. We have been using these models in a study of the health and development of infants who have been exposed to lead in utero and in the postnatal home environment. These models require as input the directionality of the relationship and then produce fitted models for multiple inputs causing each factor and the opportunity to have outputs serve as input variables into the next phase of the simultaneously fitted model. Some examples of these models from our research are presented to increase familiarity with this class of models. Use of these models can provide insight into the effect of changing an environmental factor when assessing risk. The usual cautions concerning believing a model, believing causation has been proven, and the assumptions that are required for each model are operative. PMID:2050063

  8. Estimating classification images with generalized linear and additive models.

    PubMed

    Knoblauch, Kenneth; Maloney, Laurence T

    2008-12-22

    Conventional approaches to modeling classification image data can be described in terms of a standard linear model (LM). We show how the problem can be characterized as a Generalized Linear Model (GLM) with a Bernoulli distribution. We demonstrate via simulation that this approach is more accurate in estimating the underlying template in the absence of internal noise. With increasing internal noise, however, the advantage of the GLM over the LM decreases and GLM is no more accurate than LM. We then introduce the Generalized Additive Model (GAM), an extension of GLM that can be used to estimate smooth classification images adaptively. We show that this approach is more robust to the presence of internal noise, and finally, we demonstrate that GAM is readily adapted to estimation of higher order (nonlinear) classification images and to testing their significance.

  9. The apolipoprotein epsilon4 allele confers additional risk in children with familial hypercholesterolemia.

    PubMed

    Wiegman, Albert; Sijbrands, Eric J G; Rodenburg, Jessica; Defesche, Joep C; de Jongh, Saskia; Bakker, Henk D; Kastelein, John J P

    2003-06-01

    Children with familial hypercholesterolemia (FH) exhibit substantial variance of LDL cholesterol. In previous studies, family members of children with FH were included, which may have influenced results. To avoid such bias, we studied phenotype in 450 unrelated children with FH and in 154 affected sib-pairs. In known families with classical FH, diagnosis was based on plasma LDL cholesterol above the age- and gender-specific 95th percentile. Girls had 0.47 +/- 0.15 mmol/L higher LDL cholesterol, compared with boys (p = 0.002). Also in girls, HDL cholesterol increased by 0.07 +/- 0.03 mmol/L per 5 y (pfor trend = 0.005); this age effect was not observed in boys. The distribution of apolipoprotein (apo) E genotypes was not significantly different between probands, their paired affected siblings, or a Dutch control population. Carriers with or without one epsilon4 allele had similar LDL and HDL cholesterol levels. Within the affected sib-pairs, the epsilon4 allele explained 72.4% of the variance of HDL cholesterol levels (-0.15 mmol/L, 95% confidence interval -0.24 to -0.05, p = 0.003). The effect of apoE4 on HDL cholesterol differed with an analysis based on probands or on affected sib-pairs. The affected sib-pair model used adjustment for shared environment, type of LDL receptor gene mutation, and a proportion of additional genetic factors and may, therefore, be more accurate in estimating effects of risk factors on complex traits. We conclude that the epsilon4 allele was associated with lower HDL cholesterol levels in an affected sib-pair analysis, which strongly suggests that apoE4 influences HDL cholesterol levels in FH children. Moreover, the strong association suggests that apoE4 carries an additional disadvantage for FH children. PMID:12646733

  10. Additions to Mars Global Reference Atmospheric Model (MARS-GRAM)

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; James, Bonnie

    1992-01-01

    Three major additions or modifications were made to the Mars Global Reference Atmospheric Model (Mars-GRAM): (1) in addition to the interactive version, a new batch version is available, which uses NAMELIST input, and is completely modular, so that the main driver program can easily be replaced by any calling program, such as a trajectory simulation program; (2) both the interactive and batch versions now have an option for treating local-scale dust storm effects, rather than just the global-scale dust storms in the original Mars-GRAM; and (3) the Zurek wave perturbation model was added, to simulate the effects of tidal perturbations, in addition to the random (mountain wave) perturbation model of the original Mars-GRAM. A minor modification was also made which allows heights to go 'below' local terrain height and return 'realistic' pressure, density, and temperature, and not the surface values, as returned by the original Mars-GRAM. This feature will allow simulations of Mars rover paths which might go into local 'valley' areas which lie below the average height of the present, rather coarse-resolution, terrain height data used by Mars-GRAM. Sample input and output of both the interactive and batch versions of Mars-GRAM are presented.

  11. Additions to Mars Global Reference Atmospheric Model (Mars-GRAM)

    NASA Technical Reports Server (NTRS)

    Justus, C. G.

    1991-01-01

    Three major additions or modifications were made to the Mars Global Reference Atmospheric Model (Mars-GRAM): (1) in addition to the interactive version, a new batch version is available, which uses NAMELIST input, and is completely modular, so that the main driver program can easily be replaced by any calling program, such as a trajectory simulation program; (2) both the interactive and batch versions now have an option for treating local-scale dust storm effects, rather than just the global-scale dust storms in the original Mars-GRAM; and (3) the Zurek wave perturbation model was added, to simulate the effects of tidal perturbations, in addition to the random (mountain wave) perturbation model of the original Mars-GRAM. A minor modification has also been made which allows heights to go below local terrain height and return realistic pressure, density, and temperature (not the surface values) as returned by the original Mars-GRAM. This feature will allow simulations of Mars rover paths which might go into local valley areas which lie below the average height of the present, rather coarse-resolution, terrain height data used by Mars-GRAM. Sample input and output of both the interactive and batch version of Mars-GRAM are presented.

  12. Possible effects of protracted exposure on the additivity of risks from space radiations

    NASA Technical Reports Server (NTRS)

    Curtis, S. B.

    1996-01-01

    Conventional radiation risk assessments are presently based on the additivity assumption. This assumption states that risks from individual components of a complex radiation field involving many different types of radiation can be added to yield the total risk of the complex radiation field. If the assumption is not correct, the summations and integrations performed to obtain the presently quoted risk estimates are not appropriate. This problem is particularly important in the area of space radiation risk evaluation because of the many different types of high- and low-LET radiation present in the galactic cosmic ray environment. For both low- and high-LET radiations at low enough dose rates, the present convention is that the addivity assumption holds. Mathematically, the total risk, Rtot is assumed to be Rtot = summation (i) Ri where the summation runs over the different types of radiation present. If the total dose (or fluence) from each component is such that the interaction between biological lesions caused by separate single track traversals is negligible within a given cell, it is presently considered to be reasonable to accept the additivity assumption. However, when the exposure is protracted over many cell doubling times (as will be the case for extended missions to the moon or Mars), the possibility exists that radiation effects that depend on multiple cellular events over a long time period, such as is probably the case in radiation-induced carcinogenesis, may not be additive in the above sense and the exposure interval may have to be included in the evaluation procedure. It is shown, however, that "inverse" dose-rate effects are not expected from intermediate LET radiations arising from the galactic cosmic ray environment due to the "sensitive-window-in-the-cell-cycle" hypothesis.

  13. Backbone Additivity in the Transfer Model of Protein Solvation

    SciTech Connect

    Hu, Char Y.; Kokubo, Hironori; Lynch, Gillian C.; Bolen, D Wayne; Pettitt, Bernard M.

    2010-05-01

    The transfer model implying additivity of the peptide backbone free energy of transfer is computationally tested. Molecular dynamics simulations are used to determine the extent of change in transfer free energy (ΔGtr) with increase in chain length of oligoglycine with capped end groups. Solvation free energies of oligoglycine models of varying lengths in pure water and in the osmolyte solutions, 2M urea and 2M trimethylamine N-oxide (TMAO), were calculated from simulations of all atom models, and ΔGtr values for peptide backbone transfer from water to the osmolyte solutions were determined. The results show that the transfer free energies change linearly with increasing chain length, demonstrating the principle of additivity, and provide values in reasonable agreement with experiment. The peptide backbone transfer free energy contributions arise from van der Waals interactions in the case of transfer to urea, but from electrostatics on transfer to TMAO solution. The simulations used here allow for the calculation of the solvation and transfer free energy of longer oligoglycine models to be evaluated than is currently possible through experiment. The peptide backbone unit computed transfer free energy of –54 cal/mol/Mcompares quite favorably with –43 cal/mol/M determined experimentally.

  14. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    NASA Technical Reports Server (NTRS)

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition

  15. Modeling Research Project Risks with Fuzzy Maps

    ERIC Educational Resources Information Center

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  16. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  17. Assessing the additive risks of PSII herbicide exposure to the Great Barrier Reef.

    PubMed

    Lewis, Stephen E; Schaffelke, Britta; Shaw, Melanie; Bainbridge, Zoë T; Rohde, Ken W; Kennedy, Karen; Davis, Aaron M; Masters, Bronwyn L; Devlin, Michelle J; Mueller, Jochen F; Brodie, Jon E

    2012-01-01

    Herbicide residues have been measured in the Great Barrier Reef lagoon at concentrations which have the potential to harm marine plant communities. Monitoring on the Great Barrier Reef lagoon following wet season discharge show that 80% of the time when herbicides are detected, more than one are present. These herbicides have been shown to act in an additive manner with regards to photosystem-II inhibition. In this study, the area of the Great Barrier Reef considered to be at risk from herbicides is compared when exposures are considered for each herbicide individually and also for herbicide mixtures. Two normalisation indices for herbicide mixtures were calculated based on current guidelines and PSII inhibition thresholds. The results show that the area of risk for most regions is greatly increased under the proposed additive PSII inhibition threshold and that the resilience of this important ecosystem could be reduced by exposure to these herbicides.

  18. Sensitivity analysis of geometric errors in additive manufacturing medical models.

    PubMed

    Pinto, Jose Miguel; Arrieta, Cristobal; Andia, Marcelo E; Uribe, Sergio; Ramos-Grez, Jorge; Vargas, Alex; Irarrazaval, Pablo; Tejos, Cristian

    2015-03-01

    Additive manufacturing (AM) models are used in medical applications for surgical planning, prosthesis design and teaching. For these applications, the accuracy of the AM models is essential. Unfortunately, this accuracy is compromised due to errors introduced by each of the building steps: image acquisition, segmentation, triangulation, printing and infiltration. However, the contribution of each step to the final error remains unclear. We performed a sensitivity analysis comparing errors obtained from a reference with those obtained modifying parameters of each building step. Our analysis considered global indexes to evaluate the overall error, and local indexes to show how this error is distributed along the surface of the AM models. Our results show that the standard building process tends to overestimate the AM models, i.e. models are larger than the original structures. They also show that the triangulation resolution and the segmentation threshold are critical factors, and that the errors are concentrated at regions with high curvatures. Errors could be reduced choosing better triangulation and printing resolutions, but there is an important need for modifying some of the standard building processes, particularly the segmentation algorithms.

  19. Addition Table of Colours: Additive and Subtractive Mixtures Described Using a Single Reasoning Model

    ERIC Educational Resources Information Center

    Mota, A. R.; Lopes dos Santos, J. M. B.

    2014-01-01

    Students' misconceptions concerning colour phenomena and the apparent complexity of the underlying concepts--due to the different domains of knowledge involved--make its teaching very difficult. We have developed and tested a teaching device, the addition table of colours (ATC), that encompasses additive and subtractive mixtures in a single…

  20. Risk assessment of combined photogenotoxic effects of sunlight and food additives.

    PubMed

    Salih, Fadhil M

    2006-06-01

    The presence of flavored colorants (peach and raspberry), flavors (caramel, citric acid and vanilla) and food preservatives (sodium nitrite, sodium nitrate, sodium benzoate, benzoic acid, potassium sorbate and sodium chloride) in Escherichia coli suspension during exposure to sunlight did not change the extent of cell survival. No effect on viability and mutation induction (kanamycin resistant) was also seen when cells were kept in contact with any of the additives for 80 min in the dark. However, when the relevant additive was present in cell suspension during sunlight exposure the number of induced mutations was increased to varying extents over that seen with sunlight alone. Raspberry and peach increased the number of mutations in a dose dependent manner, while vanilla produced mutations in an additive fashion. Nitrite, nitrate, benzoate, sorbate and benzoic acid increased mutation somewhat additively over that of sunlight. Sodium chloride and citric acid were not effective. The impact of this investigation reflects the significance of combination of sunlight and chemical food additives as potential risk, which requires special attention and necessitates further investigations to evaluate the risk.

  1. A Study of Additive Noise Model for Robust Speech Recognition

    NASA Astrophysics Data System (ADS)

    Awatade, Manisha H.

    2011-12-01

    A model of how speech amplitude spectra are affected by additive noise is studied. Acoustic features are extracted based on the noise robust parts of speech spectra without losing discriminative information. An existing two non-linear processing methods, harmonic demodulation and spectral peak-to-valley ratio locking, are designed to minimize mismatch between clean and noisy speech features. Previously studied methods, including peak isolation [1], do not require noise estimation and are effective in dealing with both stationary and non-stationary noise.

  2. Additive Manufacturing of Medical Models--Applications in Rhinology.

    PubMed

    Raos, Pero; Klapan, Ivica; Galeta, Tomislav

    2015-09-01

    In the paper we are introducing guidelines and suggestions for use of 3D image processing SW in head pathology diagnostic and procedures for obtaining physical medical model by additive manufacturing/rapid prototyping techniques, bearing in mind the improvement of surgery performance, its maximum security and faster postoperative recovery of patients. This approach has been verified in two case reports. In the treatment we used intelligent classifier-schemes for abnormal patterns using computer-based system for 3D-virtual and endoscopic assistance in rhinology, with appropriate visualization of anatomy and pathology within the nose, paranasal sinuses, and scull base area.

  3. Relative risk regression models with inverse polynomials.

    PubMed

    Ning, Yang; Woodward, Mark

    2013-08-30

    The proportional hazards model assumes that the log hazard ratio is a linear function of parameters. In the current paper, we model the log relative risk as an inverse polynomial, which is particularly suitable for modeling bounded and asymmetric functions. The parameters estimated by maximizing the partial likelihood are consistent and asymptotically normal. The advantages of the inverse polynomial model over the ordinary polynomial model and the fractional polynomial model for fitting various asymmetric log relative risk functions are shown by simulation. The utility of the method is further supported by analyzing two real data sets, addressing the specific question of the location of the minimum risk threshold.

  4. Multiscale Modeling of Powder Bed-Based Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Markl, Matthias; Körner, Carolin

    2016-07-01

    Powder bed fusion processes are additive manufacturing technologies that are expected to induce the third industrial revolution. Components are built up layer by layer in a powder bed by selectively melting confined areas, according to sliced 3D model data. This technique allows for manufacturing of highly complex geometries hardly machinable with conventional technologies. However, the underlying physical phenomena are sparsely understood and difficult to observe during processing. Therefore, an intensive and expensive trial-and-error principle is applied to produce components with the desired dimensional accuracy, material characteristics, and mechanical properties. This review presents numerical modeling approaches on multiple length scales and timescales to describe different aspects of powder bed fusion processes. In combination with tailored experiments, the numerical results enlarge the process understanding of the underlying physical mechanisms and support the development of suitable process strategies and component topologies.

  5. Multiscale Modeling of Powder Bed–Based Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Markl, Matthias; Körner, Carolin

    2016-07-01

    Powder bed fusion processes are additive manufacturing technologies that are expected to induce the third industrial revolution. Components are built up layer by layer in a powder bed by selectively melting confined areas, according to sliced 3D model data. This technique allows for manufacturing of highly complex geometries hardly machinable with conventional technologies. However, the underlying physical phenomena are sparsely understood and difficult to observe during processing. Therefore, an intensive and expensive trial-and-error principle is applied to produce components with the desired dimensional accuracy, material characteristics, and mechanical properties. This review presents numerical modeling approaches on multiple length scales and timescales to describe different aspects of powder bed fusion processes. In combination with tailored experiments, the numerical results enlarge the process understanding of the underlying physical mechanisms and support the development of suitable process strategies and component topologies.

  6. Additive functions in boolean models of gene regulatory network modules.

    PubMed

    Darabos, Christian; Di Cunto, Ferdinando; Tomassini, Marco; Moore, Jason H; Provero, Paolo; Giacobini, Mario

    2011-01-01

    Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity

  7. Additive Functions in Boolean Models of Gene Regulatory Network Modules

    PubMed Central

    Darabos, Christian; Di Cunto, Ferdinando; Tomassini, Marco; Moore, Jason H.; Provero, Paolo; Giacobini, Mario

    2011-01-01

    Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity

  8. WATEQ3 geochemical model: thermodynamic data for several additional solids

    SciTech Connect

    Krupka, K.M.; Jenne, E.A.

    1982-09-01

    Geochemical models such as WATEQ3 can be used to model the concentrations of water-soluble pollutants that may result from the disposal of nuclear waste and retorted oil shale. However, for a model to competently deal with these water-soluble pollutants, an adequate thermodynamic data base must be provided that includes elements identified as important in modeling these pollutants. To this end, several minerals and related solid phases were identified that were absent from the thermodynamic data base of WATEQ3. In this study, the thermodynamic data for the identified solids were compiled and selected from several published tabulations of thermodynamic data. For these solids, an accepted Gibbs free energy of formation, ..delta..G/sup 0//sub f,298/, was selected for each solid phase based on the recentness of the tabulated data and on considerations of internal consistency with respect to both the published tabulations and the existing data in WATEQ3. For those solids not included in these published tabulations, Gibbs free energies of formation were calculated from published solubility data (e.g., lepidocrocite), or were estimated (e.g., nontronite) using a free-energy summation method described by Mattigod and Sposito (1978). The accepted or estimated free energies were then combined with internally consistent, ancillary thermodynamic data to calculate equilibrium constants for the hydrolysis reactions of these minerals and related solid phases. Including these values in the WATEQ3 data base increased the competency of this geochemical model in applications associated with the disposal of nuclear waste and retorted oil shale. Additional minerals and related solid phases that need to be added to the solubility submodel will be identified as modeling applications continue in these two programs.

  9. Requirements based system risk modeling

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Cornford, Steven; Feather, Martin

    2004-01-01

    The problem that we address in this paper is assessing the expected degree of success of the system or mission based on the degree to which each requirement is satisfied and the relative weight of the requirements. We assume a complete list of the requirements, the relevant risk elements and their probability of occurrence and the quantified effect of the risk elements on the requirements. In order to assess the degree to which each requirement is satisfied, we need to determine the effect of the various risk elements on the requirement.

  10. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  11. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  12. Modeling extreme risks in ecology.

    PubMed

    Burgman, Mark; Franklin, James; Hayes, Keith R; Hosack, Geoffrey R; Peters, Gareth W; Sisson, Scott A

    2012-11-01

    Extreme risks in ecology are typified by circumstances in which data are sporadic or unavailable, understanding is poor, and decisions are urgently needed. Expert judgments are pervasive and disagreements among experts are commonplace. We outline approaches to evaluating extreme risks in ecology that rely on stochastic simulation, with a particular focus on methods to evaluate the likelihood of extinction and quasi-extinction of threatened species, and the likelihood of establishment and spread of invasive pests. We evaluate the importance of assumptions in these assessments and the potential of some new approaches to account for these uncertainties, including hierarchical estimation procedures and generalized extreme value distributions. We conclude by examining the treatment of consequences in extreme risk analysis in ecology and how expert judgment may better be harnessed to evaluate extreme risks.

  13. [Critical of the additive model of the randomized controlled trial].

    PubMed

    Boussageon, Rémy; Gueyffier, François; Bejan-Angoulvant, Theodora; Felden-Dominiak, Géraldine

    2008-01-01

    Randomized, double-blind, placebo-controlled clinical trials are currently the best way to demonstrate the clinical effectiveness of drugs. Its methodology relies on the method of difference (John Stuart Mill), through which the observed difference between two groups (drug vs placebo) can be attributed to the pharmacological effect of the drug being tested. However, this additive model can be questioned in the event of statistical interactions between the pharmacological and the placebo effects. Evidence in different domains has shown that the placebo effect can influence the effect of the active principle. This article evaluates the methodological, clinical and epistemological consequences of this phenomenon. Topics treated include extrapolating results, accounting for heterogeneous results, demonstrating the existence of several factors in the placebo effect, the necessity to take these factors into account for given symptoms or pathologies, as well as the problem of the "specific" effect.

  14. Breast cancer risk assessment across the risk continuum: genetic and nongenetic risk factors contributing to differential model performance

    PubMed Central

    2012-01-01

    (AUC = 63.2%, CI = 57.6% to 68.9%). In almost all covariate-specific subgroups, BCRAT mean risks were significantly lower than the observed risks, while IBIS risks showed generally good agreement with observed risks, even in the subgroups of women considered at average risk (for example, no family history of breast cancer, BRCA1/2 mutation negative). Conclusions Models developed using extended family history and genetic data, such as the IBIS model, also perform well in women considered at average risk (for example, no family history of breast cancer, BRCA1/2 mutation negative). Extending such models to include additional nongenetic information may improve performance in women across the breast cancer risk continuum. PMID:23127309

  15. Risk assessment of additives through soft drinks and nectars consumption on Portuguese population: a 2010 survey.

    PubMed

    Diogo, Janina S G; Silva, Liliana S O; Pena, Angelina; Lino, Celeste M

    2013-12-01

    This study investigated whether the Portuguese population is at risk of exceeding ADI levels for acesulfame-K, saccharin, aspartame, caffeine, benzoic and sorbic acid through an assessment of dietary intake of additives and specific consumption of four types of beverages, traditional soft drinks and soft drinks based on mineral waters, energetic drinks, and nectars. The highest mean levels of additives were found for caffeine in energetic drinks, 293.5mg/L, for saccharin in traditional soft drinks, 18.4 mg/L, for acesulfame-K and aspartame in nectars, with 88.2 and 97.8 mg/L, respectively, for benzoic acid in traditional soft drinks, 125.7 mg/L, and for sorbic acid in soft drinks based on mineral water, 166.5 mg/L. Traditional soft drinks presented the highest acceptable daily intake percentages (ADIs%) for acesulfame-K, aspartame, benzoic and sorbic acid and similar value for saccharin (0.5%) when compared with soft drinks based on mineral water, 0.7%, 0.08%, 7.3%, and 1.92% versus 0.2%, 0.053%, 0.6%, and 0.28%, respectively. However for saccharin the highest percentage of ADI was obtained for nectars, 0.9%, in comparison with both types of soft drinks, 0.5%. Therefore, it is concluded that the Portuguese population is not at risk of exceeding the established ADIs for the studied additives.

  16. Reliability Models and Attributable Risk

    NASA Technical Reports Server (NTRS)

    Jarvinen, Richard D.

    1999-01-01

    The intention of this report is to bring a developing and extremely useful statistical methodology to greater attention within the Safety, Reliability, and Quality Assurance Office of the NASA Johnson Space Center. The statistical methods in this exposition are found under the heading of attributable risk. Recently the Safety, Reliability, and Quality Assurance Office at the Johnson Space Center has supported efforts to introduce methods of medical research statistics dealing with the survivability of people to bear on the problems of aerospace that deal with the reliability of component hardware used in the NASA space program. This report, which describes several study designs for which attributable risk is used, is in concert with the latter goals. The report identifies areas of active research in attributable risk while briefly describing much of what has been developed in the theory of attributable risk. The report, which largely is a report on a report, attempts to recast the medical setting and language commonly found in descriptions of attributable risk into the setting and language of the space program and its component hardware.

  17. Relative Importance and Additive Effects of Maternal and Infant Risk Factors on Childhood Asthma

    PubMed Central

    Rosas-Salazar, Christian; James, Kristina; Escobar, Gabriel; Gebretsadik, Tebeb; Li, Sherian Xu; Carroll, Kecia N.; Walsh, Eileen; Mitchel, Edward; Das, Suman; Kumar, Rajesh; Yu, Chang; Dupont, William D.; Hartert, Tina V.

    2016-01-01

    Background Environmental exposures that occur in utero and during early life may contribute to the development of childhood asthma through alteration of the human microbiome. The objectives of this study were to estimate the cumulative effect and relative importance of environmental exposures on the risk of childhood asthma. Methods We conducted a population-based birth cohort study of mother-child dyads who were born between 1995 and 2003 and were continuously enrolled in the PRIMA (Prevention of RSV: Impact on Morbidity and Asthma) cohort. The individual and cumulative impact of maternal urinary tract infections (UTI) during pregnancy, maternal colonization with group B streptococcus (GBS), mode of delivery, infant antibiotic use, and older siblings at home, on the risk of childhood asthma were estimated using logistic regression. Dose-response effect on childhood asthma risk was assessed for continuous risk factors: number of maternal UTIs during pregnancy, courses of infant antibiotics, and number of older siblings at home. We further assessed and compared the relative importance of these exposures on the asthma risk. In a subgroup of children for whom maternal antibiotic use during pregnancy information was available, the effect of maternal antibiotic use on the risk of childhood asthma was estimated. Results Among 136,098 singleton birth infants, 13.29% developed asthma. In both univariate and adjusted analyses, maternal UTI during pregnancy (odds ratio [OR] 1.2, 95% confidence interval [CI] 1.18, 1.25; adjusted OR [AOR] 1.04, 95%CI 1.02, 1.07 for every additional UTI) and infant antibiotic use (OR 1.21, 95%CI 1.20, 1.22; AOR 1.16, 95%CI 1.15, 1.17 for every additional course) were associated with an increased risk of childhood asthma, while having older siblings at home (OR 0.92, 95%CI 0.91, 0.93; AOR 0.85, 95%CI 0.84, 0.87 for each additional sibling) was associated with a decreased risk of childhood asthma, in a dose-dependent manner. Compared with vaginal

  18. A methodology for modeling regional terrorism risk.

    PubMed

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States.

  19. THE COMBINED CARCINOGENIC RISK FOR EXPOSURE TO MIXTURES OF DRINKING WATER DISINFECTION BY-PRODUCTS MAY BE LESS THAN ADDITIVE

    EPA Science Inventory

    The Combined Carcinogenic Risk for Exposure to Mixtures of Drinking Water Disinfection By-Products May be Less Than Additive

    Risk assessment methods for chemical mixtures in drinking water are not well defined. Current default risk assessments for chemical mixtures assume...

  20. Adversarial risk analysis for counterterrorism modeling.

    PubMed

    Rios, Jesus; Rios Insua, David

    2012-05-01

    Recent large-scale terrorist attacks have raised interest in models for resource allocation against terrorist threats. The unifying theme in this area is the need to develop methods for the analysis of allocation decisions when risks stem from the intentional actions of intelligent adversaries. Most approaches to these problems have a game-theoretic flavor although there are also several interesting decision-analytic-based proposals. One of them is the recently introduced framework for adversarial risk analysis, which deals with decision-making problems that involve intelligent opponents and uncertain outcomes. We explore how adversarial risk analysis addresses some standard counterterrorism models: simultaneous defend-attack models, sequential defend-attack-defend models, and sequential defend-attack models with private information. For each model, we first assess critically what would be a typical game-theoretic approach and then provide the corresponding solution proposed by the adversarial risk analysis framework, emphasizing how to coherently assess a predictive probability model of the adversary's actions, in a context in which we aim at supporting decisions of a defender versus an attacker. This illustrates the application of adversarial risk analysis to basic counterterrorism models that may be used as basic building blocks for more complex risk analysis of counterterrorism problems. PMID:22150163

  1. PRISM: a planned risk information seeking model.

    PubMed

    Kahlor, LeeAnn

    2010-06-01

    Recent attention on health-related information seeking has focused primarily on information seeking within specific health and health risk contexts. This study attempts to shift some of that focus to individual-level variables that may impact health risk information seeking across contexts. To locate these variables, the researcher posits an integrated model, the Planned Risk Information Seeking Model (PRISM). The model, which treats risk information seeking as a deliberate (planned) behavior, maps variables found in the Theory of Planned Behavior (TPB; Ajzen, 1991) and the Risk Information Seeking and Processing Model (RISP; Griffin, Dunwoody, & Neuwirth, 1999), and posits linkages among those variables. This effort is further informed by Kahlor's (2007) Augmented RISP, the Theory of Motivated Information Management (Afifi & Weiner, 2004), the Comprehensive Model of Information Seeking (Johnson & Meischke, 1993), the Health Information Acquisition Model (Freimuth, Stein, & Kean, 1989), and the Extended Parallel Processing Model (Witte, 1998). The resulting integrated model accounted for 59% of the variance in health risk information-seeking intent and performed better than the TPB or the RISP alone. PMID:20512716

  2. Quantitative risk modeling in aseptic manufacture.

    PubMed

    Tidswell, Edward C; McGarvey, Bernard

    2006-01-01

    Expedient risk assessment of aseptic manufacturing processes offers unique opportunities for improved and sustained assurance of product quality. Contemporary risk assessments applied to aseptic manufacturing processes, however, are commonly handicapped by assumptions and subjectivity, leading to inexactitude. Quantitative risk modeling augmented with Monte Carlo simulations represents a novel, innovative, and more efficient means of risk assessment. This technique relies upon fewer assumptions and removes subjectivity to more swiftly generate an improved, more realistic, quantitative estimate of risk. The fundamental steps and requirements for an assessment of the risk of bioburden ingress into aseptically manufactured products are described. A case study exemplifies how quantitative risk modeling and Monte Carlo simulations achieve a more rapid and improved determination of the risk of bioburden ingress during the aseptic filling of a parenteral product. Although application of quantitative risk modeling is described here purely for the purpose of process improvement, the technique has far wider relevance in the assisted disposition of batches, cleanroom management, and the utilization of real-time data from rapid microbial monitoring technologies. PMID:17089696

  3. Addition of dipeptidyl peptidase-4 inhibitors to sulphonylureas and risk of hypoglycaemia: systematic review and meta-analysis

    PubMed Central

    Moore, Nicholas; Arnaud, Mickael; Robinson, Philip; Raschi, Emanuel; De Ponti, Fabrizio; Bégaud, Bernard; Pariente, Antoine

    2016-01-01

    Objective To quantify the risk of hypoglycaemia associated with the concomitant use of dipeptidyl peptidase-4 (DPP-4) inhibitors and sulphonylureas compared with placebo and sulphonylureas. Design Systematic review and meta-analysis. Data sources Medline, ISI Web of Science, SCOPUS, Cochrane Central Register of Controlled Trials, and clinicaltrial.gov were searched without any language restriction. Study selection Placebo controlled randomised trials comprising at least 50 participants with type 2 diabetes treated with DPP-4 inhibitors and sulphonylureas. Review methods Risk of bias in each trial was assessed using the Cochrane Collaboration tool. The risk ratio of hypoglycaemia with 95% confidence intervals was computed for each study and then pooled using fixed effect models (Mantel Haenszel method) or random effect models, when appropriate. Subgroup analyses were also performed (eg, dose of DPP-4 inhibitors). The number needed to harm (NNH) was estimated according to treatment duration. Results 10 studies were included, representing a total of 6546 participants (4020 received DPP-4 inhibitors plus sulphonylureas, 2526 placebo plus sulphonylureas). The risk ratio of hypoglycaemia was 1.52 (95% confidence interval 1.29 to 1.80). The NNH was 17 (95% confidence interval 11 to 30) for a treatment duration of six months or less, 15 (9 to 26) for 6.1 to 12 months, and 8 (5 to 15) for more than one year. In subgroup analysis, no difference was found between full and low doses of DPP-4 inhibitors: the risk ratio related to full dose DPP-4 inhibitors was 1.66 (1.34 to 2.06), whereas the increased risk ratio related to low dose DPP-4 inhibitors did not reach statistical significance (1.33, 0.92 to 1.94). Conclusions Addition of DPP-4 inhibitors to sulphonylurea to treat people with type 2 diabetes is associated with a 50% increased risk of hypoglycaemia and to one excess case of hypoglycaemia for every 17 patients in the first six months of treatment. This

  4. Korean Risk Assessment Model for Breast Cancer Risk Prediction

    PubMed Central

    Park, Boyoung; Ma, Seung Hyun; Shin, Aesun; Chang, Myung-Chul; Choi, Ji-Yeob; Kim, Sungwan; Han, Wonshik; Noh, Dong-Young; Ahn, Sei-Hyun; Kang, Daehee; Yoo, Keun-Young; Park, Sue K.

    2013-01-01

    Purpose We evaluated the performance of the Gail model for a Korean population and developed a Korean breast cancer risk assessment tool (KoBCRAT) based upon equations developed for the Gail model for predicting breast cancer risk. Methods Using 3,789 sets of cases and controls, risk factors for breast cancer among Koreans were identified. Individual probabilities were projected using Gail's equations and Korean hazard data. We compared the 5-year and lifetime risk produced using the modified Gail model which applied Korean incidence and mortality data and the parameter estimators from the original Gail model with those produced using the KoBCRAT. We validated the KoBCRAT based on the expected/observed breast cancer incidence and area under the curve (AUC) using two Korean cohorts: the Korean Multicenter Cancer Cohort (KMCC) and National Cancer Center (NCC) cohort. Results The major risk factors under the age of 50 were family history, age at menarche, age at first full-term pregnancy, menopausal status, breastfeeding duration, oral contraceptive usage, and exercise, while those at and over the age of 50 were family history, age at menarche, age at menopause, pregnancy experience, body mass index, oral contraceptive usage, and exercise. The modified Gail model produced lower 5-year risk for the cases than for the controls (p = 0.017), while the KoBCRAT produced higher 5-year and lifetime risk for the cases than for the controls (p<0.001 and <0.001, respectively). The observed incidence of breast cancer in the two cohorts was similar to the expected incidence from the KoBCRAT (KMCC, p = 0.880; NCC, p = 0.878). The AUC using the KoBCRAT was 0.61 for the KMCC and 0.89 for the NCC cohort. Conclusions Our findings suggest that the KoBCRAT is a better tool for predicting the risk of breast cancer in Korean women, especially urban women. PMID:24204664

  5. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  6. Risk assessment of nitrate and oxytetracycline addition on coastal ecosystem functions.

    PubMed

    Feng-Jiao, Liu; Shun-Xing, Li; Feng-Ying, Zheng; Xu-Guang, Huang; Yue-Gang, Zuo; Teng-Xiu, Tu; Xue-Qing, Wu

    2014-01-01

    Diatoms dominate phytoplankton communities in the well-mixed coastal and upwelling regions. Coastal diatoms are often exposed to both aquaculture pollution and eutrophication. But how these exposures influence on coastal ecosystem functions are unknown. To examine these influences, a coastal centric diatom, Conticribra weissflogii was maintained at different concentrations of nitrate (N) and/or oxytetracycline (OTC). Algal density, cell growth cycle, protein, chlorophyll a, superoxide dismutase (SOD) activity, and malonaldehyde (MDA) were determined for the assessment of algal biomass, lifetime, nutritional value, photosynthesis and respiration, antioxidant capacity, and lipid peroxidation, respectively. When N addition was combined with OTC pollution, the cell growth cycles were shortened by 56-73%; algal density, SOD activities, the concentrations of chlorophyll a, protein, and MDA varied between 73 and 121%, 19 and 397%, 52 and 693%, 19 and 875%, and 66 and 2733% of the values observed in N addition experiments, respectively. According to P-value analysis, the influence of OTC on algal density and SOD activity was not significant, but the effect on cell growth cycle, protein, chlorophyll a, and MDA were significant (P<0.05). The influence of N addition with simultaneous OTC pollution on the above six end points was significant. Algal biomass, lifetime, nutrition, antioxidant capacity, lipid peroxidation, photosynthesis, and respiration were all affected by the addition of OTC and N. Coastal ecosystem functions were severely affected by N and OTC additions, and the influence was increased in the order: Nrisk assessment of aquaculture pollution on coastal ecosystem functions.

  7. Global flood risk modelling and its applications for disaster risk reduction

    NASA Astrophysics Data System (ADS)

    Jongman, Brenden; Winsemius, Hessel; Bierkens, Marc; Bouwman, Arno; van Beek, Rens; Ligtvoet, Willem; Ward, Philip

    2014-05-01

    Flooding of river systems is the most costly natural hazard affecting societies around the world, with an average of 55 billion in direct losses and 4,500 fatalities each year between 1990 and 2012. The accurate and consistent assessment of flood risk on a global scale is essential for international development organizations and the reinsurance industry, and for enhancing our understanding of climate change impacts. This need is especially felt in developing countries, where local data and models are largely unavailable, and where flood risk is increasing rapidly under strong population growth and economic development. Here we present ongoing applications of high-resolution flood risk modelling at a global scale. The work is based on GLOFRIS, a modelling chain that produces flood risk maps at a 1km spatial resolution for the entire globe, under a range of climate and socioeconomic scenarios and various past and future time periods. This modelling chain combines a hydrological inundation model with socioeconomic datasets to assess past, current and future population exposure; economic damages; and agricultural risk. These tools are currently applied scientifically to gain insights in geographical patterns in current risk, and to assess the effects of possible future scenarios under climate change and climate variability. In this presentation we show recent applications from global scale to national scales. The global scale applications include global risk profiling for the reinsurance industry; and novel estimation of global flood mortality risk. In addition it will be demonstrated how the global flood modelling approach was successfully applied to assess disaster risk reduction priorities on a national scale in Africa. Finally, we indicate how these global modelling tools can be used to quantify the costs and benefits of adaptation, and explore pathways for development under a changing environment.

  8. Percolation model with an additional source of disorder

    NASA Astrophysics Data System (ADS)

    Kundu, Sumanta; Manna, S. S.

    2016-06-01

    The ranges of transmission of the mobiles in a mobile ad hoc network are not uniform in reality. They are affected by the temperature fluctuation in air, obstruction due to the solid objects, even the humidity difference in the environment, etc. How the varying range of transmission of the individual active elements affects the global connectivity in the network may be an important practical question to ask. Here a model of percolation phenomena, with an additional source of disorder, is introduced for a theoretical understanding of this problem. As in ordinary percolation, sites of a square lattice are occupied randomly with probability p . Each occupied site is then assigned a circular disk of random value R for its radius. A bond is defined to be occupied if and only if the radii R1 and R2 of the disks centered at the ends satisfy a certain predefined condition. In a very general formulation, one divides the R1-R2 plane into two regions by an arbitrary closed curve. One defines a point within one region as representing an occupied bond; otherwise it is a vacant bond. The study of three different rules under this general formulation indicates that the percolation threshold always varies continuously. This threshold has two limiting values, one is pc(sq) , the percolation threshold for the ordinary site percolation on the square lattice, and the other is unity. The approach of the percolation threshold to its limiting values are characterized by two exponents. In a special case, all lattice sites are occupied by disks of random radii R ∈{0 ,R0} and a percolation transition is observed with R0 as the control variable, similar to the site occupation probability.

  9. Percolation model with an additional source of disorder.

    PubMed

    Kundu, Sumanta; Manna, S S

    2016-06-01

    The ranges of transmission of the mobiles in a mobile ad hoc network are not uniform in reality. They are affected by the temperature fluctuation in air, obstruction due to the solid objects, even the humidity difference in the environment, etc. How the varying range of transmission of the individual active elements affects the global connectivity in the network may be an important practical question to ask. Here a model of percolation phenomena, with an additional source of disorder, is introduced for a theoretical understanding of this problem. As in ordinary percolation, sites of a square lattice are occupied randomly with probability p. Each occupied site is then assigned a circular disk of random value R for its radius. A bond is defined to be occupied if and only if the radii R_{1} and R_{2} of the disks centered at the ends satisfy a certain predefined condition. In a very general formulation, one divides the R_{1}-R_{2} plane into two regions by an arbitrary closed curve. One defines a point within one region as representing an occupied bond; otherwise it is a vacant bond. The study of three different rules under this general formulation indicates that the percolation threshold always varies continuously. This threshold has two limiting values, one is p_{c}(sq), the percolation threshold for the ordinary site percolation on the square lattice, and the other is unity. The approach of the percolation threshold to its limiting values are characterized by two exponents. In a special case, all lattice sites are occupied by disks of random radii R∈{0,R_{0}} and a percolation transition is observed with R_{0} as the control variable, similar to the site occupation probability.

  10. Percolation model with an additional source of disorder.

    PubMed

    Kundu, Sumanta; Manna, S S

    2016-06-01

    The ranges of transmission of the mobiles in a mobile ad hoc network are not uniform in reality. They are affected by the temperature fluctuation in air, obstruction due to the solid objects, even the humidity difference in the environment, etc. How the varying range of transmission of the individual active elements affects the global connectivity in the network may be an important practical question to ask. Here a model of percolation phenomena, with an additional source of disorder, is introduced for a theoretical understanding of this problem. As in ordinary percolation, sites of a square lattice are occupied randomly with probability p. Each occupied site is then assigned a circular disk of random value R for its radius. A bond is defined to be occupied if and only if the radii R_{1} and R_{2} of the disks centered at the ends satisfy a certain predefined condition. In a very general formulation, one divides the R_{1}-R_{2} plane into two regions by an arbitrary closed curve. One defines a point within one region as representing an occupied bond; otherwise it is a vacant bond. The study of three different rules under this general formulation indicates that the percolation threshold always varies continuously. This threshold has two limiting values, one is p_{c}(sq), the percolation threshold for the ordinary site percolation on the square lattice, and the other is unity. The approach of the percolation threshold to its limiting values are characterized by two exponents. In a special case, all lattice sites are occupied by disks of random radii R∈{0,R_{0}} and a percolation transition is observed with R_{0} as the control variable, similar to the site occupation probability. PMID:27415234

  11. Breast cancer risk prediction using a clinical risk model and polygenic risk score.

    PubMed

    Shieh, Yiwey; Hu, Donglei; Ma, Lin; Huntsman, Scott; Gard, Charlotte C; Leung, Jessica W T; Tice, Jeffrey A; Vachon, Celine M; Cummings, Steven R; Kerlikowske, Karla; Ziv, Elad

    2016-10-01

    Breast cancer risk assessment can inform the use of screening and prevention modalities. We investigated the performance of the Breast Cancer Surveillance Consortium (BCSC) risk model in combination with a polygenic risk score (PRS) comprised of 83 single nucleotide polymorphisms identified from genome-wide association studies. We conducted a nested case-control study of 486 cases and 495 matched controls within a screening cohort. The PRS was calculated using a Bayesian approach. The contributions of the PRS and variables in the BCSC model to breast cancer risk were tested using conditional logistic regression. Discriminatory accuracy of the models was compared using the area under the receiver operating characteristic curve (AUROC). Increasing quartiles of the PRS were positively associated with breast cancer risk, with OR 2.54 (95 % CI 1.69-3.82) for breast cancer in the highest versus lowest quartile. In a multivariable model, the PRS, family history, and breast density remained strong risk factors. The AUROC of the PRS was 0.60 (95 % CI 0.57-0.64), and an Asian-specific PRS had AUROC 0.64 (95 % CI 0.53-0.74). A combined model including the BCSC risk factors and PRS had better discrimination than the BCSC model (AUROC 0.65 versus 0.62, p = 0.01). The BCSC-PRS model classified 18 % of cases as high-risk (5-year risk ≥3 %), compared with 7 % using the BCSC model. The PRS improved discrimination of the BCSC risk model and classified more cases as high-risk. Further consideration of the PRS's role in decision-making around screening and prevention strategies is merited. PMID:27565998

  12. Major histocompatibility complex harbors widespread genotypic variability of non-additive risk of rheumatoid arthritis including epistasis

    PubMed Central

    Wei, Wen-Hua; Bowes, John; Plant, Darren; Viatte, Sebastien; Yarwood, Annie; Massey, Jonathan; Worthington, Jane; Eyre, Stephen

    2016-01-01

    Genotypic variability based genome-wide association studies (vGWASs) can identify potentially interacting loci without prior knowledge of the interacting factors. We report a two-stage approach to make vGWAS applicable to diseases: firstly using a mixed model approach to partition dichotomous phenotypes into additive risk and non-additive environmental residuals on the liability scale and secondly using the Levene’s (Brown-Forsythe) test to assess equality of the residual variances across genotype groups per marker. We found widespread significant (P < 2.5e-05) vGWAS signals within the major histocompatibility complex (MHC) across all three study cohorts of rheumatoid arthritis. We further identified 10 epistatic interactions between the vGWAS signals independent of the MHC additive effects, each with a weak effect but jointly explained 1.9% of phenotypic variance. PTPN22 was also identified in the discovery cohort but replicated in only one independent cohort. Combining the three cohorts boosted power of vGWAS and additionally identified TYK2 and ANKRD55. Both PTPN22 and TYK2 had evidence of interactions reported elsewhere. We conclude that vGWAS can help discover interacting loci for complex diseases but require large samples to find additional signals. PMID:27109064

  13. Risk Management in environmental geotechnical modelling

    NASA Astrophysics Data System (ADS)

    Tammemäe, Olavi; Torn, Hardi

    2008-01-01

    The objective of this article is to provide an overview of the basis of risk analysis, assessment and management, accompanying problems and principles of risk management when drafting an environmental geotechnical model, enabling the analysis of an entire territory or developed region as a whole. The environmental impact will remain within the limits of the criteria specified with the standards and will be acceptable for human health and environment. An essential part of the solution of the problem is the engineering-geological model based on risk analysis and the assessment and forecast of mutual effects of the processes.

  14. Hyperbolic value addition and general models of animal choice.

    PubMed

    Mazur, J E

    2001-01-01

    Three mathematical models of choice--the contextual-choice model (R. Grace, 1994), delay-reduction theory (N. Squires & E. Fantino, 1971), and a new model called the hyperbolic value-added model--were compared in their ability to predict the results from a wide variety of experiments with animal subjects. When supplied with 2 or 3 free parameters, all 3 models made fairly accurate predictions for a large set of experiments that used concurrent-chain procedures. One advantage of the hyperbolic value-added model is that it is derived from a simpler model that makes accurate predictions for many experiments using discrete-trial adjusting-delay procedures. Some results favor the hyperbolic value-added model and delay-reduction theory over the contextual-choice model, but more data are needed from choice situations for which the models make distinctly different predictions.

  15. Physical vulnerability modelling in natural hazard risk assessment

    NASA Astrophysics Data System (ADS)

    Douglas, J.

    2007-04-01

    An evaluation of the risk to an exposed element from a hazardous event requires a consideration of the element's vulnerability, which expresses its propensity to suffer damage. This concept allows the assessed level of hazard to be translated to an estimated level of risk and is often used to evaluate the risk from earthquakes and cyclones. However, for other natural perils, such as mass movements, coastal erosion and volcanoes, the incorporation of vulnerability within risk assessment is not well established and consequently quantitative risk estimations are not often made. This impedes the study of the relative contributions from different hazards to the overall risk at a site. Physical vulnerability is poorly modelled for many reasons: the cause of human casualties (from the event itself rather than by building damage); lack of observational data on the hazard, the elements at risk and the induced damage; the complexity of the structural damage mechanisms; the temporal and geographical scales; and the ability to modify the hazard level. Many of these causes are related to the nature of the peril therefore for some hazards, such as coastal erosion, the benefits of considering an element's physical vulnerability may be limited. However, for hazards such as volcanoes and mass movements the modelling of vulnerability should be improved by, for example, following the efforts made in earthquake risk assessment. For example, additional observational data on induced building damage and the hazardous event should be routinely collected and correlated and also numerical modelling of building behaviour during a damaging event should be attempted.

  16. Additive Synergism between Asbestos and Smoking in Lung Cancer Risk: A Systematic Review and Meta-Analysis

    PubMed Central

    Ngamwong, Yuwadee; Tangamornsuksan, Wimonchat; Lohitnavy, Ornrat; Chaiyakunapruk, Nathorn; Scholfield, C. Norman; Reisfeld, Brad; Lohitnavy, Manupat

    2015-01-01

    Smoking and asbestos exposure are important risks for lung cancer. Several epidemiological studies have linked asbestos exposure and smoking to lung cancer. To reconcile and unify these results, we conducted a systematic review and meta-analysis to provide a quantitative estimate of the increased risk of lung cancer associated with asbestos exposure and cigarette smoking and to classify their interaction. Five electronic databases were searched from inception to May, 2015 for observational studies on lung cancer. All case-control (N = 10) and cohort (N = 7) studies were included in the analysis. We calculated pooled odds ratios (ORs), relative risks (RRs) and 95% confidence intervals (CIs) using a random-effects model for the association of asbestos exposure and smoking with lung cancer. Lung cancer patients who were not exposed to asbestos and non-smoking (A-S-) were compared with; (i) asbestos-exposed and non-smoking (A+S-), (ii) non-exposure to asbestos and smoking (A-S+), and (iii) asbestos-exposed and smoking (A+S+). Our meta-analysis showed a significant difference in risk of developing lung cancer among asbestos exposed and/or smoking workers compared to controls (A-S-), odds ratios for the disease (95% CI) were (i) 1.70 (A+S-, 1.31–2.21), (ii) 5.65; (A-S+, 3.38–9.42), (iii) 8.70 (A+S+, 5.8–13.10). The additive interaction index of synergy was 1.44 (95% CI = 1.26–1.77) and the multiplicative index = 0.91 (95% CI = 0.63–1.30). Corresponding values for cohort studies were 1.11 (95% CI = 1.00–1.28) and 0.51 (95% CI = 0.31–0.85). Our results point to an additive synergism for lung cancer with co-exposure of asbestos and cigarette smoking. Assessments of industrial health risks should take smoking and other airborne health risks when setting occupational asbestos exposure limits. PMID:26274395

  17. Additive Synergism between Asbestos and Smoking in Lung Cancer Risk: A Systematic Review and Meta-Analysis.

    PubMed

    Ngamwong, Yuwadee; Tangamornsuksan, Wimonchat; Lohitnavy, Ornrat; Chaiyakunapruk, Nathorn; Scholfield, C Norman; Reisfeld, Brad; Lohitnavy, Manupat

    2015-01-01

    Smoking and asbestos exposure are important risks for lung cancer. Several epidemiological studies have linked asbestos exposure and smoking to lung cancer. To reconcile and unify these results, we conducted a systematic review and meta-analysis to provide a quantitative estimate of the increased risk of lung cancer associated with asbestos exposure and cigarette smoking and to classify their interaction. Five electronic databases were searched from inception to May, 2015 for observational studies on lung cancer. All case-control (N = 10) and cohort (N = 7) studies were included in the analysis. We calculated pooled odds ratios (ORs), relative risks (RRs) and 95% confidence intervals (CIs) using a random-effects model for the association of asbestos exposure and smoking with lung cancer. Lung cancer patients who were not exposed to asbestos and non-smoking (A-S-) were compared with; (i) asbestos-exposed and non-smoking (A+S-), (ii) non-exposure to asbestos and smoking (A-S+), and (iii) asbestos-exposed and smoking (A+S+). Our meta-analysis showed a significant difference in risk of developing lung cancer among asbestos exposed and/or smoking workers compared to controls (A-S-), odds ratios for the disease (95% CI) were (i) 1.70 (A+S-, 1.31-2.21), (ii) 5.65; (A-S+, 3.38-9.42), (iii) 8.70 (A+S+, 5.8-13.10). The additive interaction index of synergy was 1.44 (95% CI = 1.26-1.77) and the multiplicative index = 0.91 (95% CI = 0.63-1.30). Corresponding values for cohort studies were 1.11 (95% CI = 1.00-1.28) and 0.51 (95% CI = 0.31-0.85). Our results point to an additive synergism for lung cancer with co-exposure of asbestos and cigarette smoking. Assessments of industrial health risks should take smoking and other airborne health risks when setting occupational asbestos exposure limits.

  18. Using Generalized Additive Models to Analyze Single-Case Designs

    ERIC Educational Resources Information Center

    Shadish, William; Sullivan, Kristynn

    2013-01-01

    Many analyses for single-case designs (SCDs)--including nearly all the effect size indicators-- currently assume no trend in the data. Regression and multilevel models allow for trend, but usually test only linear trend and have no principled way of knowing if higher order trends should be represented in the model. This paper shows how Generalized…

  19. Long range Ising model for credit risk modeling

    NASA Astrophysics Data System (ADS)

    Molins, Jordi; Vives, Eduard

    2005-07-01

    Within the framework of maximum entropy principle we show that the finite-size long-range Ising model is the adequate model for the description of homogeneous credit portfolios and the computation of credit risk when default correlations between the borrowers are included. The exact analysis of the model suggest that when the correlation increases a first-order-like transition may occur inducing a sudden risk increase.

  20. The addition of whole soy flour to cafeteria diet reduces metabolic risk markers in wistar rats

    PubMed Central

    2013-01-01

    Background Soybean is termed a functional food because it contains bioactive compounds. However, its effects are not well known under unbalanced diet conditions. This work is aimed at evaluating the effect of adding whole soy flour to a cafeteria diet on intestinal histomorphometry, metabolic risk and toxicity markers in rats. Methods In this study, 30 male adult Wistar rats were used, distributed among three groups (n = 10): AIN-93 M diet, cafeteria diet (CAF) and cafeteria diet with soy flour (CAFS), for 56 days. The following parameters were measured: food intake; weight gain; serum concentrations of triglycerides, total cholesterol, HDL-c, glycated hemoglobin (HbA1c), aspartate (AST) and alanine (ALT) aminotransferases and Thiobarbituric Acid Reactive Substances (TBARS); humidity and lipid fecal content; weight and fat of the liver. The villous height, the crypt depth and the thickness of the duodenal and ileal circular and longitudinal muscle layers of the animals were also measured. Results There was a significant reduction in the food intake in the CAF group. The CAFS showed lower serum concentrations of triglycerides and serum TBARS and a lower percentage of hepatic fat, with a corresponding increase in thickness of the intestinal muscle layers. In the CAF group, an increase in the HbA1c, ALT, lipid excretion, liver TBARS and crypt depth, was observed associated with lower HDL-c and villous height. The addition of soy did not promote any change in these parameters. Conclusions The inclusion of whole soy flour in a high-fat diet may be helpful in reducing some markers of metabolic risk; however, more studies are required to clarify its effects on unbalanced diets. PMID:24119309

  1. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    PubMed

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models.

  2. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    PubMed

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models. PMID:26336695

  3. How much additional model complexity do the use of catchment hydrological signatures, additional data and expert knowledge warrant?

    NASA Astrophysics Data System (ADS)

    Hrachowitz, M.; Fovet, O.; RUIZ, L.; Gascuel-odoux, C.; Savenije, H.

    2013-12-01

    In the frequent absence of sufficient suitable data to constrain hydrological models, it is not uncommon to represent catchments at a range of scales by lumped model set-ups. Although process heterogeneity can average out on the catchment scale to generate simple catchment integrated responses whose general flow features can frequently be reproduced by lumped models, these models often fail to get details of the flow pattern as well as catchment internal dynamics, such as groundwater level changes, right to a sufficient degree, resulting in considerable predictive uncertainty. Traditionally, models are constrained by only one or two objectives functions, which does not warrant more than a handful of parameters to avoid elevated predictive uncertainty, thereby preventing more complex model set-ups accounting for increased process heterogeneity. In this study it was tested how much additional process heterogeneity is warranted in models when optimizing the model calibration strategy, using additional data and expert knowledge. Long-term timeseries of flow and groundwater levels for small nested experimental catchments in French Brittany with considerable differences in geology, topography and flow regime were used in this study to test which degree of model process heterogeneity is warranted with increased availability of information. In a first step, as a benchmark, the system was treated as one lumped entity and the model was trained based only on its ability to reproduce the hydrograph. Although it was found that the overall modelled flow generally reflects the observed flow response quite well, the internal system dynamics could not be reproduced. In further steps the complexity of this model was gradually increased, first by adding a separate riparian reservoir to the lumped set-up and then by a semi-distributed set-up, allowing for independent, parallel model structures, representing the contrasting nested catchments. Although calibration performance increased

  4. Addition of Diffusion Model to MELCOR and Comparison with Data

    SciTech Connect

    Brad Merrill; Richard Moore; Chang Oh

    2004-06-01

    A chemical diffusion model was incorporated into the thermal-hydraulics package of the MELCOR Severe Accident code (Reference 1) for analyzing air ingress events for a very high temperature gas-cooled reactor.

  5. Non-additive model for specific heat of electrons

    NASA Astrophysics Data System (ADS)

    Anselmo, D. H. A. L.; Vasconcelos, M. S.; Silva, R.; Mello, V. D.

    2016-10-01

    By using non-additive Tsallis entropy we demonstrate numerically that one-dimensional quasicrystals, whose energy spectra are multifractal Cantor sets, are characterized by an entropic parameter, and calculate the electronic specific heat, where we consider a non-additive entropy Sq. In our method we consider an energy spectra calculated using the one-dimensional tight binding Schrödinger equation, and their bands (or levels) are scaled onto the [ 0 , 1 ] interval. The Tsallis' formalism is applied to the energy spectra of Fibonacci and double-period one-dimensional quasiperiodic lattices. We analytically obtain an expression for the specific heat that we consider to be more appropriate to calculate this quantity in those quasiperiodic structures.

  6. Additional Research Needs to Support the GENII Biosphere Models

    SciTech Connect

    Napier, Bruce A.; Snyder, Sandra F.; Arimescu, Carmen

    2013-11-30

    In the course of evaluating the current parameter needs for the GENII Version 2 code (Snyder et al. 2013), areas of possible improvement for both the data and the underlying models have been identified. As the data review was implemented, PNNL staff identified areas where the models can be improved both to accommodate the locally significant pathways identified and also to incorporate newer models. The areas are general data needs for the existing models and improved formulations for the pathway models. It is recommended that priorities be set by NRC staff to guide selection of the most useful improvements in a cost-effective manner. Suggestions are made based on relatively easy and inexpensive changes, and longer-term more costly studies. In the short term, there are several improved model formulations that could be applied to the GENII suite of codes to make them more generally useful. • Implementation of the separation of the translocation and weathering processes • Implementation of an improved model for carbon-14 from non-atmospheric sources • Implementation of radon exposure pathways models • Development of a KML processor for the output report generator module data that are calculated on a grid that could be superimposed upon digital maps for easier presentation and display • Implementation of marine mammal models (manatees, seals, walrus, whales, etc.). Data needs in the longer term require extensive (and potentially expensive) research. Before picking any one radionuclide or food type, NRC staff should perform an in-house review of current and anticipated environmental analyses to select “dominant” radionuclides of interest to allow setting of cost-effective priorities for radionuclide- and pathway-specific research. These include • soil-to-plant uptake studies for oranges and other citrus fruits, and • Development of models for evaluation of radionuclide concentration in highly-processed foods such as oils and sugars. Finally, renewed

  7. Modeling food spoilage in microbial risk assessment.

    PubMed

    Koutsoumanis, Konstantinos

    2009-02-01

    In this study, I describe a systematic approach for modeling food spoilage in microbial risk assessment that is based on the incorporation of kinetic spoilage modeling in exposure assessment by combining data and models for the specific spoilage organisms (SSO: fraction of the total microflora responsible for spoilage) with those for pathogens. The structure of the approach is presented through an exposure assessment application for Escherichia coli O157:H7 in ground beef. The proposed approach allows for identifying spoiled products at the time of consumption by comparing the estimated level of SSO (pseudomonads) with the spoilage level (level of SSO at which spoilage is observed). The results of the application indicate that ignoring spoilage in risk assessment could lead to significant overestimations of risk.

  8. Managing exploration risk using basin modeling

    SciTech Connect

    Wendebourg, J. )

    1996-01-01

    Economic risk analysis requires a well's dry-hole probability and a probability distribution of type and volume of recoverable hydrocarbons. Today's world-wide exploration needs methods that can accommodate a wide variety of data quality and quantity. Monte Carlo methods are commonly used to compute volume distributions and dry hole probability by multiplying Probabilities of geologic risk factors such as source rock richness, migration loss, seal effectiveness etc. assuming that these are independent Parameters. This assumption however is not appropriate because they represent interdependent physical processes that should be treated by an integrated system. Basin modeling is a tool for assessing exploration risk by simulating the interdependent processes that lead to hydrocarbon accumulations. advanced 2-D and 3-D basin modeling can treat occurrence, type, and volumes of hydrocarbons. These models need many parameters that individually may have great uncertainties, but a calibration against available data may reduce their uncertainties significantly and therefore may quantify risk. Uncertainty of thermal and source rock parameters is evaluated by applying simple and fast 1-D tools to individual wells. Calibration of pressure and temperature data as well as occurrence and type of known hydrocarbon accumulations with 2-D tools evaluates uncertainty between wells along geologic cross-sections. Individual prospect risk is finally determined by the uncertainty of local parameters within the calibrated model, as for example seal effectiveness or fault permeability.

  9. Managing exploration risk using basin modeling

    SciTech Connect

    Wendebourg, J.

    1996-12-31

    Economic risk analysis requires a well`s dry-hole probability and a probability distribution of type and volume of recoverable hydrocarbons. Today`s world-wide exploration needs methods that can accommodate a wide variety of data quality and quantity. Monte Carlo methods are commonly used to compute volume distributions and dry hole probability by multiplying Probabilities of geologic risk factors such as source rock richness, migration loss, seal effectiveness etc. assuming that these are independent Parameters. This assumption however is not appropriate because they represent interdependent physical processes that should be treated by an integrated system. Basin modeling is a tool for assessing exploration risk by simulating the interdependent processes that lead to hydrocarbon accumulations. advanced 2-D and 3-D basin modeling can treat occurrence, type, and volumes of hydrocarbons. These models need many parameters that individually may have great uncertainties, but a calibration against available data may reduce their uncertainties significantly and therefore may quantify risk. Uncertainty of thermal and source rock parameters is evaluated by applying simple and fast 1-D tools to individual wells. Calibration of pressure and temperature data as well as occurrence and type of known hydrocarbon accumulations with 2-D tools evaluates uncertainty between wells along geologic cross-sections. Individual prospect risk is finally determined by the uncertainty of local parameters within the calibrated model, as for example seal effectiveness or fault permeability.

  10. Quantitative risk assessment modeling for nonhomogeneous urban road tunnels.

    PubMed

    Meng, Qiang; Qu, Xiaobo; Wang, Xinchang; Yuanita, Vivi; Wong, Siew Chee

    2011-03-01

    Urban road tunnels provide an increasingly cost-effective engineering solution, especially in compact cities like Singapore. For some urban road tunnels, tunnel characteristics such as tunnel configurations, geometries, provisions of tunnel electrical and mechanical systems, traffic volumes, etc. may vary from one section to another. These urban road tunnels that have characterized nonuniform parameters are referred to as nonhomogeneous urban road tunnels. In this study, a novel quantitative risk assessment (QRA) model is proposed for nonhomogeneous urban road tunnels because the existing QRA models for road tunnels are inapplicable to assess the risks in these road tunnels. This model uses a tunnel segmentation principle whereby a nonhomogeneous urban road tunnel is divided into various homogenous sections. Individual risk for road tunnel sections as well as the integrated risk indices for the entire road tunnel is defined. The article then proceeds to develop a new QRA model for each of the homogeneous sections. Compared to the existing QRA models for road tunnels, this section-based model incorporates one additional top event-toxic gases due to traffic congestion-and employs the Poisson regression method to estimate the vehicle accident frequencies of tunnel sections. This article further illustrates an aggregated QRA model for nonhomogeneous urban tunnels by integrating the section-based QRA models. Finally, a case study in Singapore is carried out.

  11. Uncertainty in surface water flood risk modelling

    NASA Astrophysics Data System (ADS)

    Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.

    2009-04-01

    Two thirds of the flooding that occurred in the UK during summer 2007 was as a result of surface water (otherwise known as ‘pluvial') rather than river or coastal flooding. In response, the Environment Agency and Interim Pitt Reviews have highlighted the need for surface water risk mapping and warning tools to identify, and prepare for, flooding induced by heavy rainfall events. This need is compounded by the likely increase in rainfall intensities due to climate change. The Association of British Insurers has called for the Environment Agency to commission nationwide flood risk maps showing the relative risk of flooding from all sources. At the wider European scale, the recently-published EC Directive on the assessment and management of flood risks will require Member States to evaluate, map and model flood risk from a variety of sources. As such, there is now a clear and immediate requirement for the development of techniques for assessing and managing surface water flood risk across large areas. This paper describes an approach for integrating rainfall, drainage network and high-resolution topographic data using Flowroute™, a high-resolution flood mapping and modelling platform, to produce deterministic surface water flood risk maps. Information is provided from UK case studies to enable assessment and validation of modelled results using historical flood information and insurance claims data. Flowroute was co-developed with flood scientists at Cambridge University specifically to simulate river dynamics and floodplain inundation in complex, congested urban areas in a highly computationally efficient manner. It utilises high-resolution topographic information to route flows around individual buildings so as to enable the prediction of flood depths, extents, durations and velocities. As such, the model forms an ideal platform for the development of surface water flood risk modelling and mapping capabilities. The 2-dimensional component of Flowroute employs

  12. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    PubMed

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting.

  13. Predicting the Survival Time for Bladder Cancer Using an Additive Hazards Model in Microarray Data

    PubMed Central

    TAPAK, Leili; MAHJUB, Hossein; SADEGHIFAR, Majid; SAIDIJAM, Massoud; POOROLAJAL, Jalal

    2016-01-01

    Background: One substantial part of microarray studies is to predict patients’ survival based on their gene expression profile. Variable selection techniques are powerful tools to handle high dimensionality in analysis of microarray data. However, these techniques have not been investigated in competing risks setting. This study aimed to investigate the performance of four sparse variable selection methods in estimating the survival time. Methods: The data included 1381 gene expression measurements and clinical information from 301 patients with bladder cancer operated in the years 1987 to 2000 in hospitals in Denmark, Sweden, Spain, France, and England. Four methods of the least absolute shrinkage and selection operator, smoothly clipped absolute deviation, the smooth integration of counting and absolute deviation and elastic net were utilized for simultaneous variable selection and estimation under an additive hazards model. The criteria of area under ROC curve, Brier score and c-index were used to compare the methods. Results: The median follow-up time for all patients was 47 months. The elastic net approach was indicated to outperform other methods. The elastic net had the lowest integrated Brier score (0.137±0.07) and the greatest median of the over-time AUC and C-index (0.803±0.06 and 0.779±0.13, respectively). Five out of 19 selected genes by the elastic net were significant (P<0.05) under an additive hazards model. It was indicated that the expression of RTN4, SON, IGF1R and CDC20 decrease the survival time, while the expression of SMARCAD1 increase it. Conclusion: The elastic net had higher capability than the other methods for the prediction of survival time in patients with bladder cancer in the presence of competing risks base on additive hazards model. PMID:27114989

  14. Suicide risk assessment and suicide risk formulation: essential components of the therapeutic risk management model.

    PubMed

    Silverman, Morton M

    2014-09-01

    Suicide and other suicidal behaviors are often associated with psychiatric disorders and dysfunctions. Therefore, psychiatrists have significant opportunities to identify at-risk individuals and offer treatment to reduce that risk. Although a suicide risk assessment is a core competency requirement, many clinical psychiatrists lack the requisite training and skills to appropriately assess for suicide risk. Moreover, the standard of care requires psychiatrists to foresee the possibility that a patient might engage in suicidal behavior, hence to conduct a suicide risk formulation sufficient to guide triage and treatment planning. Based on data collected via a suicide risk assessment, a suicide risk formulation is a process whereby the psychiatrist forms a judgment about a patient's foreseeable risk of suicidal behavior in order to inform triage decisions, safety and treatment planning, and interventions to reduce risk. This paper addresses the components of this process in the context of the model for therapeutic risk management of the suicidal patient developed at the Veterans Integrated Service Network (VISN) 19 Mental Illness Research, Education and Clinical Center by Wortzel et al.

  15. The addition of algebraic turbulence modeling to program LAURA

    NASA Astrophysics Data System (ADS)

    Cheatwood, F. Mcneil; Thompson, R. A.

    1993-04-01

    The Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA) is modified to allow the calculation of turbulent flows. This is accomplished using the Cebeci-Smith and Baldwin-Lomax eddy-viscosity models in conjunction with the thin-layer Navier-Stokes options of the program. Turbulent calculations can be performed for both perfect-gas and equilibrium flows. However, a requirement of the models is that the flow be attached. It is seen that for slender bodies, adequate resolution of the boundary-layer gradients may require more cells in the normal direction than a laminar solution, even when grid stretching is employed. Results for axisymmetric and three-dimensional flows are presented. Comparison with experimental data and other numerical results reveal generally good agreement, except in the regions of detached flow.

  16. The risk of stillbirth and infant death by each additional week of expectant management stratified by maternal age

    PubMed Central

    Page, Jessica M.; Snowden, Jonathan M.; Cheng, Yvonne W.; Doss, Amy; Rosenstein, Melissa G.; Caughey, Aaron B.

    2016-01-01

    OBJECTIVE The objective of the study was to examine fetal/infant mortality by gestational age at term stratified by maternal age. STUDY DESIGN A retrospective cohort study was conducted using 2005 US national birth certificate data. For each week of term gestation, the risk of mortality associated with delivery was compared with composite mortality risk of expectant management. The expectant management measure included stillbirth and infant death. This expectant management risk was calculated to estimate the composite mortality risk with remaining pregnant an additional week by combining the risk of stillbirth during the additional week of pregnancy and infant death risk following delivery at the next week. Maternal age was stratified by 35 years or more compared with women younger than 35 years as well as subgroup analyses of younger than 20, 20–34, 35–39, or 40 years old or older. RESULTS The fetal/infant mortality risk of expectant management is greater than the risk of infant death at 39 weeks’ gestation in women 35 years old or older (15.2 vs 10.9 of 10,000, P < .05). In women younger than 35 years old, the risk of expectant management also exceeded that of infant death at 39 weeks (21.3 vs 18.8 of 10,000, P < .05). For women younger than 35 years old, the overall expectant management risk is influenced by higher infant death risk and does not rise significantly until 41 weeks compared with women 35 years old or older in which it increased at 40 weeks. CONCLUSION Risk varies by maternal age, and delivery at 39 weeks minimizes fetal/infant mortality for both groups, although the magnitude of the risk reduction is greater in older women. PMID:23707677

  17. Risk management model of winter navigation operations.

    PubMed

    Valdez Banda, Osiris A; Goerlandt, Floris; Kuzmin, Vladimir; Kujala, Pentti; Montewka, Jakub

    2016-07-15

    The wintertime maritime traffic operations in the Gulf of Finland are managed through the Finnish-Swedish Winter Navigation System. This establishes the requirements and limitations for the vessels navigating when ice covers this area. During winter navigation in the Gulf of Finland, the largest risk stems from accidental ship collisions which may also trigger oil spills. In this article, a model for managing the risk of winter navigation operations is presented. The model analyses the probability of oil spills derived from collisions involving oil tanker vessels and other vessel types. The model structure is based on the steps provided in the Formal Safety Assessment (FSA) by the International Maritime Organization (IMO) and adapted into a Bayesian Network model. The results indicate that ship independent navigation and convoys are the operations with higher probability of oil spills. Minor spills are most probable, while major oil spills found very unlikely but possible.

  18. Risk management model of winter navigation operations.

    PubMed

    Valdez Banda, Osiris A; Goerlandt, Floris; Kuzmin, Vladimir; Kujala, Pentti; Montewka, Jakub

    2016-07-15

    The wintertime maritime traffic operations in the Gulf of Finland are managed through the Finnish-Swedish Winter Navigation System. This establishes the requirements and limitations for the vessels navigating when ice covers this area. During winter navigation in the Gulf of Finland, the largest risk stems from accidental ship collisions which may also trigger oil spills. In this article, a model for managing the risk of winter navigation operations is presented. The model analyses the probability of oil spills derived from collisions involving oil tanker vessels and other vessel types. The model structure is based on the steps provided in the Formal Safety Assessment (FSA) by the International Maritime Organization (IMO) and adapted into a Bayesian Network model. The results indicate that ship independent navigation and convoys are the operations with higher probability of oil spills. Minor spills are most probable, while major oil spills found very unlikely but possible. PMID:27207023

  19. Clinical Model for Suicide Risk Assessment.

    ERIC Educational Resources Information Center

    Kral, Michael J.; Sakinofsky, Isaac

    1994-01-01

    Presents suicide risk assessment in a two-tiered model comprising background/contextual factors and subjectivity. The subjectivity portion is formulated around Shneidman's concepts of perturbation and lethality. Discusses decision of hospital admission versus ambulatory care. Suggests that theoretically informed approach should serve both…

  20. Increased Risk of Additional Cancers Among Patients with Gastrointestinal Stromal Tumors: A Population-Based Study

    PubMed Central

    Murphy, James D.; Ma, Grace L.; Baumgartner, Joel M.; Madlensky, Lisa; Burgoyne, Adam M.; Tang, Chih-Min; Martinez, Maria Elena; Sicklick, Jason K.

    2015-01-01

    Purpose Most gastrointestinal stromal tumors (GIST) are considered non-hereditary or sporadic. However, single-institution studies suggest that GIST patients develop additional malignancies with increased frequencies. We hypothesized that we could gain greater insight into possible associations between GIST and other malignancies using a national cancer database inquiry. Methods Patients diagnosed with GIST (2001–2011) in the Surveillance, Epidemiology, and End Results database were included. Standardized prevalence ratios (SPRs) and standardized incidence ratios (SIRs) were used to quantify cancer risks incurred by GIST patients before and after GIST diagnoses, respectively, when compared with the general U.S. population. Results Of 6,112 GIST patients, 1,047 (17.1%) had additional cancers. There were significant increases in overall cancer rates: 44% (SPR=1.44) before diagnosis and 66% (SIR=1.66) after GIST diagnoses. Malignancies with significantly increased occurrence both before/after diagnoses included other sarcomas (SPR=5.24/SIR=4.02), neuroendocrine-carcinoid tumors (SPR=3.56/SIR=4.79), non-Hodgkin’s lymphoma (SPR=1.69/SIR=1.76), and colorectal adenocarcinoma (SPR=1.51/SIR=2.16). Esophageal adenocarcinoma (SPR=12.0), bladder adenocarcinoma (SPR=7.51), melanoma (SPR=1.46), and prostate adenocarcinoma (SPR=1.20) were significantly more common only before GIST. Ovarian carcinoma (SIR=8.72), small intestine adenocarcinoma (SIR=5.89), papillary thyroid cancer (SIR=5.16), renal cell carcinoma (SIR=4.46), hepatobiliary adenocarcinomas (SIR=3.10), gastric adenocarcinoma (SIR=2.70), pancreatic adenocarcinoma (SIR=2.03), uterine adenocarcinoma (SIR=1.96), non-small cell lung cancer (SIR=1.74), and transitional cell carcinoma of the bladder (SIR=1.65) were significantly more common only after GIST. Conclusion This is the first population-based study to characterize the associations and temporal relationships between GIST and other cancers, both by site and

  1. Software reliability: Additional investigations into modeling with replicated experiments

    NASA Technical Reports Server (NTRS)

    Nagel, P. M.; Schotz, F. M.; Skirvan, J. A.

    1984-01-01

    The effects of programmer experience level, different program usage distributions, and programming languages are explored. All these factors affect performance, and some tentative relational hypotheses are presented. An analytic framework for replicated and non-replicated (traditional) software experiments is presented. A method of obtaining an upper bound on the error rate of the next error is proposed. The method was validated empirically by comparing forecasts with actual data. In all 14 cases the bound exceeded the observed parameter, albeit somewhat conservatively. Two other forecasting methods are proposed and compared to observed results. Although demonstrated relative to this framework that stages are neither independent nor exponentially distributed, empirical estimates show that the exponential assumption is nearly valid for all but the extreme tails of the distribution. Except for the dependence in the stage probabilities, Cox's model approximates to a degree what is being observed.

  2. Determinants of Low Birth Weight in Malawi: Bayesian Geo-Additive Modelling.

    PubMed

    Ngwira, Alfred; Stanley, Christopher C

    2015-01-01

    Studies on factors of low birth weight in Malawi have neglected the flexible approach of using smooth functions for some covariates in models. Such flexible approach reveals detailed relationship of covariates with the response. The study aimed at investigating risk factors of low birth weight in Malawi by assuming a flexible approach for continuous covariates and geographical random effect. A Bayesian geo-additive model for birth weight in kilograms and size of the child at birth (less than average or average and higher) with district as a spatial effect using the 2010 Malawi demographic and health survey data was adopted. A Gaussian model for birth weight in kilograms and a binary logistic model for the binary outcome (size of child at birth) were fitted. Continuous covariates were modelled by the penalized (p) splines and spatial effects were smoothed by the two dimensional p-spline. The study found that child birth order, mother weight and height are significant predictors of birth weight. Secondary education for mother, birth order categories 2-3 and 4-5, wealth index of richer family and mother height were significant predictors of child size at birth. The area associated with low birth weight was Chitipa and areas with increased risk to less than average size at birth were Chitipa and Mchinji. The study found support for the flexible modelling of some covariates that clearly have nonlinear influences. Nevertheless there is no strong support for inclusion of geographical spatial analysis. The spatial patterns though point to the influence of omitted variables with some spatial structure or possibly epidemiological processes that account for this spatial structure and the maps generated could be used for targeting development efforts at a glance.

  3. Determinants of Low Birth Weight in Malawi: Bayesian Geo-Additive Modelling

    PubMed Central

    Ngwira, Alfred; Stanley, Christopher C.

    2015-01-01

    Studies on factors of low birth weight in Malawi have neglected the flexible approach of using smooth functions for some covariates in models. Such flexible approach reveals detailed relationship of covariates with the response. The study aimed at investigating risk factors of low birth weight in Malawi by assuming a flexible approach for continuous covariates and geographical random effect. A Bayesian geo-additive model for birth weight in kilograms and size of the child at birth (less than average or average and higher) with district as a spatial effect using the 2010 Malawi demographic and health survey data was adopted. A Gaussian model for birth weight in kilograms and a binary logistic model for the binary outcome (size of child at birth) were fitted. Continuous covariates were modelled by the penalized (p) splines and spatial effects were smoothed by the two dimensional p-spline. The study found that child birth order, mother weight and height are significant predictors of birth weight. Secondary education for mother, birth order categories 2-3 and 4-5, wealth index of richer family and mother height were significant predictors of child size at birth. The area associated with low birth weight was Chitipa and areas with increased risk to less than average size at birth were Chitipa and Mchinji. The study found support for the flexible modelling of some covariates that clearly have nonlinear influences. Nevertheless there is no strong support for inclusion of geographical spatial analysis. The spatial patterns though point to the influence of omitted variables with some spatial structure or possibly epidemiological processes that account for this spatial structure and the maps generated could be used for targeting development efforts at a glance. PMID:26114866

  4. Additional Developments in Atmosphere Revitalization Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.; Gomez, Carlos

    2013-01-01

    NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project within the AES program.

  5. Additional Developments in Atmosphere Revitalization Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.

    2013-01-01

    NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM)

  6. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  7. Landslide risk models for decision making.

    PubMed

    Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio

    2009-11-01

    This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.

  8. Landslide risk mapping and modeling in China

    NASA Astrophysics Data System (ADS)

    Li, W.; Hong, Y.

    2015-12-01

    Under circumstances of global climate change, tectonic stress and human effect, landslides are among the most frequent and severely widespread natural hazards on Earth, as demonstrated in the World Atlas of Natural Hazards (McGuire et al., 2004). Every year, landslide activities cause serious economic loss as well as casualties (Róbert et al., 2005). How landslides can be monitored and predicted is an urgent research topic of the international landslide research community. Particularly, there is a lack of high quality and updated landslide risk maps and guidelines that can be employed to better mitigate and prevent landslide disasters in many emerging regions, including China (Hong, 2007). Since the 1950s, landslide events have been recorded in the statistical yearbooks, newspapers, and monographs in China. As disasters have been increasingly concerned by the government and the public, information about landslide events is becoming available from online news reports (Liu et al., 2012).This study presents multi-scale landslide risk mapping and modeling in China. At the national scale, based on historical data and practical experiences, we carry out landslide susceptibility and risk mapping by adopting a statistical approach and pattern recognition methods to construct empirical models. Over the identified landslide hot-spot areas, we further evaluate the slope-stability for each individual site (Sidle and Hirotaka, 2006), with the ultimate goal to set up a space-time multi-scale coupling system of Landslide risk mapping and modeling for landslide hazard monitoring and early warning.

  9. Risk management model in road transport systems

    NASA Astrophysics Data System (ADS)

    Sakhapov, R. L.; Nikolaeva, R. V.; Gatiyatullin, M. H.; Makhmutov, M. M.

    2016-08-01

    The article presents the results of a study of road safety indicators that influence the development and operation of the transport system. Road safety is considered as a continuous process of risk management. Authors constructed a model that relates the social risks of a major road safety indicator - the level of motorization. The model gives a fairly accurate assessment of the level of social risk for any given level of motorization. Authors calculated the dependence of the level of socio-economic costs of accidents and injured people in them. The applicability of the concept of socio-economic damage is caused by the presence of a linear relationship between the natural and economic indicators damage from accidents. The optimization of social risk is reduced to finding the extremum of the objective function that characterizes the economic effect of the implementation of measures to improve safety. The calculations make it possible to maximize the net present value, depending on the costs of improving road safety, taking into account socio-economic damage caused by accidents. The proposed econometric models make it possible to quantify the efficiency of the transportation system, allow to simulate the change in road safety indicators.

  10. Quantitative risk modelling for new pharmaceutical compounds.

    PubMed

    Tang, Zhengru; Taylor, Mark J; Lisboa, Paulo; Dyas, Mark

    2005-11-15

    The process of discovering and developing new drugs is long, costly and risk-laden. Faced with a wealth of newly discovered compounds, industrial scientists need to target resources carefully to discern the key attributes of a drug candidate and to make informed decisions. Here, we describe a quantitative approach to modelling the risk associated with drug development as a tool for scenario analysis concerning the probability of success of a compound as a potential pharmaceutical agent. We bring together the three strands of manufacture, clinical effectiveness and financial returns. This approach involves the application of a Bayesian Network. A simulation model is demonstrated with an implementation in MS Excel using the modelling engine Crystal Ball. PMID:16257374

  11. Malignancy Risk Models for Oral Lesions

    PubMed Central

    Zarate, Ana M.; Brezzo, María M.; Secchi, Dante G.; Barra, José L.

    2013-01-01

    Objectives: The aim of this work was to assess risk habits, clinical and cellular phenotypes and TP53 DNA changes in oral mucosa samples from patients with Oral Potentially Malignant Disorders (OPMD), in order to create models that enable genotypic and phenotypic patterns to be obtained that determine the risk of lesions becoming malignant. Study Design: Clinical phenotypes, family history of cancer and risk habits were collected in clinical histories. TP53 gene mutation and morphometric-morphological features were studied, and multivariate models were applied. Three groups were estabished: a) oral cancer (OC) group (n=10), b) OPMD group (n=10), and c) control group (n=8). Results: An average of 50% of patients with malignancy were found to have smoking and drinking habits. A high percentage of TP53 mutations were observed in OC (30%) and OPMD (average 20%) lesions (p=0.000). The majority of these mutations were GC ? TA transversion mutations (60%). However, patients with OC presented mutations in all the exons and introns studied. Highest diagnostic accuracy (p=0.0001) was observed when incorporating alcohol and tobacco habits variables with TP53 mutations. Conclusions: Our results prove to be statistically reliable, with parameter estimates that are nearly unbiased even for small sample sizes. Models 2 and 3 were the most accurate for assessing the risk of an OPMD becoming cancerous. However, in a public health context, model 3 is the most recommended because the characteristics considered are easier and less costly to evaluate. Key words:TP53, oral potentially malignant disorders, risk factors, genotype, phenotype. PMID:23722122

  12. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  13. Improving the predictive accuracy of hurricane power outage forecasts using generalized additive models.

    PubMed

    Han, Seung-Ryong; Guikema, Seth D; Quiring, Steven M

    2009-10-01

    Electric power is a critical infrastructure service after hurricanes, and rapid restoration of electric power is important in order to minimize losses in the impacted areas. However, rapid restoration of electric power after a hurricane depends on obtaining the necessary resources, primarily repair crews and materials, before the hurricane makes landfall and then appropriately deploying these resources as soon as possible after the hurricane. This, in turn, depends on having sound estimates of both the overall severity of the storm and the relative risk of power outages in different areas. Past studies have developed statistical, regression-based approaches for estimating the number of power outages in advance of an approaching hurricane. However, these approaches have either not been applicable for future events or have had lower predictive accuracy than desired. This article shows that a different type of regression model, a generalized additive model (GAM), can outperform the types of models used previously. This is done by developing and validating a GAM based on power outage data during past hurricanes in the Gulf Coast region and comparing the results from this model to the previously used generalized linear models.

  14. A statistical model for collective risk assessment

    NASA Astrophysics Data System (ADS)

    Keef, Caroline; Tawn, Jonathan A.; Lamb, Rob

    2010-05-01

    In this paper we present the theoretical basis of a statistical method that can be used as the basis of a collective risk assessment for country (or continent)-wide events. Our method is based on the conditional dependence model of Heffernan and Tawn (2004), which has been extended to handle missing data and temporal dependence by Keef et al (2009). This model describes the full joint distribution function of a set of variables and incorporates separate models for the marginal and dependence characteristics of the set using a copula approach. The advantages of this model include; the flexibility in terms of types of dependence modelled; the ability to handle situations where the dependence in the tails of the data is not the same as that in the main body of the data; the ability to handle both temporal and spatial dependence; and the ability to model a large number of variables. In this paper we present further extensions to the statistical model which allow us to simulate country-wide extreme events with the correct spatial and temporal structure and show an application to river flood events. Heffernan J. E. and Tawn J. A. (2004) A conditional approach for multivariate extreme values (with discussion) J. R. Statist. Soc. B, 66 497-546 Keef, C., J. Tawn, and C. Svensson. (2009). Spatial risk assessment for extreme river flows. Applied Statistics 58,(5) pp 601-618

  15. Simulation Assisted Risk Assessment: Blast Overpressure Modeling

    NASA Technical Reports Server (NTRS)

    Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael

    2006-01-01

    A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.

  16. [Risk hidden in the small print? : Some food additives may trigger pseudoallergic reactions].

    PubMed

    Zuberbier, Torsten; Hengstenberg, Claudine

    2016-06-01

    Some food additives may trigger pseudoallergenic reactions. However, the prevalence of such an overreaction is - despite the increasing number of food additives - rather low in the general population. The most common triggers of pseudoallergic reactions to food are naturally occurring ingredients. However, symptoms in patients with chronic urticaria should improve significantly on a pseudoallergen-free diet. In addition, some studies indicate that certain food additives may also have an impact on the symptoms of patients with neurodermatitis and asthma. PMID:27173908

  17. [Risk hidden in the small print? : Some food additives may trigger pseudoallergic reactions].

    PubMed

    Zuberbier, Torsten; Hengstenberg, Claudine

    2016-06-01

    Some food additives may trigger pseudoallergenic reactions. However, the prevalence of such an overreaction is - despite the increasing number of food additives - rather low in the general population. The most common triggers of pseudoallergic reactions to food are naturally occurring ingredients. However, symptoms in patients with chronic urticaria should improve significantly on a pseudoallergen-free diet. In addition, some studies indicate that certain food additives may also have an impact on the symptoms of patients with neurodermatitis and asthma.

  18. Flexible regression models for rate differences, risk differences and relative risks.

    PubMed

    Donoghoe, Mark W; Marschner, Ian C

    2015-05-01

    Generalized additive models (GAMs) based on the binomial and Poisson distributions can be used to provide flexible semi-parametric modelling of binary and count outcomes. When used with the canonical link function, these GAMs provide semi-parametrically adjusted odds ratios and rate ratios. For adjustment of other effect measures, including rate differences, risk differences and relative risks, non-canonical link functions must be used together with a constrained parameter space. However, the algorithms used to fit these models typically rely on a form of the iteratively reweighted least squares algorithm, which can be numerically unstable when a constrained non-canonical model is used. We describe an application of a combinatorial EM algorithm to fit identity link Poisson, identity link binomial and log link binomial GAMs in order to estimate semi-parametrically adjusted rate differences, risk differences and relative risks. Using smooth regression functions based on B-splines, the method provides stable convergence to the maximum likelihood estimates, and it ensures that the estimates always remain within the parameter space. It is also straightforward to apply a monotonicity constraint to the smooth regression functions. We illustrate the method using data from a clinical trial in heart attack patients. PMID:25781711

  19. Risk modelling for vaccination: a risk assessment perspective.

    PubMed

    Wooldridge, M

    2007-01-01

    Any risk assessment involves a number of steps. First, the risk manager, in close liaison with the risk assessor, should identify the question of interest. Then, the hazards associated with each risk question should be identified. Only then can the risks themselves be assessed. Several questions may reasonably be asked about the risk associated with avian influenza vaccines and their use. Some apply to any vaccine, while others are specific to avian influenza. Risks may occur during manufacture and during use. Some concern the vaccines themselves, while others address the effect of failure on disease control. The hazards associated with each risk question are then identified. These may be technical errors in design, development or production, such as contamination or failure to inactivate appropriately. They may relate to the biological properties of the pathogens themselves displayed during manufacture or use, for example, reversion to virulence, shedding or not being the right strain for the subsequent challenge. Following a consideration of risks and hazards, the information needed and an outline of the steps necessary to assess the risk is summarized, for an illustrative risk question using, as an example, the risks associated with the use of vaccines in the field. A brief consideration of the differences between qualitative and quantitative risk assessments is also included, and the potential effects of uncertainty and variability on the results are discussed.

  20. Analysis of Time to Event Outcomes in Randomized Controlled Trials by Generalized Additive Models

    PubMed Central

    Argyropoulos, Christos; Unruh, Mark L.

    2015-01-01

    Background Randomized Controlled Trials almost invariably utilize the hazard ratio calculated with a Cox proportional hazard model as a treatment efficacy measure. Despite the widespread adoption of HRs, these provide a limited understanding of the treatment effect and may even provide a biased estimate when the assumption of proportional hazards in the Cox model is not verified by the trial data. Additional treatment effect measures on the survival probability or the time scale may be used to supplement HRs but a framework for the simultaneous generation of these measures is lacking. Methods By splitting follow-up time at the nodes of a Gauss Lobatto numerical quadrature rule, techniques for Poisson Generalized Additive Models (PGAM) can be adopted for flexible hazard modeling. Straightforward simulation post-estimation transforms PGAM estimates for the log hazard into estimates of the survival function. These in turn were used to calculate relative and absolute risks or even differences in restricted mean survival time between treatment arms. We illustrate our approach with extensive simulations and in two trials: IPASS (in which the proportionality of hazards was violated) and HEMO a long duration study conducted under evolving standards of care on a heterogeneous patient population. Findings PGAM can generate estimates of the survival function and the hazard ratio that are essentially identical to those obtained by Kaplan Meier curve analysis and the Cox model. PGAMs can simultaneously provide multiple measures of treatment efficacy after a single data pass. Furthermore, supported unadjusted (overall treatment effect) but also subgroup and adjusted analyses, while incorporating multiple time scales and accounting for non-proportional hazards in survival data. Conclusions By augmenting the HR conventionally reported, PGAMs have the potential to support the inferential goals of multiple stakeholders involved in the evaluation and appraisal of clinical trial

  1. Analysis and Modeling of soil hydrology under different soil additives in artificial runoff plots

    NASA Astrophysics Data System (ADS)

    Ruidisch, M.; Arnhold, S.; Kettering, J.; Huwe, B.; Kuzyakov, Y.; Ok, Y.; Tenhunen, J. D.

    2009-12-01

    The impact of monsoon events during June and July in the Korean project region Haean Basin, which is located in the northeastern part of South Korea plays a key role for erosion, leaching and groundwater pollution risk by agrochemicals. Therefore, the project investigates the main hydrological processes in agricultural soils under field and laboratory conditions on different scales (plot, hillslope and catchment). Soil hydrological parameters were analysed depending on different soil additives, which are known for prevention of soil erosion and nutrient loss as well as increasing of water infiltration, aggregate stability and soil fertility. Hence, synthetic water-soluble Polyacrylamides (PAM), Biochar (Black Carbon mixed with organic fertilizer), both PAM and Biochar were applied in runoff plots at three agricultural field sites. Additionally, as control a subplot was set up without any additives. The field sites were selected in areas with similar hillslope gradients and with emphasis on the dominant land management form of dryland farming in Haean, which is characterised by row planting and row covering by foil. Hydrological parameters like satured water conductivity, matrix potential and water content were analysed by infiltration experiments, continuous tensiometer measurements, time domain reflectometry as well as pressure plates to indentify characteristic water retention curves of each horizon. Weather data were observed by three weather stations next to the runoff plots. Measured data also provide the input data for modeling water transport in the unsatured zone in runoff plots with HYDRUS 1D/2D/3D and SWAT (Soil & Water Assessment Tool).

  2. Human Plague Risk: Spatial-Temporal Models

    NASA Technical Reports Server (NTRS)

    Pinzon, Jorge E.

    2010-01-01

    This chpater reviews the use of spatial-temporal models in identifying potential risks of plague outbreaks into the human population. Using earth observations by satellites remote sensing there has been a systematic analysis and mapping of the close coupling between the vectors of the disease and climate variability. The overall result is that incidence of plague is correlated to positive El Nino/Southem Oscillation (ENSO).

  3. Assessing calibration of multinomial risk prediction models.

    PubMed

    Van Hoorde, Kirsten; Vergouwe, Yvonne; Timmerman, Dirk; Van Huffel, Sabine; Steyerberg, Ewout W; Van Calster, Ben

    2014-07-10

    Calibration, that is, whether observed outcomes agree with predicted risks, is important when evaluating risk prediction models. For dichotomous outcomes, several tools exist to assess different aspects of model calibration, such as calibration-in-the-large, logistic recalibration, and (non-)parametric calibration plots. We aim to extend these tools to prediction models for polytomous outcomes. We focus on models developed using multinomial logistic regression (MLR): outcome Y with k categories is predicted using k - 1 equations comparing each category i (i = 2, … ,k) with reference category 1 using a set of predictors, resulting in k - 1 linear predictors. We propose a multinomial logistic recalibration framework that involves an MLR fit where Y is predicted using the k - 1 linear predictors from the prediction model. A non-parametric alternative may use vector splines for the effects of the linear predictors. The parametric and non-parametric frameworks can be used to generate multinomial calibration plots. Further, the parametric framework can be used for the estimation and statistical testing of calibration intercepts and slopes. Two illustrative case studies are presented, one on the diagnosis of malignancy of ovarian tumors and one on residual mass diagnosis in testicular cancer patients treated with cisplatin-based chemotherapy. The risk prediction models were developed on data from 2037 and 544 patients and externally validated on 1107 and 550 patients, respectively. We conclude that calibration tools can be extended to polytomous outcomes. The polytomous calibration plots are particularly informative through the visual summary of the calibration performance.

  4. Bacterial-based additives for the production of artificial snow: what are the risks to human health?

    PubMed

    Lagriffoul, A; Boudenne, J L; Absi, R; Ballet, J J; Berjeaud, J M; Chevalier, S; Creppy, E E; Gilli, E; Gadonna, J P; Gadonna-Widehem, P; Morris, C E; Zini, S

    2010-03-01

    For around two decades, artificial snow has been used by numerous winter sports resorts to ensure good snow cover at low altitude areas or more generally, to lengthen the skiing season. Biological additives derived from certain bacteria are regularly used to make artificial snow. However, the use of these additives has raised doubts concerning the potential impact on human health and the environment. In this context, the French health authorities have requested the French Agency for Environmental and Occupational Health Safety (Afsset) to assess the health risks resulting from the use of such additives. The health risk assessment was based on a review of the scientific literature, supplemented by professional consultations and expertise. Biological or chemical hazards from additives derived from the ice nucleation active bacterium Pseudomonas syringae were characterised. Potential health hazards to humans were considered in terms of infectious, toxic and allergenic capacities with respect to human populations liable to be exposed and the means of possible exposure. Taking into account these data, a qualitative risk assessment was carried out, according to four exposure scenarios, involving the different populations exposed, and the conditions and routes of exposure. It was concluded that certain health risks can exist for specific categories of professional workers (mainly snowmakers during additive mixing and dilution tank cleaning steps, with risks estimated to be negligible to low if workers comply with safety precautions). P. syringae does not present any pathogenic capacity to humans and that the level of its endotoxins found in artificial snow do not represent a danger beyond that of exposure to P. syringae endotoxins naturally present in snow. However, the risk of possible allergy in some particularly sensitive individuals cannot be excluded. Another important conclusion of this study concerns use of poor microbiological water quality to make artificial snow.

  5. Bacterial-based additives for the production of artificial snow: what are the risks to human health?

    PubMed

    Lagriffoul, A; Boudenne, J L; Absi, R; Ballet, J J; Berjeaud, J M; Chevalier, S; Creppy, E E; Gilli, E; Gadonna, J P; Gadonna-Widehem, P; Morris, C E; Zini, S

    2010-03-01

    For around two decades, artificial snow has been used by numerous winter sports resorts to ensure good snow cover at low altitude areas or more generally, to lengthen the skiing season. Biological additives derived from certain bacteria are regularly used to make artificial snow. However, the use of these additives has raised doubts concerning the potential impact on human health and the environment. In this context, the French health authorities have requested the French Agency for Environmental and Occupational Health Safety (Afsset) to assess the health risks resulting from the use of such additives. The health risk assessment was based on a review of the scientific literature, supplemented by professional consultations and expertise. Biological or chemical hazards from additives derived from the ice nucleation active bacterium Pseudomonas syringae were characterised. Potential health hazards to humans were considered in terms of infectious, toxic and allergenic capacities with respect to human populations liable to be exposed and the means of possible exposure. Taking into account these data, a qualitative risk assessment was carried out, according to four exposure scenarios, involving the different populations exposed, and the conditions and routes of exposure. It was concluded that certain health risks can exist for specific categories of professional workers (mainly snowmakers during additive mixing and dilution tank cleaning steps, with risks estimated to be negligible to low if workers comply with safety precautions). P. syringae does not present any pathogenic capacity to humans and that the level of its endotoxins found in artificial snow do not represent a danger beyond that of exposure to P. syringae endotoxins naturally present in snow. However, the risk of possible allergy in some particularly sensitive individuals cannot be excluded. Another important conclusion of this study concerns use of poor microbiological water quality to make artificial snow

  6. Animal Models of Ischemic Stroke. Part One: Modeling Risk Factors

    PubMed Central

    Bacigaluppi, Marco; Comi, Giancarlo; Hermann, Dirk M.

    2010-01-01

    Ischemic stroke is one of the leading causes of long-term disability and death in developed and developing countries. As emerging disease, stroke related mortality and morbidity is going to step up in the next decades. This is both due to the poor identification of risk factors and persistence of unhealthy habits, as well as to the aging of the population. To counteract the estimated increase in stroke incidence, it is of primary importance to identify risk factors, study their effects, to promote primary and secondary prevention, and to extend the therapeutic repertoire that is currently limited to the very first hours after stroke. While epidemiologic studies in the human population are essential to identify emerging risk factors, adequate animal models represent a fundamental tool to dissect stroke risk factors to their molecular mechanism and to find efficacious therapeutic strategies for this complex multi- factorial disorder. The present review is organized into two parts: the first part deals with the animal models that have been developed to study stroke and its related risk factors and the second part analyzes the specific stroke models. These models represent an indispensable tool to investigate the mechanisms of cerebral injury and to develop novel therapies. PMID:20802809

  7. Predicting the risk of rheumatoid arthritis and its age of onset through modelling genetic risk variants with smoking.

    PubMed

    Scott, Ian C; Seegobin, Seth D; Steer, Sophia; Tan, Rachael; Forabosco, Paola; Hinks, Anne; Eyre, Stephen; Morgan, Ann W; Wilson, Anthony G; Hocking, Lynne J; Wordsworth, Paul; Barton, Anne; Worthington, Jane; Cope, Andrew P; Lewis, Cathryn M

    2013-01-01

    The improved characterisation of risk factors for rheumatoid arthritis (RA) suggests they could be combined to identify individuals at increased disease risks in whom preventive strategies may be evaluated. We aimed to develop an RA prediction model capable of generating clinically relevant predictive data and to determine if it better predicted younger onset RA (YORA). Our novel modelling approach combined odds ratios for 15 four-digit/10 two-digit HLA-DRB1 alleles, 31 single nucleotide polymorphisms (SNPs) and ever-smoking status in males to determine risk using computer simulation and confidence interval based risk categorisation. Only males were evaluated in our models incorporating smoking as ever-smoking is a significant risk factor for RA in men but not women. We developed multiple models to evaluate each risk factor's impact on prediction. Each model's ability to discriminate anti-citrullinated protein antibody (ACPA)-positive RA from controls was evaluated in two cohorts: Wellcome Trust Case Control Consortium (WTCCC: 1,516 cases; 1,647 controls); UK RA Genetics Group Consortium (UKRAGG: 2,623 cases; 1,500 controls). HLA and smoking provided strongest prediction with good discrimination evidenced by an HLA-smoking model area under the curve (AUC) value of 0.813 in both WTCCC and UKRAGG. SNPs provided minimal prediction (AUC 0.660 WTCCC/0.617 UKRAGG). Whilst high individual risks were identified, with some cases having estimated lifetime risks of 86%, only a minority overall had substantially increased odds for RA. High risks from the HLA model were associated with YORA (P<0.0001); ever-smoking associated with older onset disease. This latter finding suggests smoking's impact on RA risk manifests later in life. Our modelling demonstrates that combining risk factors provides clinically informative RA prediction; additionally HLA and smoking status can be used to predict the risk of younger and older onset RA, respectively. PMID:24068971

  8. Predicting the risk of rheumatoid arthritis and its age of onset through modelling genetic risk variants with smoking.

    PubMed

    Scott, Ian C; Seegobin, Seth D; Steer, Sophia; Tan, Rachael; Forabosco, Paola; Hinks, Anne; Eyre, Stephen; Morgan, Ann W; Wilson, Anthony G; Hocking, Lynne J; Wordsworth, Paul; Barton, Anne; Worthington, Jane; Cope, Andrew P; Lewis, Cathryn M

    2013-01-01

    The improved characterisation of risk factors for rheumatoid arthritis (RA) suggests they could be combined to identify individuals at increased disease risks in whom preventive strategies may be evaluated. We aimed to develop an RA prediction model capable of generating clinically relevant predictive data and to determine if it better predicted younger onset RA (YORA). Our novel modelling approach combined odds ratios for 15 four-digit/10 two-digit HLA-DRB1 alleles, 31 single nucleotide polymorphisms (SNPs) and ever-smoking status in males to determine risk using computer simulation and confidence interval based risk categorisation. Only males were evaluated in our models incorporating smoking as ever-smoking is a significant risk factor for RA in men but not women. We developed multiple models to evaluate each risk factor's impact on prediction. Each model's ability to discriminate anti-citrullinated protein antibody (ACPA)-positive RA from controls was evaluated in two cohorts: Wellcome Trust Case Control Consortium (WTCCC: 1,516 cases; 1,647 controls); UK RA Genetics Group Consortium (UKRAGG: 2,623 cases; 1,500 controls). HLA and smoking provided strongest prediction with good discrimination evidenced by an HLA-smoking model area under the curve (AUC) value of 0.813 in both WTCCC and UKRAGG. SNPs provided minimal prediction (AUC 0.660 WTCCC/0.617 UKRAGG). Whilst high individual risks were identified, with some cases having estimated lifetime risks of 86%, only a minority overall had substantially increased odds for RA. High risks from the HLA model were associated with YORA (P<0.0001); ever-smoking associated with older onset disease. This latter finding suggests smoking's impact on RA risk manifests later in life. Our modelling demonstrates that combining risk factors provides clinically informative RA prediction; additionally HLA and smoking status can be used to predict the risk of younger and older onset RA, respectively.

  9. Predicting the Risk of Rheumatoid Arthritis and Its Age of Onset through Modelling Genetic Risk Variants with Smoking

    PubMed Central

    Scott, Ian C.; Seegobin, Seth D.; Steer, Sophia; Tan, Rachael; Forabosco, Paola; Hinks, Anne; Eyre, Stephen; Morgan, Ann W.; Wilson, Anthony G.; Hocking, Lynne J.; Wordsworth, Paul; Barton, Anne; Worthington, Jane; Cope, Andrew P.; Lewis, Cathryn M.

    2013-01-01

    The improved characterisation of risk factors for rheumatoid arthritis (RA) suggests they could be combined to identify individuals at increased disease risks in whom preventive strategies may be evaluated. We aimed to develop an RA prediction model capable of generating clinically relevant predictive data and to determine if it better predicted younger onset RA (YORA). Our novel modelling approach combined odds ratios for 15 four-digit/10 two-digit HLA-DRB1 alleles, 31 single nucleotide polymorphisms (SNPs) and ever-smoking status in males to determine risk using computer simulation and confidence interval based risk categorisation. Only males were evaluated in our models incorporating smoking as ever-smoking is a significant risk factor for RA in men but not women. We developed multiple models to evaluate each risk factor's impact on prediction. Each model's ability to discriminate anti-citrullinated protein antibody (ACPA)-positive RA from controls was evaluated in two cohorts: Wellcome Trust Case Control Consortium (WTCCC: 1,516 cases; 1,647 controls); UK RA Genetics Group Consortium (UKRAGG: 2,623 cases; 1,500 controls). HLA and smoking provided strongest prediction with good discrimination evidenced by an HLA-smoking model area under the curve (AUC) value of 0.813 in both WTCCC and UKRAGG. SNPs provided minimal prediction (AUC 0.660 WTCCC/0.617 UKRAGG). Whilst high individual risks were identified, with some cases having estimated lifetime risks of 86%, only a minority overall had substantially increased odds for RA. High risks from the HLA model were associated with YORA (P<0.0001); ever-smoking associated with older onset disease. This latter finding suggests smoking's impact on RA risk manifests later in life. Our modelling demonstrates that combining risk factors provides clinically informative RA prediction; additionally HLA and smoking status can be used to predict the risk of younger and older onset RA, respectively. PMID:24068971

  10. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    SciTech Connect

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  11. Polymorphism FXII 46C>T and cardiovascular risk: additional data from Spanish and Tunisian patients

    PubMed Central

    Athanasiadis, Georgios; Esteban, Esther; Vidal, Magdanela Gayà; Torres, Robert Carreras; Bahri, Raoudha; Moral, Pedro

    2009-01-01

    Background Previous studies showed an association between Coagulation Factor XII 46C>T polymorphism and variation in FXII plasma levels, as 46C>T seems to affect the translation efficiency. Case-control studies in Spanish samples indicated that genotype T/T is an independent risk factor for venous thrombosis, ischemic stroke and acute coronary artery disease. In this study, we tried to reaffirm the importance of 46C>T in two samples from Spain and Tunisia. Findings A Transmission Disequilibrium Test (TDT) based on 101 family trios from Barcelona with one offspring affected by ischemic heart disease and a classical case-control study based on 76 patients with IHD and 118 healthy individuals from North and Centre-South Tunisia were conducted. Subjects were genotyped for 46C>T and data were analyzed accordingly, revealing no association in any of the two samples (TDT: P = 0.16, relative risk 1.17; case-control study: P = 0.59, odds ratio 1.36). Conclusion The results suggest that 46C>T is not a risk factor for ischemic heart disease in any of the two analyzed samples and therefore the polymorphism seems not to be a universal risk factor for cardiovascular diseases. PMID:19646235

  12. USING DOSE ADDITION TO ESTIMATE CUMULATIVE RISKS FROM EXPOSURES TO MULTIPLE CHEMICALS

    EPA Science Inventory

    The Food Quality Protection Act (FQPA) of 1996 requires the EPA to consider the cumulative risk from exposure to multiple chemicals that have a common mechanism of toxicity. Three methods, hazard index (HI), point-of-departure index (PODI), and toxicity equivalence factor (TEF), ...

  13. [Food additives and genetically modified food--a risk for allergic patients?].

    PubMed

    Wüthrich, B

    1999-04-01

    Adverse reactions to food and food additives must be classified according to pathogenic criteria. It is necessary to strictly differentiate between an allergy, triggered by a substance-specific immunological mechanism, and an intolerance, in which no specific immune reaction can be established. In contrast to views expressed in the media, by laymen and patients, adverse reactions to additives are less frequent than is believed. Due to frequently "alternative" methods of examination, an allergy to food additives is often wrongly blamed as the cause of a wide variety of symptoms and illness. Diagnosing an allergy or intolerance to additives normally involves carrying out double-blind, placebo-controlled oral provocation tests with food additives. Allergic reactions to food additives occur particularly against additives which are organic in origin. In principle, it is possible that during the manufacture of genetically modified plants and food, proteins are transferred which potentially create allergies. However, legislation exists both in the USA (Federal Drug Administration, FDA) and in Switzerland (Ordinance on the approval process for GM food, GM food additives and GM accessory agents for processing) which require a careful analysis before a genetically modified product is launched, particularly where foreign genes are introduced. Products containing genetically modified organisms (GMO) as additives must be declared. In addition, the source of the foreign protein must be identified. The "Round-up ready" (RR) soya flour introduced in Switzerland is no different from natural soya flour in terms of its allergenic potential. Genetically modified food can be a blessing for allergic individuals if gene technology were to succeed in removing the allergen (e.g. such possibilities exist for rice). The same caution shown towards genetically modified food might also be advisable for foreign food in our diet. Luckily, the immune system of the digestive tract in healthy people

  14. [Food additives and genetically modified food--a risk for allergic patients?].

    PubMed

    Wüthrich, B

    1999-04-01

    Adverse reactions to food and food additives must be classified according to pathogenic criteria. It is necessary to strictly differentiate between an allergy, triggered by a substance-specific immunological mechanism, and an intolerance, in which no specific immune reaction can be established. In contrast to views expressed in the media, by laymen and patients, adverse reactions to additives are less frequent than is believed. Due to frequently "alternative" methods of examination, an allergy to food additives is often wrongly blamed as the cause of a wide variety of symptoms and illness. Diagnosing an allergy or intolerance to additives normally involves carrying out double-blind, placebo-controlled oral provocation tests with food additives. Allergic reactions to food additives occur particularly against additives which are organic in origin. In principle, it is possible that during the manufacture of genetically modified plants and food, proteins are transferred which potentially create allergies. However, legislation exists both in the USA (Federal Drug Administration, FDA) and in Switzerland (Ordinance on the approval process for GM food, GM food additives and GM accessory agents for processing) which require a careful analysis before a genetically modified product is launched, particularly where foreign genes are introduced. Products containing genetically modified organisms (GMO) as additives must be declared. In addition, the source of the foreign protein must be identified. The "Round-up ready" (RR) soya flour introduced in Switzerland is no different from natural soya flour in terms of its allergenic potential. Genetically modified food can be a blessing for allergic individuals if gene technology were to succeed in removing the allergen (e.g. such possibilities exist for rice). The same caution shown towards genetically modified food might also be advisable for foreign food in our diet. Luckily, the immune system of the digestive tract in healthy people

  15. Electricity market pricing, risk hedging and modeling

    NASA Astrophysics Data System (ADS)

    Cheng, Xu

    In this dissertation, we investigate the pricing, price risk hedging/arbitrage, and simplified system modeling for a centralized LMP-based electricity market. In an LMP-based market model, the full AC power flow model and the DC power flow model are most widely used to represent the transmission system. We investigate the differences of dispatching results, congestion pattern, and LMPs for the two power flow models. An appropriate LMP decomposition scheme to quantify the marginal costs of the congestion and real power losses is critical for the implementation of financial risk hedging markets. However, the traditional LMP decomposition heavily depends on the slack bus selection. In this dissertation we propose a slack-independent scheme to break LMP down into energy, congestion, and marginal loss components by analyzing the actual marginal cost of each bus at the optimal solution point. The physical and economic meanings of the marginal effect at each bus provide accurate price information for both congestion and losses, and thus the slack-dependency of the traditional scheme is eliminated. With electricity priced at the margin instead of the average value, the market operator typically collects more revenue from power sellers than that paid to power buyers. According to the LMP decomposition results, the revenue surplus is then divided into two parts: congestion charge surplus and marginal loss revenue surplus. We apply the LMP decomposition results to the financial tools, such as financial transmission right (FTR) and loss hedging right (LHR), which have been introduced to hedge against price risks associated to congestion and losses, to construct a full price risk hedging portfolio. The two-settlement market structure and the introduction of financial tools inevitably create market manipulation opportunities. We investigate several possible market manipulation behaviors by virtual bidding and propose a market monitor approach to identify and quantify such

  16. Use of generalised additive models to categorise continuous variables in clinical prediction

    PubMed Central

    2013-01-01

    Background In medical practice many, essentially continuous, clinical parameters tend to be categorised by physicians for ease of decision-making. Indeed, categorisation is a common practice both in medical research and in the development of clinical prediction rules, particularly where the ensuing models are to be applied in daily clinical practice to support clinicians in the decision-making process. Since the number of categories into which a continuous predictor must be categorised depends partly on the relationship between the predictor and the outcome, the need for more than two categories must be borne in mind. Methods We propose a categorisation methodology for clinical-prediction models, using Generalised Additive Models (GAMs) with P-spline smoothers to determine the relationship between the continuous predictor and the outcome. The proposed method consists of creating at least one average-risk category along with high- and low-risk categories based on the GAM smooth function. We applied this methodology to a prospective cohort of patients with exacerbated chronic obstructive pulmonary disease. The predictors selected were respiratory rate and partial pressure of carbon dioxide in the blood (PCO2), and the response variable was poor evolution. An additive logistic regression model was used to show the relationship between the covariates and the dichotomous response variable. The proposed categorisation was compared to the continuous predictor as the best option, using the AIC and AUC evaluation parameters. The sample was divided into a derivation (60%) and validation (40%) samples. The first was used to obtain the cut points while the second was used to validate the proposed methodology. Results The three-category proposal for the respiratory rate was ≤ 20;(20,24];> 24, for which the following values were obtained: AIC=314.5 and AUC=0.638. The respective values for the continuous predictor were AIC=317.1 and AUC=0.634, with no statistically

  17. Predicting contamination by the fuel additive cerium oxide engineered nanoparticles within the United Kingdom and the associated risks.

    PubMed

    Johnson, Andrew C; Park, Barry

    2012-11-01

    As a fuel additive, cerium oxide nanoparticles may become widely dispersed throughout the environment. Commercial information from the United Kingdom (UK) on the use of cerium oxide nanoparticles was used to perform a modeling and risk assessment exercise. Discharge from exhausts took into account the likely removal by filters fitted to these vehicles. For predicting current soil exposure, scenarios were examined, ranging from dispersion occurring across the entire UK landmass to only within the urban area to only 20 m on either side of road networks. For soils, the highest predicted contamination level was 0.016 mg/kg within 20 m of a road following seven years of continuous deposition. This value would represent 0.027% of reported natural background cerium. If usage were to double for five more years, levels would not be expected to exceed 0.04 mg/kg. River water contamination considered direct aerial deposition and indirect contamination via runoff in the water and entrained soil sediment, with the highest level of 0.02 ng/L predicted. The highest predicted water concentration of 300 ng/L was associated with water draining from a road surface, assuming a restricted deposition spread. These predictions are well below most toxicological levels of concern.

  18. Hazard and risk assessment of a nanoparticulate cerium oxide-based diesel fuel additive - a case study.

    PubMed

    Park, Barry; Donaldson, Kenneth; Duffin, Rodger; Tran, Lang; Kelly, Frank; Mudway, Ian; Morin, Jean-Paul; Guest, Robert; Jenkinson, Peter; Samaras, Zissis; Giannouli, Myrsini; Kouridis, Haris; Martin, Patricia

    2008-04-01

    Envirox is a scientifically and commercially proven diesel fuel combustion catalyst based on nanoparticulate cerium oxide and has been demonstrated to reduce fuel consumption, greenhouse gas emissions (CO(2)), and particulate emissions when added to diesel at levels of 5 mg/L. Studies have confirmed the adverse effects of particulates on respiratory and cardiac health, and while the use of Envirox contributes to a reduction in the particulate content in the air, it is necessary to demonstrate that the addition of Envirox does not alter the intrinsic toxicity of particles emitted in the exhaust. The purpose of this study was to evaluate the safety in use of Envirox by addressing the classical risk paradigm. Hazard assessment has been addressed by examining a range of in vitro cell and cell-free endpoints to assess the toxicity of cerium oxide nanoparticles as well as particulates emitted from engines using Envirox. Exposure assessment has taken data from modeling studies and from airborne monitoring sites in London and Newcastle adjacent to routes where vehicles using Envirox passed. Data have demonstrated that for the exposure levels measured, the estimated internal dose for a referential human in a chronic exposure situation is much lower than the no-observed-effect level (NOEL) in the in vitro toxicity studies. Exposure to nano-size cerium oxide as a result of the addition of Envirox to diesel fuel at the current levels of exposure in ambient air is therefore unlikely to lead to pulmonary oxidative stress and inflammation, which are the precursors for respiratory and cardiac health problems. PMID:18444008

  19. Modeling risk of occupational zoonotic influenza infection in swine workers.

    PubMed

    Paccha, Blanca; Jones, Rachael M; Gibbs, Shawn; Kane, Michael J; Torremorell, Montserrat; Neira-Ramirez, Victor; Rabinowitz, Peter M

    2016-08-01

    Zoonotic transmission of influenza A virus (IAV) between swine and workers in swine production facilities may play a role in the emergence of novel influenza strains with pandemic potential. Guidelines to prevent transmission of influenza to swine workers have been developed but there is a need for evidence-based decision-making about protective measures such as respiratory protection. A mathematical model was applied to estimate the risk of occupational IAV exposure to swine workers by contact and airborne transmission, and to evaluate the use of respirators to reduce transmission.  The Markov model was used to simulate the transport and exposure of workers to IAV in a swine facility. A dose-response function was used to estimate the risk of infection. This approach is similar to methods previously used to estimate the risk of infection in human health care settings. This study uses concentration of virus in air from field measurements collected during outbreaks of influenza in commercial swine facilities, and analyzed by polymerase chain reaction.  It was found that spending 25 min working in a barn during an influenza outbreak in a swine herd could be sufficient to cause zoonotic infection in a worker. However, this risk estimate was sensitive to estimates of viral infectivity to humans. Wearing an excellent fitting N95 respirator reduced this risk, but with high aerosol levels the predicted risk of infection remained high under certain assumptions.  The results of this analysis indicate that under the conditions studied, swine workers are at risk of zoonotic influenza infection. The use of an N95 respirator could reduce such risk. These findings have implications for risk assessment and preventive programs targeting swine workers. The exact level of risk remains uncertain, since our model may have overestimated the viability or infectivity of IAV. Additionally, the potential for partial immunity in swine workers associated with repeated low

  20. Modeling risk of pneumonia epizootics in bighorn sheep

    USGS Publications Warehouse

    Sells, Sarah N.; Mitchell, Michael S.; Nowak, J. Joshua; Lukacs, Paul M.; Anderson, Neil J.; Ramsey, Jennifer M.; Gude, Justin A.; Krausman, Paul R.

    2015-01-01

    Pneumonia epizootics are a major challenge for management of bighorn sheep (Ovis canadensis) affecting persistence of herds, satisfaction of stakeholders, and allocations of resources by management agencies. Risk factors associated with the disease are poorly understood, making pneumonia epizootics hard to predict; such epizootics are thus managed reactively rather than proactively. We developed a model for herds in Montana that identifies risk factors and addresses biological questions about risk. Using Bayesian logistic regression with repeated measures, we found that private land, weed control using domestic sheep or goats, pneumonia history, and herd density were positively associated with risk of pneumonia epizootics in 43 herds that experienced 22 epizootics out of 637 herd-years from 1979–2013. We defined an area of high risk for pathogen exposure as the area of each herd distribution plus a 14.5-km buffer from that boundary. Within this area, the odds of a pneumonia epizootic increased by >1.5 times per additional unit of private land (unit is the standardized % of private land where global  = 25.58% and SD = 14.53%). Odds were >3.3 times greater if domestic sheep or goats were used for weed control in a herd's area of high risk. If a herd or its neighbors within the area of high risk had a history of a pneumonia epizootic, odds of a subsequent pneumonia epizootic were >10 times greater. Risk greatly increased when herds were at high density, with nearly 15 times greater odds of a pneumonia epizootic compared to when herds were at low density. Odds of a pneumonia epizootic also appeared to decrease following increased spring precipitation (odds = 0.41 per unit increase, global  = 100.18% and SD = 26.97%). Risk was not associated with number of federal sheep and goat allotments, proximity to nearest herds of bighorn sheep, ratio of rams to ewes, percentage of average winter precipitation, or whether herds were of native versus mixed

  1. An animal model of differential genetic risk for methamphetamine intake.

    PubMed

    Phillips, Tamara J; Shabani, Shkelzen

    2015-01-01

    The question of whether genetic factors contribute to risk for methamphetamine (MA) use and dependence has not been intensively investigated. Compared to human populations, genetic animal models offer the advantages of control over genetic family history and drug exposure. Using selective breeding, we created lines of mice that differ in genetic risk for voluntary MA intake and identified the chromosomal addresses of contributory genes. A quantitative trait locus was identified on chromosome 10 that accounts for more than 50% of the genetic variance in MA intake in the selected mouse lines. In addition, behavioral and physiological screening identified differences corresponding with risk for MA intake that have generated hypotheses that are testable in humans. Heightened sensitivity to aversive and certain physiological effects of MA, such as MA-induced reduction in body temperature, are hallmarks of mice bred for low MA intake. Furthermore, unlike MA-avoiding mice, MA-preferring mice are sensitive to rewarding and reinforcing MA effects, and to MA-induced increases in brain extracellular dopamine levels. Gene expression analyses implicate the importance of a network enriched in transcription factor genes, some of which regulate the mu opioid receptor gene, Oprm1, in risk for MA use. Neuroimmune factors appear to play a role in differential response to MA between the mice bred for high and low intake. In addition, chromosome 10 candidate gene studies provide strong support for a trace amine-associated receptor 1 gene, Taar1, polymorphism in risk for MA intake. MA is a trace amine-associated receptor 1 (TAAR1) agonist, and a non-functional Taar1 allele segregates with high MA consumption. Thus, reduced TAAR1 function has the potential to increase risk for MA use. Overall, existing findings support the MA drinking lines as a powerful model for identifying genetic factors involved in determining risk for harmful MA use. Future directions include the development of a

  2. An animal model of differential genetic risk for methamphetamine intake

    PubMed Central

    Phillips, Tamara J.; Shabani, Shkelzen

    2015-01-01

    The question of whether genetic factors contribute to risk for methamphetamine (MA) use and dependence has not been intensively investigated. Compared to human populations, genetic animal models offer the advantages of control over genetic family history and drug exposure. Using selective breeding, we created lines of mice that differ in genetic risk for voluntary MA intake and identified the chromosomal addresses of contributory genes. A quantitative trait locus was identified on chromosome 10 that accounts for more than 50% of the genetic variance in MA intake in the selected mouse lines. In addition, behavioral and physiological screening identified differences corresponding with risk for MA intake that have generated hypotheses that are testable in humans. Heightened sensitivity to aversive and certain physiological effects of MA, such as MA-induced reduction in body temperature, are hallmarks of mice bred for low MA intake. Furthermore, unlike MA-avoiding mice, MA-preferring mice are sensitive to rewarding and reinforcing MA effects, and to MA-induced increases in brain extracellular dopamine levels. Gene expression analyses implicate the importance of a network enriched in transcription factor genes, some of which regulate the mu opioid receptor gene, Oprm1, in risk for MA use. Neuroimmune factors appear to play a role in differential response to MA between the mice bred for high and low intake. In addition, chromosome 10 candidate gene studies provide strong support for a trace amine-associated receptor 1 gene, Taar1, polymorphism in risk for MA intake. MA is a trace amine-associated receptor 1 (TAAR1) agonist, and a non-functional Taar1 allele segregates with high MA consumption. Thus, reduced TAAR1 function has the potential to increase risk for MA use. Overall, existing findings support the MA drinking lines as a powerful model for identifying genetic factors involved in determining risk for harmful MA use. Future directions include the development of a

  3. A hybrid likelihood algorithm for risk modelling.

    PubMed

    Kellerer, A M; Kreisheimer, M; Chmelevsky, D; Barclay, D

    1995-03-01

    The risk of radiation-induced cancer is assessed through the follow-up of large cohorts, such as atomic bomb survivors or underground miners who have been occupationally exposed to radon and its decay products. The models relate to the dose, age and time dependence of the excess tumour rates, and they contain parameters that are estimated in terms of maximum likelihood computations. The computations are performed with the software package EPI-CURE, which contains the two main options of person-by person regression or of Poisson regression with grouped data. The Poisson regression is most frequently employed, but there are certain models that require an excessive number of cells when grouped data are used. One example involves computations that account explicitly for the temporal distribution of continuous exposures, as they occur with underground miners. In past work such models had to be approximated, but it is shown here that they can be treated explicitly in a suitably reformulated person-by person computation of the likelihood. The algorithm uses the familiar partitioning of the log-likelihood into two terms, L1 and L0. The first term, L1, represents the contribution of the 'events' (tumours). It needs to be evaluated in the usual way, but constitutes no computational problem. The second term, L0, represents the event-free periods of observation. It is, in its usual form, unmanageable for large cohorts. However, it can be reduced to a simple form, in which the number of computational steps is independent of cohort size. The method requires less computing time and computer memory, but more importantly it leads to more stable numerical results by obviating the need for grouping the data. The algorithm may be most relevant to radiation risk modelling, but it can facilitate the modelling of failure-time data in general. PMID:7604154

  4. Benefits and concerns associated with biotechnology-derived foods: can additional research reduce children health risks?

    PubMed

    Cantani, A

    2006-01-01

    The development of techniques devised for the genetic manipulation of foods poses new risks for children with food allergy (FA). The introduction of foreign allergenic proteins from different foods into previously tolerated foods may trigger allergic reactions, often complicating with anaphylactic shock in a subset of allergic babies. Children with FA, even if subjected to preventative diets, always challenge the risk of developing allergic manifestations after unintentional intake of a non tolerated food in restaurant settings, with relatives or schoolmates, etc, where product labelling is necessarily lacking. The introduction of potentially allergenic proteins into foods generally considered safe for allergic children can be done deliberately, by either substantially altering the food ingredients, or by genetic manipulation which change the composition or transfer allergens, or unintentionally by quality-control failures, due to contaminations in the production process, or to genetic mismanipulation. There is a controversy between multinationals often favored by governments and consumer association resistance, thus an equidistant analysis poses some unprecedented impediments. The importance of FA and the potential of transgenic plants to bring food allergens into the food supply should not be disregarded. The expression in soybeans of a Brazil nut protein resulted in a food allergen expressed in widely used infant formulas, so paving the way to an often reported multinational debacle. Genetic engineering poses innovative ethical and social concerns, as well as serious challenges to the environment, human health, animal welfare, and the future of agriculture. In this paper will be emphasized practical concepts more crucial for pediatricians.

  5. Benefits and concerns associated with biotechnology-derived foods: can additional research reduce children health risks?

    PubMed

    Cantani, A

    2009-01-01

    The development of techniques devised for the genetic manipulation of foods poses new risks for children with food allergy (FA). The introduction of foreign allergenic proteins from different foods into previously tolerated foods may trigger allergic reactions, often complicating with anaphylactic shock in a subset of allergic babies. Children with FA, even if subjected to preventative diets, always challenge the risk of developing allergic manifestations after unintentional intake of a non tolerated food in restaurant settings, with relatives or schoolmates, etc, where product labelling is necessarily lacking. The introduction of potentially allergenic proteins into foods generally considered safe for allergic children can be done deliberately, by either substantially altering the food ingredients, or by genetic manipulation which change the composition or transfer allergens, or unintentionally by qualitycontrol failures, due to contaminations in the production process, or to genetic mismanipulation. There is a controversy between multinationals often favored by governments and consumer association resistance, thus an equidistant analysis poses some unprecedented impediments. The importance of FA and the potential of transgenic plants to bring food allergens into the food supply should not be disregarded. The expression in soybeans of a Brazil nut protein resulted in a food allergen ex-pressed in widely used infant formulas, so paving the way to an often reported multinational debacle. Genetic engineering poses innovative ethical and social concerns, as well as serious challenges to the environment, human health, animal welfare, and the future of agriculture. In this paper will be emphasized practical concepts more crucial for pediatricians.

  6. Benefits and concerns associated with biotechnology-derived foods: can additional research reduce children health risks?

    PubMed

    Cantani, A

    2006-01-01

    The development of techniques devised for the genetic manipulation of foods poses new risks for children with food allergy (FA). The introduction of foreign allergenic proteins from different foods into previously tolerated foods may trigger allergic reactions, often complicating with anaphylactic shock in a subset of allergic babies. Children with FA, even if subjected to preventative diets, always challenge the risk of developing allergic manifestations after unintentional intake of a non tolerated food in restaurant settings, with relatives or schoolmates, etc, where product labelling is necessarily lacking. The introduction of potentially allergenic proteins into foods generally considered safe for allergic children can be done deliberately, by either substantially altering the food ingredients, or by genetic manipulation which change the composition or transfer allergens, or unintentionally by quality-control failures, due to contaminations in the production process, or to genetic mismanipulation. There is a controversy between multinationals often favored by governments and consumer association resistance, thus an equidistant analysis poses some unprecedented impediments. The importance of FA and the potential of transgenic plants to bring food allergens into the food supply should not be disregarded. The expression in soybeans of a Brazil nut protein resulted in a food allergen expressed in widely used infant formulas, so paving the way to an often reported multinational debacle. Genetic engineering poses innovative ethical and social concerns, as well as serious challenges to the environment, human health, animal welfare, and the future of agriculture. In this paper will be emphasized practical concepts more crucial for pediatricians. PMID:16910351

  7. Benefits and concerns associated with biotechnology-derived foods: can additional research reduce children health risks?

    PubMed

    Cantani, A

    2009-01-01

    The development of techniques devised for the genetic manipulation of foods poses new risks for children with food allergy (FA). The introduction of foreign allergenic proteins from different foods into previously tolerated foods may trigger allergic reactions, often complicating with anaphylactic shock in a subset of allergic babies. Children with FA, even if subjected to preventative diets, always challenge the risk of developing allergic manifestations after unintentional intake of a non tolerated food in restaurant settings, with relatives or schoolmates, etc, where product labelling is necessarily lacking. The introduction of potentially allergenic proteins into foods generally considered safe for allergic children can be done deliberately, by either substantially altering the food ingredients, or by genetic manipulation which change the composition or transfer allergens, or unintentionally by qualitycontrol failures, due to contaminations in the production process, or to genetic mismanipulation. There is a controversy between multinationals often favored by governments and consumer association resistance, thus an equidistant analysis poses some unprecedented impediments. The importance of FA and the potential of transgenic plants to bring food allergens into the food supply should not be disregarded. The expression in soybeans of a Brazil nut protein resulted in a food allergen ex-pressed in widely used infant formulas, so paving the way to an often reported multinational debacle. Genetic engineering poses innovative ethical and social concerns, as well as serious challenges to the environment, human health, animal welfare, and the future of agriculture. In this paper will be emphasized practical concepts more crucial for pediatricians. PMID:19364084

  8. Social models of HIV risk among young adults in Lesotho.

    PubMed

    Bulled, Nicola L

    2015-01-01

    Extensive research over the past 30 years has revealed that individual and social determinants impact HIV risk. Even so, prevention efforts focus primarily on individual behaviour change, with little recognition of the dynamic interplay of individual and social environment factors that further exacerbate risk engagement. Drawing on long-term research with young adults in Lesotho, I examine how social environment factors contribute to HIV risk. During preliminary ethnographic analysis, I developed novel scales to measure social control, adoption of modernity, and HIV knowledge. In survey research, I examined the effects of individual characteristics (i.e., socioeconomic status, HIV knowledge, adoption of modernity) and social environment (i.e., social control) on HIV risk behaviours. In addition, I measured the impact of altered environments by taking advantage of an existing situation whereby young adults attending a national college are assigned to either a main campus in a metropolitan setting or a satellite campus in a remote setting, irrespective of the environment in which they were socialised as youth. This arbitrary assignment process generates four distinct groups of young adults with altered or constant environments. Regression models show that lower levels of perceived social control and greater adoption of modernity are associated with HIV risk, controlling for other factors. The impact of social control and modernity varies with environment dynamics. PMID:26284999

  9. Social models of HIV risk among young adults in Lesotho.

    PubMed

    Bulled, Nicola L

    2015-01-01

    Extensive research over the past 30 years has revealed that individual and social determinants impact HIV risk. Even so, prevention efforts focus primarily on individual behaviour change, with little recognition of the dynamic interplay of individual and social environment factors that further exacerbate risk engagement. Drawing on long-term research with young adults in Lesotho, I examine how social environment factors contribute to HIV risk. During preliminary ethnographic analysis, I developed novel scales to measure social control, adoption of modernity, and HIV knowledge. In survey research, I examined the effects of individual characteristics (i.e., socioeconomic status, HIV knowledge, adoption of modernity) and social environment (i.e., social control) on HIV risk behaviours. In addition, I measured the impact of altered environments by taking advantage of an existing situation whereby young adults attending a national college are assigned to either a main campus in a metropolitan setting or a satellite campus in a remote setting, irrespective of the environment in which they were socialised as youth. This arbitrary assignment process generates four distinct groups of young adults with altered or constant environments. Regression models show that lower levels of perceived social control and greater adoption of modernity are associated with HIV risk, controlling for other factors. The impact of social control and modernity varies with environment dynamics.

  10. The developmental 'risk factor' model of schizophrenia.

    PubMed

    Murray, R M; Fearon, P

    1999-01-01

    There is no single cause for schizophrenia. We believe that, as with other common chronic diseases such as diabetes and coronary artery disease, the appropriate aetiological model is one involving multiple genes and environmental risk factors; the latter can be divided into (a) predisposing and (b) precipitating. Our model is that genetic and/or early environmental factors cause the development of anomalous neural networks. We postulate that these interact in the growing child with inherited schizotypal traits to establish a trajectory towards an increasingly solitary and deviant life style. This ultimately projects the individual across the threshold for expression of schizophrenia, sometimes by causing the drug abuse and social adversity that appear to precipitate the psychosis. PMID:10628525

  11. Extreme Earthquake Risk Estimation by Hybrid Modeling

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Cabrera, E.; Ashworth, M.; Garcia, S.; Emerson, D.; Perea, N.; Salazar, A.; Moulinec, C.

    2012-12-01

    The estimation of the hazard and the economical consequences i.e. the risk associated to the occurrence of extreme magnitude earthquakes in the neighborhood of urban or lifeline infrastructure, such as the 11 March 2011 Mw 9, Tohoku, Japan, represents a complex challenge as it involves the propagation of seismic waves in large volumes of the earth crust, from unusually large seismic source ruptures up to the infrastructure location. The large number of casualties and huge economic losses observed for those earthquakes, some of which have a frequency of occurrence of hundreds or thousands of years, calls for the development of new paradigms and methodologies in order to generate better estimates, both of the seismic hazard, as well as of its consequences, and if possible, to estimate the probability distributions of their ground intensities and of their economical impacts (direct and indirect losses), this in order to implement technological and economical policies to mitigate and reduce, as much as possible, the mentioned consequences. Herewith, we propose a hybrid modeling which uses 3D seismic wave propagation (3DWP) and neural network (NN) modeling in order to estimate the seismic risk of extreme earthquakes. The 3DWP modeling is achieved by using a 3D finite difference code run in the ~100 thousands cores Blue Gene Q supercomputer of the STFC Daresbury Laboratory of UK, combined with empirical Green function (EGF) techniques and NN algorithms. In particular the 3DWP is used to generate broadband samples of the 3D wave propagation of extreme earthquakes (plausible) scenarios corresponding to synthetic seismic sources and to enlarge those samples by using feed-forward NN. We present the results of the validation of the proposed hybrid modeling for Mw 8 subduction events, and show examples of its application for the estimation of the hazard and the economical consequences, for extreme Mw 8.5 subduction earthquake scenarios with seismic sources in the Mexican

  12. Integrating Professional and Folk Models of HIV Risk: YMSM's Perceptions of High-Risk Sex

    ERIC Educational Resources Information Center

    Kubicek, Katrina; Carpineto, Julie; McDavitt, Bryce; Weiss, George; Iverson, Ellen F.; Au, Chi-Wai; Kerrone, Dustin; Martinez, Miguel; Kipke, Michele D.

    2008-01-01

    Risks associated with HIV are well documented in research literature. Although a great deal has been written about high-risk sex, little research has been conducted to examine how young men who have sex with men (YMSM) perceive and define high-risk sexual behavior. In this study, we compare the "professional" and "folk" models of HIV risk based on…

  13. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    NASA Technical Reports Server (NTRS)

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  14. The globalization of risk and risk perception: why we need a new model of risk communication for vaccines.

    PubMed

    Larson, Heidi; Brocard Paterson, Pauline; Erondu, Ngozi

    2012-11-01

    Risk communication and vaccines is complex and the nature of risk perception is changing, with perceptions converging, evolving and having impacts well beyond specific geographic localities and points in time, especially when amplified through the Internet and other modes of global communication. This article examines the globalization of risk perceptions and their impacts, including the example of measles and the globalization of measles, mumps and rubella (MMR) vaccine risk perceptions, and calls for a new, more holistic model of risk assessment, risk communication and risk mitigation, embedded in an ongoing process of risk management for vaccines and immunization programmes. It envisions risk communication as an ongoing process that includes trust-building strategies hand-in-hand with operational and policy strategies needed to mitigate and manage vaccine-related risks, as well as perceptions of risk.

  15. Concentration addition-based approach for aquatic risk assessment of realistic pesticide mixtures in Portuguese river basins.

    PubMed

    Silva, Emília; Cerejeira, Maria José

    2015-05-01

    A two-tiered outline for the predictive environmental risk assessment of chemical mixtures with effect assessments based on concentration addition (CA) approaches as first tier and consideration of independent action (IA) as the second tier was applied based on realistic pesticide mixtures measured in surface waters from 2002 to 2008 within three important Portuguese river basins ('Mondego', 'Sado' and 'Tejo'). The CA-based risk quotients, based on acute data and an assessment factor of 100, exceeded 1 in more than 39 % of the 281 samples, indicating a potential risk for the aquatic environment, namely to algae. Seven herbicide compounds and three insecticides were the most toxic compounds in the pesticide mixtures and provided at least 50 % of the mixture's toxicity in almost 100 % of the samples with risk quotients based on the sum of toxic units (RQSTU) above 1. In eight samples, the maximum cumulative ratio (MCR) and the Junghan's ratio values indicated that a chemical-by-chemical approach underestimated the toxicity of the pesticide mixtures, and CA predicted higher mixture toxicity than that of IA. From a risk management perspective, the results pointed out that, by deriving appropriate programmes of measures to a limited number of pesticides with the highest contribution to the total mixture toxicity, relevant benefits also on mixture impact could be produced. PMID:25424034

  16. Assessing patients' risk of febrile neutropenia: is there a correlation between physician-assessed risk and model-predicted risk?

    PubMed

    Lyman, Gary H; Dale, David C; Legg, Jason C; Abella, Esteban; Morrow, Phuong Khanh; Whittaker, Sadie; Crawford, Jeffrey

    2015-08-01

    This study evaluated the correlation between the risk of febrile neutropenia (FN) estimated by physicians and the risk of severe neutropenia or FN predicted by a validated multivariate model in patients with nonmyeloid malignancies receiving chemotherapy. Before patient enrollment, physician and site characteristics were recorded, and physicians self-reported the FN risk at which they would typically consider granulocyte colony-stimulating factor (G-CSF) primary prophylaxis (FN risk intervention threshold). For each patient, physicians electronically recorded their estimated FN risk, orders for G-CSF primary prophylaxis (yes/no), and patient characteristics for model predictions. Correlations between physician-assessed FN risk and model-predicted risk (primary endpoints) and between physician-assessed FN risk and G-CSF orders were calculated. Overall, 124 community-based oncologists registered; 944 patients initiating chemotherapy with intermediate FN risk enrolled. Median physician-assessed FN risk over all chemotherapy cycles was 20.0%, and median model-predicted risk was 17.9%; the correlation was 0.249 (95% CI, 0.179-0.316). The correlation between physician-assessed FN risk and subsequent orders for G-CSF primary prophylaxis (n = 634) was 0.313 (95% CI, 0.135-0.472). Among patients with a physician-assessed FN risk ≥ 20%, 14% did not receive G-CSF orders. G-CSF was not ordered for 16% of patients at or above their physician's self-reported FN risk intervention threshold (median, 20.0%) and was ordered for 21% below the threshold. Physician-assessed FN risk and model-predicted risk correlated weakly; however, there was moderate correlation between physician-assessed FN risk and orders for G-CSF primary prophylaxis. Further research and education on FN risk factors and appropriate G-CSF use are needed.

  17. Multiprocessing and Correction Algorithm of 3D-models for Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Anamova, R. R.; Zelenov, S. V.; Kuprikov, M. U.; Ripetskiy, A. V.

    2016-07-01

    This article addresses matters related to additive manufacturing preparation. A layer-by-layer model presentation was developed on the basis of a routing method. Methods for correction of errors in the layer-by-layer model presentation were developed. A multiprocessing algorithm for forming an additive manufacturing batch file was realized.

  18. Insulin resistance: an additional risk factor in the pathogenesis of cardiovascular disease in type 2 diabetes.

    PubMed

    Patel, Tushar P; Rawal, Komal; Bagchi, Ashim K; Akolkar, Gauri; Bernardes, Nathalia; Dias, Danielle da Silva; Gupta, Sarita; Singal, Pawan K

    2016-01-01

    Sedentary life style and high calorie dietary habits are prominent leading cause of metabolic syndrome in modern world. Obesity plays a central role in occurrence of various diseases like hyperinsulinemia, hyperglycemia and hyperlipidemia, which lead to insulin resistance and metabolic derangements like cardiovascular diseases (CVDs) mediated by oxidative stress. The mortality rate due to CVDs is on the rise in developing countries. Insulin resistance (IR) leads to micro or macro angiopathy, peripheral arterial dysfunction, hampered blood flow, hypertension, as well as the cardiomyocyte and the endothelial cell dysfunctions, thus increasing risk factors for coronary artery blockage, stroke and heart failure suggesting that there is a strong association between IR and CVDs. The plausible linkages between these two pathophysiological conditions are altered levels of insulin signaling proteins such as IR-β, IRS-1, PI3K, Akt, Glut4 and PGC-1α that hamper insulin-mediated glucose uptake as well as other functions of insulin in the cardiomyocytes and the endothelial cells of the heart. Reduced AMPK, PFK-2 and elevated levels of NADP(H)-dependent oxidases produced by activated M1 macrophages of the adipose tissue and elevated levels of circulating angiotensin are also cause of CVD in diabetes mellitus condition. Insulin sensitizers, angiotensin blockers, superoxide scavengers are used as therapeutics in the amelioration of CVD. It evidently becomes important to unravel the mechanisms of the association between IR and CVDs in order to formulate novel efficient drugs to treat patients suffering from insulin resistance-mediated cardiovascular diseases. The possible associations between insulin resistance and cardiovascular diseases are reviewed here. PMID:26542377

  19. Risk assessment compatible fire models (RACFMs)

    SciTech Connect

    Lopez, A.R.; Gritzo, L.A.; Sherman, M.P.

    1998-07-01

    A suite of Probabilistic Risk Assessment Compatible Fire Models (RACFMs) has been developed to represent the hazard posed by a pool fire to weapon systems transported on the B52-H aircraft. These models represent both stand-off (i.e., the weapon system is outside of the flame zone but exposed to the radiant heat load from fire) and fully-engulfing scenarios (i.e., the object is fully covered by flames). The approach taken in developing the RACFMs for both scenarios was to consolidate, reconcile, and apply data and knowledge from all available resources including: data and correlations from the literature, data from an extensive full-scale fire test program at the Naval Air Warfare Center (NAWC) at China Lake, and results from a fire field model (VULCAN). In the past, a single, effective temperature, T{sub f}, was used to represent the fire. The heat flux to an object exposed to a fire was estimated using the relationship for black body radiation, {sigma}T{sub f}{sup 4}. Significant improvements have been made by employing the present approach which accounts for the presence of temperature distributions in fully-engulfing fires, and uses best available correlations to estimate heat fluxes in stand-off scenarios.

  20. Probiotics as Additives on Therapy in Allergic Airway Diseases: A Systematic Review of Benefits and Risks

    PubMed Central

    Das, Rashmi Ranjan; Naik, Sushree Samiksha; Singh, Meenu

    2013-01-01

    Background. We conducted a systematic review to find out the role of probiotics in treatment of allergic airway diseases.  Methods. A comprehensive search of the major electronic databases was done till March 2013. Trials comparing the effect of probiotics versus placebo were included. A predefined set of outcome measures were assessed. Continuous data were expressed as standardized mean difference with 95% CI. Dichotomous data were expressed as odds ratio with 95% CI. P value < 0.05 was considered as significant. Results. A total of 12 studies were included. Probiotic intake was associated with a significantly improved quality of life score in patients with allergic rhinitis (SMD −1.9 (95% CI −3.62, −0.19); P = 0.03), though there was a high degree of heterogeneity. No improvement in quality of life score was noted in asthmatics. Probiotic intake also improved the following parameters: longer time free from episodes of asthma and rhinitis and decrease in the number of episodes of rhinitis per year. Adverse events were not significant. Conclusion. As the current evidence was generated from few trials with high degree of heterogeneity, routine use of probiotics as an additive on therapy in subjects with allergic airway diseases cannot be recommended. PMID:23956972

  1. Breast Cancer Risk Assessment SAS Macro (Gail Model)

    Cancer.gov

    A SAS macro (commonly referred to as the Gail Model) that projects absolute risk of invasive breast cancer according to NCI’s Breast Cancer Risk Assessment Tool (BCRAT) algorithm for specified race/ethnic groups and age intervals.

  2. Posttraumatic stress disorder, alone or additively with early life adversity, is associated with obesity and cardiometabolic risk

    PubMed Central

    Farr, Olivia M.; Ko, Byung-Joon; Joung, Kyoung Eun; Zaichenko, Lesya; Usher, Nicole; Tsoukas, Michael; Thakkar, Bindiya; Davis, Cynthia R.; Crowell, Judith A.; Mantzoros, Christos S.

    2015-01-01

    Background and Aims There is some evidence that posttraumatic stress disorder (PTSD) and early life adversity may influence metabolic outcomes such as obesity, diabetes, and cardiovascular disease. However, whether and how these interact is not clear. Methods We analyzed data from a cross-sectional and a longitudinal study to determine how PTSD severity influences obesity, insulin sensitivity, and key measures and biomarkers of cardiovascular risk. We then looked at how PTSD and early life adversity may interact to impact these same outcomes. Results PTSD severity is associated with increasing risk of obesity, diabetes, and cardiovascular disease, with higher symptoms correlating with higher values of BMI, leptin, fibrinogen, and blood pressure, and lower values of insulin sensitivity. PTSD and early life adversity have an additive effect on these metabolic outcomes. The longitudinal study confirmed findings from the cross sectional study and showed that fat mass, leptin, CRP, ICAM, and TNFRII were significantly increased with higher PTSD severity during a 2.5 year follow-up period. Conclusions Individuals with early life adversity and PTSD are at high risk and should be monitored carefully for obesity, insulin resistance, and cardiometabolic risk. PMID:25770759

  3. Measuring and modelling pollution for risk analysis.

    PubMed

    Zidek, J V; Le, N D

    1999-01-01

    The great scale and complexity of environmental risk analysis offers major methodological challenges to those engaged in policymaking. In this paper we describe some of those challenges from the perspective gained through our work at the University of British Columbia (UBC). We describe some of our experiences with respect to the difficult problems of formulating environmental standards and developing abatement strategies. A failed but instructive attempt to find support for experiments on a promising method of reducing acid rain will be described. Then we describe an approach to scenario analysis under hypothetical new standards. Even with measurements of ambient environmental conditions in hand the problem of inferring actual human exposures remains. For example, in very hot weather people will tend to stay inside and population levels of exposure to e.g. ozone could be well below those predicted by the ambient measurements. Setting air quality criteria should ideally recognize the discrepancies likely to arise. Computer models that incorporate spatial random pollution fields and predict actual exposures from ambient levels will be described. From there we turn to the statistical issues of measurement and modelling and some of the contributions in these areas by the UBC group and its partners elsewhere. In particular we discuss the problem of measurement error when non-linear regression models are used. We sketch our approach to imputing unmeasured predictors needed in such models, deferring details to references cited below. We describe in general terms how those imputed measurements and their errors can be accommodated within the framework of health impact analysis.

  4. Modeling HIV Risk in Highly Vulnerable Youth

    ERIC Educational Resources Information Center

    Huba, G. J.; Panter, A. T.; Melchior, Lisa A.; Trevithick, Lee; Woods, Elizabeth R.; Wright, Eric; Feudo, Rudy; Tierney, Steven; Schneir, Arlene; Tenner, Adam; Remafedi, Gary; Greenberg, Brian; Sturdevant, Marsha; Goodman, Elizabeth; Hodgins, Antigone; Wallace, Michael; Brady, Russell E.; Singer, Barney; Marconi, Katherine

    2003-01-01

    This article examines the structure of several HIV risk behaviors in an ethnically and geographically diverse sample of 8,251 clients from 10 innovative demonstration projects intended for adolescents living with, or at risk for, HIV. Exploratory and confirmatory factor analyses identified 2 risk factors for men (sexual intercourse with men and a…

  5. Optimal dividends in the Brownian motion risk model with interest

    NASA Astrophysics Data System (ADS)

    Fang, Ying; Wu, Rong

    2009-07-01

    In this paper, we consider a Brownian motion risk model, and in addition, the surplus earns investment income at a constant force of interest. The objective is to find a dividend policy so as to maximize the expected discounted value of dividend payments. It is well known that optimality is achieved by using a barrier strategy for unrestricted dividend rate. However, ultimate ruin of the company is certain if a barrier strategy is applied. In many circumstances this is not desirable. This consideration leads us to impose a restriction on the dividend stream. We assume that dividends are paid to the shareholders according to admissible strategies whose dividend rate is bounded by a constant. Under this additional constraint, we show that the optimal dividend strategy is formed by a threshold strategy.

  6. Risk assessment of nitrate and petroleum-derived hydrocarbon addition on Contricriba weissflogii biomass, lifetime, and nutritional value.

    PubMed

    Shun-Xing, Li; Feng-Jiao, Liu; Feng-Ying, Zheng; Xu-Guang, Huang; Yue-Gang, Zuo

    2014-03-15

    Coastal diatoms are often exposed to both petroleum-derived hydrocarbon pollution and eutrophication. How these exposures influence on algal biomass, lifetime, and nutritional value are unknown. To examine a more accurate risk assessment of the pollutants on the role of diatoms in coastal ecosystem functions, Conticribra weissflogii was maintained at different concentrations of nitrate (N) and/or water-soluble fractions of No.0 diesel oil (WSF). Algal density, cell growth cycle, protein, chlorophyll a, superoxide dismutase (SOD) activity, and malonaldehyde (MDA) were determined for the assessment of algal biomass, lifetime, nutritional value, photosynthesis and respiration, antioxidant capacity, and lipid peroxidation, respectively.When N addition was combined with WSF pollution, the cell growth cycles were shortened by 27-44%; SOD activities were decreased by 1-64%; algal density, the concentrations of chlorophyll a, protein, and MDA were varied between 38 and 310%, 62 and 712%, 4 and 124%, and 19 and 233% of the values observed in N addition experiments, respectively. Coastal ecosystem functions were severely weakened by N and WSF additions, and the influence was increased in the order: Nrisk assessment of petroleum-derived hydrocarbon on coastal ecosystem functions.

  7. Bankruptcy risk model and empirical tests

    PubMed Central

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M.; Urošević, Branko; Stanley, H. Eugene

    2010-01-01

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor—the debt-to-asset ratio R—in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes’s theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees—although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers. PMID:20937903

  8. Bankruptcy risk model and empirical tests.

    PubMed

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M; Urosevic, Branko; Stanley, H Eugene

    2010-10-26

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor--the debt-to-asset ratio R--in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes's theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees--although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers.

  9. Bankruptcy risk model and empirical tests.

    PubMed

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M; Urosevic, Branko; Stanley, H Eugene

    2010-10-26

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor--the debt-to-asset ratio R--in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes's theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees--although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers. PMID:20937903

  10. 42 CFR 425.600 - Selection of risk model.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Selection of risk model. 425.600 Section 425.600... Selection of risk model. (a) For its initial agreement period, an ACO may elect to operate under one of the following tracks: (1) Track 1. Under Track 1, the ACO operates under the one-sided model (as described...

  11. 42 CFR 425.600 - Selection of risk model.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Selection of risk model. 425.600 Section 425.600... Selection of risk model. (a) For its initial agreement period, an ACO may elect to operate under one of the following tracks: (1) Track 1. Under Track 1, the ACO operates under the one-sided model (as described...

  12. 42 CFR 425.600 - Selection of risk model.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Selection of risk model. 425.600 Section 425.600... Selection of risk model. (a) For its initial agreement period, an ACO may elect to operate under one of the following tracks: (1) Track 1. Under Track 1, the ACO operates under the one-sided model (as described...

  13. Research on R&D Project Risk Management Model

    NASA Astrophysics Data System (ADS)

    Gu, Xiaoyan; Cai, Chen; Song, Hao; Song, Juan

    R&D project is an exploratory high-risk investment activity and has potential management flexibility. In R&D project risk management process, it is hard to quantify risk with very little past information available. This paper introduces quality function deployment and real option in traditional project risk management process. Through waterfall decomposition mode, R&D project risk management process is constructed step by step; through real option, the managerial flexibility inherent in R&D project can be modeled. In the paper, first of all, according to the relation matrix between R&D project success factors and risk indexes, risk priority list can be obtained. Then, risk features of various stages are analyzed. Finally, real options are embedded into various stages of R&D project by the risk features. In order to effectively manage R&D risk in a dynamic cycle, the steps above should be carried out repeatedly.

  14. Proliferation Risk Characterization Model Prototype Model - User and Programmer Guidelines

    SciTech Connect

    Dukelow, J.S.; Whitford, D.

    1998-12-01

    A model for the estimation of the risk of diversion of weapons-capable materials was developed. It represents both the threat of diversion and site vulnerability as a product of a small number of variables (two to eight), each of which can take on a small number (two to four) of qualitatively defined (but quantitatively implemented) values. The values of the overall threat and vulnerability variables are then converted to threat and vulnerability categories. The threat and vulnerability categories are used to define the likelihood of diversion, also defined categorically. The evaluator supplies an estimate of the consequences of a diversion, defined categorically, but with the categories based on the IAEA Attractiveness levels. Likelihood and Consequences categories are used to define the Risk, also defined categorically. The threat, vulnerability, and consequences input provided by the evaluator contains a representation of his/her uncertainty in each variable assignment which is propagated all the way through to the calculation of the Risk categories. [Appendix G available on diskette only.

  15. The influence of dispersing additive on the paraffin crystallization in model systems

    NASA Astrophysics Data System (ADS)

    Gorshkov, A. M.; Tien Thang, Pham; Shishmina, L. V.; Chekantseva, L. V.

    2015-11-01

    The work is dedicated to investigation of the influence of dispersing additive on the paraffin crystallization in model systems. A new method to determine the paraffin saturation point of transparent solutions based on the phenomenon of light scattering has been proposed. The linear relationship between the values of critical micelle concentrations of the additive and the quantity of paraffin in solution has been obtained. The influence of the model system composition on the paraffin crystallization has been studied.

  16. Effects of additional food in a delayed predator-prey model.

    PubMed

    Sahoo, Banshidhar; Poria, Swarup

    2015-03-01

    We examine the effects of supplying additional food to predator in a gestation delay induced predator-prey system with habitat complexity. Additional food works in favor of predator growth in our model. Presence of additional food reduces the predatory attack rate to prey in the model. Supplying additional food we can control predator population. Taking time delay as bifurcation parameter the stability of the coexisting equilibrium point is analyzed. Hopf bifurcation analysis is done with respect to time delay in presence of additional food. The direction of Hopf bifurcations and the stability of bifurcated periodic solutions are determined by applying the normal form theory and the center manifold theorem. The qualitative dynamical behavior of the model is simulated using experimental parameter values. It is observed that fluctuations of the population size can be controlled either by supplying additional food suitably or by increasing the degree of habitat complexity. It is pointed out that Hopf bifurcation occurs in the system when the delay crosses some critical value. This critical value of delay strongly depends on quality and quantity of supplied additional food. Therefore, the variation of predator population significantly effects the dynamics of the model. Model results are compared with experimental results and biological implications of the analytical findings are discussed in the conclusion section.

  17. Genetic predisposition to coronary heart disease and stroke using an additive genetic risk score: a population-based study in Greece

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Objective: To determine the extent to which the risk for incident coronary heart disease (CHD) increases in relation to a genetic risk score (GRS) that additively integrates the influence of high-risk alleles in nine documented single nucleotide polymorphisms (SNPs) for CHD, and to examine whether t...

  18. Modeling biotic habitat high risk areas

    USGS Publications Warehouse

    Despain, D.G.; Beier, P.; Tate, C.; Durtsche, B.M.; Stephens, T.

    2000-01-01

    Fire, especially stand replacing fire, poses a threat to many threatened and endangered species as well as their habitat. On the other hand, fire is important in maintaining a variety of successional stages that can be important for approach risk assessment to assist in prioritizing areas for allocation of fire mitigation funds. One example looks at assessing risk to the species and biotic communities of concern followed by the Colorado Natural Heritage Program. One looks at the risk to Mexican spottled owls. Another looks at the risk to cutthroat trout, and a fourth considers the general effects of fire and elk.

  19. SMALL POPULATIONS REQUIRE SPECIFIC MODELING APPROACHES FOR ASSESSING RISK

    EPA Science Inventory

    All populations face non-zero risks of extinction. However, the risks for small populations, and therefore the modeling approaches necessary to predict them, are different from those of large populations. These differences are currently hindering assessment of risk to small pop...

  20. A Social Ecological Model of Syndemic Risk affecting Women with and At-Risk for HIV in Impoverished Urban Communities.

    PubMed

    Batchelder, A W; Gonzalez, J S; Palma, A; Schoenbaum, E; Lounsbury, D W

    2015-12-01

    Syndemic risk is an ecological construct, defined by co-occurring interdependent socio-environmental, interpersonal and intrapersonal determinants. We posited syndemic risk to be a function of violence, substance use, perceived financial hardship, emotional distress and self-worth among women with and at-risk for HIV in an impoverished urban community. In order to better understand these interrelationships, we developed and validated a system dynamics (SD) model based upon peer-reviewed literature; secondary data analyses of a cohort dataset including women living with and at-risk of HIV in Bronx, NY (N = 620); and input from a Bronx-based community advisory board. Simulated model output revealed divergent levels and patterns of syndemic risk over time across different sample profiles. Outputs generated new insights about how to effectively explore multicomponent multi-level programs in order to strategically develop more effective services for this population. Specifically, the model indicated that effective multi-level interventions might bolster women's resilience by increasing self-worth, which may result in decreased perceived financial hardship and risk of violence. Overall, our stakeholder-informed model depicts how self-worth may be a major driver of vulnerability and a meaningful addition to syndemic theory affecting this population. PMID:26370203

  1. A new explained-variance based genetic risk score for predictive modeling of disease risk.

    PubMed

    Che, Ronglin; Motsinger-Reif, Alison A

    2012-09-25

    The goal of association mapping is to identify genetic variants that predict disease, and as the field of human genetics matures, the number of successful association studies is increasing. Many such studies have shown that for many diseases, risk is explained by a reasonably large number of variants that each explains a very small amount of disease risk. This is prompting the use of genetic risk scores in building predictive models, where information across several variants is combined for predictive modeling. In the current study, we compare the performance of four previously proposed genetic risk score methods and present a new method for constructing genetic risk score that incorporates explained variance information. The methods compared include: a simple count Genetic Risk Score, an odds ratio weighted Genetic Risk Score, a direct logistic regression Genetic Risk Score, a polygenic Genetic Risk Score, and the new explained variance weighted Genetic Risk Score. We compare the methods using a wide range of simulations in two steps, with a range of the number of deleterious single nucleotide polymorphisms (SNPs) explaining disease risk, genetic modes, baseline penetrances, sample sizes, relative risks (RR) and minor allele frequencies (MAF). Several measures of model performance were compared including overall power, C-statistic and Akaike's Information Criterion. Our results show the relative performance of methods differs significantly, with the new explained variance weighted GRS (EV-GRS) generally performing favorably to the other methods.

  2. The modified model of radiation risk at radon exposure.

    PubMed

    Zhukovsky, Michael; Demin, Vladimir; Yarmoshenko, Ilia

    2014-07-01

    The combined modified model of risk assessment from an indoor radon exposure is proposed. Multiplicative dependence on fatal lung cancer is used. The model has been developed on the basis of the modern health risk theory and the results of epidemiological studies with the special attention to the results of the European combined study and the WISMUT miners cohort study. The model is presented as an age-specific relative risk coefficient for a single (short-term) exposure. The risk coefficient for an extended exposure can be obtained from this risk coefficient in the accordance with the risk theory. The smoothed dependences of the risk coefficients on time since exposure and attained age and radon progeny concentration are suggested.

  3. On an Additive Semigraphoid Model for Statistical Networks With Application to Pathway Analysis

    PubMed Central

    Li, Bing; Chun, Hyonho; Zhao, Hongyu

    2014-01-01

    We introduce a nonparametric method for estimating non-gaussian graphical models based on a new statistical relation called additive conditional independence, which is a three-way relation among random vectors that resembles the logical structure of conditional independence. Additive conditional independence allows us to use one-dimensional kernel regardless of the dimension of the graph, which not only avoids the curse of dimensionality but also simplifies computation. It also gives rise to a parallel structure to the gaussian graphical model that replaces the precision matrix by an additive precision operator. The estimators derived from additive conditional independence cover the recently introduced nonparanormal graphical model as a special case, but outperform it when the gaussian copula assumption is violated. We compare the new method with existing ones by simulations and in genetic pathway analysis. PMID:26401064

  4. Uses and Abuses of Models in Radiation Risk Management

    SciTech Connect

    Strom, Daniel J.

    1998-12-10

    This paper is a high-level overview of managing risks to workers, public, and the environment. It discusses the difference between a model and a hypothesis. The need for models in risk assessment is justified, and then it is shown that radiation risk models that are useable in risk management are highly simplistic. The weight of evidence is considered for and against the linear non-threshold (LNT) model for carcinogenesis and heritable ill-health that is currently the basis for radiation risk management. Finally, uses and misuses of this model are considered. It is concluded that the LNT model continues to be suitable for use as the basis for radiation protection.

  5. Risk management modeling and its application in maritime safety

    NASA Astrophysics Data System (ADS)

    Qin, Ting-Rong; Chen, Wei-Jiong; Zeng, Xiang-Kun

    2008-12-01

    Quantified risk assessment (QRA) needs mathematicization of risk theory. However, attention has been paid almost exclusively to applications of assessment methods, which has led to neglect of research into fundamental theories, such as the relationships among risk, safety, danger, and so on. In order to solve this problem, as a first step, fundamental theoretical relationships about risk and risk management were analyzed for this paper in the light of mathematics, and then illustrated with some charts. Second, man-machine-environment-management (MMEM) theory was introduced into risk theory to analyze some properties of risk. On the basis of this, a three-dimensional model of risk management was established that includes: a goal dimension; a management dimension; an operation dimension. This goal management operation (GMO) model was explained and then emphasis was laid on the discussion of the risk flowchart (operation dimension), which lays the groundwork for further study of risk management and qualitative and quantitative assessment. Next, the relationship between Formal Safety Assessment (FSA) and Risk Management was researched. This revealed that the FSA method, which the international maritime organization (IMO) is actively spreading, comes from Risk Management theory. Finally, conclusion were made about how to apply this risk management method to concrete fields efficiently and conveniently, as well as areas where further research is required.

  6. Increased bioclogging and corrosion risk by sulfate addition during iodine recovery at a natural gas production plant.

    PubMed

    Lim, Choon-Ping; Zhao, Dan; Takase, Yuta; Miyanaga, Kazuhiko; Watanabe, Tomoko; Tomoe, Yasuyoshi; Tanji, Yasunori

    2011-02-01

    Iodine recovery at a natural gas production plant in Japan involved the addition of sulfuric acid for pH adjustment, resulting in an additional about 200 mg/L of sulfate in the waste brine after iodine recovery. Bioclogging occurred at the waste brine injection well, causing a decrease in well injectivity. To examine the factors that contribute to bioclogging, an on-site experiment was conducted by amending 10 L of brine with different conditions and then incubating the brine for 5 months under open air. The control case was exposed to open air but did not receive additional chemicals. When sulfate addition was coupled with low iodine, there was a drastic increase in the total amount of accumulated biomass (and subsequently the risk of bioclogging) that was nearly six times higher than the control. The bioclogging-associated corrosion rate of carbon steel was 84.5 μm/year, which is four times higher than that observed under other conditions. Analysis of the microbial communities by denaturing gradient gel electrophoresis revealed that the additional sulfate established a sulfur cycle and induced the growth of phototrophic bacteria, including cyanobacteria and purple bacteria. In the presence of sulfate and low iodine levels, cyanobacteria and purple bacteria bloomed, and the accumulation of abundant biomass may have created a more conducive environment for anaerobic sulfate-reducing bacteria. It is believed that the higher corrosion rate was caused by a differential aeration cell that was established by the heterogeneous distribution of the biomass that covered the surface of the test coupons. PMID:20922384

  7. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces.

  8. Meat and bone meal and mineral feed additives may increase the risk of oral prion disease transmission

    USGS Publications Warehouse

    Johnson, Christopher J.; McKenzie, Debbie; Pedersen, Joel A.; Aiken, Judd M.

    2011-01-01

    Ingestion of prion-contaminated materials is postulated to be a primary route of prion disease transmission. Binding of prions to soil (micro)particles dramatically enhances peroral disease transmission relative to unbound prions, and it was hypothesized that micrometer-sized particles present in other consumed materials may affect prion disease transmission via the oral route of exposure. Small, insoluble particles are present in many substances, including soil, human foods, pharmaceuticals, and animal feeds. It is known that meat and bone meal (MBM), a feed additive believed responsible for the spread of bovine spongiform encephalopathy (BSE), contains particles smaller than 20 μm and that the pathogenic prion protein binds to MBM. The potentiation of disease transmission via the oral route by exposure to MBM or three micrometer-sized mineral feed additives was determined. Data showed that when the disease agent was bound to any of the tested materials, the penetrance of disease was increased compared to unbound prions. Our data suggest that in feed or other prion-contaminated substances consumed by animals or, potentially, humans, the addition of MBM or the presence of microparticles could heighten risks of prion disease acquisition.

  9. Meat and bone meal and mineral feed additives may increase the risk of oral prion disease transmission

    USGS Publications Warehouse

    Johnson, C.J.; McKenzie, D.; Pedersen, J.A.; Aiken, Judd M.

    2011-01-01

    Ingestion of prion-contaminated materials is postulated to be a primary route of prion disease transmission. Binding of prions to soil (micro)particles dramatically enhances peroral disease transmission relative to unbound prions, and it was hypothesized that micrometer-sized particles present in other consumed materials may affect prion disease transmission via the oral route of exposure. Small, insoluble particles are present in many substances, including soil, human foods, pharmaceuticals, and animal feeds. It is known that meat and bone meal (MBM), a feed additive believed responsible for the spread of bovine spongiform encephalopathy (BSE), contains particles smaller than 20 ??m and that the pathogenic prion protein binds to MBM. The potentiation of disease transmission via the oral route by exposure to MBM or three micrometer-sized mineral feed additives was determined. Data showed that when the disease agent was bound to any of the tested materials, the penetrance of disease was increased compared to unbound prions. Our data suggest that in feed or other prion-contaminated substances consumed by animals or, potentially, humans, the addition of MBM or the presence of microparticles could heighten risks of prion disease acquisition. Copyright ?? 2011 Taylor & Francis Group, LLC.

  10. MEAT AND BONE MEAL AND MINERAL FEED ADDITIVES MAY INCREASE THE RISK OF ORAL PRION DISEASE TRANSMISSION

    PubMed Central

    Johnson, Christopher J.; McKenzie, Debbie; Pedersen, Joel A.; Aiken, Judd M.

    2011-01-01

    Ingestion of prion-contaminated materials is postulated to be a primary route of prion disease transmission. Binding of prions to soil (micro)particles dramatically enhances peroral disease transmission relative to unbound prions, and it was hypothesized that micrometer–sized particles present in other consumed materials may affect prion disease transmission via the oral route of exposure. Small, insoluble particles are present in many substances, including soil, human foods, pharmaceuticals, and animal feeds. It is known that meat and bone meal (MBM), a feed additive believed responsible for the spread of bovine spongiform encephalopathy (BSE), contains particles smaller than 20 μm and that the pathogenic prion protein binds to MBM. The potentiation of disease transmission via the oral route by exposure to MBM or three micrometer-sized mineral feed additives was determined. Data showed that when the disease agent was bound to any of the tested materials, the penetrance of disease was increased compared to unbound prions. Our data suggest that in feed or other prion–contaminated substances consumed by animals or, potentially, humans, the addition of MBM or the presence of microparticles could heighten risks of prion disease acquisition. PMID:21218345

  11. A Dual-Process Approach to Health Risk Decision Making: The Prototype Willingness Model

    ERIC Educational Resources Information Center

    Gerrard, Meg; Gibbons, Frederick X.; Houlihan, Amy E.; Stock, Michelle L.; Pomery, Elizabeth A.

    2008-01-01

    Although dual-process models in cognitive, personality, and social psychology have stimulated a large body of research about analytic and heuristic modes of decision making, these models have seldom been applied to the study of adolescent risk behaviors. In addition, the developmental course of these two kinds of information processing, and their…

  12. Building risk-on-a-chip models to improve breast cancer risk assessment and prevention

    PubMed Central

    Vidi, Pierre-Alexandre; Leary, James; Lelièvre, Sophie A.

    2013-01-01

    Summary Preventive actions for chronic diseases hold the promise of improving lives and reducing healthcare costs. For several diseases, including breast cancer, multiple risk and protective factors have been identified by epidemiologists. The impact of most of these factors has yet to be fully understood at the organism, tissue, cellular and molecular levels. Importantly, combinations of external and internal risk and protective factors involve cooperativity thus, synergizing or antagonizing disease onset. Models are needed to mechanistically decipher cancer risks under defined cellular and microenvironmental conditions. Here, we briefly review breast cancer risk models based on 3D cell culture and propose to improve risk modeling with lab-on-a-chip approaches. We suggest epithelial tissue polarity, DNA repair and epigenetic profiles as endpoints in risk assessment models and discuss the development of ‘risks-on-chips’ integrating biosensors of these endpoints and of general tissue homeostasis. Risks-on-chips will help identify biomarkers of risk, serve as screening platforms for cancer preventive agents, and provide a better understanding of risk mechanisms, hence resulting in novel developments in disease prevention. PMID:23681255

  13. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  14. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  15. A Process Model for Assessing Adolescent Risk for Suicide.

    ERIC Educational Resources Information Center

    Stoelb, Matt; Chiriboga, Jennifer

    1998-01-01

    This comprehensive assessment process model includes primary, secondary, and situational risk factors and their combined implications and significance in determining an adolescent's level or risk for suicide. Empirical data and clinical intuition are integrated to form a working client model that guides the professional in continuously reassessing…

  16. Meta-analysis identifies 29 additional ulcerative colitis risk loci, increasing the number of confirmed associations to 47

    PubMed Central

    Anderson, Carl A.; Boucher, Gabrielle; Lees, Charlie W.; Franke, Andre; D’Amato, Mauro; Taylor, Kent D.; Lee, James C.; Goyette, Philippe; Imielinski, Marcin; Latiano, Anna; Lagacé, Caroline; Scott, Regan; Amininejad, Leila; Bumpstead, Suzannah; Baidoo, Leonard; Baldassano, Robert N.; Barclay, Murray; Bayless, Theodore M.; Brand, Stephan; Büning, Carsten; Colombel, Jean-Frédéric; Denson, Lee A.; De Vos, Martine; Dubinsky, Marla; Edwards, Cathryn; Ellinghaus, David; Fehrmann, Rudolf S.N.; Floyd, James A.B.; Florin, Tim; Franchimont, Denis; Franke, Lude; Georges, Michel; Glas, Jürgen; Glazer, Nicole L.; Guthery, Stephen L.; Haritunians, Talin; Hayward, Nicholas K.; Hugot, Jean-Pierre; Jobin, Gilles; Laukens, Debby; Lawrance, Ian; Lémann, Marc; Levine, Arie; Libioulle, Cecile; Louis, Edouard; McGovern, Dermot P.; Milla, Monica; Montgomery, Grant W.; Morley, Katherine I.; Mowat, Craig; Ng, Aylwin; Newman, William; Ophoff, Roel A; Papi, Laura; Palmieri, Orazio; Peyrin-Biroulet, Laurent; Panés, Julián; Phillips, Anne; Prescott, Natalie J.; Proctor, Deborah D.; Roberts, Rebecca; Russell, Richard; Rutgeerts, Paul; Sanderson, Jeremy; Sans, Miquel; Schumm, Philip; Seibold, Frank; Sharma, Yashoda; Simms, Lisa; Seielstad, Mark; Steinhart, A. Hillary; Targan, Stephan R.; van den Berg, Leonard H.; Vatn, Morten; Verspaget, Hein; Walters, Thomas; Wijmenga, Cisca; Wilson, David C.; Westra, Harm-Jan; Xavier, Ramnik J.; Zhao, Zhen Z.; Ponsioen, Cyriel Y.; Andersen, Vibeke; Torkvist, Leif; Gazouli, Maria; Anagnou, Nicholas P.; Karlsen, Tom H.; Kupcinskas, Limas; Sventoraityte, Jurgita; Mansfield, John C.; Kugathasan, Subra; Silverberg, Mark S.; Halfvarson, Jonas; Rotter, Jerome I.; Mathew, Christopher G.; Griffiths, Anne M.; Gearry, Richard; Ahmad, Tariq; Brant, Steven R.; Chamaillard, Mathias; Satsangi, Jack; Cho, Judy H.; Schreiber, Stefan; Daly, Mark J.; Barrett, Jeffrey C.; Parkes, Miles; Annese, Vito; Hakonarson, Hakon; Radford-Smith, Graham; Duerr, Richard H.; Vermeire, Séverine; Weersma, Rinse K.; Rioux, John D.

    2011-01-01

    Genome-wide association studies (GWAS) and candidate gene studies in ulcerative colitis (UC) have identified 18 susceptibility loci. We conducted a meta-analysis of 6 UC GWAS, comprising 6,687 cases and 19,718 controls, and followed-up the top association signals in 9,628 cases and 12,917 controls. We identified 29 additional risk loci (P<5×10-8), increasing the number of UC associated loci to 47. After annotating associated regions using GRAIL, eQTL data and correlations with non-synonymous SNPs, we identified many candidate genes providing potentially important insights into disease pathogenesis, including IL1R2, IL8RA/B, IL7R, IL12B, DAP, PRDM1, JAK2, IRF5, GNA12 and LSP1. The total number of confirmed inflammatory bowel disease (IBD) risk loci is now 99, including a minimum of 28 shared association signals between Crohn’s disease (CD) and UC. PMID:21297633

  17. Personal cancer knowledge and information seeking through PRISM: the planned risk information seeking model.

    PubMed

    Hovick, Shelly R; Kahlor, Leeann; Liang, Ming-Ching

    2014-04-01

    This study retested PRISM, a model of risk information seeking, and found that it is applicable to the context of cancer risk communication. The study, which used an online sample of 928 U.S. adults, also tested the effect of additional variables on that model and found that the original model better fit the data. Among the strongest predictors of cancer information seeking were seeking-related subjective norms, attitude toward seeking, perceived knowledge insufficiency, and affective risk response. Furthermore, risk perception was a strong predictor of an affective risk response. The authors suggest that, given the robustness across studies, the path between seeking-related subjective norms and seeking intention is ready to be implemented in communication practice. PMID:24433251

  18. Generalized additive modeling with implicit variable selection by likelihood-based boosting.

    PubMed

    Tutz, Gerhard; Binder, Harald

    2006-12-01

    The use of generalized additive models in statistical data analysis suffers from the restriction to few explanatory variables and the problems of selection of smoothing parameters. Generalized additive model boosting circumvents these problems by means of stagewise fitting of weak learners. A fitting procedure is derived which works for all simple exponential family distributions, including binomial, Poisson, and normal response variables. The procedure combines the selection of variables and the determination of the appropriate amount of smoothing. Penalized regression splines and the newly introduced penalized stumps are considered as weak learners. Estimates of standard deviations and stopping criteria, which are notorious problems in iterative procedures, are based on an approximate hat matrix. The method is shown to be a strong competitor to common procedures for the fitting of generalized additive models. In particular, in high-dimensional settings with many nuisance predictor variables it performs very well. PMID:17156269

  19. Applying Four Different Risk Models in Local Ore Selection

    SciTech Connect

    Richmond, Andrew

    2002-12-15

    Given the uncertainty in grade at a mine location, a financially risk-averse decision-maker may prefer to incorporate this uncertainty into the ore selection process. A FORTRAN program risksel is presented to calculate local risk-adjusted optimal ore selections using a negative exponential utility function and three dominance models: mean-variance, mean-downside risk, and stochastic dominance. All four methods are demonstrated in a grade control environment. In the case study, optimal selections range with the magnitude of financial risk that a decision-maker is prepared to accept. Except for the stochastic dominance method, the risk models reassign material from higher cost to lower cost processing options as the aversion to financial risk increases. The stochastic dominance model usually was unable to determine the optimal local selection.

  20. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2016-03-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  1. Modeling financial disaster risk management in developing countries

    NASA Astrophysics Data System (ADS)

    Mechler, R.; Hochrainer, S.; Pflug, G.; Linnerooth-Bayer, J.

    2005-12-01

    The public sector plays a major role in reducing the long-term economic repercussions of disasters by repairing damaged infrastructure and providing financial assistance to households and businesses. If critical infrastructure is not repaired in a timely manner, there can be serious effects on the economy and the livelihoods of the population. The repair of public infrastructure, however, can be a significant drain on public budgets especially in developing and transition countries. Developing country governments frequently lack the liquidity, even including international aid and loans, to fully repair damaged critical public infrastructure or provide sufficient support to households and businesses for their recovery. The earthquake in Gujarat, and other recent cases of government post-disaster liquidity crises, have sounded an alarm, prompting financial development organizations, such as the World Bank, among others, to call for greater attention to reducing financial vulnerability and increasing the resilience of the public sector. This talk reports on a model designed to illustrate the tradeoffs and choices a developing country must make in financially managing the economic risks due to natural disasters. Budgetary resources allocated to pre-disaster risk management strategies, such as loss mitigation measures, a catastrophe reserve fund, insurance and contingent credit arrangements for public assets, reduce the probability of financing gaps - the inability of governments to meet their full obligations in providing relief to private victims and restoring public infrastructure - or prevent the deterioration of the ability to undertake additional borrowing without incurring a debt crisis. The model -which is equipped with a graphical interface - can be a helpful tool for building capacity of policy makers for developing and assessing public financing strategies for disaster risk by indicating the respective costs and consequences of financing alternatives.

  2. A comprehensive Network Security Risk Model for process control networks.

    PubMed

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example.

  3. A comprehensive Network Security Risk Model for process control networks.

    PubMed

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example. PMID:19000078

  4. Vector generalized additive models for extreme rainfall data analysis (study case rainfall data in Indramayu)

    NASA Astrophysics Data System (ADS)

    Utami, Eka Putri Nur; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall pattern are good indicators for potential disasters. Global Circulation Model (GCM) contains global scale information that can be used to predict the rainfall data. Statistical downscaling (SD) utilizes the global scale information to make inferences in the local scale. Essentially, SD can be used to predict local scale variables based on global scale variables. SD requires a method to accommodate non linear effects and extreme values. Extreme value Theory (EVT) can be used to analyze the extreme value. One of methods to identify the extreme events is peak over threshold that follows Generalized Pareto Distribution (GPD). The vector generalized additive model (VGAM) is an extension of the generalized additive model. It is able to accommodate linear or nonlinear effects by involving more than one additive predictors. The advantage of VGAM is to handle multi response models. The key idea of VGAM are iteratively reweighted least square for maximum likelihood estimation, penalized smoothing, fisher scoring and additive models. This works aims to analyze extreme rainfall data in Indramayu using VGAM. The results show that the VGAM with GPD is able to predict extreme rainfall data accurately. The prediction in February is very close to the actual value at quantile 75.

  5. Integrated reservoir characterization: Improvement in heterogeneities stochastic modelling by integration of additional external constraints

    SciTech Connect

    Doligez, B.; Eschard, R.; Geffroy, F.

    1997-08-01

    The classical approach to construct reservoir models is to start with a fine scale geological model which is informed with petrophysical properties. Then scaling-up techniques allow to obtain a reservoir model which is compatible with the fluid flow simulators. Geostatistical modelling techniques are widely used to build the geological models before scaling-up. These methods provide equiprobable images of the area under investigation, which honor the well data, and which variability is the same than the variability computed from the data. At an appraisal phase, when few data are available, or when the wells are insufficient to describe all the heterogeneities and the behavior of the field, additional constraints are needed to obtain a more realistic geological model. For example, seismic data or stratigraphic models can provide average reservoir information with an excellent areal coverage, but with a poor vertical resolution. New advances in modelisation techniques allow now to integrate this type of additional external information in order to constrain the simulations. In particular, 2D or 3D seismic derived information grids, or sand-shale ratios maps coming from stratigraphic models can be used as external drifts to compute the geological image of the reservoir at the fine scale. Examples are presented to illustrate the use of these new tools, their impact on the final reservoir model, and their sensitivity to some key parameters.

  6. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  7. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk. PMID:19087232

  8. A Multiple Risk Factors Model of the Development of Aggression among Early Adolescents from Urban Disadvantaged Neighborhoods

    ERIC Educational Resources Information Center

    Kim, Sangwon; Orpinas, Pamela; Kamphaus, Randy; Kelder, Steven H.

    2011-01-01

    This study empirically derived a multiple risk factors model of the development of aggression among middle school students in urban, low-income neighborhoods, using Hierarchical Linear Modeling (HLM). Results indicated that aggression increased from sixth to eighth grade. Additionally, the influences of four risk domains (individual, family,…

  9. Experimental model and analytic solution for real-time observation of vehicle's additional steer angle

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaolong; Li, Liang; Pan, Deng; Cao, Chengmao; Song, Jian

    2014-03-01

    The current research of real-time observation for vehicle roll steer angle and compliance steer angle(both of them comprehensively referred as the additional steer angle in this paper) mainly employs the linear vehicle dynamic model, in which only the lateral acceleration of vehicle body is considered. The observation accuracy resorting to this method cannot meet the requirements of vehicle real-time stability control, especially under extreme driving conditions. The paper explores the solution resorting to experimental method. Firstly, a multi-body dynamic model of a passenger car is built based on the ADAMS/Car software, whose dynamic accuracy is verified by the same vehicle's roadway test data of steady static circular test. Based on this simulation platform, several influencing factors of additional steer angle under different driving conditions are quantitatively analyzed. Then ɛ-SVR algorithm is employed to build the additional steer angle prediction model, whose input vectors mainly include the sensor information of standard electronic stability control system(ESC). The method of typical slalom tests and FMVSS 126 tests are adopted to make simulation, train model and test model's generalization performance. The test result shows that the influence of lateral acceleration on additional steer angle is maximal (the magnitude up to 1°), followed by the longitudinal acceleration-deceleration and the road wave amplitude (the magnitude up to 0.3°). Moreover, both the prediction accuracy and the calculation real-time of the model can meet the control requirements of ESC. This research expands the accurate observation methods of the additional steer angle under extreme driving conditions.

  10. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    PubMed

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.

  11. Antimicrobial combinations: Bliss independence and Loewe additivity derived from mechanistic multi-hit models.

    PubMed

    Baeder, Desiree Y; Yu, Guozhi; Hozé, Nathanaël; Rolff, Jens; Regoes, Roland R

    2016-05-26

    Antimicrobial peptides (AMPs) and antibiotics reduce the net growth rate of bacterial populations they target. It is relevant to understand if effects of multiple antimicrobials are synergistic or antagonistic, in particular for AMP responses, because naturally occurring responses involve multiple AMPs. There are several competing proposals describing how multiple types of antimicrobials add up when applied in combination, such as Loewe additivity or Bliss independence. These additivity terms are defined ad hoc from abstract principles explaining the supposed interaction between the antimicrobials. Here, we link these ad hoc combination terms to a mathematical model that represents the dynamics of antimicrobial molecules hitting targets on bacterial cells. In this multi-hit model, bacteria are killed when a certain number of targets are hit by antimicrobials. Using this bottom-up approach reveals that Bliss independence should be the model of choice if no interaction between antimicrobial molecules is expected. Loewe additivity, on the other hand, describes scenarios in which antimicrobials affect the same components of the cell, i.e. are not acting independently. While our approach idealizes the dynamics of antimicrobials, it provides a conceptual underpinning of the additivity terms. The choice of the additivity term is essential to determine synergy or antagonism of antimicrobials.This article is part of the themed issue 'Evolutionary ecology of arthropod antimicrobial peptides'. PMID:27160596

  12. Ecological risk assessment of water environment for Luanhe River Basin based on relative risk model.

    PubMed

    Liu, Jingling; Chen, Qiuying; Li, Yongli

    2010-11-01

    The relative risk model (RRM) was applied in regional ecological risk assessments successfully. In this study, the RRM was developed through increasing the data of risk source and introducing the source-stressor-habitat exposure filter (SSH), the endpoint-habitat exposure filter (EH) and the stressor-endpoint effect filter (SE) to reflect the meaning of exposure and effect more explicit. Water environment which include water quality, water quantity and aquatic ecosystems was selected as the ecological risk assessment endpoints. The Luanhe River Basin located in the North China was selected as model case. The results showed that there were three low risk regions, one medium risk region and two high risk regions in the Luanhe River Basin. The results also indicated habitat destruction was the largest stressor with the risk scores as high as 11.87 for the Luanhe water environment, the second was oxygen consuming organic pollutants (9.28) and the third was nutrients (7.78). So these three stressors were the main influencing factors of the ecological pressure in the study area. Furthermore, animal husbandry was the biggest source with the risk scores as high as 20.38, the second was domestic sewage (14.00), and the third was polluting industry (9.96). For habitats, waters and farmland were enduring the bigger pressure and should be taken considerable attention. Water deterioration and ecological service values damaged were facing the biggest risk pressure, and secondly was biodiversity decreased and landscape fragmentation. PMID:20683654

  13. Assessing Academic Risk of Student-Athletes: Applicability of the NCAA Graduation Risk Overview Model to GPA

    ERIC Educational Resources Information Center

    Johnson, James

    2013-01-01

    In an effort to standardize academic risk assessment, the NCAA developed the graduation risk overview (GRO) model. Although this model was designed to assess graduation risk, its ability to predict grade-point average (GPA) remained unknown. Therefore, 134 individual risk assessments were made to determine GRO model effectiveness in the…

  14. Sparse Additive Ordinary Differential Equations for Dynamic Gene Regulatory Network Modeling

    PubMed Central

    Wu, Hulin; Lu, Tao; Xue, Hongqi; Liang, Hua

    2014-01-01

    Summary The gene regulation network (GRN) is a high-dimensional complex system, which can be represented by various mathematical or statistical models. The ordinary differential equation (ODE) model is one of the popular dynamic GRN models. High-dimensional linear ODE models have been proposed to identify GRNs, but with a limitation of the linear regulation effect assumption. In this article, we propose a sparse additive ODE (SA-ODE) model, coupled with ODE estimation methods and adaptive group LASSO techniques, to model dynamic GRNs that could flexibly deal with nonlinear regulation effects. The asymptotic properties of the proposed method are established and simulation studies are performed to validate the proposed approach. An application example for identifying the nonlinear dynamic GRN of T-cell activation is used to illustrate the usefulness of the proposed method. PMID:25061254

  15. Sparse Additive Ordinary Differential Equations for Dynamic Gene Regulatory Network Modeling.

    PubMed

    Wu, Hulin; Lu, Tao; Xue, Hongqi; Liang, Hua

    2014-04-01

    The gene regulation network (GRN) is a high-dimensional complex system, which can be represented by various mathematical or statistical models. The ordinary differential equation (ODE) model is one of the popular dynamic GRN models. High-dimensional linear ODE models have been proposed to identify GRNs, but with a limitation of the linear regulation effect assumption. In this article, we propose a sparse additive ODE (SA-ODE) model, coupled with ODE estimation methods and adaptive group LASSO techniques, to model dynamic GRNs that could flexibly deal with nonlinear regulation effects. The asymptotic properties of the proposed method are established and simulation studies are performed to validate the proposed approach. An application example for identifying the nonlinear dynamic GRN of T-cell activation is used to illustrate the usefulness of the proposed method. PMID:25061254

  16. Midrapidity inclusive densities in high energy pp collisions in additive quark model

    NASA Astrophysics Data System (ADS)

    Shabelski, Yu. M.; Shuvaev, A. G.

    2016-08-01

    High energy (CERN SPS and LHC) inelastic pp (pbar{p}) scattering is treated in the framework of the additive quark model together with Pomeron exchange theory. We extract the midrapidity inclusive density of the charged secondaries produced in a single quark-quark collision and investigate its energy dependence. Predictions for the π p collisions are presented.

  17. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data

    PubMed Central

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2012-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions. PMID:23645976

  18. Phase two of Site 300`s ecological risk assessment: Model verification and risk management

    SciTech Connect

    Carlson, T.M.; Gregory, S.D.

    1995-12-31

    The authors completed the baseline ecological risk assessment (ERA) for Lawrence Livermore National Laboratory`s Site 300 in 1993. Using data collection and modeling techniques adapted from the human health risk assessment (HRA), they evaluated the potential hazard of contaminants in environmental media to ecological receptors. They identified potential hazards to (1) aquatic invertebrates from heavy metal contaminants in surface water, (2) burrowing vertebrates from contaminants volatilizing from subsurface soil into burrow air, and (3) grazing deer and burrowing vertebrates from cadmium contamination in surface soil. They recently began collecting data to refine the estimates of potential hazard to these ecological receptors. Bioassay results form the surface water failed to verify a hazard to aquatic invertebrates. Soil vapor surveys of subsurface burrows did verify the presence of high concentrations of volatile organic compounds (VOCs). However, they have not yet verified a true impact on the burrowing populations. The authors also completed an extensive surface soil sampling program, which identified local hot spots of cadmium contamination. In addition, they have been collecting data on the land use patterns of the deer population. Their data indicate that deer do not typically use those areas with cadmium surface soil contamination. Information from this phase of the ERA, along with the results of the HRA, will direct the selection of remedial alternatives for the site. For the ecological receptors, remedial alternatives include developing a risk management program which includes ensuring that (1) sensitive burrowing species (such as rare or endangered species) do not use areas of surface or subsurface contamination, and (2) deer populations do not use areas of surface soil contamination.

  19. Modeling extinction risk of endemic birds of mainland china.

    PubMed

    Chen, Youhua

    2013-01-01

    The extinction risk of endemic birds of mainland China was modeled over evolutionary time. Results showed that extinction risk of endemic birds in mainland China always tended to be similar within subclades over the evolutionary time of species divergence, and the overall evolution of extinction risk of species presented a conservatism pattern, as evidenced by the disparity-through-time plot. A constant-rate evolutionary model was the best one to quantify the evolution of extinction risk of endemic birds of mainland China. Thus, there was no rate shifting pattern for the evolution of extinction risk of Chinese endemic birds over time. In a summary, extinction risk of endemic birds of mainland China is systematically quantified under the evolutionary framework in the present work.

  20. Risk models and scores for type 2 diabetes: systematic review

    PubMed Central

    Mathur, Rohini; Dent, Tom; Meads, Catherine; Greenhalgh, Trisha

    2011-01-01

    Objective To evaluate current risk models and scores for type 2 diabetes and inform selection and implementation of these in practice. Design Systematic review using standard (quantitative) and realist (mainly qualitative) methodology. Inclusion criteria Papers in any language describing the development or external validation, or both, of models and scores to predict the risk of an adult developing type 2 diabetes. Data sources Medline, PreMedline, Embase, and Cochrane databases were searched. Included studies were citation tracked in Google Scholar to identify follow-on studies of usability or impact. Data extraction Data were extracted on statistical properties of models, details of internal or external validation, and use of risk scores beyond the studies that developed them. Quantitative data were tabulated to compare model components and statistical properties. Qualitative data were analysed thematically to identify mechanisms by which use of the risk model or score might improve patient outcomes. Results 8864 titles were scanned, 115 full text papers considered, and 43 papers included in the final sample. These described the prospective development or validation, or both, of 145 risk prediction models and scores, 94 of which were studied in detail here. They had been tested on 6.88 million participants followed for up to 28 years. Heterogeneity of primary studies precluded meta-analysis. Some but not all risk models or scores had robust statistical properties (for example, good discrimination and calibration) and had been externally validated on a different population. Genetic markers added nothing to models over clinical and sociodemographic factors. Most authors described their score as “simple” or “easily implemented,” although few were specific about the intended users and under what circumstances. Ten mechanisms were identified by which measuring diabetes risk might improve outcomes. Follow-on studies that applied a risk score as part of an

  1. Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults

    ERIC Educational Resources Information Center

    Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.

    2007-01-01

    Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…

  2. A Measurement Model of Women’s Behavioral Risk Taking

    PubMed Central

    VanZile-Tamsen, Carol; Testa, Maria; Livingston, Jennifer A.; Harlow, Lisa L.

    2009-01-01

    The current study was designed to gain a better understanding of the nature of the relationship between substance use and sexual risk taking within a community sample of women (N = 1,004). Using confirmatory factor analysis, the authors examined the factor structure of sexual risk behaviors and substance use to determine whether they are best conceptualized as domains underlying a single, higher order, risk-taking propensity. A 2 higher order factor model (sexual risk behavior and substance use) provided the best fit to the data, suggesting that these 2 general risk domains are correlated but independent factors. Sensation seeking had large general direct effects on the 2 risk domains and large indirect effects on the 4 first-order factors and the individual indicators. Negative affect had smaller, yet still significant, effects. Impulsivity and anxiety were unrelated to sexual health risk domains. PMID:16569118

  3. Modeling Longitudinal Data with Generalized Additive Models: Applications to Single-Case Designs

    ERIC Educational Resources Information Center

    Sullivan, Kristynn J.; Shadish, William R.

    2013-01-01

    Single case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time both in the presence and absence of treatment. For a variety of reasons, interest in the statistical analysis and meta-analysis of these designs has been growing in recent years. This paper proposes modeling SCD data with…

  4. Modelling eutrophication and microbial risks in peri-urban river systems using discriminant function analysis.

    PubMed

    Pinto, U; Maheshwari, B; Shrestha, S; Morris, C

    2012-12-01

    The methodology currently available to river managers for assessment of river conditions for eutrophication and microbial risks is often time consuming and costly. There is a need for efficient predictive tools based on easily measured variables for implementing appropriate management strategies and providing advice to local river users on river health and associated risks. Using the Hawkesbury-Nepean River system in New South Wales, Australia as case study, a stepwise discriminant function analysis was employed to develop two predictive models, one for river eutrophication risk and the other for microbial risk. The models are intended for a preliminary assessment of a river reach, particularly to assess the level of risk (high or low) for algal bloom and whether the river water is suitable for primary contact activities such as swimming. The input variables for both models included saturated dissolved oxygen and turbidity, while the eutrophication risk model included temperature as an additional variable. When validated with an independent data set, both models predicted the observed risk category accurately in two out of three instances. Since the models developed in this study use only two or three easy-to-measure variables, their application can help in rapid assessment of river conditions, result in potential cost saving in river monitoring programs and assist in providing timely advice to community and other users for a particular aspect of river use.

  5. Prediction models for cardiovascular disease risk in the general population: systematic review

    PubMed Central

    Hooft, Lotty; Schuit, Ewoud; Debray, Thomas P A; Collins, Gary S; Tzoulaki, Ioanna; Lassale, Camille M; Siontis, George C M; Chiocchia, Virginia; Roberts, Corran; Schlüssel, Michael Maia; Gerry, Stephen; Black, James A; Heus, Pauline; van der Schouw, Yvonne T; Peelen, Linda M; Moons, Karel G M

    2016-01-01

    Objective To provide an overview of prediction models for risk of cardiovascular disease (CVD) in the general population. Design Systematic review. Data sources Medline and Embase until June 2013. Eligibility criteria for study selection Studies describing the development or external validation of a multivariable model for predicting CVD risk in the general population. Results 9965 references were screened, of which 212 articles were included in the review, describing the development of 363 prediction models and 473 external validations. Most models were developed in Europe (n=167, 46%), predicted risk of fatal or non-fatal coronary heart disease (n=118, 33%) over a 10 year period (n=209, 58%). The most common predictors were smoking (n=325, 90%) and age (n=321, 88%), and most models were sex specific (n=250, 69%). Substantial heterogeneity in predictor and outcome definitions was observed between models, and important clinical and methodological information were often missing. The prediction horizon was not specified for 49 models (13%), and for 92 (25%) crucial information was missing to enable the model to be used for individual risk prediction. Only 132 developed models (36%) were externally validated and only 70 (19%) by independent investigators. Model performance was heterogeneous and measures such as discrimination and calibration were reported for only 65% and 58% of the external validations, respectively. Conclusions There is an excess of models predicting incident CVD in the general population. The usefulness of most of the models remains unclear owing to methodological shortcomings, incomplete presentation, and lack of external validation and model impact studies. Rather than developing yet another similar CVD risk prediction model, in this era of large datasets, future research should focus on externally validating and comparing head-to-head promising CVD risk models that already exist, on tailoring or even combining these models to local

  6. Formation and reduction of carcinogenic furan in various model systems containing food additives.

    PubMed

    Kim, Jin-Sil; Her, Jae-Young; Lee, Kwang-Geun

    2015-12-15

    The aim of this study was to analyse and reduce furan in various model systems. Furan model systems consisting of monosaccharides (0.5M glucose and ribose), amino acids (0.5M alanine and serine) and/or 1.0M ascorbic acid were heated at 121°C for 25 min. The effects of food additives (each 0.1M) such as metal ions (iron sulphate, magnesium sulphate, zinc sulphate and calcium sulphate), antioxidants (BHT and BHA), and sodium sulphite on the formation of furan were measured. The level of furan formed in the model systems was 6.8-527.3 ng/ml. The level of furan in the model systems of glucose/serine and glucose/alanine increased 7-674% when food additives were added. In contrast, the level of furan decreased by 18-51% in the Maillard reaction model systems that included ribose and alanine/serine with food additives except zinc sulphate.

  7. NB-PLC channel modelling with cyclostationary noise addition & OFDM implementation for smart grid

    NASA Astrophysics Data System (ADS)

    Thomas, Togis; Gupta, K. K.

    2016-03-01

    Power line communication (PLC) technology can be a viable solution for the future ubiquitous networks because it provides a cheaper alternative to other wired technology currently being used for communication. In smart grid Power Line Communication (PLC) is used to support communication with low rate on low voltage (LV) distribution network. In this paper, we propose the channel modelling of narrowband (NB) PLC in the frequency range 5 KHz to 500 KHz by using ABCD parameter with cyclostationary noise addition. Behaviour of the channel was studied by the addition of 11KV/230V transformer, by varying load location and load. Bit error rate (BER) Vs signal to noise ratio SNR) was plotted for the proposed model by employing OFDM. Our simulation results based on the proposed channel model show an acceptable performance in terms of bit error rate versus signal to noise ratio, which enables communication required for smart grid applications.

  8. The OPTIONS model of sexual risk assessment for adolescents.

    PubMed

    Lusczakoski, Kathryn D; Rue, Lisa A

    2012-03-01

    Typically, clinical evaluations of adolescents' sexual risk is based on inquiring about past sexual activity, which is limited by not including an adolescent's cognitive decision making regarding their past sexual decisions. This study describes the novel OPTIONS framework for assessing adolescent sexual risk including three general categories of risk (e.g., primary, secondary, and tertiary risk), which is designed to overcome the limitation of action-based assessment of risk and improve practitioners' ability to assess the levels of sexual risk. A convenience sample of 201 older adolescents (18-19 years of age) completed an online version of the Relationship Options Survey (ROS), designed to measure the OPTIONS sexual risk assessment. Bivariate correlation among the subscales functioned in the hypothesized manner, with all correlations being statistically significant. Using the OPTIONS model, 22.4% participants were classified as high risk primary, 7.0% participants were classified as high risk secondary, and 27.4% participants were classified as high risk tertiary. The study provided preliminary evidence for OPTIONS model of sexual assessment, which provides a more tailored evaluation by including cognitive decisions regarding an adolescent's sexual actions.

  9. A Risk Score with Additional Four Independent Factors to Predict the Incidence and Recovery from Metabolic Syndrome: Development and Validation in Large Japanese Cohorts

    PubMed Central

    Obokata, Masaru; Negishi, Kazuaki; Ohyama, Yoshiaki; Okada, Haruka; Imai, Kunihiko; Kurabayashi, Masahiko

    2015-01-01

    Background Although many risk factors for Metabolic syndrome (MetS) have been reported, there is no clinical score that predicts its incidence. The purposes of this study were to create and validate a risk score for predicting both incidence and recovery from MetS in a large cohort. Methods Subjects without MetS at enrollment (n = 13,634) were randomly divided into 2 groups and followed to record incidence of MetS. We also examined recovery from it in rest 2,743 individuals with prevalent MetS. Results During median follow-up of 3.0 years, 878 subjects in the derivation and 757 in validation cohorts developed MetS. Multiple logistic regression analysis identified 12 independent variables from the derivation cohort and initial score for subsequent MetS was created, which showed good discrimination both in the derivation (c-statistics 0.82) and validation cohorts (0.83). The predictability of the initial score for recovery from MetS was tested in the 2,743 MetS population (906 subjects recovered from MetS), where nine variables (including age, sex, γ-glutamyl transpeptidase, uric acid and five MetS diagnostic criteria constituents.) remained significant. Then, the final score was created using the nine variables. This score significantly predicted both the recovery from MetS (c-statistics 0.70, p<0.001, 78% sensitivity and 54% specificity) and incident MetS (c-statistics 0.80) with an incremental discriminative ability over the model derived from five factors used in the diagnosis of MetS (continuous net reclassification improvement: 0.35, p < 0.001 and integrated discrimination improvement: 0.01, p<0.001). Conclusions We identified four additional independent risk factors associated with subsequent MetS, developed and validated a risk score to predict both incident and recovery from MetS. PMID:26230621

  10. Student Choices: Using a Competing Risks Model of Survival Analysis.

    ERIC Educational Resources Information Center

    Denson, Katy; Schumacker, Randall E.

    By using a competing risks model, survival analysis methods can be extended to predict which of several mutually exclusive outcomes students will choose based on predictor variables, thereby ascertaining if the profile of risk differs across groups. The paper begins with a brief introduction to logistic regression and some of the basic concepts of…

  11. Risk Prediction Models for Other Cancers or Multiple Sites

    Cancer.gov

    Developing statistical models that estimate the probability of developing other multiple cancers over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Back-end Science Model Integration for Ecological Risk Assessment

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  13. Back-end Science Model Integration for Ecological Risk Assessment.

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  14. Risk prediction models for hepatocellular carcinoma in different populations

    PubMed Central

    Ma, Xiao; Yang, Yang; Tu, Hong; Gao, Jing; Tan, Yu-Ting; Zheng, Jia-Li; Bray, Freddie; Xiang, Yong-Bing

    2016-01-01

    Hepatocellular carcinoma (HCC) is a malignant disease with limited therapeutic options due to its aggressive progression. It places heavy burden on most low and middle income countries to treat HCC patients. Nowadays accurate HCC risk predictions can help making decisions on the need for HCC surveillance and antiviral therapy. HCC risk prediction models based on major risk factors of HCC are useful and helpful in providing adequate surveillance strategies to individuals who have different risk levels. Several risk prediction models among cohorts of different populations for estimating HCC incidence have been presented recently by using simple, efficient, and ready-to-use parameters. Moreover, using predictive scoring systems to assess HCC development can provide suggestions to improve clinical and public health approaches, making them more cost-effective and effort-effective, for inducing personalized surveillance programs according to risk stratification. In this review, the features of risk prediction models of HCC across different populations were summarized, and the perspectives of HCC risk prediction models were discussed as well. PMID:27199512

  15. Validation of a novel air toxic risk model with air monitoring.

    PubMed

    Pratt, Gregory C; Dymond, Mary; Ellickson, Kristie; Thé, Jesse

    2012-01-01

    Three modeling systems were used to estimate human health risks from air pollution: two versions of MNRiskS (for Minnesota Risk Screening), and the USEPA National Air Toxics Assessment (NATA). MNRiskS is a unique cumulative risk modeling system used to assess risks from multiple air toxics, sources, and pathways on a local to a state-wide scale. In addition, ambient outdoor air monitoring data were available for estimation of risks and comparison with the modeled estimates of air concentrations. Highest air concentrations and estimated risks were generally found in the Minneapolis-St. Paul metropolitan area and lowest risks in undeveloped rural areas. Emissions from mobile and area (nonpoint) sources created greater estimated risks than emissions from point sources. Highest cancer risks were via ingestion pathway exposures to dioxins and related compounds. Diesel particles, acrolein, and formaldehyde created the highest estimated inhalation health impacts. Model-estimated air concentrations were generally highest for NATA and lowest for the AERMOD version of MNRiskS. This validation study showed reasonable agreement between available measurements and model predictions, although results varied among pollutants, and predictions were often lower than measurements. The results increased confidence in identifying pollutants, pathways, geographic areas, sources, and receptors of potential concern, and thus provide a basis for informing pollution reduction strategies and focusing efforts on specific pollutants (diesel particles, acrolein, and formaldehyde), geographic areas (urban centers), and source categories (nonpoint sources). The results heighten concerns about risks from food chain exposures to dioxins and PAHs. Risk estimates were sensitive to variations in methodologies for treating emissions, dispersion, deposition, exposure, and toxicity.

  16. Generalized neurofuzzy network modeling algorithms using Bézier-Bernstein polynomial functions and additive decomposition.

    PubMed

    Hong, X; Harris, C J

    2000-01-01

    This paper introduces a new neurofuzzy model construction algorithm for nonlinear dynamic systems based upon basis functions that are Bézier-Bernstein polynomial functions. This paper is generalized in that it copes with n-dimensional inputs by utilising an additive decomposition construction to overcome the curse of dimensionality associated with high n. This new construction algorithm also introduces univariate Bézier-Bernstein polynomial functions for the completeness of the generalized procedure. Like the B-spline expansion based neurofuzzy systems, Bézier-Bernstein polynomial function based neurofuzzy networks hold desirable properties such as nonnegativity of the basis functions, unity of support, and interpretability of basis function as fuzzy membership functions, moreover with the additional advantages of structural parsimony and Delaunay input space partition, essentially overcoming the curse of dimensionality associated with conventional fuzzy and RBF networks. This new modeling network is based on additive decomposition approach together with two separate basis function formation approaches for both univariate and bivariate Bézier-Bernstein polynomial functions used in model construction. The overall network weights are then learnt using conventional least squares methods. Numerical examples are included to demonstrate the effectiveness of this new data based modeling approach.

  17. Submission Form for Peer-Reviewed Cancer Risk Prediction Models

    Cancer.gov

    If you have information about a peer-reviewd cancer risk prediction model that you would like to be considered for inclusion on this list, submit as much information as possible through the form on this page.

  18. Lymphatic filariasis transmission risk map of India, based on a geo-environmental risk model.

    PubMed

    Sabesan, Shanmugavelu; Raju, Konuganti Hari Kishan; Subramanian, Swaminathan; Srivastava, Pradeep Kumar; Jambulingam, Purushothaman

    2013-09-01

    The strategy adopted by a global program to interrupt transmission of lymphatic filariasis (LF) is mass drug administration (MDA) using chemotherapy. India also followed this strategy by introducing MDA in the historically known endemic areas. All other areas, which remained unsurveyed, were presumed to be nonendemic and left without any intervention. Therefore, identification of LF transmission risk areas in the entire country has become essential so that they can be targeted for intervention. A geo-environmental risk model (GERM) developed earlier was used to create a filariasis transmission risk map for India. In this model, a Standardized Filariasis Transmission Risk Index (SFTRI, based on geo-environmental risk variables) was used as a predictor of transmission risk. The relationship between SFTRI and endemicity (historically known) of an area was quantified by logistic regression analysis. The quantified relationship was validated by assessing the filarial antigenemia status of children living in the unsurveyed areas through a ground truth study. A significant positive relationship was observed between SFTRI and the endemicity of an area. Overall, the model prediction of filarial endemic status of districts was found to be correct in 92.8% of the total observations. Thus, among the 190 districts hitherto unsurveyed, as many as 113 districts were predicted to be at risk, and the remaining at no risk. The GERM developed on geographic information system (GIS) platform is useful for LF spatial delimitation on a macrogeographic/regional scale. Furthermore, the risk map developed will be useful for the national LF elimination program by identifying areas at risk for intervention and for undertaking surveillance in no-risk areas.

  19. A model for assessing the risk of human trafficking on a local level

    NASA Astrophysics Data System (ADS)

    Colegrove, Amanda

    Human trafficking is a human rights violation that is difficult to quantify. Models for estimating the number of victims of trafficking presented by previous researchers depend on inconsistent, poor quality data. As an intermediate step to help current efforts by nonprofits to combat human trafficking, this project presents a model that is not dependent on quantitative data specific to human trafficking, but rather profiles the risk of human trafficking at the local level through causative factors. Businesses, indicated by the literature, were weighted based on the presence of characteristics that increase the likelihood of trafficking in persons. The mean risk was calculated by census tract to reveal the multiplicity of risk levels in both rural and urban settings. Results indicate that labor trafficking may be a more diffuse problem in Missouri than sex trafficking. Additionally, spatial patterns of risk remained largely the same regardless of adjustments made to the model.

  20. A Hybrid Tsunami Risk Model for Japan

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A. V.; Smith, D. F.; Khater, M.; Khemici, O.; Betov, B.; Scott, J.

    2014-12-01

    Around the margins of the Pacific Ocean, denser oceanic plates slipping under continental plates cause subduction earthquakes generating large tsunami waves. The subducting Pacific and Philippine Sea plates create damaging interplate earthquakes followed by huge tsunami waves. It was a rupture of the Japan Trench subduction zone (JTSZ) and the resultant M9.0 Tohoku-Oki earthquake that caused the unprecedented tsunami along the Pacific coast of Japan on March 11, 2011. EQECAT's Japan Earthquake model is a fully probabilistic model which includes a seismo-tectonic model describing the geometries, magnitudes, and frequencies of all potential earthquake events; a ground motion model; and a tsunami model. Within the much larger set of all modeled earthquake events, fault rupture parameters for about 24000 stochastic and 25 historical tsunamigenic earthquake events are defined to simulate tsunami footprints using the numerical tsunami model COMCOT. A hybrid approach using COMCOT simulated tsunami waves is used to generate inundation footprints, including the impact of tides and flood defenses. Modeled tsunami waves of major historical events are validated against observed data. Modeled tsunami flood depths on 30 m grids together with tsunami vulnerability and financial models are then used to estimate insured loss in Japan from the 2011 tsunami. The primary direct report of damage from the 2011 tsunami is in terms of the number of buildings damaged by municipality in the tsunami affected area. Modeled loss in Japan from the 2011 tsunami is proportional to the number of buildings damaged. A 1000-year return period map of tsunami waves shows high hazard along the west coast of southern Honshu, on the Pacific coast of Shikoku, and on the east coast of Kyushu, primarily associated with major earthquake events on the Nankai Trough subduction zone (NTSZ). The highest tsunami hazard of more than 20m is seen on the Sanriku coast in northern Honshu, associated with the JTSZ.

  1. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    SciTech Connect

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  2. An integrated risk management model for source water protection areas.

    PubMed

    Chiueh, Pei-Te; Shang, Wei-Ting; Lo, Shang-Lien

    2012-10-17

    Watersheds are recognized as the most effective management unit for the protection of water resources. For surface water supplies that use water from upstream watersheds, evaluating threats to water quality and implementing a watershed management plan are crucial for the maintenance of drinking water safe for humans. The aim of this article is to establish a risk assessment model that provides basic information for identifying critical pollutants and areas at high risk for degraded water quality. In this study, a quantitative risk model that uses hazard quotients for each water quality parameter was combined with a qualitative risk model that uses the relative risk level of potential pollution events in order to characterize the current condition and potential risk of watersheds providing drinking water. In a case study of Taipei Source Water Area in northern Taiwan, total coliforms and total phosphorus were the top two pollutants of concern. Intensive tea-growing and recreational activities around the riparian zone may contribute the greatest pollution to the watershed. Our risk assessment tool may be enhanced by developing, recording, and updating information on pollution sources in the water supply watersheds. Moreover, management authorities could use the resultant information to create watershed risk management plans.

  3. Modeling the Risks of Geothermal Development

    SciTech Connect

    Golabi, K.; Nair, K.; Rothstein, S.; Sioshansi, F.

    1980-12-16

    Geothermal energy has emerged as a promising energy source in recent years and has received serious attention from developers and potential users. Despite the advantages of this resource, such as potential cost competitiveness, reliability, public acceptance, etc., the commercial development and use of geothermal energy has been slow. Impediments to the development of this resource include technical, financial, environmental and regulatory uncertainties. Since geothermal power is unique in that the generation facility is tied to a single fuel at a single site, these uncertainties are of particular concern to utility companies. The areas of uncertainty and potential risks are well known. This paper presents a method for quantifying the relevant uncertainties and a framework for aggregating the risks through the use of submodels. The objective submodels can be combined with subjective probabilities (when sufficient data is not available) to yield a probability distribution over a single criterion (levelized busbar cost) that can be used to compare the desirability of geothermal power development with respect to other alternatives.

  4. Usefulness and limitations of global flood risk models

    NASA Astrophysics Data System (ADS)

    Ward, Philip; Jongman, Brenden; Salamon, Peter; Simpson, Alanna; Bates, Paul; De Groeve, Tom; Muis, Sanne; Coughlan de Perez, Erin; Rudari, Roberto; Mark, Trigg; Winsemius, Hessel

    2016-04-01

    Global flood risk models are now a reality. Initially, their development was driven by a demand from users for first-order global assessments to identify risk hotspots. Relentless upward trends in flood damage over the last decade have enhanced interest in such assessments. The adoption of the Sendai Framework for Disaster Risk Reduction and the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts have made these efforts even more essential. As a result, global flood risk models are being used more and more in practice, by an increasingly large number of practitioners and decision-makers. However, they clearly have their limits compared to local models. To address these issues, a team of scientists and practitioners recently came together at the Global Flood Partnership meeting to critically assess the question 'What can('t) we do with global flood risk models?'. The results of this dialogue (Ward et al., 2013) will be presented, opening a discussion on similar broader initiatives at the science-policy interface in other natural hazards. In this contribution, examples are provided of successful applications of global flood risk models in practice (for example together with the World Bank, Red Cross, and UNISDR), and limitations and gaps between user 'wish-lists' and model capabilities are discussed. Finally, a research agenda is presented for addressing these limitations and reducing the gaps. Ward et al., 2015. Nature Climate Change, doi:10.1038/nclimate2742

  5. Use of additive technologies for practical working with complex models for foundry technologies

    NASA Astrophysics Data System (ADS)

    Olkhovik, E.; Butsanets, A. A.; Ageeva, A. A.

    2016-07-01

    The article presents the results of research of additive technology (3D printing) application for developing a geometrically complex model of castings parts. Investment casting is well known and widely used technology for the production of complex parts. The work proposes the use of a 3D printing technology for manufacturing models parts, which are removed by thermal destruction. Traditional methods of equipment production for investment casting involve the use of manual labor which has problems with dimensional accuracy, and CNC technology which is less used. Such scheme is low productive and demands considerable time. We have offered an alternative method which consists in printing the main knots using a 3D printer (PLA and ABS) with a subsequent production of castings models from them. In this article, the main technological methods are considered and their problems are discussed. The dimensional accuracy of models in comparison with investment casting technology is considered as the main aspect.

  6. Generalized Additive Mixed-Models for Pharmacology Using Integrated Discrete Multiple Organ Co-Culture

    PubMed Central

    Ingersoll, Thomas; Cole, Stephanie; Madren-Whalley, Janna; Booker, Lamont; Dorsey, Russell; Li, Albert; Salem, Harry

    2016-01-01

    Integrated Discrete Multiple Organ Co-culture (IDMOC) is emerging as an in-vitro alternative to in-vivo animal models for pharmacology studies. IDMOC allows dose-response relationships to be investigated at the tissue and organoid levels, yet, these relationships often exhibit responses that are far more complex than the binary responses often measured in whole animals. To accommodate departure from binary endpoints, IDMOC requires an expansion of analytic techniques beyond simple linear probit and logistic models familiar in toxicology. IDMOC dose-responses may be measured at continuous scales, exhibit significant non-linearity such as local maxima or minima, and may include non-independent measures. Generalized additive mixed-modeling (GAMM) provides an alternative description of dose-response that relaxes assumptions of independence and linearity. We compared GAMMs to traditional linear models for describing dose-response in IDMOC pharmacology studies. PMID:27110941

  7. Evidence of thermal additivity during short laser pulses in an in vitro retinal model

    NASA Astrophysics Data System (ADS)

    Denton, Michael L.; Tijerina, Amanda J.; Dyer, Phillip N.; Oian, Chad A.; Noojin, Gary D.; Rickman, John M.; Shingledecker, Aurora D.; Clark, Clifton D.; Castellanos, Cherry C.; Thomas, Robert J.; Rockwell, Benjamin A.

    2015-03-01

    Laser damage thresholds were determined for exposure to 2.5-ms 532-nm pulses in an established in vitro retinal model. Single and multiple pulses (10, 100, 1000) were delivered to the cultured cells at three different pulse repetition frequency (PRF) values, and overt damage (membrane breach) was scored 1 hr post laser exposure. Trends in the damage data within and across the PRF range identified significant thermal additivity as PRF was increased, as evidenced by drastically reduced threshold values (< 40% of single-pulse value). Microthermography data that were collected in real time during each exposure also provided evidence of thermal additivity between successive laser pulses. Using thermal profiles simulated at high temporal resolution, damage threshold values were predicted by an in-house computational model. Our simulated ED50 value for a single 2.5-ms pulse was in very good agreement with experimental results, but ED50 predictions for multiple-pulse trains will require more refinement.

  8. QSAR in predictive models for ecological risk assessment

    SciTech Connect

    Passino-Reader, D.R.; Hickey, J.P.

    1994-12-31

    The end use of toxicity and exposure data is risk assessment to determine the probability that receptors experience harmful effects from exposure to environmental contaminants at a site. Determination of processes and development of predictive models precede the collection of data for risk assessment. The presence of hundreds of contaminants at a site and absence of data for many contaminants lead to the use of QSAR to implement the models. Examples of the use of linear salvation energy relationships (LSER) to provide estimates of aquatic toxicity and exposure endpoints will be provided. Integration of QSAR estimates and measured data must be addressed in the uncertainty analysis accompanying ecological risk assessment.

  9. A Corrosion Risk Assessment Model for Underground Piping

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  10. Model for Assembly Line Re-Balancing Considering Additional Capacity and Outsourcing to Face Demand Fluctuations

    NASA Astrophysics Data System (ADS)

    Samadhi, TMAA; Sumihartati, Atin

    2016-02-01

    The most critical stage in a garment industry is sewing process, because generally, it consists of a number of operations and a large number of sewing machines for each operation. Therefore, it requires a balancing method that can assign task to work station with balance workloads. Many studies on assembly line balancing assume a new assembly line, but in reality, due to demand fluctuation and demand increased a re-balancing is needed. To cope with those fluctuating demand changes, additional capacity can be carried out by investing in spare sewing machine and paying for sewing service through outsourcing. This study develops an assembly line balancing (ALB) model on existing line to cope with fluctuating demand change. Capacity redesign is decided if the fluctuation demand exceeds the available capacity through a combination of making investment on new machines and outsourcing while considering for minimizing the cost of idle capacity in the future. The objective of the model is to minimize the total cost of the line assembly that consists of operating costs, machine cost, adding capacity cost, losses cost due to idle capacity and outsourcing costs. The model develop is based on an integer programming model. The model is tested for a set of data of one year demand with the existing number of sewing machines of 41 units. The result shows that additional maximum capacity up to 76 units of machine required when there is an increase of 60% of the average demand, at the equal cost parameters..

  11. A DNA-hairpin model for repeat-addition processivity in telomere synthesis.

    PubMed

    Yang, Wei; Lee, Young-Sam

    2015-11-01

    We propose a DNA-hairpin model for the processivity of telomeric-repeat addition. Concomitantly with template-RNA translocation after each repeat synthesis, the complementary DNA repeat, for example, AGGGTT, loops out in a noncanonical base-paired hairpin, thus freeing the RNA template for the next round of repeat synthesis. The DNA hairpin is temporarily stabilized by telomerase and the incoming dGTP but becomes realigned for processive telomere synthesis.

  12. A Model for Risk Assessment in Health Care.

    PubMed

    Prijatelj, Vesna; Rajkovič, Vladislav; Šušteršič, Olga

    2016-01-01

    The purpose of our research is to reduce risks and hence prevent errors in the health care process. The aim is to design an organizational information model using error prevention methods for risk assessment in a clinical setting. The model is based on selected indicators of quality nursing care, resulting from the world-known theoretical and practical models combined with experience in the Slovenian health care. The proposed organizational information model and software solution has a significant impact on the professional attention, communication and information, critical thinking, experience and knowledge. PMID:27332383

  13. Rain water transport and storage in a model sandy soil with hydrogel particle additives.

    PubMed

    Wei, Y; Durian, D J

    2014-10-01

    We study rain water infiltration and drainage in a dry model sandy soil with superabsorbent hydrogel particle additives by measuring the mass of retained water for non-ponding rainfall using a self-built 3D laboratory set-up. In the pure model sandy soil, the retained water curve measurements indicate that instead of a stable horizontal wetting front that grows downward uniformly, a narrow fingered flow forms under the top layer of water-saturated soil. This rain water channelization phenomenon not only further reduces the available rain water in the plant root zone, but also affects the efficiency of soil additives, such as superabsorbent hydrogel particles. Our studies show that the shape of the retained water curve for a soil packing with hydrogel particle additives strongly depends on the location and the concentration of the hydrogel particles in the model sandy soil. By carefully choosing the particle size and distribution methods, we may use the swollen hydrogel particles to modify the soil pore structure, to clog or extend the water channels in sandy soils, or to build water reservoirs in the plant root zone.

  14. A triple risk model for unexplained late stillbirth

    PubMed Central

    2014-01-01

    Background The triple risk model for sudden infant death syndrome (SIDS) has been useful in understanding its pathogenesis. Risk factors for late stillbirth are well established, especially relating to maternal and fetal wellbeing. Discussion We propose a similar triple risk model for unexplained late stillbirth. The model proposed by us results from the interplay of three groups of factors: (1) maternal factors (such as maternal age, obesity, smoking), (2) fetal and placental factors (such as intrauterine growth retardation, placental insufficiency), and (3) a stressor (such as venocaval compression from maternal supine sleep position, sleep disordered breathing). We argue that the risk factors within each group in themselves may be insufficient to cause the death, but when they interrelate may produce a lethal combination. Summary Unexplained late stillbirth occurs when a fetus who is somehow vulnerable dies as a result of encountering a stressor and/or maternal condition in a combination which is lethal for them. PMID:24731396

  15. Risk Prediction Models for Mortality in Community-Acquired Pneumonia: A Systematic Review

    PubMed Central

    Loke, Yoon K.; Myint, Phyo Kyaw

    2013-01-01

    Background. Several models have been developed to predict the risk of mortality in community-acquired pneumonia (CAP). This study aims to systematically identify and evaluate the performance of published risk prediction models for CAP. Methods. We searched MEDLINE, EMBASE, and Cochrane library in November 2011 for initial derivation and validation studies for models which predict pneumonia mortality. We aimed to present the comparative usefulness of their mortality prediction. Results. We identified 20 different published risk prediction models for mortality in CAP. Four models relied on clinical variables that could be assessed in community settings, with the two validated models BTS1 and CRB-65 showing fairly similar balanced accuracy levels (0.77 and 0.72, resp.), while CRB-65 had AUROC of 0.78. Nine models required laboratory tests in addition to clinical variables, and the best performance levels amongst the validated models were those of CURB and CURB-65 (balanced accuracy 0.73 and 0.71, resp.), with CURB-65 having an AUROC of 0.79. The PSI (AUROC 0.82) was the only validated model with good discriminative ability among the four that relied on clinical, laboratorial, and radiological variables. Conclusions. There is no convincing evidence that other risk prediction models improve upon the well-established CURB-65 and PSI models. PMID:24228253

  16. Generalized linear and generalized additive models in studies of species distributions: Setting the scene

    USGS Publications Warehouse

    Guisan, A.; Edwards, T.C.; Hastie, T.

    2002-01-01

    An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.

  17. Default risk modeling beyond the first-passage approximation: Extended Black-Cox model

    NASA Astrophysics Data System (ADS)

    Katz, Yuri A.; Shokhirev, Nikolai V.

    2010-07-01

    We develop a generalization of the Black-Cox structural model of default risk. The extended model captures uncertainty related to firm’s ability to avoid default even if company’s liabilities momentarily exceeding its assets. Diffusion in a linear potential with the radiation boundary condition is used to mimic a company’s default process. The exact solution of the corresponding Fokker-Planck equation allows for derivation of analytical expressions for the cumulative probability of default and the relevant hazard rate. Obtained closed formulas fit well the historical data on global corporate defaults and demonstrate the split behavior of credit spreads for bonds of companies in different categories of speculative-grade ratings with varying time to maturity. Introduction of the finite rate of default at the boundary improves valuation of credit risk for short time horizons, which is the key advantage of the proposed model. We also consider the influence of uncertainty in the initial distance to the default barrier on the outcome of the model and demonstrate that this additional source of incomplete information may be responsible for nonzero credit spreads for bonds with very short time to maturity.

  18. Forest fire risk assessment in Sweden using climate model data: bias correction and future changes

    NASA Astrophysics Data System (ADS)

    Yang, W.; Gardelin, M.; Olsson, J.; Bosshard, T.

    2015-01-01

    As the risk for a forest fire is largely influenced by weather, evaluating its tendency under a changing climate becomes important for management and decision making. Currently, biases in climate models make it difficult to realistically estimate the future climate and consequent impact on fire risk. A distribution-based scaling (DBS) approach was developed as a post-processing tool that intends to correct systematic biases in climate modelling outputs. In this study, we used two projections, one driven by historical reanalysis (ERA40) and one from a global climate model (ECHAM5) for future projection, both having been dynamically downscaled by a regional climate model (RCA3). The effects of the post-processing tool on relative humidity and wind speed were studied in addition to the primary variables precipitation and temperature. Finally, the Canadian Fire Weather Index system was used to evaluate the influence of changing meteorological conditions on the moisture content in fuel layers and the fire-spread risk. The forest fire risk results using DBS are proven to better reflect risk using observations than that using raw climate outputs. For future periods, southern Sweden is likely to have a higher fire risk than today, whereas northern Sweden will have a lower risk of forest fire.

  19. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region. PMID:23242683

  20. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region.

  1. Modeling Manufacturing Processes to Mitigate Technological Risk

    SciTech Connect

    Allgood, G.O.; Manges, W.W.

    1999-10-24

    An economic model is a tool for determining the justifiable cost of new sensors and subsystems with respect to value and operation. This process balances the R and D costs against the expense of maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.). This model has been successfully used and deployed in the CAFE Project. The economic model was one of seven (see Fig. 1) elements critical in developing an investment strategy. It has been successfully used in guiding the R and D activities on the CAFE Project, suspending activities on three new sensor technologies, and continuing development o f two others. The model has also been used to justify the development of a new prognostic approach for diagnosing machine health using COTS equipment and a new algorithmic approach. maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.).

  2. The Global Earthquake Model and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  3. A generalized additive model for the spatial distribution of snowpack in the Spanish Pyrenees

    NASA Astrophysics Data System (ADS)

    López-Moreno, J. I.; Nogués-Bravo, D.

    2005-10-01

    A generalized additive model (GAM) was used to model the spatial distribution of snow depth in the central Spanish Pyrenees. Statistically significant non-linear relationships were found between distinct location and topographical variables and the average depth of the April snowpack at 76 snow poles from 1985 to 2000. The joint effect of the predictor variables explained more than 73% of the variance of the dependent variable. The performance of the model was assessed by applying a number of quantitative approaches to the residuals from a cross-validation test. The relatively low estimated errors and the possibility of understanding the processes that control snow accumulation, through the response curves of each independent variable, indicate that GAMs may be a useful tool for interpolating local snow depth or other climate parameters.

  4. Parity Symmetry and Parity Breaking in the Quantum Rabi Model with Addition of Ising Interaction

    NASA Astrophysics Data System (ADS)

    Wang, Qiong; He, Zhi; Yao, Chun-Mei

    2015-04-01

    We explore the possibility to generate new parity symmetry in the quantum Rabi model after a bias is introduced. In contrast to a mathematical treatment in a previous publication [J. Phys. A 46 (2013) 265302], we consider a physically realistic method by involving an additional spin into the quantum Rabi model to couple with the original spin by an Ising interaction, and then the parity symmetry is broken as well as the scaling behavior of the ground state by introducing a bias. The rule can be found that the parity symmetry is broken by introducing a bias and then restored by adding new degrees of freedom. Experimental feasibility of realizing the models under discussion is investigated. Supported by the National Natural Science Foundation of China under Grant Nos. 61475045 and 11347142, the Natural Science Foundation of Hunan Province, China under Grant No. 2015JJ3092

  5. Evaporation model for beam based additive manufacturing using free surface lattice Boltzmann methods

    NASA Astrophysics Data System (ADS)

    Klassen, Alexander; Scharowsky, Thorsten; Körner, Carolin

    2014-07-01

    Evaporation plays an important role in many technical applications including beam-based additive manufacturing processes, such as selective electron beam or selective laser melting (SEBM/SLM). In this paper, we describe an evaporation model which we employ within the framework of a two-dimensional free surface lattice Boltzmann method. With this method, we solve the hydrodynamics as well as thermodynamics of the molten material taking into account the mass and energy losses due to evaporation and the recoil pressure acting on the melt pool. Validation of the numerical model is performed by measuring maximum melt depths and evaporative losses in samples of pure titanium and Ti-6Al-4V molten by an electron beam. Finally, the model is applied to create processing maps for an SEBM process. The results predict that the penetration depth of the electron beam, which is a function of the acceleration voltage, has a significant influence on evaporation effects.

  6. Application of the Beck model to stock markets: Value-at-Risk and portfolio risk assessment

    NASA Astrophysics Data System (ADS)

    Kozaki, M.; Sato, A.-H.

    2008-02-01

    We apply the Beck model, developed for turbulent systems that exhibit scaling properties, to stock markets. Our study reveals that the Beck model elucidates the properties of stock market returns and is applicable to practical use such as the Value-at-Risk estimation and the portfolio analysis. We perform empirical analysis with daily/intraday data of the S&P500 index return and find that the volatility fluctuation of real markets is well-consistent with the assumptions of the Beck model: The volatility fluctuates at a much larger time scale than the return itself and the inverse of variance, or “inverse temperature”, β obeys Γ-distribution. As predicted by the Beck model, the distribution of returns is well-fitted by q-Gaussian distribution of Tsallis statistics. The evaluation method of Value-at-Risk (VaR), one of the most significant indicators in risk management, is studied for q-Gaussian distribution. Our proposed method enables the VaR evaluation in consideration of tail risk, which is underestimated by the variance-covariance method. A framework of portfolio risk assessment under the existence of tail risk is considered. We propose a multi-asset model with a single volatility fluctuation shared by all assets, named the single β model, and empirically examine the agreement between the model and an imaginary portfolio with Dow Jones indices. It turns out that the single β model gives good approximation to portfolios composed of the assets with non-Gaussian and correlated returns.

  7. Testing Departure from Additivity in Tukey’s Model using Shrinkage: Application to a Longitudinal Setting

    PubMed Central

    Ko, Yi-An; Mukherjee, Bhramar; Smith, Jennifer A.; Park, Sung Kyun; Kardia, Sharon L.R.; Allison, Matthew A.; Vokonas, Pantel S.; Chen, Jinbo; Diez-Roux, Ana V.

    2014-01-01

    While there has been extensive research developing gene-environment interaction (GEI) methods in case-control studies, little attention has been given to sparse and efficient modeling of GEI in longitudinal studies. In a two-way table for GEI with rows and columns as categorical variables, a conventional saturated interaction model involves estimation of a specific parameter for each cell, with constraints ensuring identifiability. The estimates are unbiased but are potentially inefficient because the number of parameters to be estimated can grow quickly with increasing categories of row/column factors. On the other hand, Tukey’s one degree of freedom (df) model for non-additivity treats the interaction term as a scaled product of row and column main effects. Due to the parsimonious form of interaction, the interaction estimate leads to enhanced efficiency and the corresponding test could lead to increased power. Unfortunately, Tukey’s model gives biased estimates and low power if the model is misspecified. When screening multiple GEIs where each genetic and environmental marker may exhibit a distinct interaction pattern, a robust estimator for interaction is important for GEI detection. We propose a shrinkage estimator for interaction effects that combines estimates from both Tukey’s and saturated interaction models and use the corresponding Wald test for testing interaction in a longitudinal setting. The proposed estimator is robust to misspecification of interaction structure. We illustrate the proposed methods using two longitudinal studies — the Normative Aging Study and the Multi-Ethnic Study of Atherosclerosis. PMID:25112650

  8. Wall-models for large eddy simulation based on a generic additive-filter formulation

    NASA Astrophysics Data System (ADS)

    Sanchez Rocha, Martin

    Based on the philosophy of only resolving the large scales of turbulent motion, Large Eddy Simulation (LES) has demonstrated potential to provide high-fidelity turbulence simulations at low computational cost. However, when the scales that control the turbulence in a particular flow are not large, LES has to increase significantly its computational cost to provide accurate predictions. This is the case in wall-bounded flows, where the grid resolution required by LES to resolve the near-wall structures is close to the requirements to resolve the smallest dissipative scales in turbulence. Therefore, to reduce this demanding requirement, it has been proposed to model the near-wall region with Reynolds-Averaged Navier-Stokes (RANS) models, in what is known as hybrid RANS/LES approach. In this work, the mathematical implications of merging two different turbulence modeling approaches are addressed by deriving the exact hybrid RANS/LES Navier-Stokes equations. These equations are derived by introducing an additive-filter, which linearly combines the RANS and LES operators with a blending function. The equations derived with the additive-filter predict additional hybrid terms, which represent the interactions between RANS and LES formulations. Theoretically, the prediction of the hybrid terms demonstrates that the hybridization of the two approaches cannot be accomplished only by the turbulence model equations, as it is claimed in current hybrid RANS/LES models. The importance of the exact hybrid RANS/LES equations is demonstrated by conducting numerical calculations on a turbulent flat-plate boundary layer. Results indicate that the hybrid terms help to maintain an equilibrated model transition when the hybrid formulation switches from RANS to LES. Results also indicate, that when the hybrid terms are not included, the accuracy of the calculations strongly relies on the blending function implemented in the additive-filter. On the other hand, if the exact equations are

  9. Improving default risk prediction using Bayesian model uncertainty techniques.

    PubMed

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. PMID:23163724

  10. Improving default risk prediction using Bayesian model uncertainty techniques.

    PubMed

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis.

  11. Reduction of carcinogenic 4(5)-methylimidazole in a caramel model system: influence of food additives.

    PubMed

    Seo, Seulgi; Ka, Mi-Hyun; Lee, Kwang-Geun

    2014-07-01

    The effect of various food additives on the formation of carcinogenic 4(5)-methylimidazole (4-MI) in a caramel model system was investigated. The relationship between the levels of 4-MI and various pyrazines was studied. When glucose and ammonium hydroxide were heated, the amount of 4-MI was 556 ± 1.3 μg/mL, which increased to 583 ± 2.6 μg/mL by the addition of 0.1 M of sodium sulfite. When various food additives, such as 0.1 M of iron sulfate, magnesium sulfate, zinc sulfate, tryptophan, and cysteine were added, the amount of 4-MI was reduced to 110 ± 0.7, 483 ± 2.0, 460 ± 2.0, 409 ± 4.4, and 397 ± 1.7 μg/mL, respectively. The greatest reduction, 80%, occurred with the addition of iron sulfate. Among the 12 pyrazines, 2-ethyl-6-methylpyrazine with 4-MI showed the highest correlation (r = -0.8239).

  12. MODELING APPROACHES TO POPULATION-LEVEL RISK AESSESSMENT

    EPA Science Inventory

    A SETAC Pellston Workshop on Population-Level Risk Assessment was held in Roskilde, Denmark on 23-27 August 2003. One aspect of this workshop focused on modeling approaches for characterizing population-level effects of chemical exposure. The modeling work group identified th...

  13. Empirical Analysis of Farm Credit Risk under the Structure Model

    ERIC Educational Resources Information Center

    Yan, Yan

    2009-01-01

    The study measures farm credit risk by using farm records collected by Farm Business Farm Management (FBFM) during the period 1995-2004. The study addresses the following questions: (1) whether farm's financial position is fully described by the structure model, (2) what are the determinants of farm capital structure under the structure model, (3)…

  14. Dental caries: an updated medical model of risk assessment.

    PubMed

    Kutsch, V Kim

    2014-04-01

    Dental caries is a transmissible, complex biofilm disease that creates prolonged periods of low pH in the mouth, resulting in a net mineral loss from the teeth. Historically, the disease model for dental caries consisted of mutans streptococci and Lactobacillus species, and the dental profession focused on restoring the lesions/damage from the disease by using a surgical model. The current recommendation is to implement a risk-assessment-based medical model called CAMBRA (caries management by risk assessment) to diagnose and treat dental caries. Unfortunately, many of the suggestions of CAMBRA have been overly complicated and confusing for clinicians. The risk of caries, however, is usually related to just a few common factors, and these factors result in common patterns of disease. This article examines the biofilm model of dental caries, identifies the common disease patterns, and discusses their targeted therapeutic strategies to make CAMBRA more easily adaptable for the privately practicing professional.

  15. Model for Solar Proton Risk Assessment

    NASA Technical Reports Server (NTRS)

    Xapos, M. A.; Stauffer, C.; Gee, G. B.; Barth, J. L.; Stassinopoulos, E. G.; McGuire, R. E.

    2004-01-01

    A statistical model for cumulative solar proton event fluences during space missions is presented that covers both the solar minimum and solar maximum phases of the solar cycle. It is based on data from the IMP and GOES series of satellites that is integrated together to allow the best features of each data set to be taken advantage of. This allows fluence-energy spectra to be extended out to energies of 327 MeV.

  16. Modeling of Flood Risk for the Continental United States

    NASA Astrophysics Data System (ADS)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  17. How pharmacokinetic modeling could improve a risk assessment for manganese

    EPA Science Inventory

    The neurotoxicity of manganese (Mn) is well established, yet the risk assessment of Mn is made complex by certain enigmas. These include apparently greatertoxicity via inhalation compared to oral exposure and greater toxicity in humans compared to rats. In addition, until recentl...

  18. Parametric Estimation in a Recurrent Competing Risks Model

    PubMed Central

    Peña, Edsel A.

    2014-01-01

    A resource-efficient approach to making inferences about the distributional properties of the failure times in a competing risks setting is presented. Efficiency is gained by observing recurrences of the competing risks over a random monitoring period. The resulting model is called the recurrent competing risks model (RCRM) and is coupled with two repair strategies whenever the system fails. Maximum likelihood estimators of the parameters of the marginal distribution functions associated with each of the competing risks and also of the system lifetime distribution function are presented. Estimators are derived under perfect and partial repair strategies. Consistency and asymptotic properties of the estimators are obtained. The estimation methods are applied to a data set of failures for cars under warranty. Simulation studies are used to ascertain the small sample properties and the efficiency gains of the resulting estimators. PMID:25346751

  19. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic

  20. Topsoil organic carbon content of Europe, a new map based on a generalised additive model

    NASA Astrophysics Data System (ADS)

    de Brogniez, Delphine; Ballabio, Cristiano; Stevens, Antoine; Jones, Robert J. A.; Montanarella, Luca; van Wesemael, Bas

    2014-05-01

    There is an increasing demand for up-to-date spatially continuous organic carbon (OC) data for global environment and climatic modeling. Whilst the current map of topsoil organic carbon content for Europe (Jones et al., 2005) was produced by applying expert-knowledge based pedo-transfer rules on large soil mapping units, the aim of this study was to replace it by applying digital soil mapping techniques on the first European harmonised geo-referenced topsoil (0-20 cm) database, which arises from the LUCAS (land use/cover area frame statistical survey) survey. A generalized additive model (GAM) was calibrated on 85% of the dataset (ca. 17 000 soil samples) and a backward stepwise approach selected slope, land cover, temperature, net primary productivity, latitude and longitude as environmental covariates (500 m resolution). The validation of the model (applied on 15% of the dataset), gave an R2 of 0.27. We observed that most organic soils were under-predicted by the model and that soils of Scandinavia were also poorly predicted. The model showed an RMSE of 42 g kg-1 for mineral soils and of 287 g kg-1 for organic soils. The map of predicted OC content showed the lowest values in Mediterranean countries and in croplands across Europe, whereas highest OC content were predicted in wetlands, woodlands and in mountainous areas. The map of standard error of the OC model predictions showed high values in northern latitudes, wetlands, moors and heathlands, whereas low uncertainty was mostly found in croplands. A comparison of our results with the map of Jones et al. (2005) showed a general agreement on the prediction of mineral soils' OC content, most probably because the models use some common covariates, namely land cover and temperature. Our model however failed to predict values of OC content greater than 200 g kg-1, which we explain by the imposed unimodal distribution of our model, whose mean is tilted towards the majority of soils, which are mineral. Finally, average

  1. Model based climate information on drought risk in Africa

    NASA Astrophysics Data System (ADS)

    Calmanti, S.; Syroka, J.; Jones, C.; Carfagna, F.; Dell'Aquila, A.; Hoefsloot, P.; Kaffaf, S.; Nikulin, G.

    2012-04-01

    The United Nations World Food Programme (WFP) has embarked upon the endeavor of creating a sustainable Africa-wide natural disaster risk management system. A fundamental building block of this initiative is the setup of a drought impact modeling platform called Africa Risk-View that aims to quantify and monitor weather-related food security risk in Africa. The modeling approach is based the Water Requirement Satisfaction Index (WRSI), as the fundamental indicator of the performances of agriculture and uses historical records of food assistance operation to project future potential needs for livelihood protection. By using climate change scenarios as an input to Africa Risk-View it is possible, in principles, to evaluate the future impact of climate variability on critical issues such as food security and the overall performance of the envisaged risk management system. A necessary preliminary step to this challenging task is the exploration of the sources of uncertainties affecting the assessment based on modeled climate change scenarios. For this purpose, a limited set of climate models have been selected in order verify the relevance of using climate model output data with Africa Risk-View and to explore a minimal range of possible sources of uncertainty. This first evaluation exercise started before the setup of the CORDEX framework and has relied on model output available at the time. In particular only one regional downscaling was available for the entire African continent from the ENSEMBLES project. The analysis shows that current coarse resolution global climate models can not directly feed into the Africa RiskView risk-analysis tool. However, regional downscaling may help correcting the inherent biases observed in the datasets. Further analysis is performed by using the first data available under the CORDEX framework. In particular, we consider a set of simulation driven with boundary conditions from the reanalysis ERA-Interim to evaluate the skill drought

  2. Comparison of prosthetic models produced by traditional and additive manufacturing methods

    PubMed Central

    Park, Jin-Young; Kim, Hae-Young; Kim, Ji-Hwan; Kim, Jae-Hong

    2015-01-01

    PURPOSE The purpose of this study was to verify the clinical-feasibility of additive manufacturing by comparing the accuracy of four different manufacturing methods for metal coping: the conventional lost wax technique (CLWT); subtractive methods with wax blank milling (WBM); and two additive methods, multi jet modeling (MJM), and micro-stereolithography (Micro-SLA). MATERIALS AND METHODS Thirty study models were created using an acrylic model with the maxillary upper right canine, first premolar, and first molar teeth. Based on the scan files from a non-contact blue light scanner (Identica; Medit Co. Ltd., Seoul, Korea), thirty cores were produced using the WBM, MJM, and Micro-SLA methods, respectively, and another thirty frameworks were produced using the CLWT method. To measure the marginal and internal gap, the silicone replica method was adopted, and the silicone images obtained were evaluated using a digital microscope (KH-7700; Hirox, Tokyo, Japan) at 140X magnification. Analyses were performed using two-way analysis of variance (ANOVA) and Tukey post hoc test (α=.05). RESULTS The mean marginal gaps and internal gaps showed significant differences according to tooth type (P<.001 and P<.001, respectively) and manufacturing method (P<.037 and P<.001, respectively). Micro-SLA did not show any significant difference from CLWT regarding mean marginal gap compared to the WBM and MJM methods. CONCLUSION The mean values of gaps resulting from the four different manufacturing methods were within a clinically allowable range, and, thus, the clinical use of additive manufacturing methods is acceptable as an alternative to the traditional lost wax-technique and subtractive manufacturing. PMID:26330976

  3. Risk Classification with an Adaptive Naive Bayes Kernel Machine Model

    PubMed Central

    Minnier, Jessica; Yuan, Ming; Liu, Jun S.; Cai, Tianxi

    2014-01-01

    Genetic studies of complex traits have uncovered only a small number of risk markers explaining a small fraction of heritability and adding little improvement to disease risk prediction. Standard single marker methods may lack power in selecting informative markers or estimating effects. Most existing methods also typically do not account for non-linearity. Identifying markers with weak signals and estimating their joint effects among many non-informative markers remains challenging. One potential approach is to group markers based on biological knowledge such as gene structure. If markers in a group tend to have similar effects, proper usage of the group structure could improve power and efficiency in estimation. We propose a two-stage method relating markers to disease risk by taking advantage of known gene-set structures. Imposing a naive bayes kernel machine (KM) model, we estimate gene-set specific risk models that relate each gene-set to the outcome in stage I. The KM framework efficiently models potentially non-linear effects of predictors without requiring explicit specification of functional forms. In stage II, we aggregate information across gene-sets via a regularization procedure. Estimation and computational efficiency is further improved with kernel principle component analysis. Asymptotic results for model estimation and gene set selection are derived and numerical studies suggest that the proposed procedure could outperform existing procedures for constructing genetic risk models. PMID:26236061

  4. Beliefs and stochastic modelling of interest rate scenario risk

    NASA Astrophysics Data System (ADS)

    Galic, E.; Molgedey, L.

    2001-04-01

    We present a framework that allows for a systematic assessment of risk given a specific model and belief on the market. Within this framework the time evolution of risk is modeled in a twofold way. On the one hand, risk is modeled by the time discrete and nonlinear garch(1,1) process, which allows for a (time-)local understanding of its level, together with a short term forecast. On the other hand, via a diffusion approximation, the time evolution of the probability density of risk is modeled by a Fokker-Planck equation. Then, as a final step, using Bayes theorem, beliefs are conditioned on the stationary probability density function as obtained from the Fokker-Planck equation. We believe this to be a highly rigorous framework to integrate subjective judgments of future market behavior and underlying models. In order to demonstrate the approach, we apply it to risk assessment of empirical interest rate scenario methodologies, i.e. the application of Principal Component Analysis to the the dynamics of bonds.

  5. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  6. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  7. Model-based benefit-risk assessment: can Archimedes help?

    PubMed

    Krishna, R

    2009-03-01

    In December 2008, the US Food and Drug Administration issued a new draft Guidance for Industry on Diabetes Mellitus--evaluating cardiovascular risk in new antidiabetic therapies to treat Type 2 diabetes. This guidance comes at a time when recent discussions have focused on delineation of cardiovascular risk reduction for new antidiabetic drugs. Computational tools that can enable early prediction of cardiovascular risk are reviewed with specific reference to Archimedes (Kaiser Permanente), with an aim of proposing a model-based solution and enabling decisions to be made as early as possible in the drug development value chain.

  8. Thermodynamic network model for predicting effects of substrate addition and other perturbations on subsurface microbial communities

    SciTech Connect

    Jack Istok; Melora Park; James McKinley; Chongxuan Liu; Lee Krumholz; Anne Spain; Aaron Peacock; Brett Baldwin

    2007-04-19

    The overall goal of this project is to develop and test a thermodynamic network model for predicting the effects of substrate additions and environmental perturbations on microbial growth, community composition and system geochemistry. The hypothesis is that a thermodynamic analysis of the energy-yielding growth reactions performed by defined groups of microorganisms can be used to make quantitative and testable predictions of the change in microbial community composition that will occur when a substrate is added to the subsurface or when environmental conditions change.

  9. Risk Management Model in Surface Exploitation of Mineral Deposits

    NASA Astrophysics Data System (ADS)

    Stojanović, Cvjetko

    2016-06-01

    Risk management is an integrative part of all types of project management. One of the main tasks of pre-investment studies and other project documentation is the tendency to protect investment projects as much as possible against investment risks. Therefore, the provision and regulation of risk information ensure the identification of the probability of the emergence of adverse events, their forms, causes and consequences, and provides a timely measures of protection against risks. This means that risk management involves a set of management methods and techniques used to reduce the possibility of realizing the adverse events and consequences and thus increase the possibilities of achieving the planned results with minimal losses. Investment in mining projects are of capital importance because they are very complex projects, therefore being very risky, because of the influence of internal and external factors and limitations arising from the socio-economic environment. Due to the lack of a risk management system, numerous organizations worldwide have suffered significant financial losses. Therefore, it is necessary for any organization to establish a risk management system as a structural element of system management system as a whole. This paper presents an approach to a Risk management model in the project of opening a surface coal mine, developed based on studies of extensive scientific literature and personal experiences of the author, and which, with certain modifications, may find use for any investment project, both in the mining industry as well as in investment projects in other areas.

  10. Risk assessment and remedial policy evaluation using predictive modeling

    SciTech Connect

    Linkov, L.; Schell, W.R.

    1996-06-01

    As a result of nuclear industry operation and accidents, large areas of natural ecosystems have been contaminated by radionuclides and toxic metals. Extensive societal pressure has been exerted to decrease the radiation dose to the population and to the environment. Thus, in making abatement and remediation policy decisions, not only economic costs but also human and environmental risk assessments are desired. This paper introduces a general framework for risk assessment and remedial policy evaluation using predictive modeling. Ecological risk assessment requires evaluation of the radionuclide distribution in ecosystems. The FORESTPATH model is used for predicting the radionuclide fate in forest compartments after deposition as well as for evaluating the efficiency of remedial policies. Time of intervention and radionuclide deposition profile was predicted as being crucial for the remediation efficiency. Risk assessment conducted for a critical group of forest users in Belarus shows that consumption of forest products (berries and mushrooms) leads to about 0.004% risk of a fatal cancer annually. Cost-benefit analysis for forest cleanup suggests that complete removal of organic layer is too expensive for application in Belarus and a better methodology is required. In conclusion, FORESTPATH modeling framework could have wide applications in environmental remediation of radionuclides and toxic metals as well as in dose reconstruction and, risk-assessment.

  11. Modelling microbial health risk of wastewater reuse: A systems perspective.

    PubMed

    Beaudequin, Denise; Harden, Fiona; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Mengersen, Kerrie

    2015-11-01

    There is a widespread need for the use of quantitative microbial risk assessment (QMRA) to determine reclaimed water quality for specific uses, however neither faecal indicator levels nor pathogen concentrations alone are adequate for assessing exposure health risk. The aim of this study was to build a conceptual model representing factors contributing to the microbiological health risks of reusing water treated in maturation ponds. This paper describes the development of an unparameterised model that provides a visual representation of theoretical constructs and variables of interest. Information was collected from the peer-reviewed literature and through consultation with experts from regulatory authorities and academic disciplines. In this paper we explore how, considering microbial risk as a modular system, following the QMRA framework enables incorporation of the many factors influencing human exposure and dose response, to better characterise likely human health impacts. By using and expanding upon the QMRA framework we deliver new insights into this important field of environmental exposures. We present a conceptual model of health risk of microbial exposure which can be used for maturation ponds and, more importantly, as a generic tool to assess health risk in diverse wastewater reuse scenarios. PMID:26277638

  12. Generalized additive models and Lucilia sericata growth: assessing confidence intervals and error rates in forensic entomology.

    PubMed

    Tarone, Aaron M; Foran, David R

    2008-07-01

    Forensic entomologists use blow fly development to estimate a postmortem interval. Although accurate, fly age estimates can be imprecise for older developmental stages and no standard means of assigning confidence intervals exists. Presented here is a method for modeling growth of the forensically important blow fly Lucilia sericata, using generalized additive models (GAMs). Eighteen GAMs were created to predict the extent of juvenile fly development, encompassing developmental stage, length, weight, strain, and temperature data, collected from 2559 individuals. All measures were informative, explaining up to 92.6% of the deviance in the data, though strain and temperature exerted negligible influences. Predictions made with an independent data set allowed for a subsequent examination of error. Estimates using length and developmental stage were within 5% of true development percent during the feeding portion of the larval life cycle, while predictions for postfeeding third instars were less precise, but within expected error.

  13. Assessing risk factors for dental caries: a statistical modeling approach.

    PubMed

    Trottini, Mario; Bossù, Maurizio; Corridore, Denise; Ierardo, Gaetano; Luzzi, Valeria; Saccucci, Matteo; Polimeni, Antonella

    2015-01-01

    The problem of identifying potential determinants and predictors of dental caries is of key importance in caries research and it has received considerable attention in the scientific literature. From the methodological side, a broad range of statistical models is currently available to analyze dental caries indices (DMFT, dmfs, etc.). These models have been applied in several studies to investigate the impact of different risk factors on the cumulative severity of dental caries experience. However, in most of the cases (i) these studies focus on a very specific subset of risk factors; and (ii) in the statistical modeling only few candidate models are considered and model selection is at best only marginally addressed. As a result, our understanding of the robustness of the statistical inferences with respect to the choice of the model is very limited; the richness of the set of statistical models available for analysis in only marginally exploited; and inferences could be biased due the omission of potentially important confounding variables in the model's specification. In this paper we argue that these limitations can be overcome considering a general class of candidate models and carefully exploring the model space using standard model selection criteria and measures of global fit and predictive performance of the candidate models. Strengths and limitations of the proposed approach are illustrated with a real data set. In our illustration the model space contains more than 2.6 million models, which require inferences to be adjusted for 'optimism'.

  14. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  15. A first screening and risk assessment of pharmaceuticals and additives in personal care products in waste water, sludge, recipient water and sediment from Faroe Islands, Iceland and Greenland.

    PubMed

    Huber, Sandra; Remberger, Mikael; Kaj, Lennart; Schlabach, Martin; Jörundsdóttir, Hrönn Ó; Vester, Jette; Arnórsson, Mímir; Mortensen, Inge; Schwartson, Richard; Dam, Maria

    2016-08-15

    A screening of a broad range of pharmaceuticals and additives in personal care products (PPCPs) in sub-arctic locations of the Faroe Islands (FO), Iceland (IS) and Greenland (GL) was conducted. In total 36 pharmaceuticals including some metabolites, and seven additives in personal care products were investigated in influent and effluent waters as well as sludge of waste water treatment plants (WWTPs) and in water and sediment of recipients. Concentrations and distribution patterns for PPCPs discharged via sewage lines (SLs) to the marine environment were assessed. Of the 36 pharmaceuticals or metabolites analysed 33 were found close to or above the limit of detection (LOD) in all or a part of the samples. All of the seven investigated additives in personal care products were detected above the LOD. Some of the analysed PPCPs occurred in every or almost every sample. Among these were diclofenac, ibuprofen, lidocaine, naproxen, metformin, citalopram, venlafaxine, amiloride, furosemide, metoprolol, sodium dodecyl sulphate (SDS) and cetrimonium salt (ATAC-C16). Additionally, the study encompasses ecotoxicological risk assessment of 2/3 of the analysed PPCPs in recipient and diluted effluent waters. For candesartan only a small margin to levels with inacceptable risks was observed in diluted effluent waters at two locations (FO). Chronical risks for aquatic organisms staying and/or living around WWTP effluent pipe-outlets were indicated for 17β-estradiol and estriol in the three countries. Additives in PCPs were found to pose the largest risk to the aquatic environment. The surfactants CAPB and ATAC-C16 were found in concentrations resulting in risk factors up to 375 for CAPB and 165 for ATAC-C16 in recipients for diluted effluents from Iggia, Nuuk (GL) and Torshavn (FO) respectively. These results demonstrates a potentially high ecological risk stemming from discharge of surfactants as used in household and industrial detergents as well as additives in personal care

  16. A first screening and risk assessment of pharmaceuticals and additives in personal care products in waste water, sludge, recipient water and sediment from Faroe Islands, Iceland and Greenland.

    PubMed

    Huber, Sandra; Remberger, Mikael; Kaj, Lennart; Schlabach, Martin; Jörundsdóttir, Hrönn Ó; Vester, Jette; Arnórsson, Mímir; Mortensen, Inge; Schwartson, Richard; Dam, Maria

    2016-08-15

    A screening of a broad range of pharmaceuticals and additives in personal care products (PPCPs) in sub-arctic locations of the Faroe Islands (FO), Iceland (IS) and Greenland (GL) was conducted. In total 36 pharmaceuticals including some metabolites, and seven additives in personal care products were investigated in influent and effluent waters as well as sludge of waste water treatment plants (WWTPs) and in water and sediment of recipients. Concentrations and distribution patterns for PPCPs discharged via sewage lines (SLs) to the marine environment were assessed. Of the 36 pharmaceuticals or metabolites analysed 33 were found close to or above the limit of detection (LOD) in all or a part of the samples. All of the seven investigated additives in personal care products were detected above the LOD. Some of the analysed PPCPs occurred in every or almost every sample. Among these were diclofenac, ibuprofen, lidocaine, naproxen, metformin, citalopram, venlafaxine, amiloride, furosemide, metoprolol, sodium dodecyl sulphate (SDS) and cetrimonium salt (ATAC-C16). Additionally, the study encompasses ecotoxicological risk assessment of 2/3 of the analysed PPCPs in recipient and diluted effluent waters. For candesartan only a small margin to levels with inacceptable risks was observed in diluted effluent waters at two locations (FO). Chronical risks for aquatic organisms staying and/or living around WWTP effluent pipe-outlets were indicated for 17β-estradiol and estriol in the three countries. Additives in PCPs were found to pose the largest risk to the aquatic environment. The surfactants CAPB and ATAC-C16 were found in concentrations resulting in risk factors up to 375 for CAPB and 165 for ATAC-C16 in recipients for diluted effluents from Iggia, Nuuk (GL) and Torshavn (FO) respectively. These results demonstrates a potentially high ecological risk stemming from discharge of surfactants as used in household and industrial detergents as well as additives in personal care

  17. Analysis of Air Toxics From NOAA WP-3 Aircraft Measurements During the TexAQS 2006 Campaign: Comparison With Emission Inventories and Additive Inhalation Risk Factors

    NASA Astrophysics Data System (ADS)

    Del Negro, L. A.; Warneke, C.; de Gouw, J. A.; Atlas, E.; Lueb, R.; Zhu, X.; Pope, L.; Schauffler, S.; Hendershot, R.; Washenfelder, R.; Fried, A.; Richter, D.; Walega, J. G.; Weibring, P.

    2007-12-01

    Benzene and nine other air toxics classified as human carcinogens by the International Agency for Research on Cancer (IARC) were measured from the NOAA WP-3 aircraft during the TexAQS 2006 campaign. In-situ measurements of benzene, measured with a PTR-MS instrument, are used to estimate emission fluxes for comparison with point source emission inventories developed by the Texas Commission on Environmental Quality. Mean and median mixing ratios for benzene, acetaldehyde, formaldehyde, 1,3-butadiene, carbon tetrachloride, chloroform, 1,2-dichloroethane, dibromoethane, dichloromethane, and vinyl chloride, encountered over the city of Houston during the campaign, are combined with inhalation unit risk factor values developed by the California Environmental Protection Agency and the United States Environmental Protection Agency to estimate the additive inhalation risk factor. This additive risk factor represents the risk associated with lifetime (70 year) exposure at the levels measured and should not be used as an absolute indicator of risk to individuals. However, the results are useful for assessments of changing relative risk over time, and for identifying dominant contributions to the overall air toxic risk.

  18. Guarana provides additional stimulation over caffeine alone in the planarian model.

    PubMed

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R; Constable, Mic Andre; Mulligan, Margaret E; Voura, Evelyn B

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose. PMID:25880065

  19. Guarana provides additional stimulation over caffeine alone in the planarian model.

    PubMed

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R; Constable, Mic Andre; Mulligan, Margaret E; Voura, Evelyn B

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose.

  20. Guarana Provides Additional Stimulation over Caffeine Alone in the Planarian Model

    PubMed Central

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R.; Constable, Mic Andre; Mulligan, Margaret E.; Voura, Evelyn B.

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose. PMID:25880065

  1. A simulation model for risk assessment of turbine wheels

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Hage, Richard T.

    1991-01-01

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  2. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models.

    PubMed

    Fan, Jianqing; Feng, Yang; Song, Rui

    2011-06-01

    A variable screening procedure via correlation learning was proposed in Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under general nonparametric models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, a data-driven thresholding and an iterative nonparametric independence screening (INIS) are also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data analysis demonstrate that the proposed procedure works well with moderate sample size and large dimension and performs better than competing methods.

  3. The use of ecosystem models in risk assessment

    SciTech Connect

    Starodub, M.E.; Miller, P.A.; Willes, R.F.

    1994-12-31

    Ecosystem models, when used in conjunction with available environmental effects monitoring data enable informed decisions regarding actions that should be taken to manage ecological risks from areas of localized chemical loadings and accumulation. These models provide quantitative estimates of chemical concentrations in various environmental media. The reliable application of these models as predictive tools for environmental assessment requires a thorough understanding of the theory and mathematical relationships described by the models and demands rigorous validation of input data and model results with field and laboratory data. Food chain model selection should be based on the ability to best simulate the interactions of the food web and processes governing the transfer of chemicals from the dissolved and particulate phase to various trophic levels for the site in question. This requires that the user be familiar with the theories on which these models are based, and be aware of the merits and short comings of each prior to attempting to model food chain accumulation. Questions to be asked include: are all potential exposure pathways addressed? are omitted pathways critical to the risk assessment process? is the model flexible? To answer these questions one must consider the, chemical(s) of concern, site-specific ecosystem characteristics, risk assessment receptor (aquatic, wildlife, human) dietary habits, influence of effluent characteristics on food chain dynamics.

  4. Model Scramjet Inlet Unstart Induced by Mass Addition and Heat Release

    NASA Astrophysics Data System (ADS)

    Im, Seong-Kyun; Baccarella, Damiano; McGann, Brendan; Liu, Qili; Wermer, Lydiy; Do, Hyungrok

    2015-11-01

    The inlet unstart phenomena in a model scramjet are investigated at an arc-heated hypersonic wind tunnel. The unstart induced by nitrogen or ethylene jets at low or high enthalpy Mach 4.5 freestream flow conditions are compared. The jet injection pressurizes the downstream flow by mass addition and flow blockage. In case of the ethylene jet injection, heat release from combustion increases the backpressure further. Time-resolved schlieren imaging is performed at the jet and the lip of the model inlet to visualize the flow features during unstart. High frequency pressure measurements are used to provide information on pressure fluctuation at the scramjet wall. In both of the mass and heat release driven unstart cases, it is observed that there are similar flow transient and quasi-steady behaviors of unstart shockwave system during the unstart processes. Combustion driven unstart induces severe oscillatory flow motions of the jet and the unstart shock at the lip of the scramjet inlet after the completion of the unstart process, while the unstarted flow induced by solely mass addition remains relatively steady. The discrepancies between the processes of mass and heat release driven unstart are explained by flow choking mechanism.

  5. Exact solutions for models of evolving networks with addition and deletion of nodes.

    PubMed

    Moore, Cristopher; Ghoshal, Gourab; Newman, M E J

    2006-09-01

    There has been considerable recent interest in the properties of networks, such as citation networks and the worldwide web, that grow by the addition of vertices, and a number of simple solvable models of network growth have been studied. In the real world, however, many networks, including the web, not only add vertices but also lose them. Here we formulate models of the time evolution of such networks and give exact solutions for a number of cases of particular interest. For the case of net growth and so-called preferential attachment--in which newly appearing vertices attach to previously existing ones in proportion to vertex degree--we show that the resulting networks have power-law degree distributions, but with an exponent that diverges as the growth rate vanishes. We conjecture that the low exponent values observed in real-world networks are thus the result of vigorous growth in which the rate of addition of vertices far exceeds the rate of removal. Were growth to slow in the future--for instance, in a more mature future version of the web--we would expect to see exponents increase, potentially without bound.

  6. A new model for polluted soil risk assessment

    NASA Astrophysics Data System (ADS)

    Andretta, M.; Serra, R.; Villani, M.

    2006-08-01

    In this paper, we discuss the most important theoretical aspects of polluted soil Risk Assessment Methodologies, which have been developed in order to evaluate the risk, for the exposed people, connected with the residual contaminant concentration in polluted soil, and we make a short presentation of the major different kinds of risk assessment methodologies. We also underline the relevant role played, in this kind of analysis, by the pollutant transport models. We also describe a new and innovative model, based on the general framework of the so-called Cellular Automata (CA), initially developed in the UE-Esprit Project COLOMBO for the simulation of bioremediation processes. These kinds of models, for their intrinsic "finite and discrete" characteristics, seem to be very well suited for a detailed analysis of the shape of the pollutant sources, the contaminant fates and the evaluation of target in the risk assessment evaluation. In particular, we will describe the future research activities we are going to develop in the area of a strict integration between pollutant fate and transport models and Risk Analysis Methodologies.

  7. Development and application of chronic disease risk prediction models.

    PubMed

    Oh, Sun Min; Stefani, Katherine M; Kim, Hyeon Chang

    2014-07-01

    Currently, non-communicable chronic diseases are a major cause of morbidity and mortality worldwide, and a large proportion of chronic diseases are preventable through risk factor management. However, the prevention efficacy at the individual level is not yet satisfactory. Chronic disease prediction models have been developed to assist physicians and individuals in clinical decision-making. A chronic disease prediction model assesses multiple risk factors together and estimates an absolute disease risk for the individual. Accurate prediction of an individual's future risk for a certain disease enables the comparison of benefits and risks of treatment, the costs of alternative prevention strategies, and selection of the most efficient strategy for the individual. A large number of chronic disease prediction models, especially targeting cardiovascular diseases and cancers, have been suggested, and some of them have been adopted in the clinical practice guidelines and recommendations of many countries. Although few chronic disease prediction tools have been suggested in the Korean population, their clinical utility is not as high as expected. This article reviews methodologies that are commonly used for developing and evaluating a chronic disease prediction model and discusses the current status of chronic disease prediction in Korea.

  8. Modeling protein density of states: additive hydrophobic effects are insufficient for calorimetric two-state cooperativity.

    PubMed

    Chan, H S

    2000-09-01

    A well-established experimental criterion for two-state thermodynamic cooperativity in protein folding is that the van't Hoff enthalpy DeltaH(vH) around the transition midpoint is equal, or very nearly so, to the calorimetric enthalpy DeltaH(cal) of the entire transition. This condition is satisfied by many small proteins. We use simple lattice models to provide a statistical mechanical framework to elucidate how this calorimetric two-state picture may be reconciled with the hierarchical multistate scenario emerging from recent hydrogen exchange experiments. We investigate the feasibility of using inverse Laplace transforms to recover the underlying density of states (i.e., enthalpy distribution) from calorimetric data. We find that the constraint imposed by DeltaH(vH)/DeltaH(cal) approximately 1 on densities of states of proteins is often more stringent than other "two-state" criteria proposed in recent theoretical studies. In conjunction with reasonable assumptions, the calorimetric two-state condition implies a narrow distribution of denatured-state enthalpies relative to the overall enthalpy difference between the native and the denatured conformations. This requirement does not always correlate with simple definitions of "sharpness" of a transition and has important ramifications for theoretical modeling. We find that protein models that assume capillarity cooperativity can exhibit overall calorimetric two-state-like behaviors. However, common heteropolymer models based on additive hydrophobic-like interactions, including highly specific two-dimensional Gō models, fail to produce proteinlike DeltaH(vH)/DeltaH(cal) approximately 1. A simple model is constructed to illustrate a proposed scenario in which physically plausible local and nonlocal cooperative terms, which mimic helical cooperativity and environment-dependent hydrogen bonding strength, can lead to thermodynamic behaviors closer to experiment. Our results suggest that proteinlike thermodynamic

  9. Physics-based Entry, Descent and Landing Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  10. A cooperative model for IS security risk management in distributed environment.

    PubMed

    Feng, Nan; Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.

  11. Evaluating biomarkers to model cancer risk post cosmic ray exposure

    NASA Astrophysics Data System (ADS)

    Sridharan, Deepa M.; Asaithamby, Aroumougame; Blattnig, Steve R.; Costes, Sylvain V.; Doetsch, Paul W.; Dynan, William S.; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D.; Peterson, Leif E.; Plante, Ianik; Ponomarev, Artem L.; Saha, Janapriya; Snijders, Antoine M.; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M.

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  12. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  13. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  14. Are Masking-Based Models of Risk Useful?

    PubMed

    Gisiner, Robert C

    2016-01-01

    As our understanding of directly observable effects from anthropogenic sound exposure has improved, concern about "unobservable" effects such as stress and masking have received greater attention. Equal energy models of masking such as power spectrum models have the appeal of simplicity, but do they offer biologically realistic assessments of the risk of masking? Data relevant to masking such as critical ratios, critical bandwidths, temporal resolution, and directional resolution along with what is known about general mammalian antimasking mechanisms all argue for a much more complicated view of masking when making decisions about the risk of masking inherent in a given anthropogenic sound exposure scenario. PMID:26610979

  15. A dynamical systems model for nuclear power plant risk

    NASA Astrophysics Data System (ADS)

    Hess, Stephen Michael

    The recent transition to an open access generation marketplace has forced nuclear plant operators to become much more cost conscious and focused on plant performance. Coincidentally, the regulatory perspective also is in a state of transition from a command and control framework to one that is risk-informed and performance-based. Due to these structural changes in the economics and regulatory system associated with commercial nuclear power plant operation, there is an increased need for plant management to explicitly manage nuclear safety risk. Application of probabilistic risk assessment techniques to model plant hardware has provided a significant contribution to understanding the potential initiating events and equipment failures that can lead to core damage accidents. Application of the lessons learned from these analyses has supported improved plant operation and safety over the previous decade. However, this analytical approach has not been nearly as successful in addressing the impact of plant processes and management effectiveness on the risks of plant operation. Thus, the research described in this dissertation presents a different approach to address this issue. Here we propose a dynamical model that describes the interaction of important plant processes among themselves and their overall impact on nuclear safety risk. We first provide a review of the techniques that are applied in a conventional probabilistic risk assessment of commercially operating nuclear power plants and summarize the typical results obtained. The limitations of the conventional approach and the status of research previously performed to address these limitations also are presented. Next, we present the case for the application of an alternative approach using dynamical systems theory. This includes a discussion of previous applications of dynamical models to study other important socio-economic issues. Next, we review the analytical techniques that are applicable to analysis of

  16. Additive influence of genetic predisposition and conventional risk factors in the incidence of coronary heart disease: a population-based study in Greece

    PubMed Central

    Yiannakouris, Nikos; Katsoulis, Michail; Trichopoulou, Antonia; Ordovas, Jose M; Trichopoulos, Dimitrios

    2014-01-01

    Objectives An additive genetic risk score (GRS) for coronary heart disease (CHD) has previously been associated with incident CHD in the population-based Greek European Prospective Investigation into Cancer and nutrition (EPIC) cohort. In this study, we explore GRS-‘environment’ joint actions on CHD for several conventional cardiovascular risk factors (ConvRFs), including smoking, hypertension, type-2 diabetes mellitus (T2DM), body mass index (BMI), physical activity and adherence to the Mediterranean diet. Design A case–control study. Setting The general Greek population of the EPIC study. Participants and outcome measures 477 patients with medically confirmed incident CHD and 1271 controls participated in this study. We estimated the ORs for CHD by dividing participants at higher or lower GRS and, alternatively, at higher or lower ConvRF, and calculated the relative excess risk due to interaction (RERI) as a measure of deviation from additivity. Results The joint presence of higher GRS and higher risk ConvRF was in all instances associated with an increased risk of CHD, compared with the joint presence of lower GRS and lower risk ConvRF. The OR (95% CI) was 1.7 (1.2 to 2.4) for smoking, 2.7 (1.9 to 3.8) for hypertension, 4.1 (2.8 to 6.1) for T2DM, 1.9 (1.4 to 2.5) for lower physical activity, 2.0 (1.3 to 3.2) for high BMI and 1.5 (1.1 to 2.1) for poor adherence to the Mediterranean diet. In all instances, RERI values were fairly small and not statistically significant, suggesting that the GRS and the ConvRFs do not have effects beyond additivity. Conclusions Genetic predisposition to CHD, operationalised through a multilocus GRS, and ConvRFs have essentially additive effects on CHD risk. PMID:24500614

  17. Analysis of Radiation Pneumonitis Risk Using a Generalized Lyman Model

    SciTech Connect

    Tucker, Susan L. Liu, H. Helen; Liao Zhongxing; Wei Xiong; Wang Shulian; Jin Hekun; Komaki, Ritsuko; Martel, Mary K.; Mohan, Radhe

    2008-10-01

    Purpose: To introduce a version of the Lyman normal-tissue complication probability (NTCP) model adapted to incorporate censored time-to-toxicity data and clinical risk factors and to apply the generalized model to analysis of radiation pneumonitis (RP) risk. Methods and Materials: Medical records and radiation treatment plans were reviewed retrospectively for 576 patients with non-small cell lung cancer treated with radiotherapy. The time to severe (Grade {>=}3) RP was computed, with event times censored at last follow-up for patients not experiencing this endpoint. The censored time-to-toxicity data were analyzed using the standard and generalized Lyman models with patient smoking status taken into account. Results: The generalized Lyman model with patient smoking status taken into account produced NTCP estimates up to 27 percentage points different from the model based on dose-volume factors alone. The generalized model also predicted that 8% of the expected cases of severe RP were unobserved because of censoring. The estimated volume parameter for lung was not significantly different from n = 1, corresponding to mean lung dose. Conclusions: NTCP models historically have been based solely on dose-volume effects and binary (yes/no) toxicity data. Our results demonstrate that inclusion of nondosimetric risk factors and censored time-to-event data can markedly affect outcome predictions made using NTCP models.

  18. Generalized Additive Models Used to Predict Species Abundance in the Gulf of Mexico: An Ecosystem Modeling Tool

    PubMed Central

    Drexler, Michael; Ainsworth, Cameron H.

    2013-01-01

    Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM) approach is used to describe the abundance of 40 species groups (i.e. functional groups) across the Gulf of Mexico (GoM) using a large fisheries independent data set (SEAMAP) and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum) and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist. PMID:23691223

  19. A Model for Risk Analysis of Oil Tankers

    NASA Astrophysics Data System (ADS)

    Montewka, Jakub; Krata, Przemysław; Goerland, Floris; Kujala, Pentti

    2010-01-01

    The paper presents a model for risk analysis regarding marine traffic, with the emphasis on two types of the most common marine accidents which are: collision and grounding. The focus is on oil tankers as these pose the highest environmental risk. A case study in selected areas of Gulf of Finland in ice free conditions is presented. The model utilizes a well-founded formula for risk calculation, which combines the probability of an unwanted event with its consequences. Thus the model is regarded a block type model, consisting of blocks for the probability of collision and grounding estimation respectively as well as blocks for consequences of an accident modelling. Probability of vessel colliding is assessed by means of a Minimum Distance To Collision (MDTC) based model. The model defines in a novel way the collision zone, using mathematical ship motion model and recognizes traffic flow as a non homogeneous process. The presented calculations address waterways crossing between Helsinki and Tallinn, where dense cross traffic during certain hours is observed. For assessment of a grounding probability, a new approach is proposed, which utilizes a newly developed model, where spatial interactions between objects in different locations are recognized. A ship at a seaway and navigational obstructions may be perceived as interacting objects and their repulsion may be modelled by a sort of deterministic formulation. Risk due to tankers running aground addresses an approach fairway to an oil terminal in Sköldvik, near Helsinki. The consequences of an accident are expressed in monetary terms, and concern costs of an oil spill, based on statistics of compensations claimed from the International Oil Pollution Compensation Funds (IOPC Funds) by parties involved.

  20. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases. PMID:24402720

  1. Assessing Landslide Risk Areas Using Statistical Models and Land Cover

    NASA Astrophysics Data System (ADS)

    Kim, H. G.; Lee, D. K.; Park, C.; Ahn, Y.; Sung, S.; Park, J. H.

    2015-12-01

    Recently, damages due to landslides have increased in Republic of Korea. Extreme weathers like typhoon, heavy rainfall related to climate change are the main factor of the damages. Especially, Inje-gun, Gangwon-do had severe landslide damages in 2006 and 2007. In Inje-gun, 91% areas are forest, therefore, many land covers related to human activities were adjacent to forest land. Thus, establishment of adaptation plans to landslides was urgently needed. Landslide risk assessment can serve as a good information to policy makers. The objective of this study was assessing landslide risk areas to support establishment of adaptation plans to reduce landslide damages. Statistical distribution models (SDMs) were used to evaluate probability of landslide occurrence. Various SDMs were used to make landslide probability maps considering uncertainty of SDMs. The types of land cover were classified into 5 grades considering vulnerable level to landslide. The landslide probability maps were overlaid with land cover map to calculate landslide risk. As a result of overlay analysis, landslide risk areas were derived. Especially agricultural areas and transportation areas showed high risk and large areas in the risk map. In conclusion, policy makers in Inje-gun must consider the landslide risk map to establish adaptation plans effectively.

  2. Risk assessment models for cancer-associated venous thromboembolism.

    PubMed

    Dutia, Mrinal; White, Richard H; Wun, Ted

    2012-07-15

    Venous thromboembolism (VTE) is common in cancer patients, and is associated with significant morbidity and mortality. Several factors, including procoagulant agents secreted by tumor cells, immobilization, surgery, indwelling catheters, and systemic treatment (including chemotherapy), contribute to an increased risk of VTE in cancer patients. There is growing interest in instituting primary prophylaxis in high-risk patients to prevent incident (first-time) VTE events. The identification of patients at sufficiently high risk of VTE to warrant primary thromboprophylaxis is essential, as anticoagulation may be associated with a higher risk of bleeding. Current guidelines recommend the use of pharmacological thromboprophylaxis in postoperative and hospitalized cancer patients, as well as ambulatory cancer patients receiving thalidomide or lenalidomide in combination with high-dose dexamethasone or chemotherapy, in the absence of contraindications to anticoagulation. However, the majority of cancer patients are ambulatory, and currently primary thromboprophylaxis is not recommended for these patients, even those considered at very high risk. In this concise review, the authors discuss risk stratification models that have been specifically developed to identify cancer patients at high risk for VTE, and thus might be useful in future studies designed to determine the potential benefit of primary thromboprophylaxis.

  3. The biobehavioral family model: testing social support as an additional exogenous variable.

    PubMed

    Woods, Sarah B; Priest, Jacob B; Roush, Tara

    2014-12-01

    This study tests the inclusion of social support as a distinct exogenous variable in the Biobehavioral Family Model (BBFM). The BBFM is a biopsychosocial approach to health that proposes that biobehavioral reactivity (anxiety and depression) mediates the relationship between family emotional climate and disease activity. Data for this study included married, English-speaking adult participants (n = 1,321; 55% female; M age = 45.2 years) from the National Comorbidity Survey Replication, a nationally representative epidemiological study of the frequency of mental disorders in the United States. Participants reported their demographics, marital functioning, social support from friends and relatives, anxiety and depression (biobehavioral reactivity), number of chronic health conditions, and number of prescription medications. Confirmatory factor analyses supported the items used in the measures of negative marital interactions, social support, and biobehavioral reactivity, as well as the use of negative marital interactions, friends' social support, and relatives' social support as distinct factors in the model. Structural equation modeling indicated a good fit of the data to the hypothesized model (χ(2)  = 846.04, p = .000, SRMR = .039, CFI = .924, TLI = .914, RMSEA = .043). Negative marital interactions predicted biobehavioral reactivity (β = .38, p < .001), as did relatives' social support, inversely (β = -.16, p < .001). Biobehavioral reactivity predicted disease activity (β = .40, p < .001) and was demonstrated to be a significant mediator through tests of indirect effects. Findings are consistent with previous tests of the BBFM with adult samples, and suggest the important addition of family social support as a predicting factor in the model. PMID:24981970

  4. Modeling particulate matter concentrations measured through mobile monitoring in a deletion/substitution/addition approach

    NASA Astrophysics Data System (ADS)

    Su, Jason G.; Hopke, Philip K.; Tian, Yilin; Baldwin, Nichole; Thurston, Sally W.; Evans, Kristin; Rich, David Q.

    2015-12-01

    Land use regression modeling (LUR) through local scale circular modeling domains has been used to predict traffic-related air pollution such as nitrogen oxides (NOX). LUR modeling for fine particulate matters (PM), which generally have smaller spatial gradients than NOX, has been typically applied for studies involving multiple study regions. To increase the spatial coverage for fine PM and key constituent concentrations, we designed a mobile monitoring network in Monroe County, New York to measure pollutant concentrations of black carbon (BC, wavelength at 880 nm), ultraviolet black carbon (UVBC, wavelength at 3700 nm) and Delta-C (the difference between the UVBC and BC concentrations) using the Clarkson University Mobile Air Pollution Monitoring Laboratory (MAPL). A Deletion/Substitution/Addition (D/S/A) algorithm was conducted, which used circular buffers as a basis for statistics. The algorithm maximizes the prediction accuracy for locations without measurements using the V-fold cross-validation technique, and it reduces overfitting compared to other approaches. We found that the D/S/A LUR modeling approach could achieve good results, with prediction powers of 60%, 63%, and 61%, respectively, for BC, UVBC, and Delta-C. The advantage of mobile monitoring is that it can monitor pollutant concentrations at hundreds of spatial points in a region, rather than the typical less than 100 points from a fixed site saturation monitoring network. This research indicates that a mobile saturation sampling network, when combined with proper modeling techniques, can uncover small area variations (e.g., 10 m) in particulate matter concentrations.

  5. A habitat suitability model for Chinese sturgeon determined using the generalized additive method

    NASA Astrophysics Data System (ADS)

    Yi, Yujun; Sun, Jie; Zhang, Shanghong

    2016-03-01

    The Chinese sturgeon is a type of large anadromous fish that migrates between the ocean and rivers. Because of the construction of dams, this sturgeon's migration path has been cut off, and this species currently is on the verge of extinction. Simulating suitable environmental conditions for spawning followed by repairing or rebuilding its spawning grounds are effective ways to protect this species. Various habitat suitability models based on expert knowledge have been used to evaluate the suitability of spawning habitat. In this study, a two-dimensional hydraulic simulation is used to inform a habitat suitability model based on the generalized additive method (GAM). The GAM is based on real data. The values of water depth and velocity are calculated first via the hydrodynamic model and later applied in the GAM. The final habitat suitability model is validated using the catch per unit effort (CPUEd) data of 1999 and 2003. The model results show that a velocity of 1.06-1.56 m/s and a depth of 13.33-20.33 m are highly suitable ranges for the Chinese sturgeon to spawn. The hydraulic habitat suitability indexes (HHSI) for seven discharges (4000; 9000; 12,000; 16,000; 20,000; 30,000; and 40,000 m3/s) are calculated to evaluate integrated habitat suitability. The results show that the integrated habitat suitability reaches its highest value at a discharge of 16,000 m3/s. This study is the first to apply a GAM to evaluate the suitability of spawning grounds for the Chinese sturgeon. The study provides a reference for the identification of potential spawning grounds in the entire basin.

  6. The biobehavioral family model: testing social support as an additional exogenous variable.

    PubMed

    Woods, Sarah B; Priest, Jacob B; Roush, Tara

    2014-12-01

    This study tests the inclusion of social support as a distinct exogenous variable in the Biobehavioral Family Model (BBFM). The BBFM is a biopsychosocial approach to health that proposes that biobehavioral reactivity (anxiety and depression) mediates the relationship between family emotional climate and disease activity. Data for this study included married, English-speaking adult participants (n = 1,321; 55% female; M age = 45.2 years) from the National Comorbidity Survey Replication, a nationally representative epidemiological study of the frequency of mental disorders in the United States. Participants reported their demographics, marital functioning, social support from friends and relatives, anxiety and depression (biobehavioral reactivity), number of chronic health conditions, and number of prescription medications. Confirmatory factor analyses supported the items used in the measures of negative marital interactions, social support, and biobehavioral reactivity, as well as the use of negative marital interactions, friends' social support, and relatives' social support as distinct factors in the model. Structural equation modeling indicated a good fit of the data to the hypothesized model (χ(2)  = 846.04, p = .000, SRMR = .039, CFI = .924, TLI = .914, RMSEA = .043). Negative marital interactions predicted biobehavioral reactivity (β = .38, p < .001), as did relatives' social support, inversely (β = -.16, p < .001). Biobehavioral reactivity predicted disease activity (β = .40, p < .001) and was demonstrated to be a significant mediator through tests of indirect effects. Findings are consistent with previous tests of the BBFM with adult samples, and suggest the important addition of family social support as a predicting factor in the model.

  7. Impact of an additional chronic BDNF reduction on learning performance in an Alzheimer mouse model.

    PubMed

    Psotta, Laura; Rockahr, Carolin; Gruss, Michael; Kirches, Elmar; Braun, Katharina; Lessmann, Volkmar; Bock, Jörg; Endres, Thomas

    2015-01-01

    There is increasing evidence that brain-derived neurotrophic factor (BDNF) plays a crucial role in Alzheimer's disease (AD) pathology. A number of studies demonstrated that AD patients exhibit reduced BDNF levels in the brain and the blood serum, and in addition, several animal-based studies indicated a potential protective effect of BDNF against Aβ-induced neurotoxicity. In order to further investigate the role of BDNF in the etiology of AD, we created a novel mouse model by crossing a well-established AD mouse model (APP/PS1) with a mouse exhibiting a chronic BDNF deficiency (BDNF(+/-)). This new triple transgenic mouse model enabled us to further analyze the role of BDNF in AD in vivo. We reasoned that in case BDNF has a protective effect against AD pathology, an AD-like phenotype in our new mouse model should occur earlier and/or in more severity than in the APP/PS1-mice. Indeed, the behavioral analysis revealed that the APP/PS1-BDNF(+/-)-mice show an earlier onset of learning impairments in a two-way active avoidance task in comparison to APP/PS1- and BDNF(+/-)-mice. However in the Morris water maze (MWM) test, we could not observe an overall aggrevated impairment in spatial learning and also short-term memory in an object recognition task remained intact in all tested mouse lines. In addition to the behavioral experiments, we analyzed the amyloid plaque pathology in the APP/PS1 and APP/PS1-BDNF(+/-)-mice and observed a comparable plaque density in the two genotypes. Moreover, our results revealed a higher plaque density in prefrontal cortical compared to hippocampal brain regions. Our data reveal that higher cognitive tasks requiring the recruitment of cortical networks appear to be more severely affected in our new mouse model than learning tasks requiring mainly sub-cortical networks. Furthermore, our observations of an accelerated impairment in active avoidance learning in APP/PS1-BDNF(+/-)-mice further supports the hypothesis that BDNF deficiency

  8. Forewarning model for water pollution risk based on Bayes theory.

    PubMed

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  9. Launch Vehicle Debris Models and Crew Vehicle Ascent Abort Risk

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott

    2013-01-01

    For manned space launch systems, a reliable abort system is required to reduce the risks associated with a launch vehicle failure during ascent. Understanding the risks associated with failure environments can be achieved through the use of physics-based models of these environments. Debris fields due to destruction of the launch vehicle is one such environment. To better analyze the risk posed by debris, a physics-based model for generating launch vehicle debris catalogs has been developed. The model predicts the mass distribution of the debris field based on formulae developed from analysis of explosions. Imparted velocity distributions are computed using a shock-physics code to model the explosions within the launch vehicle. A comparison of the debris catalog with an existing catalog for the Shuttle external tank show good comparison in the debris characteristics and the predicted debris strike probability. The model is used to analyze the effects of number of debris pieces and velocity distributions on the strike probability and risk.

  10. Nonlinear feedback in a six-dimensional Lorenz model: impact of an additional heating term

    NASA Astrophysics Data System (ADS)

    Shen, B.-W.

    2015-12-01

    In this study, a six-dimensional Lorenz model (6DLM) is derived, based on a recent study using a five-dimensional (5-D) Lorenz model (LM), in order to examine the impact of an additional mode and its accompanying heating term on solution stability. The new mode added to improve the representation of the streamfunction is referred to as a secondary streamfunction mode, while the two additional modes, which appear in both the 6DLM and 5DLM but not in the original LM, are referred to as secondary temperature modes. Two energy conservation relationships of the 6DLM are first derived in the dissipationless limit. The impact of three additional modes on solution stability is examined by comparing numerical solutions and ensemble Lyapunov exponents of the 6DLM and 5DLM as well as the original LM. For the onset of chaos, the critical value of the normalized Rayleigh number (rc) is determined to be 41.1. The critical value is larger than that in the 3DLM (rc ~ 24.74), but slightly smaller than the one in the 5DLM (rc ~ 42.9). A stability analysis and numerical experiments obtained using generalized LMs, with or without simplifications, suggest the following: (1) negative nonlinear feedback in association with the secondary temperature modes, as first identified using the 5DLM, plays a dominant role in providing feedback for improving the solution's stability of the 6DLM, (2) the additional heating term in association with the secondary streamfunction mode may destabilize the solution, and (3) overall feedback due to the secondary streamfunction mode is much smaller than the feedback due to the secondary temperature modes; therefore, the critical Rayleigh number of the 6DLM is comparable to that of the 5DLM. The 5DLM and 6DLM collectively suggest different roles for small-scale processes (i.e., stabilization vs. destabilization), consistent with the following statement by Lorenz (1972): "If the flap of a butterfly's wings can be instrumental in generating a tornado, it can

  11. Nonlinear feedback in a six-dimensional Lorenz Model: impact of an additional heating term

    NASA Astrophysics Data System (ADS)

    Shen, B.-W.

    2015-03-01

    In this study, a six-dimensional Lorenz model (6DLM) is derived, based on a recent study using a five-dimensional (5-D) Lorenz model (LM), in order to examine the impact of an additional mode and its accompanying heating term on solution stability. The new mode added to improve the representation of the steamfunction is referred to as a secondary streamfunction mode, while the two additional modes, that appear in both the 6DLM and 5DLM but not in the original LM, are referred to as secondary temperature modes. Two energy conservation relationships of the 6DLM are first derived in the dissipationless limit. The impact of three additional modes on solution stability is examined by comparing numerical solutions and ensemble Lyapunov exponents of the 6DLM and 5DLM as well as the original LM. For the onset of chaos, the critical value of the normalized Rayleigh number (rc) is determined to be 41.1. The critical value is larger than that in the 3DLM (rc ~ 24.74), but slightly smaller than the one in the 5DLM (rc ~ 42.9). A stability analysis and numerical experiments obtained using generalized LMs, with or without simplifications, suggest the following: (1) negative nonlinear feedback in association with the secondary temperature modes, as first identified using the 5DLM, plays a dominant role in providing feedback for improving the solution's stability of the 6DLM, (2) the additional heating term in association with the secondary streamfunction mode may destabilize the solution, and (3) overall feedback due to the secondary streamfunction mode is much smaller than the feedback due to the secondary temperature modes; therefore, the critical Rayleigh number of the 6DLM is comparable to that of the 5DLM. The 5DLM and 6DLM collectively suggest different roles for small-scale processes (i.e., stabilization vs. destabilization), consistent with the following statement by Lorenz (1972): If the flap of a butterfly's wings can be instrumental in generating a tornado, it can

  12. Estimation and Inference in Generalized Additive Coefficient Models for Nonlinear Interactions with High-Dimensional Covariates

    PubMed Central

    Shujie, MA; Carroll, Raymond J.; Liang, Hua; Xu, Shizhong

    2015-01-01

    In the low-dimensional case, the generalized additive coefficient model (GACM) proposed by Xue and Yang [Statist. Sinica 16 (2006) 1423–1446] has been demonstrated to be a powerful tool for studying nonlinear interaction effects of variables. In this paper, we propose estimation and inference procedures for the GACM when the dimension of the variables is high. Specifically, we propose a groupwise penalization based procedure to distinguish significant covariates for the “large p small n” setting. The procedure is shown to be consistent for model structure identification. Further, we construct simultaneous confidence bands for the coefficient functions in the selected model based on a refined two-step spline estimator. We also discuss how to choose the tuning parameters. To estimate the standard deviation of the functional estimator, we adopt the smoothed bootstrap method. We conduct simulation experiments to evaluate the numerical performance of the proposed methods and analyze an obesity data set from a genome-wide association study as an illustration. PMID:26412908

  13. Spectral models of additive and modulation noise in speech and phonatory excitation signals

    NASA Astrophysics Data System (ADS)

    Schoentgen, Jean

    2003-01-01

    The article presents spectral models of additive and modulation noise in speech. The purpose is to learn about the causes of noise in the spectra of normal and disordered voices and to gauge whether the spectral properties of the perturbations of the phonatory excitation signal can be inferred from the spectral properties of the speech signal. The approach to modeling consists of deducing the Fourier series of the perturbed speech, assuming that the Fourier series of the noise and of the clean monocycle-periodic excitation are known. The models explain published data, take into account the effects of supraglottal tremor, demonstrate the modulation distortion owing to vocal tract filtering, establish conditions under which noise cues of different speech signals may be compared, and predict the impossibility of inferring the spectral properties of the frequency modulating noise from the spectral properties of the frequency modulation noise (e.g., phonatory jitter and frequency tremor). The general conclusion is that only phonatory frequency modulation noise is spectrally relevant. Other types of noise in speech are either epiphenomenal, or their spectral effects are masked by the spectral effects of frequency modulation noise.

  14. There's risk, and then there's risk: The latest clinical prognostic risk stratification models in myelodysplastic syndromes.

    PubMed

    Zeidan, Amer M; Komrokji, Rami S

    2013-12-01

    Myelodysplastic syndromes (MDS) include a diverse group of clonal hematopoietic disorders characterized by progressive cytopenias and propensity for leukemic progression. The biologic heterogeneity that underlies MDS translates clinically in wide variations of clinical outcomes. Several prognostic schemes were developed to predict the natural course of MDS, counsel patients, and allow evidence-based, risk-adaptive implementation of therapeutic strategies. The prognostic schemes divide patients into subgroups with similar prognosis, but the extent to which the prognostic prediction applies to any individual patient is more variable. None of these instruments was designed to predict the clinical benefit in relation to any specific MDS therapy. The prognostic impact of molecular mutations is being more recognized and attempts at incorporating it into the current prognostic schemes are ongoing.

  15. The benefits of an additional worker are task-dependent: assessing low-back injury risks during prefabricated (panelized) wall construction.

    PubMed

    Kim, Sunwook; Nussbaum, Maury A; Jia, Bochen

    2012-09-01

    Team manual material handling is a common practice in residential construction where prefabricated building components (e.g., wall panels) are increasingly used. As part of a larger effort to enable proactive control of ergonomic exposures among workers handling panels, this study explored the effects of additional workers on injury risks during team-based panel erection tasks, specifically by quantifying how injury risks are affected by increasing the number of workers (by one, above the nominal or most common number). Twenty-four participants completed panel erection tasks with and without an additional worker under different panel mass and size conditions. Four risk assessment methods were employed that emphasized the low back. Though including an additional worker generally reduced injury risk across several panel masses and sizes, the magnitude of these benefits varied depending on the specific task and exhibited somewhat high variability within a given task. These results suggest that a simple, generalizable recommendation regarding team-based panel erection tasks is not warranted. Rather, a more systems-level approach accounting for both injury risk and productivity (a strength of panelized wall systems) should be undertaken.

  16. A risk management model for securing virtual healthcare communities.

    PubMed

    Chryssanthou, Anargyros; Varlamis, Iraklis; Latsiou, Charikleia

    2011-01-01

    Virtual healthcare communities aim to bring together healthcare professionals and patients, improve the quality of healthcare services and assist healthcare professionals and researchers in their everyday activities. In a secure and reliable environment, patients share their medical data with doctors, expect confidentiality and demand reliable medical consultation. Apart from a concrete policy framework, several ethical, legal and technical issues must be considered in order to build a trustful community. This research emphasises on security issues, which can arise inside a virtual healthcare community and relate to the communication and storage of data. It capitalises on a standardised risk management methodology and a prototype architecture for healthcare community portals and justifies a security model that allows the identification, estimation and evaluation of potential security risks for the community. A hypothetical virtual healthcare community is employed in order to portray security risks and the solutions that the security model provides.

  17. Additive Factors Do Not Imply Discrete Processing Stages: A Worked Example Using Models of the Stroop Task

    PubMed Central

    Stafford, Tom; Gurney, Kevin N.

    2011-01-01

    Previously, it has been shown experimentally that the psychophysical law known as Piéron’s Law holds for color intensity and that the size of the effect is additive with that of Stroop condition (Stafford et al., 2011). According to the additive factors method (Donders, 1868–1869/1969; Sternberg, 1998), additivity is assumed to indicate independent and discrete processing stages. We present computational modeling work, using an existing Parallel Distributed Processing model of the Stroop task (Cohen et al., 1990) and a standard model of decision making (Ratcliff, 1978). This demonstrates that additive factors can be successfully accounted for by existing single stage models of the Stroop effect. Consequently, it is not valid to infer either discrete stages or separate loci of effects from additive factors. Further, our modeling work suggests that information binding may be a more important architectural property for producing additive factors than discrete stages. PMID:22102842

  18. Climate and weather risk in natural resource models

    NASA Astrophysics Data System (ADS)

    Merrill, Nathaniel Henry

    This work, consisting of three manuscripts, addresses natural resource management under risk due to variation in climate and weather. In three distinct but theoretically related applications, I quantify the role of natural resources in stabilizing economic outcomes. In Manuscript 1, we address policy designed to effect the risk of cyanobacteria blooms in a drinking water reservoir through watershed wide policy. Combining a hydrologic and economic model for a watershed in Rhode Island, we solve for the efficient allocation of best management practices (BMPs) on livestock pastures to meet a monthly risk-based as well as mean-based water quality objective. In order to solve for the efficient allocations of nutrient control effort, we optimize a probabilistically constrained integer-programming problem representing the choices made on each farm and the resultant conditions that support cyanobacteria blooms. In doing so, we employ a genetic algorithm (GA). We hypothesize that management based on controlling the upper tail of the probability distribution of phosphorus loading implies different efficient management actions as compared to controlling mean loading. We find a shift to more intense effort on fewer acres when a probabilistic objective is specified with cost savings of meeting risk levels of up to 25% over mean loading based policies. Additionally, we illustrate the relative cost effectiveness of various policies designed to meet this risk-based objective. Rainfall and the subsequent overland runoff is the source of transportation of nutrients to a receiving water body, with larger amounts of phosphorus moving in more intense rainfall events. We highlight the importance of this transportation mechanism by comparing policies under climate change scenarios, where the intensity of rainfall is projected to increase and the time series process of rainfall to change. In Manuscript 2, we introduce a new economic groundwater model that incorporates the gradual shift

  19. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    PubMed

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. PMID:22150359

  20. Active Contours Using Additive Local and Global Intensity Fitting Models for Intensity Inhomogeneous Image Segmentation

    PubMed Central

    Soomro, Shafiullah; Kim, Jeong Heon; Soomro, Toufique Ahmed

    2016-01-01

    This paper introduces an improved region based active contour method with a level set formulation. The proposed energy functional integrates both local and global intensity fitting terms in an additive formulation. Local intensity fitting term influences local force to pull the contour and confine it to object boundaries. In turn, the global intensity fitting term drives the movement of contour at a distance from the object boundaries. The global intensity term is based on the global division algorithm, which can better capture intensity information of an image than Chan-Vese (CV) model. Both local and global terms are mutually assimilated to construct an energy function based on a level set formulation to segment images with intensity inhomogeneity. Experimental results show that the proposed method performs better both qualitatively and quantitatively compared to other state-of-the-art-methods. PMID:27800011

  1. Generalized Concentration Addition Modeling Predicts Mixture Effects of Environmental PPARγ Agonists.

    PubMed

    Watt, James; Webster, Thomas F; Schlezinger, Jennifer J

    2016-09-01

    The vast array of potential environmental toxicant combinations necessitates the development of efficient strategies for predicting toxic effects of mixtures. Current practices emphasize the use of concentration addition to predict joint effects of endocrine disrupting chemicals in coexposures. Generalized concentration addition (GCA) is one such method for predicting joint effects of coexposures to chemicals and has the advantage of allowing for mixture components to have differences in efficacy (ie, dose-response curve maxima). Peroxisome proliferator-activated receptor gamma (PPARγ) is a nuclear receptor that plays a central role in regulating lipid homeostasis, insulin sensitivity, and bone quality and is the target of an increasing number of environmental toxicants. Here, we tested the applicability of GCA in predicting mixture effects of therapeutic (rosiglitazone and nonthiazolidinedione partial agonist) and environmental PPARγ ligands (phthalate compounds identified using EPA's ToxCast database). Transcriptional activation of human PPARγ1 by individual compounds and mixtures was assessed using a peroxisome proliferator response element-driven luciferase reporter. Using individual dose-response parameters and GCA, we generated predictions of PPARγ activation by the mixtures, and we compared these predictions with the empirical data. At high concentrations, GCA provided a better estimation of the experimental response compared with 3 alternative models: toxic equivalency factor, effect summation and independent action. These alternatives provided reasonable fits to the data at low concentrations in this system. These experiments support the implementation of GCA in mixtures analysis with endocrine disrupting compounds and establish PPARγ as an important target for further studies of chemical mixtures.

  2. Cognitive Processes that Account for Mental Addition Fluency Differences between Children Typically Achieving in Arithmetic and Children At-Risk for Failure in Arithmetic

    ERIC Educational Resources Information Center

    Berg, Derek H.; Hutchinson, Nancy L.

    2010-01-01

    This study investigated whether processing speed, short-term memory, and working memory accounted for the differential mental addition fluency between children typically achieving in arithmetic (TA) and children at-risk for failure in arithmetic (AR). Further, we drew attention to fluency differences in simple (e.g., 5 + 3) and complex (e.g., 16 +…

  3. Additive influence of genetic predisposition and conventional risk factors in the incidence of coronary heart disease: a population-based study in Greece

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An additive genetic risk score (GRS) for coronary heart disease (CHD) has previously been associated with incident CHD in the population-based Greek European Prospective Investigation into Cancer and nutrition (EPIC) cohort. In this study, we explore GRS-‘environment’ joint actions on CHD for severa...

  4. State-of-the-Art in Tsunami Risk Modelling for a global perspective

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Tsunamis can be considered as the natural hazard with the largest global spread in terms of hazard distribution due to a single event with the exception of global extinction level events (volcanoes, meteor impacts). Multiple extreme events have occurred during the last decade, including the devastating 2004 Sumatra tsunami and the events in Japan and Chile in 2011. In general, the hazard and risk of tsunamis is investigated in regional or inter-regional projects like in Japan, Indonesia or New Zealand following different methodologies and investigating various source mechanisms. Thus, in this study, a review of the state-of-the-art in global tsunami risk modelling has been undertaken. The most recent and up-to-date methodologies and projects from all over the world have been assembled for a direct comparison to provide a global perspective into a hazard scenario that affects multiple countries at once to extreme magnitudes. The assemblage of these models provides an insight into the temporal and spatial development of tsunami risk research and how it was adopted by research institutes and combined into official hazard modelling. A global map is assembled, indicating local and international case studies and projects with respect to their source model and date of development. In addition, the study also covers the development of software packages used to set up hazard and risk models and it investigates the different source processes of tsunami generation and propagation. A comparison is made using a multicriteria approach to examine the physical models and capabilities of software packages as well as the source identification procedure in different hazard models. A complete and up-to-date overview of tsunami risk and hazard modelling is presented, compared and classified which has far-reaching uses for the preparation of a tsunami risk assessment at any point on the earth.

  5. A Dual System Model of Preferences under Risk

    ERIC Educational Resources Information Center

    Mukherjee, Kanchan

    2010-01-01

    This article presents a dual system model (DSM) of decision making under risk and uncertainty according to which the value of a gamble is a combination of the values assigned to it independently by the affective and deliberative systems. On the basis of research on dual process theories and empirical research in Hsee and Rottenstreich (2004) and…

  6. Field Evaluation of an Avian Risk Assessment Model

    EPA Science Inventory

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in ...

  7. Surface Water Contamination Risk Assessment Modeled by Fuzzy-WRASTIC.

    PubMed

    Alavipoor, Fatemeh Sadat; Ghorbaninia, Zahra; Karimi, Saeed; Jafari, Hamidreza

    2016-07-01

    This research provides a Fuzzy-WRASTIC new model for water resource contamination risk assessment in a GIS (Geographic Information System) environment. First, this method setting in a multi-criteria evaluation framework (MCE) reviewed and mapped the sub criteria of every above-mentioned criterion. Then, related sub-layers were phased by the observance of GIS environment standards. In the next step, first the sub-layers were combined together, next the modeling of pollution risk status was done by utilizing a fuzzy overlay method and applying the OR, AND, SUM, PRODUCT and GAMMA operators by using WLC (Weighted Linear Combination) method and providing weights in the WRASTIC model. The results provide the best combination of modeling and the percentages of its risk categories of low, medium, high and very high, which are respectively 1.8, 14.07, 51.43 and 32.7. More areas have severe risk due to the unbalanced arrangement and compact of land uses around the compact surface water resources. PMID:27329055

  8. Model of risk assessment under ballistic statistical tests

    NASA Astrophysics Data System (ADS)

    Gabrovski, Ivan; Karakaneva, Juliana

    The material presents the application of a mathematical method for risk assessment under statistical determination of the ballistic limits of the protection equipment. The authors have implemented a mathematical model based on Pierson's criteria. The software accomplishment of the model allows to evaluate the V50 indicator and to assess the statistical hypothesis' reliability. The results supply the specialists with information about the interval valuations of the probability determined during the testing process.

  9. Modeling external carbon addition in biological nutrient removal processes with an extension of the international water association activated sludge model.

    PubMed

    Swinarski, M; Makinia, J; Stensel, H D; Czerwionka, K; Drewnowski, J

    2012-08-01

    The aim of this study was to expand the International Water Association Activated Sludge Model No. 2d (ASM2d) to account for a newly defined readily biodegradable substrate that can be consumed by polyphosphate-accumulating organisms (PAOs) under anoxic and aerobic conditions, but not under anaerobic conditions. The model change was to add a new substrate component and process terms for its use by PAOs and other heterotrophic bacteria under anoxic and aerobic conditions. The Gdansk (Poland) wastewater treatment plant (WWTP), which has a modified University of Cape Town (MUCT) process for nutrient removal, provided field data and mixed liquor for batch tests for model evaluation. The original ASM2d was first calibrated under dynamic conditions with the results of batch tests with settled wastewater and mixed liquor, in which nitrate-uptake rates, phosphorus-release rates, and anoxic phosphorus uptake rates were followed. Model validation was conducted with data from a 96-hour measurement campaign in the full-scale WWTP. The results of similar batch tests with ethanol and fusel oil as the external carbon sources were used to adjust kinetic and stoichiometric coefficients in the expanded ASM2d. Both models were compared based on their predictions of the effect of adding supplemental carbon to the anoxic zone of an MUCT process. In comparison with the ASM2d, the new model better predicted the anoxic behaviors of carbonaceous oxygen demand, nitrate-nitrogen (NO3-N), and phosphorous (PO4-P) in batch experiments with ethanol and fusel oil. However, when simulating ethanol addition to the anoxic zone of a full-scale biological nutrient removal facility, both models predicted similar effluent NO3-N concentrations (6.6 to 6.9 g N/m3). For the particular application, effective enhanced biological phosphorus removal was predicted by both models with external carbon addition but, for the new model, the effluent PO4-P concentration was approximately one-half of that found from

  10. Job strain (demands and control model) as a predictor of cardiovascular risk factors among petrochemical personnel

    PubMed Central

    Habibi, Ehsanollah; Poorabdian, Siamak; Shakerian, Mahnaz

    2015-01-01

    Background: One of the practical models for the assessment of stressful working conditions due to job strain is job demand and control model, which explains how physical and psychological adverse consequences, including cardiovascular risk factors can be established due to high work demands (the amount of workload, in addition to time limitations to complete that work) and low control of the worker on his/her work (lack of decision making) in the workplace. The aim of this study was to investigate how certain cardiovascular risk factors (including body mass index [BMI], heart rate, blood pressure, cholesterol and smoking) and the job demand and job control are related to each other. Materials and Methods: This prospective cohort study was conducted on 500 workers of the petrochemical industry in south of Iran, 2009. The study population was selected using simple random statistical method. They completed job demand and control questionnaire. The cardiovascular risk factors data was extracted from the workers hygiene profiles. Chi-square (χ2) test and hypothesis test (η) were used to assess the possible relationship between different quantified variables, individual demographic and cardiovascular risk factors. Results: The results of this study revealed that a significant relationship can be found between job demand control model and cardiovascular risk factors. Chi-square test result for the heart rate showed the highest (χ2 = 145.078) relationship, the corresponding results for smoking and BMI were χ2 = 85.652 and χ2 = 30.941, respectively. Subsequently, hypothesis testing results for cholesterol and hypertension was 0.469 and 0.684, respectively. Discussion: Job strain is likely to be associated with an increased risk of cardiovascular risk factors among male staff in a petrochemical company in Iran. The parameters illustrated in the Job demands and control model can act as acceptable predictors for the probability of job stress occurrence followed by showing

  11. Hospital-associated venous thromboembolism in pediatrics: a systematic review and meta-analysis of risk factors and risk-assessment models

    PubMed Central

    Mahajerin, Arash; Branchford, Brian R.; Amankwah, Ernest K.; Raffini, Leslie; Chalmers, Elizabeth; van Ommen, C. Heleen; Goldenberg, Neil A.

    2015-01-01

    Hospital-associated venous thromboembolism, including deep vein thrombosis and pulmonary embolism, is increasing in pediatric centers. The objective of this work was to systematically review literature on pediatric hospital-acquired venous thromboembolism risk factors and risk-assessment models, to inform future prevention research. We conducted a literature search on pediatric venous thromboembolism risk via PubMed (1946–2014) and Embase (1980–2014). Data on risk factors and risk-assessment models were extracted from case-control studies, while prevalence data on clinical characteristics were obtained from registries, large (n>40) retrospective case series, and cohort studies. Meta-analyses were conducted for risk factors or clinical characteristics reported in at least three studies. Heterogeneity among studies was assessed with the Cochran Q test and quantified by the I2 statistic. From 394 initial articles, 60 met the final inclusion criteria (20 case-control studies and 40 registries/large case series/cohort studies). Significant risk factors among case-control studies were: intensive care unit stay (OR: 2.14, 95% CI: 1.97–2.32); central venous catheter (OR: 2.12, 95% CI: 2.00–2.25); mechanical ventilation (OR: 1.56, 95%CI: 1.42–1.72); and length of stay in hospital (per each additional day, OR: 1.03, 95% CI: 1.03–1.03). Three studies developed/applied risk-assessment models from a combination of these risk factors. Fourteen significant clinical characteristics were identified through non-case-control studies. This meta-analysis confirms central venous catheter, intensive care unit stay, mechanical ventilation, and length of stay as risk factors. A few pediatric hospital-acquired venous thromboembolism risk scores have emerged employing these factors. Prospective validation is necessary to inform risk-stratified prevention trials. PMID:26001789

  12. Hospital-associated venous thromboembolism in pediatrics: a systematic review and meta-analysis of risk factors and risk-assessment models.

    PubMed

    Mahajerin, Arash; Branchford, Brian R; Amankwah, Ernest K; Raffini, Leslie; Chalmers, Elizabeth; van Ommen, C Heleen; Goldenberg, Neil A

    2015-08-01

    Hospital-associated venous thromboembolism, including deep vein thrombosis and pulmonary embolism, is increasing in pediatric centers. The objective of this work was to systematically review literature on pediatric hospital-acquired venous thromboembolism risk factors and risk-assessment models, to inform future prevention research. We conducted a literature search on pediatric venous thromboembolism risk via PubMed (1946-2014) and Embase (1980-2014). Data on risk factors and risk-assessment models were extracted from case-control studies, while prevalence data on clinical characteristics were obtained from registries, large (n>40) retrospective case series, and cohort studies. Meta-analyses were conducted for risk factors or clinical characteristics reported in at least three studies. Heterogeneity among studies was assessed with the Cochran Q test and quantified by the I(2) statistic. From 394 initial articles, 60 met the final inclusion criteria (20 case-control studies and 40 registries/large case series/cohort studies). Significant risk factors among case-control studies were: intensive care unit stay (OR: 2.14, 95% CI: 1.97-2.32); central venous catheter (OR: 2.12, 95% CI: 2.00-2.25); mechanical ventilation (OR: 1.56, 95%CI: 1.42-1.72); and length of stay in hospital (per each additional day, OR: 1.03, 95% CI: 1.03-1.03). Three studies developed/applied risk-assessment models from a combination of these risk factors. Fourteen significant clinical characteristics were identified through non-case-control studies. This meta-analysis confirms central venous catheter, intensive care unit stay, mechanical ventilation, and length of stay as risk factors. A few pediatric hospital-acquired venous thromboembolism risk scores have emerged employing these factors. Prospective validation is necessary to inform risk-stratified prevention trials.

  13. Modeling and additive manufacturing of bio-inspired composites with tunable fracture mechanical properties.

    PubMed

    Dimas, Leon S; Buehler, Markus J

    2014-07-01

    Flaws, imperfections and cracks are ubiquitous in material systems and are commonly the catalysts of catastrophic material failure. As stresses and strains tend to concentrate around cracks and imperfections, structures tend to fail far before large regions of material have ever been subjected to significant loading. Therefore, a major challenge in material design is to engineer systems that perform on par with pristine structures despite the presence of imperfections. In this work we integrate knowledge of biological systems with computational modeling and state of the art additive manufacturing to synthesize advanced composites with tunable fracture mechanical properties. Supported by extensive mesoscale computer simulations, we demonstrate the design and manufacturing of composites that exhibit deformation mechanisms characteristic of pristine systems, featuring flaw-tolerant properties. We analyze the results by directly comparing strain fields for the synthesized composites, obtained through digital image correlation (DIC), and the computationally tested composites. Moreover, we plot Ashby diagrams for the range of simulated and experimental composites. Our findings show good agreement between simulation and experiment, confirming that the proposed mechanisms have a significant potential for vastly improving the fracture response of composite materials. We elucidate the role of stiffness ratio variations of composite constituents as an important feature in determining the composite properties. Moreover, our work validates the predictive ability of our models, presenting them as useful tools for guiding further material design. This work enables the tailored design and manufacturing of composites assembled from inferior building blocks, that obtain optimal combinations of stiffness and toughness. PMID:24700202

  14. Additive surface complexation modeling of uranium(VI) adsorption onto quartz-sand dominated sediments.

    PubMed

    Dong, Wenming; Wan, Jiamin

    2014-06-17

    Many aquifers contaminated by U(VI)-containing acidic plumes are composed predominantly of quartz-sand sediments. The F-Area of the Savannah River Site (SRS) in South Carolina (USA) is an example. To predict U(VI) mobility and natural attenuation, we conducted U(VI) adsorption experiments using the F-Area plume sediments and reference quartz, goethite, and kaolinite. The sediments are composed of ∼96% quartz-sand and 3-4% fine fractions of kaolinite and goethite. We developed a new humic acid adsorption method for determining the relative surface area abundances of goethite and kaolinite in the fine fractions. This method is expected to be applicable to many other binary mineral pairs, and allows successful application of the component additivity (CA) approach based surface complexation modeling (SCM) at the SRS F-Area and other similar aquifers. Our experimental results indicate that quartz has stronger U(VI) adsorption ability per unit surface area than goethite and kaolinite at pH ≤ 4.0. Our modeling results indicate that the binary (goethite/kaolinite) CA-SCM under-predicts U(VI) adsorption to the quartz-sand dominated sediments at pH ≤ 4.0. The new ternary (quartz/goethite/kaolinite) CA-SCM provides excellent predictions. The contributions of quartz-sand, kaolinite, and goethite to U(VI) adsorption and the potential influences of dissolved Al, Si, and Fe are also discussed.

  15. Modeling and additive manufacturing of bio-inspired composites with tunable fracture mechanical properties.

    PubMed

    Dimas, Leon S; Buehler, Markus J

    2014-07-01

    Flaws, imperfections and cracks are ubiquitous in material systems and are commonly the catalysts of catastrophic material failure. As stresses and strains tend to concentrate around cracks and imperfections, structures tend to fail far before large regions of material have ever been subjected to significant loading. Therefore, a major challenge in material design is to engineer systems that perform on par with pristine structures despite the presence of imperfections. In this work we integrate knowledge of biological systems with computational modeling and state of the art additive manufacturing to synthesize advanced composites with tunable fracture mechanical properties. Supported by extensive mesoscale computer simulations, we demonstrate the design and manufacturing of composites that exhibit deformation mechanisms characteristic of pristine systems, featuring flaw-tolerant properties. We analyze the results by directly comparing strain fields for the synthesized composites, obtained through digital image correlation (DIC), and the computationally tested composites. Moreover, we plot Ashby diagrams for the range of simulated and experimental composites. Our findings show good agreement between simulation and experiment, confirming that the proposed mechanisms have a significant potential for vastly improving the fracture response of composite materials. We elucidate the role of stiffness ratio variations of composite constituents as an important feature in determining the composite properties. Moreover, our work validates the predictive ability of our models, presenting them as useful tools for guiding further material design. This work enables the tailored design and manufacturing of composites assembled from inferior building blocks, that obtain optimal combinations of stiffness and toughness.

  16. An integrated model-based approach to the risk assessment of pesticide drift from vineyards

    NASA Astrophysics Data System (ADS)

    Pivato, Alberto; Barausse, Alberto; Zecchinato, Francesco; Palmeri, Luca; Raga, Roberto; Lavagnolo, Maria Cristina; Cossu, Raffaello

    2015-06-01

    The inhalation of pesticide in air is of particular concern for people living in close contact with intensive agricultural activities. This study aims to develop an integrated modelling methodology to assess whether pesticides pose a risk to the health of people living near vineyards, and apply this methodology in the world-renowned Prosecco DOCG (Italian label for protection of origin and geographical indication of wines) region. A sample field in Bigolino di Valdobbiadene (North-Eastern Italy) was selected to perform the pesticide fate modellization and the consequent inhalation risk assessment for people living in the area. The modellization accounts for the direct pesticide loss during the treatment of vineyards and for the volatilization from soil after the end of the treatment. A fugacity model was used to assess the volatilization flux from soil. The Gaussian puff air dispersion model CALPUFF was employed to assess the airborne concentration of the emitted pesticide over the simulation domain. The subsequent risk assessment integrates the HArmonised environmental Indicators for pesticide Risk (HAIR) and US-EPA guidelines. In this case study the modelled situation turned to be safe from the point of view of human health in the case of non-carcinogenic compounds, and additional improvements were suggested to further mitigate the effect of the most critical compound.

  17. Modelling tsunami inundation for risk analysis at the Andaman Sea Coast of Thailand

    NASA Astrophysics Data System (ADS)

    Kaiser, G.; Kortenhaus, A.

    2009-04-01

    The mega-tsunami of Dec. 26, 2004 strongly impacted the Andaman Sea coast of Thailand and devastated coastal ecosystems as well as towns, settlements and tourism resorts. In addition to the tragic loss of many lives, the destruction or damage of life-supporting infrastructure, such as buildings, roads, water & power supply etc. caused high economic losses in the region. To mitigate future tsunami impacts there is a need to assess the tsunami hazard and vulnerability in flood prone areas at the Andaman Sea coast in order to determine the spatial distribution of risk and to develop risk management strategies. In the bilateral German-Thai project TRAIT research is performed on integrated risk assessment for the Provinces Phang Nga and Phuket in southern Thailand, including a hazard analysis, i.e. modelling tsunami propagation to the coast, tsunami wave breaking and inundation characteristics, as well as vulnerability analysis of the socio-economic and the ecological system in order to determine the scenario-based, specific risk for the region. In this presentation results of the hazard analysis and the inundation simulation are presented and discussed. Numerical modelling of tsunami propagation and inundation simulation is an inevitable tool for risk analysis, risk management and evacuation planning. While numerous investigations have been made to model tsunami wave generation and propagation in the Indian Ocean, there is still a lack in determining detailed inundation patterns, i.e. water depth and flow dynamics. However, for risk management and evacuation planning this knowledge is essential. As the accuracy of the inundation simulation is strongly depending on the available bathymetric and the topographic data, a multi-scale approach is chosen in this work. The ETOPO Global Relief Model as a bathymetric basis and the Shuttle Radar Topography Mission (SRTM90) have been widely applied in tsunami modelling approaches as these data are free and almost world

  18. Challenges of Modeling Flood Risk at Large Scales

    NASA Astrophysics Data System (ADS)

    Guin, J.; Simic, M.; Rowe, J.

    2009-04-01

    Flood risk management is a major concern for many nations and for the insurance sector in places where this peril is insured. A prerequisite for risk management, whether in the public sector or in the private sector is an accurate estimation of the risk. Mitigation measures and traditional flood management techniques are most successful when the problem is viewed at a large regional scale such that all inter-dependencies in a river network are well understood. From an insurance perspective the jury is still out there on whether flood is an insurable peril. However, with advances in modeling techniques and computer power it is possible to develop models that allow proper risk quantification at the scale suitable for a viable insurance market for flood peril. In order to serve the insurance market a model has to be event-simulation based and has to provide financial risk estimation that forms the basis for risk pricing, risk transfer and risk management at all levels of insurance industry at large. In short, for a collection of properties, henceforth referred to as a portfolio, the critical output of the model is an annual probability distribution of economic losses from a single flood occurrence (flood event) or from an aggregation of all events in any given year. In this paper, the challenges of developing such a model are discussed in the context of Great Britain for which a model has been developed. The model comprises of several, physically motivated components so that the primary attributes of the phenomenon are accounted for. The first component, the rainfall generator simulates a continuous series of rainfall events in space and time over thousands of years, which are physically realistic while maintaining the statistical properties of rainfall at all locations over the model domain. A physically based runoff generation module feeds all the rivers in Great Britain, whose total length of stream links amounts to about 60,000 km. A dynamical flow routing

  19. Modeling of Radiation Risks for Human Space Missions

    NASA Technical Reports Server (NTRS)

    Fletcher, Graham

    2004-01-01

    Prior to any human space flight, calculations of radiation risks are used to determine the acceptable scope of astronaut activity. Using the supercomputing facilities at NASA Ames Research Center, Ames researchers have determined the damage probabilities of DNA functional groups by space radiation. The data supercede those used in the current Monte Carlo model for risk assessment. One example is the reaction of DNA with hydroxyl radical produced by the interaction of highly energetic particles from space radiation with water molecules in the human body. This reaction is considered an important cause of DNA mutations, although its mechanism is not well understood.

  20. Reducing uncertainty in risk modeling for methylmercury exposure

    SciTech Connect

    Ponce, R.; Egeland, G.; Middaugh, J.; Lee, R.

    1995-12-31

    The biomagnification and bioaccumulation of methylmercury in marine species represents a challenge for risk assessment related to the consumption of subsistence foods in Alaska. Because of the profound impact that food consumption advisories have on indigenous peoples seeking to preserve a way of life, there is a need to reduce uncertainty in risk assessment. Thus, research was initiated to reduce the uncertainty in assessing the health risks associated with the consumption of subsistence foods. Because marine subsistence foods typically contain elevated levels of methylmercury, preliminary research efforts have focused on methylmercury as the principal chemical of concern. Of particular interest are the antagonistic effects of selenium on methylmercury toxicity. Because of this antagonism, methylmercury exposure through the consumption of marine mammal meat (with high selenium) may not be as toxic as comparable exposures through other sources of dietary intake, such as in the contaminated bread episode of Iraq (containing relatively low selenium). This hypothesis is supported by animal experiments showing reduced toxicity of methylmercury associated with marine mammal meat, by the antagonistic influence of selenium on methylmercury toxicity, and by negative clinical findings in adult populations exposed to methylmercury through a marine diet not subject to industrial contamination. Exploratory model development is underway to identify potential improvements and applications of current deterministic and probabilistic models, particularly by incorporating selenium as an antagonist in risk modeling methods.

  1. The acquired preparedness model of risk for bulimic symptom development.

    PubMed

    Combs, Jessica L; Smith, Gregory T; Flory, Kate; Simmons, Jean R; Hill, Kelly K

    2010-09-01

    The authors applied person-environment transaction theory to test the acquired preparedness model of eating disorder risk. The model holds that (a) middle-school girls high in the trait of ineffectiveness are differentially prepared to acquire high-risk expectancies for reinforcement from dieting or thinness; (b) those expectancies predict subsequent binge eating and purging; and (c) the influence of the disposition of ineffectiveness on binge eating and purging is mediated by dieting or thinness expectancies. In a three-wave longitudinal study of 394 middle-school girls, the authors found support for the model. Seventh-grade girls' scores on ineffectiveness predicted their subsequent endorsement of high-risk dieting or thinness expectancies, which in turn predicted subsequent increases in binge eating and purging. Statistical tests of mediation supported the hypothesis that the prospective relation between ineffectiveness and binge eating was mediated by dieting or thinness expectancies, as was the prospective relation between ineffectiveness and purging. This application of a basic science theory to eating disorder risk appears fruitful, and the findings suggest the importance of early interventions that address both disposition and learning.

  2. Small scale water recycling systems--risk assessment and modelling.

    PubMed

    Diaper, C; Dixon, A; Bulier, D; Fewkes, A; Parsons, S A; Strathern, M; Stephenson, T; Strutt, J

    2001-01-01

    This paper aims to use quantitative risk analysis, risk modelling and simulation modelling tools to assess the performance of a proprietary single house grey water recycling system. A preliminary Hazard and Operability study (HAZOP) identified the main hazards, both health related and economic, associated with installing the recycling system in a domestic environment. The health related consequences of system failure were associated with the presence of increased concentrations of micro-organisms at the point of use, due to failure of the disinfection system and/or the pump. The risk model was used to assess the increase in the probability of infection for a particular genus of micro-organism, Salmonella spp, during disinfection failure. The increase in the number of cases of infection above a base rate rose from 0.001% during normal operation, to 4% for a recycling system with no disinfection. The simulation model was used to examine the possible effects of pump failure. The model indicated that the anaerobic COD release rate in the system storage tank increases over time and dissolved oxygen decreases during this failure mode. These conditions are likely to result in odour problems.

  3. Modeling insurer-homeowner interactions in managing natural disaster risk.

    PubMed

    Kesete, Yohannes; Peng, Jiazhen; Gao, Yang; Shan, Xiaojun; Davidson, Rachel A; Nozick, Linda K; Kruse, Jamie

    2014-06-01

    The current system for managing natural disaster risk in the United States is problematic for both homeowners and insurers. Homeowners are often uninsured or underinsured against natural disaster losses, and typically do not invest in retrofits that can reduce losses. Insurers often do not want to insure against these losses, which are some of their biggest exposures and can cause an undesirably high chance of insolvency. There is a need to design an improved system that acknowledges the different perspectives of the stakeholders. In this article, we introduce a new modeling framework to help understand and manage the insurer's role in catastrophe risk management. The framework includes a new game-theoretic optimization model of insurer decisions that interacts with a utility-based homeowner decision model and is integrated with a regional catastrophe loss estimation model. Reinsurer and government roles are represented as bounds on the insurer-insured interactions. We demonstrate the model for a full-scale case study for hurricane risk to residential buildings in eastern North Carolina; present the results from the perspectives of all stakeholders-primary insurers, homeowners (insured and uninsured), and reinsurers; and examine the effect of key parameters on the results.

  4. Modelling of fire count data: fire disaster risk in Ghana.

    PubMed

    Boadi, Caleb; Harvey, Simon K; Gyeke-Dako, Agyapomaa

    2015-01-01

    Stochastic dynamics involved in ecological count data require distribution fitting procedures to model and make informed judgments. The study provides empirical research, focused on the provision of an early warning system and a spatial graph that can detect societal fire risks. It offers an opportunity for communities, organizations, risk managers, actuaries and governments to be aware of, and understand fire risks, so that they will increase the direct tackling of the threats posed by fire. Statistical distribution fitting method that best helps identify the stochastic dynamics of fire count data is used. The aim is to provide a fire-prediction model and fire spatial graph for observed fire count data. An empirical probability distribution model is fitted to the fire count data and compared to the theoretical probability distribution of the stochastic process of fire count data. The distribution fitted to the fire frequency count data helps identify the class of models that are exhibited by the fire and provides time leading decisions. The research suggests that fire frequency and loss (fire fatalities) count data in Ghana are best modelled with a Negative Binomial Distribution. The spatial map of observed fire frequency and fatality measured over 5 years (2007-2011) offers in this study a first regional assessment of fire frequency and fire fatality in Ghana. PMID:26702383

  5. Generalized Concentration Addition Modeling Predicts Mixture Effects of Environmental PPARγ Agonists.

    PubMed

    Watt, James; Webster, Thomas F; Schlezinger, Jennifer J

    2016-09-01

    The vast array of potential environmental toxicant combinations necessitates the development of efficient strategies for predicting toxic effects of mixtures. Current practices emphasize the use of concentration addition to predict joint effects of endocrine disrupting chemicals in coexposures. Generalized concentration addition (GCA) is one such method for predicting joint effects of coexposures to chemicals and has the advantage of allowing for mixture components to have differences in efficacy (ie, dose-response curve maxima). Peroxisome proliferator-activated receptor gamma (PPARγ) is a nuclear receptor that plays a central role in regulating lipid homeostasis, insulin sensitivity, and bone quality and is the target of an increasing number of environmental toxicants. Here, we tested the applicability of GCA in predicting mixture effects of therapeutic (rosiglitazone and nonthiazolidinedione partial agonist) and environmental PPARγ ligands (phthalate compounds identified using EPA's ToxCast database). Transcriptional activation of human PPARγ1 by individual compounds and mixtures was assessed using a peroxisome proliferator response element-driven luciferase reporter. Using individual dose-response parameters and GCA, we generated predictions of PPARγ activation by the mixtures, and we compared these predictions with the empirical data. At high concentrations, GCA provided a better estimation of the experimental response compared with 3 alternative models: toxic equivalency factor, effect summation and independent action. These alternatives provided reasonable fits to the data at low concentrations in this system. These experiments support the implementation of GCA in mixtures analysis with endocrine disrupting compounds and establish PPARγ as an important target for further studies of chemical mixtures. PMID:27255385

  6. Additive Effects of the Risk Alleles of PNPLA3 and TM6SF2 on Non-alcoholic Fatty Liver Disease (NAFLD) in a Chinese Population

    PubMed Central

    Wang, Xiaoliang; Liu, Zhipeng; Wang, Kai; Wang, Zhaowen; Sun, Xing; Zhong, Lin; Deng, Guilong; Song, Guohe; Sun, Baining; Peng, Zhihai; Liu, Wanqing

    2016-01-01

    Recent genome-wide association studies have identified that variants in or near PNPLA3, NCAN, GCKR, LYPLAL1, and TM6SF2 are significantly associated with non-alcoholic fatty liver disease (NAFLD) in multiple ethnic groups. Studies on their impact on NAFLD in Han Chinese are still limited. In this study, we examined the relevance of these variants to NAFLD in a community-based Han Chinese population and further explored their potential joint effect on NAFLD. Six single nucleotide polymorphisms (SNPs) (PNPLA3 rs738409, rs2294918, NCAN rs2228603, GCKR rs780094, LYPLAL1 rs12137855, and TM6SF2 rs58542926) previously identified in genome-wide analyses, to be associated with NAFLD were genotyped in 384 NAFLD patients and 384 age- and gender-matched healthy controls. We found two out of the six polymorphisms, PNPLA3 rs738409 (OR = 1.52, 95%CI: 1.19–1.96; P = 0.00087) and TM6SF2 rs58542926 (OR = 2.11, 95%CI: 1.34–3.39; P = 0.0016) are independently associated with NAFLD after adjustment for the effects of age, gender, and BMI. Our analysis further demonstrated the strong additive effects of the risk alleles of PNPLA3 and TM6SF2 with an overall significance between the number of risk alleles and NAFLD (OR = 1.64, 95%CI: 1.34–2.01; P = 1.4 × 10-6). The OR for NAFLD increased in an additive manner, with an average increase in OR of 1.52 per additional risk allele. Our results confirmed that the PNPLA3 and TM6SF2 variants were the most significant risk alleles for NAFLD in Chinese population. Therefore, genotyping these two genetic risk factors may help identify individuals with the highest risk of NAFLD. PMID:27532011

  7. Brownfields and health risks--air dispersion modeling and health risk assessment at landfill redevelopment sites.

    PubMed

    Ofungwu, Joseph; Eget, Steven

    2006-07-01

    Redevelopment of landfill sites in the New Jersey-New York metropolitan area for recreational (golf courses), commercial, and even residential purposes seems to be gaining acceptance among municipal planners and developers. Landfill gas generation, which includes methane and potentially toxic nonmethane compounds usually continues long after closure of the landfill exercise phase. It is therefore prudent to evaluate potential health risks associated with exposure to gas emissions before redevelopment of the landfill sites as recreational, commercial, and, especially, residential properties. Unacceptably high health risks would call for risk management measures such as limiting the development to commercial/recreational rather than residential uses, stringent gas control mechanisms, interior air filtration, etc. A methodology is presented for applying existing models to estimate residual landfill hazardous compounds emissions and to quantify associated health risks. Besides the toxic gas constituents of landfill emissions, other risk-related issues concerning buried waste, landfill leachate, and explosive gases were qualitatively evaluated. Five contiguously located landfill sites in New Jersey intended for residential and recreational redevelopment were used to exemplify the approach.

  8. Evaluation of Major Online Diabetes Risk Calculators and Computerized Predictive Models.

    PubMed

    Stiglic, Gregor; Pajnkihar, Majda

    2015-01-01

    Classical paper-and-pencil based risk assessment questionnaires are often accompanied by the online versions of the questionnaire to reach a wider population. This study focuses on the loss, especially in risk estimation performance, that can be inflicted by direct transformation from the paper to online versions of risk estimation calculators by ignoring the possibilities of more complex and accurate calculations that can be performed using the online calculators. We empirically compare the risk estimation performance between four major diabetes risk calculators and two, more advanced, predictive models. National Health and Nutrition Examination Survey (NHANES) data from 1999-2012 was used to evaluate the performance of detecting diabetes and pre-diabetes. American Diabetes Association risk test achieved the best predictive performance in category of classical paper-and-pencil based tests with an Area Under the ROC Curve (AUC) of 0.699 for undiagnosed diabetes (0.662 for pre-diabetes) and 47% (47% for pre-diabetes) persons selected for screening. Our results demonstrate a significant difference in performance with additional benefits for a lower number of persons selected for screening when statistical methods are used. The best AUC overall was obtained in diabetes risk prediction using logistic regression with AUC of 0.775 (0.734) and an average 34% (48%) persons selected for screening. However, generalized boosted regression models might be a better option from the economical point of view as the number of selected persons for screening of 30% (47%) lies significantly lower for diabetes risk assessment in comparison to logistic regression (p < 0.001), with a significantly higher AUC (p < 0.001) of 0.774 (0.740) for the pre-diabetes group. Our results demonstrate a serious lack of predictive performance in four major online diabetes risk calculators. Therefore, one should take great care and consider optimizing the online versions of questionnaires that were

  9. Evaluation of Major Online Diabetes Risk Calculators and Computerized Predictive Models.

    PubMed

    Stiglic, Gregor; Pajnkihar, Majda

    2015-01-01

    Classical paper-and-pencil based risk assessment questionnaires are often accompanied by the online versions of the questionnaire to reach a wider population. This study focuses on the loss, especially in risk estimation performance, that can be inflicted by direct transformation from the paper to online versions of risk estimation calculators by ignoring the possibilities of more complex and accurate calculations that can be performed using the online calculators. We empirically compare the risk estimation performance between four major diabetes risk calculators and two, more advanced, predictive models. National Health and Nutrition Examination Survey (NHANES) data from 1999-2012 was used to evaluate the performance of detecting diabetes and pre-diabetes. American Diabetes Association risk test achieved the best predictive performance in category of classical paper-and-pencil based tests with an Area Under the ROC Curve (AUC) of 0.699 for undiagnosed diabetes (0.662 for pre-diabetes) and 47% (47% for pre-diabetes) persons selected for screening. Our results demonstrate a significant difference in performance with additional benefits for a lower number of persons selected for screening when statistical methods are used. The best AUC overall was obtained in diabetes risk prediction using logistic regression with AUC of 0.775 (0.734) and an average 34% (48%) persons selected for screening. However, generalized boosted regression models might be a better option from the economical point of view as the number of selected persons for screening of 30% (47%) lies significantly lower for diabetes risk assessment in comparison to logistic regression (p < 0.001), with a significantly higher AUC (p < 0.001) of 0.774 (0.740) for the pre-diabetes group. Our results demonstrate a serious lack of predictive performance in four major online diabetes risk calculators. Therefore, one should take great care and consider optimizing the online versions of questionnaires that were

  10. Implications of pharmacokinetic modeling in risk assessment analysis.

    PubMed Central

    Lutz, R J; Dedrick, R L

    1987-01-01

    Physiologic pharmacokinetic models are a useful interface between exposure models and risk assessment models by providing a means to estimate tissue concentrations of reactive chemical species at the site of action. The models utilize numerous parameters that can be characterized as anatomical, such as body size or tissue volume; physiological, such as tissue blood perfusion rates, clearances, and metabolism; thermodynamic, such as partition coefficients; and transport, such as membrane permeabilities. The models provide a format to investigate how these parameters can influence the disposition of chemicals throughout the body, which is an important consideration in interpreting toxicity studies. Physiologic models can take into account nonlinear effects related to clearance, metabolism, or transport. They allow for extrapolation of tissue concentration from high dose to low dose experiments and from species to species and can account for temporal variations in dose. PMID:3447907

  11. Derivation of a risk assessment model for hospital-acquired venous thrombosis: the NAVAL score.

    PubMed

    de Bastos, Marcos; Barreto, Sandhi M; Caiafa, Jackson S; Boguchi, Tânia; Silva, José Luiz Padilha; Rezende, Suely M

    2016-05-01

    Venous thrombosis (VT) is a preventable cause of death in hospitalized patients. The main strategy to decrease VT incidence is timely thromboprophylaxis in at-risk patients. We sought to evaluate the reliability of risk assessment model (RAM) data, the incremental usefulness of additional variables and the modelling of an adjusted score (the NAVAL score). We used the RAM proposed by Caprini for initial assessment. A 5 % systematic sample of data was independently reviewed for reliability. We evaluated the incremental usefulness of six variables for VT during the score modelling by logistic regression. We then assessed the NAVAL score for calibration, reclassification and discrimination performances. We observed 11,091 patients with 37 (0.3 %) VT events. Using the Caprini RAM, high-risk and moderate-risk patients were respectively associated with a 17.4 (95 % confidence interval [CI] 6.1-49.9) and 4.2 (95 % CI 1.6-11.0) increased VT risk compared with low-risk patients. Four independent variables were selected for the NAVAL score: "Age", "Admission clinic", "History of previous VT event" and "History of thrombophilia". The area under the receiver-operating-characteristic curve for the NAVAL score was 0.72 (95 % CI 0.63-0.81). The Net Reclassification Index (NRI) for the NAVAL score compared with the Caprini RAM was -0.1 (95 % CI -0.3 to 0.1; p = 0.28). We conclude that the NAVAL score is a simplified tool for the stratification of VT risk in hospitalized patients. With only four variables, it demonstrated good performance and discrimination, but requires external validation before clinical application. We also confirm that the Caprini RAM can effectively stratify VT risk in hospitalized patients in our population. PMID:26446587

  12. The determination of risk areas for muddy floods based on a worst-case erosion modelling

    NASA Astrophysics Data System (ADS)

    Saathoff, Ulfert; Schindewolf, Marcus; Annika Arévalo, Sarah

    2013-04-01

    Soil erosion and muddy floods are a frequently occurring hazard in the German state of Saxony, because of the topography and the high relief energy together with the high proportion of arable land. Still, the events are rather heterogeneously distributed and we do not know where damage is likely to occur. The goal of this study is to locate hot spots for the risk of muddy floods, with the objective to prevent high economic damage in future. We applied a soil erosion and deposition map of Saxony, calculated with the process based soil erosion model EROSION 3D. This map shows the potential soil erosion and transported sediment for worst case soil conditions and a 10 year rain storm event. Furthermore, a map of the current landuse in the state is used. From the landuse map, we extracted those areas that are especially vulnerable to muddy floods, like residential and industrial areas, infrastructural facilities (e.g. power plants, hospitals) and highways. In combination with the output of the soil erosion model, the amount of sediment, that enters each single landuse entity, is calculated. Based on this data, a state-wide map with classified risks is created. The results are furthermore used to identify the risk of muddy floods for each single municipality in Saxony. The results are evaluated with data of real occurred muddy flood events with documented locations during the period between 2000 and 2010. Additionally, plausibility tests are performed for selected areas (examination of landuse, topography and soil). The results prove to be plausible and most of the documented events can be explained by the modelled risk map. The created map can be used by different institutions like city and traffic planners, to estimate the risk of muddy flood occurrence at specific locations. Furthermore, the risk map can serve insurance companies to evaluate the insurance risk of a building. To make them easily accessible, the risk map will be published online via a web GIS

  13. Agents, Bayes, and Climatic Risks - a modular modelling approach

    NASA Astrophysics Data System (ADS)

    Haas, A.; Jaeger, C.

    2005-08-01

    When insurance firms, energy companies, governments, NGOs, and other agents strive to manage climatic risks, it is by no way clear what the aggregate outcome should and will be. As a framework for investigating this subject, we present the LAGOM model family. It is based on modules depicting learning social agents. For managing climate risks, our agents use second order probabilities and update them by means of a Bayesian mechanism while differing in priors and risk aversion. The interactions between these modules and the aggregate outcomes of their actions are implemented using further modules. The software system is implemented as a series of parallel processes using the CIAMn approach. It is possible to couple modules irrespective of the language they are written in, the operating system under which they are run, and the physical location of the machine.

  14. A neural network model for credit risk evaluation.

    PubMed

    Khashman, Adnan

    2009-08-01

    Credit scoring is one of the key analytical techniques in credit risk evaluation which has been an active research area in financial risk management. This paper presents a credit risk evaluation system that uses a neural network model based on the back propagation learning algorithm. We train and implement the neural network to decide whether to approve or reject a credit application, using seven learning schemes and real world credit applications from the Australian credit approval datasets. A comparison of the system performance under the different learning schemes is provided, furthermore, we compare the performance of two neural networks; with one and two hidden layers following the ideal learning scheme. Experimental results suggest that neural networks can be effectively used in automatic processing of credit applications.

  15. Development of Standardized Probabilistic Risk Assessment Models for Shutdown Operations Integrated in SPAR Level 1 Model

    SciTech Connect

    S. T. Khericha; J. Mitman

    2008-05-01

    Nuclear plant operating experience and several studies show that the risk from shutdown operation during Modes 4, 5, and 6 at pressurized water reactors and Modes 4 and 5 at boiling water reactors can be significant. This paper describes using the U.S. Nuclear Regulatory Commission’s full-power Standardized Plant Analysis Risk (SPAR) model as the starting point for development of risk evaluation models for commercial nuclear power plants. The shutdown models are integrated with their respective internal event at-power SPAR model. This is accomplished by combining the modified system fault trees from the SPAR full-power model with shutdown event tree logic. Preliminary human reliability analysis results indicate that risk is dominated by the operator’s ability to correctly diagnose events and initiate systems.

  16. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models

    SciTech Connect

    David G. Hoel, PhD

    2012-04-19

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival function and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact

  17. Risk assessment and management: a community forensic mental health practice model.

    PubMed

    Kelly, Teresa; Simmons, Warren; Gregory, Esther

    2002-12-01

    In Victoria, the Crimes (Mental Impairment and Unfitness to be Tried) Act (1997) reformed legal practice in relation to the detention, management and release of persons found by a court to be not guilty on the grounds of insanity or unfit to be tried. This Act provides a legal structure for such 'forensic patients' to move from secure inpatient facilities into the community. This new legislative landscape has generated challenges for all stakeholders and has provided the impetus for the development of a risk assessment and management model. The key components of the model are the risk profile, assessment and management plan. The discussion comprises theory, legislation, practice implications and limitations of the model. Practice implications concern the provision of objective tools, which identify risk and document strategic interventions to support clinical management. Some of the practice limitations include the model's applicability to risk assessment and management and its dependence on a mercurial multi-service interface in after-hours crisis situations. In addition to this, the paper articulates human limitations implicit in the therapeutic relationship that necessarily underpins the model. The paper concludes with an exploration of the importance of evaluative processes as well as the need for formal support and education for clinicians.

  18. Assessing discriminative ability of risk models in clustered data

    PubMed Central

    2014-01-01

    Background The discriminative ability of a risk model is often measured by Harrell’s concordance-index (c-index). The c-index estimates for two randomly chosen subjects the probability that the model predicts a higher risk for the subject with poorer outcome (concordance probability). When data are clustered, as in multicenter data, two types of concordance are distinguished: concordance in subjects from the same cluster (within-cluster concordance probability) and concordance in subjects from different clusters (between-cluster concordance probability). We argue that the within-cluster concordance probability is most relevant when a risk model supports decisions within clusters (e.g. who should be treated in a particular center). We aimed to explore different approaches to estimate the within-cluster concordance probability in clustered data. Methods We used data of the CRASH trial (2,081 patients clustered in 35 centers) to develop a risk model for mortality after traumatic brain injury. To assess the discriminative ability of the risk model within centers we first calculated cluster-specific c-indexes. We then pooled the cluster-specific c-indexes into a summary estimate with different meta-analytical techniques. We considered fixed effect meta-analysis with different weights (equal; inverse variance; number of subjects, events or pairs) and random effects meta-analysis. We reflected on pooling the estimates on the log-odds scale rather than the probability scale. Results The cluster-specific c-index varied substantially across centers (IQR = 0.70-0.81; I 2 = 0.76 with 95% confidence interval 0.66 to 0.82). Summary estimates resulting from fixed effect meta-analysis ranged from 0.75 (equal weights) to 0.84 (inverse variance weights). With random effects meta-analysis – accounting for the observed heterogeneity in c-indexes across clusters – we estimated a mean of 0.77, a between-cluster variance of 0.0072 and a 95% prediction interval of 0.60 to 0.95. The

  19. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  20. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  1. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  2. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  3. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  4. Multistage Carcinogenesis Modelling of Low and Protracted Radiation Exposure for Risk Assessment

    NASA Astrophysics Data System (ADS)

    Brugmans, M. J. P.; Bijwaard, H.

    Exposure to cosmic radiation in space poses an increased risk for radiation-induced cancer later in life. Modelling is essential to quantify these excess risks from low and protracted exposures to a mixture of radiation types, since they cannot be determined directly in epidemiological studies. Multistage carcinogenesis models provide a mechanistic basis for the extrapolation of epidemiological data to the regime that is relevant for radiation protection. In recent years, we have exploited the well-known two-mutation carcinogenesis model to bridge the gap between radiobiology and epidemiology. We have fitted this model to a number of animal and epidemiological data sets, using dose-response relationships for the mutational steps that are well established in cellular radiobiology. The methodology and implications for radiation risks are illustrated with analyses of two radiation-induced tumours: bone cancer from internal (high-LET and low-LET) emitters and lung cancer after radon exposure. For the risks of bone-seeking radionuclides (Ra-226, Sr-90, Pu-239), model fits to beagle data show that the dose-effect relationship for bone cancer at low intakes is linear-quadratic. This is due to a combination of equally strong linear dose-effects in the two subsequent mutational steps in the model. This supra-linear dose-effect relationship is also found in a model analysis of bone cancer in radium dial painters. This implies that at low intakes the risks from bone seekers are significantly lower than estimated from a linear extrapolation from high doses. Model analyses of radon-exposed rats and uranium miners show that lung-cancer induction is dominated by a linear radiation effect in the first mutational step. For two miner cohorts with significantly different lung cancer baselines a uniform description of the effect of radon is obtained in a joint analysis. This demonstrates the possibility to model risk transfer across populations. In addition to biologically based risk

  5. Survivorship models for estimating the risk of decompression sickness.

    PubMed

    Kumar, K V; Powell, M R

    1994-07-01

    Several approaches have been used for modeling the incidence of decompression sickness (DCS) such as Hill's dose-response and logistic regression. Most of these methods do not include the time-to-onset information in the model. Survival analysis (failure time analysis) is appropriate when the time to onset of an event is of interest. The applicability of survival analysis for modeling the risk of DCS is illustrated by using data obtained from hypobaric chamber exposures simulating extravehicular activities (n = 426). Univariate analysis of incidence-free survival proportions were obtained for Doppler-detectable circulating microbubbles (CMB), symptoms of DCS and test aborts. A log-linear failure time regression model with 360-min half-time tissue ratio (TR) as covariate was constructed, and estimated probabilities for various TR values were calculated. Further regression analysis by including CMB status in this model showed significant improvement (p < 0.05) in the estimation of DCS over the previous model. Since DCS is dependent on the exposure pressure as well as the duration of exposure, we recommend the use of survival analysis for modeling the risk of DCS. PMID:7945136

  6. Cost-Sensitive Boosting: Fitting an Additive Asymmetric Logistic Regression Model

    NASA Astrophysics Data System (ADS)

    Li, Qiu-Jie; Mao, Yao-Bin; Wang, Zhi-Quan; Xiang, Wen-Bo

    Conventional machine learning algorithms like boosting tend to equally treat misclassification errors that are not adequate to process certain cost-sensitive classification problems such as object detection. Although many cost-sensitive extensions of boosting by directly modifying the weighting strategy of correspond original algorithms have been proposed and reported, they are heuristic in nature and only proved effective by empirical results but lack sound theoretical analysis. This paper develops a framework from a statistical insight that can embody almost all existing cost-sensitive boosting algorithms: fitting an additive asymmetric logistic regression model by stage-wise optimization of certain criterions. Four cost-sensitive versions of boosting algorithms are derived, namely CSDA, CSRA, CSGA and CSLB which respectively correspond to Discrete AdaBoost, Real AdaBoost, Gentle AdaBoost and LogitBoost. Experimental results on the application of face detection have shown the effectiveness of the proposed learning framework in the reduction of the cumulative misclassification cost.

  7. Influence of the heterogeneous reaction HCL + HOCl on an ozone hole model with hydrocarbon additions

    SciTech Connect

    Elliott, S.; Cicerone, R.J.; Turco, R.P.

    1994-02-20

    Injection of ethane or propane has been suggested as a means for reducing ozone loss within the Antarctic vortex because alkanes can convert active chlorine radicals into hydrochloric acid. In kinetic models of vortex chemistry including as heterogeneous processes only the hydrolysis and HCl reactions of ClONO{sub 2} and N{sub 2}O{sub 5}, parts per billion by volume levels of the light alkanes counteract ozone depletion by sequestering chlorine atoms. Introduction of the surface reaction of HCl with HOCl causes ethane to deepen baseline ozone holes and generally works to impede any mitigation by hydrocarbons. The increased depletion occurs because HCl + HOCl can be driven by HO{sub x} radicals released during organic oxidation. Following initial hydrogen abstraction by chlorine, alkane breakdown leads to a net hydrochloric acid activation as the remaining hydrogen atoms enter the photochemical system. Lowering the rate constant for reactions of organic peroxy radicals with ClO to 10{sup {minus}13} cm{sup 3} molecule{sup {minus}1} s{sup {minus}1} does not alter results, and the major conclusions are insensitive to the timing of the ethane additions. Ignoring the organic peroxy radical plus ClO reactions entirely restores remediation capabilities by allowing HO{sub x} removal independent of HCl. Remediation also returns if early evaporation of polar stratospheric clouds leaves hydrogen atoms trapped in aldehyde intermediates, but real ozone losses are small in such cases. 95 refs., 4 figs., 7 tabs.

  8. In vivo characterization of two additional Leishmania donovani strains using the murine and hamster model.

    PubMed

    Kauffmann, F; Dumetz, F; Hendrickx, S; Muraille, E; Dujardin, J-C; Maes, L; Magez, S; De Trez, C

    2016-05-01

    Leishmania donovani is a protozoan parasite causing the neglected tropical disease visceral leishmaniasis. One difficulty to study the immunopathology upon L. donovani infection is the limited adaptability of the strains to experimental mammalian hosts. Our knowledge about L. donovani infections relies on a restricted number of East African strains (LV9, 1S). Isolated from patients in the 1960s, these strains were described extensively in mice and Syrian hamsters and have consequently become 'reference' laboratory strains. L. donovani strains from the Indian continent display distinct clinical features compared to East African strains. Some reports describing the in vivo immunopathology of strains from the Indian continent exist. This study comprises a comprehensive immunopathological characterization upon infection with two additional strains, the Ethiopian L. donovani L82 strain and the Nepalese L. donovani BPK282 strain in both Syrian hamsters and C57BL/6 mice. Parameters that include parasitaemia levels, weight loss, hepatosplenomegaly and alterations in cellular composition of the spleen and liver, showed that the L82 strain generated an overall more virulent infection compared to the BPK282 strain. Altogether, both L. donovani strains are suitable and interesting for subsequent in vivo investigation of visceral leishmaniasis in the Syrian hamster and the C57BL/6 mouse model. PMID:27012562

  9. Enhancement of colour stability of anthocyanins in model beverages by gum arabic addition.

    PubMed

    Chung, Cheryl; Rojanasasithara, Thananunt; Mutilangi, William; McClements, David Julian

    2016-06-15

    This study investigated the potential of gum arabic to improve the stability of anthocyanins that are used in commercial beverages as natural colourants. The degradation of purple carrot anthocyanin in model beverage systems (pH 3.0) containing L-ascorbic acid proceeded with a first-order reaction rate during storage (40 °C for 5 days in light). The addition of gum arabic (0.05-5.0%) significantly enhanced the colour stability of anthocyanin, with the most stable systems observed at intermediate levels (1.5%). A further increase in concentration (>1.5%) reduced its efficacy due to a change in the conformation of the gum arabic molecules that hindered their exposure to the anthocyanins. Fluorescence quenching measurements showed that the anthocyanin could have interacted with the glycoprotein fractions of the gum arabic through hydrogen bonding, resulting in enhanced stability. Overall, this study provides valuable information about enhancing the stability of anthocyanins in beverage systems using natural ingredients.

  10. Risk factors assessment and risk prediction models in lung cancer screening candidates

    PubMed Central

    Wachuła, Ewa; Szabłowska-Siwik, Sylwia; Boratyn-Nowicka, Agnieszka; Czyżewski, Damian

    2016-01-01

    From February 2015, low-dose computed tomography (LDCT) screening entered the armamentarium of diagnostic tools broadly available to individuals at high-risk of developing lung cancer. While a huge number of pulmonary nodules are identified, only a small fraction turns out to be early lung cancers. The majority of them constitute a variety of benign lesions. Although it entails a burden of the diagnostic work-up, the undisputable benefit emerges from: (I) lung cancer diagnosis at earlier stages (stage shift); (II) additional findings enabling the implementation of a preventive action beyond the realm of thoracic oncology. This review presents how to utilize the risk factors from distinct categories such as epidemiology, radiology and biomarkers to target the fraction of population, which may benefit most from the introduced screening modality. PMID:27195269

  11. Field evaluation of an avian risk assessment model

    USGS Publications Warehouse

    Vyas, N.B.; Spann, J.W.; Hulse, C.S.; Borges, S.L.; Bennett, R.S.; Torrez, M.; Williams, B.I.; Leffel, R.

    2006-01-01

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in the field. We tested technical-grade diazinon and its D Z N- 50W (50% diazinon active ingredient wettable powder) formulation on Canada goose (Branta canadensis) goslings. Brain acetylcholinesterase activity was measured, and the feathers and skin, feet. and gastrointestinal contents were analyzed for diazinon residues. The dose-response curves showed that diazinon was significantly more toxic to goslings in the outdoor test than in the laboratory tests. The deterministic risk assessment method identified the potential for risk to birds in general, but the factors associated with extrapolating from the laboratory to the field, and from the laboratory test species to other species, resulted in the underestimation of risk to the goslings. The present study indicates that laboratory-based risk quotients should be interpreted with caution.

  12. Making Risk Models Operational for Situational Awareness and Decision Support

    SciTech Connect

    Paulson, Patrick R.; Coles, Garill A.; Shoemaker, Steven V.

    2012-06-12

    Modernization of nuclear power operations control systems, in particular the move to digital control systems, creates an opportunity to modernize existing legacy infrastructure and extend plant life. We describe here decision support tools that allow the assessment of different facets of risk and support the optimization of available resources to reduce risk as plants are upgraded and maintained. This methodology could become an integrated part of the design review process and a part of the operations management systems. The methodology can be applied to the design of new reactors such as small nuclear reactors (SMR), and be helpful in assessing the risks of different configurations of the reactors. Our tool provides a low cost evaluation of alternative configurations and provides an expanded safety analysis by considering scenarios while early in the implementation cycle where cost impacts can be minimized. The effects of failures can be modeled and thoroughly vetted to understand their potential impact on risk. The process and tools presented here allow for an integrated assessment of risk by supporting traditional defense in depth approaches while taking into consideration the insertion of new digital instrument and control systems.

  13. A Longitudinal Transactional Risk Model for Early Eating Disorder Onset

    PubMed Central

    Pearson, Carolyn M.; Combs, Jessica L.; Zapolski, Tamika C. B.; Smith, Gregory T.

    2014-01-01

    The presence of binge eating behavior in early middle school predicts future diagnoses and health difficulties. The authors showed that this early binge eating behavior can, itself, be predicted by risk factors assessed in elementary school. We tested the acquired preparedness model of risk, which involves transactions among personality, psychosocial learning, and binge eating. In a sample of 1,906 children assessed in the spring of fifth grade (the last year of elementary school), the fall of sixth grade, and the spring of sixth grade, we found that fifth grade negative urgency (the personality tendency to act rashly when distressed) predicted subsequent increases in the expectancy that eating helps alleviate negative affect, which in turn predicted subsequent increases in binge eating behavior. This transactional risk process appeared to continue to occur at later time points. Negative urgency in the fall of sixth grade was predicted by fifth grade pubertal onset, binge eating behavior, and expectancies. It, in turn, predicted increases in high-risk eating expectancies by the spring of sixth grade, and thus heightened risk. PMID:22428790

  14. Holistic flood risk assessment using agent-based modelling: the case of Sint Maarten Island

    NASA Astrophysics Data System (ADS)

    Abayneh Abebe, Yared; Vojinovic, Zoran; Nikolic, Igor; Hammond, Michael; Sanchez, Arlex; Pelling, Mark

    2015-04-01

    Floods in coastal regions are regarded as one of the most dangerous and harmful disasters. Though commonly referred to as natural disasters, coastal floods are also attributable to various social, economic, historical and political issues. Rapid urbanisation in coastal areas combined with climate change and poor governance can lead to a significant increase in the risk of pluvial flooding coinciding with fluvial and coastal flooding posing a greater risk of devastation in coastal communities. Disasters that can be triggered by hydro-meteorological events are interconnected and interrelated with both human activities and natural processes. They, therefore, require holistic approaches to help understand their complexity in order to design and develop adaptive risk management approaches that minimise social and economic losses and environmental impacts, and increase resilience to such events. Being located in the North Atlantic Ocean, Sint Maarten is frequently subjected to hurricanes. In addition, the stormwater catchments and streams on Sint Maarten have several unique characteristics that contribute to the severity of flood-related impacts. Urban environments are usually situated in low-lying areas, with little consideration for stormwater drainage, and as such are subject to flash flooding. Hence, Sint Maarten authorities drafted policies to minimise the risk of flood-related disasters on the island. In this study, an agent-based model is designed and applied to understand the implications of introduced policies and regulations, and to understand how different actors' behaviours influence the formation, propagation and accumulation of flood risk. The agent-based model built for this study is based on the MAIA meta-model, which helps to decompose, structure and conceptualize socio-technical systems with an agent-oriented perspective, and is developed using the NetLogo simulation environment. The agents described in this model are households and businesses, and

  15. Recurrence models of volcanic events: Applications to volcanic risk assessment

    SciTech Connect

    Crowe, B.M.; Picard, R.; Valentine, G.; Perry, F.V.

    1992-03-01

    An assessment of the risk of future volcanism has been conducted for isolation of high-level radioactive waste at the potential Yucca Mountain site in southern Nevada. Risk used in this context refers to a combined assessment of the probability and consequences of future volcanic activity. Past studies established bounds on the probability of magmatic disruption of a repository. These bounds were revised as additional data were gathered from site characterization studies. The probability of direct intersection of a potential repository located in an eight km{sup 2} area of Yucca Mountain by ascending basalt magma was bounded by the range of 10{sup {minus}8} to 10{sup {minus}10} yr{sup {minus}1 2}. The consequences of magmatic disruption of a repository were estimated in previous studies to be limited. The exact releases from such an event are dependent on the strike of an intruding basalt dike relative to the repository geometry, the timing of the basaltic event relative to the age of the radioactive waste and the mechanisms of release and dispersal of the waste radionuclides in the accessible environment. The combined low probability of repository disruption and the limited releases associated with this event established the basis for the judgement that the risk of future volcanism was relatively low. It was reasoned that that risk of future volcanism was not likely to result in disqualification of the potential Yucca Mountain site.

  16. Radiation-Induced Leukemia at Doses Relevant to Radiation Therapy: Modeling Mechanisms and Estimating Risks

    NASA Technical Reports Server (NTRS)

    Shuryak, Igor; Sachs, Rainer K.; Hlatky, Lynn; Mark P. Little; Hahnfeldt, Philip; Brenner, David J.

    2006-01-01

    Because many cancer patients are diagnosed earlier and live longer than in the past, second cancers induced by radiation therapy have become a clinically significant issue. An earlier biologically based model that was designed to estimate risks of high-dose radiation induced solid cancers included initiation of stem cells to a premalignant state, inactivation of stem cells at high radiation doses, and proliferation of stem cells during cellular repopulation after inactivation. This earlier model predicted the risks of solid tumors induced by radiation therapy but overestimated the corresponding leukemia risks. Methods: To extend the model to radiation-induced leukemias, we analyzed in addition to cellular initiation, inactivation, and proliferation a repopulation mechanism specific to the hematopoietic system: long-range migration through the blood stream of hematopoietic stem cells (HSCs) from distant locations. Parameters for the model were derived from HSC biologic data in the literature and from leukemia risks among atomic bomb survivors v^ ho were subjected to much lower radiation doses. Results: Proliferating HSCs that migrate from sites distant from the high-dose region include few preleukemic HSCs, thus decreasing the high-dose leukemia risk. The extended model for leukemia provides risk estimates that are consistent with epidemiologic data for leukemia risk associated with radiation therapy over a wide dose range. For example, when applied to an earlier case-control study of 110000 women undergoing radiotherapy for uterine cancer, the model predicted an excess relative risk (ERR) of 1.9 for leukemia among women who received a large inhomogeneous fractionated external beam dose to the bone marrow (mean = 14.9 Gy), consistent with the measured ERR (2.0, 95% confidence interval [CI] = 0.2 to 6.4; from 3.6 cases expected and 11 cases observed). As a corresponding example for brachytherapy, the predicted ERR of 0.80 among women who received an inhomogeneous low

  17. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  18. Forecasting the risk of brown tree snake dispersal from Guam: a mixed transport-establishment model.

    PubMed

    Perry, Gad; Vice, Dan

    2009-08-01

    The brown tree snake (Boiga irregularis) is a devastating invader that has ecologically and economically affected Guam and is poised to disperse further. Interdiction efforts are being conducted on Guam and some of the potential receiving sites, but no tools exist for evaluating the potential for snake incursion; thus, the amount of effort that should be invested in protecting particular sites is unknown. We devised a model that predicts the relative risk of establishment of the brown tree snake (BTS) at a given site. To calculate overall risk, we incorporated in the model information on the likelihood of an organism entering the transportation system, avoiding detection, surviving to arrive at another location, and establishing at the receiving end. On the basis of documented rates of snake arrival at receiving sites, the model produced realistic predictions of invasion risk. Model outputs can thus be used to prioritize interdiction efforts to focus on especially vulnerable receiving locations. We provide examples of the utility of the model in evaluating the impacts of changes in transportation parameters. Finally, the model can be used to evaluate the impacts that BTS establishment at an additional site and that creation of a new source of snakes would have. The use of qualitative inputs allows the model to be adapted by substituting data on other invasive species or transportation systems.

  19. Integrated Water and Sanitation Risk Assessment and Modeling in the Upper Sonora River basin (Northwest, Mexico)

    NASA Astrophysics Data System (ADS)

    Mayer, A. S.; Robles-Morua, A.; Halvorsen, K. E.; Vivoni, E. R.; Auer, M. T.

    2011-12-01

    explore the use of participatory modeling frameworks in less developed regions. Results indicate that respondents agreed strongly with the hydrologic and water quality modeling methodologies presented and considered the modeling results useful. Our results also show that participatory modeling approaches can have short term impacts as seen in the changes in water-related risk perceptions. In total, these projects revealed that water resources management solutions need to take into account variations across the human landscape (i.e. risk perceptions) and variations in the biophysical response of watersheds to natural phenomena (i.e. streamflow generation) and to anthropogenic activities (i.e. contaminant fate and transport). In addition, this work underscores the notion that sustainable water resources solutions need to contend with uncertainty in our understanding and predictions of human perceptions and biophysical systems.

  20. Prediction models for early risk detection of cardiovascular event.

    PubMed

    Purwanto; Eswaran, Chikkannan; Logeswaran, Rajasvaran; Abdul Rahman, Abdul Rashid

    2012-04-01

    Cardiovascular disease (CVD) is the major cause of death globally. More people die of CVDs each year than from any other disease. Over 80% of CVD deaths occur in low and middle income countries and occur almost equally in male and female. In this paper, different computational models based on Bayesian Networks, Multilayer Perceptron,Radial Basis Function and Logistic Regression methods are presented to predict early risk detection of the cardiovascular event. A total of 929 (626 male and 303 female) heart attack data are used to construct the models.The models are tested using combined as well as separate male and female data. Among the models used, it is found that the Multilayer Perceptron model yields the best accuracy result.

  1. FIRESTORM: Modelling the water quality risk of wildfire.

    NASA Astrophysics Data System (ADS)

    Mason, C. I.; Sheridan, G. J.; Smith, H. G.; Jones, O.; Chong, D.; Tolhurst, K.

    2012-04-01

    Following wildfire, loss of vegetation and changes to soil properties may result in decreases in infiltration rates, less rainfall interception, and higher overland flow velocities. Rainfall events affecting burn areas before vegetation recovers can cause high magnitude erosion events that impact on downstream water quality. For cities and towns that rely upon fire-prone forest catchments for water supply, wildfire impacts on water quality represent a credible risk to water supply security. Quantifying the risk associated with the occurrence of wildfires and the magnitude of water quality impacts has important implications for managing water supplies. At present, no suitable integrative model exists that considers the probabilistic nature of system inputs as well as the range of processes and scales involved in this problem. We present FIRESTORM, a new model currently in development that aims to determine the range of sediment and associated contaminant loads that may be delivered to water supply reservoirs from the combination of wildfire and subsequent rainfall events. This Monte Carlo model incorporates the probabilistic nature of fire ignition, fire weather and rainfall, and includes deterministic models for fire behaviour and locally dominant erosion processes. FIRESTORM calculates the magnitude and associated annual risk of catchment-scale sediment loads associated with the occurrence of wildfire and rainfall generated by two rain event types. The two event types are localised, high intensity, short-duration convective storms, and widespread, longer duration synoptic-scale rainfall events. Initial application and testing of the model will focus on the two main reservoirs supplying water to Melbourne, Australia, both of which are situated in forest catchments vulnerable to wildfire. Probabilistic fire ignition and weather scenarios have been combined using 40 years of fire records and weather observations. These are used to select from a dataset of over 80

  2. Fire risk in San Diego County, California: A weighted Bayesian model approach

    USGS Publications Warehouse

    Kolden, Crystal A.; Weigel, Timothy J.

    2007-01-01

    Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.

  3. Engineering models for catastrophe risk and their application to insurance

    NASA Astrophysics Data System (ADS)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  4. Network Dependence in Risk Trading Games: A Banking Regulation Model

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan

    2003-04-01

    In the quest of quantitatively understanding risk-regulatory behavior of financial agents we propose a physical model of interacting agents where interactions are defined by trades of financial derivatives. Consequences arising from various types of interaction-network topologies are shown for system safety and efficiency. We demonstrate that the model yields characteristic features of actually observed wealth timeseries. Further we study the dependence of global system safety as a function of a risk-control parameter (Basle multiplier). We find a phase transition-like phenomenon, where the Basle parameter plays the role of temperature and safety serves as the order parameter. This work is done together with R. Hanel and S. Pichler.

  5. Application of physiologically based pharmacokinetic models in chemical risk assessment.

    PubMed

    Mumtaz, Moiz; Fisher, Jeffrey; Blount, Benjamin; Ruiz, Patricia

    2012-01-01

    Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting "in silico" tools such as physiologically based pharmacokinetic (PBPK) models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application-health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The "human PBPK model toolkit" is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures. PMID:22523493

  6. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    PubMed Central

    Mumtaz, Moiz; Fisher, Jeffrey; Blount, Benjamin; Ruiz, Patricia

    2012-01-01

    Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK) models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures. PMID:22523493

  7. A Family-Centered Model for Sharing Genetic Risk.

    PubMed

    Daly, Mary B

    2015-01-01

    The successes of the Human Genome Project have ushered in a new era of genomic science. To effectively translate these discoveries, it will be critical to improve the communication of genetic risk within families. This will require a systematic approach that accounts for the nature of family relationships and sociocultural beliefs. This paper proposes the application of the Family Systems Illness Model, used in the setting of cancer care, to the evolving field of genomics. PMID:26479564

  8. Quasi-likelihood estimation for relative risk regression models.

    PubMed

    Carter, Rickey E; Lipsitz, Stuart R; Tilley, Barbara C

    2005-01-01

    For a prospective randomized clinical trial with two groups, the relative risk can be used as a measure of treatment effect and is directly interpretable as the ratio of success probabilities in the new treatment group versus the placebo group. For a prospective study with many covariates and a binary outcome (success or failure), relative risk regression may be of interest. If we model the log of the success probability as a linear function of covariates, the regression coefficients are log-relative risks. However, using such a log-linear model with a Bernoulli likelihood can lead to convergence problems in the Newton-Raphson algorithm. This is likely to occur when the success probabilities are close to one. A constrained likelihood method proposed by Wacholder (1986, American Journal of Epidemiology 123, 174-184), also has convergence problems. We propose a quasi-likelihood method of moments technique in which we naively assume the Bernoulli outcome is Poisson, with the mean (success probability) following a log-linear model. We use the Poisson maximum likelihood equations to estimate the regression coefficients without constraints. Using method of moment ideas, one can show that the estimates using the Poisson likelihood will be consistent and asymptotically normal. We apply these methods to a double-blinded randomized trial in primary biliary cirrhosis of the liver (Markus et al., 1989, New England Journal of Medicine 320, 1709-1713). PMID:15618526

  9. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  10. Regime switching model for financial data: Empirical risk analysis

    NASA Astrophysics Data System (ADS)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  11. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    NASA Technical Reports Server (NTRS)

    2012-01-01

    estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements (NCRP, 2000, 2006). However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as opposed to the previous approach based on linear energy transfer, the development of a new solar particle event (SPE) model, and the updates to galactic cosmic ray (GCR) and shielding transport models. The newer epidemiological information includes updates to the cancer incidence rates from the life span study (LSS) of the Japanese atomic bomb survivors (Preston et al., 2007), transferred to the U.S. population and converted to cancer mortality rates from U.S. population statistics. In addition, the proposed model provides an alternative analysis applicable to lifetime never-smokers (NSs). Details of the uncertainty analysis in the model have also been updated and revised. NASA's proposed model and associated uncertainties are complex in their formulation and as such require a very clear and precise set of descriptions. The committee found the 2011 NASA report challenging to review largely because of the lack of clarity in the model descriptions and derivation of the various parameters used. The committee requested some clarifications from NASA throughout its review and was able to resolve many, but not all, of the ambiguities in the written description.

  12. Evaluation of data for Sinkhole-development risk models

    NASA Astrophysics Data System (ADS)

    Upchurch, Sam B.; Littlefield, James R.

    1988-10-01

    Before risk assessments for sinkhole damage and indemnification are developed, a data base must be created to predict the occurrence and distribution of sinkholes. This database must be evaluated in terms of the following questions: (1) are available records of modern sinkhole development adequate, (2) can the distribution of ancient sinks be used for predictive purposes, and (3) at what areal scale must sinkhole occurrences be evaluated for predictive and risk analysis purposes? Twelve 7.5' quadrangles with varying karst development in Hillsborough County, Florida provide insight into these questions. The area includes 179 modern sinks that developed between 1964 and 1985 and 2,303 ancient sinks. The sinks occur in urban, suburban, agricultural, and major forest wetland areas. The number of ancient sinks ranges from 0.1 to 3.2/km2 and averages 1.1/km2 for the entire area. The quadrangle area occupied by ancient sinks ranges from 0.3 to 10.2 percent. The distribution of ancient sinkholes within a quadrangle ranges from 0 to over 25 percent of the land surface. In bare karst areas, the sinks are localized along major lineaments, especially at lineament intersections. Where there is covered karst, ancient sinks may be obscured. Modern sinkholes did not uniformly through time, they ranged from 0 to 29/yr. The regional occurrence rate is 7.6/yr. Most were reported in urban or suburban areas and their locations coincide with the lineament-controlled areas of ancient karst. Moving-average analysis indicates that the distribution of modern sinks is highly localized and ranges from 0 to 1.9/km2. Chi-square tests show that the distribution of ancient sinks in bare karst areas significantly predicts the locations of modern sinks. In areas of covered karst, the locations of ancient sinkholes do not predict modern sinks. It appears that risk-assessment models for sinkhole development can use the distribution of ancient sinks where bare karst is present. In covered karst areas

  13. A new model for polluted soil risk assessment

    NASA Astrophysics Data System (ADS)

    Andretta, M.; Villani, M.; Serra, R.

    2003-04-01

    In the last years, the problem of the evaluation of the risk related to soil pollution has became more and more important, all over the world. The increasing number of polluted soils in all the industrialised counties has required the formalisation of well defined methodologies for defining the technical and economical limits of soil remediation. Mainly, these limits are defined in terms of general threshold values that, in some cases, can not be reached even with the so called Best Available Technology (B.A.T.) due for example to the characteristics of the pollutants or of the affected soil, or on the extremely high cost or duration of the remedial intervention. For these reasons, both in the North American Countries and in the European ones, many alternative methodologies based on systematic and scientifically well founded approaches have been developed, in order to determine the real effects of the pollution on the receptor targets. Typically, these methodologies are organised into different levels of detail, the so called "TIERS". Tier 1 is based on a conservative estimation of the risk for the targets, that comes from very general and "worst case" general situations. Tier 2 is based on a more detailed and site specific estimation of the hazard, evaluated by the use of semi-empirical, analytical formulas for the source characterisation, the transport of the pollutant, the target exposition evaluation. Tier 3 is the more detailed and site specific level of application of the risk assessment methodologies and requires the use of numerical methods with many detailed information on the site and on the receptors (e.g.: chemical/physical parameters of the pollutants, hydro-geological data, exposition data, etc.) In this paper, we describe the most important theoretical aspects of the polluted soil risk assessment methodologies and the relevant role played, in this kind of analysis, by the pollutant transport models. In particular, we describe a new and innovative

  14. Modeling the operational risk in Iranian commercial banks: case study of a private bank

    NASA Astrophysics Data System (ADS)

    Momen, Omid; Kimiagari, Alimohammad; Noorbakhsh, Eaman

    2012-08-01

    The Basel Committee on Banking Supervision from the Bank for International Settlement classifies banking risks into three main categories including credit risk, market risk, and operational risk. The focus of this study is on the operational risk measurement in Iranian banks. Therefore, issues arising when trying to implement operational risk models in Iran are discussed, and then, some solutions are recommended. Moreover, all steps of operational risk measurement based on Loss Distribution Approach with Iran's specific modifications are presented. We employed the approach of this study to model the operational risk of an Iranian private bank. The results are quite reasonable, comparing the scale of bank and other risk categories.

  15. Time-based collision risk modeling for air traffic management

    NASA Astrophysics Data System (ADS)

    Bell, Alan E.

    Since the emergence of commercial aviation in the early part of last century, economic forces have driven a steadily increasing demand for air transportation. Increasing density of aircraft operating in a finite volume of airspace is accompanied by a corresponding increase in the risk of collision, and in response to a growing number of incidents and accidents involving collisions between aircraft, governments worldwide have developed air traffic control systems and procedures to mitigate this risk. The objective of any collision risk management system is to project conflicts and provide operators with sufficient opportunity to recognize potential collisions and take necessary actions to avoid them. It is therefore the assertion of this research that the currency of collision risk management is time. Future Air Traffic Management Systems are being designed around the foundational principle of four dimensional trajectory based operations, a method that replaces legacy first-come, first-served sequencing priorities with time-based reservations throughout the airspace system. This research will demonstrate that if aircraft are to be sequenced in four dimensions, they must also be separated in four dimensions. In order to separate aircraft in four dimensions, time must emerge as the primary tool by which air traffic is managed. A functional relationship exists between the time-based performance of aircraft, the interval between aircraft scheduled to cross some three dimensional point in space, and the risk of collision. This research models that relationship and presents two key findings. First, a method is developed by which the ability of an aircraft to meet a required time of arrival may be expressed as a robust standard for both industry and operations. Second, a method by which airspace system capacity may be increased while maintaining an acceptable level of collision risk is presented and demonstrated for the purpose of formulating recommendations for procedures

  16. A Spatio-temporal Model of African Animal Trypanosomosis Risk

    PubMed Central

    Dicko, Ahmadou H.; Percoma, Lassane; Sow, Adama; Adam, Yahaya; Mahama, Charles; Sidibé, Issa; Dayo, Guiguigbaza-Kossigan; Thévenon, Sophie; Fonta, William; Sanfo, Safietou; Djiteye, Aligui; Salou, Ernest; Djohan, Vincent; Cecchi, Giuliano; Bouyer, Jérémy

    2015-01-01

    Background African animal trypanosomosis (AAT) is a major constraint to sustainable development of cattle farming in sub-Saharan Africa. The habitat of the tsetse fly vector is increasingly fragmented owing to demographic pressure and shifts in climate, which leads to heterogeneous risk of cyclical transmission both in space and time. In Burkina Faso and Ghana, the most important vectors are riverine species, namely Glossina palpalis gambiensis and G. tachinoides, which are more resilient to human-induced changes than the savannah and forest species. Although many authors studied the distribution of AAT risk both in space and time, spatio-temporal models allowing predictions of it are lacking. Methodology/Principal Findings We used datasets generated by various projects, including two baseline surveys conducted in Burkina Faso and Ghana within PATTEC (Pan African Tsetse and Trypanosomosis Eradication Campaign) national initiatives. We computed the entomological inoculation rate (EIR) or tsetse challenge using a range of environmental data. The tsetse apparent density and their infection rate were separately estimated and subsequently combined to derive the EIR using a “one layer-one model” approach. The estimated EIR was then projected into suitable habitat. This risk index was finally validated against data on bovine trypanosomosis. It allowed a good prediction of the parasitological status (r2 = 67%), showed a positive correlation but less predictive power with serological status (r2 = 22%) aggregated at the village level but was not related to the illness status (r2 = 2%). Conclusions/Significance The presented spatio-temporal model provides a fine-scale picture of the dynamics of AAT risk in sub-humid areas of West Africa. The estimated EIR was high in the proximity of rivers during the dry season and more widespread during the rainy season. The present analysis is a first step in a broader framework for an efficient risk management of climate

  17. Peer Review of NRC Standardized Plant Analysis Risk Models

    SciTech Connect

    Anthony Koonce; James Knudsen; Robert Buell

    2011-03-01

    The Nuclear Regulatory Commission (NRC) Standardized Plant Analysis Risk (SPAR) Models underwent a Peer Review using ASME PRA standard (Addendum C) as endorsed by NRC in Regulatory Guide (RG) 1.200. The review was performed by a mix of industry probabilistic risk analysis (PRA) experts and NRC PRA experts. Representative SPAR models, one PWR and one BWR, were reviewed against Capability Category I of the ASME PRA standard. Capability Category I was selected as the basis for review due to the specific uses/applications of the SPAR models. The BWR SPAR model was reviewed against 331 ASME PRA Standard Supporting Requirements; however, based on the Capability Category I level of review and the absence of internal flooding and containment performance (LERF) logic only 216 requirements were determined to be applicable. Based on the review, the BWR SPAR model met 139 of the 216 supporting requirements. The review also generated 200 findings or suggestions. Of these 200 findings and suggestions 142 were findings and 58 were suggestions. The PWR SPAR model was also evaluated against the same 331 ASME PRA Standard Supporting Requirements. Of these requirements only 215 were deemed appropriate for the review (for the same reason as noted for the BWR). The PWR review determined that 125 of the 215 supporting requirements met Capability Category I or greater. The review identified 101 findings or suggestions (76 findings and 25 suggestions). These findings or suggestions were developed to identify areas where SPAR models could be enhanced. A process to prioritize and incorporate the findings/suggestions supporting requirements into the SPAR models is being developed. The prioritization process focuses on those findings that will enhance the accuracy, completeness and usability of the SPAR models.

  18. PGD for cystic fibrosis patients and couples at risk of an additional genetic disorder combined with 24-chromosome aneuploidy testing.

    PubMed

    Rechitsky, Svetlana; Verlinsky, Oleg; Kuliev, Anver

    2013-05-01

    Preimplantation genetic diagnosis (PGD) for inherited disorders is presently applied for more than 300 different conditions. The most frequent PGD indication is cystic fibrosis (CF), the largest series of which is reviewed here, totalling 404 PGD cycles. This involved testing for 52 different CFTR mutations with almost half of the cases (195/404 cycles) performed for ΔF508 mutation, one-quarter (103/404 cycles) for six other frequent mutations and only a few for the remaining 45 CFTR mutations. There were 44 PGD cycles performed for 25 CF-affected homozygous or double-heterozygous CF patients (18 male and seven female partners), which involved testing simultaneously for three mutations, resulting in birth of 13 healthy CF-free children and no misdiagnosis. PGD was also performed for six couples at a combined risk of producing offspring with CF and another genetic disorder. Concomitant testing for CFTR and other mutations resulted in birth of six healthy children, free of both CF and another genetic disorder in all but one cycle. A total of 96 PGD cycles for CF were performed with simultaneous aneuploidy testing, including microarray-based 24-chromosome analysis, as a comprehensive PGD for two or more conditions in the same biopsy material.

  19. A Novel Risk Score to the Prediction of 10-year Risk for Coronary Artery Disease Among the Elderly in Beijing Based on Competing Risk Model

    PubMed Central

    Liu, Long; Tang, Zhe; Li, Xia; Luo, Yanxia; Guo, Jin; Li, Haibin; Liu, Xiangtong; Tao, Lixin; Yan, Aoshuang; Guo, Xiuhua

    2016-01-01

    Abstract The study aimed to construct a risk prediction model for coronary artery disease (CAD) based on competing risk model among the elderly in Beijing and develop a user-friendly CAD risk score tool. We used competing risk model to evaluate the risk of developing a first CAD event. On the basis of the risk factors that were included in the competing risk model, we constructed the CAD risk prediction model with Cox proportional hazard model. Time-dependent receiver operating characteristic (ROC) curve and time-dependent area under the ROC curve (AUC) were used to evaluate the discrimination ability of the both methods. Calibration plots were applied to assess the calibration ability and adjusted for the competing risk of non-CAD death. Net reclassification index (NRI) and integrated discrimination improvement (IDI) were applied to quantify the improvement contributed by the new risk factors. Internal validation of predictive accuracy was performed using 1000 times of bootstrap re-sampling. Of the 1775 participants without CAD at baseline, 473 incident cases of CAD were documented for a 20-year follow-up. Time-dependent AUCs for men and women at t = 10 years were 0.841 [95% confidence interval (95% CI): 0.806–0.877], 0.804 (95% CI: 0.768–0.839) in Fine and Gray model, 0.784 (95% CI: 0.738–0.830), 0.733 (95% CI: 0.692–0.775) in Cox proportional hazard model. The competing risk model was significantly superior to Cox proportional hazard model on discrimination and calibration. The cut-off values of the risk score that marked the difference between low-risk and high-risk patients were 34 points for men and 30 points for women, which have good sensitivity and specificity. A sex-specific multivariable risk factor algorithm-based competing risk model has been developed on the basis of an elderly Chinese cohort, which could be applied to predict an individual's risk and provide a useful guide to identify the groups at a high risk for CAD among the Chinese

  20. A Novel Risk Score to the Prediction of 10-year Risk for Coronary Artery Disease Among the Elderly in Beijing Based on Competing Risk Model.

    PubMed

    Liu, Long; Tang, Zhe; Li, Xia; Luo, Yanxia; Guo, Jin; Li, Haibin; Liu, Xiangtong; Tao, Lixin; Yan, Aoshuang; Guo, Xiuhua

    2016-03-01

    The study aimed to construct a risk prediction model for coronary artery disease (CAD) based on competing risk model among the elderly in Beijing and develop a user-friendly CAD risk score tool. We used competing risk model to evaluate the risk of developing a first CAD event. On the basis of the risk factors that were included in the competing risk model, we constructed the CAD risk prediction model with Cox proportional hazard model. Time-dependent receiver operating characteristic (ROC) curve and time-dependent area under the ROC curve (AUC) were used to evaluate the discrimination ability of the both methods. Calibration plots were applied to assess the calibration ability and adjusted for the competing risk of non-CAD death. Net reclassification index (NRI) and integrated discrimination improvement (IDI) were applied to quantify the improvement contributed by the new risk factors. Internal validation of predictive accuracy was performed using 1000 times of bootstrap re-sampling. Of the 1775 participants without CAD at baseline, 473 incident cases of CAD were documented for a 20-year follow-up. Time-dependent AUCs for men and women at t = 10 years were 0.841 [95% confidence interval (95% CI): 0.806-0.877], 0.804 (95% CI: 0.768-0.839) in Fine and Gray model, 0.784 (95% CI: 0.738-0.830), 0.733 (95% CI: 0.692-0.775) in Cox proportional hazard model. The competing risk model was significantly superior to Cox proportional hazard model on discrimination and calibration. The cut-off values of the risk score that marked the difference between low-risk and high-risk patients were 34 points for men and 30 points for women, which have good sensitivity and specificity. A sex-specific multivariable risk factor algorithm-based competing risk model has been developed on the basis of an elderly Chinese cohort, which could be applied to predict an individual's risk and provide a useful guide to identify the groups at a high risk for CAD among the Chinese adults over 55

  1. Integrated Assessment Modeling for Carbon Storage Risk and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Bromhal, G. S.; Dilmore, R.; Pawar, R.; Stauffer, P. H.; Gastelum, J.; Oldenburg, C. M.; Zhang, Y.; Chu, S.

    2013-12-01

    The National Risk Assessment Partnership (NRAP) has developed tools to perform quantitative risk assessment at site-specific locations for long-term carbon storage. The approach that is being used is to divide the storage and containment system into components (e.g., reservoirs, seals, wells, groundwater aquifers), to develop detailed models for each component, to generate reduced order models (ROMs) based on the detailed models, and to reconnect the reduced order models within an integrated assessment model (IAM). CO2-PENS, developed at Los Alamos National Lab, is being used as the IAM for the simulations in this study. The benefit of this approach is that simulations of the complete system can be generated on a relatively rapid time scale so that Monte Carlo simulation can be performed. In this study, hundreds of thousands of runs of the IAMs have been generated to estimate likelihoods of the quantity of CO2 released to the atmosphere, size of aquifer impacted by pH, size of aquifer impacted by TDS, and size of aquifer with different metals concentrations. Correlations of the output variables with different reservoir, seal, wellbore, and aquifer parameters have been generated. Importance measures have been identified, and inputs have been ranked in the order of their impact on the output quantities. Presentation will describe the approach used, representative results, and implications for how the Monte Carlo analysis is implemented on uncertainty quantification.

  2. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas

    PubMed Central

    Bedford, Tim; Daneshkhah, Alireza

    2015-01-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets. PMID:26332240

  3. Future bloom and blossom frost risk for Malus domestica considering climate model and impact model uncertainties.

    PubMed

    Hoffmann, Holger; Rath, Thomas

    2013-01-01

    The future bloom and risk of blossom frosts for Malus domestica were projected using regional climate realizations and phenological ( = impact) models. As climate impact projections are susceptible to uncertainties of climate and impact models and model concatenation, the significant horizon of the climate impact signal was analyzed by applying 7 impact models, including two new developments, on 13 climate realizations of the IPCC emission scenario A1B. Advancement of phenophases and a decrease in blossom frost risk for Lower Saxony (Germany) for early and late ripeners was determined by six out of seven phenological models. Single model/single grid point time series of bloom showed significant trends by 2021-2050 compared to 1971-2000, whereas the joint signal of all climate and impact models did not stabilize until 2043. Regarding blossom frost risk, joint projection variability exceeded the projected signal. Thus, blossom frost risk cannot be stated to be lower by the end of the 21st century despite a negative trend. As a consequence it is however unlikely to increase. Uncertainty of temperature, blooming date and blossom frost risk projection reached a minimum at 2078-2087. The projected phenophases advanced by 5.5 d K(-1), showing partial compensation of delayed fulfillment of the winter chill requirement and faster completion of