Science.gov

Sample records for additive risk model

  1. Goodness-of-fit methods for additive-risk models in tumorigenicity experiments.

    PubMed

    Ghosh, Debashis

    2003-09-01

    In tumorigenicity experiments, a complication is that the time to event is generally not observed, so that the time to tumor is subject to interval censoring. One of the goals in these studies is to properly model the effect of dose on risk. Thus, it is important to have goodness of fit procedures available for assessing the model fit. While several estimation procedures have been developed for current-status data, relatively little work has been done on model-checking techniques. In this article, we propose numerical and graphical methods for the analysis of current-status data using the additive-risk model, primarily focusing on the situation where the monitoring times are dependent. The finite-sample properties of the proposed methodology are examined through numerical studies. The methods are then illustrated with data from a tumorigenicity experiment.

  2. Group Sparse Additive Models

    PubMed Central

    Yin, Junming; Chen, Xi; Xing, Eric P.

    2016-01-01

    We consider the problem of sparse variable selection in nonparametric additive models, with the prior knowledge of the structure among the covariates to encourage those variables within a group to be selected jointly. Previous works either study the group sparsity in the parametric setting (e.g., group lasso), or address the problem in the nonparametric setting without exploiting the structural information (e.g., sparse additive models). In this paper, we present a new method, called group sparse additive models (GroupSpAM), which can handle group sparsity in additive models. We generalize the ℓ1/ℓ2 norm to Hilbert spaces as the sparsity-inducing penalty in GroupSpAM. Moreover, we derive a novel thresholding condition for identifying the functional sparsity at the group level, and propose an efficient block coordinate descent algorithm for constructing the estimate. We demonstrate by simulation that GroupSpAM substantially outperforms the competing methods in terms of support recovery and prediction accuracy in additive models, and also conduct a comparative experiment on a real breast cancer dataset.

  3. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online.

  4. Fused Lasso Additive Model

    PubMed Central

    Petersen, Ashley; Witten, Daniela; Simon, Noah

    2016-01-01

    We consider the problem of predicting an outcome variable using p covariates that are measured on n independent observations, in a setting in which additive, flexible, and interpretable fits are desired. We propose the fused lasso additive model (FLAM), in which each additive function is estimated to be piecewise constant with a small number of adaptively-chosen knots. FLAM is the solution to a convex optimization problem, for which a simple algorithm with guaranteed convergence to a global optimum is provided. FLAM is shown to be consistent in high dimensions, and an unbiased estimator of its degrees of freedom is proposed. We evaluate the performance of FLAM in a simulation study and on two data sets. Supplemental materials are available online, and the R package flam is available on CRAN. PMID:28239246

  5. Polymorphisms associated with the risk of lung cancer in a healthy Mexican Mestizo population: Application of the additive model for cancer

    PubMed Central

    Pérez-Morales, Rebeca; Méndez-Ramírez, Ignacio; Castro-Hernández, Clementina; Martínez-Ramírez, Ollin C.; Gonsebatt, María Eugenia; Rubio, Julieta

    2011-01-01

    Lung cancer is the leading cause of cancer mortality in Mexico and worldwide. In the past decade, there has been an increase in the number of lung cancer cases in young people, which suggests an important role for genetic background in the etiology of this disease. In this study, we genetically characterized 16 polymorphisms in 12 low penetrance genes (AhR, CYP1A1, CYP2E1, EPHX1, GSTM1, GSTT1, GSTPI, XRCC1, ERCC2, MGMT, CCND1 and TP53) in 382 healthy Mexican Mestizos as the first step in elucidating the genetic structure of this population and identifying high risk individuals. All of the genotypes analyzed were in Hardy-Weinberg equilibrium, but different degrees of linkage were observed for polymorphisms in the CYP1A1 and EPHX1 genes. The genetic variability of this population was distributed in six clusters that were defined based on their genetic characteristics. The use of a polygenic model to assess the additive effect of low penetrance risk alleles identified combinations of risk genotypes that could be useful in predicting a predisposition to lung cancer. Estimation of the level of genetic susceptibility showed that the individual calculated risk value (iCRV) ranged from 1 to 16, with a higher iCRV indicating a greater genetic susceptibility to lung cancer. PMID:22215955

  6. Does early intensive multifactorial therapy reduce modelled cardiovascular risk in individuals with screen-detected diabetes? Results from the ADDITION-Europe cluster randomized trial

    PubMed Central

    Black, J A; Sharp, S J; Wareham, N J; Sandbæk, A; Rutten, G E H M; Lauritzen, T; Khunti, K; Davies, M J; Borch-Johnsen, K; Griffin, S J; Simmons, R K

    2014-01-01

    Aims Little is known about the long-term effects of intensive multifactorial treatment early in the diabetes disease trajectory. In the absence of long-term data on hard outcomes, we described change in 10-year modelled cardiovascular risk in the 5 years following diagnosis, and quantified the impact of intensive treatment on 10-year modelled cardiovascular risk at 5 years. Methods In a pragmatic, cluster-randomized, parallel-group trial in Denmark, the Netherlands and the UK, 3057 people with screen-detected Type 2 diabetes were randomized by general practice to receive (1) routine care of diabetes according to national guidelines (1379 patients) or (2) intensive multifactorial target-driven management (1678 patients). Ten-year modelled cardiovascular disease risk was calculated at baseline and 5 years using the UK Prospective Diabetes Study Risk Engine (version 3β). Results Among 2101 individuals with complete data at follow up (73.4%), 10-year modelled cardiovascular disease risk was 27.3% (sd 13.9) at baseline and 21.3% (sd 13.8) at 5-year follow-up (intensive treatment group difference –6.9, sd 9.0; routine care group difference –5.0, sd 12.2). Modelled 10-year cardiovascular disease risk was lower in the intensive treatment group compared with the routine care group at 5 years, after adjustment for baseline cardiovascular disease risk and clustering (–2.0; 95% CI –3.1 to –0.9). Conclusions Despite increasing age and diabetes duration, there was a decline in modelled cardiovascular disease risk in the 5 years following diagnosis. Compared with routine care, 10-year modelled cardiovascular disease risk was lower in the intensive treatment group at 5 years. Our results suggest that patients benefit from intensive treatment early in the diabetes disease trajectory, where the rate of cardiovascular disease risk progression may be slowed. PMID:24533664

  7. Biosafety Risk Assessment Model

    SciTech Connect

    Daniel Bowen, Susan Caskey

    2011-05-27

    Software tool based on a structured methodology for conducting laboratory biosafety risk assessments by biosafety experts. Software is based upon an MCDA scheme and uses peer reviewed criteria and weights. The software was developed upon Microsoft’s .net framework. The methodology defines likelihood and consequence of a laboratory exposure for thirteen unique scenarios and provides numerical relative risks for each of the relevant thirteen. The software produces 2-d graphs reflecting the relative risk and a sensitivity analysis which highlights the overall importance of each factor. The software works as a set of questions with absolute scales and uses a weighted additive model to calculate the likelihood and consequence.

  8. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  9. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  10. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  11. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  12. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  13. Chemical Mixture Risk Assessment Additivity-Based Approaches

    EPA Science Inventory

    Powerpoint presentation includes additivity-based chemical mixture risk assessment methods. Basic concepts, theory and example calculations are included. Several slides discuss the use of "common adverse outcomes" in analyzing phthalate mixtures.

  14. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  15. Risks associated with endotoxins in feed additives produced by fermentation.

    PubMed

    Wallace, R John; Gropp, Jürgen; Dierick, Noël; Costa, Lucio G; Martelli, Giovanna; Brantom, Paul G; Bampidis, Vasileios; Renshaw, Derek W; Leng, Lubomir

    2016-01-15

    Increasingly, feed additives for livestock, such as amino acids and vitamins, are being produced by Gram-negative bacteria, particularly Escherichia coli. The potential therefore exists for animals, consumers and workers to be exposed to possibly harmful amounts of endotoxin from these products. The aim of this review was to assess the extent of the risk from endotoxins in feed additives and to calculate how such risk can be assessed from the properties of the additive. Livestock are frequently exposed to a relatively high content of endotoxin in the diet: no additional hazard to livestock would be anticipated if the endotoxin concentration of the feed additive falls in the same range as feedstuffs. Consumer exposure will be unaffected by the consumption of food derived from animals receiving endotoxin-containing feed, because the small concentrations of endotoxin absorbed do not accumulate in edible tissues. In contrast, workers processing a dusty additive may be exposed to hazardous amounts of endotoxin even if the endotoxin concentration of the product is low. A calculation method is proposed to compare the potential risk to the worker, based on the dusting potential, the endotoxin concentration and technical guidance of the European Food Safety Authority, with national exposure limits.

  16. Melanoma Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  17. Public risk perception of food additives and food scares. The case in Suzhou, China.

    PubMed

    Wu, Linhai; Zhong, Yingqi; Shan, Lijie; Qin, Wei

    2013-11-01

    This study examined the factors affecting public risk perception of food additive safety and possible resulting food scares using a survey conducted in Suzhou, Jiangsu Province, China. The model was proposed based on literature relating to the role of risk perception and information perception of public purchase intention under food scares. Structural equation modeling (SEM) was used for data analysis. The results showed that attitude towards behavior, subjective norm and information perception exerted moderate to high effect on food scares, and the effects were also mediated by risk perceptions of additive safety. Significant covariance was observed between attitudes toward behavior, subjective norm and information perception. Establishing an effective mechanism of food safety risk communication, releasing information of government supervision on food safety in a timely manner, curbing misleading media reports on public food safety risk, and enhancing public knowledge of the food additives are key to the development and implementation of food safety risk management policies by the Chinese government.

  18. Quantile uncertainty and value-at-risk model risk.

    PubMed

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks.

  19. Nanotechnologies: Risk assessment model

    NASA Astrophysics Data System (ADS)

    Giacobbe, F.; Monica, L.; Geraci, D.

    2009-05-01

    The development and use of nanomaterials has grown widely in the last years. Hence, it is necessary to carry out a careful and aimed risk assessment for the safety of the workers. The objective of this research is a specific assessment model finalized to the workplaces where the personnel work manipulating nanoparticles. This model mainly takes into account the number of exposed workers, the dimensions of particles, the information found in the safety data sheets and the uncertainties about the danger level coming from the exposition to nanomaterials. The evaluation algorithm considers the normal work conditions, the abnormal (e.g. breakdown air filter) and emergency situations (e.g. package cracking). It has been necessary to define several risk conditions in order to quantify the risk by increasing levels ("low", "middle" and "high" level). Each level includes appropriate behavioural procedures. In particular for the high level, it is advisable that the user carries out urgent interventions finalized to reduce the risk level (e.g. the utilization of vacuum box for the manipulation, high efficiency protection PPE, etc). The model has been implemented in a research laboratory where titanium dioxide and carbon nanotubes are used. The outcomes taken out from such specific evaluation gave a risk level equal to middle.

  20. Risk analysis of sulfites used as food additives in China.

    PubMed

    Zhang, Jian Bo; Zhang, Hong; Wang, Hua Li; Zhang, Ji Yue; Luo, Peng Jie; Zhu, Lei; Wang, Zhu Tian

    2014-02-01

    This study was to analyze the risk of sulfites in food consumed by the Chinese people and assess the health protection capability of maximum-permitted level (MPL) of sulfites in GB 2760-2011. Sulfites as food additives are overused or abused in many food categories. When the MPL in GB 2760-2011 was used as sulfites content in food, the intake of sulfites in most surveyed populations was lower than the acceptable daily intake (ADI). Excess intake of sulfites was found in all the surveyed groups when a high percentile of sulfites in food was in taken. Moreover, children aged 1-6 years are at a high risk to intake excess sulfites. The primary cause for the excess intake of sulfites in Chinese people is the overuse and abuse of sulfites by the food industry. The current MPL of sulfites in GB 2760-2011 protects the health of most populations.

  1. Lunar Landing Operational Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian

    2010-01-01

    Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.

  2. Network Reconstruction Using Nonparametric Additive ODE Models

    PubMed Central

    Henderson, James; Michailidis, George

    2014-01-01

    Network representations of biological systems are widespread and reconstructing unknown networks from data is a focal problem for computational biologists. For example, the series of biochemical reactions in a metabolic pathway can be represented as a network, with nodes corresponding to metabolites and edges linking reactants to products. In a different context, regulatory relationships among genes are commonly represented as directed networks with edges pointing from influential genes to their targets. Reconstructing such networks from data is a challenging problem receiving much attention in the literature. There is a particular need for approaches tailored to time-series data and not reliant on direct intervention experiments, as the former are often more readily available. In this paper, we introduce an approach to reconstructing directed networks based on dynamic systems models. Our approach generalizes commonly used ODE models based on linear or nonlinear dynamics by extending the functional class for the functions involved from parametric to nonparametric models. Concomitantly we limit the complexity by imposing an additive structure on the estimated slope functions. Thus the submodel associated with each node is a sum of univariate functions. These univariate component functions form the basis for a novel coupling metric that we define in order to quantify the strength of proposed relationships and hence rank potential edges. We show the utility of the method by reconstructing networks using simulated data from computational models for the glycolytic pathway of Lactocaccus Lactis and a gene network regulating the pluripotency of mouse embryonic stem cells. For purposes of comparison, we also assess reconstruction performance using gene networks from the DREAM challenges. We compare our method to those that similarly rely on dynamic systems models and use the results to attempt to disentangle the distinct roles of linearity, sparsity, and derivative

  3. Acute radiation risk models

    NASA Astrophysics Data System (ADS)

    Smirnova, Olga

    Biologically motivated mathematical models, which describe the dynamics of the major hematopoietic lineages (the thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems) in acutely/chronically irradiated humans are developed. These models are implemented as systems of nonlinear differential equations, which variables and constant parameters have clear biological meaning. It is shown that the developed models are capable of reproducing clinical data on the dynamics of these systems in humans exposed to acute radiation in the result of incidents and accidents, as well as in humans exposed to low-level chronic radiation. Moreover, the averaged value of the "lethal" dose rates of chronic irradiation evaluated within models of these four major hematopoietic lineages coincides with the real minimal dose rate of lethal chronic irradiation. The demonstrated ability of the models of the human thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems to predict the dynamical response of these systems to acute/chronic irradiation in wide ranges of doses and dose rates implies that these mathematical models form an universal tool for the investigation and prediction of the dynamics of the major human hematopoietic lineages for a vast pattern of irradiation scenarios. In particular, these models could be applied for the radiation risk assessment for health of astronauts exposed to space radiation during long-term space missions, such as voyages to Mars or Lunar colonies, as well as for health of people exposed to acute/chronic irradiation due to environmental radiological events.

  4. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  5. Additional risk of end-of-the-pipe geoengineering technologies

    NASA Astrophysics Data System (ADS)

    Bohle, Martin

    2014-05-01

    qualitatively from the known successes. They do not tackle the initial cause, namely the carbon-dioxide inputs that are too high. This is their additional specific risk. 'The acceptability of geoengineering will be determined as much by social, legal and political issues as by scientific and technical factors', conclude Adam Corner and Nick Pidgeon (2010) when reviewing social and ethical implications of geoengineering the climate. It is to debate in that context that most geoengineering technologies are 'end of the pipe technologies', what involves an additional specific risk. Should these technologies be part of the toolbox to tackle anthropogenic climate change? Adam Corner and Nick Pidgeon 2010, Geoengineering the climate: The social and ethical implications, Environment Vol. 52.

  6. CREATION OF THE MODEL ADDITIONAL PROTOCOL

    SciTech Connect

    Houck, F.; Rosenthal, M.; Wulf, N.

    2010-05-25

    In 1991, the international nuclear nonproliferation community was dismayed to discover that the implementation of safeguards by the International Atomic Energy Agency (IAEA) under its NPT INFCIRC/153 safeguards agreement with Iraq had failed to detect Iraq's nuclear weapon program. It was now clear that ensuring that states were fulfilling their obligations under the NPT would require not just detecting diversion but also the ability to detect undeclared materials and activities. To achieve this, the IAEA initiated what would turn out to be a five-year effort to reappraise the NPT safeguards system. The effort engaged the IAEA and its Member States and led to agreement in 1997 on a new safeguards agreement, the Model Protocol Additional to the Agreement(s) between States and the International Atomic Energy Agency for the Application of Safeguards. The Model Protocol makes explicit that one IAEA goal is to provide assurance of the absence of undeclared nuclear material and activities. The Model Protocol requires an expanded declaration that identifies a State's nuclear potential, empowers the IAEA to raise questions about the correctness and completeness of the State's declaration, and, if needed, allows IAEA access to locations. The information required and the locations available for access are much broader than those provided for under INFCIRC/153. The negotiation was completed in quite a short time because it started with a relatively complete draft of an agreement prepared by the IAEA Secretariat. This paper describes how the Model Protocol was constructed and reviews key decisions that were made both during the five-year period and in the actual negotiation.

  7. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  8. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  9. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  10. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  11. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  12. Testing a Gender Additive Model: The Role of Body Image in Adolescent Depression

    ERIC Educational Resources Information Center

    Bearman, Sarah Kate; Stice, Eric

    2008-01-01

    Despite consistent evidence that adolescent girls are at greater risk of developing depression than adolescent boys, risk factor models that account for this difference have been elusive. The objective of this research was to examine risk factors proposed by the "gender additive" model of depression that attempts to partially explain the increased…

  13. Widespread non-additive and interaction effects within HLA loci modulate the risk of autoimmune diseases

    PubMed Central

    Lenz, Tobias L.; Deutsch, Aaron J.; Han, Buhm; Hu, Xinli; Okada, Yukinori; Eyre, Stephen; Knapp, Michael; Zhernakova, Alexandra; Huizinga, Tom W.J.; Abecasis, Goncalo; Becker, Jessica; Boeckxstaens, Guy E.; Chen, Wei-Min; Franke, Andre; Gladman, Dafna D.; Gockel, Ines; Gutierrez-Achury, Javier; Martin, Javier; Nair, Rajan P.; Nöthen, Markus M.; Onengut-Gumuscu, Suna; Rahman, Proton; Rantapää-Dahlqvist, Solbritt; Stuart, Philip E.; Tsoi, Lam C.; Van Heel, David A.; Worthington, Jane; Wouters, Mira M.; Klareskog, Lars; Elder, James T.; Gregersen, Peter K.; Schumacher, Johannes; Rich, Stephen S.; Wijmenga, Cisca; Sunyaev, Shamil R.; de Bakker, Paul I.W.; Raychaudhuri, Soumya

    2015-01-01

    Human leukocyte antigen (HLA) genes confer strong risk for autoimmune diseases on a log-additive scale. Here we speculated that differences in autoantigen binding repertoires between a heterozygote’s two expressed HLA variants may result in additional non-additive risk effects. We tested non-additive disease contributions of classical HLA alleles in patients and matched controls for five common autoimmune diseases: rheumatoid arthritis (RA, Ncases=5,337), type 1 diabetes (T1D, Ncases=5,567), psoriasis vulgaris (Ncases=3,089), idiopathic achalasia (Ncases=727), and celiac disease (Ncases=11,115). In four out of five diseases, we observed highly significant non-additive dominance effects (RA: P=2.5×1012; T1D: P=2.4×10−10; psoriasis: P=5.9×10−6; celiac disease: P=1.2×10−87). In three of these diseases, the dominance effects were explained by interactions between specific classical HLA alleles (RA: P=1.8×10−3; T1D: P=8.6×1027; celiac disease: P=6.0×10−100). These interactions generally increased disease risk and explained moderate but significant fractions of phenotypic variance (RA: 1.4%, T1D: 4.0%, and celiac disease: 4.1%, beyond a simple additive model). PMID:26258845

  14. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  15. The Addition of Vascular Calcification Scores to Traditional Risk Factors Improves Cardiovascular Risk Assessment in Patients with Chronic Kidney Disease

    PubMed Central

    Diouf, Momar; Temmar, Mohamed; Renard, Cédric; Choukroun, Gabriel; Massy, Ziad A.

    2015-01-01

    Background Although a variety of non-invasive methods for measuring cardiovascular (CV) risk (such as carotid intima media thickness, pulse wave velocity (PWV), coronary artery and aortic calcification scores (measured either by CT scan or X-ray) and the ankle brachial index (ABI)) have been evaluated separately in chronic kidney disease (CKD) cohorts, few studies have evaluated these methods simultaneously. Here, we looked at whether the addition of non-invasive methods to traditional risk factors (TRFs) improves prediction of the CV risk in patients at different CKD stages. Methods We performed a prospective, observational study of the relationship between the outputs of non-invasive measurement methods on one hand and mortality and CV outcomes in 143 patients at different CKD stages on the other. During the follow-up period, 44 patients died and 30 CV events were recorded. We used Cox models to calculate the relative risk for outcomes. To assess the putative clinical value of each method, we also determined the categorical net reclassification improvement (NRI) and the integrated discrimination improvement. Results Vascular calcification, PWV and ABI predicted all-cause mortality and CV events in univariate analyses. However, after adjustment for TRFs, only aortic and coronary artery calcification scores were found to be significant, independent variables. Moreover, the addition of coronary artery calcification scores to TRFs improved the specificity of prediction by 20%. Conclusion The addition of vascular calcification scores (especially the coronary artery calcification score) to TRFs appears to improve CV risk assessment in a CKD population. PMID:26181592

  16. RISK 0301 - MOLECULAR MODELING

    EPA Science Inventory

    Risk assessment practices, in general, for a range of diseases now encourages the use of mechanistic data to enhance the ability to predict responses at low, environmental exposures. In particular, the pathway from normal biology to pathologic state can be dcscribed by a set of m...

  17. Breast Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  18. Esophageal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  19. Colorectal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  20. Prostate Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  1. Pancreatic Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  2. Lung Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  3. Testicular Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  4. Ovarian Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Cervical Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Bladder Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Liver Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Risk modelling in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  9. Models for Pesticide Risk Assessment

    EPA Pesticide Factsheets

    EPA considers the toxicity of the pesticide as well as the amount of pesticide to which a person or the environments may be exposed in risk assessment. Scientists use mathematical models to predict pesticide concentrations in exposure assessment.

  10. Adiponectin Provides Additional Information to Conventional Cardiovascular Risk Factors for Assessing the Risk of Atherosclerosis in Both Genders

    PubMed Central

    Yoon, Jin-Ha; Kim, Sung-Kyung; Choi, Ho-June; Choi, Soo-In; Cha, So-Youn; Koh, Sang-Baek

    2013-01-01

    Background This study evaluated the relation between adiponectin and atherosclerosis in both genders, and investigated whether adiponectin provides useful additional information for assessing the risk of atherosclerosis. Methods We measured serum adiponectin levels and other cardiovascular risk factors in 1033 subjects (454 men, 579 women) from the Korean Genomic Rural Cohort study. Carotid intima–media-thickness (CIMT) was used as measure of atherosclerosis. Odds ratios (ORs) with 95% confidence intervals (95% CI) were calculated using multiple logistic regression, and receiver operating characteristic curves (ROC), the category-free net reclassification improvement (NRI) and integrated discrimination improvement (IDI) were calculated. Results After adjustment for conventional cardiovascular risk factors, such as age, waist circumference, smoking history, low-density and high-density lipoprotein cholesterol, triglycerides, systolic blood pressure and insulin resistance, the ORs (95%CI) of the third tertile adiponectin group were 0.42 (0.25–0.72) in men and 0.47 (0.29–0.75) in women. The area under the curve (AUC) on the ROC analysis increased significantly by 0.025 in men and 0.022 in women when adiponectin was added to the logistic model of conventional cardiovascular risk factors (AUC in men: 0.655 to 0.680, p = 0.038; AUC in women: 0.654 to 0.676, p = 0.041). The NRI was 0.32 (95%CI: 0.13–0.50, p<0.001), and the IDI was 0.03 (95%CI: 0.01–0.04, p<0.001) for men. For women, the category-free NRI was 0.18 (95%CI: 0.02–0.34, p = 0.031) and the IDI was 0.003 (95%CI: −0.002–0.008, p = 0.189). Conclusion Adiponectin and atherosclerosis were significantly related in both genders, and these relationships were independent of conventional cardiovascular risk factors. Furthermore, adiponectin provided additional information to conventional cardiovascular risk factors regarding the risk of atherosclerosis. PMID:24116054

  11. Multifractal Value at Risk model

    NASA Astrophysics Data System (ADS)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  12. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  13. Information risk and security modeling

    NASA Astrophysics Data System (ADS)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  14. Influence of dispersing additive on asphaltenes aggregation in model system

    NASA Astrophysics Data System (ADS)

    Gorshkov, A. M.; Shishmina, L. V.; Tukhvatullina, A. Z.; Ismailov, Yu R.; Ges, G. A.

    2016-09-01

    The work is devoted to investigation of the dispersing additive influence on asphaltenes aggregation in the asphaltenes-toluene-heptane model system by photon correlation spectroscopy method. The experimental relationship between the onset point of asphaltenes and their concentration in toluene has been obtained. The influence of model system composition on asphaltenes aggregation has been researched. The estimation of aggregative and sedimentation stability of asphaltenes in model system and system with addition of dispersing additive has been given.

  15. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  16. Common genetic variants, acting additively, are a major source of risk for autism

    PubMed Central

    2012-01-01

    Background Autism spectrum disorders (ASD) are early onset neurodevelopmental syndromes typified by impairments in reciprocal social interaction and communication, accompanied by restricted and repetitive behaviors. While rare and especially de novo genetic variation are known to affect liability, whether common genetic polymorphism plays a substantial role is an open question and the relative contribution of genes and environment is contentious. It is probable that the relative contributions of rare and common variation, as well as environment, differs between ASD families having only a single affected individual (simplex) versus multiplex families who have two or more affected individuals. Methods By using quantitative genetics techniques and the contrast of ASD subjects to controls, we estimate what portion of liability can be explained by additive genetic effects, known as narrow-sense heritability. We evaluate relatives of ASD subjects using the same methods to evaluate the assumptions of the additive model and partition families by simplex/multiplex status to determine how heritability changes with status. Results By analyzing common variation throughout the genome, we show that common genetic polymorphism exerts substantial additive genetic effects on ASD liability and that simplex/multiplex family status has an impact on the identified composition of that risk. As a fraction of the total variation in liability, the estimated narrow-sense heritability exceeds 60% for ASD individuals from multiplex families and is approximately 40% for simplex families. By analyzing parents, unaffected siblings and alleles not transmitted from parents to their affected children, we conclude that the data for simplex ASD families follow the expectation for additive models closely. The data from multiplex families deviate somewhat from an additive model, possibly due to parental assortative mating. Conclusions Our results, when viewed in the context of results from genome

  17. Evaluation of the performance of smoothing functions in generalized additive models for spatial variation in disease.

    PubMed

    Siangphoe, Umaporn; Wheeler, David C

    2015-01-01

    Generalized additive models (GAMs) with bivariate smoothing functions have been applied to estimate spatial variation in risk for many types of cancers. Only a handful of studies have evaluated the performance of smoothing functions applied in GAMs with regard to different geographical areas of elevated risk and different risk levels. This study evaluates the ability of different smoothing functions to detect overall spatial variation of risk and elevated risk in diverse geographical areas at various risk levels using a simulation study. We created five scenarios with different true risk area shapes (circle, triangle, linear) in a square study region. We applied four different smoothing functions in the GAMs, including two types of thin plate regression splines (TPRS) and two versions of locally weighted scatterplot smoothing (loess). We tested the null hypothesis of constant risk and detected areas of elevated risk using analysis of deviance with permutation methods and assessed the performance of the smoothing methods based on the spatial detection rate, sensitivity, accuracy, precision, power, and false-positive rate. The results showed that all methods had a higher sensitivity and a consistently moderate-to-high accuracy rate when the true disease risk was higher. The models generally performed better in detecting elevated risk areas than detecting overall spatial variation. One of the loess methods had the highest precision in detecting overall spatial variation across scenarios and outperformed the other methods in detecting a linear elevated risk area. The TPRS methods outperformed loess in detecting elevated risk in two circular areas.

  18. Evaluation of the Performance of Smoothing Functions in Generalized Additive Models for Spatial Variation in Disease

    PubMed Central

    Siangphoe, Umaporn; Wheeler, David C.

    2015-01-01

    Generalized additive models (GAMs) with bivariate smoothing functions have been applied to estimate spatial variation in risk for many types of cancers. Only a handful of studies have evaluated the performance of smoothing functions applied in GAMs with regard to different geographical areas of elevated risk and different risk levels. This study evaluates the ability of different smoothing functions to detect overall spatial variation of risk and elevated risk in diverse geographical areas at various risk levels using a simulation study. We created five scenarios with different true risk area shapes (circle, triangle, linear) in a square study region. We applied four different smoothing functions in the GAMs, including two types of thin plate regression splines (TPRS) and two versions of locally weighted scatterplot smoothing (loess). We tested the null hypothesis of constant risk and detected areas of elevated risk using analysis of deviance with permutation methods and assessed the performance of the smoothing methods based on the spatial detection rate, sensitivity, accuracy, precision, power, and false-positive rate. The results showed that all methods had a higher sensitivity and a consistently moderate-to-high accuracy rate when the true disease risk was higher. The models generally performed better in detecting elevated risk areas than detecting overall spatial variation. One of the loess methods had the highest precision in detecting overall spatial variation across scenarios and outperformed the other methods in detecting a linear elevated risk area. The TPRS methods outperformed loess in detecting elevated risk in two circular areas. PMID:25983545

  19. Risk perception in epidemic modeling

    NASA Astrophysics Data System (ADS)

    Bagnoli, Franco; Liò, Pietro; Sguanci, Luca

    2007-12-01

    We investigate the effects of risk perception in a simple model of epidemic spreading. We assume that the perception of the risk of being infected depends on the fraction of neighbors that are ill. The effect of this factor is to decrease the infectivity, that therefore becomes a dynamical component of the model. We study the problem in the mean-field approximation and by numerical simulations for regular, random, and scale-free networks. We show that for homogeneous and random networks, there is always a value of perception that stops the epidemics. In the “worst-case” scenario of a scale-free network with diverging input connectivity, a linear perception cannot stop the epidemics; however, we show that a nonlinear increase of the perception risk may lead to the extinction of the disease. This transition is discontinuous, and is not predicted by the mean-field analysis.

  20. Additive and subtractive scrambling in optional randomized response modeling.

    PubMed

    Hussain, Zawar; Al-Sobhi, Mashail M; Al-Zahrani, Bander

    2014-01-01

    This article considers unbiased estimation of mean, variance and sensitivity level of a sensitive variable via scrambled response modeling. In particular, we focus on estimation of the mean. The idea of using additive and subtractive scrambling has been suggested under a recent scrambled response model. Whether it is estimation of mean, variance or sensitivity level, the proposed scheme of estimation is shown relatively more efficient than that recent model. As far as the estimation of mean is concerned, the proposed estimators perform relatively better than the estimators based on recent additive scrambling models. Relative efficiency comparisons are also made in order to highlight the performance of proposed estimators under suggested scrambling technique.

  1. Complex Modelling Scheme Of An Additive Manufacturing Centre

    NASA Astrophysics Data System (ADS)

    Popescu, Liliana Georgeta

    2015-09-01

    This paper presents a modelling scheme sustaining the development of an additive manufacturing research centre model and its processes. This modelling is performed using IDEF0, the resulting model process representing the basic processes required in developing such a centre in any university. While the activities presented in this study are those recommended in general, changes may occur in specific existing situations in a research centre.

  2. Concentration Addition, Independent Action and Generalized Concentration Addition Models for Mixture Effect Prediction of Sex Hormone Synthesis In Vitro

    PubMed Central

    Hadrup, Niels; Taxvig, Camilla; Pedersen, Mikael; Nellemann, Christine; Hass, Ulla; Vinggaard, Anne Marie

    2013-01-01

    Humans are concomitantly exposed to numerous chemicals. An infinite number of combinations and doses thereof can be imagined. For toxicological risk assessment the mathematical prediction of mixture effects, using knowledge on single chemicals, is therefore desirable. We investigated pros and cons of the concentration addition (CA), independent action (IA) and generalized concentration addition (GCA) models. First we measured effects of single chemicals and mixtures thereof on steroid synthesis in H295R cells. Then single chemical data were applied to the models; predictions of mixture effects were calculated and compared to the experimental mixture data. Mixture 1 contained environmental chemicals adjusted in ratio according to human exposure levels. Mixture 2 was a potency adjusted mixture containing five pesticides. Prediction of testosterone effects coincided with the experimental Mixture 1 data. In contrast, antagonism was observed for effects of Mixture 2 on this hormone. The mixtures contained chemicals exerting only limited maximal effects. This hampered prediction by the CA and IA models, whereas the GCA model could be used to predict a full dose response curve. Regarding effects on progesterone and estradiol, some chemicals were having stimulatory effects whereas others had inhibitory effects. The three models were not applicable in this situation and no predictions could be performed. Finally, the expected contributions of single chemicals to the mixture effects were calculated. Prochloraz was the predominant but not sole driver of the mixtures, suggesting that one chemical alone was not responsible for the mixture effects. In conclusion, the GCA model seemed to be superior to the CA and IA models for the prediction of testosterone effects. A situation with chemicals exerting opposing effects, for which the models could not be applied, was identified. In addition, the data indicate that in non-potency adjusted mixtures the effects cannot always be

  3. "The Dose Makes the Poison": Informing Consumers About the Scientific Risk Assessment of Food Additives.

    PubMed

    Bearth, Angela; Cousin, Marie-Eve; Siegrist, Michael

    2016-01-01

    Intensive risk assessment is required before the approval of food additives. During this process, based on the toxicological principle of "the dose makes the poison,ˮ maximum usage doses are assessed. However, most consumers are not aware of these efforts to ensure the safety of food additives and are therefore sceptical, even though food additives bring certain benefits to consumers. This study investigated the effect of a short video, which explains the scientific risk assessment and regulation of food additives, on consumers' perceptions and acceptance of food additives. The primary goal of this study was to inform consumers and enable them to construct their own risk-benefit assessment and make informed decisions about food additives. The secondary goal was to investigate whether people have different perceptions of food additives of artificial (i.e., aspartame) or natural origin (i.e., steviolglycoside). To attain these research goals, an online experiment was conducted on 185 Swiss consumers. Participants were randomly assigned to either the experimental group, which was shown a video about the scientific risk assessment of food additives, or the control group, which was shown a video about a topic irrelevant to the study. After watching the video, the respondents knew significantly more, expressed more positive thoughts and feelings, had less risk perception, and more acceptance than prior to watching the video. Thus, it appears that informing consumers about complex food safety topics, such as the scientific risk assessment of food additives, is possible, and using a carefully developed information video is a successful strategy for informing consumers.

  4. Intelligent adversary risk analysis: a bioterrorism risk management model.

    PubMed

    Parnell, Gregory S; Smith, Christopher M; Moxley, Frederick I

    2010-01-01

    The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender-attacker-defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data.

  5. Comprehensive European dietary exposure model (CEDEM) for food additives.

    PubMed

    Tennant, David R

    2016-05-01

    European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database.

  6. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  7. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  8. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins; John M. Beck

    2011-11-01

    The Next Generation Nuclear Plant (NGNP) Risk Management System (RMS) is a database used to maintain the project risk register. The RMS also maps risk reduction activities to specific identified risks. Further functionality of the RMS includes mapping reactor suppliers Design Data Needs (DDNs) to risk reduction tasks and mapping Phenomena Identification Ranking Table (PIRTs) to associated risks. This document outlines the basic instructions on how to use the RMS. This document constitutes Revision 1 of the NGNP Risk Management Database: A Model for Managing Risk. It incorporates the latest enhancements to the RMS. The enhancements include six new custom views of risk data - Impact/Consequence, Tasks by Project Phase, Tasks by Status, Tasks by Project Phase/Status, Tasks by Impact/WBS, and Tasks by Phase/Impact/WBS.

  9. Transferability of regional permafrost disturbance susceptibility modelling using generalized linear and generalized additive models

    NASA Astrophysics Data System (ADS)

    Rudy, Ashley C. A.; Lamoureux, Scott F.; Treitz, Paul; van Ewijk, Karin Y.

    2016-07-01

    To effectively assess and mitigate risk of permafrost disturbance, disturbance-prone areas can be predicted through the application of susceptibility models. In this study we developed regional susceptibility models for permafrost disturbances using a field disturbance inventory to test the transferability of the model to a broader region in the Canadian High Arctic. Resulting maps of susceptibility were then used to explore the effect of terrain variables on the occurrence of disturbances within this region. To account for a large range of landscape characteristics, the model was calibrated using two locations: Sabine Peninsula, Melville Island, NU, and Fosheim Peninsula, Ellesmere Island, NU. Spatial patterns of disturbance were predicted with a generalized linear model (GLM) and generalized additive model (GAM), each calibrated using disturbed and randomized undisturbed locations from both locations and GIS-derived terrain predictor variables including slope, potential incoming solar radiation, wetness index, topographic position index, elevation, and distance to water. Each model was validated for the Sabine and Fosheim Peninsulas using independent data sets while the transferability of the model to an independent site was assessed at Cape Bounty, Melville Island, NU. The regional GLM and GAM validated well for both calibration sites (Sabine and Fosheim) with the area under the receiver operating curves (AUROC) > 0.79. Both models were applied directly to Cape Bounty without calibration and validated equally with AUROC's of 0.76; however, each model predicted disturbed and undisturbed samples differently. Additionally, the sensitivity of the transferred model was assessed using data sets with different sample sizes. Results indicated that models based on larger sample sizes transferred more consistently and captured the variability within the terrain attributes in the respective study areas. Terrain attributes associated with the initiation of disturbances were

  10. Do Health Professionals Need Additional Competencies for Stratified Cancer Prevention Based on Genetic Risk Profiling?

    PubMed Central

    Chowdhury, Susmita; Henneman, Lidewij; Dent, Tom; Hall, Alison; Burton, Alice; Pharoah, Paul; Pashayan, Nora; Burton, Hilary

    2015-01-01

    There is growing evidence that inclusion of genetic information about known common susceptibility variants may enable population risk-stratification and personalized prevention for common diseases including cancer. This would require the inclusion of genetic testing as an integral part of individual risk assessment of an asymptomatic individual. Front line health professionals would be expected to interact with and assist asymptomatic individuals through the risk stratification process. In that case, additional knowledge and skills may be needed. Current guidelines and frameworks for genetic competencies of non-specialist health professionals place an emphasis on rare inherited genetic diseases. For common diseases, health professionals do use risk assessment tools but such tools currently do not assess genetic susceptibility of individuals. In this article, we compare the skills and knowledge needed by non-genetic health professionals, if risk-stratified prevention is implemented, with existing competence recommendations from the UK, USA and Europe, in order to assess the gaps in current competences. We found that health professionals would benefit from understanding the contribution of common genetic variations in disease risk, the rationale for a risk-stratified prevention pathway, and the implications of using genomic information in risk-assessment and risk management of asymptomatic individuals for common disease prevention. PMID:26068647

  11. Do Health Professionals Need Additional Competencies for Stratified Cancer Prevention Based on Genetic Risk Profiling?

    PubMed

    Chowdhury, Susmita; Henneman, Lidewij; Dent, Tom; Hall, Alison; Burton, Alice; Pharoah, Paul; Pashayan, Nora; Burton, Hilary

    2015-06-09

    There is growing evidence that inclusion of genetic information about known common susceptibility variants may enable population risk-stratification and personalized prevention for common diseases including cancer. This would require the inclusion of genetic testing as an integral part of individual risk assessment of an asymptomatic individual. Front line health professionals would be expected to interact with and assist asymptomatic individuals through the risk stratification process. In that case, additional knowledge and skills may be needed. Current guidelines and frameworks for genetic competencies of non-specialist health professionals place an emphasis on rare inherited genetic diseases. For common diseases, health professionals do use risk assessment tools but such tools currently do not assess genetic susceptibility of individuals. In this article, we compare the skills and knowledge needed by non-genetic health professionals, if risk-stratified prevention is implemented, with existing competence recommendations from the UK, USA and Europe, in order to assess the gaps in current competences. We found that health professionals would benefit from understanding the contribution of common genetic variations in disease risk, the rationale for a risk-stratified prevention pathway, and the implications of using genomic information in risk-assessment and risk management of asymptomatic individuals for common disease prevention.

  12. Additive composite ABCG2, SLC2A9 and SLC22A12 scores of high-risk alleles with alcohol use modulate gout risk.

    PubMed

    Tu, Hung-Pin; Chung, Chia-Min; Min-Shan Ko, Albert; Lee, Su-Shin; Lai, Han-Ming; Lee, Chien-Hung; Huang, Chung-Ming; Liu, Chiu-Shong; Ko, Ying-Chin

    2016-09-01

    The aim of the present study was to evaluate the contribution of urate transporter genes and alcohol use to the risk of gout/tophi. Eight variants of ABCG2, SLC2A9, SLC22A12, SLC22A11 and SLC17A3 were genotyped in male individuals in a case-control study with 157 gout (33% tophi), 106 asymptomatic hyperuricaemia and 295 control subjects from Taiwan. The multilocus profiles of the genetic risk scores for urate gene variants were used to evaluate the risk of asymptomatic hyperuricaemia, gout and tophi. ABCG2 Q141K (T), SLC2A9 rs1014290 (A) and SLC22A12 rs475688 (C) under an additive model and alcohol use independently predicted the risk of gout (respective odds ratio for each factor=2.48, 2.03, 1.95 and 2.48). The additive composite Q141K, rs1014290 and rs475688 scores of high-risk alleles were associated with gout risk (P<0.0001). We observed the supramultiplicative interaction effect of genetic urate scores and alcohol use on gout and tophi risk (P for interaction=0.0452, 0.0033). The synergistic effect of genetic urate score 5-6 and alcohol use indicates that these combined factors correlate with gout and tophi occurrence.

  13. Modeling the cardiovascular system using a nonlinear additive autoregressive model with exogenous input

    NASA Astrophysics Data System (ADS)

    Riedl, M.; Suhrbier, A.; Malberg, H.; Penzel, T.; Bretthauer, G.; Kurths, J.; Wessel, N.

    2008-07-01

    The parameters of heart rate variability and blood pressure variability have proved to be useful analytical tools in cardiovascular physics and medicine. Model-based analysis of these variabilities additionally leads to new prognostic information about mechanisms behind regulations in the cardiovascular system. In this paper, we analyze the complex interaction between heart rate, systolic blood pressure, and respiration by nonparametric fitted nonlinear additive autoregressive models with external inputs. Therefore, we consider measurements of healthy persons and patients suffering from obstructive sleep apnea syndrome (OSAS), with and without hypertension. It is shown that the proposed nonlinear models are capable of describing short-term fluctuations in heart rate as well as systolic blood pressure significantly better than similar linear ones, which confirms the assumption of nonlinear controlled heart rate and blood pressure. Furthermore, the comparison of the nonlinear and linear approaches reveals that the heart rate and blood pressure variability in healthy subjects is caused by a higher level of noise as well as nonlinearity than in patients suffering from OSAS. The residue analysis points at a further source of heart rate and blood pressure variability in healthy subjects, in addition to heart rate, systolic blood pressure, and respiration. Comparison of the nonlinear models within and among the different groups of subjects suggests the ability to discriminate the cohorts that could lead to a stratification of hypertension risk in OSAS patients.

  14. Modeling Errors in Daily Precipitation Measurements: Additive or Multiplicative?

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Huffman, George J.; Adler, Robert F.; Tang, Ling; Sapiano, Matthew; Maggioni, Viviana; Wu, Huan

    2013-01-01

    The definition and quantification of uncertainty depend on the error model used. For uncertainties in precipitation measurements, two types of error models have been widely adopted: the additive error model and the multiplicative error model. This leads to incompatible specifications of uncertainties and impedes intercomparison and application.In this letter, we assess the suitability of both models for satellite-based daily precipitation measurements in an effort to clarify the uncertainty representation. Three criteria were employed to evaluate the applicability of either model: (1) better separation of the systematic and random errors; (2) applicability to the large range of variability in daily precipitation; and (3) better predictive skills. It is found that the multiplicative error model is a much better choice under all three criteria. It extracted the systematic errors more cleanly, was more consistent with the large variability of precipitation measurements, and produced superior predictions of the error characteristics. The additive error model had several weaknesses, such as non constant variance resulting from systematic errors leaking into random errors, and the lack of prediction capability. Therefore, the multiplicative error model is a better choice.

  15. Electroacoustics modeling of piezoelectric welders for ultrasonic additive manufacturing processes

    NASA Astrophysics Data System (ADS)

    Hehr, Adam; Dapino, Marcelo J.

    2016-04-01

    Ultrasonic additive manufacturing (UAM) is a recent 3D metal printing technology which utilizes ultrasonic vibrations from high power piezoelectric transducers to additively weld similar and dissimilar metal foils. CNC machining is used intermittent of welding to create internal channels, embed temperature sensitive components, sensors, and materials, and for net shaping parts. Structural dynamics of the welder and work piece influence the performance of the welder and part quality. To understand the impact of structural dynamics on UAM, a linear time-invariant model is used to relate system shear force and electric current inputs to the system outputs of welder velocity and voltage. Frequency response measurements are combined with in-situ operating measurements of the welder to identify model parameters and to verify model assumptions. The proposed LTI model can enhance process consistency, performance, and guide the development of improved quality monitoring and control strategies.

  16. Risk assessment for the combinational effects of food color additives: neural progenitor cells and hippocampal neurogenesis.

    PubMed

    Park, Mikyung; Park, Hee Ra; Kim, So Jung; Kim, Min-Sun; Kong, Kyoung Hye; Kim, Hyun Soo; Gong, Ein Ji; Kim, Mi Eun; Kim, Hyung Sik; Lee, Byung Mu; Lee, Jaewon

    2009-01-01

    In 2006, the Korea Food and Drug Administration reported that combinations of dietary colors such as allura red AC (R40), tartrazine (Y4), sunset yellow FCF (Y5), amaranth (R2), and brilliant blue FCF (B1) are widely used in food manufacturing. Although individual tar food colors are controlled based on acceptable daily intake (ADI), there is no apparent information available for how combinations of these additives affect food safety. In the current study, the potencies of single and combination use of R40, Y4, Y5, R2, and B1 were examined on neural progenitor cell (NPC) toxicity, a biomarker for developmental stage, and neurogenesis, indicative of adult central nervous system (CNS) functions. R40 and R2 reduced NPC proliferation and viability in mouse multipotent NPC, in the developing CNS model. Among several combinations tested in mouse model, combination of Y4 and B1 at 1000-fold higher than average daily intake in Korea significantly decreased numbers of newly generated cells in adult mouse hippocampus, indicating potent adverse actions on hippocampal neurogenesis. However, other combinations including R40 and R2 did not affect adult hippocampal neurogenesis in the dentate gyrus. Evidence indicates that single and combination use of most tar food colors may be safe with respect to risk using developmental NPC and adult hippocampal neurogenesis. However, the response to excessively high dose combination of Y4 and B1 is suggestive of synergistic effects to suppress proliferation of NPC in adult hippocampus. Data indicated that combinations of tar colors may adversely affect both developmental and adult hippocampal neurogenesis; thus, further extensive studies are required to assess the safety of these additive combinations.

  17. An Additional Symmetry in the Weinberg-Salam Model

    SciTech Connect

    Bakker, B.L.G.; Veselov, A.I.; Zubkov, M.A.

    2005-06-01

    An additional Z{sub 6} symmetry hidden in the fermion and Higgs sectors of the Standard Model has been found recently. It has a singular nature and is connected to the centers of the SU(3) and SU(2) subgroups of the gauge group. A lattice regularization of the Standard Model was constructed that possesses this symmetry. In this paper, we report our results on the numerical simulation of its electroweak sector.

  18. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  19. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  20. Modeling uranium transport in acidic contaminated groundwater with base addition.

    PubMed

    Zhang, Fan; Luo, Wensui; Parker, Jack C; Brooks, Scott C; Watson, David B; Jardine, Philip M; Gu, Baohua

    2011-06-15

    This study investigates reactive transport modeling in a column of uranium(VI)-contaminated sediments with base additions in the circulating influent. The groundwater and sediment exhibit oxic conditions with low pH, high concentrations of NO(3)(-), SO(4)(2-), U and various metal cations. Preliminary batch experiments indicate that additions of strong base induce rapid immobilization of U for this material. In the column experiment that is the focus of the present study, effluent groundwater was titrated with NaOH solution in an inflow reservoir before reinjection to gradually increase the solution pH in the column. An equilibrium hydrolysis, precipitation and ion exchange reaction model developed through simulation of the preliminary batch titration experiments predicted faster reduction of aqueous Al than observed in the column experiment. The model was therefore modified to consider reaction kinetics for the precipitation and dissolution processes which are the major mechanism for Al immobilization. The combined kinetic and equilibrium reaction model adequately described variations in pH, aqueous concentrations of metal cations (Al, Ca, Mg, Sr, Mn, Ni, Co), sulfate and U(VI). The experimental and modeling results indicate that U(VI) can be effectively sequestered with controlled base addition due to sorption by slowly precipitated Al with pH-dependent surface charge. The model may prove useful to predict field-scale U(VI) sequestration and remediation effectiveness.

  1. Generalised additive modelling approach to the fermentation process of glutamate.

    PubMed

    Liu, Chun-Bo; Li, Yun; Pan, Feng; Shi, Zhong-Ping

    2011-03-01

    In this work, generalised additive models (GAMs) were used for the first time to model the fermentation of glutamate (Glu). It was found that three fermentation parameters fermentation time (T), dissolved oxygen (DO) and oxygen uptake rate (OUR) could capture 97% variance of the production of Glu during the fermentation process through a GAM model calibrated using online data from 15 fermentation experiments. This model was applied to investigate the individual and combined effects of T, DO and OUR on the production of Glu. The conditions to optimize the fermentation process were proposed based on the simulation study from this model. Results suggested that the production of Glu can reach a high level by controlling concentration levels of DO and OUR to the proposed optimization conditions during the fermentation process. The GAM approach therefore provides an alternative way to model and optimize the fermentation process of Glu.

  2. Structural equation modeling in environmental risk assessment.

    PubMed Central

    Buncher, C R; Succop, P A; Dietrich, K N

    1991-01-01

    Environmental epidemiology requires effective models that take individual observations of environmental factors and connect them into meaningful patterns. Single-factor relationships have given way to multivariable analyses; simple additive models have been augmented by multiplicative (logistic) models. Each of these steps has produced greater enlightenment and understanding. Models that allow for factors causing outputs that can affect later outputs with putative causation working at several different time points (e.g., linkage) are not commonly used in the environmental literature. Structural equation models are a class of covariance structure models that have been used extensively in economics/business and social science but are still little used in the realm of biostatistics. Path analysis in genetic studies is one simplified form of this class of models. We have been using these models in a study of the health and development of infants who have been exposed to lead in utero and in the postnatal home environment. These models require as input the directionality of the relationship and then produce fitted models for multiple inputs causing each factor and the opportunity to have outputs serve as input variables into the next phase of the simultaneously fitted model. Some examples of these models from our research are presented to increase familiarity with this class of models. Use of these models can provide insight into the effect of changing an environmental factor when assessing risk. The usual cautions concerning believing a model, believing causation has been proven, and the assumptions that are required for each model are operative. PMID:2050063

  3. Validation of transport models using additive flux minimization technique

    NASA Astrophysics Data System (ADS)

    Pankin, A. Y.; Kruger, S. E.; Groebner, R. J.; Hakim, A.; Kritz, A. H.; Rafiq, T.

    2013-10-01

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V&V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V&V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V&V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile.

  4. The apolipoprotein epsilon4 allele confers additional risk in children with familial hypercholesterolemia.

    PubMed

    Wiegman, Albert; Sijbrands, Eric J G; Rodenburg, Jessica; Defesche, Joep C; de Jongh, Saskia; Bakker, Henk D; Kastelein, John J P

    2003-06-01

    Children with familial hypercholesterolemia (FH) exhibit substantial variance of LDL cholesterol. In previous studies, family members of children with FH were included, which may have influenced results. To avoid such bias, we studied phenotype in 450 unrelated children with FH and in 154 affected sib-pairs. In known families with classical FH, diagnosis was based on plasma LDL cholesterol above the age- and gender-specific 95th percentile. Girls had 0.47 +/- 0.15 mmol/L higher LDL cholesterol, compared with boys (p = 0.002). Also in girls, HDL cholesterol increased by 0.07 +/- 0.03 mmol/L per 5 y (pfor trend = 0.005); this age effect was not observed in boys. The distribution of apolipoprotein (apo) E genotypes was not significantly different between probands, their paired affected siblings, or a Dutch control population. Carriers with or without one epsilon4 allele had similar LDL and HDL cholesterol levels. Within the affected sib-pairs, the epsilon4 allele explained 72.4% of the variance of HDL cholesterol levels (-0.15 mmol/L, 95% confidence interval -0.24 to -0.05, p = 0.003). The effect of apoE4 on HDL cholesterol differed with an analysis based on probands or on affected sib-pairs. The affected sib-pair model used adjustment for shared environment, type of LDL receptor gene mutation, and a proportion of additional genetic factors and may, therefore, be more accurate in estimating effects of risk factors on complex traits. We conclude that the epsilon4 allele was associated with lower HDL cholesterol levels in an affected sib-pair analysis, which strongly suggests that apoE4 influences HDL cholesterol levels in FH children. Moreover, the strong association suggests that apoE4 carries an additional disadvantage for FH children.

  5. Additive pressor effects of caffeine and stress in male medical students at risk for hypertension.

    PubMed

    Shepard, J D; al'Absi, M; Whitsett, T L; Passey, R B; Lovallo, W R

    2000-05-01

    The effects of caffeine on blood pressure (BP) and cortisol secretion were examined during elevated work stress in medical students at high versus low risk for hypertension. Among 31 male medical students who were regular consumers of caffeine, 20 were considered at low risk for hypertension (negative parental history and all screening BP < 125/78 mm Hg) and 11 at high risk based on epidemiologic criteria (positive parental history and average screening BPs between 125/78 and 139/89 mm Hg). Cortisol levels and ambulatory BP were measured with and without caffeine during two lectures (low work stress) and two exams (high work stress) in a randomized, double-blind, crossover trial. Caffeine consumption and exam stress increased cortisol secretion in both groups (P < .05). BP increased with caffeine or exam stress in both groups, low versus high risk, respectively (Caffeine: + 5/4 vs + 3/3 mm Hg; Stress: + 4/1 vs + 7/3 mm Hg; P < .05). The combination of stress and caffeine caused additive increases in BP (Low Risk + 9/5 mm Hg, High Risk + 10/6 mm Hg) such that 46% of high-risk participants had average systolic BP > or = 140 mm Hg. This combined effect of stress and caffeine on BP suggests that it may be beneficial for individuals at high risk for hypertension to refrain from the use of caffeinated beverages, particularly at times when work demands and attendant stressors are high. For the same reasons, recent intake of caffeine should be controlled in patients undergoing BP measurement for the diagnosis of hypertension.

  6. Modeling Research Project Risks with Fuzzy Maps

    ERIC Educational Resources Information Center

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  7. Assessing the additive risks of PSII herbicide exposure to the Great Barrier Reef.

    PubMed

    Lewis, Stephen E; Schaffelke, Britta; Shaw, Melanie; Bainbridge, Zoë T; Rohde, Ken W; Kennedy, Karen; Davis, Aaron M; Masters, Bronwyn L; Devlin, Michelle J; Mueller, Jochen F; Brodie, Jon E

    2012-01-01

    Herbicide residues have been measured in the Great Barrier Reef lagoon at concentrations which have the potential to harm marine plant communities. Monitoring on the Great Barrier Reef lagoon following wet season discharge show that 80% of the time when herbicides are detected, more than one are present. These herbicides have been shown to act in an additive manner with regards to photosystem-II inhibition. In this study, the area of the Great Barrier Reef considered to be at risk from herbicides is compared when exposures are considered for each herbicide individually and also for herbicide mixtures. Two normalisation indices for herbicide mixtures were calculated based on current guidelines and PSII inhibition thresholds. The results show that the area of risk for most regions is greatly increased under the proposed additive PSII inhibition threshold and that the resilience of this important ecosystem could be reduced by exposure to these herbicides.

  8. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  9. Additional Treatments for High-Risk Obstetric Antiphospholipid Syndrome: a Comprehensive Review.

    PubMed

    Ruffatti, Amelia; Hoxha, Ariela; Favaro, Maria; Tonello, Marta; Colpo, Anna; Cucchini, Umberto; Banzato, Alessandra; Pengo, Vittorio

    2016-06-25

    Most investigators currently advocate prophylactic-dose heparin plus low-dose aspirin as the preferred treatment of otherwise healthy women with obstetric antiphospholipid syndrome, whilst women with a history of vascular thrombosis alone or associated with pregnancy morbidity are usually treated with therapeutic heparin doses in association with low-dose aspirin in an attempt to prevent both thrombosis and pregnancy morbidity. However, the protocols outlined above fail in about 20 % of pregnant women with antiphospholipid syndrome. Identifying risk factors associated with pregnancy failure when conventional therapies are utilized is an important step in establishing guidelines to manage these high-risk patients. Some clinical and laboratory risk factors have been found to be related to maternal-foetal complications in pregnant women on conventional therapy. However, the most efficacious treatments to administer to high-risk antiphospholipid syndrome women in addition to conventional therapy in order to avoid pregnancy complications are as yet unestablished. This is a comprehensive review on this topic and an invitation to participate in a multicentre study in order to identify the best additional treatments to be used in this subset of antiphospholipid syndrome patients.

  10. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    NASA Technical Reports Server (NTRS)

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition

  11. [The real-world effectiveness of personal protective equipment and additional risks for workers' health].

    PubMed

    Denisov, É I; Morozova, T V; Adeninskaia, E E; Kur'erov, N N

    2013-01-01

    The effectiveness of personal protective equipment (PPE) of hearing, respiratory organs and hands is considered. It is shown that real effect of PPE is twice lower than declared by supplier; this presumes some derating system. The aspects of discomfort and additional risks are analyzed. The hygienic and physiologic evaluation of PPE is required along with elaboration of an official document (OSH standard or sanitary regulation) on selection, personal fit, organization of use and individual training of workers and their motivation.

  12. MTHFR homozygous mutation and additional risk factors for cerebral infarction in a large Italian family.

    PubMed

    Del Balzo, Francesca; Spalice, Alberto; Perla, Massimo; Properzi, Enrico; Iannetti, Paola

    2009-01-01

    Several cases with cerebral infarctions associated with the C677T mutation in the methylenetetrahydrofolate reductase gene (MTHFR) have been reported. Given the large number of asymptomatic individuals with the MTHFR mutation, additional risk factors for cerebral infarction should be considered. This study describes a large family with the MTHFR mutation and a combination of heterozygous factor V Leiden mutations and different additional exogenous and endogenous thrombogenic risk factors. Psychomotor retardation and a left fronto-insular infarct associated with the MTHFR mutation together with diminished factor VII and low level of protein C was documented in the first patient. In the second patient, generalized epilepsy and a malacic area in the right nucleus lenticularis was associated with the MTHFR mutation and a low level of protein C. In the third patient, right hemiparesis and a left fronto-temporal porencephalic cyst were documented, together with the MTHFR mutation and hyperhomocysteinemia. An extensive search of additional circumstantial and genetic thrombogenic risk factors should be useful for prophylaxis and prognosis of infants with cerebral infarctions associated with the MTHFR mutation and of their related family members.

  13. Addition Table of Colours: Additive and Subtractive Mixtures Described Using a Single Reasoning Model

    ERIC Educational Resources Information Center

    Mota, A. R.; Lopes dos Santos, J. M. B.

    2014-01-01

    Students' misconceptions concerning colour phenomena and the apparent complexity of the underlying concepts--due to the different domains of knowledge involved--make its teaching very difficult. We have developed and tested a teaching device, the addition table of colours (ATC), that encompasses additive and subtractive mixtures in a single…

  14. Relative risk regression models with inverse polynomials.

    PubMed

    Ning, Yang; Woodward, Mark

    2013-08-30

    The proportional hazards model assumes that the log hazard ratio is a linear function of parameters. In the current paper, we model the log relative risk as an inverse polynomial, which is particularly suitable for modeling bounded and asymmetric functions. The parameters estimated by maximizing the partial likelihood are consistent and asymptotically normal. The advantages of the inverse polynomial model over the ordinary polynomial model and the fractional polynomial model for fitting various asymmetric log relative risk functions are shown by simulation. The utility of the method is further supported by analyzing two real data sets, addressing the specific question of the location of the minimum risk threshold.

  15. Sensitivity analysis of geometric errors in additive manufacturing medical models.

    PubMed

    Pinto, Jose Miguel; Arrieta, Cristobal; Andia, Marcelo E; Uribe, Sergio; Ramos-Grez, Jorge; Vargas, Alex; Irarrazaval, Pablo; Tejos, Cristian

    2015-03-01

    Additive manufacturing (AM) models are used in medical applications for surgical planning, prosthesis design and teaching. For these applications, the accuracy of the AM models is essential. Unfortunately, this accuracy is compromised due to errors introduced by each of the building steps: image acquisition, segmentation, triangulation, printing and infiltration. However, the contribution of each step to the final error remains unclear. We performed a sensitivity analysis comparing errors obtained from a reference with those obtained modifying parameters of each building step. Our analysis considered global indexes to evaluate the overall error, and local indexes to show how this error is distributed along the surface of the AM models. Our results show that the standard building process tends to overestimate the AM models, i.e. models are larger than the original structures. They also show that the triangulation resolution and the segmentation threshold are critical factors, and that the errors are concentrated at regions with high curvatures. Errors could be reduced choosing better triangulation and printing resolutions, but there is an important need for modifying some of the standard building processes, particularly the segmentation algorithms.

  16. Additive Manufacturing of Medical Models--Applications in Rhinology.

    PubMed

    Raos, Pero; Klapan, Ivica; Galeta, Tomislav

    2015-09-01

    In the paper we are introducing guidelines and suggestions for use of 3D image processing SW in head pathology diagnostic and procedures for obtaining physical medical model by additive manufacturing/rapid prototyping techniques, bearing in mind the improvement of surgery performance, its maximum security and faster postoperative recovery of patients. This approach has been verified in two case reports. In the treatment we used intelligent classifier-schemes for abnormal patterns using computer-based system for 3D-virtual and endoscopic assistance in rhinology, with appropriate visualization of anatomy and pathology within the nose, paranasal sinuses, and scull base area.

  17. Multiscale Modeling of Powder Bed-Based Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Markl, Matthias; Körner, Carolin

    2016-07-01

    Powder bed fusion processes are additive manufacturing technologies that are expected to induce the third industrial revolution. Components are built up layer by layer in a powder bed by selectively melting confined areas, according to sliced 3D model data. This technique allows for manufacturing of highly complex geometries hardly machinable with conventional technologies. However, the underlying physical phenomena are sparsely understood and difficult to observe during processing. Therefore, an intensive and expensive trial-and-error principle is applied to produce components with the desired dimensional accuracy, material characteristics, and mechanical properties. This review presents numerical modeling approaches on multiple length scales and timescales to describe different aspects of powder bed fusion processes. In combination with tailored experiments, the numerical results enlarge the process understanding of the underlying physical mechanisms and support the development of suitable process strategies and component topologies.

  18. Additive Functions in Boolean Models of Gene Regulatory Network Modules

    PubMed Central

    Darabos, Christian; Di Cunto, Ferdinando; Tomassini, Marco; Moore, Jason H.; Provero, Paolo; Giacobini, Mario

    2011-01-01

    Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity

  19. Additive functions in boolean models of gene regulatory network modules.

    PubMed

    Darabos, Christian; Di Cunto, Ferdinando; Tomassini, Marco; Moore, Jason H; Provero, Paolo; Giacobini, Mario

    2011-01-01

    Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity

  20. WATEQ3 geochemical model: thermodynamic data for several additional solids

    SciTech Connect

    Krupka, K.M.; Jenne, E.A.

    1982-09-01

    Geochemical models such as WATEQ3 can be used to model the concentrations of water-soluble pollutants that may result from the disposal of nuclear waste and retorted oil shale. However, for a model to competently deal with these water-soluble pollutants, an adequate thermodynamic data base must be provided that includes elements identified as important in modeling these pollutants. To this end, several minerals and related solid phases were identified that were absent from the thermodynamic data base of WATEQ3. In this study, the thermodynamic data for the identified solids were compiled and selected from several published tabulations of thermodynamic data. For these solids, an accepted Gibbs free energy of formation, ..delta..G/sup 0//sub f,298/, was selected for each solid phase based on the recentness of the tabulated data and on considerations of internal consistency with respect to both the published tabulations and the existing data in WATEQ3. For those solids not included in these published tabulations, Gibbs free energies of formation were calculated from published solubility data (e.g., lepidocrocite), or were estimated (e.g., nontronite) using a free-energy summation method described by Mattigod and Sposito (1978). The accepted or estimated free energies were then combined with internally consistent, ancillary thermodynamic data to calculate equilibrium constants for the hydrolysis reactions of these minerals and related solid phases. Including these values in the WATEQ3 data base increased the competency of this geochemical model in applications associated with the disposal of nuclear waste and retorted oil shale. Additional minerals and related solid phases that need to be added to the solubility submodel will be identified as modeling applications continue in these two programs.

  1. Requirements based system risk modeling

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Cornford, Steven; Feather, Martin

    2004-01-01

    The problem that we address in this paper is assessing the expected degree of success of the system or mission based on the degree to which each requirement is satisfied and the relative weight of the requirements. We assume a complete list of the requirements, the relevant risk elements and their probability of occurrence and the quantified effect of the risk elements on the requirements. In order to assess the degree to which each requirement is satisfied, we need to determine the effect of the various risk elements on the requirement.

  2. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  3. Risk assessment of additives through soft drinks and nectars consumption on Portuguese population: a 2010 survey.

    PubMed

    Diogo, Janina S G; Silva, Liliana S O; Pena, Angelina; Lino, Celeste M

    2013-12-01

    This study investigated whether the Portuguese population is at risk of exceeding ADI levels for acesulfame-K, saccharin, aspartame, caffeine, benzoic and sorbic acid through an assessment of dietary intake of additives and specific consumption of four types of beverages, traditional soft drinks and soft drinks based on mineral waters, energetic drinks, and nectars. The highest mean levels of additives were found for caffeine in energetic drinks, 293.5mg/L, for saccharin in traditional soft drinks, 18.4 mg/L, for acesulfame-K and aspartame in nectars, with 88.2 and 97.8 mg/L, respectively, for benzoic acid in traditional soft drinks, 125.7 mg/L, and for sorbic acid in soft drinks based on mineral water, 166.5 mg/L. Traditional soft drinks presented the highest acceptable daily intake percentages (ADIs%) for acesulfame-K, aspartame, benzoic and sorbic acid and similar value for saccharin (0.5%) when compared with soft drinks based on mineral water, 0.7%, 0.08%, 7.3%, and 1.92% versus 0.2%, 0.053%, 0.6%, and 0.28%, respectively. However for saccharin the highest percentage of ADI was obtained for nectars, 0.9%, in comparison with both types of soft drinks, 0.5%. Therefore, it is concluded that the Portuguese population is not at risk of exceeding the established ADIs for the studied additives.

  4. Increased risk of delayed cerebral ischemia in subarachnoid hemorrhage patients with additional intracerebral hematoma.

    PubMed

    Platz, Johannes; Güresir, Erdem; Wagner, Marlies; Seifert, Volker; Konczalla, Juergen

    2017-02-01

    OBJECTIVE Delayed cerebral ischemia (DCI) has a major impact on the outcome of patients suffering from aneurysmal subarachnoid hemorrhage (SAH). The aim of this study was to assess the influence of an additional intracerebral hematoma (ICH) on the occurrence of DCI. METHODS The authors conducted a single-center retrospective analysis of cases of SAH involving patients treated between 2006 and 2011. Patients who died or were transferred to another institution within 10 days after SAH without the occurrence of DCI were excluded from the analysis. RESULTS Additional ICH was present in 123 (24.4%) of 504 included patients (66.7% female). ICH was classified as frontal in 72 patients, temporal in 24, and perisylvian in 27. DCI occurred in 183 patients (36.3%). A total of 59 (32.2%) of these 183 patients presented with additional ICH, compared with 64 (19.9%) of the 321 without DCI (p = 0.002). In addition, DCI was detected significantly more frequently in patients with higher World Federation of Neurosurgical Societies (WFNS) grades. The authors compared the original and modified Fisher Scales with respect to the occurrence of DCI. The modified Fisher Scale (mFS) was superior to the original Fisher Scale (oFS) in predicting DCI. Furthermore, they suggest a new classification based on the mFS, which demonstrates the impact of additional ICH on the occurrence of DCI. After the different scales were corrected for age, sex, WFNS score, and aneurysm site, the oFS no longer was predictive for the occurrence of DCI, while the new scale demonstrated a superior capacity for prediction as compared with the mFS. CONCLUSIONS Additional ICH was associated with an increased risk of DCI in this study. Furthermore, adding the presence or absence of ICH to the mFS improved the identification of patients at the highest risk for the development of DCI. Thus, a simple adjustment of the mFS might help to identify patients at high risk for DCI.

  5. Estimation of propensity scores using generalized additive models.

    PubMed

    Woo, Mi-Ja; Reiter, Jerome P; Karr, Alan F

    2008-08-30

    Propensity score matching is often used in observational studies to create treatment and control groups with similar distributions of observed covariates. Typically, propensity scores are estimated using logistic regressions that assume linearity between the logistic link and the predictors. We evaluate the use of generalized additive models (GAMs) for estimating propensity scores. We compare logistic regressions and GAMs in terms of balancing covariates using simulation studies with artificial and genuine data. We find that, when the distributions of covariates in the treatment and control groups overlap sufficiently, using GAMs can improve overall covariate balance, especially for higher-order moments of distributions. When the distributions in the two groups overlap insufficiently, GAM more clearly reveals this fact than logistic regression does. We also demonstrate via simulation that matching with GAMs can result in larger reductions in bias when estimating treatment effects than matching with logistic regression.

  6. [Critical of the additive model of the randomized controlled trial].

    PubMed

    Boussageon, Rémy; Gueyffier, François; Bejan-Angoulvant, Theodora; Felden-Dominiak, Géraldine

    2008-01-01

    Randomized, double-blind, placebo-controlled clinical trials are currently the best way to demonstrate the clinical effectiveness of drugs. Its methodology relies on the method of difference (John Stuart Mill), through which the observed difference between two groups (drug vs placebo) can be attributed to the pharmacological effect of the drug being tested. However, this additive model can be questioned in the event of statistical interactions between the pharmacological and the placebo effects. Evidence in different domains has shown that the placebo effect can influence the effect of the active principle. This article evaluates the methodological, clinical and epistemological consequences of this phenomenon. Topics treated include extrapolating results, accounting for heterogeneous results, demonstrating the existence of several factors in the placebo effect, the necessity to take these factors into account for given symptoms or pathologies, as well as the problem of the "specific" effect.

  7. Root Caries Risk Indicators: A Systematic Review of Risk Models

    PubMed Central

    Ritter, André V.; Shugars, Daniel A.; Bader, James D.

    2010-01-01

    Objective To identify risk indicators that are associated with root caries incidence in published predictive risk models. Methods Abstracts (n=472) identified from a MEDLINE, EMBASE, and Cochrane registry search were screened independently by two investigators to exclude articles not in English (n=39), published prior to 1970 (none), or containing no information on either root caries incidence, risk indicators, or risk models (n=209). A full-article duplicate review of the remaining articles (n=224) selected those reporting predictive risk models based on original/primary longitudinal root caries incidence studies. The quality of the included articles was assessed based both on selected criteria of methodological standards for observational studies and on the statistical quality of the modeling strategy. Data from these included studies were extracted and compiled into evidence tables, which included information about the cohort location, incidence period, sample size, age of the study participants, risk indicators included in the model, root caries incidence, modeling strategy, significant risk indicators/predictors, and parameter estimates and statistical findings. Results Thirteen articles were selected for data extraction. The overall quality of the included articles was poor to moderate. Root caries incidence ranged fro m 12%–77% (mean±SD=45%±17%); follow-up time of the published studies was ≤10 years (range=9; median=3); sample size ranged from 23–723 (mean±SD=264±203; median=261); person-years ranged from 23–1540 (mean±SD=760±556; median=746). Variables most frequently tested and significantly associated with root caries incidence were (times tested; % significant; directionality): baseline root caries (12; 58%; positive); number of teeth (7; 71%; 3 times positive, twice negative), and plaque index (4; 100%; positive). Ninety-two other clinical and non-clinical variables were tested: 27 were tested 3 times or more and were significant between 9

  8. Relative Importance and Additive Effects of Maternal and Infant Risk Factors on Childhood Asthma

    PubMed Central

    Rosas-Salazar, Christian; James, Kristina; Escobar, Gabriel; Gebretsadik, Tebeb; Li, Sherian Xu; Carroll, Kecia N.; Walsh, Eileen; Mitchel, Edward; Das, Suman; Kumar, Rajesh; Yu, Chang; Dupont, William D.; Hartert, Tina V.

    2016-01-01

    Background Environmental exposures that occur in utero and during early life may contribute to the development of childhood asthma through alteration of the human microbiome. The objectives of this study were to estimate the cumulative effect and relative importance of environmental exposures on the risk of childhood asthma. Methods We conducted a population-based birth cohort study of mother-child dyads who were born between 1995 and 2003 and were continuously enrolled in the PRIMA (Prevention of RSV: Impact on Morbidity and Asthma) cohort. The individual and cumulative impact of maternal urinary tract infections (UTI) during pregnancy, maternal colonization with group B streptococcus (GBS), mode of delivery, infant antibiotic use, and older siblings at home, on the risk of childhood asthma were estimated using logistic regression. Dose-response effect on childhood asthma risk was assessed for continuous risk factors: number of maternal UTIs during pregnancy, courses of infant antibiotics, and number of older siblings at home. We further assessed and compared the relative importance of these exposures on the asthma risk. In a subgroup of children for whom maternal antibiotic use during pregnancy information was available, the effect of maternal antibiotic use on the risk of childhood asthma was estimated. Results Among 136,098 singleton birth infants, 13.29% developed asthma. In both univariate and adjusted analyses, maternal UTI during pregnancy (odds ratio [OR] 1.2, 95% confidence interval [CI] 1.18, 1.25; adjusted OR [AOR] 1.04, 95%CI 1.02, 1.07 for every additional UTI) and infant antibiotic use (OR 1.21, 95%CI 1.20, 1.22; AOR 1.16, 95%CI 1.15, 1.17 for every additional course) were associated with an increased risk of childhood asthma, while having older siblings at home (OR 0.92, 95%CI 0.91, 0.93; AOR 0.85, 95%CI 0.84, 0.87 for each additional sibling) was associated with a decreased risk of childhood asthma, in a dose-dependent manner. Compared with vaginal

  9. THE COMBINED CARCINOGENIC RISK FOR EXPOSURE TO MIXTURES OF DRINKING WATER DISINFECTION BY-PRODUCTS MAY BE LESS THAN ADDITIVE

    EPA Science Inventory

    The Combined Carcinogenic Risk for Exposure to Mixtures of Drinking Water Disinfection By-Products May be Less Than Additive

    Risk assessment methods for chemical mixtures in drinking water are not well defined. Current default risk assessments for chemical mixtures assume...

  10. Individualized Risk Prediction Model for Lung Cancer in Korean Men

    PubMed Central

    Park, Sohee; Nam, Byung-Ho; Yang, Hye-Ryung; Lee, Ji An; Lim, Hyunsun; Han, Jun Tae; Park, Il Su; Shin, Hai-Rim; Lee, Jin Soo

    2013-01-01

    Purpose Lung cancer is the leading cause of cancer deaths in Korea. The objective of the present study was to develop an individualized risk prediction model for lung cancer in Korean men using population-based cohort data. Methods From a population-based cohort study of 1,324,804 Korean men free of cancer at baseline, the individualized absolute risk of developing lung cancer was estimated using the Cox proportional hazards model. We checked the validity of the model using C statistics and the Hosmer–Lemeshow chi-square test on an external validation dataset. Results The risk prediction model for lung cancer in Korean men included smoking exposure, age at smoking initiation, body mass index, physical activity, and fasting glucose levels. The model showed excellent performance (C statistic = 0.871, 95% CI = 0.867–0.876). Smoking was significantly associated with the risk of lung cancer in Korean men, with a four-fold increased risk in current smokers consuming more than one pack a day relative to non-smokers. Age at smoking initiation was also a significant predictor for developing lung cancer; a younger age at initiation was associated with a higher risk of developing lung cancer. Conclusion This is the first study to provide an individualized risk prediction model for lung cancer in an Asian population with very good model performance. In addition to current smoking status, earlier exposure to smoking was a very important factor for developing lung cancer. Since most of the risk factors are modifiable, this model can be used to identify those who are at a higher risk and who can subsequently modify their lifestyle choices to lower their risk of lung cancer. PMID:23408946

  11. A methodology for modeling regional terrorism risk.

    PubMed

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States.

  12. Adversarial risk analysis for counterterrorism modeling.

    PubMed

    Rios, Jesus; Rios Insua, David

    2012-05-01

    Recent large-scale terrorist attacks have raised interest in models for resource allocation against terrorist threats. The unifying theme in this area is the need to develop methods for the analysis of allocation decisions when risks stem from the intentional actions of intelligent adversaries. Most approaches to these problems have a game-theoretic flavor although there are also several interesting decision-analytic-based proposals. One of them is the recently introduced framework for adversarial risk analysis, which deals with decision-making problems that involve intelligent opponents and uncertain outcomes. We explore how adversarial risk analysis addresses some standard counterterrorism models: simultaneous defend-attack models, sequential defend-attack-defend models, and sequential defend-attack models with private information. For each model, we first assess critically what would be a typical game-theoretic approach and then provide the corresponding solution proposed by the adversarial risk analysis framework, emphasizing how to coherently assess a predictive probability model of the adversary's actions, in a context in which we aim at supporting decisions of a defender versus an attacker. This illustrates the application of adversarial risk analysis to basic counterterrorism models that may be used as basic building blocks for more complex risk analysis of counterterrorism problems.

  13. Quantifying fatigue risk in model-based fatigue risk management.

    PubMed

    Rangan, Suresh; Van Dongen, Hans P A

    2013-02-01

    The question of what is a maximally acceptable level of fatigue risk is hotly debated in model-based fatigue risk management in commercial aviation and other transportation modes. A quantitative approach to addressing this issue, referred to by the Federal Aviation Administration with regard to its final rule for commercial aviation "Flightcrew Member Duty and Rest Requirements," is to compare predictions from a mathematical fatigue model against a fatigue threshold. While this accounts for duty time spent at elevated fatigue risk, it does not account for the degree of fatigue risk and may, therefore, result in misleading schedule assessments. We propose an alternative approach based on the first-order approximation that fatigue risk is proportional to both the duty time spent below the fatigue threshold and the distance of the fatigue predictions to the threshold--that is, the area under the curve (AUC). The AUC approach is straightforward to implement for schedule assessments in commercial aviation and also provides a useful fatigue metric for evaluating thousands of scheduling options in industrial schedule optimization tools.

  14. Risk assessment of nitrate and oxytetracycline addition on coastal ecosystem functions.

    PubMed

    Feng-Jiao, Liu; Shun-Xing, Li; Feng-Ying, Zheng; Xu-Guang, Huang; Yue-Gang, Zuo; Teng-Xiu, Tu; Xue-Qing, Wu

    2014-01-01

    Diatoms dominate phytoplankton communities in the well-mixed coastal and upwelling regions. Coastal diatoms are often exposed to both aquaculture pollution and eutrophication. But how these exposures influence on coastal ecosystem functions are unknown. To examine these influences, a coastal centric diatom, Conticribra weissflogii was maintained at different concentrations of nitrate (N) and/or oxytetracycline (OTC). Algal density, cell growth cycle, protein, chlorophyll a, superoxide dismutase (SOD) activity, and malonaldehyde (MDA) were determined for the assessment of algal biomass, lifetime, nutritional value, photosynthesis and respiration, antioxidant capacity, and lipid peroxidation, respectively. When N addition was combined with OTC pollution, the cell growth cycles were shortened by 56-73%; algal density, SOD activities, the concentrations of chlorophyll a, protein, and MDA varied between 73 and 121%, 19 and 397%, 52 and 693%, 19 and 875%, and 66 and 2733% of the values observed in N addition experiments, respectively. According to P-value analysis, the influence of OTC on algal density and SOD activity was not significant, but the effect on cell growth cycle, protein, chlorophyll a, and MDA were significant (P<0.05). The influence of N addition with simultaneous OTC pollution on the above six end points was significant. Algal biomass, lifetime, nutrition, antioxidant capacity, lipid peroxidation, photosynthesis, and respiration were all affected by the addition of OTC and N. Coastal ecosystem functions were severely affected by N and OTC additions, and the influence was increased in the order: Nrisk assessment of aquaculture pollution on coastal ecosystem functions.

  15. Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity

    DTIC Science & Technology

    2015-01-07

    Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity1 R. Tyrrell Rockafellar Johannes O. Royset...insights and a new class of distributionally robust optimization models. Keywords: risk measures, residual risk, generalized regression, surrogate ...Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  16. Modelling suicide risk in later life.

    PubMed

    Lo, C F; Kwok, Cordelia M Y

    2006-08-01

    Affective disorder is generally regarded as the prominent risk factor for suicide in the old age population. Despite the large number of empirical studies available in the literature, there is no attempt in modelling the dynamics of an individual's level of suicide risk theoretically yet. In particular, a dynamic model which can simulate the time evolution of an individual's level of risk for suicide and provide quantitative estimates of the probability of suicide risk is still lacking. In the present study we apply the contingent claims analysis of credit risk modelling in the field of quantitative finance to derive a theoretical stochastic model for estimation of the probability of suicide risk in later life in terms of a signalling index of affective disorder. Our model is based upon the hypothesis that the current state of affective disorder of a patient can be represented by a signalling index and exhibits stochastic movement and that a threshold of affective disorder, which signifies the occurrence of suicide, exists. According to the numerical results, the implications of our model are consistent with the clinical findings. Hence, we believe that such a dynamic model will be essential to the design of effective suicide prevention strategies in the target population of older adults, especially in the primary care setting.

  17. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  18. Global flood risk modelling and its applications for disaster risk reduction

    NASA Astrophysics Data System (ADS)

    Jongman, Brenden; Winsemius, Hessel; Bierkens, Marc; Bouwman, Arno; van Beek, Rens; Ligtvoet, Willem; Ward, Philip

    2014-05-01

    Flooding of river systems is the most costly natural hazard affecting societies around the world, with an average of 55 billion in direct losses and 4,500 fatalities each year between 1990 and 2012. The accurate and consistent assessment of flood risk on a global scale is essential for international development organizations and the reinsurance industry, and for enhancing our understanding of climate change impacts. This need is especially felt in developing countries, where local data and models are largely unavailable, and where flood risk is increasing rapidly under strong population growth and economic development. Here we present ongoing applications of high-resolution flood risk modelling at a global scale. The work is based on GLOFRIS, a modelling chain that produces flood risk maps at a 1km spatial resolution for the entire globe, under a range of climate and socioeconomic scenarios and various past and future time periods. This modelling chain combines a hydrological inundation model with socioeconomic datasets to assess past, current and future population exposure; economic damages; and agricultural risk. These tools are currently applied scientifically to gain insights in geographical patterns in current risk, and to assess the effects of possible future scenarios under climate change and climate variability. In this presentation we show recent applications from global scale to national scales. The global scale applications include global risk profiling for the reinsurance industry; and novel estimation of global flood mortality risk. In addition it will be demonstrated how the global flood modelling approach was successfully applied to assess disaster risk reduction priorities on a national scale in Africa. Finally, we indicate how these global modelling tools can be used to quantify the costs and benefits of adaptation, and explore pathways for development under a changing environment.

  19. Major histocompatibility complex harbors widespread genotypic variability of non-additive risk of rheumatoid arthritis including epistasis

    PubMed Central

    Wei, Wen-Hua; Bowes, John; Plant, Darren; Viatte, Sebastien; Yarwood, Annie; Massey, Jonathan; Worthington, Jane; Eyre, Stephen

    2016-01-01

    Genotypic variability based genome-wide association studies (vGWASs) can identify potentially interacting loci without prior knowledge of the interacting factors. We report a two-stage approach to make vGWAS applicable to diseases: firstly using a mixed model approach to partition dichotomous phenotypes into additive risk and non-additive environmental residuals on the liability scale and secondly using the Levene’s (Brown-Forsythe) test to assess equality of the residual variances across genotype groups per marker. We found widespread significant (P < 2.5e-05) vGWAS signals within the major histocompatibility complex (MHC) across all three study cohorts of rheumatoid arthritis. We further identified 10 epistatic interactions between the vGWAS signals independent of the MHC additive effects, each with a weak effect but jointly explained 1.9% of phenotypic variance. PTPN22 was also identified in the discovery cohort but replicated in only one independent cohort. Combining the three cohorts boosted power of vGWAS and additionally identified TYK2 and ANKRD55. Both PTPN22 and TYK2 had evidence of interactions reported elsewhere. We conclude that vGWAS can help discover interacting loci for complex diseases but require large samples to find additional signals. PMID:27109064

  20. Applying risk assessment models in non-surgical patients: effective risk stratification.

    PubMed

    Eldor, A

    1999-08-01

    Pulmonary embolism and deep vein thrombosis are serious complications of non-surgical patients, but scarcity of data documenting prophylaxis means antithrombotic therapy is rarely used. Prediction of risk is complicated by the variation in the medical conditions associated with venous thromboembolism (VTE), and lack of data defining risk in different groups. Accurate risk assessment is further confounded by inherited or acquired factors for VTE, additional risk due to medical interventions, and interactions between risk factors. Acquired and inherited risk factors may underlie thromboembolic complications in a range of conditions, including pregnancy, ischaemic stroke, myocardial infarction and cancer. Risk stratification may be feasible in non-surgical patients by considering individual risk factors and their cumulative effects. Current risk assessment models require expansion and modification to reflect emerging evidence in the non-surgical field. A large on-going study of prophylaxis with low-molecular-weight heparin in non-surgical patients will clarify our understanding of the components of risk, and assist in developing therapy recommendations.

  1. Percolation model with an additional source of disorder

    NASA Astrophysics Data System (ADS)

    Kundu, Sumanta; Manna, S. S.

    2016-06-01

    The ranges of transmission of the mobiles in a mobile ad hoc network are not uniform in reality. They are affected by the temperature fluctuation in air, obstruction due to the solid objects, even the humidity difference in the environment, etc. How the varying range of transmission of the individual active elements affects the global connectivity in the network may be an important practical question to ask. Here a model of percolation phenomena, with an additional source of disorder, is introduced for a theoretical understanding of this problem. As in ordinary percolation, sites of a square lattice are occupied randomly with probability p . Each occupied site is then assigned a circular disk of random value R for its radius. A bond is defined to be occupied if and only if the radii R1 and R2 of the disks centered at the ends satisfy a certain predefined condition. In a very general formulation, one divides the R1-R2 plane into two regions by an arbitrary closed curve. One defines a point within one region as representing an occupied bond; otherwise it is a vacant bond. The study of three different rules under this general formulation indicates that the percolation threshold always varies continuously. This threshold has two limiting values, one is pc(sq) , the percolation threshold for the ordinary site percolation on the square lattice, and the other is unity. The approach of the percolation threshold to its limiting values are characterized by two exponents. In a special case, all lattice sites are occupied by disks of random radii R ∈{0 ,R0} and a percolation transition is observed with R0 as the control variable, similar to the site occupation probability.

  2. Two criteria for evaluating risk prediction models.

    PubMed

    Pfeiffer, R M; Gail, M H

    2011-09-01

    We propose and study two criteria to assess the usefulness of models that predict risk of disease incidence for screening and prevention, or the usefulness of prognostic models for management following disease diagnosis. The first criterion, the proportion of cases followed PCF (q), is the proportion of individuals who will develop disease who are included in the proportion q of individuals in the population at highest risk. The second criterion is the proportion needed to follow-up, PNF (p), namely the proportion of the general population at highest risk that one needs to follow in order that a proportion p of those destined to become cases will be followed. PCF (q) assesses the effectiveness of a program that follows 100q% of the population at highest risk. PNF (p) assess the feasibility of covering 100p% of cases by indicating how much of the population at highest risk must be followed. We show the relationship of those two criteria to the Lorenz curve and its inverse, and present distribution theory for estimates of PCF and PNF. We develop new methods, based on influence functions, for inference for a single risk model, and also for comparing the PCFs and PNFs of two risk models, both of which were evaluated in the same validation data.

  3. Hyperbolic value addition and general models of animal choice.

    PubMed

    Mazur, J E

    2001-01-01

    Three mathematical models of choice--the contextual-choice model (R. Grace, 1994), delay-reduction theory (N. Squires & E. Fantino, 1971), and a new model called the hyperbolic value-added model--were compared in their ability to predict the results from a wide variety of experiments with animal subjects. When supplied with 2 or 3 free parameters, all 3 models made fairly accurate predictions for a large set of experiments that used concurrent-chain procedures. One advantage of the hyperbolic value-added model is that it is derived from a simpler model that makes accurate predictions for many experiments using discrete-trial adjusting-delay procedures. Some results favor the hyperbolic value-added model and delay-reduction theory over the contextual-choice model, but more data are needed from choice situations for which the models make distinctly different predictions.

  4. Synergistic effect of rice husk addition on hydrothermal treatment of sewage sludge: fate and environmental risk of heavy metals.

    PubMed

    Shi, Wansheng; Liu, Chunguang; Shu, Youju; Feng, Chuanping; Lei, Zhongfang; Zhang, Zhenya

    2013-12-01

    Hydrothermal treatment (HTT) at 200°C was applied to immobilize heavy metals (HMs) and the effect of rice husk (RH) addition was investigated based on total HMs concentration, fractionation and leaching tests. The results indicated that a synergistic effect of RH addition and HTT could be achieved on reducing the risk of HMs from medium and low risk to no risk. Metals were redistributed and transformed from weakly bounded state to stable state during the HTT process under RH addition. Notably at a RH/sludge ratio of 1/1.75 (d.w.), all the HMs showed no eco-toxicity and no leaching toxicity, with the concentrations of leachable Cr, Ni, Cu and Cd decreased by 17%, 89%, 95% and 93%, respectively. This synergistic effect of RH addition and HTT on the risk reduction of HMs implies that HTT process with RH addition could be a promising and safe disposal technology for sewage sludge treatment in practice.

  5. Additive Synergism between Asbestos and Smoking in Lung Cancer Risk: A Systematic Review and Meta-Analysis

    PubMed Central

    Ngamwong, Yuwadee; Tangamornsuksan, Wimonchat; Lohitnavy, Ornrat; Chaiyakunapruk, Nathorn; Scholfield, C. Norman; Reisfeld, Brad; Lohitnavy, Manupat

    2015-01-01

    Smoking and asbestos exposure are important risks for lung cancer. Several epidemiological studies have linked asbestos exposure and smoking to lung cancer. To reconcile and unify these results, we conducted a systematic review and meta-analysis to provide a quantitative estimate of the increased risk of lung cancer associated with asbestos exposure and cigarette smoking and to classify their interaction. Five electronic databases were searched from inception to May, 2015 for observational studies on lung cancer. All case-control (N = 10) and cohort (N = 7) studies were included in the analysis. We calculated pooled odds ratios (ORs), relative risks (RRs) and 95% confidence intervals (CIs) using a random-effects model for the association of asbestos exposure and smoking with lung cancer. Lung cancer patients who were not exposed to asbestos and non-smoking (A-S-) were compared with; (i) asbestos-exposed and non-smoking (A+S-), (ii) non-exposure to asbestos and smoking (A-S+), and (iii) asbestos-exposed and smoking (A+S+). Our meta-analysis showed a significant difference in risk of developing lung cancer among asbestos exposed and/or smoking workers compared to controls (A-S-), odds ratios for the disease (95% CI) were (i) 1.70 (A+S-, 1.31–2.21), (ii) 5.65; (A-S+, 3.38–9.42), (iii) 8.70 (A+S+, 5.8–13.10). The additive interaction index of synergy was 1.44 (95% CI = 1.26–1.77) and the multiplicative index = 0.91 (95% CI = 0.63–1.30). Corresponding values for cohort studies were 1.11 (95% CI = 1.00–1.28) and 0.51 (95% CI = 0.31–0.85). Our results point to an additive synergism for lung cancer with co-exposure of asbestos and cigarette smoking. Assessments of industrial health risks should take smoking and other airborne health risks when setting occupational asbestos exposure limits. PMID:26274395

  6. Additive Synergism between Asbestos and Smoking in Lung Cancer Risk: A Systematic Review and Meta-Analysis.

    PubMed

    Ngamwong, Yuwadee; Tangamornsuksan, Wimonchat; Lohitnavy, Ornrat; Chaiyakunapruk, Nathorn; Scholfield, C Norman; Reisfeld, Brad; Lohitnavy, Manupat

    2015-01-01

    Smoking and asbestos exposure are important risks for lung cancer. Several epidemiological studies have linked asbestos exposure and smoking to lung cancer. To reconcile and unify these results, we conducted a systematic review and meta-analysis to provide a quantitative estimate of the increased risk of lung cancer associated with asbestos exposure and cigarette smoking and to classify their interaction. Five electronic databases were searched from inception to May, 2015 for observational studies on lung cancer. All case-control (N = 10) and cohort (N = 7) studies were included in the analysis. We calculated pooled odds ratios (ORs), relative risks (RRs) and 95% confidence intervals (CIs) using a random-effects model for the association of asbestos exposure and smoking with lung cancer. Lung cancer patients who were not exposed to asbestos and non-smoking (A-S-) were compared with; (i) asbestos-exposed and non-smoking (A+S-), (ii) non-exposure to asbestos and smoking (A-S+), and (iii) asbestos-exposed and smoking (A+S+). Our meta-analysis showed a significant difference in risk of developing lung cancer among asbestos exposed and/or smoking workers compared to controls (A-S-), odds ratios for the disease (95% CI) were (i) 1.70 (A+S-, 1.31-2.21), (ii) 5.65; (A-S+, 3.38-9.42), (iii) 8.70 (A+S+, 5.8-13.10). The additive interaction index of synergy was 1.44 (95% CI = 1.26-1.77) and the multiplicative index = 0.91 (95% CI = 0.63-1.30). Corresponding values for cohort studies were 1.11 (95% CI = 1.00-1.28) and 0.51 (95% CI = 0.31-0.85). Our results point to an additive synergism for lung cancer with co-exposure of asbestos and cigarette smoking. Assessments of industrial health risks should take smoking and other airborne health risks when setting occupational asbestos exposure limits.

  7. Additive genetic variation in schizophrenia risk is shared by populations of African and European descent.

    PubMed

    de Candia, Teresa R; Lee, S Hong; Yang, Jian; Browning, Brian L; Gejman, Pablo V; Levinson, Douglas F; Mowry, Bryan J; Hewitt, John K; Goddard, Michael E; O'Donovan, Michael C; Purcell, Shaun M; Posthuma, Danielle; Visscher, Peter M; Wray, Naomi R; Keller, Matthew C

    2013-09-05

    To investigate the extent to which the proportion of schizophrenia's additive genetic variation tagged by SNPs is shared by populations of European and African descent, we analyzed the largest combined African descent (AD [n = 2,142]) and European descent (ED [n = 4,990]) schizophrenia case-control genome-wide association study (GWAS) data set available, the Molecular Genetics of Schizophrenia (MGS) data set. We show how a method that uses genomic similarities at measured SNPs to estimate the additive genetic correlation (SNP correlation [SNP-rg]) between traits can be extended to estimate SNP-rg for the same trait between ethnicities. We estimated SNP-rg for schizophrenia between the MGS ED and MGS AD samples to be 0.66 (SE = 0.23), which is significantly different from 0 (p(SNP-rg = 0) = 0.0003), but not 1 (p(SNP-rg = 1) = 0.26). We re-estimated SNP-rg between an independent ED data set (n = 6,665) and the MGS AD sample to be 0.61 (SE = 0.21, p(SNP-rg = 0) = 0.0003, p(SNP-rg = 1) = 0.16). These results suggest that many schizophrenia risk alleles are shared across ethnic groups and predate African-European divergence.

  8. Calibrated predictions for multivariate competing risks models.

    PubMed

    Gorfine, Malka; Hsu, Li; Zucker, David M; Parmigiani, Giovanni

    2014-04-01

    Prediction models for time-to-event data play a prominent role in assessing the individual risk of a disease, such as cancer. Accurate disease prediction models provide an efficient tool for identifying individuals at high risk, and provide the groundwork for estimating the population burden and cost of disease and for developing patient care guidelines. We focus on risk prediction of a disease in which family history is an important risk factor that reflects inherited genetic susceptibility, shared environment, and common behavior patterns. In this work family history is accommodated using frailty models, with the main novel feature being allowing for competing risks, such as other diseases or mortality. We show through a simulation study that naively treating competing risks as independent right censoring events results in non-calibrated predictions, with the expected number of events overestimated. Discrimination performance is not affected by ignoring competing risks. Our proposed prediction methodologies correctly account for competing events, are very well calibrated, and easy to implement.

  9. Risk terrain modeling predicts child maltreatment.

    PubMed

    Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye

    2016-12-01

    As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children.

  10. Earthquake Risk Modelling - Opening the black box

    NASA Astrophysics Data System (ADS)

    Alarcon, John E.; Simic, Milan; Franco, Guillermo; Shen-Tu, Bingming

    2010-05-01

    Assessing the risk from natural catastrophes such as earthquakes involves the detailed study of the seismic sources and site conditions that contribute to the earthquake hazard in the region of interest, the distribution and particular characteristics of the exposures through the study of building stock and its vulnerabilities, and the application of specific financial terms for particular portfolios. The catastrophe modelling framework encompasses these relatively complex considerations while also including a measure of uncertainty. This paper describes succinctly the structure and modules included in a probabilistic catastrophe risk model and presents several examples of risk modelling for realistic scenarios such as the expected earthquakes in the Marmara Sea region of Turkey and the results from modelling the 2009 L'Aquila (Abruzzo) earthquake.

  11. The addition of whole soy flour to cafeteria diet reduces metabolic risk markers in wistar rats

    PubMed Central

    2013-01-01

    Background Soybean is termed a functional food because it contains bioactive compounds. However, its effects are not well known under unbalanced diet conditions. This work is aimed at evaluating the effect of adding whole soy flour to a cafeteria diet on intestinal histomorphometry, metabolic risk and toxicity markers in rats. Methods In this study, 30 male adult Wistar rats were used, distributed among three groups (n = 10): AIN-93 M diet, cafeteria diet (CAF) and cafeteria diet with soy flour (CAFS), for 56 days. The following parameters were measured: food intake; weight gain; serum concentrations of triglycerides, total cholesterol, HDL-c, glycated hemoglobin (HbA1c), aspartate (AST) and alanine (ALT) aminotransferases and Thiobarbituric Acid Reactive Substances (TBARS); humidity and lipid fecal content; weight and fat of the liver. The villous height, the crypt depth and the thickness of the duodenal and ileal circular and longitudinal muscle layers of the animals were also measured. Results There was a significant reduction in the food intake in the CAF group. The CAFS showed lower serum concentrations of triglycerides and serum TBARS and a lower percentage of hepatic fat, with a corresponding increase in thickness of the intestinal muscle layers. In the CAF group, an increase in the HbA1c, ALT, lipid excretion, liver TBARS and crypt depth, was observed associated with lower HDL-c and villous height. The addition of soy did not promote any change in these parameters. Conclusions The inclusion of whole soy flour in a high-fat diet may be helpful in reducing some markers of metabolic risk; however, more studies are required to clarify its effects on unbalanced diets. PMID:24119309

  12. A Risk Model for Lung Cancer Incidence

    PubMed Central

    Hoggart, Clive; Brennan, Paul; Tjonneland, Anne; Vogel, Ulla; Overvad, Kim; Østergaard, Jane Nautrup; Kaaks, Rudolf; Canzian, Federico; Boeing, Heiner; Steffen, Annika; Trichopoulou, Antonia; Bamia, Christina; Trichopoulos, Dimitrios; Johansson, Mattias; Palli, Domenico; Krogh, Vittorio; Tumino, Rosario; Sacerdote, Carlotta; Panico, Salvatore; Boshuizen, Hendriek; Bueno-de-Mesquita, H. Bas; Peeters, Petra H.M.; Lund, Eiliv; Gram, Inger Torhild; Braaten, Tonje; Rodríguez, Laudina; Agudo, Antonio; Sanchez-Cantalejo, Emilio; Arriola, Larraitz; Chirlaque, Maria-Dolores; Barricarte, Aurelio; Rasmuson, Torgny; Khaw, Kay-Tee; Wareham, Nicholas; Allen, Naomi E.; Riboli, Elio; Vineis, Paolo

    2015-01-01

    Risk models for lung cancer incidence would be useful for prioritizing individuals for screening and participation in clinical trials of chemoprevention. We present a risk model for lung cancer built using prospective cohort data from a general population which predicts individual incidence in a given time period. We build separate risk models for current and former smokers using 169,035 ever smokers from the multicenter European Prospective Investigation into Cancer and Nutrition (EPIC) and considered a model for never smokers. The data set was split into independent training and test sets. Lung cancer incidence was modeled using survival analysis, stratifying by age started smoking, and for former smokers, also smoking duration. Other risk factors considered were smoking intensity, 10 occupational/environmental exposures previously implicated with lung cancer, and single-nucleotide polymorphisms at two loci identified by genome-wide association studies of lung cancer. Individual risk in the test set was measured by the predicted probability of lung cancer incidence in the year preceding last follow-up time, predictive accuracy was measured by the area under the receiver operator characteristic curve (AUC). Using smoking information alone gave good predictive accuracy: the AUC and 95% confidence interval in ever smokers was 0.843 (0.810–0.875), the Bach model applied to the same data gave an AUC of 0.775 (0.737–0.813). Other risk factors had negligible effect on the AUC, including never smokers for whom prediction was poor. Our model is generalizable and straightforward to implement. Its accuracy can be attributed to its modeling of lifetime exposure to smoking. PMID:22496387

  13. Veterans Affairs Health Care: Addition to GAO’s High Risk List and Actions Needed for Removal

    DTIC Science & Technology

    2015-04-29

    VETERANS AFFAIRS HEALTH CARE Addition to GAO’s High Risk List and Actions Needed for Removal Statement of Debra A...Draper Director, Health Care Testimony Before the Committee on Veterans’ Affairs, U.S. Senate For Release on Delivery Expected at 2:30 p.m. ET...to 00-00-2015 4. TITLE AND SUBTITLE Veterans Affairs Health Care: Addition to GAO’s High Risk List and Actions Needed for Removal 5a. CONTRACT

  14. Using Generalized Additive Models to Analyze Single-Case Designs

    ERIC Educational Resources Information Center

    Shadish, William; Sullivan, Kristynn

    2013-01-01

    Many analyses for single-case designs (SCDs)--including nearly all the effect size indicators-- currently assume no trend in the data. Regression and multilevel models allow for trend, but usually test only linear trend and have no principled way of knowing if higher order trends should be represented in the model. This paper shows how Generalized…

  15. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    PubMed

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models.

  16. Bladder explosion during transurethral resection of prostate: Bladder diverticula as an additional risk factor

    PubMed Central

    Vincent, D. Paul

    2017-01-01

    Vesical explosion during transurethral resection of the prostate (TURP) is a very rare occurrence. Very few cases have been reported in the literature. The literature was reviewed pertaining to the etiology of bladder explosion during transurethral resection. The underlying mechanism for intravesical explosion is the generation and trapping of explosive gasses under the dome of the bladder which eventually detonates when it comes into contact with the cautery electrode during TURP. Various techniques have been suggested to prevent this dreaded complication. A 75-year-old male with chronic retention of urine underwent TURP. There was Grade 2 trilobar enlargement of the prostate. There were multiple diverticula with one large diverticulum in the dome of the bladder. During hemostasis, there was a loud pop sound and the bladder exploded. Lower midline laparotomy was performed and the intraperitoneal bladder rupture was repaired. He had an uneventful postoperative recovery, and he is asymptomatic at 6 months of follow-up. Even though all the precautions were taken to avoid this complication, bladder rupture was encountered. The presence of multiple diverticula is being suggested as an additional risk factor for this complication as the bladder is thinned out and also possibly due to trapping of air bubble within the diverticulum. In such cases where there are multiple bladder diverticula, the employment of a suprapubic trocar for continuous drainage of the air bubble, could well be a practical consideration. PMID:28216933

  17. Modeling food spoilage in microbial risk assessment.

    PubMed

    Koutsoumanis, Konstantinos

    2009-02-01

    In this study, I describe a systematic approach for modeling food spoilage in microbial risk assessment that is based on the incorporation of kinetic spoilage modeling in exposure assessment by combining data and models for the specific spoilage organisms (SSO: fraction of the total microflora responsible for spoilage) with those for pathogens. The structure of the approach is presented through an exposure assessment application for Escherichia coli O157:H7 in ground beef. The proposed approach allows for identifying spoiled products at the time of consumption by comparing the estimated level of SSO (pseudomonads) with the spoilage level (level of SSO at which spoilage is observed). The results of the application indicate that ignoring spoilage in risk assessment could lead to significant overestimations of risk.

  18. Non-additive model for specific heat of electrons

    NASA Astrophysics Data System (ADS)

    Anselmo, D. H. A. L.; Vasconcelos, M. S.; Silva, R.; Mello, V. D.

    2016-10-01

    By using non-additive Tsallis entropy we demonstrate numerically that one-dimensional quasicrystals, whose energy spectra are multifractal Cantor sets, are characterized by an entropic parameter, and calculate the electronic specific heat, where we consider a non-additive entropy Sq. In our method we consider an energy spectra calculated using the one-dimensional tight binding Schrödinger equation, and their bands (or levels) are scaled onto the [ 0 , 1 ] interval. The Tsallis' formalism is applied to the energy spectra of Fibonacci and double-period one-dimensional quasiperiodic lattices. We analytically obtain an expression for the specific heat that we consider to be more appropriate to calculate this quantity in those quasiperiodic structures.

  19. Modeling of additive manufacturing processes for metals: Challenges and opportunities

    DOE PAGES

    Francois, Marianne M.; Sun, Amy; King, Wayne E.; ...

    2017-01-09

    Here, with the technology being developed to manufacture metallic parts using increasingly advanced additive manufacturing processes, a new era has opened up for designing novel structural materials, from designing shapes and complex geometries to controlling the microstructure (alloy composition and morphology). The material properties used within specific structural components are also designable in order to meet specific performance requirements that are not imaginable with traditional metal forming and machining (subtractive) techniques.

  20. Additional Research Needs to Support the GENII Biosphere Models

    SciTech Connect

    Napier, Bruce A.; Snyder, Sandra F.; Arimescu, Carmen

    2013-11-30

    In the course of evaluating the current parameter needs for the GENII Version 2 code (Snyder et al. 2013), areas of possible improvement for both the data and the underlying models have been identified. As the data review was implemented, PNNL staff identified areas where the models can be improved both to accommodate the locally significant pathways identified and also to incorporate newer models. The areas are general data needs for the existing models and improved formulations for the pathway models. It is recommended that priorities be set by NRC staff to guide selection of the most useful improvements in a cost-effective manner. Suggestions are made based on relatively easy and inexpensive changes, and longer-term more costly studies. In the short term, there are several improved model formulations that could be applied to the GENII suite of codes to make them more generally useful. • Implementation of the separation of the translocation and weathering processes • Implementation of an improved model for carbon-14 from non-atmospheric sources • Implementation of radon exposure pathways models • Development of a KML processor for the output report generator module data that are calculated on a grid that could be superimposed upon digital maps for easier presentation and display • Implementation of marine mammal models (manatees, seals, walrus, whales, etc.). Data needs in the longer term require extensive (and potentially expensive) research. Before picking any one radionuclide or food type, NRC staff should perform an in-house review of current and anticipated environmental analyses to select “dominant” radionuclides of interest to allow setting of cost-effective priorities for radionuclide- and pathway-specific research. These include • soil-to-plant uptake studies for oranges and other citrus fruits, and • Development of models for evaluation of radionuclide concentration in highly-processed foods such as oils and sugars. Finally, renewed

  1. Suicide risk assessment and suicide risk formulation: essential components of the therapeutic risk management model.

    PubMed

    Silverman, Morton M

    2014-09-01

    Suicide and other suicidal behaviors are often associated with psychiatric disorders and dysfunctions. Therefore, psychiatrists have significant opportunities to identify at-risk individuals and offer treatment to reduce that risk. Although a suicide risk assessment is a core competency requirement, many clinical psychiatrists lack the requisite training and skills to appropriately assess for suicide risk. Moreover, the standard of care requires psychiatrists to foresee the possibility that a patient might engage in suicidal behavior, hence to conduct a suicide risk formulation sufficient to guide triage and treatment planning. Based on data collected via a suicide risk assessment, a suicide risk formulation is a process whereby the psychiatrist forms a judgment about a patient's foreseeable risk of suicidal behavior in order to inform triage decisions, safety and treatment planning, and interventions to reduce risk. This paper addresses the components of this process in the context of the model for therapeutic risk management of the suicidal patient developed at the Veterans Integrated Service Network (VISN) 19 Mental Illness Research, Education and Clinical Center by Wortzel et al.

  2. Addition of a Hydrological Cycle to the EPIC Jupiter Model

    NASA Astrophysics Data System (ADS)

    Dowling, T. E.; Palotai, C. J.

    2002-09-01

    We present a progress report on the development of the EPIC atmospheric model to include clouds, moist convection, and precipitation. Two major goals are: i) to study the influence that convective water clouds have on Jupiter's jets and vortices, such as those to the northwest of the Great Red Spot, and ii) to predict ammonia-cloud evolution for direct comparison to visual images (instead of relying on surrogates for clouds like potential vorticity). Data structures in the model are now set up to handle the vapor, liquid, and solid phases of the most common chemical species in planetary atmospheres. We have adapted the Prather conservation of second-order moments advection scheme to the model, which yields high accuracy for dealing with cloud edges. In collaboration with computer scientists H. Dietz and T. Mattox at the U. Kentucky, we have built a dedicated 40-node parallel computer that achieves 34 Gflops (double precision) at 74 cents per Mflop, and have updated the EPIC-model code to use cache-aware memory layouts and other modern optimizations. The latest test-case results of cloud evolution in the model will be presented. This research is funded by NASA's Planetary Atmospheres and EPSCoR programs.

  3. Modeling and Managing Risk in Billing Infrastructures

    NASA Astrophysics Data System (ADS)

    Baiardi, Fabrizio; Telmon, Claudio; Sgandurra, Daniele

    This paper discusses risk modeling and risk management in information and communications technology (ICT) systems for which the attack impact distribution is heavy tailed (e.g., power law distribution) and the average risk is unbounded. Systems with these properties include billing infrastructures used to charge customers for services they access. Attacks against billing infrastructures can be classified as peripheral attacks and backbone attacks. The goal of a peripheral attack is to tamper with user bills; a backbone attack seeks to seize control of the billing infrastructure. The probability distribution of the overall impact of an attack on a billing infrastructure also has a heavy-tailed curve. This implies that the probability of a massive impact cannot be ignored and that the average impact may be unbounded - thus, even the most expensive countermeasures would be cost effective. Consequently, the only strategy for managing risk is to increase the resilience of the infrastructure by employing redundant components.

  4. Generalized Additive Models, Cubic Splines and Penalized Likelihood.

    DTIC Science & Technology

    1987-05-22

    in case control studies ). All models in the table include dummy variable to account for the matching. The first 3 lines of the table indicate that OA...Ausoc. Breslow, N. and Day, N. (1980). Statistical methods in cancer research, volume 1- the analysis of case - control studies . International agency

  5. Predicting the Survival Time for Bladder Cancer Using an Additive Hazards Model in Microarray Data

    PubMed Central

    TAPAK, Leili; MAHJUB, Hossein; SADEGHIFAR, Majid; SAIDIJAM, Massoud; POOROLAJAL, Jalal

    2016-01-01

    Background: One substantial part of microarray studies is to predict patients’ survival based on their gene expression profile. Variable selection techniques are powerful tools to handle high dimensionality in analysis of microarray data. However, these techniques have not been investigated in competing risks setting. This study aimed to investigate the performance of four sparse variable selection methods in estimating the survival time. Methods: The data included 1381 gene expression measurements and clinical information from 301 patients with bladder cancer operated in the years 1987 to 2000 in hospitals in Denmark, Sweden, Spain, France, and England. Four methods of the least absolute shrinkage and selection operator, smoothly clipped absolute deviation, the smooth integration of counting and absolute deviation and elastic net were utilized for simultaneous variable selection and estimation under an additive hazards model. The criteria of area under ROC curve, Brier score and c-index were used to compare the methods. Results: The median follow-up time for all patients was 47 months. The elastic net approach was indicated to outperform other methods. The elastic net had the lowest integrated Brier score (0.137±0.07) and the greatest median of the over-time AUC and C-index (0.803±0.06 and 0.779±0.13, respectively). Five out of 19 selected genes by the elastic net were significant (P<0.05) under an additive hazards model. It was indicated that the expression of RTN4, SON, IGF1R and CDC20 decrease the survival time, while the expression of SMARCAD1 increase it. Conclusion: The elastic net had higher capability than the other methods for the prediction of survival time in patients with bladder cancer in the presence of competing risks base on additive hazards model. PMID:27114989

  6. Technical Work Plan for: Additional Multoscale Thermohydrologic Modeling

    SciTech Connect

    B. Kirstein

    2006-08-24

    The primary objective of Revision 04 of the MSTHM report is to provide TSPA with revised repository-wide MSTHM analyses that incorporate updated percolation flux distributions, revised hydrologic properties, updated IEDs, and information pertaining to the emplacement of transport, aging, and disposal (TAD) canisters. The updated design information is primarily related to the incorporation of TAD canisters, but also includes updates related to superseded IEDs describing emplacement drift cross-sectional geometry and layout. The intended use of the results of Revision 04 of the MSTHM report, as described in this TWP, is to predict the evolution of TH conditions (temperature, relative humidity, liquid-phase saturation, and liquid-phase flux) at specified locations within emplacement drifts and in the adjoining near-field host rock along all emplacement drifts throughout the repository. This information directly supports the TSPA for the nominal and seismic scenarios. The revised repository-wide analyses are required to incorporate updated parameters and design information and to extend those analyses out to 1,000,000 years. Note that the previous MSTHM analyses reported in Revision 03 of Multiscale Thermohydrologic Model (BSC 2005 [DIRS 173944]) only extend out to 20,000 years. The updated parameters are the percolation flux distributions, including incorporation of post-10,000-year distributions, and updated calibrated hydrologic property values for the host-rock units. The applied calibrated hydrologic properties will be an updated version of those available in Calibrated Properties Model (BSC 2004 [DIRS 169857]). These updated properties will be documented in an Appendix of Revision 03 of UZ Flow Models and Submodels (BSC 2004 [DIRS 169861]). The updated calibrated properties are applied because they represent the latest available information. The reasonableness of applying the updated calibrated' properties to the prediction of near-fieldin-drift TH conditions

  7. Risk management model of winter navigation operations.

    PubMed

    Valdez Banda, Osiris A; Goerlandt, Floris; Kuzmin, Vladimir; Kujala, Pentti; Montewka, Jakub

    2016-07-15

    The wintertime maritime traffic operations in the Gulf of Finland are managed through the Finnish-Swedish Winter Navigation System. This establishes the requirements and limitations for the vessels navigating when ice covers this area. During winter navigation in the Gulf of Finland, the largest risk stems from accidental ship collisions which may also trigger oil spills. In this article, a model for managing the risk of winter navigation operations is presented. The model analyses the probability of oil spills derived from collisions involving oil tanker vessels and other vessel types. The model structure is based on the steps provided in the Formal Safety Assessment (FSA) by the International Maritime Organization (IMO) and adapted into a Bayesian Network model. The results indicate that ship independent navigation and convoys are the operations with higher probability of oil spills. Minor spills are most probable, while major oil spills found very unlikely but possible.

  8. Software reliability: Additional investigations into modeling with replicated experiments

    NASA Technical Reports Server (NTRS)

    Nagel, P. M.; Schotz, F. M.; Skirvan, J. A.

    1984-01-01

    The effects of programmer experience level, different program usage distributions, and programming languages are explored. All these factors affect performance, and some tentative relational hypotheses are presented. An analytic framework for replicated and non-replicated (traditional) software experiments is presented. A method of obtaining an upper bound on the error rate of the next error is proposed. The method was validated empirically by comparing forecasts with actual data. In all 14 cases the bound exceeded the observed parameter, albeit somewhat conservatively. Two other forecasting methods are proposed and compared to observed results. Although demonstrated relative to this framework that stages are neither independent nor exponentially distributed, empirical estimates show that the exponential assumption is nearly valid for all but the extreme tails of the distribution. Except for the dependence in the stage probabilities, Cox's model approximates to a degree what is being observed.

  9. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  10. Mathematical modelling of risk reduction in reinsurance

    NASA Astrophysics Data System (ADS)

    Balashov, R. B.; Kryanev, A. V.; Sliva, D. E.

    2017-01-01

    The paper presents a mathematical model of efficient portfolio formation in the reinsurance markets. The presented approach provides the optimal ratio between the expected value of return and the risk of yield values below a certain level. The uncertainty in the return values is conditioned by use of expert evaluations and preliminary calculations, which result in expected return values and the corresponding risk levels. The proposed method allows for implementation of computationally simple schemes and algorithms for numerical calculation of the numerical structure of the efficient portfolios of reinsurance contracts of a given insurance company.

  11. Landslide risk mapping and modeling in China

    NASA Astrophysics Data System (ADS)

    Li, W.; Hong, Y.

    2015-12-01

    Under circumstances of global climate change, tectonic stress and human effect, landslides are among the most frequent and severely widespread natural hazards on Earth, as demonstrated in the World Atlas of Natural Hazards (McGuire et al., 2004). Every year, landslide activities cause serious economic loss as well as casualties (Róbert et al., 2005). How landslides can be monitored and predicted is an urgent research topic of the international landslide research community. Particularly, there is a lack of high quality and updated landslide risk maps and guidelines that can be employed to better mitigate and prevent landslide disasters in many emerging regions, including China (Hong, 2007). Since the 1950s, landslide events have been recorded in the statistical yearbooks, newspapers, and monographs in China. As disasters have been increasingly concerned by the government and the public, information about landslide events is becoming available from online news reports (Liu et al., 2012).This study presents multi-scale landslide risk mapping and modeling in China. At the national scale, based on historical data and practical experiences, we carry out landslide susceptibility and risk mapping by adopting a statistical approach and pattern recognition methods to construct empirical models. Over the identified landslide hot-spot areas, we further evaluate the slope-stability for each individual site (Sidle and Hirotaka, 2006), with the ultimate goal to set up a space-time multi-scale coupling system of Landslide risk mapping and modeling for landslide hazard monitoring and early warning.

  12. Risk management model in road transport systems

    NASA Astrophysics Data System (ADS)

    Sakhapov, R. L.; Nikolaeva, R. V.; Gatiyatullin, M. H.; Makhmutov, M. M.

    2016-08-01

    The article presents the results of a study of road safety indicators that influence the development and operation of the transport system. Road safety is considered as a continuous process of risk management. Authors constructed a model that relates the social risks of a major road safety indicator - the level of motorization. The model gives a fairly accurate assessment of the level of social risk for any given level of motorization. Authors calculated the dependence of the level of socio-economic costs of accidents and injured people in them. The applicability of the concept of socio-economic damage is caused by the presence of a linear relationship between the natural and economic indicators damage from accidents. The optimization of social risk is reduced to finding the extremum of the objective function that characterizes the economic effect of the implementation of measures to improve safety. The calculations make it possible to maximize the net present value, depending on the costs of improving road safety, taking into account socio-economic damage caused by accidents. The proposed econometric models make it possible to quantify the efficiency of the transportation system, allow to simulate the change in road safety indicators.

  13. Landslide risk models for decision making.

    PubMed

    Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio

    2009-11-01

    This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.

  14. Additional Developments in Atmosphere Revitalization Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.; Gomez, Carlos

    2013-01-01

    NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project within the AES program.

  15. Additional Developments in Atmosphere Revitalization Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.

    2013-01-01

    NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM)

  16. Quantitative risk modelling for new pharmaceutical compounds.

    PubMed

    Tang, Zhengru; Taylor, Mark J; Lisboa, Paulo; Dyas, Mark

    2005-11-15

    The process of discovering and developing new drugs is long, costly and risk-laden. Faced with a wealth of newly discovered compounds, industrial scientists need to target resources carefully to discern the key attributes of a drug candidate and to make informed decisions. Here, we describe a quantitative approach to modelling the risk associated with drug development as a tool for scenario analysis concerning the probability of success of a compound as a potential pharmaceutical agent. We bring together the three strands of manufacture, clinical effectiveness and financial returns. This approach involves the application of a Bayesian Network. A simulation model is demonstrated with an implementation in MS Excel using the modelling engine Crystal Ball.

  17. Malignancy Risk Models for Oral Lesions

    PubMed Central

    Zarate, Ana M.; Brezzo, María M.; Secchi, Dante G.; Barra, José L.

    2013-01-01

    Objectives: The aim of this work was to assess risk habits, clinical and cellular phenotypes and TP53 DNA changes in oral mucosa samples from patients with Oral Potentially Malignant Disorders (OPMD), in order to create models that enable genotypic and phenotypic patterns to be obtained that determine the risk of lesions becoming malignant. Study Design: Clinical phenotypes, family history of cancer and risk habits were collected in clinical histories. TP53 gene mutation and morphometric-morphological features were studied, and multivariate models were applied. Three groups were estabished: a) oral cancer (OC) group (n=10), b) OPMD group (n=10), and c) control group (n=8). Results: An average of 50% of patients with malignancy were found to have smoking and drinking habits. A high percentage of TP53 mutations were observed in OC (30%) and OPMD (average 20%) lesions (p=0.000). The majority of these mutations were GC ? TA transversion mutations (60%). However, patients with OC presented mutations in all the exons and introns studied. Highest diagnostic accuracy (p=0.0001) was observed when incorporating alcohol and tobacco habits variables with TP53 mutations. Conclusions: Our results prove to be statistically reliable, with parameter estimates that are nearly unbiased even for small sample sizes. Models 2 and 3 were the most accurate for assessing the risk of an OPMD becoming cancerous. However, in a public health context, model 3 is the most recommended because the characteristics considered are easier and less costly to evaluate. Key words:TP53, oral potentially malignant disorders, risk factors, genotype, phenotype. PMID:23722122

  18. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  19. [Risk hidden in the small print? : Some food additives may trigger pseudoallergic reactions].

    PubMed

    Zuberbier, Torsten; Hengstenberg, Claudine

    2016-06-01

    Some food additives may trigger pseudoallergenic reactions. However, the prevalence of such an overreaction is - despite the increasing number of food additives - rather low in the general population. The most common triggers of pseudoallergic reactions to food are naturally occurring ingredients. However, symptoms in patients with chronic urticaria should improve significantly on a pseudoallergen-free diet. In addition, some studies indicate that certain food additives may also have an impact on the symptoms of patients with neurodermatitis and asthma.

  20. Cryptosporidium Infection Risk: Results of New Dose-Response Modeling.

    PubMed

    Messner, Michael J; Berger, Philip

    2016-10-01

    Cryptosporidium human dose-response data from seven species/isolates are used to investigate six models of varying complexity that estimate infection probability as a function of dose. Previous models attempt to explicitly account for virulence differences among C. parvum isolates, using three or six species/isolates. Four (two new) models assume species/isolate differences are insignificant and three of these (all but exponential) allow for variable human susceptibility. These three human-focused models (fractional Poisson, exponential with immunity and beta-Poisson) are relatively simple yet fit the data significantly better than the more complex isolate-focused models. Among these three, the one-parameter fractional Poisson model is the simplest but assumes that all Cryptosporidium oocysts used in the studies were capable of initiating infection. The exponential with immunity model does not require such an assumption and includes the fractional Poisson as a special case. The fractional Poisson model is an upper bound of the exponential with immunity model and applies when all oocysts are capable of initiating infection. The beta Poisson model does not allow an immune human subpopulation; thus infection probability approaches 100% as dose becomes huge. All three of these models predict significantly (>10x) greater risk at the low doses that consumers might receive if exposed through drinking water or other environmental exposure (e.g., 72% vs. 4% infection probability for a one oocyst dose) than previously predicted. This new insight into Cryptosporidium risk suggests additional inactivation and removal via treatment may be needed to meet any specified risk target, such as a suggested 10(-4) annual risk of Cryptosporidium infection.

  1. Improving the predictive accuracy of hurricane power outage forecasts using generalized additive models.

    PubMed

    Han, Seung-Ryong; Guikema, Seth D; Quiring, Steven M

    2009-10-01

    Electric power is a critical infrastructure service after hurricanes, and rapid restoration of electric power is important in order to minimize losses in the impacted areas. However, rapid restoration of electric power after a hurricane depends on obtaining the necessary resources, primarily repair crews and materials, before the hurricane makes landfall and then appropriately deploying these resources as soon as possible after the hurricane. This, in turn, depends on having sound estimates of both the overall severity of the storm and the relative risk of power outages in different areas. Past studies have developed statistical, regression-based approaches for estimating the number of power outages in advance of an approaching hurricane. However, these approaches have either not been applicable for future events or have had lower predictive accuracy than desired. This article shows that a different type of regression model, a generalized additive model (GAM), can outperform the types of models used previously. This is done by developing and validating a GAM based on power outage data during past hurricanes in the Gulf Coast region and comparing the results from this model to the previously used generalized linear models.

  2. Simulation Assisted Risk Assessment: Blast Overpressure Modeling

    NASA Technical Reports Server (NTRS)

    Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael

    2006-01-01

    A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.

  3. Risk modelling for vaccination: a risk assessment perspective.

    PubMed

    Wooldridge, M

    2007-01-01

    Any risk assessment involves a number of steps. First, the risk manager, in close liaison with the risk assessor, should identify the question of interest. Then, the hazards associated with each risk question should be identified. Only then can the risks themselves be assessed. Several questions may reasonably be asked about the risk associated with avian influenza vaccines and their use. Some apply to any vaccine, while others are specific to avian influenza. Risks may occur during manufacture and during use. Some concern the vaccines themselves, while others address the effect of failure on disease control. The hazards associated with each risk question are then identified. These may be technical errors in design, development or production, such as contamination or failure to inactivate appropriately. They may relate to the biological properties of the pathogens themselves displayed during manufacture or use, for example, reversion to virulence, shedding or not being the right strain for the subsequent challenge. Following a consideration of risks and hazards, the information needed and an outline of the steps necessary to assess the risk is summarized, for an illustrative risk question using, as an example, the risks associated with the use of vaccines in the field. A brief consideration of the differences between qualitative and quantitative risk assessments is also included, and the potential effects of uncertainty and variability on the results are discussed.

  4. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  5. Human Plague Risk: Spatial-Temporal Models

    NASA Technical Reports Server (NTRS)

    Pinzon, Jorge E.

    2010-01-01

    This chpater reviews the use of spatial-temporal models in identifying potential risks of plague outbreaks into the human population. Using earth observations by satellites remote sensing there has been a systematic analysis and mapping of the close coupling between the vectors of the disease and climate variability. The overall result is that incidence of plague is correlated to positive El Nino/Southem Oscillation (ENSO).

  6. Risk-driven security testing using risk analysis with threat modeling approach.

    PubMed

    Palanivel, Maragathavalli; Selvadurai, Kanmani

    2014-01-01

    Security testing is a process of determining risks present in the system states and protects them from vulnerabilities. But security testing does not provide due importance to threat modeling and risk analysis simultaneously that affects confidentiality and integrity of the system. Risk analysis includes identification, evaluation and assessment of risks. Threat modeling approach is identifying threats associated with the system. Risk-driven security testing uses risk analysis results in test case identification, selection and assessment to prioritize and optimize the testing process. Threat modeling approach, STRIDE is generally used to identify both technical and non-technical threats present in the system. Thus, a security testing mechanism based on risk analysis results using STRIDE approach has been proposed for identifying highly risk states. Risk metrics considered for testing includes risk impact, risk possibility and risk threshold. Risk threshold value is directly proportional to risk impact and risk possibility. Risk-driven security testing results in reduced test suite which in turn reduces test case selection time. Risk analysis optimizes the test case selection and execution process. For experimentation, the system models namely LMS, ATM, OBS, OSS and MTRS are considered. The performance of proposed system is analyzed using Test Suite Reduction Rate (TSRR) and FSM coverage. TSRR varies from 13.16 to 21.43% whereas FSM coverage is achieved up to 91.49%. The results show that the proposed method combining risk analysis with threat modeling identifies states with high risks to improve the testing efficiency.

  7. Analysis of Time to Event Outcomes in Randomized Controlled Trials by Generalized Additive Models

    PubMed Central

    Argyropoulos, Christos; Unruh, Mark L.

    2015-01-01

    Background Randomized Controlled Trials almost invariably utilize the hazard ratio calculated with a Cox proportional hazard model as a treatment efficacy measure. Despite the widespread adoption of HRs, these provide a limited understanding of the treatment effect and may even provide a biased estimate when the assumption of proportional hazards in the Cox model is not verified by the trial data. Additional treatment effect measures on the survival probability or the time scale may be used to supplement HRs but a framework for the simultaneous generation of these measures is lacking. Methods By splitting follow-up time at the nodes of a Gauss Lobatto numerical quadrature rule, techniques for Poisson Generalized Additive Models (PGAM) can be adopted for flexible hazard modeling. Straightforward simulation post-estimation transforms PGAM estimates for the log hazard into estimates of the survival function. These in turn were used to calculate relative and absolute risks or even differences in restricted mean survival time between treatment arms. We illustrate our approach with extensive simulations and in two trials: IPASS (in which the proportionality of hazards was violated) and HEMO a long duration study conducted under evolving standards of care on a heterogeneous patient population. Findings PGAM can generate estimates of the survival function and the hazard ratio that are essentially identical to those obtained by Kaplan Meier curve analysis and the Cox model. PGAMs can simultaneously provide multiple measures of treatment efficacy after a single data pass. Furthermore, supported unadjusted (overall treatment effect) but also subgroup and adjusted analyses, while incorporating multiple time scales and accounting for non-proportional hazards in survival data. Conclusions By augmenting the HR conventionally reported, PGAMs have the potential to support the inferential goals of multiple stakeholders involved in the evaluation and appraisal of clinical trial

  8. Graphical models and Bayesian domains in risk modelling: application in microbiological risk assessment.

    PubMed

    Greiner, Matthias; Smid, Joost; Havelaar, Arie H; Müller-Graf, Christine

    2013-05-15

    Quantitative microbiological risk assessment (QMRA) models are used to reflect knowledge about complex real-world scenarios for the propagation of microbiological hazards along the feed and food chain. The aim is to provide insight into interdependencies among model parameters, typically with an interest to characterise the effect of risk mitigation measures. A particular requirement is to achieve clarity about the reliability of conclusions from the model in the presence of uncertainty. To this end, Monte Carlo (MC) simulation modelling has become a standard in so-called probabilistic risk assessment. In this paper, we elaborate on the application of Bayesian computational statistics in the context of QMRA. It is useful to explore the analogy between MC modelling and Bayesian inference (BI). This pertains in particular to the procedures for deriving prior distributions for model parameters. We illustrate using a simple example that the inability to cope with feedback among model parameters is a major limitation of MC modelling. However, BI models can be easily integrated into MC modelling to overcome this limitation. We refer a BI submodel integrated into a MC model to as a "Bayes domain". We also demonstrate that an entire QMRA model can be formulated as Bayesian graphical model (BGM) and discuss the advantages of this approach. Finally, we show example graphs of MC, BI and BGM models, highlighting the similarities among the three approaches.

  9. Analysis and Modeling of soil hydrology under different soil additives in artificial runoff plots

    NASA Astrophysics Data System (ADS)

    Ruidisch, M.; Arnhold, S.; Kettering, J.; Huwe, B.; Kuzyakov, Y.; Ok, Y.; Tenhunen, J. D.

    2009-12-01

    The impact of monsoon events during June and July in the Korean project region Haean Basin, which is located in the northeastern part of South Korea plays a key role for erosion, leaching and groundwater pollution risk by agrochemicals. Therefore, the project investigates the main hydrological processes in agricultural soils under field and laboratory conditions on different scales (plot, hillslope and catchment). Soil hydrological parameters were analysed depending on different soil additives, which are known for prevention of soil erosion and nutrient loss as well as increasing of water infiltration, aggregate stability and soil fertility. Hence, synthetic water-soluble Polyacrylamides (PAM), Biochar (Black Carbon mixed with organic fertilizer), both PAM and Biochar were applied in runoff plots at three agricultural field sites. Additionally, as control a subplot was set up without any additives. The field sites were selected in areas with similar hillslope gradients and with emphasis on the dominant land management form of dryland farming in Haean, which is characterised by row planting and row covering by foil. Hydrological parameters like satured water conductivity, matrix potential and water content were analysed by infiltration experiments, continuous tensiometer measurements, time domain reflectometry as well as pressure plates to indentify characteristic water retention curves of each horizon. Weather data were observed by three weather stations next to the runoff plots. Measured data also provide the input data for modeling water transport in the unsatured zone in runoff plots with HYDRUS 1D/2D/3D and SWAT (Soil & Water Assessment Tool).

  10. Accelerometry-based gait analysis, an additional objective approach to screen subjects at risk for falling.

    PubMed

    Senden, R; Savelberg, H H C M; Grimm, B; Heyligers, I C; Meijer, K

    2012-06-01

    This study investigated whether the Tinetti scale, as a subjective measure for fall risk, is associated with objectively measured gait characteristics. It is studied whether gait parameters are different for groups that are stratified for fall risk using the Tinetti scale. Moreover, the discriminative power of gait parameters to classify elderly according to the Tinetti scale is investigated. Gait of 50 elderly with a Tinneti>24 and 50 elderly with a Tinetti≤24 was analyzed using acceleration-based gait analysis. Validated algorithms were used to derive spatio-temporal gait parameters, harmonic ratio, inter-stride amplitude variability and root mean square (RMS) from the accelerometer data. Clear differences in gait were found between the groups. All gait parameters correlated with the Tinetti scale (r-range: 0.20-0.73). Only walking speed, step length and RMS showed moderate to strong correlations and high discriminative power to classify elderly according to the Tinetti scale. It is concluded that subtle gait changes that have previously been related to fall risk are not captured by the subjective assessment. It is therefore worthwhile to include objective gait assessment in fall risk screening.

  11. USING DOSE ADDITION TO ESTIMATE CUMULATIVE RISKS FROM EXPOSURES TO MULTIPLE CHEMICALS

    EPA Science Inventory

    The Food Quality Protection Act (FQPA) of 1996 requires the EPA to consider the cumulative risk from exposure to multiple chemicals that have a common mechanism of toxicity. Three methods, hazard index (HI), point-of-departure index (PODI), and toxicity equivalence factor (TEF), ...

  12. Polymorphism FXII 46C>T and cardiovascular risk: additional data from Spanish and Tunisian patients

    PubMed Central

    Athanasiadis, Georgios; Esteban, Esther; Vidal, Magdanela Gayà; Torres, Robert Carreras; Bahri, Raoudha; Moral, Pedro

    2009-01-01

    Background Previous studies showed an association between Coagulation Factor XII 46C>T polymorphism and variation in FXII plasma levels, as 46C>T seems to affect the translation efficiency. Case-control studies in Spanish samples indicated that genotype T/T is an independent risk factor for venous thrombosis, ischemic stroke and acute coronary artery disease. In this study, we tried to reaffirm the importance of 46C>T in two samples from Spain and Tunisia. Findings A Transmission Disequilibrium Test (TDT) based on 101 family trios from Barcelona with one offspring affected by ischemic heart disease and a classical case-control study based on 76 patients with IHD and 118 healthy individuals from North and Centre-South Tunisia were conducted. Subjects were genotyped for 46C>T and data were analyzed accordingly, revealing no association in any of the two samples (TDT: P = 0.16, relative risk 1.17; case-control study: P = 0.59, odds ratio 1.36). Conclusion The results suggest that 46C>T is not a risk factor for ischemic heart disease in any of the two analyzed samples and therefore the polymorphism seems not to be a universal risk factor for cardiovascular diseases. PMID:19646235

  13. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents.

  14. Metal-Polycyclic Aromatic Hydrocarbon Mixture Toxicity in Hyalella azteca. 1. Response Surfaces and Isoboles To Measure Non-additive Mixture Toxicity and Ecological Risk.

    PubMed

    Gauthier, Patrick T; Norwood, Warren P; Prepas, Ellie E; Pyle, Greg G

    2015-10-06

    Mixtures of metals and polycyclic aromatic hydrocarbons (PAHs) occur ubiquitously in aquatic environments, yet relatively little is known regarding their potential to produce non-additive toxicity (i.e., antagonism or potentiation). A review of the lethality of metal-PAH mixtures in aquatic biota revealed that more-than-additive lethality is as common as strictly additive effects. Approaches to ecological risk assessment do not consider non-additive toxicity of metal-PAH mixtures. Forty-eight-hour water-only binary mixture toxicity experiments were conducted to determine the additive toxic nature of mixtures of Cu, Cd, V, or Ni with phenanthrene (PHE) or phenanthrenequinone (PHQ) using the aquatic amphipod Hyalella azteca. In cases where more-than-additive toxicity was observed, we calculated the possible mortality rates at Canada's environmental water quality guideline concentrations. We used a three-dimensional response surface isobole model-based approach to compare the observed co-toxicity in juvenile amphipods to predicted outcomes based on concentration addition or effects addition mixtures models. More-than-additive lethality was observed for all Cu-PHE, Cu-PHQ, and several Cd-PHE, Cd-PHQ, and Ni-PHE mixtures. Our analysis predicts Cu-PHE, Cu-PHQ, Cd-PHE, and Cd-PHQ mixtures at the Canadian Water Quality Guideline concentrations would produce 7.5%, 3.7%, 4.4% and 1.4% mortality, respectively.

  15. Risk Factors for Additional Surgery after Iatrogenic Perforations due to Endoscopic Submucosal Dissection

    PubMed Central

    Kim, Gi Jun; Ji, Jeong Seon; Kim, Byung Wook; Choi, Hwang

    2017-01-01

    Objectives. Endoscopic resection (ER) is commonly performed to treat gastric epithelial neoplasms and subepithelial tumors. The aim of this study was to predict the risk factors for surgery after ER-induced perforation. Methods. We retrospectively reviewed the data on patients who received gastric endoscopic submucosal dissection (ESD) or endoscopic mucosal resection (EMR) between January 2010 and March 2015. Patients who were confirmed to have perforation were classified into surgery and nonsurgery groups. We aimed to determine the risk factors for surgery in patients who developed iatrogenic gastric perforations. Results. A total of 1183 patients underwent ER. Perforation occurred in 69 (5.8%) patients, and 9 patients (0.8%) required surgery to manage the perforation. In univariate analysis, anterior location of the lesion, a subepithelial lesion, two or more postprocedure pain killers within 24 hrs, and increased heart rate within 24 hrs after the procedure were the factors related to surgery. In logistic regression analysis, the location of the lesion at the anterior wall and using two or more postprocedure pain killers within 24 hrs were risk factors for surgery. Conclusion. Most cases of perforations after ER can be managed conservatively. When a patient requires two or more postprocedure pain killers within 24 hrs and the lesion is located on the anterior wall, early surgery should be considered instead of conservative management. PMID:28316622

  16. Modeling Situation Awareness and Crash Risk

    PubMed Central

    Fisher, Donald L.; Strayer, David. L.

    2014-01-01

    In this article we develop a model of the relationship between crash risk and a driver’s situation awareness. We consider a driver’s situation awareness to reflect the dynamic mental model of the driving environment and to be dependent upon several psychological processes including Scanning the driving environment, Predicting and anticipating hazards, Identifying potential hazards in the driving scene as they occur, Deciding on an action, and Executing an appropriate Response (SPIDER). Together, SPIDER is important for establishing and maintaining good situation awareness of the driving environment and good situation awareness is important for coordinating and scheduling the SPIDER-relevant processes necessary for safe driving. An Order-of-Processing (OP) model makes explicit the SPIDER-relevant processes and how they predict the likelihood of a crash when the driver is or is not distracted by a secondary task. For example, the OP model shows how a small decrease in the likelihood of any particular SPIDER activity being completed successfully (because of a concurrent secondary task performance) would lead to a large increase in the relative risk of a crash. PMID:24776225

  17. Use of generalised additive models to categorise continuous variables in clinical prediction

    PubMed Central

    2013-01-01

    Background In medical practice many, essentially continuous, clinical parameters tend to be categorised by physicians for ease of decision-making. Indeed, categorisation is a common practice both in medical research and in the development of clinical prediction rules, particularly where the ensuing models are to be applied in daily clinical practice to support clinicians in the decision-making process. Since the number of categories into which a continuous predictor must be categorised depends partly on the relationship between the predictor and the outcome, the need for more than two categories must be borne in mind. Methods We propose a categorisation methodology for clinical-prediction models, using Generalised Additive Models (GAMs) with P-spline smoothers to determine the relationship between the continuous predictor and the outcome. The proposed method consists of creating at least one average-risk category along with high- and low-risk categories based on the GAM smooth function. We applied this methodology to a prospective cohort of patients with exacerbated chronic obstructive pulmonary disease. The predictors selected were respiratory rate and partial pressure of carbon dioxide in the blood (PCO2), and the response variable was poor evolution. An additive logistic regression model was used to show the relationship between the covariates and the dichotomous response variable. The proposed categorisation was compared to the continuous predictor as the best option, using the AIC and AUC evaluation parameters. The sample was divided into a derivation (60%) and validation (40%) samples. The first was used to obtain the cut points while the second was used to validate the proposed methodology. Results The three-category proposal for the respiratory rate was ≤ 20;(20,24];> 24, for which the following values were obtained: AIC=314.5 and AUC=0.638. The respective values for the continuous predictor were AIC=317.1 and AUC=0.634, with no statistically

  18. A flexible count data regression model for risk analysis.

    PubMed

    Guikema, Seth D; Coffelt, Jeremy P; Goffelt, Jeremy P

    2008-02-01

    In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets.

  19. Modeling risk of pneumonia epizootics in bighorn sheep

    USGS Publications Warehouse

    Sells, Sarah N.; Mitchell, Michael S.; Nowak, J. Joshua; Lukacs, Paul M.; Anderson, Neil J.; Ramsey, Jennifer M.; Gude, Justin A.; Krausman, Paul R.

    2015-01-01

    Pneumonia epizootics are a major challenge for management of bighorn sheep (Ovis canadensis) affecting persistence of herds, satisfaction of stakeholders, and allocations of resources by management agencies. Risk factors associated with the disease are poorly understood, making pneumonia epizootics hard to predict; such epizootics are thus managed reactively rather than proactively. We developed a model for herds in Montana that identifies risk factors and addresses biological questions about risk. Using Bayesian logistic regression with repeated measures, we found that private land, weed control using domestic sheep or goats, pneumonia history, and herd density were positively associated with risk of pneumonia epizootics in 43 herds that experienced 22 epizootics out of 637 herd-years from 1979–2013. We defined an area of high risk for pathogen exposure as the area of each herd distribution plus a 14.5-km buffer from that boundary. Within this area, the odds of a pneumonia epizootic increased by >1.5 times per additional unit of private land (unit is the standardized % of private land where global  = 25.58% and SD = 14.53%). Odds were >3.3 times greater if domestic sheep or goats were used for weed control in a herd's area of high risk. If a herd or its neighbors within the area of high risk had a history of a pneumonia epizootic, odds of a subsequent pneumonia epizootic were >10 times greater. Risk greatly increased when herds were at high density, with nearly 15 times greater odds of a pneumonia epizootic compared to when herds were at low density. Odds of a pneumonia epizootic also appeared to decrease following increased spring precipitation (odds = 0.41 per unit increase, global  = 100.18% and SD = 26.97%). Risk was not associated with number of federal sheep and goat allotments, proximity to nearest herds of bighorn sheep, ratio of rams to ewes, percentage of average winter precipitation, or whether herds were of native versus mixed

  20. Benefits and concerns associated with biotechnology-derived foods: can additional research reduce children health risks?

    PubMed

    Cantani, A

    2009-01-01

    The development of techniques devised for the genetic manipulation of foods poses new risks for children with food allergy (FA). The introduction of foreign allergenic proteins from different foods into previously tolerated foods may trigger allergic reactions, often complicating with anaphylactic shock in a subset of allergic babies. Children with FA, even if subjected to preventative diets, always challenge the risk of developing allergic manifestations after unintentional intake of a non tolerated food in restaurant settings, with relatives or schoolmates, etc, where product labelling is necessarily lacking. The introduction of potentially allergenic proteins into foods generally considered safe for allergic children can be done deliberately, by either substantially altering the food ingredients, or by genetic manipulation which change the composition or transfer allergens, or unintentionally by qualitycontrol failures, due to contaminations in the production process, or to genetic mismanipulation. There is a controversy between multinationals often favored by governments and consumer association resistance, thus an equidistant analysis poses some unprecedented impediments. The importance of FA and the potential of transgenic plants to bring food allergens into the food supply should not be disregarded. The expression in soybeans of a Brazil nut protein resulted in a food allergen ex-pressed in widely used infant formulas, so paving the way to an often reported multinational debacle. Genetic engineering poses innovative ethical and social concerns, as well as serious challenges to the environment, human health, animal welfare, and the future of agriculture. In this paper will be emphasized practical concepts more crucial for pediatricians.

  1. Benefits and concerns associated with biotechnology-derived foods: can additional research reduce children health risks?

    PubMed

    Cantani, A

    2006-01-01

    The development of techniques devised for the genetic manipulation of foods poses new risks for children with food allergy (FA). The introduction of foreign allergenic proteins from different foods into previously tolerated foods may trigger allergic reactions, often complicating with anaphylactic shock in a subset of allergic babies. Children with FA, even if subjected to preventative diets, always challenge the risk of developing allergic manifestations after unintentional intake of a non tolerated food in restaurant settings, with relatives or schoolmates, etc, where product labelling is necessarily lacking. The introduction of potentially allergenic proteins into foods generally considered safe for allergic children can be done deliberately, by either substantially altering the food ingredients, or by genetic manipulation which change the composition or transfer allergens, or unintentionally by quality-control failures, due to contaminations in the production process, or to genetic mismanipulation. There is a controversy between multinationals often favored by governments and consumer association resistance, thus an equidistant analysis poses some unprecedented impediments. The importance of FA and the potential of transgenic plants to bring food allergens into the food supply should not be disregarded. The expression in soybeans of a Brazil nut protein resulted in a food allergen expressed in widely used infant formulas, so paving the way to an often reported multinational debacle. Genetic engineering poses innovative ethical and social concerns, as well as serious challenges to the environment, human health, animal welfare, and the future of agriculture. In this paper will be emphasized practical concepts more crucial for pediatricians.

  2. Fungal colonization - an additional risk factor for diseased dogs and cats?

    PubMed

    Biegańska, Małgorzata; Dardzińska, Weronika; Dworecka-Kaszak, Bożena

    2014-01-01

    The aim of the presented mini-review is to review the literature data referring to opportunistic mycoses in pet dogs and cats suffering from other concurrent diseases, comparable to human medical disorders with high risk of secondary mycoses. This review also presents the preliminary results of a project aimed at understanding the fungal colonization and occurrence of secondary mycoses in pets suffering from metabolic disorders, neoplasms and viral infections. The incidence of opportunistic mycoses is higher in such individuals, mostly because of their impaired immunity. The main risk factors are primary and secondary types of immunodeficiency connected with anti-cancer treatment or neoplastic disease itself. Moreover, literature data and the results of our investigations show that Candida yeasts are prevalent among diabetic animals and indicate that these fungi are the main etiological agents of secondary infections of the oral cavity, GI and urogenital tracts. Other important conditions possibly favoring the development of mycoses are concurrent infections of cats with FeLV and FIV viruses. Thus, in all cases of the mentioned underlying diseases, animals should be carefully monitored by repeated mycological examination, together with inspection of other parameters. Also, the prophylaxis of opportunistic mycoses should be carefully considered alike other factors influencing the prognosis and the outcome of primary diseases.

  3. An animal model of differential genetic risk for methamphetamine intake

    PubMed Central

    Phillips, Tamara J.; Shabani, Shkelzen

    2015-01-01

    The question of whether genetic factors contribute to risk for methamphetamine (MA) use and dependence has not been intensively investigated. Compared to human populations, genetic animal models offer the advantages of control over genetic family history and drug exposure. Using selective breeding, we created lines of mice that differ in genetic risk for voluntary MA intake and identified the chromosomal addresses of contributory genes. A quantitative trait locus was identified on chromosome 10 that accounts for more than 50% of the genetic variance in MA intake in the selected mouse lines. In addition, behavioral and physiological screening identified differences corresponding with risk for MA intake that have generated hypotheses that are testable in humans. Heightened sensitivity to aversive and certain physiological effects of MA, such as MA-induced reduction in body temperature, are hallmarks of mice bred for low MA intake. Furthermore, unlike MA-avoiding mice, MA-preferring mice are sensitive to rewarding and reinforcing MA effects, and to MA-induced increases in brain extracellular dopamine levels. Gene expression analyses implicate the importance of a network enriched in transcription factor genes, some of which regulate the mu opioid receptor gene, Oprm1, in risk for MA use. Neuroimmune factors appear to play a role in differential response to MA between the mice bred for high and low intake. In addition, chromosome 10 candidate gene studies provide strong support for a trace amine-associated receptor 1 gene, Taar1, polymorphism in risk for MA intake. MA is a trace amine-associated receptor 1 (TAAR1) agonist, and a non-functional Taar1 allele segregates with high MA consumption. Thus, reduced TAAR1 function has the potential to increase risk for MA use. Overall, existing findings support the MA drinking lines as a powerful model for identifying genetic factors involved in determining risk for harmful MA use. Future directions include the development of a

  4. A Risk Assessment Model for Type 2 Diabetes in Chinese

    PubMed Central

    Luo, Senlin; Han, Longfei; Zeng, Ping; Chen, Feng; Pan, Limin; Wang, Shu; Zhang, Tiemei

    2014-01-01

    Aims To develop a risk assessment model for persons at risk from type 2 diabetes in Chinese. Materials and Methods The model was generated from the cross-sectional data of 16246 persons aged from 20 years old and over. C4.5 algorithm and multivariate logistic regression were used for variable selection. Relative risk value combined with expert decision constructed a comprehensive risk assessment for evaluating the individual risk category. The validity of the model was tested by cross validation and a survey performed six years later with some participants. Results Nine variables were selected as risk variables. A mathematical model was established to calculate the average probability of diabetes in each cluster's group divided by sex and age. A series of criteria combined with relative RR value (2.2) and level of risk variables stratified individuals into four risk groups (non, low, medium and high risk). The overall accuracy reached 90.99% evaluated by cross-validation inside the model population. The incidence of diabetes for each risk group increased from 1.5 (non-risk group) to 28.2(high-risk group) per one thousand persons per year with six years follow-up. Discussion The model could determine the individual risk for type 2 diabetes by four risk degrees. This model could be used as a technique tool not only to support screening persons at different risk, but also to evaluate the result of the intervention. PMID:25101994

  5. Social models of HIV risk among young adults in Lesotho.

    PubMed

    Bulled, Nicola L

    2015-01-01

    Extensive research over the past 30 years has revealed that individual and social determinants impact HIV risk. Even so, prevention efforts focus primarily on individual behaviour change, with little recognition of the dynamic interplay of individual and social environment factors that further exacerbate risk engagement. Drawing on long-term research with young adults in Lesotho, I examine how social environment factors contribute to HIV risk. During preliminary ethnographic analysis, I developed novel scales to measure social control, adoption of modernity, and HIV knowledge. In survey research, I examined the effects of individual characteristics (i.e., socioeconomic status, HIV knowledge, adoption of modernity) and social environment (i.e., social control) on HIV risk behaviours. In addition, I measured the impact of altered environments by taking advantage of an existing situation whereby young adults attending a national college are assigned to either a main campus in a metropolitan setting or a satellite campus in a remote setting, irrespective of the environment in which they were socialised as youth. This arbitrary assignment process generates four distinct groups of young adults with altered or constant environments. Regression models show that lower levels of perceived social control and greater adoption of modernity are associated with HIV risk, controlling for other factors. The impact of social control and modernity varies with environment dynamics.

  6. A Risk Management Model for the Federal Acquisition Process.

    DTIC Science & Technology

    1999-06-01

    risk management in the acquisition process. This research explains the Federal Acquisition Process and each of the 78 tasks to be completed by the CO...and examines the concepts of risk and risk management . This research culminates in the development of a model that identifies prevalent risks in the...contracting professionals is used to gather opinions, ideas, and practical applications of risk management in the acquisition process, and refine the model

  7. Integrating Professional and Folk Models of HIV Risk: YMSM's Perceptions of High-Risk Sex

    ERIC Educational Resources Information Center

    Kubicek, Katrina; Carpineto, Julie; McDavitt, Bryce; Weiss, George; Iverson, Ellen F.; Au, Chi-Wai; Kerrone, Dustin; Martinez, Miguel; Kipke, Michele D.

    2008-01-01

    Risks associated with HIV are well documented in research literature. Although a great deal has been written about high-risk sex, little research has been conducted to examine how young men who have sex with men (YMSM) perceive and define high-risk sexual behavior. In this study, we compare the "professional" and "folk" models of HIV risk based on…

  8. The globalization of risk and risk perception: why we need a new model of risk communication for vaccines.

    PubMed

    Larson, Heidi; Brocard Paterson, Pauline; Erondu, Ngozi

    2012-11-01

    Risk communication and vaccines is complex and the nature of risk perception is changing, with perceptions converging, evolving and having impacts well beyond specific geographic localities and points in time, especially when amplified through the Internet and other modes of global communication. This article examines the globalization of risk perceptions and their impacts, including the example of measles and the globalization of measles, mumps and rubella (MMR) vaccine risk perceptions, and calls for a new, more holistic model of risk assessment, risk communication and risk mitigation, embedded in an ongoing process of risk management for vaccines and immunization programmes. It envisions risk communication as an ongoing process that includes trust-building strategies hand-in-hand with operational and policy strategies needed to mitigate and manage vaccine-related risks, as well as perceptions of risk.

  9. Personalized Predictive Modeling and Risk Factor Identification using Patient Similarity.

    PubMed

    Ng, Kenney; Sun, Jimeng; Hu, Jianying; Wang, Fei

    2015-01-01

    Personalized predictive models are customized for an individual patient and trained using information from similar patients. Compared to global models trained on all patients, they have the potential to produce more accurate risk scores and capture more relevant risk factors for individual patients. This paper presents an approach for building personalized predictive models and generating personalized risk factor profiles. A locally supervised metric learning (LSML) similarity measure is trained for diabetes onset and used to find clinically similar patients. Personalized risk profiles are created by analyzing the parameters of the trained personalized logistic regression models. A 15,000 patient data set, derived from electronic health records, is used to evaluate the approach. The predictive results show that the personalized models can outperform the global model. Cluster analysis of the risk profiles show groups of patients with similar risk factors, differences in the top risk factors for different groups of patients and differences between the individual and global risk factors.

  10. Concentration addition-based approach for aquatic risk assessment of realistic pesticide mixtures in Portuguese river basins.

    PubMed

    Silva, Emília; Cerejeira, Maria José

    2015-05-01

    A two-tiered outline for the predictive environmental risk assessment of chemical mixtures with effect assessments based on concentration addition (CA) approaches as first tier and consideration of independent action (IA) as the second tier was applied based on realistic pesticide mixtures measured in surface waters from 2002 to 2008 within three important Portuguese river basins ('Mondego', 'Sado' and 'Tejo'). The CA-based risk quotients, based on acute data and an assessment factor of 100, exceeded 1 in more than 39 % of the 281 samples, indicating a potential risk for the aquatic environment, namely to algae. Seven herbicide compounds and three insecticides were the most toxic compounds in the pesticide mixtures and provided at least 50 % of the mixture's toxicity in almost 100 % of the samples with risk quotients based on the sum of toxic units (RQSTU) above 1. In eight samples, the maximum cumulative ratio (MCR) and the Junghan's ratio values indicated that a chemical-by-chemical approach underestimated the toxicity of the pesticide mixtures, and CA predicted higher mixture toxicity than that of IA. From a risk management perspective, the results pointed out that, by deriving appropriate programmes of measures to a limited number of pesticides with the highest contribution to the total mixture toxicity, relevant benefits also on mixture impact could be produced.

  11. Response to selection in finite locus models with non-additive effects.

    PubMed

    Esfandyari, Hadi; Henryon, Mark; Berg, Peer; Thomasen, Jorn Rind; Bijma, Piter; Sørensen, Anders Christian

    2017-01-12

    Under the finite-locus model in the absence of mutation, the additive genetic variation is expected to decrease when directional selection is acting on a population, according to quantitative-genetic theory. However, some theoretical studies of selection suggest that the level of additive variance can be sustained or even increased when non-additive genetic effects are present. We tested the hypothesis that finite-locus models with both additive and non-additive genetic effects maintain more additive genetic variance (V_A) and realize larger medium-to-long term genetic gains than models with only additive effects when the trait under selection is subject to truncation selection. Four genetic models that included additive, dominance, and additive-by-additive epistatic effects were simulated. The simulated genome for individuals consisted of 25 chromosomes, each with a length of 1M. One hundred bi-allelic QTL, four on each chromosome, were considered. In each generation, 100 sires and 100 dams were mated, producing five progeny per mating. The population was selected for a single trait (h(2)=0.1) for 100 discrete generations with selection on phenotype or BLUP-EBV. V_A decreased with directional truncation selection even in presence of non-additive genetic effects. Non-additive effects influenced long-term response to selection and among genetic models additive gene action had highest response to selection. In addition, in all genetic models, BLUP-EBV resulted in a greater fixation of favourable and unfavourable alleles and higher response than phenotypic selection. In conclusion, for the schemes we simulated, the presence of non-additive genetic effects had little effect in changes of additive variance and V_A decreased by directional selection.

  12. Assessing patients' risk of febrile neutropenia: is there a correlation between physician-assessed risk and model-predicted risk?

    PubMed

    Lyman, Gary H; Dale, David C; Legg, Jason C; Abella, Esteban; Morrow, Phuong Khanh; Whittaker, Sadie; Crawford, Jeffrey

    2015-08-01

    This study evaluated the correlation between the risk of febrile neutropenia (FN) estimated by physicians and the risk of severe neutropenia or FN predicted by a validated multivariate model in patients with nonmyeloid malignancies receiving chemotherapy. Before patient enrollment, physician and site characteristics were recorded, and physicians self-reported the FN risk at which they would typically consider granulocyte colony-stimulating factor (G-CSF) primary prophylaxis (FN risk intervention threshold). For each patient, physicians electronically recorded their estimated FN risk, orders for G-CSF primary prophylaxis (yes/no), and patient characteristics for model predictions. Correlations between physician-assessed FN risk and model-predicted risk (primary endpoints) and between physician-assessed FN risk and G-CSF orders were calculated. Overall, 124 community-based oncologists registered; 944 patients initiating chemotherapy with intermediate FN risk enrolled. Median physician-assessed FN risk over all chemotherapy cycles was 20.0%, and median model-predicted risk was 17.9%; the correlation was 0.249 (95% CI, 0.179-0.316). The correlation between physician-assessed FN risk and subsequent orders for G-CSF primary prophylaxis (n = 634) was 0.313 (95% CI, 0.135-0.472). Among patients with a physician-assessed FN risk ≥ 20%, 14% did not receive G-CSF orders. G-CSF was not ordered for 16% of patients at or above their physician's self-reported FN risk intervention threshold (median, 20.0%) and was ordered for 21% below the threshold. Physician-assessed FN risk and model-predicted risk correlated weakly; however, there was moderate correlation between physician-assessed FN risk and orders for G-CSF primary prophylaxis. Further research and education on FN risk factors and appropriate G-CSF use are needed.

  13. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    NASA Technical Reports Server (NTRS)

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  14. Urban Drainage Modeling and Flood Risk Management

    NASA Astrophysics Data System (ADS)

    Schmitt, Theo G.; Thomas, Martin

    The European research project in the EUREKA framework, RisUrSim (Σ!2255) has been worked out by a project consortium including industrial mathematics and water engineering research institutes, municipal drainage works as well as an insurance company. The overall objective has been the development of a simulation to allow flood risk analysis and cost-effective management for urban drainage systems. In view of the regulatory background of European Standard EN 752, the phenomenon of urban flooding caused by surcharged sewer systems in urban drainage systems is analyzed, leading to the necessity of dual drainage modeling. A detailed dual drainage simulation model is described based upon hydraulic flow routing procedures for surface flow and pipe flow. Special consideration is given to the interaction between surface and sewer flow in order to most accurately compute water levels above ground as a basis for further assessment of possible damage costs. The model application is presented for small case study in terms of data needs, model verification, and first simulation results.

  15. Insulin resistance: an additional risk factor in the pathogenesis of cardiovascular disease in type 2 diabetes.

    PubMed

    Patel, Tushar P; Rawal, Komal; Bagchi, Ashim K; Akolkar, Gauri; Bernardes, Nathalia; Dias, Danielle da Silva; Gupta, Sarita; Singal, Pawan K

    2016-01-01

    Sedentary life style and high calorie dietary habits are prominent leading cause of metabolic syndrome in modern world. Obesity plays a central role in occurrence of various diseases like hyperinsulinemia, hyperglycemia and hyperlipidemia, which lead to insulin resistance and metabolic derangements like cardiovascular diseases (CVDs) mediated by oxidative stress. The mortality rate due to CVDs is on the rise in developing countries. Insulin resistance (IR) leads to micro or macro angiopathy, peripheral arterial dysfunction, hampered blood flow, hypertension, as well as the cardiomyocyte and the endothelial cell dysfunctions, thus increasing risk factors for coronary artery blockage, stroke and heart failure suggesting that there is a strong association between IR and CVDs. The plausible linkages between these two pathophysiological conditions are altered levels of insulin signaling proteins such as IR-β, IRS-1, PI3K, Akt, Glut4 and PGC-1α that hamper insulin-mediated glucose uptake as well as other functions of insulin in the cardiomyocytes and the endothelial cells of the heart. Reduced AMPK, PFK-2 and elevated levels of NADP(H)-dependent oxidases produced by activated M1 macrophages of the adipose tissue and elevated levels of circulating angiotensin are also cause of CVD in diabetes mellitus condition. Insulin sensitizers, angiotensin blockers, superoxide scavengers are used as therapeutics in the amelioration of CVD. It evidently becomes important to unravel the mechanisms of the association between IR and CVDs in order to formulate novel efficient drugs to treat patients suffering from insulin resistance-mediated cardiovascular diseases. The possible associations between insulin resistance and cardiovascular diseases are reviewed here.

  16. Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity

    DTIC Science & Technology

    2015-05-20

    Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity1 R. Tyrrell Rockafellar Johannes O. Royset... Measures of residual risk are developed as extension of measures of risk. They view a random variable of interest in concert with an auxiliary random...forecasting and generalized regression. We establish the fundamental properties in this framework and show that measures of residual risk along with

  17. A poultry-processing model for quantitative microbiological risk assessment.

    PubMed

    Nauta, Maarten; van der Fels-Klerx, Ine; Havelaar, Arie

    2005-02-01

    A poultry-processing model for a quantitative microbiological risk assessment (QMRA) of campylobacter is presented, which can also be applied to other QMRAs involving poultry processing. The same basic model is applied in each consecutive stage of industrial processing. It describes the effects of inactivation and removal of the bacteria, and the dynamics of cross-contamination in terms of the transfer of campylobacter from the intestines to the carcass surface and the environment, from the carcasses to the environment, and from the environment to the carcasses. From the model it can be derived that, in general, the effect of inactivation and removal is dominant for those carcasses with high initial bacterial loads, and cross-contamination is dominant for those with low initial levels. In other QMRA poultry-processing models, the input-output relationship between the numbers of bacteria on the carcasses is usually assumed to be linear on a logarithmic scale. By including some basic mechanistics, it is shown that this may not be realistic. As nonlinear behavior may affect the predicted effects of risk mitigations; this finding is relevant for risk management. Good knowledge of the variability of bacterial loads on poultry entering the process is important. The common practice in microbiology to only present geometric mean of bacterial counts is insufficient: arithmetic mean are more suitable, in particular, to describe the effect of cross-contamination. The effects of logistic slaughter (scheduled processing) as a risk mitigation strategy are predicted to be small. Some additional complications in applying microbiological data obtained in processing plants are discussed.

  18. Re-sequencing of the APOL1-APOL4 and MYH9 gene regions in African Americans does not identify additional risks for CKD progression

    PubMed Central

    Hawkins, Gregory A.; Friedman, David J.; Lu, Lingyi; McWilliams, David R.; Chou, Jeff W.; Sajuthi, Satria; Divers, Jasmin; Parekh, Rulan; Li, Man; Genovese, Giulio; Pollak, Martin R.; Hicks, Pamela J.; Bowden, Donald W.; Ma, Lijun; Freedman, Barry I.; Langefeld, Carl D.

    2015-01-01

    Background APOL1 G1 and G2 nephropathy risk variants are associated with non-diabetic end-stage kidney disease (ESKD) in African Americans (AAs) in an autosomal recessive pattern. Additional risk and protective genetic variants may be present near the APOL1 loci since earlier age ESKD is observed in some AAs with one APOL1 renal-risk variant and because the adjacent gene MYH9 is associated with nephropathy in populations lacking G1 and G2 variants. Methods Re-sequencing was performed across a ~275 kb region encompassing the APOL1-APOL4 and MYH9 genes in 154 AA cases with non-diabetic ESKD and 38 controls without nephropathy who were heterozygous for a single APOL1 G1 or G2 risk variant. Results Sequencing identified 3246 non-coding single nucleotide polymorphisms (SNPs), 55 coding SNPs, and 246 insertion/deletions (InDels). No new coding variations were identified. Eleven variants, including a rare APOL3 Gln58Ter null variant (rs11089781), were genotyped in a replication panel of 1571 AA ESKD cases and 1334 controls. After adjusting for APOL1 G1 and G2 risk effects, these variations were not significantly associated with ESKD. In subjects with <2 APOL1 G1 and/or G2 alleles (849 cases; 1139 controls), the APOL3 null variant was nominally associated with ESKD (recessive model, OR 1.81; p=0.026); however, analysis in 807 AA cases and 634 controls from the Family Investigation of Nephropathy and Diabetes (FIND) did not replicate this association. Conclusion Additional common variants in the APOL1-APOL4-MYH9 region do not contribute significantly to ESKD risk beyond the APOL1 G1 and G2 alleles. PMID:26343748

  19. Risk assessment compatible fire models (RACFMs)

    SciTech Connect

    Lopez, A.R.; Gritzo, L.A.; Sherman, M.P.

    1998-07-01

    A suite of Probabilistic Risk Assessment Compatible Fire Models (RACFMs) has been developed to represent the hazard posed by a pool fire to weapon systems transported on the B52-H aircraft. These models represent both stand-off (i.e., the weapon system is outside of the flame zone but exposed to the radiant heat load from fire) and fully-engulfing scenarios (i.e., the object is fully covered by flames). The approach taken in developing the RACFMs for both scenarios was to consolidate, reconcile, and apply data and knowledge from all available resources including: data and correlations from the literature, data from an extensive full-scale fire test program at the Naval Air Warfare Center (NAWC) at China Lake, and results from a fire field model (VULCAN). In the past, a single, effective temperature, T{sub f}, was used to represent the fire. The heat flux to an object exposed to a fire was estimated using the relationship for black body radiation, {sigma}T{sub f}{sup 4}. Significant improvements have been made by employing the present approach which accounts for the presence of temperature distributions in fully-engulfing fires, and uses best available correlations to estimate heat fluxes in stand-off scenarios.

  20. A "CLIPER" Risk Model for Insured Losses From US Hurricane Landfalls and the Need for an Open-Source Risk Model

    NASA Astrophysics Data System (ADS)

    Murnane, R. J.

    2003-12-01

    To plan for the consequences of hurricanes, earthquakes, and other natural hazards the public and private sectors use a variety of risk models. Model output is tailored for specific users and includes a range of parameters including: damage to structures, insured losses, and estimates of shelter requirements to care for people displaced by the catastrophe. Extensive efforts are made to tune risk models to past events. However, model "forecasts" of losses are rarely verified through a comparison with new events. Instead, new events generally are used to further tune a new version of the model. In addition, there has been no public attempt to determine which model has the most predictive skill, in part because there is no agreed upon reference forecast, and in part because most risk models are proprietary. Here I describe a simple risk model that can be used to provide deterministic and probabilistic exceedance probabilities for insured losses caused by hurricanes striking the US coastline. I propose that loss estimates based on the approach used in this simple model can be used as a reference forecast for assessing the skill of more complex commercial models. I also suggest that an effort be initiated to promote the development of an open-source risk model. The simple risk model combines wind speed exceedance probabilities estimated using the historical record of maximum sustained winds for hurricanes at landfall, and a set of normalized insured losses produced by landfalling hurricanes. The approach is analogous to weather, or climate, forecasts based on a combination of CLImatology and PERsistence (CLIPER). The climatological component accounts for low frequency variability in weather due to factors such as seasonality. The analog to climatology in the simple risk model is the historical record of hurricane wind speeds and insured losses. The insured losses have been corrected for the effects of inflation, population increases, and wealth, and other factors. The

  1. Adolescent mental health and academic functioning: empirical support for contrasting models of risk and vulnerability.

    PubMed

    Lucier-Greer, Mallory; O'Neal, Catherine W; Arnold, A Laura; Mancini, Jay A; Wickrama, Kandauda K A S

    2014-11-01

    Adolescents in military families contend with normative stressors that are universal and exist across social contexts (minority status, family disruptions, and social isolation) as well as stressors reflective of their military life context (e.g., parental deployment, school transitions, and living outside the United States). This study utilizes a social ecological perspective and a stress process lens to examine the relationship between multiple risk factors and relevant indicators of youth well-being, namely depressive symptoms and academic performance, as well as the mediating role of self-efficacy (N = 1,036). Three risk models were tested: an additive effects model (each risk factor uniquely influences outcomes), a full cumulative effects model (the collection of risk factors influences outcomes), a comparative model (a cumulative effects model exploring the differential effects of normative and military-related risks). This design allowed for the simultaneous examination of multiple risk factors and a comparison of alternative perspectives on measuring risk. Each model was predictive of depressive symptoms and academic performance through persistence; however, each model provides unique findings about the relationship between risk factors and youth outcomes. Discussion is provided pertinent to service providers and researchers on how risk is conceptualized and suggestions for identifying at-risk youth.

  2. Using additive modelling to quantify the effect of chemicals on phytoplankton diversity and biomass.

    PubMed

    Viaene, K P J; De Laender, F; Van den Brink, P J; Janssen, C R

    2013-04-01

    Environmental authorities require the protection of biodiversity and other ecosystem properties such as biomass production. However, the endpoints listed in available ecotoxicological datasets generally do not contain these two ecosystem descriptors. Inferring the effects of chemicals on such descriptors from micro- or mesocosm experiments is often hampered by inherent differences in the initial biodiversity levels between experimental units or by delayed community responses. Here we introduce additive modelling to establish the effects of a chronic application of the herbicide linuron on 10 biodiversity indices and phytoplankton biomass in microcosms. We found that communities with a low (high) initial biodiversity subsequently became more (less) diverse, indicating an equilibrium biodiversity status in the communities considered here. Linuron adversely affected richness and evenness while dominance increased but no biodiversity indices were different from the control treatment at linuron concentrations below 2.4 μg/L. Richness-related indices changed at lower linuron concentrations (effects noticeable from 2.4 μg/L) than other biodiversity indices (effects noticeable from 14.4 μg/L) and, in contrast to the other indices, showed no signs of recovery following chronic exposure. Phytoplankton biomass was unaffected by linuron due to functional redundancy within the phytoplankton community. Comparing thresholds for biodiversity with conventional toxicity test results showed that standard ecological risk assessments also protect biodiversity in the case of linuron.

  3. Statistical inference for the additive hazards model under outcome-dependent sampling.

    PubMed

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo

    2015-09-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.

  4. Statistical inference for the additive hazards model under outcome-dependent sampling

    PubMed Central

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P.; Zhou, Haibo

    2015-01-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer. PMID:26379363

  5. Breast Cancer Risk Assessment SAS Macro (Gail Model)

    Cancer.gov

    A SAS macro (commonly referred to as the Gail Model) that projects absolute risk of invasive breast cancer according to NCI’s Breast Cancer Risk Assessment Tool (BCRAT) algorithm for specified race/ethnic groups and age intervals.

  6. Meeting the Educational Needs of At-Risk Students: A Cost Analysis of Three Models.

    ERIC Educational Resources Information Center

    King, Jennifer A.

    1994-01-01

    The following models targeting at-risk elementary students are compared for cost: (1) Success for All (Robert Slavin), (2) Accelerated Schools (Henry Levin), and (3) School Development Program (James Comer). The Slavin model is the most costly in expenditure, and the Levin model requires the most additional staff time. (SLD)

  7. An introduction to modeling longitudinal data with generalized additive models: applications to single-case designs.

    PubMed

    Sullivan, Kristynn J; Shadish, William R; Steiner, Peter M

    2015-03-01

    Single-case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time in both the presence and absence of treatment. This article introduces a statistical technique for analyzing SCD data that has not been much used in psychological and educational research: generalized additive models (GAMs). In parametric regression, the researcher must choose a functional form to impose on the data, for example, that trend over time is linear. GAMs reverse this process by letting the data inform the choice of functional form. In this article we review the problem that trend poses in SCDs, discuss how current SCD analytic methods approach trend, describe GAMs as a possible solution, suggest a GAM model testing procedure for examining the presence of trend in SCDs, present a small simulation to show the statistical properties of GAMs, and illustrate the procedure on 3 examples of different lengths. Results suggest that GAMs may be very useful both as a form of sensitivity analysis for checking the plausibility of assumptions about trend and as a primary data analysis strategy for testing treatment effects. We conclude with a discussion of some problems with GAMs and some future directions for research on the application of GAMs to SCDs.

  8. Risk assessment of nitrate and petroleum-derived hydrocarbon addition on Contricriba weissflogii biomass, lifetime, and nutritional value.

    PubMed

    Shun-Xing, Li; Feng-Jiao, Liu; Feng-Ying, Zheng; Xu-Guang, Huang; Yue-Gang, Zuo

    2014-03-15

    Coastal diatoms are often exposed to both petroleum-derived hydrocarbon pollution and eutrophication. How these exposures influence on algal biomass, lifetime, and nutritional value are unknown. To examine a more accurate risk assessment of the pollutants on the role of diatoms in coastal ecosystem functions, Conticribra weissflogii was maintained at different concentrations of nitrate (N) and/or water-soluble fractions of No.0 diesel oil (WSF). Algal density, cell growth cycle, protein, chlorophyll a, superoxide dismutase (SOD) activity, and malonaldehyde (MDA) were determined for the assessment of algal biomass, lifetime, nutritional value, photosynthesis and respiration, antioxidant capacity, and lipid peroxidation, respectively.When N addition was combined with WSF pollution, the cell growth cycles were shortened by 27-44%; SOD activities were decreased by 1-64%; algal density, the concentrations of chlorophyll a, protein, and MDA were varied between 38 and 310%, 62 and 712%, 4 and 124%, and 19 and 233% of the values observed in N addition experiments, respectively. Coastal ecosystem functions were severely weakened by N and WSF additions, and the influence was increased in the order: Nrisk assessment of petroleum-derived hydrocarbon on coastal ecosystem functions.

  9. Assessing the goodness of fit of personal risk models.

    PubMed

    Gong, Gail; Quante, Anne S; Terry, Mary Beth; Whittemore, Alice S

    2014-08-15

    We describe a flexible family of tests for evaluating the goodness of fit (calibration) of a pre-specified personal risk model to the outcomes observed in a longitudinal cohort. Such evaluation involves using the risk model to assign each subject an absolute risk of developing the outcome within a given time from cohort entry and comparing subjects' assigned risks with their observed outcomes. This comparison involves several issues. For example, subjects followed only for part of the risk period have unknown outcomes. Moreover, existing tests do not reveal the reasons for poor model fit when it occurs, which can reflect misspecification of the model's hazards for the competing risks of outcome development and death. To address these issues, we extend the model-specified hazards for outcome and death, and use score statistics to test the null hypothesis that the extensions are unnecessary. Simulated cohort data applied to risk models whose outcome and mortality hazards agreed and disagreed with those generating the data show that the tests are sensitive to poor model fit, provide insight into the reasons for poor fit, and accommodate a wide range of model misspecification. We illustrate the methods by examining the calibration of two breast cancer risk models as applied to a cohort of participants in the Breast Cancer Family Registry. The methods can be implemented using the Risk Model Assessment Program, an R package freely available at http://stanford.edu/~ggong/rmap/.

  10. Product versus additive threshold models for analysis of reproduction outcomes in animal genetics.

    PubMed

    David, I; Bodin, L; Gianola, D; Legarra, A; Manfredi, E; Robert-Granié, C

    2009-08-01

    The phenotypic observation of some reproduction traits (e.g., insemination success, interval from lambing to insemination) is the result of environmental and genetic factors acting on 2 individuals: the male and female involved in a mating couple. In animal genetics, the main approach (called additive model) proposed for studying such traits assumes that the phenotype is linked to a purely additive combination, either on the observed scale for continuous traits or on some underlying scale for discrete traits, of environmental and genetic effects affecting the 2 individuals. Statistical models proposed for studying human fecundability generally consider reproduction outcomes as the product of hypothetical unobservable variables. Taking inspiration from these works, we propose a model (product threshold model) for studying a binary reproduction trait that supposes that the observed phenotype is the product of 2 unobserved phenotypes, 1 for each individual. We developed a Gibbs sampling algorithm for fitting a Bayesian product threshold model including additive genetic effects and showed by simulation that it is feasible and that it provides good estimates of the parameters. We showed that fitting an additive threshold model to data that are simulated under a product threshold model provides biased estimates, especially for individuals with high breeding values. A main advantage of the product threshold model is that, in contrast to the additive model, it provides distinct estimates of fixed effects affecting each of the 2 unobserved phenotypes.

  11. Addition of Ezetimibe to statins for patients at high cardiovascular risk: Systematic review of patient-important outcomes.

    PubMed

    Fei, Yutong; Guyatt, Gordon Henry; Alexander, Paul Elias; El Dib, Regina; Siemieniuk, Reed A C; Vandvik, Per Olav; Nunnally, Mark E; Gomaa, Huda; Morgan, Rebecca L; Agarwal, Arnav; Zhang, Ying; Bhatnagar, Neera; Spencer, Frederick A

    2017-01-16

    Ezetimibe is widely used in combination with statins to reduce low-density lipoprotein. We sought to examine the impact of ezetimibe when added to statins on patient-important outcomes. Medline, EMBASE, CINAHL, and CENTRAL were searched through July, 2016. Randomized controlled trials (RCTs) of ezetimibe combined with statins versus statins alone that followed patients for at least 6 months and reported on at least one of all-cause mortality, cardiovascular deaths, non-fatal myocardial infarctions (MI), and non-fatal strokes were included. Pairs of reviewers extracted study data and assessed risk of bias independently and in duplicate. Quality of evidence was assessed using the GRADE approach. We conducted a narrative review with complementary subgroup and sensitivity analyses. IMPROVE-IT study enrolled 93% of all patients enrolled in the 8 included trials. Our analysis of the IMPROVE-IT study results showed that in patients at high risk of cardiovascular events, ezetimibe added to statins was associated with i) a likely reduction in non-fatal MI (17 fewer/1000 treated over 6 years, moderate certainty in evidence); ii) a possible reduction in non-fatal stroke (6 fewer/1000 treated over 6 years, low certainty); iii) no impact on myopathy (moderate certainty); iv) potentially no impact on all-cause mortality and cardiovascular death (both moderate certainty); and v) possibly no impact on cancer (low certainty). Addition of ezetimibe to moderate-dose statins is likely to result in 17 fewer MIs and possibly 6 fewer strokes/1000 treated over 6 years but is unlikely to reduce all-cause mortality or cardiovascular death. Patients who place a high value on a small absolute reduction in MI and are not adverse to use of an additional medication over a long duration may opt for ezetimibe in addition to statin therapy. Our analysis revealed no increased specific harms associated with addition of ezetimibe to statins.

  12. Intelligent Adversary Risk Analysis: A Bioterrorism Risk Management Model (PREPRINT)

    DTIC Science & Technology

    2009-02-20

    pseudomallei Emerging infectious disease threats such as Nipah virus and additional hantaviruses . • Coxiella burnetii (Q fever) • Clostridium...Waterborne Pathogens • Lassa Fever • Other Rickettsias • Bacteria • Bunyaviruses • Rabies • Diarrheagenic E.coli • Hantaviruses • Prions* • Pathogenic

  13. Genetic predisposition to coronary heart disease and stroke using an additive genetic risk score: a population-based study in Greece

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Objective: To determine the extent to which the risk for incident coronary heart disease (CHD) increases in relation to a genetic risk score (GRS) that additively integrates the influence of high-risk alleles in nine documented single nucleotide polymorphisms (SNPs) for CHD, and to examine whether t...

  14. Molecular basis of inherited antithrombin deficiency in Portuguese families: identification of genetic alterations and screening for additional thrombotic risk factors.

    PubMed

    David, Dezsö; Ribeiro, Sofia; Ferrão, Lénia; Gago, Teresa; Crespo, Francisco

    2004-06-01

    Antithrombin (AT), the most important coagulation serine proteases inhibitor, plays an important role in maintaining the hemostatic balance. Inherited AT deficiency, mainly characterized by predisposition to recurrent venous thromboembolism, is transmitted in an autosomal dominant manner. In this study, we analyzed the underlying genetic alterations in 12 unrelated Portuguese thrombophilic families with AT deficiency. At the same time, the modulating effect of the FV Leiden mutation, PT 20210A, PAI-1 4G, and MTHFR 677T allelic variants, on the thrombotic risk of AT deficient patients was also evaluated. Three novel frameshift alterations, a 4-bp deletion in exon 4 and two 1-bp insertions in exon 6, were identified in six unrelated type I AT deficient families. A novel missense mutation in exon 3a, which changes the highly conserved F147 residue, and a novel splice site mutation in the invariant acceptor AG dinucleotide of intron 2 were also identified in unrelated type I AT deficient families. In addition to these, two previously reported missense mutations changing the AT reactive site bond (R393-S394) and leading to type II-RS deficiency, and a previously reported cryptic splice site mutation (IVS4-14G-->A), were also identified. In these families, increased thrombotic risk associated with co-inheritance of the FV Leiden mutation and of the PAI-1 4G variant was also observed. In conclusion, we present the first data regarding the underlying genetic alterations in Portuguese thrombophilic families with AT deficiency, and confirm that the FV Leiden mutation and probably the PAI-1 4G variant represent additional thrombotic risk factors in these families.

  15. Addition of a fracture risk assessment to a coordinator's role improved treatment rates within 6 months of screening in a fragility fracture screening program.

    PubMed

    Beaton, D E; Vidmar, M; Pitzul, K B; Sujic, R; Rotondi, N K; Bogoch, E R; Sale, J E M; Jain, R; Weldon, J

    2017-03-01

    We evaluated the impact of a more intensive version of an existing post-fracture coordinator-based fracture prevention program and found that the addition of a full-risk assessment improved treatment rates. These findings provide additional support for more intensive programs aimed at reducing the risk of re-fractures.

  16. Effects of additional food in a delayed predator-prey model.

    PubMed

    Sahoo, Banshidhar; Poria, Swarup

    2015-03-01

    We examine the effects of supplying additional food to predator in a gestation delay induced predator-prey system with habitat complexity. Additional food works in favor of predator growth in our model. Presence of additional food reduces the predatory attack rate to prey in the model. Supplying additional food we can control predator population. Taking time delay as bifurcation parameter the stability of the coexisting equilibrium point is analyzed. Hopf bifurcation analysis is done with respect to time delay in presence of additional food. The direction of Hopf bifurcations and the stability of bifurcated periodic solutions are determined by applying the normal form theory and the center manifold theorem. The qualitative dynamical behavior of the model is simulated using experimental parameter values. It is observed that fluctuations of the population size can be controlled either by supplying additional food suitably or by increasing the degree of habitat complexity. It is pointed out that Hopf bifurcation occurs in the system when the delay crosses some critical value. This critical value of delay strongly depends on quality and quantity of supplied additional food. Therefore, the variation of predator population significantly effects the dynamics of the model. Model results are compared with experimental results and biological implications of the analytical findings are discussed in the conclusion section.

  17. Modeling biotic habitat high risk areas

    USGS Publications Warehouse

    Despain, D.G.; Beier, P.; Tate, C.; Durtsche, B.M.; Stephens, T.

    2000-01-01

    Fire, especially stand replacing fire, poses a threat to many threatened and endangered species as well as their habitat. On the other hand, fire is important in maintaining a variety of successional stages that can be important for approach risk assessment to assist in prioritizing areas for allocation of fire mitigation funds. One example looks at assessing risk to the species and biotic communities of concern followed by the Colorado Natural Heritage Program. One looks at the risk to Mexican spottled owls. Another looks at the risk to cutthroat trout, and a fourth considers the general effects of fire and elk.

  18. SMALL POPULATIONS REQUIRE SPECIFIC MODELING APPROACHES FOR ASSESSING RISK

    EPA Science Inventory

    All populations face non-zero risks of extinction. However, the risks for small populations, and therefore the modeling approaches necessary to predict them, are different from those of large populations. These differences are currently hindering assessment of risk to small pop...

  19. A new explained-variance based genetic risk score for predictive modeling of disease risk.

    PubMed

    Che, Ronglin; Motsinger-Reif, Alison A

    2012-09-25

    The goal of association mapping is to identify genetic variants that predict disease, and as the field of human genetics matures, the number of successful association studies is increasing. Many such studies have shown that for many diseases, risk is explained by a reasonably large number of variants that each explains a very small amount of disease risk. This is prompting the use of genetic risk scores in building predictive models, where information across several variants is combined for predictive modeling. In the current study, we compare the performance of four previously proposed genetic risk score methods and present a new method for constructing genetic risk score that incorporates explained variance information. The methods compared include: a simple count Genetic Risk Score, an odds ratio weighted Genetic Risk Score, a direct logistic regression Genetic Risk Score, a polygenic Genetic Risk Score, and the new explained variance weighted Genetic Risk Score. We compare the methods using a wide range of simulations in two steps, with a range of the number of deleterious single nucleotide polymorphisms (SNPs) explaining disease risk, genetic modes, baseline penetrances, sample sizes, relative risks (RR) and minor allele frequencies (MAF). Several measures of model performance were compared including overall power, C-statistic and Akaike's Information Criterion. Our results show the relative performance of methods differs significantly, with the new explained variance weighted GRS (EV-GRS) generally performing favorably to the other methods.

  20. Modeling probability of additional cases of natalizumab-associated JCV sero-negative progressive multifocal leukoencephalopathy.

    PubMed

    Carruthers, Robert L; Chitnis, Tanuja; Healy, Brian C

    2014-05-01

    JCV serologic status is used to determine PML risk in natalizumab-treated patients. Given two cases of natalizumab-associated PML in JCV sero-negative patients and two publications that question the false negative rate of the JCV serologic test, clinicians may question whether our understanding of PML risk is adequate. Given that there is no gold standard for diagnosing previous JCV exposure, the test characteristics of the JCV serologic test are unknowable. We propose a model of PML risk in JCV sero-negative natalizumab patients. Using the numbers of JCV sero-positive and -negative patients from a study of PML risk by JCV serologic status (sero-positive: 13,950 and sero-negative: 11,414), we apply a range of sensitivities and specificities in order calculate the number of JCV-exposed but JCV sero-negative patients (false negatives). We then apply a range of rates of developing PML in sero-negative patients to calculate the expected number of PML cases. By using the binomial function, we calculate the probability of a given number of JCV sero-negative PML cases. With this model, one has a means to establish a threshold number of JCV sero-negative natalizumab-associated PML cases at which it is improbable that our understanding of PML risk in JCV sero-negative patients is adequate.

  1. Development and Validation of a Lung Cancer Risk Prediction Model for African-Americans

    PubMed Central

    Etzel, Carol J.; Kachroo, Sumesh; Liu, Mei; D'Amelio, Anthony; Dong, Qiong; Cote, Michele L.; Wenzlaff, Angela S.; Hong, Waun Ki; Greisinger, Anthony J.; Schwartz, Ann G.; Spitz, Margaret R.

    2009-01-01

    Because existing risk prediction models for lung cancer were developed in white populations, they may not be appropriate for predicting risk among African-Americans. Therefore, a need exists to construct and validate a risk prediction model for lung cancer that is specific to African-Americans. We analyzed data from 491 African-Americans with lung cancer and 497 matched African-American controls to identify specific risks and incorporate them into a multivariable risk model for lung cancer and estimate the 5-year absolute risk of lung cancer. We performed internal and external validations of the risk model using data on additional cases and controls from the same ongoing multiracial/ethnic lung cancer case-control study from which the model-building data were obtained as well as data from two different lung cancer studies in metropolitan Detroit, respectively. We also compared our African-American model with our previously developed risk prediction model for whites. The final risk model included smoking-related variables [smoking status, pack-years smoked, age at smoking cessation (former smokers), and number of years since smoking cessation (former smokers)], self- reported physician diagnoses of chronic obstructive pulmonary disease or hay fever, and exposures to asbestos or wood dusts. Our risk prediction model for African-Americans exhibited good discrimination [75% (95% confidence interval, 0.67−0.82)] for our internal data and moderate discrimination [63% (95% confidence interval, 0.57−0.69)] for the external data group, which is an improvement over the Spitz model for white subjects. Existing lung cancer prediction models may not be appropriate for predicting risk for African-Americans because (a) they were developed using white populations, (b) level of risk is different for risk factors that African-American share with whites, and (c) unique group-specific risk factors exist for African-Americans. This study developed and validated a risk prediction

  2. Meat and bone meal and mineral feed additives may increase the risk of oral prion disease transmission

    USGS Publications Warehouse

    Johnson, Christopher J.; McKenzie, Debbie; Pedersen, Joel A.; Aiken, Judd M.

    2011-01-01

    Ingestion of prion-contaminated materials is postulated to be a primary route of prion disease transmission. Binding of prions to soil (micro)particles dramatically enhances peroral disease transmission relative to unbound prions, and it was hypothesized that micrometer-sized particles present in other consumed materials may affect prion disease transmission via the oral route of exposure. Small, insoluble particles are present in many substances, including soil, human foods, pharmaceuticals, and animal feeds. It is known that meat and bone meal (MBM), a feed additive believed responsible for the spread of bovine spongiform encephalopathy (BSE), contains particles smaller than 20 μm and that the pathogenic prion protein binds to MBM. The potentiation of disease transmission via the oral route by exposure to MBM or three micrometer-sized mineral feed additives was determined. Data showed that when the disease agent was bound to any of the tested materials, the penetrance of disease was increased compared to unbound prions. Our data suggest that in feed or other prion-contaminated substances consumed by animals or, potentially, humans, the addition of MBM or the presence of microparticles could heighten risks of prion disease acquisition.

  3. Meat and bone meal and mineral feed additives may increase the risk of oral prion disease transmission

    USGS Publications Warehouse

    Johnson, C.J.; McKenzie, D.; Pedersen, J.A.; Aiken, Judd M.

    2011-01-01

    Ingestion of prion-contaminated materials is postulated to be a primary route of prion disease transmission. Binding of prions to soil (micro)particles dramatically enhances peroral disease transmission relative to unbound prions, and it was hypothesized that micrometer-sized particles present in other consumed materials may affect prion disease transmission via the oral route of exposure. Small, insoluble particles are present in many substances, including soil, human foods, pharmaceuticals, and animal feeds. It is known that meat and bone meal (MBM), a feed additive believed responsible for the spread of bovine spongiform encephalopathy (BSE), contains particles smaller than 20 ??m and that the pathogenic prion protein binds to MBM. The potentiation of disease transmission via the oral route by exposure to MBM or three micrometer-sized mineral feed additives was determined. Data showed that when the disease agent was bound to any of the tested materials, the penetrance of disease was increased compared to unbound prions. Our data suggest that in feed or other prion-contaminated substances consumed by animals or, potentially, humans, the addition of MBM or the presence of microparticles could heighten risks of prion disease acquisition. Copyright ?? 2011 Taylor & Francis Group, LLC.

  4. MEAT AND BONE MEAL AND MINERAL FEED ADDITIVES MAY INCREASE THE RISK OF ORAL PRION DISEASE TRANSMISSION

    PubMed Central

    Johnson, Christopher J.; McKenzie, Debbie; Pedersen, Joel A.; Aiken, Judd M.

    2011-01-01

    Ingestion of prion-contaminated materials is postulated to be a primary route of prion disease transmission. Binding of prions to soil (micro)particles dramatically enhances peroral disease transmission relative to unbound prions, and it was hypothesized that micrometer–sized particles present in other consumed materials may affect prion disease transmission via the oral route of exposure. Small, insoluble particles are present in many substances, including soil, human foods, pharmaceuticals, and animal feeds. It is known that meat and bone meal (MBM), a feed additive believed responsible for the spread of bovine spongiform encephalopathy (BSE), contains particles smaller than 20 μm and that the pathogenic prion protein binds to MBM. The potentiation of disease transmission via the oral route by exposure to MBM or three micrometer-sized mineral feed additives was determined. Data showed that when the disease agent was bound to any of the tested materials, the penetrance of disease was increased compared to unbound prions. Our data suggest that in feed or other prion–contaminated substances consumed by animals or, potentially, humans, the addition of MBM or the presence of microparticles could heighten risks of prion disease acquisition. PMID:21218345

  5. Risk management modeling and its application in maritime safety

    NASA Astrophysics Data System (ADS)

    Qin, Ting-Rong; Chen, Wei-Jiong; Zeng, Xiang-Kun

    2008-12-01

    Quantified risk assessment (QRA) needs mathematicization of risk theory. However, attention has been paid almost exclusively to applications of assessment methods, which has led to neglect of research into fundamental theories, such as the relationships among risk, safety, danger, and so on. In order to solve this problem, as a first step, fundamental theoretical relationships about risk and risk management were analyzed for this paper in the light of mathematics, and then illustrated with some charts. Second, man-machine-environment-management (MMEM) theory was introduced into risk theory to analyze some properties of risk. On the basis of this, a three-dimensional model of risk management was established that includes: a goal dimension; a management dimension; an operation dimension. This goal management operation (GMO) model was explained and then emphasis was laid on the discussion of the risk flowchart (operation dimension), which lays the groundwork for further study of risk management and qualitative and quantitative assessment. Next, the relationship between Formal Safety Assessment (FSA) and Risk Management was researched. This revealed that the FSA method, which the international maritime organization (IMO) is actively spreading, comes from Risk Management theory. Finally, conclusion were made about how to apply this risk management method to concrete fields efficiently and conveniently, as well as areas where further research is required.

  6. Concentration of Risk Model (CORM) Verification and Analysis

    DTIC Science & Technology

    2014-06-15

    Mental Health and using data from a repository at the University of Michigan, had attempted to identify soldiers at higher-than-average risk of suicide ...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis TRADOC Analysis Center - Monterey 700 Dyer Road Monterey...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis Edward M. Masotti Sam Buttrey TRADOC Analysis Center

  7. An original traffic additional emission model and numerical simulation on a signalized road

    NASA Astrophysics Data System (ADS)

    Zhu, Wen-Xing; Zhang, Jing-Yu

    2017-02-01

    Based on VSP (Vehicle Specific Power) model traffic real emissions were theoretically classified into two parts: basic emission and additional emission. An original additional emission model was presented to calculate the vehicle's emission due to the signal control effects. Car-following model was developed and used to describe the traffic behavior including cruising, accelerating, decelerating and idling at a signalized intersection. Simulations were conducted under two situations: single intersection and two adjacent intersections with their respective control policy. Results are in good agreement with the theoretical analysis. It is also proved that additional emission model may be used to design the signal control policy in our modern traffic system to solve the serious environmental problems.

  8. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces.

  9. [Model for early childhood caries risks].

    PubMed

    Dimitrova, M; Kukleva, M

    2008-01-01

    Risk factors of early childhood caries were studied on 406 children of 12-47 months age. The results showed that pathological pregnancy, sleeping with bottle of blend or sweet liquid, use of candy and caramel on sticks and sour-sweet fruit juices were significant factors leading to early childhood caries. During simultaneous action of all these risk factors domination belonged to use of sour-sweet fruit juices. The probability of caries occurrence at simultaneous action of all these risk factors was equal to 62%.

  10. Additive Interaction of MTHFR C677T and MTRR A66G Polymorphisms with Being Overweight/Obesity on the Risk of Type 2 Diabetes

    PubMed Central

    Zhi, Xueyuan; Yang, Boyi; Fan, Shujun; Li, Yongfang; He, Miao; Wang, Da; Wang, Yanxun; Wei, Jian; Zheng, Quanmei; Sun, Guifan

    2016-01-01

    Although both methylenetetrahydrofolate reductase (MTHFR) C677T and methionine synthase reductase (MTRR) A66G polymorphisms have been associated with type 2 diabetes (T2D), their interactions with being overweight/obesity on T2D risk remain unclear. To evaluate the associations of the two polymorphisms with T2D and their interactions with being overweight/obesity on T2D risk, a case-control study of 180 T2D patients and 350 healthy controls was conducted in northern China. Additive interaction was estimated using relative excess risk due to interaction (RERI), attributable proportion due to interaction (AP) and synergy index (S). After adjustments for age and gender, borderline significant associations of the MTHFR C677T and MTRR A66G polymorphisms with T2D were observed under recessive (OR = 1.43, 95% CI: 0.98–2.10) and dominant (OR = 1.43, 95% CI: 1.00–2.06) models, respectively. There was a significant interaction between the MTHFR 677TT genotype and being overweight/obesity on T2D risk (AP = 0.404, 95% CI: 0.047–0.761), in addition to the MTRR 66AG/GG genotypes (RERI = 1.703, 95% CI: 0.401–3.004; AP = 0.528, 95% CI: 0.223–0.834). Our findings suggest that individuals with the MTHFR 677TT or MTRR 66AG/GG genotypes are more susceptible to the detrimental effect of being overweight/obesity on T2D. Further large-scale studies are still needed to confirm our findings. PMID:27983710

  11. A Dual-Process Approach to Health Risk Decision Making: The Prototype Willingness Model

    ERIC Educational Resources Information Center

    Gerrard, Meg; Gibbons, Frederick X.; Houlihan, Amy E.; Stock, Michelle L.; Pomery, Elizabeth A.

    2008-01-01

    Although dual-process models in cognitive, personality, and social psychology have stimulated a large body of research about analytic and heuristic modes of decision making, these models have seldom been applied to the study of adolescent risk behaviors. In addition, the developmental course of these two kinds of information processing, and their…

  12. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  13. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    PubMed

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2016-11-28

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks.

  14. Estimate of influenza cases using generalized linear, additive and mixed models.

    PubMed

    Oviedo, Manuel; Domínguez, Ángela; Pilar Muñoz, M

    2015-01-01

    We investigated the relationship between reported cases of influenza in Catalonia (Spain). Covariates analyzed were: population, age, data of report of influenza, and health region during 2010-2014 using data obtained from the SISAP program (Institut Catala de la Salut - Generalitat of Catalonia). Reported cases were related with the study of covariates using a descriptive analysis. Generalized Linear Models, Generalized Additive Models and Generalized Additive Mixed Models were used to estimate the evolution of the transmission of influenza. Additive models can estimate non-linear effects of the covariates by smooth functions; and mixed models can estimate data dependence and variability in factor variables using correlations structures and random effects, respectively. The incidence rate of influenza was calculated as the incidence per 100 000 people. The mean rate was 13.75 (range 0-27.5) in the winter months (December, January, February) and 3.38 (range 0-12.57) in the remaining months. Statistical analysis showed that Generalized Additive Mixed Models were better adapted to the temporal evolution of influenza (serial correlation 0.59) than classical linear models.

  15. Extended risk-analysis model for activities of the project.

    PubMed

    Kušar, Janez; Rihar, Lidija; Zargi, Urban; Starbek, Marko

    2013-12-01

    Project management of product/service orders has become a mode of operation in many companies. Although these are mostly cyclically recurring projects, risk management is very important for them. An extended risk-analysis model for new product/service projects is presented in this paper. Emphasis is on a solution developed in the Faculty of Mechanical Engineering in Ljubljana, Slovenia. The usual project activities risk analysis is based on evaluation of the probability that risk events occur and on evaluation of their consequences. A third parameter has been added in our model: an estimate of the incidence of risk events. On the basis of the calculated activity risk level, a project team prepares preventive and corrective measures that should be taken according to the status indicators. An important advantage of the proposed solution is that the project manager and his team members are timely warned of risk events and they can thus activate the envisaged preventive and corrective measures as necessary.

  16. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2016-03-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  17. Evaluation of cluster recovery for small area relative risk models.

    PubMed

    Rotejanaprasert, Chawarat

    2014-12-01

    The analysis of disease risk is often considered via relative risk. The comparison of relative risk estimation methods with "true risk" scenarios has been considered on various occasions. However, there has been little examination of how well competing methods perform when the focus is clustering of risk. In this paper, a simulated evaluation of a range of potential spatial risk models and a range of measures that can be used for (a) cluster goodness of fit, (b) cluster diagnostics are considered. Results suggest that exceedence probability is a poor measure of hot spot clustering because of model dependence, whereas residual-based methods are less model dependent and perform better. Local deviance information criteria measures perform well, but conditional predictive ordinate measures yield a high false positive rate.

  18. Modeling financial disaster risk management in developing countries

    NASA Astrophysics Data System (ADS)

    Mechler, R.; Hochrainer, S.; Pflug, G.; Linnerooth-Bayer, J.

    2005-12-01

    The public sector plays a major role in reducing the long-term economic repercussions of disasters by repairing damaged infrastructure and providing financial assistance to households and businesses. If critical infrastructure is not repaired in a timely manner, there can be serious effects on the economy and the livelihoods of the population. The repair of public infrastructure, however, can be a significant drain on public budgets especially in developing and transition countries. Developing country governments frequently lack the liquidity, even including international aid and loans, to fully repair damaged critical public infrastructure or provide sufficient support to households and businesses for their recovery. The earthquake in Gujarat, and other recent cases of government post-disaster liquidity crises, have sounded an alarm, prompting financial development organizations, such as the World Bank, among others, to call for greater attention to reducing financial vulnerability and increasing the resilience of the public sector. This talk reports on a model designed to illustrate the tradeoffs and choices a developing country must make in financially managing the economic risks due to natural disasters. Budgetary resources allocated to pre-disaster risk management strategies, such as loss mitigation measures, a catastrophe reserve fund, insurance and contingent credit arrangements for public assets, reduce the probability of financing gaps - the inability of governments to meet their full obligations in providing relief to private victims and restoring public infrastructure - or prevent the deterioration of the ability to undertake additional borrowing without incurring a debt crisis. The model -which is equipped with a graphical interface - can be a helpful tool for building capacity of policy makers for developing and assessing public financing strategies for disaster risk by indicating the respective costs and consequences of financing alternatives.

  19. A comprehensive Network Security Risk Model for process control networks.

    PubMed

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example.

  20. Intervention models for mothers and children at risk for injuries.

    PubMed

    Gulotta, C S; Finney, J W

    2000-03-01

    We review risk factors commonly associated with childhood unintentional injuries and highlight adolescent mothers and their young children as a high risk group. Several intervention models of injury, including the epidemiological model, Peterson and Brown's "working model," and the socioecological model have been proposed to explain the events that lead to injuries. Discussion of these models is provided and a synthesis of the adolescent parenting model and the socioecological model of injury is suggested as way to address the complex variables that lead to an injury causing event for adolescent mothers and their young children. Finally, we suggest areas of future investigation and their implications for prevention and treatment.

  1. The effects of vehicle model and driver behavior on risk.

    PubMed

    Wenzel, Thomas P; Ross, Marc

    2005-05-01

    We study the dependence of risk on vehicle type and especially on vehicle model. Here, risk is measured by the number of driver fatalities per year per million vehicles registered. We analyze both the risk to the drivers of each vehicle model and the risk the vehicle model imposes on drivers of other vehicles with which it crashes. The "combined risk" associated with each vehicle model is simply the sum of the risk-to-drivers in all kinds of crashes and the risk-to-drivers-of-other-vehicles in two-vehicle crashes. We find that most car models are as safe to their drivers as most sport utility vehicles (SUVs); the increased risk of a rollover in a SUV roughly balances the higher risk for cars that collide with SUVs and pickup trucks. We find that SUVs and to a greater extent pickup trucks, impose much greater risks than cars on drivers of other vehicles; and these risks increase with increasing pickup size. The higher aggressivity of SUVs and pickups makes their combined risk higher than that of almost all cars. Effects of light truck design on their risk are revealed by the analysis of specific models: new unibody (or "crossover") SUVs appear, in preliminary analysis, to have much lower risks than the most popular truck-based SUVs. Much has been made in the past about the high risk of low-mass cars in certain kinds of collisions. We find there are other plausible explanations for this pattern of risk, which suggests that mass may not be fundamental to safety. While not conclusive, this is potentially important because improvement in fuel economy is a major goal for designers of new vehicles. We find that accounting for the most risky drivers, young males and the elderly, does not change our general results. Similarly, we find with California data that the high risk of rural driving and the high level of rural driving by pickups does not increase the risk-to-drivers of pickups relative to that for cars. However, other more subtle differences in drivers and the

  2. Modelling the risk of noise-induced hearing loss among military pilots.

    PubMed

    Kuronen, Pentti; Toppila, Esko; Starck, Jukka; Pääkkönen, Rauno; Sorri, Martti J

    2004-02-01

    Noise is a significant risk factor in aviation, especially in military aviation. Even though our earlier studies have shown that the risk of noise-induced hearing loss (NIHL) among military pilots is small and the monitoring of their hearing is effective, we still need to develop methods of assessing the risk of NIHL more effectively at both the general and individual levels. In addition, many other risk factors are considered to contribute to the development of hearing impairment. The novel NoiseScan data management system enables assessment of the risk of developing hearing impairment on the basis of known risk factors. This study investigates the risk of hearing impairment among Finnish Air Force pilots using reasonably accurate noise exposure data and other risk factors for hearing impairment. This risk is also compared with that of industrial workers, whose risk followed the ISO 1999 prediction. Hearing among Finnish military pilots turned out to be better than predicted by the ISO 1999 model. The industrial workers had a larger number of risk factors than the pilots. Owing to the small number of risk factors, the hearing of pilots corresponds to approximately the 80th percentile, being 9-13 dB better than the 50th percentile obtained with the industrial population.

  3. A Multiple Risk Factors Model of the Development of Aggression among Early Adolescents from Urban Disadvantaged Neighborhoods

    ERIC Educational Resources Information Center

    Kim, Sangwon; Orpinas, Pamela; Kamphaus, Randy; Kelder, Steven H.

    2011-01-01

    This study empirically derived a multiple risk factors model of the development of aggression among middle school students in urban, low-income neighborhoods, using Hierarchical Linear Modeling (HLM). Results indicated that aggression increased from sixth to eighth grade. Additionally, the influences of four risk domains (individual, family,…

  4. Integrated reservoir characterization: Improvement in heterogeneities stochastic modelling by integration of additional external constraints

    SciTech Connect

    Doligez, B.; Eschard, R.; Geffroy, F.

    1997-08-01

    The classical approach to construct reservoir models is to start with a fine scale geological model which is informed with petrophysical properties. Then scaling-up techniques allow to obtain a reservoir model which is compatible with the fluid flow simulators. Geostatistical modelling techniques are widely used to build the geological models before scaling-up. These methods provide equiprobable images of the area under investigation, which honor the well data, and which variability is the same than the variability computed from the data. At an appraisal phase, when few data are available, or when the wells are insufficient to describe all the heterogeneities and the behavior of the field, additional constraints are needed to obtain a more realistic geological model. For example, seismic data or stratigraphic models can provide average reservoir information with an excellent areal coverage, but with a poor vertical resolution. New advances in modelisation techniques allow now to integrate this type of additional external information in order to constrain the simulations. In particular, 2D or 3D seismic derived information grids, or sand-shale ratios maps coming from stratigraphic models can be used as external drifts to compute the geological image of the reservoir at the fine scale. Examples are presented to illustrate the use of these new tools, their impact on the final reservoir model, and their sensitivity to some key parameters.

  5. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    PubMed

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.

  6. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    PubMed

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods.

  7. Assessing Academic Risk of Student-Athletes: Applicability of the NCAA Graduation Risk Overview Model to GPA

    ERIC Educational Resources Information Center

    Johnson, James

    2013-01-01

    In an effort to standardize academic risk assessment, the NCAA developed the graduation risk overview (GRO) model. Although this model was designed to assess graduation risk, its ability to predict grade-point average (GPA) remained unknown. Therefore, 134 individual risk assessments were made to determine GRO model effectiveness in the…

  8. Developing risk prediction models for type 2 diabetes: a systematic review of methodology and reporting

    PubMed Central

    2011-01-01

    ten studies (26%). Twenty-one risk prediction models (49%) were developed by categorising all continuous risk predictors. The treatment and handling of missing data were not reported in 16 studies (41%). Conclusions We found widespread use of poor methods that could jeopardise model development, including univariate pre-screening of variables, categorisation of continuous risk predictors and poor handling of missing data. The use of poor methods affects the reliability of the prediction model and ultimately compromises the accuracy of the probability estimates of having undiagnosed type 2 diabetes or the predicted risk of developing type 2 diabetes. In addition, many studies were characterised by a generally poor level of reporting, with many key details to objectively judge the usefulness of the models often omitted. PMID:21902820

  9. Phase two of Site 300`s ecological risk assessment: Model verification and risk management

    SciTech Connect

    Carlson, T.M.; Gregory, S.D.

    1995-12-31

    The authors completed the baseline ecological risk assessment (ERA) for Lawrence Livermore National Laboratory`s Site 300 in 1993. Using data collection and modeling techniques adapted from the human health risk assessment (HRA), they evaluated the potential hazard of contaminants in environmental media to ecological receptors. They identified potential hazards to (1) aquatic invertebrates from heavy metal contaminants in surface water, (2) burrowing vertebrates from contaminants volatilizing from subsurface soil into burrow air, and (3) grazing deer and burrowing vertebrates from cadmium contamination in surface soil. They recently began collecting data to refine the estimates of potential hazard to these ecological receptors. Bioassay results form the surface water failed to verify a hazard to aquatic invertebrates. Soil vapor surveys of subsurface burrows did verify the presence of high concentrations of volatile organic compounds (VOCs). However, they have not yet verified a true impact on the burrowing populations. The authors also completed an extensive surface soil sampling program, which identified local hot spots of cadmium contamination. In addition, they have been collecting data on the land use patterns of the deer population. Their data indicate that deer do not typically use those areas with cadmium surface soil contamination. Information from this phase of the ERA, along with the results of the HRA, will direct the selection of remedial alternatives for the site. For the ecological receptors, remedial alternatives include developing a risk management program which includes ensuring that (1) sensitive burrowing species (such as rare or endangered species) do not use areas of surface or subsurface contamination, and (2) deer populations do not use areas of surface soil contamination.

  10. Comparing GWAS Results of Complex Traits Using Full Genetic Model and Additive Models for Revealing Genetic Architecture

    PubMed Central

    Monir, Md. Mamun; Zhu, Jun

    2017-01-01

    Most of the genome-wide association studies (GWASs) for human complex diseases have ignored dominance, epistasis and ethnic interactions. We conducted comparative GWASs for total cholesterol using full model and additive models, which illustrate the impacts of the ignoring genetic variants on analysis results and demonstrate how genetic effects of multiple loci could differ across different ethnic groups. There were 15 quantitative trait loci with 13 individual loci and 3 pairs of epistasis loci identified by full model, whereas only 14 loci (9 common loci and 5 different loci) identified by multi-loci additive model. Again, 4 full model detected loci were not detected using multi-loci additive model. PLINK-analysis identified two loci and GCTA-analysis detected only one locus with genome-wide significance. Full model identified three previously reported genes as well as several new genes. Bioinformatics analysis showed some new genes are related with cholesterol related chemicals and/or diseases. Analyses of cholesterol data and simulation studies revealed that the full model performs were better than the additive-model performs in terms of detecting power and unbiased estimations of genetic variants of complex traits. PMID:28079101

  11. A second common mutation in the methylenetetrahydrofolate reductase gene: an additional risk factor for neural-tube defects?

    PubMed Central

    van der Put, N M; Gabreëls, F; Stevens, E M; Smeitink, J A; Trijbels, F J; Eskes, T K; van den Heuvel, L P; Blom, H J

    1998-01-01

    Recently, we showed that homozygosity for the common 677(C-->T) mutation in the methylenetetrahydrofolate reductase (MTHFR) gene, causing thermolability of the enzyme, is a risk factor for neural-tube defects (NTDs). We now report on another mutation in the same gene, the 1298(A-->C) mutation, which changes a glutamate into an alanine residue. This mutation destroys an MboII recognition site and has an allele frequency of .33. This 1298(A-->C) mutation results in decreased MTHFR activity (one-way analysis of variance [ANOVA] P < .0001), which is more pronounced in the homozygous than heterozygous state. Neither the homozygous nor the heterozygous state is associated with higher plasma homocysteine (Hcy) or a lower plasma folate concentration-phenomena that are evident with homozygosity for the 677(C-->T) mutation. However, there appears to be an interaction between these two common mutations. When compared with heterozygosity for either the 677(C-->T) or 1298(A-->C) mutations, the combined heterozygosity for the 1298(A-->C) and 677(C-->T) mutations was associated with reduced MTHFR specific activity (ANOVA P < .0001), higher Hcy, and decreased plasma folate levels (ANOVA P <.03). Thus, combined heterozygosity for both MTHFR mutations results in similar features as observed in homozygotes for the 677(C-->T) mutation. This combined heterozygosity was observed in 28% (n =86) of the NTD patients compared with 20% (n =403) among controls, resulting in an odds ratio of 2.04 (95% confidence interval: .9-4.7). These data suggest that the combined heterozygosity for the two MTHFR common mutations accounts for a proportion of folate-related NTDs, which is not explained by homozygosity for the 677(C-->T) mutation, and can be an additional genetic risk factor for NTDs. PMID:9545395

  12. Risk Models to Predict Hypertension: A Systematic Review

    PubMed Central

    Echouffo-Tcheugui, Justin B.; Batty, G. David; Kivimäki, Mika; Kengne, Andre P.

    2013-01-01

    Background As well as being a risk factor for cardiovascular disease, hypertension is also a health condition in its own right. Risk prediction models may be of value in identifying those individuals at risk of developing hypertension who are likely to benefit most from interventions. Methods and Findings To synthesize existing evidence on the performance of these models, we searched MEDLINE and EMBASE; examined bibliographies of retrieved articles; contacted experts in the field; and searched our own files. Dual review of identified studies was conducted. Included studies had to report on the development, validation, or impact analysis of a hypertension risk prediction model. For each publication, information was extracted on study design and characteristics, predictors, model discrimination, calibration and reclassification ability, validation and impact analysis. Eleven studies reporting on 15 different hypertension prediction risk models were identified. Age, sex, body mass index, diabetes status, and blood pressure variables were the most common predictor variables included in models. Most risk models had acceptable-to-good discriminatory ability (C-statistic>0.70) in the derivation sample. Calibration was less commonly assessed, but overall acceptable. Two hypertension risk models, the Framingham and Hopkins, have been externally validated, displaying acceptable-to-good discrimination, and C-statistic ranging from 0.71 to 0.81. Lack of individual-level data precluded analyses of the risk models in subgroups. Conclusions The discrimination ability of existing hypertension risk prediction tools is acceptable, but the impact of using these tools on prescriptions and outcomes of hypertension prevention is unclear. PMID:23861760

  13. Risk models and scores for type 2 diabetes: systematic review

    PubMed Central

    Mathur, Rohini; Dent, Tom; Meads, Catherine; Greenhalgh, Trisha

    2011-01-01

    Objective To evaluate current risk models and scores for type 2 diabetes and inform selection and implementation of these in practice. Design Systematic review using standard (quantitative) and realist (mainly qualitative) methodology. Inclusion criteria Papers in any language describing the development or external validation, or both, of models and scores to predict the risk of an adult developing type 2 diabetes. Data sources Medline, PreMedline, Embase, and Cochrane databases were searched. Included studies were citation tracked in Google Scholar to identify follow-on studies of usability or impact. Data extraction Data were extracted on statistical properties of models, details of internal or external validation, and use of risk scores beyond the studies that developed them. Quantitative data were tabulated to compare model components and statistical properties. Qualitative data were analysed thematically to identify mechanisms by which use of the risk model or score might improve patient outcomes. Results 8864 titles were scanned, 115 full text papers considered, and 43 papers included in the final sample. These described the prospective development or validation, or both, of 145 risk prediction models and scores, 94 of which were studied in detail here. They had been tested on 6.88 million participants followed for up to 28 years. Heterogeneity of primary studies precluded meta-analysis. Some but not all risk models or scores had robust statistical properties (for example, good discrimination and calibration) and had been externally validated on a different population. Genetic markers added nothing to models over clinical and sociodemographic factors. Most authors described their score as “simple” or “easily implemented,” although few were specific about the intended users and under what circumstances. Ten mechanisms were identified by which measuring diabetes risk might improve outcomes. Follow-on studies that applied a risk score as part of an

  14. Sparse Additive Ordinary Differential Equations for Dynamic Gene Regulatory Network Modeling.

    PubMed

    Wu, Hulin; Lu, Tao; Xue, Hongqi; Liang, Hua

    2014-04-02

    The gene regulation network (GRN) is a high-dimensional complex system, which can be represented by various mathematical or statistical models. The ordinary differential equation (ODE) model is one of the popular dynamic GRN models. High-dimensional linear ODE models have been proposed to identify GRNs, but with a limitation of the linear regulation effect assumption. In this article, we propose a sparse additive ODE (SA-ODE) model, coupled with ODE estimation methods and adaptive group LASSO techniques, to model dynamic GRNs that could flexibly deal with nonlinear regulation effects. The asymptotic properties of the proposed method are established and simulation studies are performed to validate the proposed approach. An application example for identifying the nonlinear dynamic GRN of T-cell activation is used to illustrate the usefulness of the proposed method.

  15. Sparse Additive Ordinary Differential Equations for Dynamic Gene Regulatory Network Modeling

    PubMed Central

    Wu, Hulin; Lu, Tao; Xue, Hongqi; Liang, Hua

    2014-01-01

    Summary The gene regulation network (GRN) is a high-dimensional complex system, which can be represented by various mathematical or statistical models. The ordinary differential equation (ODE) model is one of the popular dynamic GRN models. High-dimensional linear ODE models have been proposed to identify GRNs, but with a limitation of the linear regulation effect assumption. In this article, we propose a sparse additive ODE (SA-ODE) model, coupled with ODE estimation methods and adaptive group LASSO techniques, to model dynamic GRNs that could flexibly deal with nonlinear regulation effects. The asymptotic properties of the proposed method are established and simulation studies are performed to validate the proposed approach. An application example for identifying the nonlinear dynamic GRN of T-cell activation is used to illustrate the usefulness of the proposed method. PMID:25061254

  16. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data.

    PubMed

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2013-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions.

  17. Representational Flexibility and Problem-Solving Ability in Fraction and Decimal Number Addition: A Structural Model

    ERIC Educational Resources Information Center

    Deliyianni, Eleni; Gagatsis, Athanasios; Elia, Iliada; Panaoura, Areti

    2016-01-01

    The aim of this study was to propose and validate a structural model in fraction and decimal number addition, which is founded primarily on a synthesis of major theoretical approaches in the field of representations in Mathematics and also on previous research on the learning of fractions and decimals. The study was conducted among 1,701 primary…

  18. Measuring Children's Proportional Reasoning, The "Tendency" for an Additive Strategy and The Effect of Models

    ERIC Educational Resources Information Center

    Misailadou, Christina; Williams, Julian

    2003-01-01

    We report a study of 10-14 year old children's use of additive strategies while solving ratio and proportion tasks. Rasch methodology was used to develop a diagnostic instrument that reveals children's misconceptions. Two versions of this instrument, one with "models" thought to facilitate proportional reasoning and one without were…

  19. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data

    PubMed Central

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2012-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions. PMID:23645976

  20. Modelling eutrophication and microbial risks in peri-urban river systems using discriminant function analysis.

    PubMed

    Pinto, U; Maheshwari, B; Shrestha, S; Morris, C

    2012-12-01

    The methodology currently available to river managers for assessment of river conditions for eutrophication and microbial risks is often time consuming and costly. There is a need for efficient predictive tools based on easily measured variables for implementing appropriate management strategies and providing advice to local river users on river health and associated risks. Using the Hawkesbury-Nepean River system in New South Wales, Australia as case study, a stepwise discriminant function analysis was employed to develop two predictive models, one for river eutrophication risk and the other for microbial risk. The models are intended for a preliminary assessment of a river reach, particularly to assess the level of risk (high or low) for algal bloom and whether the river water is suitable for primary contact activities such as swimming. The input variables for both models included saturated dissolved oxygen and turbidity, while the eutrophication risk model included temperature as an additional variable. When validated with an independent data set, both models predicted the observed risk category accurately in two out of three instances. Since the models developed in this study use only two or three easy-to-measure variables, their application can help in rapid assessment of river conditions, result in potential cost saving in river monitoring programs and assist in providing timely advice to community and other users for a particular aspect of river use.

  1. Prediction models for cardiovascular disease risk in the general population: systematic review

    PubMed Central

    Hooft, Lotty; Schuit, Ewoud; Debray, Thomas P A; Collins, Gary S; Tzoulaki, Ioanna; Lassale, Camille M; Siontis, George C M; Chiocchia, Virginia; Roberts, Corran; Schlüssel, Michael Maia; Gerry, Stephen; Black, James A; Heus, Pauline; van der Schouw, Yvonne T; Peelen, Linda M; Moons, Karel G M

    2016-01-01

    Objective To provide an overview of prediction models for risk of cardiovascular disease (CVD) in the general population. Design Systematic review. Data sources Medline and Embase until June 2013. Eligibility criteria for study selection Studies describing the development or external validation of a multivariable model for predicting CVD risk in the general population. Results 9965 references were screened, of which 212 articles were included in the review, describing the development of 363 prediction models and 473 external validations. Most models were developed in Europe (n=167, 46%), predicted risk of fatal or non-fatal coronary heart disease (n=118, 33%) over a 10 year period (n=209, 58%). The most common predictors were smoking (n=325, 90%) and age (n=321, 88%), and most models were sex specific (n=250, 69%). Substantial heterogeneity in predictor and outcome definitions was observed between models, and important clinical and methodological information were often missing. The prediction horizon was not specified for 49 models (13%), and for 92 (25%) crucial information was missing to enable the model to be used for individual risk prediction. Only 132 developed models (36%) were externally validated and only 70 (19%) by independent investigators. Model performance was heterogeneous and measures such as discrimination and calibration were reported for only 65% and 58% of the external validations, respectively. Conclusions There is an excess of models predicting incident CVD in the general population. The usefulness of most of the models remains unclear owing to methodological shortcomings, incomplete presentation, and lack of external validation and model impact studies. Rather than developing yet another similar CVD risk prediction model, in this era of large datasets, future research should focus on externally validating and comparing head-to-head promising CVD risk models that already exist, on tailoring or even combining these models to local

  2. Does the model of additive effect in placebo research still hold true? A narrative review

    PubMed Central

    Berger, Bettina; Weger, Ulrich; Heusser, Peter

    2017-01-01

    Personalised and contextualised care has been turned into a major demand by people involved in healthcare suggesting to move toward person-centred medicine. The assessment of person-centred medicine can be most effectively achieved if treatments are investigated using ‘with versus without’ person-centredness or integrative study designs. However, this assumes that the components of an integrative or person-centred intervention have an additive relationship to produce the total effect. Beecher’s model of additivity assumes an additive relation between placebo and drug effects and is thus presenting an arithmetic summation. So far, no review has been carried out assessing the validity of the additive model, which is to be questioned and more closely investigated in this review. Initial searches for primary studies were undertaken in July 2016 using Pubmed and Google Scholar. In order to find matching publications of similar magnitude for the comparison part of this review, corresponding matches for all included reviews were sought. A total of 22 reviews and 3 clinical and experimental studies fulfilled the inclusion criteria. The results pointed to the following factors actively questioning the additive model: interactions of various effects, trial design, conditioning, context effects and factors, neurobiological factors, mechanism of action, statistical factors, intervention-specific factors (alcohol, caffeine), side-effects and type of intervention. All but one of the closely assessed publications was questioning the additive model. A closer examination of study design is necessary. An attempt in a more systematic approach geared towards solutions could be a suggestion for future research in this field. PMID:28321318

  3. The OPTIONS model of sexual risk assessment for adolescents.

    PubMed

    Lusczakoski, Kathryn D; Rue, Lisa A

    2012-03-01

    Typically, clinical evaluations of adolescents' sexual risk is based on inquiring about past sexual activity, which is limited by not including an adolescent's cognitive decision making regarding their past sexual decisions. This study describes the novel OPTIONS framework for assessing adolescent sexual risk including three general categories of risk (e.g., primary, secondary, and tertiary risk), which is designed to overcome the limitation of action-based assessment of risk and improve practitioners' ability to assess the levels of sexual risk. A convenience sample of 201 older adolescents (18-19 years of age) completed an online version of the Relationship Options Survey (ROS), designed to measure the OPTIONS sexual risk assessment. Bivariate correlation among the subscales functioned in the hypothesized manner, with all correlations being statistically significant. Using the OPTIONS model, 22.4% participants were classified as high risk primary, 7.0% participants were classified as high risk secondary, and 27.4% participants were classified as high risk tertiary. The study provided preliminary evidence for OPTIONS model of sexual assessment, which provides a more tailored evaluation by including cognitive decisions regarding an adolescent's sexual actions.

  4. Formation and reduction of carcinogenic furan in various model systems containing food additives.

    PubMed

    Kim, Jin-Sil; Her, Jae-Young; Lee, Kwang-Geun

    2015-12-15

    The aim of this study was to analyse and reduce furan in various model systems. Furan model systems consisting of monosaccharides (0.5M glucose and ribose), amino acids (0.5M alanine and serine) and/or 1.0M ascorbic acid were heated at 121°C for 25 min. The effects of food additives (each 0.1M) such as metal ions (iron sulphate, magnesium sulphate, zinc sulphate and calcium sulphate), antioxidants (BHT and BHA), and sodium sulphite on the formation of furan were measured. The level of furan formed in the model systems was 6.8-527.3 ng/ml. The level of furan in the model systems of glucose/serine and glucose/alanine increased 7-674% when food additives were added. In contrast, the level of furan decreased by 18-51% in the Maillard reaction model systems that included ribose and alanine/serine with food additives except zinc sulphate.

  5. Back-end Science Model Integration for Ecological Risk Assessment

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  6. Back-end Science Model Integration for Ecological Risk Assessment.

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  7. Risk Prediction Models for Other Cancers or Multiple Sites

    Cancer.gov

    Developing statistical models that estimate the probability of developing other multiple cancers over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Modeling Longitudinal Data with Generalized Additive Models: Applications to Single-Case Designs

    ERIC Educational Resources Information Center

    Sullivan, Kristynn J.; Shadish, William R.

    2013-01-01

    Single case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time both in the presence and absence of treatment. For a variety of reasons, interest in the statistical analysis and meta-analysis of these designs has been growing in recent years. This paper proposes modeling SCD data with…

  9. Rare disasters and risk attitudes: international differences and implications for integrated assessment modeling.

    PubMed

    Ding, P; Gerst, M D; Bernstein, A; Howarth, R B; Borsuk, M E

    2012-11-01

    Evaluation of public policies with uncertain economic outcomes should consider society's preferences regarding risk. However, the preference models used in most integrated assessment models, including those commonly used to inform climate policy, do not adequately characterize the risk attitudes revealed by typical investment decisions. Here, we adopt an empirical approach to risk preference description using international historical data on investment returns and the occurrence of rare economic disasters. We improve on earlier analyses by employing a hierarchical Bayesian inference procedure that allows for nation-specific estimates of both disaster probabilities and preference parameters. This provides a stronger test of the underlying investment model than provided by previous calibrations and generates some compelling hypotheses for further study. Specifically, results suggest that society is substantially more averse to risk than typically assumed in integrated assessment models. In addition, there appear to be systematic differences in risk preferences among nations. These results are likely to have important implications for policy recommendations: higher aversion to risk increases the precautionary value of taking action to avoid low-probability, high-impact outcomes. However, geographically variable attitudes toward risk indicate that this precautionary value could vary widely across nations, thereby potentially complicating the negotiation of transboundary agreements focused on risk reduction.

  10. NB-PLC channel modelling with cyclostationary noise addition & OFDM implementation for smart grid

    NASA Astrophysics Data System (ADS)

    Thomas, Togis; Gupta, K. K.

    2016-03-01

    Power line communication (PLC) technology can be a viable solution for the future ubiquitous networks because it provides a cheaper alternative to other wired technology currently being used for communication. In smart grid Power Line Communication (PLC) is used to support communication with low rate on low voltage (LV) distribution network. In this paper, we propose the channel modelling of narrowband (NB) PLC in the frequency range 5 KHz to 500 KHz by using ABCD parameter with cyclostationary noise addition. Behaviour of the channel was studied by the addition of 11KV/230V transformer, by varying load location and load. Bit error rate (BER) Vs signal to noise ratio SNR) was plotted for the proposed model by employing OFDM. Our simulation results based on the proposed channel model show an acceptable performance in terms of bit error rate versus signal to noise ratio, which enables communication required for smart grid applications.

  11. Hemolysate-mediated platelet aggregation: an additional risk mechanism contributing to thrombosis of continuous flow ventricular assist devices.

    PubMed

    Tran, Phat L; Pietropaolo, Maria-Grazia; Valerio, Lorenzo; Brengle, William; Wong, Raymond K; Kazui, Toshinobu; Khalpey, Zain I; Redaelli, Alberto; Sheriff, Jawaad; Bluestein, Danny; Slepian, Marvin J

    2016-07-01

    Despite the clinical success and growth in the utilization of continuous flow ventricular assist devices (cfVADs) for the treatment of advanced heart failure, hemolysis and thrombosis remain major limitations. Inadequate and/or ineffective anticoagulation regimens, combined with high pump speed and non-physiological flow patterns, can result in hemolysis which often is accompanied by pump thrombosis. An unexpected increase in cfVADs thrombosis was reported by multiple major VAD implanting centers in 2014, highlighting the association of hemolysis and a rise in lactate dehydrogenase (LDH) presaging thrombotic events. It is well established that thrombotic complications arise from the abnormal shear stresses generated by cfVADs. What remains unknown is the link between cfVAD-associated hemolysis and pump thrombosis. Can hemolysis of red blood cells (RBCs) contribute to platelet aggregation, thereby, facilitating prothrombotic complications in cfVADs? Herein, we examine the effect of RBC-hemolysate and selected major constituents, i.e., lactate dehydrogenase (LDH) and plasma free hemoglobin (pHb) on platelet aggregation, utilizing electrical resistance aggregometry. Our hypothesis is that elements of RBCs, released as a result of shear-mediated hemolysis, will contribute to platelet aggregation. We show that RBC hemolysate and pHb, but not LDH, are direct contributors to platelet aggregation, posing an additional risk mechanism for cfVAD thrombosis.

  12. Cumulative effects of bamboo sawdust addition on pyrolysis of sewage sludge: Biochar properties and environmental risk from metals.

    PubMed

    Jin, Junwei; Wang, Minyan; Cao, Yucheng; Wu, Shengchun; Liang, Peng; Li, Yanan; Zhang, Jianyun; Zhang, Jin; Wong, Ming Hung; Shan, Shengdao; Christie, Peter

    2017-03-01

    A novel type of biochar was produced by mixing bamboo sawdust with sewage sludge (1:1, w/w) via a co-pyrolysis process at 400-600°C. Changes in physico-chemical properties and the intrinsic speciation of metals were investigated before and after pyrolysis. Co-pyrolysis resulted in a lower biochar yield but a higher C content in the end product compared with use of sludge alone as the raw material. FT-IR analysis indicates that phosphine derivatives containing PH bonds were formed in the co-pyrolyzed biochars. In addition, co-pyrolysis of sludge with bamboo sawdust transformed the potentially toxic metals in the sludge into more stable fractions, leading to a considerable decrease in their direct toxicity and bioavailability in the co-pyrolyzed biochar. In conclusion, the co-pyrolysis technology provides a feasible method for the safe disposal of metal-contaminated sewage sludge in an attempt to minimize the environmental risk from potentially toxic metals after land application.

  13. Lymphatic Filariasis Transmission Risk Map of India, Based on a Geo-Environmental Risk Model

    PubMed Central

    Sabesan, Shanmugavelu; Raju, Konuganti Hari Kishan; Srivastava, Pradeep Kumar; Jambulingam, Purushothaman

    2013-01-01

    Abstract The strategy adopted by a global program to interrupt transmission of lymphatic filariasis (LF) is mass drug administration (MDA) using chemotherapy. India also followed this strategy by introducing MDA in the historically known endemic areas. All other areas, which remained unsurveyed, were presumed to be nonendemic and left without any intervention. Therefore, identification of LF transmission risk areas in the entire country has become essential so that they can be targeted for intervention. A geo-environmental risk model (GERM) developed earlier was used to create a filariasis transmission risk map for India. In this model, a Standardized Filariasis Transmission Risk Index (SFTRI, based on geo-environmental risk variables) was used as a predictor of transmission risk. The relationship between SFTRI and endemicity (historically known) of an area was quantified by logistic regression analysis. The quantified relationship was validated by assessing the filarial antigenemia status of children living in the unsurveyed areas through a ground truth study. A significant positive relationship was observed between SFTRI and the endemicity of an area. Overall, the model prediction of filarial endemic status of districts was found to be correct in 92.8% of the total observations. Thus, among the 190 districts hitherto unsurveyed, as many as 113 districts were predicted to be at risk, and the remaining at no risk. The GERM developed on geographic information system (GIS) platform is useful for LF spatial delimitation on a macrogeographic/regional scale. Furthermore, the risk map developed will be useful for the national LF elimination program by identifying areas at risk for intervention and for undertaking surveillance in no-risk areas. PMID:23808973

  14. Validation of a novel air toxic risk model with air monitoring.

    PubMed

    Pratt, Gregory C; Dymond, Mary; Ellickson, Kristie; Thé, Jesse

    2012-01-01

    Three modeling systems were used to estimate human health risks from air pollution: two versions of MNRiskS (for Minnesota Risk Screening), and the USEPA National Air Toxics Assessment (NATA). MNRiskS is a unique cumulative risk modeling system used to assess risks from multiple air toxics, sources, and pathways on a local to a state-wide scale. In addition, ambient outdoor air monitoring data were available for estimation of risks and comparison with the modeled estimates of air concentrations. Highest air concentrations and estimated risks were generally found in the Minneapolis-St. Paul metropolitan area and lowest risks in undeveloped rural areas. Emissions from mobile and area (nonpoint) sources created greater estimated risks than emissions from point sources. Highest cancer risks were via ingestion pathway exposures to dioxins and related compounds. Diesel particles, acrolein, and formaldehyde created the highest estimated inhalation health impacts. Model-estimated air concentrations were generally highest for NATA and lowest for the AERMOD version of MNRiskS. This validation study showed reasonable agreement between available measurements and model predictions, although results varied among pollutants, and predictions were often lower than measurements. The results increased confidence in identifying pollutants, pathways, geographic areas, sources, and receptors of potential concern, and thus provide a basis for informing pollution reduction strategies and focusing efforts on specific pollutants (diesel particles, acrolein, and formaldehyde), geographic areas (urban centers), and source categories (nonpoint sources). The results heighten concerns about risks from food chain exposures to dioxins and PAHs. Risk estimates were sensitive to variations in methodologies for treating emissions, dispersion, deposition, exposure, and toxicity.

  15. Submission Form for Peer-Reviewed Cancer Risk Prediction Models

    Cancer.gov

    If you have information about a peer-reviewd cancer risk prediction model that you would like to be considered for inclusion on this list, submit as much information as possible through the form on this page.

  16. A model for assessing the risk of human trafficking on a local level

    NASA Astrophysics Data System (ADS)

    Colegrove, Amanda

    Human trafficking is a human rights violation that is difficult to quantify. Models for estimating the number of victims of trafficking presented by previous researchers depend on inconsistent, poor quality data. As an intermediate step to help current efforts by nonprofits to combat human trafficking, this project presents a model that is not dependent on quantitative data specific to human trafficking, but rather profiles the risk of human trafficking at the local level through causative factors. Businesses, indicated by the literature, were weighted based on the presence of characteristics that increase the likelihood of trafficking in persons. The mean risk was calculated by census tract to reveal the multiplicity of risk levels in both rural and urban settings. Results indicate that labor trafficking may be a more diffuse problem in Missouri than sex trafficking. Additionally, spatial patterns of risk remained largely the same regardless of adjustments made to the model.

  17. Predicting the occurrence of wildfires with binary structured additive regression models.

    PubMed

    Ríos-Pena, Laura; Kneib, Thomas; Cadarso-Suárez, Carmen; Marey-Pérez, Manuel

    2017-02-01

    Wildfires are one of the main environmental problems facing societies today, and in the case of Galicia (north-west Spain), they are the main cause of forest destruction. This paper used binary structured additive regression (STAR) for modelling the occurrence of wildfires in Galicia. Binary STAR models are a recent contribution to the classical logistic regression and binary generalized additive models. Their main advantage lies in their flexibility for modelling non-linear effects, while simultaneously incorporating spatial and temporal variables directly, thereby making it possible to reveal possible relationships among the variables considered. The results showed that the occurrence of wildfires depends on many covariates which display variable behaviour across space and time, and which largely determine the likelihood of ignition of a fire. The joint possibility of working on spatial scales with a resolution of 1 × 1 km cells and mapping predictions in a colour range makes STAR models a useful tool for plotting and predicting wildfire occurrence. Lastly, it will facilitate the development of fire behaviour models, which can be invaluable when it comes to drawing up fire-prevention and firefighting plans.

  18. The effect of tailor-made additives on crystal growth of methyl paraben: Experiments and modelling

    NASA Astrophysics Data System (ADS)

    Cai, Zhihui; Liu, Yong; Song, Yang; Guan, Guoqiang; Jiang, Yanbin

    2017-03-01

    In this study, methyl paraben (MP) was selected as the model component, and acetaminophen (APAP), p-methyl acetanilide (PMAA) and acetanilide (ACET), which share the similar molecular structure as MP, were selected as the three tailor-made additives to study the effect of tailor-made additives on the crystal growth of MP. HPLC results indicated that the MP crystals induced by the three additives contained MP only. Photographs of the single crystals prepared indicated that the morphology of the MP crystals was greatly changed by the additives, but PXRD and single crystal diffraction results illustrated that the MP crystals were the same polymorph only with different crystal habits, and no new crystal form was found compared with other references. To investigate the effect of the additives on the crystal growth, the interaction between additives and facets was discussed in detail using the DFT methods and MD simulations. The results showed that APAP, PMAA and ACET would be selectively adsorbed on the growth surfaces of the crystal facets, which induced the change in MP crystal habits.

  19. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    SciTech Connect

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  20. Regulatory network reconstruction using an integral additive model with flexible kernel functions

    PubMed Central

    Novikov, Eugene; Barillot, Emmanuel

    2008-01-01

    Background Reconstruction of regulatory networks is one of the most challenging tasks of systems biology. A limited amount of experimental data and little prior knowledge make the problem difficult to solve. Although models that are currently used for inferring regulatory networks are sometimes able to make useful predictions about the structures and mechanisms of molecular interactions, there is still a strong demand to develop increasingly universal and accurate approaches for network reconstruction. Results The additive regulation model is represented by a set of differential equations and is frequently used for network inference from time series data. Here we generalize this model by converting differential equations into integral equations with adjustable kernel functions. These kernel functions can be selected based on prior knowledge or defined through iterative improvement in data analysis. This makes the integral model very flexible and thus capable of covering a broad range of biological systems more adequately and specifically than previous models. Conclusion We reconstructed network structures from artificial and real experimental data using differential and integral inference models. The artificial data were simulated using mathematical models implemented in JDesigner. The real data were publicly available yeast cell cycle microarray time series. The integral model outperformed the differential one for all cases. In the integral model, we tested the zero-degree polynomial and single exponential kernels. Further improvements could be expected if the kernel were selected more specifically depending on the system. PMID:18218091

  1. A Hybrid Tsunami Risk Model for Japan

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A. V.; Smith, D. F.; Khater, M.; Khemici, O.; Betov, B.; Scott, J.

    2014-12-01

    Around the margins of the Pacific Ocean, denser oceanic plates slipping under continental plates cause subduction earthquakes generating large tsunami waves. The subducting Pacific and Philippine Sea plates create damaging interplate earthquakes followed by huge tsunami waves. It was a rupture of the Japan Trench subduction zone (JTSZ) and the resultant M9.0 Tohoku-Oki earthquake that caused the unprecedented tsunami along the Pacific coast of Japan on March 11, 2011. EQECAT's Japan Earthquake model is a fully probabilistic model which includes a seismo-tectonic model describing the geometries, magnitudes, and frequencies of all potential earthquake events; a ground motion model; and a tsunami model. Within the much larger set of all modeled earthquake events, fault rupture parameters for about 24000 stochastic and 25 historical tsunamigenic earthquake events are defined to simulate tsunami footprints using the numerical tsunami model COMCOT. A hybrid approach using COMCOT simulated tsunami waves is used to generate inundation footprints, including the impact of tides and flood defenses. Modeled tsunami waves of major historical events are validated against observed data. Modeled tsunami flood depths on 30 m grids together with tsunami vulnerability and financial models are then used to estimate insured loss in Japan from the 2011 tsunami. The primary direct report of damage from the 2011 tsunami is in terms of the number of buildings damaged by municipality in the tsunami affected area. Modeled loss in Japan from the 2011 tsunami is proportional to the number of buildings damaged. A 1000-year return period map of tsunami waves shows high hazard along the west coast of southern Honshu, on the Pacific coast of Shikoku, and on the east coast of Kyushu, primarily associated with major earthquake events on the Nankai Trough subduction zone (NTSZ). The highest tsunami hazard of more than 20m is seen on the Sanriku coast in northern Honshu, associated with the JTSZ.

  2. Usefulness and limitations of global flood risk models

    NASA Astrophysics Data System (ADS)

    Ward, Philip; Jongman, Brenden; Salamon, Peter; Simpson, Alanna; Bates, Paul; De Groeve, Tom; Muis, Sanne; Coughlan de Perez, Erin; Rudari, Roberto; Mark, Trigg; Winsemius, Hessel

    2016-04-01

    Global flood risk models are now a reality. Initially, their development was driven by a demand from users for first-order global assessments to identify risk hotspots. Relentless upward trends in flood damage over the last decade have enhanced interest in such assessments. The adoption of the Sendai Framework for Disaster Risk Reduction and the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts have made these efforts even more essential. As a result, global flood risk models are being used more and more in practice, by an increasingly large number of practitioners and decision-makers. However, they clearly have their limits compared to local models. To address these issues, a team of scientists and practitioners recently came together at the Global Flood Partnership meeting to critically assess the question 'What can('t) we do with global flood risk models?'. The results of this dialogue (Ward et al., 2013) will be presented, opening a discussion on similar broader initiatives at the science-policy interface in other natural hazards. In this contribution, examples are provided of successful applications of global flood risk models in practice (for example together with the World Bank, Red Cross, and UNISDR), and limitations and gaps between user 'wish-lists' and model capabilities are discussed. Finally, a research agenda is presented for addressing these limitations and reducing the gaps. Ward et al., 2015. Nature Climate Change, doi:10.1038/nclimate2742

  3. The effectiveness of a graphical presentation in addition to a frequency format in the context of familial breast cancer risk communication: a multicenter controlled trial

    PubMed Central

    2013-01-01

    Background Inadequate understanding of risk among counselees is a common problem in familial cancer clinics. It has been suggested that graphical displays can help counselees understand cancer risks and subsequent decision-making. We evaluated the effects of a graphical presentation in addition to a frequency format on counselees’ understanding, psychological well-being, and preventive intentions. Design: Multicenter controlled trial. Setting: Three familial cancer clinics in the Netherlands. Methods Participants: Unaffected women with a breast cancer family history (first-time attendees). Intervention: Immediately after standard genetic counseling, an additional consultation by a trained risk counselor took place where women were presented with their lifetime breast cancer risk in frequency format (X out of 100) (n = 63) or frequency format plus graphical display (10 × 10 human icons) (n = 91). Main outcome measures: understanding of risk (risk accuracy, risk perception), psychological well-being, and intentions regarding cancer prevention. Measurements were assessed using questionnaires at baseline, 2-week and 6-month follow-up. Results Baseline participant characteristics did not differ between the two groups. In both groups there was an increase in women’s risk accuracy from baseline to follow-up. No significant differences were found between women who received the frequency format and those who received an additional graphical display in terms of understanding, psychological well-being and intentions regarding cancer prevention. The groups did not differ in their evaluation of the process of counseling. Conclusion Women’s personal risk estimation accuracy was generally high at baseline and the results suggest that an additional graphical display does not lead to a significant benefit in terms of increasing understanding of risk, psychological well-being and preventive intentions. Trial registration Current Controlled Trials http://ISRCTN14566836

  4. A Corrosion Risk Assessment Model for Underground Piping

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  5. Sensitivity and uncertainty analysis of a regulatory risk model

    SciTech Connect

    Kumar, A.; Manocha, A.; Shenoy, T.

    1999-07-01

    Health Risk Assessments (H.R.A.s) are increasingly being used in the environmental decision making process, starting from problem identification to the final clean up activities. A key issue concerning the results of these risk assessments is the uncertainty associated with them. This uncertainty has been associated with highly conservative estimates of risk assessment parameters in past studies. The primary purpose of this study was to investigate error propagation through a risk model. A hypothetical glass plant situated in the state of California was studied. Air emissions from this plant were modeled using the ISCST2 model and the risk was calculated using the ACE2588 model. The downwash was also considered during the concentration calculations. A sensitivity analysis on the risk computations identified five parameters--mixing depth for human consumption, deposition velocity, weathering constant, interception factors for vine crop and the average leaf vegetable consumption--which had the greatest impact on the calculated risk. A Monte Carlo analysis using these five parameters resulted in a distribution with a lesser percentage deviation than the percentage standard deviation of the input parameters.

  6. Test of the Additivity Principle for Current Fluctuations in a Model of Heat Conduction

    NASA Astrophysics Data System (ADS)

    Hurtado, Pablo I.; Garrido, Pedro L.

    2009-06-01

    The additivity principle allows to compute the current distribution in many one-dimensional (1D) nonequilibrium systems. Using simulations, we confirm this conjecture in the 1D Kipnis-Marchioro-Presutti model of heat conduction for a wide current interval. The current distribution shows both Gaussian and non-Gaussian regimes, and obeys the Gallavotti-Cohen fluctuation theorem. We verify the existence of a well-defined temperature profile associated to a given current fluctuation. This profile is independent of the sign of the current, and this symmetry extends to higher-order profiles and spatial correlations. We also show that finite-time joint fluctuations of the current and the profile are described by the additivity functional. These results suggest the additivity hypothesis as a general and powerful tool to compute current distributions in many nonequilibrium systems.

  7. Test of the additivity principle for current fluctuations in a model of heat conduction.

    PubMed

    Hurtado, Pablo I; Garrido, Pedro L

    2009-06-26

    The additivity principle allows to compute the current distribution in many one-dimensional (1D) nonequilibrium systems. Using simulations, we confirm this conjecture in the 1D Kipnis-Marchioro-Presutti model of heat conduction for a wide current interval. The current distribution shows both Gaussian and non-Gaussian regimes, and obeys the Gallavotti-Cohen fluctuation theorem. We verify the existence of a well-defined temperature profile associated to a given current fluctuation. This profile is independent of the sign of the current, and this symmetry extends to higher-order profiles and spatial correlations. We also show that finite-time joint fluctuations of the current and the profile are described by the additivity functional. These results suggest the additivity hypothesis as a general and powerful tool to compute current distributions in many nonequilibrium systems.

  8. Generalized Additive Mixed-Models for Pharmacology Using Integrated Discrete Multiple Organ Co-Culture.

    PubMed

    Ingersoll, Thomas; Cole, Stephanie; Madren-Whalley, Janna; Booker, Lamont; Dorsey, Russell; Li, Albert; Salem, Harry

    2016-01-01

    Integrated Discrete Multiple Organ Co-culture (IDMOC) is emerging as an in-vitro alternative to in-vivo animal models for pharmacology studies. IDMOC allows dose-response relationships to be investigated at the tissue and organoid levels, yet, these relationships often exhibit responses that are far more complex than the binary responses often measured in whole animals. To accommodate departure from binary endpoints, IDMOC requires an expansion of analytic techniques beyond simple linear probit and logistic models familiar in toxicology. IDMOC dose-responses may be measured at continuous scales, exhibit significant non-linearity such as local maxima or minima, and may include non-independent measures. Generalized additive mixed-modeling (GAMM) provides an alternative description of dose-response that relaxes assumptions of independence and linearity. We compared GAMMs to traditional linear models for describing dose-response in IDMOC pharmacology studies.

  9. Generalized Additive Mixed-Models for Pharmacology Using Integrated Discrete Multiple Organ Co-Culture

    PubMed Central

    Ingersoll, Thomas; Cole, Stephanie; Madren-Whalley, Janna; Booker, Lamont; Dorsey, Russell; Li, Albert; Salem, Harry

    2016-01-01

    Integrated Discrete Multiple Organ Co-culture (IDMOC) is emerging as an in-vitro alternative to in-vivo animal models for pharmacology studies. IDMOC allows dose-response relationships to be investigated at the tissue and organoid levels, yet, these relationships often exhibit responses that are far more complex than the binary responses often measured in whole animals. To accommodate departure from binary endpoints, IDMOC requires an expansion of analytic techniques beyond simple linear probit and logistic models familiar in toxicology. IDMOC dose-responses may be measured at continuous scales, exhibit significant non-linearity such as local maxima or minima, and may include non-independent measures. Generalized additive mixed-modeling (GAMM) provides an alternative description of dose-response that relaxes assumptions of independence and linearity. We compared GAMMs to traditional linear models for describing dose-response in IDMOC pharmacology studies. PMID:27110941

  10. Use of additive technologies for practical working with complex models for foundry technologies

    NASA Astrophysics Data System (ADS)

    Olkhovik, E.; Butsanets, A. A.; Ageeva, A. A.

    2016-07-01

    The article presents the results of research of additive technology (3D printing) application for developing a geometrically complex model of castings parts. Investment casting is well known and widely used technology for the production of complex parts. The work proposes the use of a 3D printing technology for manufacturing models parts, which are removed by thermal destruction. Traditional methods of equipment production for investment casting involve the use of manual labor which has problems with dimensional accuracy, and CNC technology which is less used. Such scheme is low productive and demands considerable time. We have offered an alternative method which consists in printing the main knots using a 3D printer (PLA and ABS) with a subsequent production of castings models from them. In this article, the main technological methods are considered and their problems are discussed. The dimensional accuracy of models in comparison with investment casting technology is considered as the main aspect.

  11. Evidence of thermal additivity during short laser pulses in an in vitro retinal model

    NASA Astrophysics Data System (ADS)

    Denton, Michael L.; Tijerina, Amanda J.; Dyer, Phillip N.; Oian, Chad A.; Noojin, Gary D.; Rickman, John M.; Shingledecker, Aurora D.; Clark, Clifton D.; Castellanos, Cherry C.; Thomas, Robert J.; Rockwell, Benjamin A.

    2015-03-01

    Laser damage thresholds were determined for exposure to 2.5-ms 532-nm pulses in an established in vitro retinal model. Single and multiple pulses (10, 100, 1000) were delivered to the cultured cells at three different pulse repetition frequency (PRF) values, and overt damage (membrane breach) was scored 1 hr post laser exposure. Trends in the damage data within and across the PRF range identified significant thermal additivity as PRF was increased, as evidenced by drastically reduced threshold values (< 40% of single-pulse value). Microthermography data that were collected in real time during each exposure also provided evidence of thermal additivity between successive laser pulses. Using thermal profiles simulated at high temporal resolution, damage threshold values were predicted by an in-house computational model. Our simulated ED50 value for a single 2.5-ms pulse was in very good agreement with experimental results, but ED50 predictions for multiple-pulse trains will require more refinement.

  12. Describing long-term trends in precipitation using generalized additive models

    NASA Astrophysics Data System (ADS)

    Underwood, Fiona M.

    2009-01-01

    SummaryWith the current concern over climate change, descriptions of how rainfall patterns are changing over time can be useful. Observations of daily rainfall data over the last few decades provide information on these trends. Generalized linear models are typically used to model patterns in the occurrence and intensity of rainfall. These models describe rainfall patterns for an average year but are more limited when describing long-term trends, particularly when these are potentially non-linear. Generalized additive models (GAMs) provide a framework for modelling non-linear relationships by fitting smooth functions to the data. This paper describes how GAMs can extend the flexibility of models to describe seasonal patterns and long-term trends in the occurrence and intensity of daily rainfall using data from Mauritius from 1962 to 2001. Smoothed estimates from the models provide useful graphical descriptions of changing rainfall patterns over the last 40 years at this location. GAMs are particularly helpful when exploring non-linear relationships in the data. Care is needed to ensure the choice of smooth functions is appropriate for the data and modelling objectives.

  13. Boosted structured additive regression for Escherichia coli fed-batch fermentation modeling.

    PubMed

    Melcher, Michael; Scharl, Theresa; Luchner, Markus; Striedner, Gerald; Leisch, Friedrich

    2017-02-01

    The quality of biopharmaceuticals and patients' safety are of highest priority and there are tremendous efforts to replace empirical production process designs by knowledge-based approaches. Main challenge in this context is that real-time access to process variables related to product quality and quantity is severely limited. To date comprehensive on- and offline monitoring platforms are used to generate process data sets that allow for development of mechanistic and/or data driven models for real-time prediction of these important quantities. Ultimate goal is to implement model based feed-back control loops that facilitate online control of product quality. In this contribution, we explore structured additive regression (STAR) models in combination with boosting as a variable selection tool for modeling the cell dry mass, product concentration, and optical density on the basis of online available process variables and two-dimensional fluorescence spectroscopic data. STAR models are powerful extensions of linear models allowing for inclusion of smooth effects or interactions between predictors. Boosting constructs the final model in a stepwise manner and provides a variable importance measure via predictor selection frequencies. Our results show that the cell dry mass can be modeled with a relative error of about ±3%, the optical density with ±6%, the soluble protein with ±16%, and the insoluble product with an accuracy of ±12%. Biotechnol. Bioeng. 2017;114: 321-334. © 2016 Wiley Periodicals, Inc.

  14. Addition of a 5/cm Spectral Resolution Band Model Option to LOWTRAN5.

    DTIC Science & Technology

    1980-10-01

    FORM I. REPORT NUMBER .GOVT ACCESSION NO. 3 . RECIPIENT’S CATALCI UMISER ARI-RR-232 -9 1 0. T Ct IIIM INNY S TYPE OF REPORT & PERIOD COVERED I ddition of...5r/TPAN (2) the addition of temperature dependent ecular absorption coefficients,’ and ( 3 ) the use of a multi-parameter, Dp 71pForentz band model for...LOWTRA.I5 and LOWTRAN5(IMOD) ..... 2-10 2.8 Comparison of LOWTRAN5 Models to Measurements 2-16 3 . MODIFICATIONS TO LOWTRAN5

  15. Model for Assembly Line Re-Balancing Considering Additional Capacity and Outsourcing to Face Demand Fluctuations

    NASA Astrophysics Data System (ADS)

    Samadhi, TMAA; Sumihartati, Atin

    2016-02-01

    The most critical stage in a garment industry is sewing process, because generally, it consists of a number of operations and a large number of sewing machines for each operation. Therefore, it requires a balancing method that can assign task to work station with balance workloads. Many studies on assembly line balancing assume a new assembly line, but in reality, due to demand fluctuation and demand increased a re-balancing is needed. To cope with those fluctuating demand changes, additional capacity can be carried out by investing in spare sewing machine and paying for sewing service through outsourcing. This study develops an assembly line balancing (ALB) model on existing line to cope with fluctuating demand change. Capacity redesign is decided if the fluctuation demand exceeds the available capacity through a combination of making investment on new machines and outsourcing while considering for minimizing the cost of idle capacity in the future. The objective of the model is to minimize the total cost of the line assembly that consists of operating costs, machine cost, adding capacity cost, losses cost due to idle capacity and outsourcing costs. The model develop is based on an integer programming model. The model is tested for a set of data of one year demand with the existing number of sewing machines of 41 units. The result shows that additional maximum capacity up to 76 units of machine required when there is an increase of 60% of the average demand, at the equal cost parameters..

  16. Patient-specific in vitro models for hemodynamic analysis of congenital heart disease - Additive manufacturing approach.

    PubMed

    Medero, Rafael; García-Rodríguez, Sylvana; François, Christopher J; Roldán-Alzate, Alejandro

    2017-03-21

    Non-invasive hemodynamic assessment of total cavopulmonary connection (TCPC) is challenging due to the complex anatomy. Additive manufacturing (AM) is a suitable alternative for creating patient-specific in vitro models for flow measurements using four-dimensional (4D) Flow MRI. These in vitro systems have the potential to serve as validation for computational fluid dynamics (CFD), simulating different physiological conditions. This study investigated three different AM technologies, stereolithography (SLA), selective laser sintering (SLS) and fused deposition modeling (FDM), to determine differences in hemodynamics when measuring flow using 4D Flow MRI. The models were created using patient-specific MRI data from an extracardiac TCPC. These models were connected to a perfusion pump circulating water at three different flow rates. Data was processed for visualization and quantification of velocity, flow distribution, vorticity and kinetic energy. These results were compared between each model. In addition, the flow distribution obtained in vitro was compared to in vivo. The results showed significant difference in velocities measured at the outlets of the models that required internal support material when printing. Furthermore, an ultrasound flow sensor was used to validate flow measurements at the inlets and outlets of the in vitro models. These results were highly correlated to those measured with 4D Flow MRI. This study showed that commercially available AM technologies can be used to create patient-specific vascular models for in vitro hemodynamic studies at reasonable costs. However, technologies that do not require internal supports during manufacturing allow smoother internal surfaces, which makes them better suited for flow analyses.

  17. Dietary Information Improves Model Performance and Predictive Ability of a Noninvasive Type 2 Diabetes Risk Model

    PubMed Central

    Han, Tianshu; Tian, Shuang; Wang, Li; Liang, Xi; Cui, Hongli; Du, Shanshan; Na, Guanqiong; Na, Lixin; Sun, Changhao

    2016-01-01

    There is no diabetes risk model that includes dietary predictors in Asia. We sought to develop a diet-containing noninvasive diabetes risk model in Northern China and to evaluate whether dietary predictors can improve model performance and predictive ability. Cross-sectional data for 9,734 adults aged 20–74 years old were used as the derivation data, and results obtained for a cohort of 4,515 adults with 4.2 years of follow-up were used as the validation data. We used a logistic regression model to develop a diet-containing noninvasive risk model. Akaike’s information criterion (AIC), area under curve (AUC), integrated discrimination improvements (IDI), net classification improvement (NRI) and calibration statistics were calculated to explicitly assess the effect of dietary predictors on a diabetes risk model. A diet-containing type 2 diabetes risk model was developed. The significant dietary predictors including the consumption of staple foods, livestock, eggs, potato, dairy products, fresh fruit and vegetables were included in the risk model. Dietary predictors improved the noninvasive diabetes risk model with a significant increase in the AUC (delta AUC = 0.03, P<0.001), an increase in relative IDI (24.6%, P-value for IDI <0.001), an increase in NRI (category-free NRI = 0.155, P<0.001), an increase in sensitivity of the model with 7.3% and a decrease in AIC (delta AIC = 199.5). The results of the validation data were similar to the derivation data. The calibration of the diet-containing diabetes risk model was better than that of the risk model without dietary predictors in the validation data. Dietary information improves model performance and predictive ability of noninvasive type 2 diabetes risk model based on classic risk factors. Dietary information may be useful for developing a noninvasive diabetes risk model. PMID:27851788

  18. Use of anatomical and kinetic models in the evaluation of human food additive safety.

    PubMed

    Roth, William L

    2005-09-22

    Toxicological testing in animals is relied upon as a surrogate for clinical testing of most food additives. Both animal and human clinical test results are generally available for direct additives when high levels of exposure are expected. Limited animal studies or in vitro test results may be the only sources of toxicological data available when low levels of exposure (microg/person/day) are expected and where no effects of the additive on the food itself are desired. Safety assessment of such materials for humans requires mathematical extrapolation from any effects observed in test animals to arrive at acceptable daily intakes (ADIs) for humans. Models of anatomy may be used to estimate tissue and organ weights where that information is missing and necessary for evaluation of a data set. The effect of growth on target tissue exposure during critical phases of organ development can be more accurately assessed when models of growth and known physiological changes are combined with pharmacokinetic results for test species. Kinetic models, when combined with limited chemical property, kinetic, and distribution data, can often be used to predict steady-state plasma and tissue levels of a test material over the range of doses employed in chronic studies to aid in interpretation of effects that are often nonlinear with respect to delivered dose. A better understanding of the reasons for nonlinearity of effects in animals improves our confidence in extrapolation to humans.

  19. Rain water transport and storage in a model sandy soil with hydrogel particle additives.

    PubMed

    Wei, Y; Durian, D J

    2014-10-01

    We study rain water infiltration and drainage in a dry model sandy soil with superabsorbent hydrogel particle additives by measuring the mass of retained water for non-ponding rainfall using a self-built 3D laboratory set-up. In the pure model sandy soil, the retained water curve measurements indicate that instead of a stable horizontal wetting front that grows downward uniformly, a narrow fingered flow forms under the top layer of water-saturated soil. This rain water channelization phenomenon not only further reduces the available rain water in the plant root zone, but also affects the efficiency of soil additives, such as superabsorbent hydrogel particles. Our studies show that the shape of the retained water curve for a soil packing with hydrogel particle additives strongly depends on the location and the concentration of the hydrogel particles in the model sandy soil. By carefully choosing the particle size and distribution methods, we may use the swollen hydrogel particles to modify the soil pore structure, to clog or extend the water channels in sandy soils, or to build water reservoirs in the plant root zone.

  20. Building a Better Model: A Comprehensive Breast Cancer Risk Model Incorporating Breast Density to Stratify Risk and Apply Resources

    DTIC Science & Technology

    2012-10-01

    methods (CumulusV, Volpara), developed an automated area based 15. SUBJECT TERMS Breast cancer; risk model; mammography ; breast density 16...recommendations based on an individual’s risk beginning with personalized mammography screening decisions. This will be done by increasing the ability... mammography machine vendor. Once the model is complete, tested nationally, and proven accurate, it will be available for widespread use within five to six

  1. Can ligand addition to soil enhance Cd phytoextraction? A mechanistic model study.

    PubMed

    Lin, Zhongbing; Schneider, André; Nguyen, Christophe; Sterckeman, Thibault

    2014-11-01

    Phytoextraction is a potential method for cleaning Cd-polluted soils. Ligand addition to soil is expected to enhance Cd phytoextraction. However, experimental results show that this addition has contradictory effects on plant Cd uptake. A mechanistic model simulating the reaction kinetics (adsorption on solid phase, complexation in solution), transport (convection, diffusion) and root absorption (symplastic, apoplastic) of Cd and its complexes in soil was developed. This was used to calculate plant Cd uptake with and without ligand addition in a great number of combinations of soil, ligand and plant characteristics, varying the parameters within defined domains. Ligand addition generally strongly reduced hydrated Cd (Cd(2+)) concentration in soil solution through Cd complexation. Dissociation of Cd complex ([Formula: see text]) could not compensate for this reduction, which greatly lowered Cd(2+) symplastic uptake by roots. The apoplastic uptake of [Formula: see text] was not sufficient to compensate for the decrease in symplastic uptake. This explained why in the majority of the cases, ligand addition resulted in the reduction of the simulated Cd phytoextraction. A few results showed an enhanced phytoextraction in very particular conditions (strong plant transpiration with high apoplastic Cd uptake capacity), but this enhancement was very limited, making chelant-enhanced phytoextraction poorly efficient for Cd.

  2. Default risk modeling beyond the first-passage approximation: extended Black-Cox model.

    PubMed

    Katz, Yuri A; Shokhirev, Nikolai V

    2010-07-01

    We develop a generalization of the Black-Cox structural model of default risk. The extended model captures uncertainty related to firm's ability to avoid default even if company's liabilities momentarily exceeding its assets. Diffusion in a linear potential with the radiation boundary condition is used to mimic a company's default process. The exact solution of the corresponding Fokker-Planck equation allows for derivation of analytical expressions for the cumulative probability of default and the relevant hazard rate. Obtained closed formulas fit well the historical data on global corporate defaults and demonstrate the split behavior of credit spreads for bonds of companies in different categories of speculative-grade ratings with varying time to maturity. Introduction of the finite rate of default at the boundary improves valuation of credit risk for short time horizons, which is the key advantage of the proposed model. We also consider the influence of uncertainty in the initial distance to the default barrier on the outcome of the model and demonstrate that this additional source of incomplete information may be responsible for nonzero credit spreads for bonds with very short time to maturity.

  3. Default risk modeling beyond the first-passage approximation: Extended Black-Cox model

    NASA Astrophysics Data System (ADS)

    Katz, Yuri A.; Shokhirev, Nikolai V.

    2010-07-01

    We develop a generalization of the Black-Cox structural model of default risk. The extended model captures uncertainty related to firm’s ability to avoid default even if company’s liabilities momentarily exceeding its assets. Diffusion in a linear potential with the radiation boundary condition is used to mimic a company’s default process. The exact solution of the corresponding Fokker-Planck equation allows for derivation of analytical expressions for the cumulative probability of default and the relevant hazard rate. Obtained closed formulas fit well the historical data on global corporate defaults and demonstrate the split behavior of credit spreads for bonds of companies in different categories of speculative-grade ratings with varying time to maturity. Introduction of the finite rate of default at the boundary improves valuation of credit risk for short time horizons, which is the key advantage of the proposed model. We also consider the influence of uncertainty in the initial distance to the default barrier on the outcome of the model and demonstrate that this additional source of incomplete information may be responsible for nonzero credit spreads for bonds with very short time to maturity.

  4. Spectral prediction model for color prints on paper with fluorescent additives.

    PubMed

    Hersch, Roger David

    2008-12-20

    I propose a model for predicting the total reflectance of color halftones printed on paper incorporating fluorescent brighteners. The total reflectance is modeled as the additive superposition of the relative fluorescent emission and the pure reflectance of the color print. The fluorescent emission prediction model accounts for both the attenuation of light by the halftone within the excitation wavelength range and for the attenuation of the fluorescent emission by the same halftone within the emission wavelength range. The model's calibration relies on reflectance measurements of the optically brightened paper and of the solid colorant patches with two illuminants, one including and one excluding the UV components. The part of the model predicting the pure reflectance relies on an ink-spreading extended Clapper-Yule model. On uniformly distributed surface coverages of cyan, magenta, and yellow halftone patches, the proposed model predicts the relative fluorescent emission with a high accuracy (mean DeltaE(94)=0.42 under a D65 standard illuminant). For optically brightened paper exhibiting a moderate fluorescence, the total reflectance prediction improves the spectral reflectance prediction mainly for highlight color halftones, comprising a proportion of paper white above 12%. Applications include the creation of improved printer characterization tables for color management purposes and the prediction of color gamuts for new combinations of optically brightened papers and inks.

  5. Generalized linear and generalized additive models in studies of species distributions: Setting the scene

    USGS Publications Warehouse

    Guisan, A.; Edwards, T.C.; Hastie, T.

    2002-01-01

    An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.

  6. The Global Earthquake Model and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  7. Modeling Manufacturing Processes to Mitigate Technological Risk

    SciTech Connect

    Allgood, G.O.; Manges, W.W.

    1999-10-24

    An economic model is a tool for determining the justifiable cost of new sensors and subsystems with respect to value and operation. This process balances the R and D costs against the expense of maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.). This model has been successfully used and deployed in the CAFE Project. The economic model was one of seven (see Fig. 1) elements critical in developing an investment strategy. It has been successfully used in guiding the R and D activities on the CAFE Project, suspending activities on three new sensor technologies, and continuing development o f two others. The model has also been used to justify the development of a new prognostic approach for diagnosing machine health using COTS equipment and a new algorithmic approach. maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.).

  8. Application of the Beck model to stock markets: Value-at-Risk and portfolio risk assessment

    NASA Astrophysics Data System (ADS)

    Kozaki, M.; Sato, A.-H.

    2008-02-01

    We apply the Beck model, developed for turbulent systems that exhibit scaling properties, to stock markets. Our study reveals that the Beck model elucidates the properties of stock market returns and is applicable to practical use such as the Value-at-Risk estimation and the portfolio analysis. We perform empirical analysis with daily/intraday data of the S&P500 index return and find that the volatility fluctuation of real markets is well-consistent with the assumptions of the Beck model: The volatility fluctuates at a much larger time scale than the return itself and the inverse of variance, or “inverse temperature”, β obeys Γ-distribution. As predicted by the Beck model, the distribution of returns is well-fitted by q-Gaussian distribution of Tsallis statistics. The evaluation method of Value-at-Risk (VaR), one of the most significant indicators in risk management, is studied for q-Gaussian distribution. Our proposed method enables the VaR evaluation in consideration of tail risk, which is underestimated by the variance-covariance method. A framework of portfolio risk assessment under the existence of tail risk is considered. We propose a multi-asset model with a single volatility fluctuation shared by all assets, named the single β model, and empirically examine the agreement between the model and an imaginary portfolio with Dow Jones indices. It turns out that the single β model gives good approximation to portfolios composed of the assets with non-Gaussian and correlated returns.

  9. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region.

  10. Quantum-chemical model evaluations of thermodynamics and kinetics of oxygen atom additions to narrow nanotubes.

    PubMed

    Slanina, Zdenĕk; Stobinski, Leszek; Tomasik, Piotr; Lin, Hong-Ming; Adamowicz, Ludwik

    2003-01-01

    This paper reports a computational study of oxygen additions to narrow nanotubes, a problem frequently studied with fullerenes. In fact, fullerene oxides were the first observed fullerene derivatives, and they have naturally attracted the attention of both experiment and theory. C60O had represented a long-standing case of experiment-theory disagreement, and there has been a similar problem with C60O2. The disagreement has been explained by kinetic rather than thermodynamic control. In this paper a similar computational approach is applied to narrow nanotubes. Recently, very narrow nanotubes have been observed with a diameter of 5 A and even with a diameter of 4 A. It has been supposed that the narrow nanotubes are closed by fragments of small fullerenes like C36 or C20. In this report we perform calculations for oxygen additions to such model nanotubes capped by fragments of D2d C36, D4d C32, and Ih C20 fullerenic cages (though the computational models have to be rather short). The three models have the following carbon contents: C84, C80, and C80. Both thermodynamic enthalpy changes and kinetic activation barriers for oxygen addition to six selected bonds are computed and analyzed. The lowest isomer (thermodynamically the most stable) is never of the 6/6 type, that is, the enthalpically favored structures are produced by oxygen additions to the nanotube tips. Interestingly enough, the lowest energy isomer has, for the D2d C36 and D4d C32 cases, the lowest kinetic activation barrier as well.

  11. Improving default risk prediction using Bayesian model uncertainty techniques.

    PubMed

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis.

  12. Testing departure from additivity in Tukey's model using shrinkage: application to a longitudinal setting.

    PubMed

    Ko, Yi-An; Mukherjee, Bhramar; Smith, Jennifer A; Park, Sung Kyun; Kardia, Sharon L R; Allison, Matthew A; Vokonas, Pantel S; Chen, Jinbo; Diez-Roux, Ana V

    2014-12-20

    While there has been extensive research developing gene-environment interaction (GEI) methods in case-control studies, little attention has been given to sparse and efficient modeling of GEI in longitudinal studies. In a two-way table for GEI with rows and columns as categorical variables, a conventional saturated interaction model involves estimation of a specific parameter for each cell, with constraints ensuring identifiability. The estimates are unbiased but are potentially inefficient because the number of parameters to be estimated can grow quickly with increasing categories of row/column factors. On the other hand, Tukey's one-degree-of-freedom model for non-additivity treats the interaction term as a scaled product of row and column main effects. Because of the parsimonious form of interaction, the interaction estimate leads to enhanced efficiency, and the corresponding test could lead to increased power. Unfortunately, Tukey's model gives biased estimates and low power if the model is misspecified. When screening multiple GEIs where each genetic and environmental marker may exhibit a distinct interaction pattern, a robust estimator for interaction is important for GEI detection. We propose a shrinkage estimator for interaction effects that combines estimates from both Tukey's and saturated interaction models and use the corresponding Wald test for testing interaction in a longitudinal setting. The proposed estimator is robust to misspecification of interaction structure. We illustrate the proposed methods using two longitudinal studies-the Normative Aging Study and the Multi-ethnic Study of Atherosclerosis.

  13. Testing Departure from Additivity in Tukey’s Model using Shrinkage: Application to a Longitudinal Setting

    PubMed Central

    Ko, Yi-An; Mukherjee, Bhramar; Smith, Jennifer A.; Park, Sung Kyun; Kardia, Sharon L.R.; Allison, Matthew A.; Vokonas, Pantel S.; Chen, Jinbo; Diez-Roux, Ana V.

    2014-01-01

    While there has been extensive research developing gene-environment interaction (GEI) methods in case-control studies, little attention has been given to sparse and efficient modeling of GEI in longitudinal studies. In a two-way table for GEI with rows and columns as categorical variables, a conventional saturated interaction model involves estimation of a specific parameter for each cell, with constraints ensuring identifiability. The estimates are unbiased but are potentially inefficient because the number of parameters to be estimated can grow quickly with increasing categories of row/column factors. On the other hand, Tukey’s one degree of freedom (df) model for non-additivity treats the interaction term as a scaled product of row and column main effects. Due to the parsimonious form of interaction, the interaction estimate leads to enhanced efficiency and the corresponding test could lead to increased power. Unfortunately, Tukey’s model gives biased estimates and low power if the model is misspecified. When screening multiple GEIs where each genetic and environmental marker may exhibit a distinct interaction pattern, a robust estimator for interaction is important for GEI detection. We propose a shrinkage estimator for interaction effects that combines estimates from both Tukey’s and saturated interaction models and use the corresponding Wald test for testing interaction in a longitudinal setting. The proposed estimator is robust to misspecification of interaction structure. We illustrate the proposed methods using two longitudinal studies — the Normative Aging Study and the Multi-Ethnic Study of Atherosclerosis. PMID:25112650

  14. Crisis and emergency risk communication as an integrative model.

    PubMed

    Reynolds, Barbara; W Seeger, Matthew

    2005-01-01

    This article describes a model of communication known as crisis and emergency risk communication (CERC). The model is outlined as a merger of many traditional notions of health and risk communication with work in crisis and disaster communication. The specific kinds of communication activities that should be called for at various stages of disaster or crisis development are outlined. Although crises are by definition uncertain, equivocal, and often chaotic situations, the CERC model is presented as a tool health communicators can use to help manage these complex events.

  15. Empirical Analysis of Farm Credit Risk under the Structure Model

    ERIC Educational Resources Information Center

    Yan, Yan

    2009-01-01

    The study measures farm credit risk by using farm records collected by Farm Business Farm Management (FBFM) during the period 1995-2004. The study addresses the following questions: (1) whether farm's financial position is fully described by the structure model, (2) what are the determinants of farm capital structure under the structure model, (3)…

  16. Dental caries: an updated medical model of risk assessment.

    PubMed

    Kutsch, V Kim

    2014-04-01

    Dental caries is a transmissible, complex biofilm disease that creates prolonged periods of low pH in the mouth, resulting in a net mineral loss from the teeth. Historically, the disease model for dental caries consisted of mutans streptococci and Lactobacillus species, and the dental profession focused on restoring the lesions/damage from the disease by using a surgical model. The current recommendation is to implement a risk-assessment-based medical model called CAMBRA (caries management by risk assessment) to diagnose and treat dental caries. Unfortunately, many of the suggestions of CAMBRA have been overly complicated and confusing for clinicians. The risk of caries, however, is usually related to just a few common factors, and these factors result in common patterns of disease. This article examines the biofilm model of dental caries, identifies the common disease patterns, and discusses their targeted therapeutic strategies to make CAMBRA more easily adaptable for the privately practicing professional.

  17. Modeling of Flood Risk for the Continental United States

    NASA Astrophysics Data System (ADS)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  18. Model for Solar Proton Risk Assessment

    NASA Technical Reports Server (NTRS)

    Xapos, M. A.; Stauffer, C.; Gee, G. B.; Barth, J. L.; Stassinopoulos, E. G.; McGuire, R. E.

    2004-01-01

    A statistical model for cumulative solar proton event fluences during space missions is presented that covers both the solar minimum and solar maximum phases of the solar cycle. It is based on data from the IMP and GOES series of satellites that is integrated together to allow the best features of each data set to be taken advantage of. This allows fluence-energy spectra to be extended out to energies of 327 MeV.

  19. Reduction of carcinogenic 4(5)-methylimidazole in a caramel model system: influence of food additives.

    PubMed

    Seo, Seulgi; Ka, Mi-Hyun; Lee, Kwang-Geun

    2014-07-09

    The effect of various food additives on the formation of carcinogenic 4(5)-methylimidazole (4-MI) in a caramel model system was investigated. The relationship between the levels of 4-MI and various pyrazines was studied. When glucose and ammonium hydroxide were heated, the amount of 4-MI was 556 ± 1.3 μg/mL, which increased to 583 ± 2.6 μg/mL by the addition of 0.1 M of sodium sulfite. When various food additives, such as 0.1 M of iron sulfate, magnesium sulfate, zinc sulfate, tryptophan, and cysteine were added, the amount of 4-MI was reduced to 110 ± 0.7, 483 ± 2.0, 460 ± 2.0, 409 ± 4.4, and 397 ± 1.7 μg/mL, respectively. The greatest reduction, 80%, occurred with the addition of iron sulfate. Among the 12 pyrazines, 2-ethyl-6-methylpyrazine with 4-MI showed the highest correlation (r = -0.8239).

  20. How pharmacokinetic modeling could improve a risk assessment for manganese

    EPA Science Inventory

    The neurotoxicity of manganese (Mn) is well established, yet the risk assessment of Mn is made complex by certain enigmas. These include apparently greatertoxicity via inhalation compared to oral exposure and greater toxicity in humans compared to rats. In addition, until recentl...

  1. Parametric Estimation in a Recurrent Competing Risks Model.

    PubMed

    Taylor, Laura L; Peña, Edsel A

    2013-01-01

    A resource-efficient approach to making inferences about the distributional properties of the failure times in a competing risks setting is presented. Efficiency is gained by observing recurrences of the competing risks over a random monitoring period. The resulting model is called the recurrent competing risks model (RCRM) and is coupled with two repair strategies whenever the system fails. Maximum likelihood estimators of the parameters of the marginal distribution functions associated with each of the competing risks and also of the system lifetime distribution function are presented. Estimators are derived under perfect and partial repair strategies. Consistency and asymptotic properties of the estimators are obtained. The estimation methods are applied to a data set of failures for cars under warranty. Simulation studies are used to ascertain the small sample properties and the efficiency gains of the resulting estimators.

  2. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic

  3. Model based climate information on drought risk in Africa

    NASA Astrophysics Data System (ADS)

    Calmanti, S.; Syroka, J.; Jones, C.; Carfagna, F.; Dell'Aquila, A.; Hoefsloot, P.; Kaffaf, S.; Nikulin, G.

    2012-04-01

    The United Nations World Food Programme (WFP) has embarked upon the endeavor of creating a sustainable Africa-wide natural disaster risk management system. A fundamental building block of this initiative is the setup of a drought impact modeling platform called Africa Risk-View that aims to quantify and monitor weather-related food security risk in Africa. The modeling approach is based the Water Requirement Satisfaction Index (WRSI), as the fundamental indicator of the performances of agriculture and uses historical records of food assistance operation to project future potential needs for livelihood protection. By using climate change scenarios as an input to Africa Risk-View it is possible, in principles, to evaluate the future impact of climate variability on critical issues such as food security and the overall performance of the envisaged risk management system. A necessary preliminary step to this challenging task is the exploration of the sources of uncertainties affecting the assessment based on modeled climate change scenarios. For this purpose, a limited set of climate models have been selected in order verify the relevance of using climate model output data with Africa Risk-View and to explore a minimal range of possible sources of uncertainty. This first evaluation exercise started before the setup of the CORDEX framework and has relied on model output available at the time. In particular only one regional downscaling was available for the entire African continent from the ENSEMBLES project. The analysis shows that current coarse resolution global climate models can not directly feed into the Africa RiskView risk-analysis tool. However, regional downscaling may help correcting the inherent biases observed in the datasets. Further analysis is performed by using the first data available under the CORDEX framework. In particular, we consider a set of simulation driven with boundary conditions from the reanalysis ERA-Interim to evaluate the skill drought

  4. Marginal regression approach for additive hazards models with clustered current status data.

    PubMed

    Su, Pei-Fang; Chi, Yunchan

    2014-01-15

    Current status data arise naturally from tumorigenicity experiments, epidemiology studies, biomedicine, econometrics and demographic and sociology studies. Moreover, clustered current status data may occur with animals from the same litter in tumorigenicity experiments or with subjects from the same family in epidemiology studies. Because the only information extracted from current status data is whether the survival times are before or after the monitoring or censoring times, the nonparametric maximum likelihood estimator of survival function converges at a rate of n(1/3) to a complicated limiting distribution. Hence, semiparametric regression models such as the additive hazards model have been extended for independent current status data to derive the test statistics, whose distributions converge at a rate of n(1/2) , for testing the regression parameters. However, a straightforward application of these statistical methods to clustered current status data is not appropriate because intracluster correlation needs to be taken into account. Therefore, this paper proposes two estimating functions for estimating the parameters in the additive hazards model for clustered current status data. The comparative results from simulation studies are presented, and the application of the proposed estimating functions to one real data set is illustrated.

  5. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  6. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  7. Model-based benefit-risk assessment: can Archimedes help?

    PubMed

    Krishna, R

    2009-03-01

    In December 2008, the US Food and Drug Administration issued a new draft Guidance for Industry on Diabetes Mellitus--evaluating cardiovascular risk in new antidiabetic therapies to treat Type 2 diabetes. This guidance comes at a time when recent discussions have focused on delineation of cardiovascular risk reduction for new antidiabetic drugs. Computational tools that can enable early prediction of cardiovascular risk are reviewed with specific reference to Archimedes (Kaiser Permanente), with an aim of proposing a model-based solution and enabling decisions to be made as early as possible in the drug development value chain.

  8. Risk Prediction Models for Lung Cancer: A Systematic Review.

    PubMed

    Gray, Eoin P; Teare, M Dawn; Stevens, John; Archer, Rachel

    2016-03-01

    Many lung cancer risk prediction models have been published but there has been no systematic review or comprehensive assessment of these models to assess how they could be used in screening. We performed a systematic review of lung cancer prediction models and identified 31 articles that related to 25 distinct models, of which 11 considered epidemiological factors only and did not require a clinical input. Another 11 articles focused on models that required a clinical assessment such as a blood test or scan, and 8 articles considered the 2-stage clonal expansion model. More of the epidemiological models had been externally validated than the more recent clinical assessment models. There was varying discrimination, the ability of a model to distinguish between cases and controls, with an area under the curve between 0.57 and 0.879 and calibration, the model's ability to assign an accurate probability to an individual. In our review we found that further validation studies need to be considered; especially for the Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial 2012 Model Version (PLCOM2012) and Hoggart models, which recorded the best overall performance. Future studies will need to focus on prediction rules, such as optimal risk thresholds, for models for selective screening trials. Only 3 validation studies considered prediction rules when validating the models and overall the models were validated using varied tests in distinct populations, which made direct comparisons difficult. To improve this, multiple models need to be tested on the same data set with considerations for sensitivity, specificity, model accuracy, and positive predictive values at the optimal risk thresholds.

  9. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  10. Topsoil organic carbon content of Europe, a new map based on a generalised additive model

    NASA Astrophysics Data System (ADS)

    de Brogniez, Delphine; Ballabio, Cristiano; Stevens, Antoine; Jones, Robert J. A.; Montanarella, Luca; van Wesemael, Bas

    2014-05-01

    There is an increasing demand for up-to-date spatially continuous organic carbon (OC) data for global environment and climatic modeling. Whilst the current map of topsoil organic carbon content for Europe (Jones et al., 2005) was produced by applying expert-knowledge based pedo-transfer rules on large soil mapping units, the aim of this study was to replace it by applying digital soil mapping techniques on the first European harmonised geo-referenced topsoil (0-20 cm) database, which arises from the LUCAS (land use/cover area frame statistical survey) survey. A generalized additive model (GAM) was calibrated on 85% of the dataset (ca. 17 000 soil samples) and a backward stepwise approach selected slope, land cover, temperature, net primary productivity, latitude and longitude as environmental covariates (500 m resolution). The validation of the model (applied on 15% of the dataset), gave an R2 of 0.27. We observed that most organic soils were under-predicted by the model and that soils of Scandinavia were also poorly predicted. The model showed an RMSE of 42 g kg-1 for mineral soils and of 287 g kg-1 for organic soils. The map of predicted OC content showed the lowest values in Mediterranean countries and in croplands across Europe, whereas highest OC content were predicted in wetlands, woodlands and in mountainous areas. The map of standard error of the OC model predictions showed high values in northern latitudes, wetlands, moors and heathlands, whereas low uncertainty was mostly found in croplands. A comparison of our results with the map of Jones et al. (2005) showed a general agreement on the prediction of mineral soils' OC content, most probably because the models use some common covariates, namely land cover and temperature. Our model however failed to predict values of OC content greater than 200 g kg-1, which we explain by the imposed unimodal distribution of our model, whose mean is tilted towards the majority of soils, which are mineral. Finally, average

  11. Risk Management Model in Surface Exploitation of Mineral Deposits

    NASA Astrophysics Data System (ADS)

    Stojanović, Cvjetko

    2016-06-01

    Risk management is an integrative part of all types of project management. One of the main tasks of pre-investment studies and other project documentation is the tendency to protect investment projects as much as possible against investment risks. Therefore, the provision and regulation of risk information ensure the identification of the probability of the emergence of adverse events, their forms, causes and consequences, and provides a timely measures of protection against risks. This means that risk management involves a set of management methods and techniques used to reduce the possibility of realizing the adverse events and consequences and thus increase the possibilities of achieving the planned results with minimal losses. Investment in mining projects are of capital importance because they are very complex projects, therefore being very risky, because of the influence of internal and external factors and limitations arising from the socio-economic environment. Due to the lack of a risk management system, numerous organizations worldwide have suffered significant financial losses. Therefore, it is necessary for any organization to establish a risk management system as a structural element of system management system as a whole. This paper presents an approach to a Risk management model in the project of opening a surface coal mine, developed based on studies of extensive scientific literature and personal experiences of the author, and which, with certain modifications, may find use for any investment project, both in the mining industry as well as in investment projects in other areas.

  12. A first screening and risk assessment of pharmaceuticals and additives in personal care products in waste water, sludge, recipient water and sediment from Faroe Islands, Iceland and Greenland.

    PubMed

    Huber, Sandra; Remberger, Mikael; Kaj, Lennart; Schlabach, Martin; Jörundsdóttir, Hrönn Ó; Vester, Jette; Arnórsson, Mímir; Mortensen, Inge; Schwartson, Richard; Dam, Maria

    2016-08-15

    A screening of a broad range of pharmaceuticals and additives in personal care products (PPCPs) in sub-arctic locations of the Faroe Islands (FO), Iceland (IS) and Greenland (GL) was conducted. In total 36 pharmaceuticals including some metabolites, and seven additives in personal care products were investigated in influent and effluent waters as well as sludge of waste water treatment plants (WWTPs) and in water and sediment of recipients. Concentrations and distribution patterns for PPCPs discharged via sewage lines (SLs) to the marine environment were assessed. Of the 36 pharmaceuticals or metabolites analysed 33 were found close to or above the limit of detection (LOD) in all or a part of the samples. All of the seven investigated additives in personal care products were detected above the LOD. Some of the analysed PPCPs occurred in every or almost every sample. Among these were diclofenac, ibuprofen, lidocaine, naproxen, metformin, citalopram, venlafaxine, amiloride, furosemide, metoprolol, sodium dodecyl sulphate (SDS) and cetrimonium salt (ATAC-C16). Additionally, the study encompasses ecotoxicological risk assessment of 2/3 of the analysed PPCPs in recipient and diluted effluent waters. For candesartan only a small margin to levels with inacceptable risks was observed in diluted effluent waters at two locations (FO). Chronical risks for aquatic organisms staying and/or living around WWTP effluent pipe-outlets were indicated for 17β-estradiol and estriol in the three countries. Additives in PCPs were found to pose the largest risk to the aquatic environment. The surfactants CAPB and ATAC-C16 were found in concentrations resulting in risk factors up to 375 for CAPB and 165 for ATAC-C16 in recipients for diluted effluents from Iggia, Nuuk (GL) and Torshavn (FO) respectively. These results demonstrates a potentially high ecological risk stemming from discharge of surfactants as used in household and industrial detergents as well as additives in personal care

  13. Comparison of prosthetic models produced by traditional and additive manufacturing methods

    PubMed Central

    Park, Jin-Young; Kim, Hae-Young; Kim, Ji-Hwan; Kim, Jae-Hong

    2015-01-01

    PURPOSE The purpose of this study was to verify the clinical-feasibility of additive manufacturing by comparing the accuracy of four different manufacturing methods for metal coping: the conventional lost wax technique (CLWT); subtractive methods with wax blank milling (WBM); and two additive methods, multi jet modeling (MJM), and micro-stereolithography (Micro-SLA). MATERIALS AND METHODS Thirty study models were created using an acrylic model with the maxillary upper right canine, first premolar, and first molar teeth. Based on the scan files from a non-contact blue light scanner (Identica; Medit Co. Ltd., Seoul, Korea), thirty cores were produced using the WBM, MJM, and Micro-SLA methods, respectively, and another thirty frameworks were produced using the CLWT method. To measure the marginal and internal gap, the silicone replica method was adopted, and the silicone images obtained were evaluated using a digital microscope (KH-7700; Hirox, Tokyo, Japan) at 140X magnification. Analyses were performed using two-way analysis of variance (ANOVA) and Tukey post hoc test (α=.05). RESULTS The mean marginal gaps and internal gaps showed significant differences according to tooth type (P<.001 and P<.001, respectively) and manufacturing method (P<.037 and P<.001, respectively). Micro-SLA did not show any significant difference from CLWT regarding mean marginal gap compared to the WBM and MJM methods. CONCLUSION The mean values of gaps resulting from the four different manufacturing methods were within a clinically allowable range, and, thus, the clinical use of additive manufacturing methods is acceptable as an alternative to the traditional lost wax-technique and subtractive manufacturing. PMID:26330976

  14. Thermodynamic network model for predicting effects of substrate addition and other perturbations on subsurface microbial communities

    SciTech Connect

    Jack Istok; Melora Park; James McKinley; Chongxuan Liu; Lee Krumholz; Anne Spain; Aaron Peacock; Brett Baldwin

    2007-04-19

    The overall goal of this project is to develop and test a thermodynamic network model for predicting the effects of substrate additions and environmental perturbations on microbial growth, community composition and system geochemistry. The hypothesis is that a thermodynamic analysis of the energy-yielding growth reactions performed by defined groups of microorganisms can be used to make quantitative and testable predictions of the change in microbial community composition that will occur when a substrate is added to the subsurface or when environmental conditions change.

  15. Assessing risk factors for dental caries: a statistical modeling approach.

    PubMed

    Trottini, Mario; Bossù, Maurizio; Corridore, Denise; Ierardo, Gaetano; Luzzi, Valeria; Saccucci, Matteo; Polimeni, Antonella

    2015-01-01

    The problem of identifying potential determinants and predictors of dental caries is of key importance in caries research and it has received considerable attention in the scientific literature. From the methodological side, a broad range of statistical models is currently available to analyze dental caries indices (DMFT, dmfs, etc.). These models have been applied in several studies to investigate the impact of different risk factors on the cumulative severity of dental caries experience. However, in most of the cases (i) these studies focus on a very specific subset of risk factors; and (ii) in the statistical modeling only few candidate models are considered and model selection is at best only marginally addressed. As a result, our understanding of the robustness of the statistical inferences with respect to the choice of the model is very limited; the richness of the set of statistical models available for analysis in only marginally exploited; and inferences could be biased due the omission of potentially important confounding variables in the model's specification. In this paper we argue that these limitations can be overcome considering a general class of candidate models and carefully exploring the model space using standard model selection criteria and measures of global fit and predictive performance of the candidate models. Strengths and limitations of the proposed approach are illustrated with a real data set. In our illustration the model space contains more than 2.6 million models, which require inferences to be adjusted for 'optimism'.

  16. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  17. Generalized additive models and Lucilia sericata growth: assessing confidence intervals and error rates in forensic entomology.

    PubMed

    Tarone, Aaron M; Foran, David R

    2008-07-01

    Forensic entomologists use blow fly development to estimate a postmortem interval. Although accurate, fly age estimates can be imprecise for older developmental stages and no standard means of assigning confidence intervals exists. Presented here is a method for modeling growth of the forensically important blow fly Lucilia sericata, using generalized additive models (GAMs). Eighteen GAMs were created to predict the extent of juvenile fly development, encompassing developmental stage, length, weight, strain, and temperature data, collected from 2559 individuals. All measures were informative, explaining up to 92.6% of the deviance in the data, though strain and temperature exerted negligible influences. Predictions made with an independent data set allowed for a subsequent examination of error. Estimates using length and developmental stage were within 5% of true development percent during the feeding portion of the larval life cycle, while predictions for postfeeding third instars were less precise, but within expected error.

  18. Phase-Field Modeling of Microstructure Evolution in Electron Beam Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Gong, Xibing; Chou, Kevin

    2015-05-01

    In this study, the microstructure evolution in the powder-bed electron beam additive manufacturing (EBAM) process is studied using phase-field modeling. In essence, EBAM involves a rapid solidification process and the properties of a build partly depend on the solidification behavior as well as the microstructure of the build material. Thus, the prediction of microstructure evolution in EBAM is of importance for its process optimization. Phase-field modeling was applied to study the microstructure evolution and solute concentration of the Ti-6Al-4V alloy in the EBAM process. The effect of undercooling was investigated through the simulations; the greater the undercooling, the faster the dendrite grows. The microstructure simulations show multiple columnar-grain growths, comparable with experimental results for the tested range.

  19. Robust estimation of mean and dispersion functions in extended generalized additive models.

    PubMed

    Croux, Christophe; Gijbels, Irène; Prosdocimi, Ilaria

    2012-03-01

    Generalized linear models are a widely used method to obtain parametric estimates for the mean function. They have been further extended to allow the relationship between the mean function and the covariates to be more flexible via generalized additive models. However, the fixed variance structure can in many cases be too restrictive. The extended quasilikelihood (EQL) framework allows for estimation of both the mean and the dispersion/variance as functions of covariates. As for other maximum likelihood methods though, EQL estimates are not resistant to outliers: we need methods to obtain robust estimates for both the mean and the dispersion function. In this article, we obtain functional estimates for the mean and the dispersion that are both robust and smooth. The performance of the proposed method is illustrated via a simulation study and some real data examples.

  20. Observations and model calculations of an additional layer in the topside ionosphere above Fortaleza, Brazil

    NASA Astrophysics Data System (ADS)

    Jenkins, B.; Bailey, G. J.; Abdu, M. A.; Batista, I. S.; Balan, N.

    1997-06-01

    Calculations using the Sheffield University plasmasphere ionosphere model have shown that under certain conditions an additional layer can form in the low latitude topside ionosphere. This layer (the F3 layer) has subsequently been observed in ionograms recorded at Fortaleza in Brazil. It has not been observed in ionograms recorded at the neighbouring station São Luis. Model calculations have shown that the F3 layer is most likely to form in summer at Fortaleza due to a combination of the neutral wind and the E×B drift acting to raise the plasma. At the location of São Luis, almost on the geomagnetic equator, the neutral wind has a smaller vertical component so the F3 layer does not form.

  1. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix.

  2. The use of ecosystem models in risk assessment

    SciTech Connect

    Starodub, M.E.; Miller, P.A.; Willes, R.F.

    1994-12-31

    Ecosystem models, when used in conjunction with available environmental effects monitoring data enable informed decisions regarding actions that should be taken to manage ecological risks from areas of localized chemical loadings and accumulation. These models provide quantitative estimates of chemical concentrations in various environmental media. The reliable application of these models as predictive tools for environmental assessment requires a thorough understanding of the theory and mathematical relationships described by the models and demands rigorous validation of input data and model results with field and laboratory data. Food chain model selection should be based on the ability to best simulate the interactions of the food web and processes governing the transfer of chemicals from the dissolved and particulate phase to various trophic levels for the site in question. This requires that the user be familiar with the theories on which these models are based, and be aware of the merits and short comings of each prior to attempting to model food chain accumulation. Questions to be asked include: are all potential exposure pathways addressed? are omitted pathways critical to the risk assessment process? is the model flexible? To answer these questions one must consider the, chemical(s) of concern, site-specific ecosystem characteristics, risk assessment receptor (aquatic, wildlife, human) dietary habits, influence of effluent characteristics on food chain dynamics.

  3. Physics-based Entry, Descent and Landing Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  4. A cooperative model for IS security risk management in distributed environment.

    PubMed

    Feng, Nan; Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.

  5. A Cooperative Model for IS Security Risk Management in Distributed Environment

    PubMed Central

    Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively. PMID:24563626

  6. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  7. Evaluating biomarkers to model cancer risk post cosmic ray exposure

    NASA Astrophysics Data System (ADS)

    Sridharan, Deepa M.; Asaithamby, Aroumougame; Blattnig, Steve R.; Costes, Sylvain V.; Doetsch, Paul W.; Dynan, William S.; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D.; Peterson, Leif E.; Plante, Ianik; Ponomarev, Artem L.; Saha, Janapriya; Snijders, Antoine M.; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M.

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  8. Guarana provides additional stimulation over caffeine alone in the planarian model.

    PubMed

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R; Constable, Mic Andre; Mulligan, Margaret E; Voura, Evelyn B

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose.

  9. Guarana Provides Additional Stimulation over Caffeine Alone in the Planarian Model

    PubMed Central

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R.; Constable, Mic Andre; Mulligan, Margaret E.; Voura, Evelyn B.

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose. PMID:25880065

  10. Building a Better Model: A Comprehensive Breast Cancer Risk Model Incorporating Breast Density to Stratify Risk and Apply Resources

    DTIC Science & Technology

    2014-10-01

    assessment model that includes automated measurement of breast density. Scope: Assemble a cohort of women with known breast cancer risk factors and...digital mammogram files for women diagnosed with breast cancer using existing data sources and match them to controls (Harvey/Knaus). Validate and...density will translate to changes in breast cancer risk. Therefore, noise in measurement should be minimal. Thirty women were recruited under this

  11. “Skill of Generalized Additive Model to Detect PM2.5 Health ...

    EPA Pesticide Factsheets

    Summary. Measures of health outcomes are collinear with meteorology and air quality, making analysis of connections between human health and air quality difficult. The purpose of this analysis was to determine time scales and periods shared by the variables of interest (and by implication scales and periods that are not shared). Hospital admissions, meteorology (temperature and relative humidity), and air quality (PM2.5 and daily maximum ozone) for New York City during the period 2000-2006 were decomposed into temporal scales ranging from 2 days to greater than two years using a complex wavelet transform. Health effects were modeled as functions of the wavelet components of meteorology and air quality using the generalized additive model (GAM) framework. This simulation study showed that GAM is extremely successful at extracting and estimating a health effect embedded in a dataset. It also shows that, if the objective in mind is to estimate the health signal but not to fully explain this signal, a simple GAM model with a single confounder (calendar time) whose smooth representation includes a sufficient number of constraints is as good as a more complex model.Introduction. In the context of wavelet regression, confounding occurs when two or more independent variables interact with the dependent variable at the same frequency. Confounding also acts on a variety of time scales, changing the PM2.5 coefficient (magnitude and sign) and its significance ac

  12. A dynamical systems model for nuclear power plant risk

    NASA Astrophysics Data System (ADS)

    Hess, Stephen Michael

    The recent transition to an open access generation marketplace has forced nuclear plant operators to become much more cost conscious and focused on plant performance. Coincidentally, the regulatory perspective also is in a state of transition from a command and control framework to one that is risk-informed and performance-based. Due to these structural changes in the economics and regulatory system associated with commercial nuclear power plant operation, there is an increased need for plant management to explicitly manage nuclear safety risk. Application of probabilistic risk assessment techniques to model plant hardware has provided a significant contribution to understanding the potential initiating events and equipment failures that can lead to core damage accidents. Application of the lessons learned from these analyses has supported improved plant operation and safety over the previous decade. However, this analytical approach has not been nearly as successful in addressing the impact of plant processes and management effectiveness on the risks of plant operation. Thus, the research described in this dissertation presents a different approach to address this issue. Here we propose a dynamical model that describes the interaction of important plant processes among themselves and their overall impact on nuclear safety risk. We first provide a review of the techniques that are applied in a conventional probabilistic risk assessment of commercially operating nuclear power plants and summarize the typical results obtained. The limitations of the conventional approach and the status of research previously performed to address these limitations also are presented. Next, we present the case for the application of an alternative approach using dynamical systems theory. This includes a discussion of previous applications of dynamical models to study other important socio-economic issues. Next, we review the analytical techniques that are applicable to analysis of

  13. Computation of octanol-water partition coefficients by guiding an additive model with knowledge.

    PubMed

    Cheng, Tiejun; Zhao, Yuan; Li, Xun; Lin, Fu; Xu, Yong; Zhang, Xinglong; Li, Yan; Wang, Renxiao; Lai, Luhua

    2007-01-01

    We have developed a new method, i.e., XLOGP3, for logP computation. XLOGP3 predicts the logP value of a query compound by using the known logP value of a reference compound as a starting point. The difference in the logP values of the query compound and the reference compound is then estimated by an additive model. The additive model implemented in XLOGP3 uses a total of 87 atom/group types and two correction factors as descriptors. It is calibrated on a training set of 8199 organic compounds with reliable logP data through a multivariate linear regression analysis. For a given query compound, the compound showing the highest structural similarity in the training set will be selected as the reference compound. Structural similarity is quantified based on topological torsion descriptors. XLOGP3 has been tested along with its predecessor, i.e., XLOGP2, as well as several popular logP methods on two independent test sets: one contains 406 small-molecule drugs approved by the FDA and the other contains 219 oligopeptides. On both test sets, XLOGP3 produces more accurate predictions than most of the other methods with average unsigned errors of 0.24-0.51 units. Compared to conventional additive methods, XLOGP3 does not rely on an extensive classification of fragments and correction factors in order to improve accuracy. It is also able to utilize the ever-increasing experimentally measured logP data more effectively.

  14. A quality risk management model approach for cell therapy manufacturing.

    PubMed

    Lopez, Fabio; Di Bartolo, Chiara; Piazza, Tommaso; Passannanti, Antonino; Gerlach, Jörg C; Gridelli, Bruno; Triolo, Fabio

    2010-12-01

    International regulatory authorities view risk management as an essential production need for the development of innovative, somatic cell-based therapies in regenerative medicine. The available risk management guidelines, however, provide little guidance on specific risk analysis approaches and procedures applicable in clinical cell therapy manufacturing. This raises a number of problems. Cell manufacturing is a poorly automated process, prone to operator-introduced variations, and affected by heterogeneity of the processed organs/tissues and lot-dependent variability of reagent (e.g., collagenase) efficiency. In this study, the principal challenges faced in a cell-based product manufacturing context (i.e., high dependence on human intervention and absence of reference standards for acceptable risk levels) are identified and addressed, and a risk management model approach applicable to manufacturing of cells for clinical use is described for the first time. The use of the heuristic and pseudo-quantitative failure mode and effect analysis/failure mode and critical effect analysis risk analysis technique associated with direct estimation of severity, occurrence, and detection is, in this specific context, as effective as, but more efficient than, the analytic hierarchy process. Moreover, a severity/occurrence matrix and Pareto analysis can be successfully adopted to identify priority failure modes on which to act to mitigate risks. The application of this approach to clinical cell therapy manufacturing in regenerative medicine is also discussed.

  15. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases.

  16. Assessing Landslide Risk Areas Using Statistical Models and Land Cover

    NASA Astrophysics Data System (ADS)

    Kim, H. G.; Lee, D. K.; Park, C.; Ahn, Y.; Sung, S.; Park, J. H.

    2015-12-01

    Recently, damages due to landslides have increased in Republic of Korea. Extreme weathers like typhoon, heavy rainfall related to climate change are the main factor of the damages. Especially, Inje-gun, Gangwon-do had severe landslide damages in 2006 and 2007. In Inje-gun, 91% areas are forest, therefore, many land covers related to human activities were adjacent to forest land. Thus, establishment of adaptation plans to landslides was urgently needed. Landslide risk assessment can serve as a good information to policy makers. The objective of this study was assessing landslide risk areas to support establishment of adaptation plans to reduce landslide damages. Statistical distribution models (SDMs) were used to evaluate probability of landslide occurrence. Various SDMs were used to make landslide probability maps considering uncertainty of SDMs. The types of land cover were classified into 5 grades considering vulnerable level to landslide. The landslide probability maps were overlaid with land cover map to calculate landslide risk. As a result of overlay analysis, landslide risk areas were derived. Especially agricultural areas and transportation areas showed high risk and large areas in the risk map. In conclusion, policy makers in Inje-gun must consider the landslide risk map to establish adaptation plans effectively.

  17. A Model for Risk Analysis of Oil Tankers

    NASA Astrophysics Data System (ADS)

    Montewka, Jakub; Krata, Przemysław; Goerland, Floris; Kujala, Pentti

    2010-01-01

    The paper presents a model for risk analysis regarding marine traffic, with the emphasis on two types of the most common marine accidents which are: collision and grounding. The focus is on oil tankers as these pose the highest environmental risk. A case study in selected areas of Gulf of Finland in ice free conditions is presented. The model utilizes a well-founded formula for risk calculation, which combines the probability of an unwanted event with its consequences. Thus the model is regarded a block type model, consisting of blocks for the probability of collision and grounding estimation respectively as well as blocks for consequences of an accident modelling. Probability of vessel colliding is assessed by means of a Minimum Distance To Collision (MDTC) based model. The model defines in a novel way the collision zone, using mathematical ship motion model and recognizes traffic flow as a non homogeneous process. The presented calculations address waterways crossing between Helsinki and Tallinn, where dense cross traffic during certain hours is observed. For assessment of a grounding probability, a new approach is proposed, which utilizes a newly developed model, where spatial interactions between objects in different locations are recognized. A ship at a seaway and navigational obstructions may be perceived as interacting objects and their repulsion may be modelled by a sort of deterministic formulation. Risk due to tankers running aground addresses an approach fairway to an oil terminal in Sköldvik, near Helsinki. The consequences of an accident are expressed in monetary terms, and concern costs of an oil spill, based on statistics of compensations claimed from the International Oil Pollution Compensation Funds (IOPC Funds) by parties involved.

  18. The benefits of an additional worker are task-dependent: assessing low-back injury risks during prefabricated (panelized) wall construction.

    PubMed

    Kim, Sunwook; Nussbaum, Maury A; Jia, Bochen

    2012-09-01

    Team manual material handling is a common practice in residential construction where prefabricated building components (e.g., wall panels) are increasingly used. As part of a larger effort to enable proactive control of ergonomic exposures among workers handling panels, this study explored the effects of additional workers on injury risks during team-based panel erection tasks, specifically by quantifying how injury risks are affected by increasing the number of workers (by one, above the nominal or most common number). Twenty-four participants completed panel erection tasks with and without an additional worker under different panel mass and size conditions. Four risk assessment methods were employed that emphasized the low back. Though including an additional worker generally reduced injury risk across several panel masses and sizes, the magnitude of these benefits varied depending on the specific task and exhibited somewhat high variability within a given task. These results suggest that a simple, generalizable recommendation regarding team-based panel erection tasks is not warranted. Rather, a more systems-level approach accounting for both injury risk and productivity (a strength of panelized wall systems) should be undertaken.

  19. Launch Vehicle Debris Models and Crew Vehicle Ascent Abort Risk

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott

    2013-01-01

    For manned space launch systems, a reliable abort system is required to reduce the risks associated with a launch vehicle failure during ascent. Understanding the risks associated with failure environments can be achieved through the use of physics-based models of these environments. Debris fields due to destruction of the launch vehicle is one such environment. To better analyze the risk posed by debris, a physics-based model for generating launch vehicle debris catalogs has been developed. The model predicts the mass distribution of the debris field based on formulae developed from analysis of explosions. Imparted velocity distributions are computed using a shock-physics code to model the explosions within the launch vehicle. A comparison of the debris catalog with an existing catalog for the Shuttle external tank show good comparison in the debris characteristics and the predicted debris strike probability. The model is used to analyze the effects of number of debris pieces and velocity distributions on the strike probability and risk.

  20. Evaluation of Periodontal Risk in Adult Patients using Two Different Risk Assessment Models – A Pilot Study

    PubMed Central

    Bade, Shruthi; Bollepalli, Appaiah Chowdary; Katuri, Kishore Kumar; Devulapalli, Narasimha Swamy; Swarna, Chakrapani

    2015-01-01

    Objective: The aim of the present study was to evaluate the periodontal risk of individuals using periodontal risk assessment (PRA) model and modified PRA model. Materials and Methods: A total of 50 patients with chronic periodontitis, age 30-60 years were selected randomly and charting of the periodontal status was performed and those who met the inclusion criteria were enrolled in the study. Parameters recorded were- percentage of sites with bleeding on probing (BOP), number of sites with pocket depths (PD) ≥ 5mm, number of the teeth lost, bone loss (BL)/age ratio, Clinical attachment loss(CAL)/age ratio, diabetic and smoking status, dental status, systemic factors like diabetes were assessed. All the risk factors were plotted on the radar chart in (PRA) and (mPRA) models, using Microsoft excel and periodontal risk were categorized as low, moderate and high risk. Results: Among 50 patients 31 were in low risk, 9 in moderate risk, and 10 in high risk identified by modified (PRA) model, whereas 28 patients were in low risk, 13 in moderate risk and 9 in high risk identified by (PRA). Statistical analysis demonstrated that there was no significant difference between the risk scores (X2 = 0.932 with degree of freedom = 2, P = 0.627). Conclusion: Both the periodontal risk models are effective in evaluating the risk factors and can be useful tool for predicting proper diagnosis, disease progression and therapeutic strategies during the supportive periodontal therapy. PMID:25859520

  1. Inaccuracy of Self-Evaluation as Additional Variable for Prediction of Students at Risk of Failing First-Year Chemistry

    ERIC Educational Resources Information Center

    Potgieter, Marietjie; Ackermann, Mia; Fletcher, Lizelle

    2010-01-01

    Early identification of students at risk of failing first-year chemistry allows timely intervention. Cognitive factors alone are insufficient predictors for success; however, non cognitive factors are usually difficult to measure. We have explored the use of demographic and performance variables, as well as the accuracy of self-evaluation as an…

  2. The biobehavioral family model: testing social support as an additional exogenous variable.

    PubMed

    Woods, Sarah B; Priest, Jacob B; Roush, Tara

    2014-12-01

    This study tests the inclusion of social support as a distinct exogenous variable in the Biobehavioral Family Model (BBFM). The BBFM is a biopsychosocial approach to health that proposes that biobehavioral reactivity (anxiety and depression) mediates the relationship between family emotional climate and disease activity. Data for this study included married, English-speaking adult participants (n = 1,321; 55% female; M age = 45.2 years) from the National Comorbidity Survey Replication, a nationally representative epidemiological study of the frequency of mental disorders in the United States. Participants reported their demographics, marital functioning, social support from friends and relatives, anxiety and depression (biobehavioral reactivity), number of chronic health conditions, and number of prescription medications. Confirmatory factor analyses supported the items used in the measures of negative marital interactions, social support, and biobehavioral reactivity, as well as the use of negative marital interactions, friends' social support, and relatives' social support as distinct factors in the model. Structural equation modeling indicated a good fit of the data to the hypothesized model (χ(2)  = 846.04, p = .000, SRMR = .039, CFI = .924, TLI = .914, RMSEA = .043). Negative marital interactions predicted biobehavioral reactivity (β = .38, p < .001), as did relatives' social support, inversely (β = -.16, p < .001). Biobehavioral reactivity predicted disease activity (β = .40, p < .001) and was demonstrated to be a significant mediator through tests of indirect effects. Findings are consistent with previous tests of the BBFM with adult samples, and suggest the important addition of family social support as a predicting factor in the model.

  3. A habitat suitability model for Chinese sturgeon determined using the generalized additive method

    NASA Astrophysics Data System (ADS)

    Yi, Yujun; Sun, Jie; Zhang, Shanghong

    2016-03-01

    The Chinese sturgeon is a type of large anadromous fish that migrates between the ocean and rivers. Because of the construction of dams, this sturgeon's migration path has been cut off, and this species currently is on the verge of extinction. Simulating suitable environmental conditions for spawning followed by repairing or rebuilding its spawning grounds are effective ways to protect this species. Various habitat suitability models based on expert knowledge have been used to evaluate the suitability of spawning habitat. In this study, a two-dimensional hydraulic simulation is used to inform a habitat suitability model based on the generalized additive method (GAM). The GAM is based on real data. The values of water depth and velocity are calculated first via the hydrodynamic model and later applied in the GAM. The final habitat suitability model is validated using the catch per unit effort (CPUEd) data of 1999 and 2003. The model results show that a velocity of 1.06-1.56 m/s and a depth of 13.33-20.33 m are highly suitable ranges for the Chinese sturgeon to spawn. The hydraulic habitat suitability indexes (HHSI) for seven discharges (4000; 9000; 12,000; 16,000; 20,000; 30,000; and 40,000 m3/s) are calculated to evaluate integrated habitat suitability. The results show that the integrated habitat suitability reaches its highest value at a discharge of 16,000 m3/s. This study is the first to apply a GAM to evaluate the suitability of spawning grounds for the Chinese sturgeon. The study provides a reference for the identification of potential spawning grounds in the entire basin.

  4. Modeling particulate matter concentrations measured through mobile monitoring in a deletion/substitution/addition approach

    NASA Astrophysics Data System (ADS)

    Su, Jason G.; Hopke, Philip K.; Tian, Yilin; Baldwin, Nichole; Thurston, Sally W.; Evans, Kristin; Rich, David Q.

    2015-12-01

    Land use regression modeling (LUR) through local scale circular modeling domains has been used to predict traffic-related air pollution such as nitrogen oxides (NOX). LUR modeling for fine particulate matters (PM), which generally have smaller spatial gradients than NOX, has been typically applied for studies involving multiple study regions. To increase the spatial coverage for fine PM and key constituent concentrations, we designed a mobile monitoring network in Monroe County, New York to measure pollutant concentrations of black carbon (BC, wavelength at 880 nm), ultraviolet black carbon (UVBC, wavelength at 3700 nm) and Delta-C (the difference between the UVBC and BC concentrations) using the Clarkson University Mobile Air Pollution Monitoring Laboratory (MAPL). A Deletion/Substitution/Addition (D/S/A) algorithm was conducted, which used circular buffers as a basis for statistics. The algorithm maximizes the prediction accuracy for locations without measurements using the V-fold cross-validation technique, and it reduces overfitting compared to other approaches. We found that the D/S/A LUR modeling approach could achieve good results, with prediction powers of 60%, 63%, and 61%, respectively, for BC, UVBC, and Delta-C. The advantage of mobile monitoring is that it can monitor pollutant concentrations at hundreds of spatial points in a region, rather than the typical less than 100 points from a fixed site saturation monitoring network. This research indicates that a mobile saturation sampling network, when combined with proper modeling techniques, can uncover small area variations (e.g., 10 m) in particulate matter concentrations.

  5. Revisiting automated G-protein coupled receptor modeling: the benefit of additional template structures for a neurokinin-1 receptor model.

    PubMed

    Kneissl, Benny; Leonhardt, Bettina; Hildebrandt, Andreas; Tautermann, Christofer S

    2009-05-28

    The feasibility of automated procedures for the modeling of G-protein coupled receptors (GPCR) is investigated on the example of the human neurokinin-1 (NK1) receptor. We use a combined method of homology modeling and molecular docking and analyze the information content of the resulting docking complexes regarding the binding mode for further refinements. Moreover, we explore the impact of different template structures, the bovine rhodopsin structure, the human beta(2) adrenergic receptor, and in particular a combination of both templates to include backbone flexibility in the target conformational space. Our results for NK1 modeling demonstrate that model selection from a set of decoys can in general not solely rely on docking experiments but still requires additional mutagenesis data. However, an enrichment factor of 2.6 in a nearly fully automated approach indicates that reasonable models can be created automatically if both available templates are used for model construction. Thus, the recently resolved GPCR structures open new ways to improve the model building fundamentally.

  6. Generalized Additive Models Used to Predict Species Abundance in the Gulf of Mexico: An Ecosystem Modeling Tool

    PubMed Central

    Drexler, Michael; Ainsworth, Cameron H.

    2013-01-01

    Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM) approach is used to describe the abundance of 40 species groups (i.e. functional groups) across the Gulf of Mexico (GoM) using a large fisheries independent data set (SEAMAP) and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum) and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist. PMID:23691223

  7. Smoking and polymorphisms in xenobiotic metabolism and DNA repair genes are additive risk factors affecting bladder cancer in Northern Tunisia.

    PubMed

    Rouissi, Kamel; Ouerhani, Slah; Hamrita, Bechr; Bougatef, Karim; Marrakchi, Raja; Cherif, Mohamed; Ben Slama, Mohamed Riadh; Bouzouita, Mohamed; Chebil, Mohamed; Ben Ammar Elgaaied, Amel

    2011-12-01

    Cancer epidemiology has undergone marked development since the nineteen-fifties. One of the most spectacular and specific contributions was the demonstration of the massive effect of smoking and genetic polymorphisms on the occurrence of bladder cancer. The tobacco carcinogens are metabolized by various xenobiotic metabolizing enzymes, such as the super-families of N-acetyltransferases (NAT) and glutathione S-transferases (GST). DNA repair is essential to an individual's ability to respond to damage caused by tobacco carcinogens. Alterations in DNA repair genes may affect cancer risk by influencing individual susceptibility to this environmental exposure. Polymorphisms in NAT2, GST and DNA repair genes alter the ability of these enzymes to metabolize carcinogens or to repair alterations caused by this process. We have conducted a case-control study to assess the role of smoking, slow NAT2 variants, GSTM1 and GSTT1 null, and XPC, XPD, XPG nucleotide excision-repair (NER) genotypes in bladder cancer development in North Tunisia. Taken alone, each gene unless NAT2 did not appear to be a factor affecting bladder cancer susceptibility. For the NAT2 slow acetylator genotypes, the NAT2*5/*7 diplotype was found to have a 7-fold increased risk to develop bladder cancer (OR = 7.14; 95% CI: 1.30-51.41). However, in tobacco consumers, we have shown that Null GSTM1, Wild GSTT1, Slow NAT2, XPC (CC) and XPG (CC) are genetic risk factors for the disease. When combined together in susceptible individuals compared to protected individuals these risk factors give an elevated OR (OR = 61). So, we have shown a strong cumulative effect of tobacco and different combinations of studied genetic risk factors which lead to a great susceptibility to bladder cancer.

  8. Impact of an additional chronic BDNF reduction on learning performance in an Alzheimer mouse model

    PubMed Central

    Psotta, Laura; Rockahr, Carolin; Gruss, Michael; Kirches, Elmar; Braun, Katharina; Lessmann, Volkmar; Bock, Jörg; Endres, Thomas

    2015-01-01

    There is increasing evidence that brain-derived neurotrophic factor (BDNF) plays a crucial role in Alzheimer’s disease (AD) pathology. A number of studies demonstrated that AD patients exhibit reduced BDNF levels in the brain and the blood serum, and in addition, several animal-based studies indicated a potential protective effect of BDNF against Aβ-induced neurotoxicity. In order to further investigate the role of BDNF in the etiology of AD, we created a novel mouse model by crossing a well-established AD mouse model (APP/PS1) with a mouse exhibiting a chronic BDNF deficiency (BDNF+/−). This new triple transgenic mouse model enabled us to further analyze the role of BDNF in AD in vivo. We reasoned that in case BDNF has a protective effect against AD pathology, an AD-like phenotype in our new mouse model should occur earlier and/or in more severity than in the APP/PS1-mice. Indeed, the behavioral analysis revealed that the APP/PS1-BDNF+/−-mice show an earlier onset of learning impairments in a two-way active avoidance task in comparison to APP/PS1- and BDNF+/−-mice. However in the Morris water maze (MWM) test, we could not observe an overall aggrevated impairment in spatial learning and also short-term memory in an object recognition task remained intact in all tested mouse lines. In addition to the behavioral experiments, we analyzed the amyloid plaque pathology in the APP/PS1 and APP/PS1-BDNF+/−-mice and observed a comparable plaque density in the two genotypes. Moreover, our results revealed a higher plaque density in prefrontal cortical compared to hippocampal brain regions. Our data reveal that higher cognitive tasks requiring the recruitment of cortical networks appear to be more severely affected in our new mouse model than learning tasks requiring mainly sub-cortical networks. Furthermore, our observations of an accelerated impairment in active avoidance learning in APP/PS1-BDNF+/−-mice further supports the hypothesis that BDNF deficiency

  9. Cognitive Processes that Account for Mental Addition Fluency Differences between Children Typically Achieving in Arithmetic and Children At-Risk for Failure in Arithmetic

    ERIC Educational Resources Information Center

    Berg, Derek H.; Hutchinson, Nancy L.

    2010-01-01

    This study investigated whether processing speed, short-term memory, and working memory accounted for the differential mental addition fluency between children typically achieving in arithmetic (TA) and children at-risk for failure in arithmetic (AR). Further, we drew attention to fluency differences in simple (e.g., 5 + 3) and complex (e.g., 16 +…

  10. Additive influence of genetic predisposition and conventional risk factors in the incidence of coronary heart disease: a population-based study in Greece

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An additive genetic risk score (GRS) for coronary heart disease (CHD) has previously been associated with incident CHD in the population-based Greek European Prospective Investigation into Cancer and nutrition (EPIC) cohort. In this study, we explore GRS-‘environment’ joint actions on CHD for severa...

  11. Risk of second primary cancer following prostate cancer radiotherapy: DVH analysis using the competitive risk model

    NASA Astrophysics Data System (ADS)

    Takam, R.; Bezak, E.; Yeoh, E. E.

    2009-02-01

    This study aimed to estimate the risk of developing second primary cancer (SPC) corresponding to various radiation treatment techniques for prostate cancer. Estimation of SPC was done by analysing differential dose-volume histograms (DDVH) of normal tissues such as rectum, bladder and urethra with the competitive risk model. Differential DVHs were obtained from treatment planning systems for external beam radiotherapy (EBRT), low-dose-rate (LDR) and high-dose-rate (HDR) brachytherapy techniques. The average risk of developing SPC was no greater than 0.6% for all treatment techniques but was lower with either LDR or HDR brachytherapy alone compared with any EBRT technique. For LDR and HDR brachytherapy alone, the risk of SPC for the rectum was 2.0 × 10-4% and 8.3 × 10-5% respectively compared with 0.2% for EBRT using five-field 3D-CRT to a total dose of 74 Gy. Overall, the risk of developing SPC for urethra following all radiation treatment techniques was very low compared with the rectum and bladder. Treatment plans which deliver equivalent doses of around 3-5 Gy to normal tissues were associated with higher risks of development of SPC.

  12. Spectral models of additive and modulation noise in speech and phonatory excitation signals

    NASA Astrophysics Data System (ADS)

    Schoentgen, Jean

    2003-01-01

    The article presents spectral models of additive and modulation noise in speech. The purpose is to learn about the causes of noise in the spectra of normal and disordered voices and to gauge whether the spectral properties of the perturbations of the phonatory excitation signal can be inferred from the spectral properties of the speech signal. The approach to modeling consists of deducing the Fourier series of the perturbed speech, assuming that the Fourier series of the noise and of the clean monocycle-periodic excitation are known. The models explain published data, take into account the effects of supraglottal tremor, demonstrate the modulation distortion owing to vocal tract filtering, establish conditions under which noise cues of different speech signals may be compared, and predict the impossibility of inferring the spectral properties of the frequency modulating noise from the spectral properties of the frequency modulation noise (e.g., phonatory jitter and frequency tremor). The general conclusion is that only phonatory frequency modulation noise is spectrally relevant. Other types of noise in speech are either epiphenomenal, or their spectral effects are masked by the spectral effects of frequency modulation noise.

  13. Mental self-government: development of the additional democratic learning style scale using Rasch measurement models.

    PubMed

    Nielsen, Tine; Kreiner, Svend; Styles, Irene

    2007-01-01

    This paper describes the development and validation of a democratic learning style scale intended to fill a gap in Sternberg's theory of mental self-government and the associated learning style inventory (Sternberg, 1988, 1997). The scale was constructed as an 8-item scale with a 7-category response scale. The scale was developed following an adapted version of DeVellis' (2003) guidelines for scale development. The validity of the Democratic Learning Style Scale was assessed by items analysis using graphical loglinear Rasch models (Kreiner and Christensen, 2002, 2004, 2006) The item analysis confirmed that the full 8-item revised Democratic Learning Style Scale fitted a graphical loglinear Rasch model with no differential item functioning but weak to moderate uniform local dependence between two items. In addition, a reduced 6-item version of the scale fitted the pure Rasch model with a rating scale parameterization. The revised Democratic Learning Style Scale can therefore be regarded as a sound measurement scale meeting requirements of both construct validity and objectivity.

  14. A Bayesian additive model for understanding public transport usage in special events.

    PubMed

    Rodrigues, Filipe; Borysov, Stanislav; Ribeiro, Bernardete; Pereira, Francisco

    2016-12-02

    Public special events, like sports games, concerts and festivals are well known to create disruptions in transportation systems, often catching the operators by surprise. Although these are usually planned well in advance, their impact is difficult to predict, even when organisers and transportation operators coordinate. The problem highly increases when several events happen concurrently. To solve these problems, costly processes, heavily reliant on manual search and personal experience, are usual practice in large cities like Singapore, London or Tokyo. This paper presents a Bayesian additive model with Gaussian process components that combines smart card records from public transport with context information about events that is continuously mined from the Web. We develop an efficient approximate inference algorithm using expectation propagation, which allows us to predict the total number of public transportation trips to the special event areas, thereby contributing to a more adaptive transportation system. Furthermore, for multiple concurrent event scenarios, the proposed algorithm is able to disaggregate gross trip counts into their most likely components related to specific events and routine behavior. Using real data from Singapore, we show that the presented model outperforms the best baseline model by up to 26% in R2 and also has explanatory power for its individual components.

  15. Risk assessment of consuming agricultural products irrigated with reclaimed wastewater: An exposure model

    NASA Astrophysics Data System (ADS)

    van Ginneken, Meike; Oron, Gideon

    2000-09-01

    This study assesses health risks to consumers due to the use of agricultural products irrigated with reclaimed wastewater. The analysis is based on a definition of an exposure model which takes into account several parameters: (1) the quality of the applied wastewater, (2) the irrigation method, (3) the elapsed times between irrigation, harvest, and product consumption, and (4) the consumers' habits. The exposure model is used for numerical simulation of human consumers' risks using the Monte Carlo simulation method. The results of the numerical simulation show large deviations, probably caused by uncertainty (impreciseness in quality of input data) and variability due to diversity among populations. There is a 10-orders of magnitude difference in the risk of infection between the different exposure scenarios with the same water quality. This variation indicates the need for setting risk-based criteria for wastewater reclamation rather than single water quality guidelines. Extra data are required to decrease uncertainty in the risk assessment. Future research needs to include definition of acceptable risk criteria, more accurate dose-response modeling, information regarding pathogen survival in treated wastewater, additional data related to the passage of pathogens into and in the plants during irrigation, and information regarding the behavior patterns of the community of human consumers.

  16. The cut-off value for interleukin 34 as an additional potential inflammatory biomarker for the prediction of the risk of diabetic complications.

    PubMed

    Zorena, Katarzyna; Jachimowicz-Duda, Olga; Wąż, Piotr

    2016-01-01

    In the present study, we have decided to evaluate whether serum interleukin 34 (IL-34) levels may have diagnostic value in predicting the risk of vascular diabetic complications. The study included 49 patients with type 2 diabetes mellitus (T2DM) and 23 high-risk group. The receiver operating characteristic (ROC) curve analysis has shown that IL-34 has more discriminatory power than C-reactive protein (CRP) for the risk of diabetic complications. The cut-off value for IL-34 was established as 91.2 pg/mL. The gist of our research was identification of IL-34 as an additional potential inflammatory biomarker for the prediction of the risk of vascular diabetic complications.

  17. A risk management model for securing virtual healthcare communities.

    PubMed

    Chryssanthou, Anargyros; Varlamis, Iraklis; Latsiou, Charikleia

    2011-01-01

    Virtual healthcare communities aim to bring together healthcare professionals and patients, improve the quality of healthcare services and assist healthcare professionals and researchers in their everyday activities. In a secure and reliable environment, patients share their medical data with doctors, expect confidentiality and demand reliable medical consultation. Apart from a concrete policy framework, several ethical, legal and technical issues must be considered in order to build a trustful community. This research emphasises on security issues, which can arise inside a virtual healthcare community and relate to the communication and storage of data. It capitalises on a standardised risk management methodology and a prototype architecture for healthcare community portals and justifies a security model that allows the identification, estimation and evaluation of potential security risks for the community. A hypothetical virtual healthcare community is employed in order to portray security risks and the solutions that the security model provides.

  18. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    PubMed Central

    2010-01-01

    Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM permutation testing methods

  19. Climate and weather risk in natural resource models

    NASA Astrophysics Data System (ADS)

    Merrill, Nathaniel Henry

    This work, consisting of three manuscripts, addresses natural resource management under risk due to variation in climate and weather. In three distinct but theoretically related applications, I quantify the role of natural resources in stabilizing economic outcomes. In Manuscript 1, we address policy designed to effect the risk of cyanobacteria blooms in a drinking water reservoir through watershed wide policy. Combining a hydrologic and economic model for a watershed in Rhode Island, we solve for the efficient allocation of best management practices (BMPs) on livestock pastures to meet a monthly risk-based as well as mean-based water quality objective. In order to solve for the efficient allocations of nutrient control effort, we optimize a probabilistically constrained integer-programming problem representing the choices made on each farm and the resultant conditions that support cyanobacteria blooms. In doing so, we employ a genetic algorithm (GA). We hypothesize that management based on controlling the upper tail of the probability distribution of phosphorus loading implies different efficient management actions as compared to controlling mean loading. We find a shift to more intense effort on fewer acres when a probabilistic objective is specified with cost savings of meeting risk levels of up to 25% over mean loading based policies. Additionally, we illustrate the relative cost effectiveness of various policies designed to meet this risk-based objective. Rainfall and the subsequent overland runoff is the source of transportation of nutrients to a receiving water body, with larger amounts of phosphorus moving in more intense rainfall events. We highlight the importance of this transportation mechanism by comparing policies under climate change scenarios, where the intensity of rainfall is projected to increase and the time series process of rainfall to change. In Manuscript 2, we introduce a new economic groundwater model that incorporates the gradual shift

  20. A Dual System Model of Preferences under Risk

    ERIC Educational Resources Information Center

    Mukherjee, Kanchan

    2010-01-01

    This article presents a dual system model (DSM) of decision making under risk and uncertainty according to which the value of a gamble is a combination of the values assigned to it independently by the affective and deliberative systems. On the basis of research on dual process theories and empirical research in Hsee and Rottenstreich (2004) and…

  1. Field Evaluation of an Avian Risk Assessment Model

    EPA Science Inventory

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in ...

  2. Effective Genetic-Risk Prediction Using Mixed Models

    PubMed Central

    Golan, David; Rosset, Saharon

    2014-01-01

    For predicting genetic risk, we propose a statistical approach that is specifically adapted to dealing with the challenges imposed by disease phenotypes and case-control sampling. Our approach (termed Genetic Risk Scores Inference [GeRSI]), combines the power of fixed-effects models (which estimate and aggregate the effects of single SNPs) and random-effects models (which rely primarily on whole-genome similarities between individuals) within the framework of the widely used liability-threshold model. We demonstrate in extensive simulation that GeRSI produces predictions that are consistently superior to current state-of-the-art approaches. When applying GeRSI to seven phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) study, we confirm that the use of random effects is most beneficial for diseases that are known to be highly polygenic: hypertension (HT) and bipolar disorder (BD). For HT, there are no significant associations in the WTCCC data. The fixed-effects model yields an area under the ROC curve (AUC) of 54%, whereas GeRSI improves it to 59%. For BD, using GeRSI improves the AUC from 55% to 62%. For individuals ranked at the top 10% of BD risk predictions, using GeRSI substantially increases the BD relative risk from 1.4 to 2.5. PMID:25279982

  3. Asymptotic Theory for Weighted Least Squares Estimators in Aalen’s Additive Risk Model.

    DTIC Science & Technology

    1987-11-01

    hold, b. - 0, nb - oo and the kernel function K has bounded variation . Let 0 < to < 1. Then , (.A - A) P n’ in D[to, ]’ where m’ is a p-variate...function K has bounded variation . Let 0 < to < 1. Then An(4 - ) converges weakly in D[to, l[ P to the p-variate Gaussian martingale m’ of Theorem 3.2. We...the kernel function K has bounded variation , b, - 0 and nb2 -. oo. Let 0 < to < 1. Then sup 16A(t) - a 3 (t) L 0 .telto’l] PROOF. Use integration by

  4. Application of Catastrophe Risk Modelling to Evacuation Public Policy

    NASA Astrophysics Data System (ADS)

    Woo, G.

    2009-04-01

    The decision by civic authorities to evacuate an area threatened by a natural hazard is especially fraught when the population in harm's way is extremely large, and where there is considerable uncertainty in the spatial footprint, scale, and strike time of a hazard event. Traditionally viewed as a hazard forecasting issue, civil authorities turn to scientists for advice on a potentially imminent dangerous event. However, the level of scientific confidence varies enormously from one peril and crisis situation to another. With superior observational data, meteorological and hydrological hazards are generally better forecast than geological hazards. But even with Atlantic hurricanes, the track and intensity of a hurricane can change significantly within a few hours. This complicated and delayed the decision to call an evacuation of New Orleans when threatened by Hurricane Katrina, and would present a severe dilemma if a major hurricane were appearing to head for New York. Evacuation needs to be perceived as a risk issue, requiring the expertise of catastrophe risk modellers as well as geoscientists. Faced with evidence of a great earthquake in the Indian Ocean in December 2004, seismologists were reluctant to give a tsunami warning without more direct sea observations. Yet, from a risk perspective, the risk to coastal populations would have warranted attempts at tsunami warning, even though there was significant uncertainty in the hazard forecast, and chance of a false alarm. A systematic coherent risk-based framework for evacuation decision-making exists, which weighs the advantages of an evacuation call against the disadvantages. Implicitly and qualitatively, such a cost-benefit analysis is undertaken by civic authorities whenever an evacuation is considered. With the progress in catastrophe risk modelling, such an analysis can be made explicit and quantitative, providing a transparent audit trail for the decision process. A stochastic event set, the core of a

  5. Design and tuning of standard additive model based fuzzy PID controllers for multivariable process systems.

    PubMed

    Harinath, Eranda; Mann, George K I

    2008-06-01

    This paper describes a design and two-level tuning method for fuzzy proportional-integral derivative (FPID) controllers for a multivariable process where the fuzzy inference uses the inference of standard additive model. The proposed method can be used for any n x n multi-input-multi-output process and guarantees closed-loop stability. In the two-level tuning scheme, the tuning follows two steps: low-level tuning followed by high-level tuning. The low-level tuning adjusts apparent linear gains, whereas the high-level tuning changes the nonlinearity in the normalized fuzzy output. In this paper, two types of FPID configurations are considered, and their performances are evaluated by using a real-time multizone temperature control problem having a 3 x 3 process system.

  6. Modeling the flux of metabolites in the juvenile hormone biosynthesis pathway using generalized additive models and ordinary differential equations.

    PubMed

    Martínez-Rincón, Raúl O; Rivera-Pérez, Crisalejandra; Diambra, Luis; Noriega, Fernando G

    2017-01-01

    Juvenile hormone (JH) regulates development and reproductive maturation in insects. The corpora allata (CA) from female adult mosquitoes synthesize fluctuating levels of JH, which have been linked to the ovarian development and are influenced by nutritional signals. The rate of JH biosynthesis is controlled by the rate of flux of isoprenoids in the pathway, which is the outcome of a complex interplay of changes in precursor pools and enzyme levels. A comprehensive study of the changes in enzymatic activities and precursor pool sizes have been previously reported for the mosquito Aedes aegypti JH biosynthesis pathway. In the present studies, we used two different quantitative approaches to describe and predict how changes in the individual metabolic reactions in the pathway affect JH synthesis. First, we constructed generalized additive models (GAMs) that described the association between changes in specific metabolite concentrations with changes in enzymatic activities and substrate concentrations. Changes in substrate concentrations explained 50% or more of the model deviances in 7 of the 13 metabolic steps analyzed. Addition of information on enzymatic activities almost always improved the fitness of GAMs built solely based on substrate concentrations. GAMs were validated using experimental data that were not included when the model was built. In addition, a system of ordinary differential equations (ODE) was developed to describe the instantaneous changes in metabolites as a function of the levels of enzymatic catalytic activities. The results demonstrated the ability of the models to predict changes in the flux of metabolites in the JH pathway, and can be used in the future to design and validate experimental manipulations of JH synthesis.

  7. Modeling the flux of metabolites in the juvenile hormone biosynthesis pathway using generalized additive models and ordinary differential equations

    PubMed Central

    Martínez-Rincón, Raúl O.; Rivera-Pérez, Crisalejandra; Diambra, Luis; Noriega, Fernando G.

    2017-01-01

    Juvenile hormone (JH) regulates development and reproductive maturation in insects. The corpora allata (CA) from female adult mosquitoes synthesize fluctuating levels of JH, which have been linked to the ovarian development and are influenced by nutritional signals. The rate of JH biosynthesis is controlled by the rate of flux of isoprenoids in the pathway, which is the outcome of a complex interplay of changes in precursor pools and enzyme levels. A comprehensive study of the changes in enzymatic activities and precursor pool sizes have been previously reported for the mosquito Aedes aegypti JH biosynthesis pathway. In the present studies, we used two different quantitative approaches to describe and predict how changes in the individual metabolic reactions in the pathway affect JH synthesis. First, we constructed generalized additive models (GAMs) that described the association between changes in specific metabolite concentrations with changes in enzymatic activities and substrate concentrations. Changes in substrate concentrations explained 50% or more of the model deviances in 7 of the 13 metabolic steps analyzed. Addition of information on enzymatic activities almost always improved the fitness of GAMs built solely based on substrate concentrations. GAMs were validated using experimental data that were not included when the model was built. In addition, a system of ordinary differential equations (ODE) was developed to describe the instantaneous changes in metabolites as a function of the levels of enzymatic catalytic activities. The results demonstrated the ability of the models to predict changes in the flux of metabolites in the JH pathway, and can be used in the future to design and validate experimental manipulations of JH synthesis. PMID:28158248

  8. [Integrated Management Area of Vascular Risk: A new organisational model for global control of risk factors].

    PubMed

    Armario, P; Jericó, C; Vila, L; Freixa, R; Martin-Castillejos, C; Rotllan, M

    2016-11-17

    Cardiovascular disease (CVD), is a major cause of morbidity and mortality that increases the cost of care. Currently there is a low degree of control of the main cardiovascular risk factors, although we have a good therapeutic arsenal. To achieve the improvement of this reality, a good coordination and multidisciplinary participation are essential. The development of new organizational models such as the Integrated Management Area of Vascular Risk can facilitate the therapeutic harmonization and unification of the health messages offered by different levels of care, based on clinical practice guidelines, in order to provide patient-centred integrated care.

  9. Simplified risk score models accurately predict the risk of major in-hospital complications following percutaneous coronary intervention.

    PubMed

    Resnic, F S; Ohno-Machado, L; Selwyn, A; Simon, D I; Popma, J J

    2001-07-01

    The objectives of this analysis were to develop and validate simplified risk score models for predicting the risk of major in-hospital complications after percutaneous coronary intervention (PCI) in the era of widespread stenting and use of glycoprotein IIb/IIIa antagonists. We then sought to compare the performance of these simplified models with those of full logistic regression and neural network models. From January 1, 1997 to December 31, 1999, data were collected on 4,264 consecutive interventional procedures at a single center. Risk score models were derived from multiple logistic regression models using the first 2,804 cases and then validated on the final 1,460 cases. The area under the receiver operating characteristic (ROC) curve for the risk score model that predicted death was 0.86 compared with 0.85 for the multiple logistic model and 0.83 for the neural network model (validation set). For the combined end points of death, myocardial infarction, or bypass surgery, the corresponding areas under the ROC curves were 0.74, 0.78, and 0.81, respectively. Previously identified risk factors were confirmed in this analysis. The use of stents was associated with a decreased risk of in-hospital complications. Thus, risk score models can accurately predict the risk of major in-hospital complications after PCI. Their discriminatory power is comparable to those of logistic models and neural network models. Accurate bedside risk stratification may be achieved with these simple models.

  10. Hospital-associated venous thromboembolism in pediatrics: a systematic review and meta-analysis of risk factors and risk-assessment models.

    PubMed

    Mahajerin, Arash; Branchford, Brian R; Amankwah, Ernest K; Raffini, Leslie; Chalmers, Elizabeth; van Ommen, C Heleen; Goldenberg, Neil A

    2015-08-01

    Hospital-associated venous thromboembolism, including deep vein thrombosis and pulmonary embolism, is increasing in pediatric centers. The objective of this work was to systematically review literature on pediatric hospital-acquired venous thromboembolism risk factors and risk-assessment models, to inform future prevention research. We conducted a literature search on pediatric venous thromboembolism risk via PubMed (1946-2014) and Embase (1980-2014). Data on risk factors and risk-assessment models were extracted from case-control studies, while prevalence data on clinical characteristics were obtained from registries, large (n>40) retrospective case series, and cohort studies. Meta-analyses were conducted for risk factors or clinical characteristics reported in at least three studies. Heterogeneity among studies was assessed with the Cochran Q test and quantified by the I(2) statistic. From 394 initial articles, 60 met the final inclusion criteria (20 case-control studies and 40 registries/large case series/cohort studies). Significant risk factors among case-control studies were: intensive care unit stay (OR: 2.14, 95% CI: 1.97-2.32); central venous catheter (OR: 2.12, 95% CI: 2.00-2.25); mechanical ventilation (OR: 1.56, 95%CI: 1.42-1.72); and length of stay in hospital (per each additional day, OR: 1.03, 95% CI: 1.03-1.03). Three studies developed/applied risk-assessment models from a combination of these risk factors. Fourteen significant clinical characteristics were identified through non-case-control studies. This meta-analysis confirms central venous catheter, intensive care unit stay, mechanical ventilation, and length of stay as risk factors. A few pediatric hospital-acquired venous thromboembolism risk scores have emerged employing these factors. Prospective validation is necessary to inform risk-stratified prevention trials.

  11. Hospital-associated venous thromboembolism in pediatrics: a systematic review and meta-analysis of risk factors and risk-assessment models

    PubMed Central

    Mahajerin, Arash; Branchford, Brian R.; Amankwah, Ernest K.; Raffini, Leslie; Chalmers, Elizabeth; van Ommen, C. Heleen; Goldenberg, Neil A.

    2015-01-01

    Hospital-associated venous thromboembolism, including deep vein thrombosis and pulmonary embolism, is increasing in pediatric centers. The objective of this work was to systematically review literature on pediatric hospital-acquired venous thromboembolism risk factors and risk-assessment models, to inform future prevention research. We conducted a literature search on pediatric venous thromboembolism risk via PubMed (1946–2014) and Embase (1980–2014). Data on risk factors and risk-assessment models were extracted from case-control studies, while prevalence data on clinical characteristics were obtained from registries, large (n>40) retrospective case series, and cohort studies. Meta-analyses were conducted for risk factors or clinical characteristics reported in at least three studies. Heterogeneity among studies was assessed with the Cochran Q test and quantified by the I2 statistic. From 394 initial articles, 60 met the final inclusion criteria (20 case-control studies and 40 registries/large case series/cohort studies). Significant risk factors among case-control studies were: intensive care unit stay (OR: 2.14, 95% CI: 1.97–2.32); central venous catheter (OR: 2.12, 95% CI: 2.00–2.25); mechanical ventilation (OR: 1.56, 95%CI: 1.42–1.72); and length of stay in hospital (per each additional day, OR: 1.03, 95% CI: 1.03–1.03). Three studies developed/applied risk-assessment models from a combination of these risk factors. Fourteen significant clinical characteristics were identified through non-case-control studies. This meta-analysis confirms central venous catheter, intensive care unit stay, mechanical ventilation, and length of stay as risk factors. A few pediatric hospital-acquired venous thromboembolism risk scores have emerged employing these factors. Prospective validation is necessary to inform risk-stratified prevention trials. PMID:26001789

  12. Specific BACE1 genotypes provide additional risk for late-onset Alzheimer disease in APOE epsilon 4 carriers.

    PubMed

    Gold, Gabriel; Blouin, Jean-Louis; Herrmann, François R; Michon, Agnès; Mulligan, Reinhild; Duriaux Saïl, Geneviève; Bouras, Constantin; Giannakopoulos, Panteleimon; Antonarakis, Stylianos E

    2003-05-15

    Alzheimer disease (AD) is characterized neuropathologically by neurofibrillary tangles and senile plaques. A key component of plaques is A beta, a polypeptide derived from A beta-precursor protein (APP) through proteolytic cleavage catalyzed by beta and gamma-secretase. We hypothesized that sequence variation in genes BACE1 (on chromosome 11q23.3) and BACE2 (on chromosome 21q22.3), which encode two closely related proteases that seem to act as the APP beta-secretase, may represent a genetic risk factor for AD. We analyzed the frequencies of single nucleotide polymorphisms (SNPs) in BACE1 and BACE2 genes in a community-based sample of 96 individuals with late-onset AD and 170 controls selected randomly among residents of the same community. The genotype data in both study groups did not demonstrate any association between AD and BACE1 or BACE2. After stratification for APOE status, however, an association between a BACE1 polymorphism located within codon V262 and AD in APOE epsilon 4 carriers was observed (P = 0.03). We conclude that sequence variation in the BACE1 or BACE 2 gene is not a significant risk factor for AD; however, a combination of a specific BACE1 allele and APOE epsilon 4 may increase the risk for Alzheimer disease over and above that attributed to APOE epsilon 4 alone.

  13. Equine seroprevalence rates as an additional indicator for a more accurate risk assessment of the West Nile virus transmission.

    PubMed

    Vignjević, Goran; Vrućina, Ivana; Sestak, Ivana; Turić, Natasa; Bogojević, Mirta Sudarić; Merdić, Enrih

    2013-09-01

    The West Nile Virus (WNV) is a zoonotic arbovirus that has recently been causing outbreaks in many countries in southern and Central Europe. In 2012, for the first time, it caused an outbreak in eastern Croatia with total of 7 human clinical cases. With an aim of assisting public health personnel in order to improve survey protocols and vector control, the high risk areas of the WNV transmission were estimated and mapped. The study area included cities of Osijek and Slavonski Brod and 8 municipalities in Vukovarsko-Srijemska County. Risk estimation was based on seroprevalence of WNV infections in horses as an indicator of the virus presence, as well as the presence of possible WNV mosquito vectors with corresponding vector competences. Four mosquito species considered as possible WNV vectors are included in this study: Aedes vexans, Culex modestus, Culex pipiens and Ochlerotatus caspius. Mosquitoes were sampled using dry-ice baited CDC trap, twice a month, between May and October. This study suggests that the two mosquito species present the main risk of WNV transmission in eastern Croatia: the Culex pipiens--because of good vector competence and the Aedes vexans--because of the very high abundances. As a result, these two species should be focus of future mosquito surveillance and a vector control management.

  14. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  15. Spin-probe ESR and molecular modeling studies on calcium carbonate dispersions in overbased detergent additives.

    PubMed

    Montanari, Luciano; Frigerio, Francesco

    2010-08-15

    Oil-soluble calcium carbonate colloids are used as detergent additives in lubricating oils. They are colloidal dispersions of calcium carbonate particles stabilized by different surfactants; in this study alkyl-aryl-sulfonates and sulfurized alkyl-phenates, widely used in the synthesis of these additives, are considered. The physical properties of surfactant layers surrounding the surfaces of calcium carbonate particles were analyzed by using some nitroxide spin-probes (stable free radicals) and observing the corresponding ESR spectra. The spin-probe molecules contain polar groups which tend to tether them to the carbonate particle polar surface. They can reach these surfaces only if the surfactant layers are not very compact, hence the relative amounts of spin-probe molecules accessing carbonate surfaces are an index of the compactness of surfactant core. ESR signals of spin-probe molecules dissolved in oil or "locked" near the carbonate surfaces are different because of the different molecular mobility. Through deconvolution of the ESR spectra, the fraction of spin-probes penetrating surfactant shells have been calculated, and differences were observed according to the surfactant molecular structures. Moreover, by using specially labeled spin-probes based on stearic acids, functionalized at different separations from the carboxylic acid group, it was possible to interrogate the molecular physical behavior of surfactant shells at different distances from carbonate surfaces. Molecular modeling was applied to generate some three-dimensional micellar models of the colloidal stabilizations of the stabilized carbonate particles with different molecular structures of the surfactant. The diffusion of spin-probe molecules into the surfactant shells were studied by applying a starting force to push the molecules towards the carbonate surfaces and then observing the ensuing behavior. The simulations are in accordance with the ESR data and show that the geometrical

  16. Modeling external carbon addition in biological nutrient removal processes with an extension of the international water association activated sludge model.

    PubMed

    Swinarski, M; Makinia, J; Stensel, H D; Czerwionka, K; Drewnowski, J

    2012-08-01

    The aim of this study was to expand the International Water Association Activated Sludge Model No. 2d (ASM2d) to account for a newly defined readily biodegradable substrate that can be consumed by polyphosphate-accumulating organisms (PAOs) under anoxic and aerobic conditions, but not under anaerobic conditions. The model change was to add a new substrate component and process terms for its use by PAOs and other heterotrophic bacteria under anoxic and aerobic conditions. The Gdansk (Poland) wastewater treatment plant (WWTP), which has a modified University of Cape Town (MUCT) process for nutrient removal, provided field data and mixed liquor for batch tests for model evaluation. The original ASM2d was first calibrated under dynamic conditions with the results of batch tests with settled wastewater and mixed liquor, in which nitrate-uptake rates, phosphorus-release rates, and anoxic phosphorus uptake rates were followed. Model validation was conducted with data from a 96-hour measurement campaign in the full-scale WWTP. The results of similar batch tests with ethanol and fusel oil as the external carbon sources were used to adjust kinetic and stoichiometric coefficients in the expanded ASM2d. Both models were compared based on their predictions of the effect of adding supplemental carbon to the anoxic zone of an MUCT process. In comparison with the ASM2d, the new model better predicted the anoxic behaviors of carbonaceous oxygen demand, nitrate-nitrogen (NO3-N), and phosphorous (PO4-P) in batch experiments with ethanol and fusel oil. However, when simulating ethanol addition to the anoxic zone of a full-scale biological nutrient removal facility, both models predicted similar effluent NO3-N concentrations (6.6 to 6.9 g N/m3). For the particular application, effective enhanced biological phosphorus removal was predicted by both models with external carbon addition but, for the new model, the effluent PO4-P concentration was approximately one-half of that found from

  17. An integrated model-based approach to the risk assessment of pesticide drift from vineyards

    NASA Astrophysics Data System (ADS)

    Pivato, Alberto; Barausse, Alberto; Zecchinato, Francesco; Palmeri, Luca; Raga, Roberto; Lavagnolo, Maria Cristina; Cossu, Raffaello

    2015-06-01

    The inhalation of pesticide in air is of particular concern for people living in close contact with intensive agricultural activities. This study aims to develop an integrated modelling methodology to assess whether pesticides pose a risk to the health of people living near vineyards, and apply this methodology in the world-renowned Prosecco DOCG (Italian label for protection of origin and geographical indication of wines) region. A sample field in Bigolino di Valdobbiadene (North-Eastern Italy) was selected to perform the pesticide fate modellization and the consequent inhalation risk assessment for people living in the area. The modellization accounts for the direct pesticide loss during the treatment of vineyards and for the volatilization from soil after the end of the treatment. A fugacity model was used to assess the volatilization flux from soil. The Gaussian puff air dispersion model CALPUFF was employed to assess the airborne concentration of the emitted pesticide over the simulation domain. The subsequent risk assessment integrates the HArmonised environmental Indicators for pesticide Risk (HAIR) and US-EPA guidelines. In this case study the modelled situation turned to be safe from the point of view of human health in the case of non-carcinogenic compounds, and additional improvements were suggested to further mitigate the effect of the most critical compound.

  18. The MCRA model for probabilistic single-compound and cumulative risk assessment of pesticides.

    PubMed

    van der Voet, Hilko; de Boer, Waldo J; Kruisselbrink, Johannes W; Goedhart, Paul W; van der Heijden, Gerie W A M; Kennedy, Marc C; Boon, Polly E; van Klaveren, Jacob D

    2015-05-01

    Pesticide risk assessment is hampered by worst-case assumptions leading to overly pessimistic assessments. On the other hand, cumulative health effects of similar pesticides are often not taken into account. This paper describes models and a web-based software system developed in the European research project ACROPOLIS. The models are appropriate for both acute and chronic exposure assessments of single compounds and of multiple compounds in cumulative assessment groups. The software system MCRA (Monte Carlo Risk Assessment) is available for stakeholders in pesticide risk assessment at mcra.rivm.nl. We describe the MCRA implementation of the methods as advised in the 2012 EFSA Guidance on probabilistic modelling, as well as more refined methods developed in the ACROPOLIS project. The emphasis is on cumulative assessments. Two approaches, sample-based and compound-based, are contrasted. It is shown that additional data on agricultural use of pesticides may give more realistic risk assessments. Examples are given of model and software validation of acute and chronic assessments, using both simulated data and comparisons against the previous release of MCRA and against the standard software DEEM-FCID used by the Environmental Protection Agency in the USA. It is shown that the EFSA Guidance pessimistic model may not always give an appropriate modelling of exposure.

  19. Cumulative Incidence Association Models for Bivariate Competing Risks Data.

    PubMed

    Cheng, Yu; Fine, Jason P

    2012-03-01

    Association models, like frailty and copula models, are frequently used to analyze clustered survival data and evaluate within-cluster associations. The assumption of noninformative censoring is commonly applied to these models, though it may not be true in many situations. In this paper, we consider bivariate competing risk data and focus on association models specified for the bivariate cumulative incidence function (CIF), a nonparametrically identifiable quantity. Copula models are proposed which relate the bivariate CIF to its corresponding univariate CIFs, similarly to independently right censored data, and accommodate frailty models for the bivariate CIF. Two estimating equations are developed to estimate the association parameter, permitting the univariate CIFs to be estimated either parametrically or nonparametrically. Goodness-of-fit tests are presented for formally evaluating the parametric models. Both estimators perform well with moderate sample sizes in simulation studies. The practical use of the methodology is illustrated in an analysis of dementia associations.

  20. Advanced uncertainty modelling for container port risk analysis.

    PubMed

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance.

  1. Additive surface complexation modeling of uranium(VI) adsorption onto quartz-sand dominated sediments.

    PubMed

    Dong, Wenming; Wan, Jiamin

    2014-06-17

    Many aquifers contaminated by U(VI)-containing acidic plumes are composed predominantly of quartz-sand sediments. The F-Area of the Savannah River Site (SRS) in South Carolina (USA) is an example. To predict U(VI) mobility and natural attenuation, we conducted U(VI) adsorption experiments using the F-Area plume sediments and reference quartz, goethite, and kaolinite. The sediments are composed of ∼96% quartz-sand and 3-4% fine fractions of kaolinite and goethite. We developed a new humic acid adsorption method for determining the relative surface area abundances of goethite and kaolinite in the fine fractions. This method is expected to be applicable to many other binary mineral pairs, and allows successful application of the component additivity (CA) approach based surface complexation modeling (SCM) at the SRS F-Area and other similar aquifers. Our experimental results indicate that quartz has stronger U(VI) adsorption ability per unit surface area than goethite and kaolinite at pH ≤ 4.0. Our modeling results indicate that the binary (goethite/kaolinite) CA-SCM under-predicts U(VI) adsorption to the quartz-sand dominated sediments at pH ≤ 4.0. The new ternary (quartz/goethite/kaolinite) CA-SCM provides excellent predictions. The contributions of quartz-sand, kaolinite, and goethite to U(VI) adsorption and the potential influences of dissolved Al, Si, and Fe are also discussed.

  2. Modeling and additive manufacturing of bio-inspired composites with tunable fracture mechanical properties.

    PubMed

    Dimas, Leon S; Buehler, Markus J

    2014-07-07

    Flaws, imperfections and cracks are ubiquitous in material systems and are commonly the catalysts of catastrophic material failure. As stresses and strains tend to concentrate around cracks and imperfections, structures tend to fail far before large regions of material have ever been subjected to significant loading. Therefore, a major challenge in material design is to engineer systems that perform on par with pristine structures despite the presence of imperfections. In this work we integrate knowledge of biological systems with computational modeling and state of the art additive manufacturing to synthesize advanced composites with tunable fracture mechanical properties. Supported by extensive mesoscale computer simulations, we demonstrate the design and manufacturing of composites that exhibit deformation mechanisms characteristic of pristine systems, featuring flaw-tolerant properties. We analyze the results by directly comparing strain fields for the synthesized composites, obtained through digital image correlation (DIC), and the computationally tested composites. Moreover, we plot Ashby diagrams for the range of simulated and experimental composites. Our findings show good agreement between simulation and experiment, confirming that the proposed mechanisms have a significant potential for vastly improving the fracture response of composite materials. We elucidate the role of stiffness ratio variations of composite constituents as an important feature in determining the composite properties. Moreover, our work validates the predictive ability of our models, presenting them as useful tools for guiding further material design. This work enables the tailored design and manufacturing of composites assembled from inferior building blocks, that obtain optimal combinations of stiffness and toughness.

  3. Generalized additive models reveal the intrinsic complexity of wood formation dynamics.

    PubMed

    Cuny, Henri E; Rathgeber, Cyrille B K; Kiessé, Tristan Senga; Hartmann, Felix P; Barbeito, Ignacio; Fournier, Meriem

    2013-04-01

    The intra-annual dynamics of wood formation, which involves the passage of newly produced cells through three successive differentiation phases (division, enlargement, and wall thickening) to reach the final functional mature state, has traditionally been described in conifers as three delayed bell-shaped curves followed by an S-shaped curve. Here the classical view represented by the 'Gompertz function (GF) approach' was challenged using two novel approaches based on parametric generalized linear models (GLMs) and 'data-driven' generalized additive models (GAMs). These three approaches (GFs, GLMs, and GAMs) were used to describe seasonal changes in cell numbers in each of the xylem differentiation phases and to calculate the timing of cell development in three conifer species [Picea abies (L.), Pinus sylvestris L., and Abies alba Mill.]. GAMs outperformed GFs and GLMs in describing intra-annual wood formation dynamics, showing two left-skewed bell-shaped curves for division and enlargement, and a right-skewed bimodal curve for thickening. Cell residence times progressively decreased through the season for enlargement, whilst increasing late but rapidly for thickening. These patterns match changes in cell anatomical features within a tree ring, which allows the separation of earlywood and latewood into two distinct cell populations. A novel statistical approach is presented which renews our understanding of xylogenesis, a dynamic biological process in which the rate of cell production interplays with cell residence times in each developmental phase to create complex seasonal patterns.

  4. Modelling tsunami inundation for risk analysis at the Andaman Sea Coast of Thailand

    NASA Astrophysics Data System (ADS)

    Kaiser, G.; Kortenhaus, A.

    2009-04-01

    The mega-tsunami of Dec. 26, 2004 strongly impacted the Andaman Sea coast of Thailand and devastated coastal ecosystems as well as towns, settlements and tourism resorts. In addition to the tragic loss of many lives, the destruction or damage of life-supporting infrastructure, such as buildings, roads, water & power supply etc. caused high economic losses in the region. To mitigate future tsunami impacts there is a need to assess the tsunami hazard and vulnerability in flood prone areas at the Andaman Sea coast in order to determine the spatial distribution of risk and to develop risk management strategies. In the bilateral German-Thai project TRAIT research is performed on integrated risk assessment for the Provinces Phang Nga and Phuket in southern Thailand, including a hazard analysis, i.e. modelling tsunami propagation to the coast, tsunami wave breaking and inundation characteristics, as well as vulnerability analysis of the socio-economic and the ecological system in order to determine the scenario-based, specific risk for the region. In this presentation results of the hazard analysis and the inundation simulation are presented and discussed. Numerical modelling of tsunami propagation and inundation simulation is an inevitable tool for risk analysis, risk management and evacuation planning. While numerous investigations have been made to model tsunami wave generation and propagation in the Indian Ocean, there is still a lack in determining detailed inundation patterns, i.e. water depth and flow dynamics. However, for risk management and evacuation planning this knowledge is essential. As the accuracy of the inundation simulation is strongly depending on the available bathymetric and the topographic data, a multi-scale approach is chosen in this work. The ETOPO Global Relief Model as a bathymetric basis and the Shuttle Radar Topography Mission (SRTM90) have been widely applied in tsunami modelling approaches as these data are free and almost world

  5. Water- and wastewater-related disease and infection risks: what is an appropriate value for the maximum tolerable additional burden of disease?

    PubMed

    Mara, Duncan

    2011-06-01

    The maximum additional burden of water- and wastewater-related disease of 10-6 disability-adjusted life year (DALY) loss per person per year (pppy), used in the WHO Drinking-water Quality Guidelines and the WHO Guidelines for Wastewater Use in Agriculture, is based on US EPA'S acceptance of a 70-year lifetime waterborne cancer risk of 10(-5) per person, equivalent to an annual risk of 1.4x10(-7) per person which is four orders of magnitude lower than the actual all-cancer incidence in the USA in 2009 of 1.8x10(-3) pppy. A maximum additional burden of 10(-4) DALY loss pppy would reduce this risk to a more cost-effective, but still low, risk of 1.4x10(-5) pppy. It would increase the DALY loss pppy in low- and middle-income countries due to diarrhoeal diseases from the current level of 0.0119 pppy to 0.0120 pppy, and that due to ascariasis from 0.0026 pppy to 0.0027 pppy, but neither increase is of public-health significance. It is therefore recommended that the maximum additional burden of disease from these activities be increased to a DALY loss of 10(-4) pppy as this provides an adequate margin of public-health safety in relation to waterborne-cancer deaths, diarrhoeal disease and ascariasis in all countries.

  6. Challenges of Modeling Flood Risk at Large Scales

    NASA Astrophysics Data System (ADS)

    Guin, J.; Simic, M.; Rowe, J.

    2009-04-01

    Flood risk management is a major concern for many nations and for the insurance sector in places where this peril is insured. A prerequisite for risk management, whether in the public sector or in the private sector is an accurate estimation of the risk. Mitigation measures and traditional flood management techniques are most successful when the problem is viewed at a large regional scale such that all inter-dependencies in a river network are well understood. From an insurance perspective the jury is still out there on whether flood is an insurable peril. However, with advances in modeling techniques and computer power it is possible to develop models that allow proper risk quantification at the scale suitable for a viable insurance market for flood peril. In order to serve the insurance market a model has to be event-simulation based and has to provide financial risk estimation that forms the basis for risk pricing, risk transfer and risk management at all levels of insurance industry at large. In short, for a collection of properties, henceforth referred to as a portfolio, the critical output of the model is an annual probability distribution of economic losses from a single flood occurrence (flood event) or from an aggregation of all events in any given year. In this paper, the challenges of developing such a model are discussed in the context of Great Britain for which a model has been developed. The model comprises of several, physically motivated components so that the primary attributes of the phenomenon are accounted for. The first component, the rainfall generator simulates a continuous series of rainfall events in space and time over thousands of years, which are physically realistic while maintaining the statistical properties of rainfall at all locations over the model domain. A physically based runoff generation module feeds all the rivers in Great Britain, whose total length of stream links amounts to about 60,000 km. A dynamical flow routing

  7. Additive Effects of the Risk Alleles of PNPLA3 and TM6SF2 on Non-alcoholic Fatty Liver Disease (NAFLD) in a Chinese Population

    PubMed Central

    Wang, Xiaoliang; Liu, Zhipeng; Wang, Kai; Wang, Zhaowen; Sun, Xing; Zhong, Lin; Deng, Guilong; Song, Guohe; Sun, Baining; Peng, Zhihai; Liu, Wanqing

    2016-01-01

    Recent genome-wide association studies have identified that variants in or near PNPLA3, NCAN, GCKR, LYPLAL1, and TM6SF2 are significantly associated with non-alcoholic fatty liver disease (NAFLD) in multiple ethnic groups. Studies on their impact on NAFLD in Han Chinese are still limited. In this study, we examined the relevance of these variants to NAFLD in a community-based Han Chinese population and further explored their potential joint effect on NAFLD. Six single nucleotide polymorphisms (SNPs) (PNPLA3 rs738409, rs2294918, NCAN rs2228603, GCKR rs780094, LYPLAL1 rs12137855, and TM6SF2 rs58542926) previously identified in genome-wide analyses, to be associated with NAFLD were genotyped in 384 NAFLD patients and 384 age- and gender-matched healthy controls. We found two out of the six polymorphisms, PNPLA3 rs738409 (OR = 1.52, 95%CI: 1.19–1.96; P = 0.00087) and TM6SF2 rs58542926 (OR = 2.11, 95%CI: 1.34–3.39; P = 0.0016) are independently associated with NAFLD after adjustment for the effects of age, gender, and BMI. Our analysis further demonstrated the strong additive effects of the risk alleles of PNPLA3 and TM6SF2 with an overall significance between the number of risk alleles and NAFLD (OR = 1.64, 95%CI: 1.34–2.01; P = 1.4 × 10-6). The OR for NAFLD increased in an additive manner, with an average increase in OR of 1.52 per additional risk allele. Our results confirmed that the PNPLA3 and TM6SF2 variants were the most significant risk alleles for NAFLD in Chinese population. Therefore, genotyping these two genetic risk factors may help identify individuals with the highest risk of NAFLD. PMID:27532011

  8. Modeling of Radiation Risks for Human Space Missions

    NASA Technical Reports Server (NTRS)

    Fletcher, Graham

    2004-01-01

    Prior to any human space flight, calculations of radiation risks are used to determine the acceptable scope of astronaut activity. Using the supercomputing facilities at NASA Ames Research Center, Ames researchers have determined the damage probabilities of DNA functional groups by space radiation. The data supercede those used in the current Monte Carlo model for risk assessment. One example is the reaction of DNA with hydroxyl radical produced by the interaction of highly energetic particles from space radiation with water molecules in the human body. This reaction is considered an important cause of DNA mutations, although its mechanism is not well understood.

  9. Multinomial additive hazard model to assess the disability burden using cross-sectional data.

    PubMed

    Yokota, Renata T C; Van Oyen, Herman; Looman, Caspar W N; Nusselder, Wilma J; Otava, Martin; Kifle, Yimer Wasihun; Molenberghs, Geert

    2017-03-23

    Population aging is accompanied by the burden of chronic diseases and disability. Chronic diseases are among the main causes of disability, which is associated with poor quality of life and high health care costs in the elderly. The identification of which chronic diseases contribute most to the disability prevalence is important to reduce the burden. Although longitudinal studies can be considered the gold standard to assess the causes of disability, they are costly and often with restricted sample size. Thus, the use of cross-sectional data under certain assumptions has become a popular alternative. Among the existing methods based on cross-sectional data, the attribution method, which was originally developed for binary disability outcomes, is an attractive option, as it enables the partition of disability into the additive contribution of chronic diseases, taking into account multimorbidity and that disability can be present even in the absence of disease. In this paper, we propose an extension of the attribution method to multinomial responses, since disability is often measured as a multicategory variable in most surveys, representing different severity levels. The R function constrOptim is used to maximize the multinomial log-likelihood function subject to a linear inequality constraint. Our simulation study indicates overall good performance of the model, without convergence problems. However, the model must be used with care for populations with low marginal disability probabilities and with high sum of conditional probabilities, especially with small sample size. For illustration, we apply the model to the data of the Belgian Health Interview Surveys.

  10. Models for the risk of secondary cancers from radiation therapy.

    PubMed

    Dasu, Alexandru; Toma-Dasu, Iuliana

    2017-02-24

    The interest in the induction of secondary tumours following radiotherapy has greatly increased as developments in detecting and treating the primary tumours have improved the life expectancy of cancer patients. However, most of the knowledge on the current levels of risk comes from patients treated many decades ago. As developments of irradiation techniques take place at a much faster pace than the progression of the carcinogenesis process, the earlier results could not be easily extrapolated to modern treatments. Indeed, the patterns of irradiation from historically-used orthovoltage radiotherapy and from contemporary techniques like conformal radiotherapy with megavoltage radiation, intensity modulated radiation therapy with photons or with particles are quite different. Furthermore, the increased interest in individualised treatment options raises the question of evaluating and ranking the different treatment plan options from the point of view of the risk for cancer induction, in parallel with the quantification of other long-term effects. It is therefore inevitable that models for risk assessment will have to be used to complement the knowledge from epidemiological studies and to make predictions for newer forms of treatment for which clinical evidence is not yet available. This work reviews the mathematical models that could be used to predict the risk of secondary cancers from radiotherapy-relevant dose levels, as well as the approaches and factors that have to be taken into account when including these models in the clinical evaluation process. These include the effects of heterogeneous irradiation, secondary particles production, imaging techniques, interpatient variability and other confounding factors.

  11. Reducing uncertainty in risk modeling for methylmercury exposure

    SciTech Connect

    Ponce, R.; Egeland, G.; Middaugh, J.; Lee, R.

    1995-12-31

    The biomagnification and bioaccumulation of methylmercury in marine species represents a challenge for risk assessment related to the consumption of subsistence foods in Alaska. Because of the profound impact that food consumption advisories have on indigenous peoples seeking to preserve a way of life, there is a need to reduce uncertainty in risk assessment. Thus, research was initiated to reduce the uncertainty in assessing the health risks associated with the consumption of subsistence foods. Because marine subsistence foods typically contain elevated levels of methylmercury, preliminary research efforts have focused on methylmercury as the principal chemical of concern. Of particular interest are the antagonistic effects of selenium on methylmercury toxicity. Because of this antagonism, methylmercury exposure through the consumption of marine mammal meat (with high selenium) may not be as toxic as comparable exposures through other sources of dietary intake, such as in the contaminated bread episode of Iraq (containing relatively low selenium). This hypothesis is supported by animal experiments showing reduced toxicity of methylmercury associated with marine mammal meat, by the antagonistic influence of selenium on methylmercury toxicity, and by negative clinical findings in adult populations exposed to methylmercury through a marine diet not subject to industrial contamination. Exploratory model development is underway to identify potential improvements and applications of current deterministic and probabilistic models, particularly by incorporating selenium as an antagonist in risk modeling methods.

  12. A Comparative Kirkwood-Buff Study of Aqueous Methanol Solutions Modeled by the CHARMM Additive and Drude Polarizable Force Fields

    PubMed Central

    Lin, Bin; He, Xibing; MacKerell, Alexander D.

    2013-01-01

    A comparative study on aqueous methanol solutions modeled by the CHARMM additive and Drude polarizable force fields was carried out by employing Kirkwood-Buff analysis. It was shown that both models reproduced the experimental Kirkwood-Buff integrals and excess coordination numbers adequately well over the entire concentration range. The Drude model showed significant improvement over the additive model in solution densities, partial molar volumes, excess molar volumes, concentration-dependent diffusion constants, and dielectric constants. However, the additive model performed somewhat better than the Drude model in reproducing the activity derivative, excess molar Gibbs energy and excess molar enthalpy of mixing. This is due to the additive achieving a better balance among solute-solute, solute-solvent, and solvent-solvent interactions, indicating the potential for improvements in the Drude polarizable alcohol model. PMID:23947568

  13. Perchloroethylene-contaminated drinking water and the risk of breast cancer: additional results from Cape Cod, Massachusetts, USA.

    PubMed Central

    Aschengrau, Ann; Rogers, Sarah; Ozonoff, David

    2003-01-01

    In 1998 we published the results of a study suggesting an association between breast cancer and perchloroethylene (PCE; also called tetrachloroethylene) exposure from public drinking water. The present case-control study was undertaken to evaluate this association further. The cases were composed of female residents of eight towns in the Cape Cod region of Massachusetts who had been diagnosed with breast cancer from 1987 through 1993 (n = 672). Controls were composed of demographically similar women from the same towns (n = 616). Women were exposed to PCE when it leached from the vinyl lining of water distribution pipes from the late 1960s through the early 1980s. A relative delivered dose of PCE that entered a home was estimated using an algorithm that took into account residential history, water flow, and pipe characteristics. Small to moderate elevations in risk were seen among women whose exposure levels were above the 75th and 90th percentiles when 0-15 years of latency were considered (adjusted odds ratios, 1.5-1.9 for > 75th percentile, 1.3-2.8 for > 90th percentile). When data from the present and prior studies were combined, small to moderate increases in risk were also seen among women whose exposure levels were above the 75th and 90th percentiles when 0-15 years of latency were considered (adjusted odds ratios, 1.6-1.9 for > 75th percentile, 1.3-1.9 for > 90th percentile). The results of the present study confirm those of the previous one and suggest that women with the highest PCE exposure levels have a small to moderate increased risk of breast cancer. PMID:12573900

  14. [Helicobacter pylori infection as additional risk factor of the development of NSAID-gastropatia effects at the patients with osteoarthritis].

    PubMed

    Maev, I V; Samsonov, A A; Lezhneva, Iu A; Andreev, N G; Salova, L M

    2009-01-01

    Prevalence of osteoartrosis disease is high among the population. The main places in treatment of this pathology occupy NSAID. Intake of NSAID is lead to the development of NSAID-gastropatia. During last years H. pylori infection was numbered with risk factors of the NSAID-gastropatia development. In this review considered researches which are devoted to studying ties between H. pylori and NSAID. Data of the using eradication therapy with purpose of prevention and treatment of NSAID-gastropatia associated with H. pylori are shown in this review.

  15. A quantitative risk assessment model for Salmonella and whole chickens.

    PubMed

    Oscar, Thomas P

    2004-06-01

    Existing data and predictive models were used to define the input settings of a previously developed but modified quantitative risk assessment model (QRAM) for Salmonella and whole chickens. The QRAM was constructed in an Excel spreadsheet and was simulated using @Risk. The retail-to-table pathway was modeled as a series of unit operations and associated pathogen events that included initial contamination at retail, growth during consumer transport, thermal inactivation during cooking, cross-contamination during serving, and dose response after consumption. Published data as well as predictive models for growth and thermal inactivation of Salmonella were used to establish input settings. Noncontaminated chickens were simulated so that the QRAM could predict changes in the incidence of Salmonella contamination. The incidence of Salmonella contamination changed from 30% at retail to 0.16% after cooking to 4% at consumption. Salmonella growth on chickens during consumer transport was the only pathogen event that did not impact the risk of salmonellosis. For the scenario simulated, the QRAM predicted 0.44 cases of salmonellosis per 100,000 consumers, which was consistent with recent epidemiological data that indicate a rate of 0.66-0.88 cases of salmonellosis per 100,000 consumers of chicken. Although the QRAM was in agreement with the epidemiological data, surrogate data and models were used, assumptions were made, and potentially important unit operations and pathogen events were not included because of data gaps and thus, further refinement of the QRAM is needed.

  16. Modeling insurer-homeowner interactions in managing natural disaster risk.

    PubMed

    Kesete, Yohannes; Peng, Jiazhen; Gao, Yang; Shan, Xiaojun; Davidson, Rachel A; Nozick, Linda K; Kruse, Jamie

    2014-06-01

    The current system for managing natural disaster risk in the United States is problematic for both homeowners and insurers. Homeowners are often uninsured or underinsured against natural disaster losses, and typically do not invest in retrofits that can reduce losses. Insurers often do not want to insure against these losses, which are some of their biggest exposures and can cause an undesirably high chance of insolvency. There is a need to design an improved system that acknowledges the different perspectives of the stakeholders. In this article, we introduce a new modeling framework to help understand and manage the insurer's role in catastrophe risk management. The framework includes a new game-theoretic optimization model of insurer decisions that interacts with a utility-based homeowner decision model and is integrated with a regional catastrophe loss estimation model. Reinsurer and government roles are represented as bounds on the insurer-insured interactions. We demonstrate the model for a full-scale case study for hurricane risk to residential buildings in eastern North Carolina; present the results from the perspectives of all stakeholders-primary insurers, homeowners (insured and uninsured), and reinsurers; and examine the effect of key parameters on the results.

  17. Modelling of fire count data: fire disaster risk in Ghana.

    PubMed

    Boadi, Caleb; Harvey, Simon K; Gyeke-Dako, Agyapomaa

    2015-01-01

    Stochastic dynamics involved in ecological count data require distribution fitting procedures to model and make informed judgments. The study provides empirical research, focused on the provision of an early warning system and a spatial graph that can detect societal fire risks. It offers an opportunity for communities, organizations, risk managers, actuaries and governments to be aware of, and understand fire risks, so that they will increase the direct tackling of the threats posed by fire. Statistical distribution fitting method that best helps identify the stochastic dynamics of fire count data is used. The aim is to provide a fire-prediction model and fire spatial graph for observed fire count data. An empirical probability distribution model is fitted to the fire count data and compared to the theoretical probability distribution of the stochastic process of fire count data. The distribution fitted to the fire frequency count data helps identify the class of models that are exhibited by the fire and provides time leading decisions. The research suggests that fire frequency and loss (fire fatalities) count data in Ghana are best modelled with a Negative Binomial Distribution. The spatial map of observed fire frequency and fatality measured over 5 years (2007-2011) offers in this study a first regional assessment of fire frequency and fire fatality in Ghana.

  18. Framework for Risk Analysis in Multimedia Environmental Systems: Modeling Individual Steps of a Risk Assessment Process

    SciTech Connect

    Shah, Anuj; Castleton, Karl J.; Hoopes, Bonnie L.

    2004-06-01

    The study of the release and effects of chemicals in the environment and their associated risks to humans is central to public and private decision making. FRAMES 1.X, Framework for Risk Analysis in Multimedia Environmental Systems, is a systems modeling software platform, developed by PNNL, Pacific Northwest National Laboratory, that helps scientists study the release and effects of chemicals on a source to outcome basis, create environmental models for similar risk assessment and management problems. The unique aspect of FRAMES is to dynamically introduce software modules representing individual components of a risk assessment (e.g., source release of contaminants, fate and transport in various environmental media, exposure, etc.) within a software framework, manipulate their attributes and run simulations to obtain results. This paper outlines the fundamental constituents of FRAMES 2.X, an enhanced version of FRAMES 1.X, that greatly improve the ability of the module developers to “plug” their self-developed software modules into the system. The basic design, the underlying principles and a discussion of the guidelines for module developers are presented.

  19. Child Effortful Control, Teacher-student Relationships, and Achievement in Academically At-risk Children: Additive and Interactive Effects.

    PubMed

    Liew, Jeffrey; Chen, Qi; Hughes, Jan N

    2010-01-01

    The joint contributions of child effortful control (using inhibitory control and task accuracy as behavioral indices) and positive teacher-student relationships at first grade on reading and mathematics achievement at second grade were examined in 761 children who were predominantly from low-income and ethnic minority backgrounds and assessed to be academically at-risk at entry to first grade. Analyses accounted for clustering effects, covariates, baselines of effortful control measures, and prior levels of achievement. Even with such conservative statistical controls, interactive effects were found for task accuracy and positive teacher-student relationships on future achievement. Results suggest that task accuracy served as a protective factor so that children with high task accuracy performed well academically despite not having positive teacher-student relationships. Further, positive teacher-student relationships served as a compensatory factor so that children with low task accuracy performed just as well as those with high task accuracy if they were paired with a positive and supportive teacher. Importantly, results indicate that the influence of positive teacher-student relationships on future achievement was most pronounced for students with low effortful control on tasks that require fine motor skills, accuracy, and attention-related skills. Study results have implications for narrowing achievement disparities for academically at-risk children.

  20. Child Effortful Control, Teacher-student Relationships, and Achievement in Academically At-risk Children: Additive and Interactive Effects

    PubMed Central

    Liew, Jeffrey; Chen, Qi; Hughes, Jan N.

    2009-01-01

    The joint contributions of child effortful control (using inhibitory control and task accuracy as behavioral indices) and positive teacher-student relationships at first grade on reading and mathematics achievement at second grade were examined in 761 children who were predominantly from low-income and ethnic minority backgrounds and assessed to be academically at-risk at entry to first grade. Analyses accounted for clustering effects, covariates, baselines of effortful control measures, and prior levels of achievement. Even with such conservative statistical controls, interactive effects were found for task accuracy and positive teacher-student relationships on future achievement. Results suggest that task accuracy served as a protective factor so that children with high task accuracy performed well academically despite not having positive teacher-student relationships. Further, positive teacher-student relationships served as a compensatory factor so that children with low task accuracy performed just as well as those with high task accuracy if they were paired with a positive and supportive teacher. Importantly, results indicate that the influence of positive teacher-student relationships on future achievement was most pronounced for students with low effortful control on tasks that require fine motor skills, accuracy, and attention-related skills. Study results have implications for narrowing achievement disparities for academically at-risk children. PMID:20161421

  1. Evaluation of Major Online Diabetes Risk Calculators and Computerized Predictive Models

    PubMed Central

    Stiglic, Gregor; Pajnkihar, Majda

    2015-01-01

    Classical paper-and-pencil based risk assessment questionnaires are often accompanied by the online versions of the questionnaire to reach a wider population. This study focuses on the loss, especially in risk estimation performance, that can be inflicted by direct transformation from the paper to online versions of risk estimation calculators by ignoring the possibilities of more complex and accurate calculations that can be performed using the online calculators. We empirically compare the risk estimation performance between four major diabetes risk calculators and two, more advanced, predictive models. National Health and Nutrition Examination Survey (NHANES) data from 1999–2012 was used to evaluate the performance of detecting diabetes and pre-diabetes. American Diabetes Association risk test achieved the best predictive performance in category of classical paper-and-pencil based tests with an Area Under the ROC Curve (AUC) of 0.699 for undiagnosed diabetes (0.662 for pre-diabetes) and 47% (47% for pre-diabetes) persons selected for screening. Our results demonstrate a significant difference in performance with additional benefits for a lower number of persons selected for screening when statistical methods are used. The best AUC overall was obtained in diabetes risk prediction using logistic regression with AUC of 0.775 (0.734) and an average 34% (48%) persons selected for screening. However, generalized boosted regression models might be a better option from the economical point of view as the number of selected persons for screening of 30% (47%) lies significantly lower for diabetes risk assessment in comparison to logistic regression (p < 0.001), with a significantly higher AUC (p < 0.001) of 0.774 (0.740) for the pre-diabetes group. Our results demonstrate a serious lack of predictive performance in four major online diabetes risk calculators. Therefore, one should take great care and consider optimizing the online versions of questionnaires that were

  2. A neural network model for credit risk evaluation.

    PubMed

    Khashman, Adnan

    2009-08-01

    Credit scoring is one of the key analytical techniques in credit risk evaluation which has been an active research area in financial risk management. This paper presents a credit risk evaluation system that uses a neural network model based on the back propagation learning algorithm. We train and implement the neural network to decide whether to approve or reject a credit application, using seven learning schemes and real world credit applications from the Australian credit approval datasets. A comparison of the system performance under the different learning schemes is provided, furthermore, we compare the performance of two neural networks; with one and two hidden layers following the ideal learning scheme. Experimental results suggest that neural networks can be effectively used in automatic processing of credit applications.

  3. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  4. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  5. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  6. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  7. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  8. Issues in risk assessment and modifications of the NRC health effects models

    SciTech Connect

    Gilbert, E.S.

    1992-06-01

    A report, Health Effects Models for Nuclear Power Plant Accident Consequence Analysis, was published by the US Nuclear Regulatory Commission, in 1985, and revised in 1989. These reports provided models for estimating health effects that would be expected to result from the radiation exposure received in a nuclear reactor accident. Separate models were given for early occurring effects, late somatic effects, and genetic effects; however, this paper addresses only late somatic effects, or the risk of cancer expected to occur in the lifetimes of exposed individuals. The 1989 revision was prepared prior to the publication of the BEIR V, 1988 UNSCEAR, and ICRP 60 reports. For this reason, an addendum was needed that would provide modified risk models that took into account these recent reports, and, more generally, any new evidence that had appeared since the 1989 publication. Of special importance was consideration of updated analyses of the Japanese A-bomb survivor study data based on revised DS86 dosimetry. The process of preparing the addendum required thorough review and evaluation of the models used by the BEIR V, UNSCEAR, and ICRP committees, and also required thorough consideration of the various decisions that must be made in any risk assessment effort. This paper emphasizes general issues and problems that arise in risk assessment, and also indicates areas where additional development and application of statistical methods may be fruitful.

  9. Issues in risk assessment and modifications of the NRC health effects models

    SciTech Connect

    Gilbert, E.S.

    1992-07-02

    A report, Health Effects Models for Nuclear Power Plant Accident Consequence Analysis, was published by the US Nuclear Regulatory Commission, in 1985, and revised in 1989. These reports provided models for estimating health effects that would be expected to result from the radiation exposure received in a nuclear reactor accident. Separate models were given for early occurring effects, late somatic effects, and genetic effects; however, this paper addresses only late somatic effects, or the risk of cancer expected to occur in the lifetimes of exposed individuals. The 1989 revision was prepared prior to the publication of the BEIR V, 1988 UNSCEAR, and ICRP 60 reports. For this reason, an addendum was needed that would provide modified risk models that took into account these recent reports, and, more generally, any new evidence that had appeared since the 1989 publication. Of special importance was consideration of updated analyses of the Japanese A-bomb survivor study data based on revised DS86 dosimetry. The process of preparing the addendum required thorough review and evaluation of the models used by the BEIR V, UNSCEAR, and ICRP committees, and also required thorough consideration of the various decisions that must be made in any risk assessment effort. This paper emphasizes general issues and problems that arise in risk assessment, and also indicates areas where additional development and application of statistical methods may be fruitful.

  10. Risk assessment and management: a community forensic mental health practice model.

    PubMed

    Kelly, Teresa; Simmons, Warren; Gregory, Esther

    2002-12-01

    In Victoria, the Crimes (Mental Impairment and Unfitness to be Tried) Act (1997) reformed legal practice in relation to the detention, management and release of persons found by a court to be not guilty on the grounds of insanity or unfit to be tried. This Act provides a legal structure for such 'forensic patients' to move from secure inpatient facilities into the community. This new legislative landscape has generated challenges for all stakeholders and has provided the impetus for the development of a risk assessment and management model. The key components of the model are the risk profile, assessment and management plan. The discussion comprises theory, legislation, practice implications and limitations of the model. Practice implications concern the provision of objective tools, which identify risk and document strategic interventions to support clinical management. Some of the practice limitations include the model's applicability to risk assessment and management and its dependence on a mercurial multi-service interface in after-hours crisis situations. In addition to this, the paper articulates human limitations implicit in the therapeutic relationship that necessarily underpins the model. The paper concludes with an exploration of the importance of evaluative processes as well as the need for formal support and education for clinicians.

  11. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models

    SciTech Connect

    David G. Hoel, PhD

    2012-04-19

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival function and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact

  12. Risk factors assessment and risk prediction models in lung cancer screening candidates

    PubMed Central

    Wachuła, Ewa; Szabłowska-Siwik, Sylwia; Boratyn-Nowicka, Agnieszka; Czyżewski, Damian

    2016-01-01

    From February 2015, low-dose computed tomography (LDCT) screening entered the armamentarium of diagnostic tools broadly available to individuals at high-risk of developing lung cancer. While a huge number of pulmonary nodules are identified, only a small fraction turns out to be early lung cancers. The majority of them constitute a variety of benign lesions. Although it entails a burden of the diagnostic work-up, the undisputable benefit emerges from: (I) lung cancer diagnosis at earlier stages (stage shift); (II) additional findings enabling the implementation of a preventive action beyond the realm of thoracic oncology. This review presents how to utilize the risk factors from distinct categories such as epidemiology, radiology and biomarkers to target the fraction of population, which may benefit most from the introduced screening modality. PMID:27195269

  13. Multistage Carcinogenesis Modelling of Low and Protracted Radiation Exposure for Risk Assessment

    NASA Astrophysics Data System (ADS)

    Brugmans, M. J. P.; Bijwaard, H.

    Exposure to cosmic radiation in space poses an increased risk for radiation-induced cancer later in life. Modelling is essential to quantify these excess risks from low and protracted exposures to a mixture of radiation types, since they cannot be determined directly in epidemiological studies. Multistage carcinogenesis models provide a mechanistic basis for the extrapolation of epidemiological data to the regime that is relevant for radiation protection. In recent years, we have exploited the well-known two-mutation carcinogenesis model to bridge the gap between radiobiology and epidemiology. We have fitted this model to a number of animal and epidemiological data sets, using dose-response relationships for the mutational steps that are well established in cellular radiobiology. The methodology and implications for radiation risks are illustrated with analyses of two radiation-induced tumours: bone cancer from internal (high-LET and low-LET) emitters and lung cancer after radon exposure. For the risks of bone-seeking radionuclides (Ra-226, Sr-90, Pu-239), model fits to beagle data show that the dose-effect relationship for bone cancer at low intakes is linear-quadratic. This is due to a combination of equally strong linear dose-effects in the two subsequent mutational steps in the model. This supra-linear dose-effect relationship is also found in a model analysis of bone cancer in radium dial painters. This implies that at low intakes the risks from bone seekers are significantly lower than estimated from a linear extrapolation from high doses. Model analyses of radon-exposed rats and uranium miners show that lung-cancer induction is dominated by a linear radiation effect in the first mutational step. For two miner cohorts with significantly different lung cancer baselines a uniform description of the effect of radon is obtained in a joint analysis. This demonstrates the possibility to model risk transfer across populations. In addition to biologically based risk

  14. Individual-based model for radiation risk assessment

    NASA Astrophysics Data System (ADS)

    Smirnova, O.

    A mathematical model is developed which enables one to predict the life span probability for mammals exposed to radiation. It relates statistical biometric functions with statistical and dynamic characteristics of an organism's critical system. To calculate the dynamics of the latter, the respective mathematical model is used too. This approach is applied to describe the effects of low level chronic irradiation on mice when the hematopoietic system (namely, thrombocytopoiesis) is the critical one. For identification of the joint model, experimental data on hematopoiesis in nonirradiated and irradiated mice, as well as on mortality dynamics of those in the absence of radiation are utilized. The life span probability and life span shortening predicted by the model agree with corresponding experimental data. Modeling results show the significance of ac- counting the variability of the individual radiosensitivity of critical system cells when estimating the radiation risk. These findings are corroborated by clinical data on persons involved in the elimination of the Chernobyl catastrophe after- effects. All this makes it feasible to use the model for radiation risk assessments for cosmonauts and astronauts on long-term missions such as a voyage to Mars or a lunar colony. In this case the model coefficients have to be determined by making use of the available data for humans. Scenarios for the dynamics of dose accumulation during space flights should also be taken into account.

  15. Holistic flood risk assessment using agent-based modelling: the case of Sint Maarten Island

    NASA Astrophysics Data System (ADS)

    Abayneh Abebe, Yared; Vojinovic, Zoran; Nikolic, Igor; Hammond, Michael; Sanchez, Arlex; Pelling, Mark

    2015-04-01

    Floods in coastal regions are regarded as one of the most dangerous and harmful disasters. Though commonly referred to as natural disasters, coastal floods are also attributable to various social, economic, historical and political issues. Rapid urbanisation in coastal areas combined with climate change and poor governance can lead to a significant increase in the risk of pluvial flooding coinciding with fluvial and coastal flooding posing a greater risk of devastation in coastal communities. Disasters that can be triggered by hydro-meteorological events are interconnected and interrelated with both human activities and natural processes. They, therefore, require holistic approaches to help understand their complexity in order to design and develop adaptive risk management approaches that minimise social and economic losses and environmental impacts, and increase resilience to such events. Being located in the North Atlantic Ocean, Sint Maarten is frequently subjected to hurricanes. In addition, the stormwater catchments and streams on Sint Maarten have several unique characteristics that contribute to the severity of flood-related impacts. Urban environments are usually situated in low-lying areas, with little consideration for stormwater drainage, and as such are subject to flash flooding. Hence, Sint Maarten authorities drafted policies to minimise the risk of flood-related disasters on the island. In this study, an agent-based model is designed and applied to understand the implications of introduced policies and regulations, and to understand how different actors' behaviours influence the formation, propagation and accumulation of flood risk. The agent-based model built for this study is based on the MAIA meta-model, which helps to decompose, structure and conceptualize socio-technical systems with an agent-oriented perspective, and is developed using the NetLogo simulation environment. The agents described in this model are households and businesses, and

  16. Field evaluation of an avian risk assessment model

    USGS Publications Warehouse

    Vyas, N.B.; Spann, J.W.; Hulse, C.S.; Borges, S.L.; Bennett, R.S.; Torrez, M.; Williams, B.I.; Leffel, R.

    2006-01-01

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in the field. We tested technical-grade diazinon and its D Z N- 50W (50% diazinon active ingredient wettable powder) formulation on Canada goose (Branta canadensis) goslings. Brain acetylcholinesterase activity was measured, and the feathers and skin, feet. and gastrointestinal contents were analyzed for diazinon residues. The dose-response curves showed that diazinon was significantly more toxic to goslings in the outdoor test than in the laboratory tests. The deterministic risk assessment method identified the potential for risk to birds in general, but the factors associated with extrapolating from the laboratory to the field, and from the laboratory test species to other species, resulted in the underestimation of risk to the goslings. The present study indicates that laboratory-based risk quotients should be interpreted with caution.

  17. Making Risk Models Operational for Situational Awareness and Decision Support

    SciTech Connect

    Paulson, Patrick R.; Coles, Garill A.; Shoemaker, Steven V.

    2012-06-12

    Modernization of nuclear power operations control systems, in particular the move to digital control systems, creates an opportunity to modernize existing legacy infrastructure and extend plant life. We describe here decision support tools that allow the assessment of different facets of risk and support the optimization of available resources to reduce risk as plants are upgraded and maintained. This methodology could become an integrated part of the design review process and a part of the operations management systems. The methodology can be applied to the design of new reactors such as small nuclear reactors (SMR), and be helpful in assessing the risks of different configurations of the reactors. Our tool provides a low cost evaluation of alternative configurations and provides an expanded safety analysis by considering scenarios while early in the implementation cycle where cost impacts can be minimized. The effects of failures can be modeled and thoroughly vetted to understand their potential impact on risk. The process and tools presented here allow for an integrated assessment of risk by supporting traditional defense in depth approaches while taking into consideration the insertion of new digital instrument and control systems.

  18. Cancer risk estimation of genotoxic chemicals based on target dose and a multiplicative model

    SciTech Connect

    Granath, F.N. . Dept. of Mathematical Statistics Karolinska Inst., Stockholm . Dept. of Medical Epidemiology); Vaca, C.E. . Dept. of Radiobiology Casco Products AB, Stockholm ); Ehrenberg, L.G.; Toernqvist, M.A. )

    1999-04-01

    A mechanistic model and associated procedures are proposed for cancer risk assessment of genotoxic chemicals. As previously shown for ionizing radiation, a linear multiplicative model was found to be compatible with published experimental data for ethylene oxide, acrylamide, and butadiene. Concurrent analysis led to rejection of an additive model. A reanalysis of data for radiogenic cancer in mouse, dog and man shows that the relative risk coefficient is approximately the same for tumors induced in the three species. Doses in vivo, defined as the time-integrated concentrations of ultimate mutagens, expressed in millimol x kg[sup [minus]1] x h (mMh) are, like radiation doses given in Gy or rad, proportional to frequencies of potentially mutagenic events. The radiation dose equivalents of chemical doses are, calculated by multiplying chemical doses (in mMh) with the relative genotoxic potencies determined in vitro. In this way the relative cancer incidence increments in rats and mice exposed to ethylene oxide were shown to be about 0.4% per rad-equivalent, in agreement with the data for radiogenic cancer. The analyses suggest that values of the relative risk coefficients for genotoxic chemicals are independent of species and that relative cancer risks determined in animal tests apply also to humans. If reliable animal test data are not available, cancer risks may be estimated by the relative potency. In both cases exposure dose/target dose relationships, the latter via macromolecule adducts, should be determined.

  19. Enhanced coding in a cochlear-implant model using additive noise: Aperiodic stochastic resonance with tuning

    NASA Astrophysics Data System (ADS)

    Morse, Robert P.; Roper, Peter

    2000-05-01

    Analog electrical stimulation of the cochlear nerve (the nerve of hearing) by a cochlear implant is an effective method of providing functional hearing to profoundly deaf people. Recent physiological and computational experiments have shown that analog cochlear implants are unlikely to convey certain speech cues by the temporal pattern of evoked nerve discharges. However, these experiments have also shown that the optimal addition of noise to cochlear implant signals can enhance the temporal representation of speech cues [R. P. Morse and E. F. Evans, Nature Medicine 2, 928 (1996)]. We present a simple model to explain this enhancement of temporal representation. Our model derives from a rate equation for the mean threshold-crossing rate of an infinite set of parallel discriminators (level-crossing detectors); a system that well describes the time coding of information by a set of nerve fibers. Our results show that the optimal transfer of information occurs when the threshold level of each discriminator is equal to the root-mean-square noise level. The optimal transfer of information by a cochlear implant is therefore expected to occur when the internal root-mean-square noise level of each stimulated fiber is approximately equal to the nerve threshold. When interpreted within the framework of aperiodic stochastic resonance, our results indicate therefore that for an infinite array of discriminators, a tuning of the noise is still necessary for optimal performance. This is in contrast to previous results [Collins, Chow, and Imhoff, Nature 376, 236 (1995); Chialvo, Longtin, and Müller-Gerking, Phys. Rev. E 55, 1798 (1997)] on arrays of FitzHugh-Nagumo neurons.

  20. The impact of consumer phase models in microbial risk analysis.

    PubMed

    Nauta, Maarten; Christensen, Bjarke

    2011-02-01

    In quantitative microbiological risk assessment (QMRA), the consumer phase model (CPM) describes the part of the food chain between purchase of the food product at retail and exposure. Construction of a CPM is complicated by the large variation in consumer food handling practices and a limited availability of data. Therefore, several subjective (simplifying) assumptions have to be made when a CPM is constructed, but with a single CPM their impact on the QMRA results is unclear. We therefore compared the performance of eight published CPMs for Campylobacter in broiler meat in an example of a QMRA, where all the CPMs were analyzed using one single input distribution of concentrations at retail, and the same dose-response relationship. It was found that, between CPMs, there may be a considerable difference in the estimated probability of illness per serving. However, the estimated relative risk reductions are less different for scenarios modeling the implementation of control measures. For control measures affecting the Campylobacter prevalence, the relative risk is proportional irrespective of the CPM used. However, for control measures affecting the concentration the CPMs show some difference in the estimated relative risk. This difference is largest for scenarios where the aim is to remove the highly contaminated portion from human exposure. Given these results, we conclude that for many purposes it is not necessary to develop a new detailed CPM for each new QMRA. However, more observational data on consumer food handling practices and their impact on microbial transfer and survival are needed to generalize this conclusion.

  1. Risk prediction models for contrast induced nephropathy: systematic review

    PubMed Central

    Silver, Samuel A; Shah, Prakesh M; Chertow, Glenn M; Wald, Ron

    2015-01-01

    Objectives To look at the available literature on validated prediction models for contrast induced nephropathy and describe their characteristics. Design Systematic review. Data sources Medline, Embase, and CINAHL (cumulative index to nursing and allied health literature) databases. Review methods Databases searched from inception to 2015, and the retrieved reference lists hand searched. Dual reviews were conducted to identify studies published in the English language of prediction models tested with patients that included derivation and validation cohorts. Data were extracted on baseline patient characteristics, procedural characteristics, modelling methods, metrics of model performance, risk of bias, and clinical usefulness. Eligible studies evaluated characteristics of predictive models that identified patients at risk of contrast induced nephropathy among adults undergoing a diagnostic or interventional procedure using conventional radiocontrast media (media used for computed tomography or angiography, and not gadolinium based contrast). Results 16 studies were identified, describing 12 prediction models. Substantial interstudy heterogeneity was identified, as a result of different clinical settings, cointerventions, and the timing of creatinine measurement to define contrast induced nephropathy. Ten models were validated internally and six were validated externally. Discrimination varied in studies that were validated internally (C statistic 0.61-0.95) and externally (0.57-0.86). Only one study presented reclassification indices. The majority of higher performing models included measures of pre-existing chronic kidney disease, age, diabetes, heart failure or impaired ejection fraction, and hypotension or shock. No prediction model evaluated its effect on clinical decision making or patient outcomes. Conclusions Most predictive models for contrast induced nephropathy in clinical use have modest ability, and are only relevant to patients receiving contrast for

  2. Inhibition of Ostwald ripening in model beverage emulsions by addition of poorly water soluble triglyceride oils.

    PubMed

    McClements, David Julian; Henson, Lulu; Popplewell, L Michael; Decker, Eric Andrew; Choi, Seung Jun

    2012-01-01

    Beverage emulsions containing flavor oils that have a relatively high water-solubility are unstable to droplet growth due to Ostwald ripening. The aim of this study was to improve the stability of model beverage emulsions to this kind of droplet growth by incorporating poorly water-soluble triglyceride oils. High pressure homogenization was used to prepare a series of 5 wt% oil-in-water emulsions stabilized by modified starch that had different lipid phase compositions (orange oil : corn oil). Emulsions prepared using only orange oil as the lipid phase were highly unstable to droplet growth during storage, which was attributed to Ostwald ripening resulting from the relatively high water-solubility of orange oil. Droplet growth could be effectively inhibited by incorporating ≥ 10% corn oil into the lipid phase prior to homogenization. In addition, creaming was also retarded because the lipid phase density was closer to that of the aqueous phase density. These results illustrate a simple method of improving the physical stability of orange oil emulsions for utilization in the food, beverage, and fragrance industries.

  3. Influence of the heterogeneous reaction HCl + HOCl on an ozone hole model with hydrocarbon additions

    NASA Astrophysics Data System (ADS)

    Elliott, Scott; Cicerone, Ralph J.; Turco, Richard P.; Drdla, Katja; Tabazadeh, Azadeh

    1994-02-01

    Injection of ethane or propane has been suggested as a means for reducing ozone loss within the Antarctic vortex because alkanes can convert active chlorine radicals into hydrochloric acid. In kinetic models of vortex chemistry including as heterogeneous processes only the hydrolysis and HCl reactions of ClONO2 and N2O5, parts per billion by volume levels of the light alkanes counteract ozone depletion by sequestering chlorine atoms. Introduction of the surface reaction of HCl with HOCl causes ethane to deepen baseline ozone holes and generally works to impede any mitigation by hydrocarbons. The increased depletion occurs because HCl + HOCl can be driven by HOx radicals released during organic oxidation. Following initial hydrogen abstraction by chlorine, alkane breakdown leads to a net hydrochloric acid activation as the remaining hydrogen atoms enter the photochemical system. Lowering the rate constant for reactions of organic peroxy radicals with ClO to 10-13 cm3 molecule-1 s-1 does not alter results, and the major conclusions are insensitive to the timing of the ethane additions. Ignoring the organic peroxy radical plus ClO reactions entirely restores remediation capabilities by allowing HOx removal independent of HCl. Remediation also returns if early evaporation of polar stratospheric clouds leaves hydrogen atoms trapped in aldehyde intermediates, but real ozone losses are small in such cases.

  4. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    PubMed

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study.

  5. Enhancement of colour stability of anthocyanins in model beverages by gum arabic addition.

    PubMed

    Chung, Cheryl; Rojanasasithara, Thananunt; Mutilangi, William; McClements, David Julian

    2016-06-15

    This study investigated the potential of gum arabic to improve the stability of anthocyanins that are used in commercial beverages as natural colourants. The degradation of purple carrot anthocyanin in model beverage systems (pH 3.0) containing L-ascorbic acid proceeded with a first-order reaction rate during storage (40 °C for 5 days in light). The addition of gum arabic (0.05-5.0%) significantly enhanced the colour stability of anthocyanin, with the most stable systems observed at intermediate levels (1.5%). A further increase in concentration (>1.5%) reduced its efficacy due to a change in the conformation of the gum arabic molecules that hindered their exposure to the anthocyanins. Fluorescence quenching measurements showed that the anthocyanin could have interacted with the glycoprotein fractions of the gum arabic through hydrogen bonding, resulting in enhanced stability. Overall, this study provides valuable information about enhancing the stability of anthocyanins in beverage systems using natural ingredients.

  6. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  7. Recurrence models of volcanic events: Applications to volcanic risk assessment

    SciTech Connect

    Crowe, B.M.; Picard, R.; Valentine, G.; Perry, F.V.

    1992-03-01

    An assessment of the risk of future volcanism has been conducted for isolation of high-level radioactive waste at the potential Yucca Mountain site in southern Nevada. Risk used in this context refers to a combined assessment of the probability and consequences of future volcanic activity. Past studies established bounds on the probability of magmatic disruption of a repository. These bounds were revised as additional data were gathered from site characterization studies. The probability of direct intersection of a potential repository located in an eight km{sup 2} area of Yucca Mountain by ascending basalt magma was bounded by the range of 10{sup {minus}8} to 10{sup {minus}10} yr{sup {minus}1 2}. The consequences of magmatic disruption of a repository were estimated in previous studies to be limited. The exact releases from such an event are dependent on the strike of an intruding basalt dike relative to the repository geometry, the timing of the basaltic event relative to the age of the radioactive waste and the mechanisms of release and dispersal of the waste radionuclides in the accessible environment. The combined low probability of repository disruption and the limited releases associated with this event established the basis for the judgement that the risk of future volcanism was relatively low. It was reasoned that that risk of future volcanism was not likely to result in disqualification of the potential Yucca Mountain site.

  8. Radiation-Induced Leukemia at Doses Relevant to Radiation Therapy: Modeling Mechanisms and Estimating Risks

    NASA Technical Reports Server (NTRS)

    Shuryak, Igor; Sachs, Rainer K.; Hlatky, Lynn; Mark P. Little; Hahnfeldt, Philip; Brenner, David J.

    2006-01-01

    Because many cancer patients are diagnosed earlier and live longer than in the past, second cancers induced by radiation therapy have become a clinically significant issue. An earlier biologically based model that was designed to estimate risks of high-dose radiation induced solid cancers included initiation of stem cells to a premalignant state, inactivation of stem cells at high radiation doses, and proliferation of stem cells during cellular repopulation after inactivation. This earlier model predicted the risks of solid tumors induced by radiation therapy but overestimated the corresponding leukemia risks. Methods: To extend the model to radiation-induced leukemias, we analyzed in addition to cellular initiation, inactivation, and proliferation a repopulation mechanism specific to the hematopoietic system: long-range migration through the blood stream of hematopoietic stem cells (HSCs) from distant locations. Parameters for the model were derived from HSC biologic data in the literature and from leukemia risks among atomic bomb survivors v^ ho were subjected to much lower radiation doses. Results: Proliferating HSCs that migrate from sites distant from the high-dose region include few preleukemic HSCs, thus decreasing the high-dose leukemia risk. The extended model for leukemia provides risk estimates that are consistent with epidemiologic data for leukemia risk associated with radiation therapy over a wide dose range. For example, when applied to an earlier case-control study of 110000 women undergoing radiotherapy for uterine cancer, the model predicted an excess relative risk (ERR) of 1.9 for leukemia among women who received a large inhomogeneous fractionated external beam dose to the bone marrow (mean = 14.9 Gy), consistent with the measured ERR (2.0, 95% confidence interval [CI] = 0.2 to 6.4; from 3.6 cases expected and 11 cases observed). As a corresponding example for brachytherapy, the predicted ERR of 0.80 among women who received an inhomogeneous low

  9. Integrated Water and Sanitation Risk Assessment and Modeling in the Upper Sonora River basin (Northwest, Mexico)

    NASA Astrophysics Data System (ADS)

    Mayer, A. S.; Robles-Morua, A.; Halvorsen, K. E.; Vivoni, E. R.; Auer, M. T.

    2011-12-01

    explore the use of participatory modeling frameworks in less developed regions. Results indicate that respondents agreed strongly with the hydrologic and water quality modeling methodologies presented and considered the modeling results useful. Our results also show that participatory modeling approaches can have short term impacts as seen in the changes in water-related risk perceptions. In total, these projects revealed that water resources management solutions need to take into account variations across the human landscape (i.e. risk perceptions) and variations in the biophysical response of watersheds to natural phenomena (i.e. streamflow generation) and to anthropogenic activities (i.e. contaminant fate and transport). In addition, this work underscores the notion that sustainable water resources solutions need to contend with uncertainty in our understanding and predictions of human perceptions and biophysical systems.

  10. Detecting Departure From Additivity Along a Fixed-Ratio Mixture Ray With a Piecewise Model for Dose and Interaction Thresholds

    PubMed Central

    Gennings, Chris; Wagner, Elizabeth D.; Simmons, Jane Ellen; Plewa, Michael J.

    2010-01-01

    For mixtures of many chemicals, a ray design based on a relevant, fixed mixing ratio is useful for detecting departure from additivity. Methods for detecting departure involve modeling the response as a function of total dose along the ray. For mixtures with many components, the interaction may be dose dependent. Therefore, we have developed the use of a three-segment model containing both a dose threshold and an interaction threshold. Prior to the dose threshold, the response is that of background; between the dose threshold and the interaction threshold, an additive relationship exists; the model allows for departure from additivity beyond the interaction threshold. With such a model, we can conduct a hypothesis test of additivity, as well as a test for a region of additivity. The methods are illustrated with cytotoxicity data that arise when Chinese hamster ovary cells are exposed to a mixture of nine haloacetic acids. PMID:21359103

  11. Fire risk in San Diego County, California: A weighted Bayesian model approach

    USGS Publications Warehouse

    Kolden, Crystal A.; Weigel, Timothy J.

    2007-01-01

    Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.

  12. Network Dependence in Risk Trading Games: A Banking Regulation Model

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan

    2003-04-01

    In the quest of quantitatively understanding risk-regulatory behavior of financial agents we propose a physical model of interacting agents where interactions are defined by trades of financial derivatives. Consequences arising from various types of interaction-network topologies are shown for system safety and efficiency. We demonstrate that the model yields characteristic features of actually observed wealth timeseries. Further we study the dependence of global system safety as a function of a risk-control parameter (Basle multiplier). We find a phase transition-like phenomenon, where the Basle parameter plays the role of temperature and safety serves as the order parameter. This work is done together with R. Hanel and S. Pichler.

  13. Engineering models for catastrophe risk and their application to insurance

    NASA Astrophysics Data System (ADS)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  14. FIRESTORM: Modelling the water quality risk of wildfire.

    NASA Astrophysics Data System (ADS)

    Mason, C. I.; Sheridan, G. J.; Smith, H. G.; Jones, O.; Chong, D.; Tolhurst, K.

    2012-04-01

    Following wildfire, loss of vegetation and changes to soil properties may result in decreases in infiltration rates, less rainfall interception, and higher overland flow velocities. Rainfall events affecting burn areas before vegetation recovers can cause high magnitude erosion events that impact on downstream water quality. For cities and towns that rely upon fire-prone forest catchments for water supply, wildfire impacts on water quality represent a credible risk to water supply security. Quantifying the risk associated with the occurrence of wildfires and the magnitude of water quality impacts has important implications for managing water supplies. At present, no suitable integrative model exists that considers the probabilistic nature of system inputs as well as the range of processes and scales involved in this problem. We present FIRESTORM, a new model currently in development that aims to determine the range of sediment and associated contaminant loads that may be delivered to water supply reservoirs from the combination of wildfire and subsequent rainfall events. This Monte Carlo model incorporates the probabilistic nature of fire ignition, fire weather and rainfall, and includes deterministic models for fire behaviour and locally dominant erosion processes. FIRESTORM calculates the magnitude and associated annual risk of catchment-scale sediment loads associated with the occurrence of wildfire and rainfall generated by two rain event types. The two event types are localised, high intensity, short-duration convective storms, and widespread, longer duration synoptic-scale rainfall events. Initial application and testing of the model will focus on the two main reservoirs supplying water to Melbourne, Australia, both of which are situated in forest catchments vulnerable to wildfire. Probabilistic fire ignition and weather scenarios have been combined using 40 years of fire records and weather observations. These are used to select from a dataset of over 80

  15. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    PubMed Central

    Mumtaz, Moiz; Fisher, Jeffrey; Blount, Benjamin; Ruiz, Patricia

    2012-01-01

    Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK) models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures. PMID:22523493

  16. Racial Differences in the Performance of Existing Risk Prediction Models for Incident Type 2 Diabetes: The CARDIA Study

    PubMed Central

    Wellenius, Gregory A.; Carnethon, Mercedes R.; Loucks, Eric B.; Carson, April P.; Luo, Xi; Kiefe, Catarina I.; Gjelsvik, Annie; Gunderson, Erica P.; Eaton, Charles B.; Wu, Wen-Chih

    2016-01-01

    OBJECTIVE In 2010, the American Diabetes Association (ADA) added hemoglobin A1c (A1C) to the guidelines for diagnosing type 2 diabetes. However, existing models for predicting diabetes risk were developed prior to the widespread adoption of A1C. Thus, it remains unknown how well existing diabetes risk prediction models predict incident diabetes defined according to the ADA 2010 guidelines. Accordingly, we examined the performance of an existing diabetes prediction model applied to a cohort of African American (AA) and white adults from the Coronary Artery Risk Development Study in Young Adults (CARDIA). RESEARCH DESIGN AND METHODS We evaluated the performance of the Atherosclerosis Risk in Communities (ARIC) diabetes risk prediction model among 2,456 participants in CARDIA free of diabetes at the 2005–2006 exam and followed for 5 years. We evaluated model discrimination, calibration, and integrated discrimination improvement with incident diabetes defined by ADA 2010 guidelines before and after adding baseline A1C to the prediction model. RESULTS In the overall cohort, re-estimating the ARIC model in the CARDIA cohort resulted in good discrimination for the prediction of 5-year diabetes risk (area under the curve [AUC] 0.841). Adding baseline A1C as a predictor improved discrimination (AUC 0.841 vs. 0.863, P = 0.03). In race-stratified analyses, model discrimination was significantly higher in whites than AA (AUC AA 0.816 vs. whites 0.902; P = 0.008). CONCLUSIONS Addition of A1C to the ARIC diabetes risk prediction model improved performance overall and in racial subgroups. However, for all models examined, discrimination was better in whites than AA. Additional studies are needed to further improve diabetes risk prediction among AA. PMID:26628420

  17. Modeling the Risk of Secondary Malignancies after Radiotherapy

    PubMed Central

    Schneider, Uwe

    2011-01-01

    In developed countries, more than half of all cancer patients receive radiotherapy at some stage in the management of their disease. However, a radiation-induced secondary malignancy can be the price of success if the primary cancer is cured or at least controlled. Therefore, there is increasing concern regarding radiation-related second cancer risks in long-term radiotherapy survivors and a corresponding need to be able to predict cancer risks at high radiation doses. Of particular interest are second cancer risk estimates for new radiation treatment modalities such as intensity modulated radiotherapy, intensity modulated arc-therapy, proton and heavy ion radiotherapy. The long term risks from such modern radiotherapy treatment techniques have not yet been determined and are unlikely to become apparent for many years, due to the long latency time for solid tumor induction. Most information on the dose-response of radiation-induced cancer is derived from data on the A-bomb survivors who were exposed to γ-rays and neutrons. Since, for radiation protection purposes, the dose span of main interest is between zero and one Gy, the analysis of the A-bomb survivors is usually focused on this range. With increasing cure rates, estimates of cancer risk for doses larger than one Gy are becoming more important for radiotherapy patients. Therefore in this review, emphasis was placed on doses relevant for radiotherapy with respect to radiation induced solid cancer. Simple radiation protection models should be used only with extreme care for risk estimates in radiotherapy, since they are developed exclusively for low dose. When applied to scatter radiation, such models can predict only a fraction of observed second malignancies. Better semi-empirical models include the effect of dose fractionation and represent the dose-response relationships more accurately. The involved uncertainties are still huge for most of the organs and tissues. A major reason for this is that the

  18. Risk factors correlated with risk of insulin resistance using homeostasis model assessment in adolescents in Taiwan.

    PubMed

    Lin, Shiyng-Yu; Su, Chien-Tien; Hsieh, Yi-Chen; Li, Yu-Ling; Chen, Yih-Ru; Cheng, Shu-Yun; Hu, Chien-Ming; Chen, Yi-Hua; Hsieh, Fang-I; Chiou, Hung-Yi

    2015-03-01

    The study aims to discover risk factors significantly correlated with insulin resistance among adolescents in Taiwan. A total of 339 study subjects were recruited in this cross-sectional study. A self-administered questionnaire and physical examinations including anthropometrics and biochemistry profiles were collected. Insulin resistance was assessed using homeostasis model assessment for insulin resistance (HOMA-IR). Study subjects had a significantly increased risk of IR for those with abnormal level of body mass index (odds ratio [OR] = 3.54; 95% confidence interval [CI] = 1.81-6.91), body fat (OR = 2.71; 95% CI = 1.25-5.88), and waist circumference (OR = 25.04; 95% CI = 2.93-214.14) when compared with those who have normal values. Furthermore, a significantly joint effect of 10.86-fold risk for HOMA-IR abnormality among body fat, body mass index, and systolic blood pressure was observed. The identification of risk factors significantly correlated with IR will be important to prevent metabolic syndrome-related diseases and complications for adolescents in their future life.

  19. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  20. Regime switching model for financial data: Empirical risk analysis

    NASA Astrophysics Data System (ADS)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  1. Model risk for European-style stock index options.

    PubMed

    Gençay, Ramazan; Gibson, Rajna

    2007-01-01

    In empirical modeling, there have been two strands for pricing in the options literature, namely the parametric and nonparametric models. Often, the support for the nonparametric methods is based on a benchmark such as the Black-Scholes (BS) model with constant volatility. In this paper, we study the stochastic volatility (SV) and stochastic volatility random jump (SVJ) models as parametric benchmarks against feedforward neural network (FNN) models, a class of neural network models. Our choice for FNN models is due to their well-studied universal approximation properties of an unknown function and its partial derivatives. Since the partial derivatives of an option pricing formula are risk pricing tools, an accurate estimation of the unknown option pricing function is essential for pricing and hedging. Our findings indicate that FNN models offer themselves as robust option pricing tools, over their sophisticated parametric counterparts in predictive settings. There are two routes to explain the superiority of FNN models over the parametric models in forecast settings. These are nonnormality of return distributions and adaptive learning.

  2. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    NASA Technical Reports Server (NTRS)

    2012-01-01

    estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements (NCRP, 2000, 2006). However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as opposed to the previous approach based on linear energy transfer, the development of a new solar particle event (SPE) model, and the updates to galactic cosmic ray (GCR) and shielding transport models. The newer epidemiological information includes updates to the cancer incidence rates from the life span study (LSS) of the Japanese atomic bomb survivors (Preston et al., 2007), transferred to the U.S. population and converted to cancer mortality rates from U.S. population statistics. In addition, the proposed model provides an alternative analysis applicable to lifetime never-smokers (NSs). Details of the uncertainty analysis in the model have also been updated and revised. NASA's proposed model and associated uncertainties are complex in their formulation and as such require a very clear and precise set of descriptions. The committee found the 2011 NASA report challenging to review largely because of the lack of clarity in the model descriptions and derivation of the various parameters used. The committee requested some clarifications from NASA throughout its review and was able to resolve many, but not all, of the ambiguities in the written description.

  3. Modeling the operational risk in Iranian commercial banks: case study of a private bank

    NASA Astrophysics Data System (ADS)

    Momen, Omid; Kimiagari, Alimohammad; Noorbakhsh, Eaman

    2012-08-01

    The Basel Committee on Banking Supervision from the Bank for International Settlement classifies banking risks into three main categories including credit risk, market risk, and operational risk. The focus of this study is on the operational risk measurement in Iranian banks. Therefore, issues arising when trying to implement operational risk models in Iran are discussed, and then, some solutions are recommended. Moreover, all steps of operational risk measurement based on Loss Distribution Approach with Iran's specific modifications are presented. We employed the approach of this study to model the operational risk of an Iranian private bank. The results are quite reasonable, comparing the scale of bank and other risk categories.

  4. Transmission of risk from parents with chronic pain to offspring: an integrative conceptual model.

    PubMed

    Stone, Amanda L; Wilson, Anna C

    2016-12-01

    Offspring of parents with chronic pain are at increased risk for pain and adverse mental and physical health outcomes (Higgins et al, 2015). Although the association between chronic pain in parents and offspring has been established, few studies have addressed why or how this relation occurs. Identifying mechanisms for the transmission of risk that leads to the development of chronic pain in offspring is important for developing preventive interventions targeted to decrease risk for chronic pain and related outcomes (eg, disability and internalizing symptoms). This review presents a conceptual model for the intergenerational transmission of chronic pain from parents to offspring with the goal of setting an agenda for future research and the development of preventive interventions. Our proposed model highlights 5 potential mechanisms for the relation between parental chronic pain and pediatric chronic pain and related adverse outcomes: (1) genetics, (2) alterations in early neurobiological development, (3) pain-specific social learning, (4), general parenting and family health, and (5) exposure to stressful environment. In addition, the model presents 3 potential moderators for the relation between parent and child chronic pain: (1) the presence of chronic pain in a second parent, (2) timing, course, and location of parental chronic pain, and (3) offspring's characteristics (ie, sex, developmental stage, race or ethnicity, and temperament). Such a framework highlights chronic pain as inherently familial and intergenerational, opening up avenues for new models of intervention and prevention that can be family centered and include at-risk children.

  5. A Novel Risk Score to the Prediction of 10-year Risk for Coronary Artery Disease Among the Elderly in Beijing Based on Competing Risk Model

    PubMed Central

    Liu, Long; Tang, Zhe; Li, Xia; Luo, Yanxia; Guo, Jin; Li, Haibin; Liu, Xiangtong; Tao, Lixin; Yan, Aoshuang; Guo, Xiuhua

    2016-01-01

    Abstract The study aimed to construct a risk prediction model for coronary artery disease (CAD) based on competing risk model among the elderly in Beijing and develop a user-friendly CAD risk score tool. We used competing risk model to evaluate the risk of developing a first CAD event. On the basis of the risk factors that were included in the competing risk model, we constructed the CAD risk prediction model with Cox proportional hazard model. Time-dependent receiver operating characteristic (ROC) curve and time-dependent area under the ROC curve (AUC) were used to evaluate the discrimination ability of the both methods. Calibration plots were applied to assess the calibration ability and adjusted for the competing risk of non-CAD death. Net reclassification index (NRI) and integrated discrimination improvement (IDI) were applied to quantify the improvement contributed by the new risk factors. Internal validation of predictive accuracy was performed using 1000 times of bootstrap re-sampling. Of the 1775 participants without CAD at baseline, 473 incident cases of CAD were documented for a 20-year follow-up. Time-dependent AUCs for men and women at t = 10 years were 0.841 [95% confidence interval (95% CI): 0.806–0.877], 0.804 (95% CI: 0.768–0.839) in Fine and Gray model, 0.784 (95% CI: 0.738–0.830), 0.733 (95% CI: 0.692–0.775) in Cox proportional hazard model. The competing risk model was significantly superior to Cox proportional hazard model on discrimination and calibration. The cut-off values of the risk score that marked the difference between low-risk and high-risk patients were 34 points for men and 30 points for women, which have good sensitivity and specificity. A sex-specific multivariable risk factor algorithm-based competing risk model has been developed on the basis of an elderly Chinese cohort, which could be applied to predict an individual's risk and provide a useful guide to identify the groups at a high risk for CAD among the Chinese

  6. A Novel Risk Score to the Prediction of 10-year Risk for Coronary Artery Disease Among the Elderly in Beijing Based on Competing Risk Model.

    PubMed

    Liu, Long; Tang, Zhe; Li, Xia; Luo, Yanxia; Guo, Jin; Li, Haibin; Liu, Xiangtong; Tao, Lixin; Yan, Aoshuang; Guo, Xiuhua

    2016-03-01

    The study aimed to construct a risk prediction model for coronary artery disease (CAD) based on competing risk model among the elderly in Beijing and develop a user-friendly CAD risk score tool. We used competing risk model to evaluate the risk of developing a first CAD event. On the basis of the risk factors that were included in the competing risk model, we constructed the CAD risk prediction model with Cox proportional hazard model. Time-dependent receiver operating characteristic (ROC) curve and time-dependent area under the ROC curve (AUC) were used to evaluate the discrimination ability of the both methods. Calibration plots were applied to assess the calibration ability and adjusted for the competing risk of non-CAD death. Net reclassification index (NRI) and integrated discrimination improvement (IDI) were applied to quantify the improvement contributed by the new risk factors. Internal validation of predictive accuracy was performed using 1000 times of bootstrap re-sampling. Of the 1775 participants without CAD at baseline, 473 incident cases of CAD were documented for a 20-year follow-up. Time-dependent AUCs for men and women at t = 10 years were 0.841 [95% confidence interval (95% CI): 0.806-0.877], 0.804 (95% CI: 0.768-0.839) in Fine and Gray model, 0.784 (95% CI: 0.738-0.830), 0.733 (95% CI: 0.692-0.775) in Cox proportional hazard model. The competing risk model was significantly superior to Cox proportional hazard model on discrimination and calibration. The cut-off values of the risk score that marked the difference between low-risk and high-risk patients were 34 points for men and 30 points for women, which have good sensitivity and specificity. A sex-specific multivariable risk factor algorithm-based competing risk model has been developed on the basis of an elderly Chinese cohort, which could be applied to predict an individual's risk and provide a useful guide to identify the groups at a high risk for CAD among the Chinese adults over 55

  7. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    SciTech Connect

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.; Hu, Qinhong

    2015-09-28

    This study statistically analyzed a grain-size based additivity model that has been proposed to scale reaction rates and parameters from laboratory to field. The additivity model assumed that reaction properties in a sediment including surface area, reactive site concentration, reaction rate, and extent can be predicted from field-scale grain size distribution by linearly adding reaction properties for individual grain size fractions. This study focused on the statistical analysis of the additivity model with respect to reaction rate constants using multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment as an example. Experimental data of rate-limited U(VI) desorption in a stirred flow-cell reactor were used to estimate the statistical properties of multi-rate parameters for individual grain size fractions. The statistical properties of the rate constants for the individual grain size fractions were then used to analyze the statistical properties of the additivity model to predict rate-limited U(VI) desorption in the composite sediment, and to evaluate the relative importance of individual grain size fractions to the overall U(VI) desorption. The result indicated that the additivity model provided a good prediction of the U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model, and U(VI) desorption in individual grain size fractions have to be simulated in order to apply the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel size fraction (2-8mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.

  8. A Spatio-temporal Model of African Animal Trypanosomosis Risk

    PubMed Central

    Dicko, Ahmadou H.; Percoma, Lassane; Sow, Adama; Adam, Yahaya; Mahama, Charles; Sidibé, Issa; Dayo, Guiguigbaza-Kossigan; Thévenon, Sophie; Fonta, William; Sanfo, Safietou; Djiteye, Aligui; Salou, Ernest; Djohan, Vincent; Cecchi, Giuliano; Bouyer, Jérémy

    2015-01-01

    Background African animal trypanosomosis (AAT) is a major constraint to sustainable development of cattle farming in sub-Saharan Africa. The habitat of the tsetse fly vector is increasingly fragmented owing to demographic pressure and shifts in climate, which leads to heterogeneous risk of cyclical transmission both in space and time. In Burkina Faso and Ghana, the most important vectors are riverine species, namely Glossina palpalis gambiensis and G. tachinoides, which are more resilient to human-induced changes than the savannah and forest species. Although many authors studied the distribution of AAT risk both in space and time, spatio-temporal models allowing predictions of it are lacking. Methodology/Principal Findings We used datasets generated by various projects, including two baseline surveys conducted in Burkina Faso and Ghana within PATTEC (Pan African Tsetse and Trypanosomosis Eradication Campaign) national initiatives. We computed the entomological inoculation rate (EIR) or tsetse challenge using a range of environmental data. The tsetse apparent density and their infection rate were separately estimated and subsequently combined to derive the EIR using a “one layer-one model” approach. The estimated EIR was then projected into suitable habitat. This risk index was finally validated against data on bovine trypanosomosis. It allowed a good prediction of the parasitological status (r2 = 67%), showed a positive correlation but less predictive power with serological status (r2 = 22%) aggregated at the village level but was not related to the illness status (r2 = 2%). Conclusions/Significance The presented spatio-temporal model provides a fine-scale picture of the dynamics of AAT risk in sub-humid areas of West Africa. The estimated EIR was high in the proximity of rivers during the dry season and more widespread during the rainy season. The present analysis is a first step in a broader framework for an efficient risk management of climate

  9. Peer Review of NRC Standardized Plant Analysis Risk Models

    SciTech Connect

    Anthony Koonce; James Knudsen; Robert Buell

    2011-03-01

    The Nuclear Regulatory Commission (NRC) Standardized Plant Analysis Risk (SPAR) Models underwent a Peer Review using ASME PRA standard (Addendum C) as endorsed by NRC in Regulatory Guide (RG) 1.200. The review was performed by a mix of industry probabilistic risk analysis (PRA) experts and NRC PRA experts. Representative SPAR models, one PWR and one BWR, were reviewed against Capability Category I of the ASME PRA standard. Capability Category I was selected as the basis for review due to the specific uses/applications of the SPAR models. The BWR SPAR model was reviewed against 331 ASME PRA Standard Supporting Requirements; however, based on the Capability Category I level of review and the absence of internal flooding and containment performance (LERF) logic only 216 requirements were determined to be applicable. Based on the review, the BWR SPAR