Science.gov

Sample records for additive risk model

  1. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online. PMID:24729671

  2. Refining Breast Cancer Risk Stratification: Additional Genes, Additional Information.

    PubMed

    Kurian, Allison W; Antoniou, Antonis C; Domchek, Susan M

    2016-01-01

    Recent advances in genomic technology have enabled far more rapid, less expensive sequencing of multiple genes than was possible only a few years ago. Advances in bioinformatics also facilitate the interpretation of large amounts of genomic data. New strategies for cancer genetic risk assessment include multiplex sequencing panels of 5 to more than 100 genes (in which rare mutations are often associated with at least two times the average risk of developing breast cancer) and panels of common single-nucleotide polymorphisms (SNPs), combinations of which are generally associated with more modest cancer risks (more than twofold). Although these new multiple-gene panel tests are used in oncology practice, questions remain about the clinical validity and the clinical utility of their results. To translate this increasingly complex genetic information for clinical use, cancer risk prediction tools are under development that consider the joint effects of all susceptibility genes, together with other established breast cancer risk factors. Risk-adapted screening and prevention protocols are underway, with ongoing refinement as genetic knowledge grows. Priority areas for future research include the clinical validity and clinical utility of emerging genetic tests; the accuracy of developing cancer risk prediction models; and the long-term outcomes of risk-adapted screening and prevention protocols, in terms of patients' experiences and survival. PMID:27249685

  3. Mixed additive models

    NASA Astrophysics Data System (ADS)

    Carvalho, Francisco; Covas, Ricardo

    2016-06-01

    We consider mixed models y =∑i =0 w Xiβi with V (y )=∑i =1 w θiMi Where Mi=XiXi⊤ , i = 1, . . ., w, and µ = X0β0. For these we will estimate the variance components θ1, . . ., θw, aswell estimable vectors through the decomposition of the initial model into sub-models y(h), h ∈ Γ, with V (y (h ))=γ (h )Ig (h )h ∈Γ . Moreover we will consider L extensions of these models, i.e., y˚=Ly+ɛ, where L=D (1n1, . . ., 1nw) and ɛ, independent of y, has null mean vector and variance covariance matrix θw+1Iw, where w =∑i =1 n wi .

  4. Does early intensive multifactorial therapy reduce modelled cardiovascular risk in individuals with screen-detected diabetes? Results from the ADDITION-Europe cluster randomized trial

    PubMed Central

    Black, J A; Sharp, S J; Wareham, N J; Sandbæk, A; Rutten, G E H M; Lauritzen, T; Khunti, K; Davies, M J; Borch-Johnsen, K; Griffin, S J; Simmons, R K

    2014-01-01

    Aims Little is known about the long-term effects of intensive multifactorial treatment early in the diabetes disease trajectory. In the absence of long-term data on hard outcomes, we described change in 10-year modelled cardiovascular risk in the 5 years following diagnosis, and quantified the impact of intensive treatment on 10-year modelled cardiovascular risk at 5 years. Methods In a pragmatic, cluster-randomized, parallel-group trial in Denmark, the Netherlands and the UK, 3057 people with screen-detected Type 2 diabetes were randomized by general practice to receive (1) routine care of diabetes according to national guidelines (1379 patients) or (2) intensive multifactorial target-driven management (1678 patients). Ten-year modelled cardiovascular disease risk was calculated at baseline and 5 years using the UK Prospective Diabetes Study Risk Engine (version 3β). Results Among 2101 individuals with complete data at follow up (73.4%), 10-year modelled cardiovascular disease risk was 27.3% (sd 13.9) at baseline and 21.3% (sd 13.8) at 5-year follow-up (intensive treatment group difference –6.9, sd 9.0; routine care group difference –5.0, sd 12.2). Modelled 10-year cardiovascular disease risk was lower in the intensive treatment group compared with the routine care group at 5 years, after adjustment for baseline cardiovascular disease risk and clustering (–2.0; 95% CI –3.1 to –0.9). Conclusions Despite increasing age and diabetes duration, there was a decline in modelled cardiovascular disease risk in the 5 years following diagnosis. Compared with routine care, 10-year modelled cardiovascular disease risk was lower in the intensive treatment group at 5 years. Our results suggest that patients benefit from intensive treatment early in the diabetes disease trajectory, where the rate of cardiovascular disease risk progression may be slowed. PMID:24533664

  5. Biosafety Risk Assessment Model

    SciTech Connect

    2011-05-27

    Software tool based on a structured methodology for conducting laboratory biosafety risk assessments by biosafety experts. Software is based upon an MCDA scheme and uses peer reviewed criteria and weights. The software was developed upon Microsoft’s .net framework. The methodology defines likelihood and consequence of a laboratory exposure for thirteen unique scenarios and provides numerical relative risks for each of the relevant thirteen. The software produces 2-d graphs reflecting the relative risk and a sensitivity analysis which highlights the overall importance of each factor. The software works as a set of questions with absolute scales and uses a weighted additive model to calculate the likelihood and consequence.

  6. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  7. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  8. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  9. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  10. 46 CFR 308.104 - Additional war risk insurance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Additional war risk insurance. 308.104 Section 308.104 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Hull and Disbursements Insurance § 308.104 Additional war risk insurance. Owners or charterers...

  11. Biosafety Risk Assessment Model

    2011-05-27

    Software tool based on a structured methodology for conducting laboratory biosafety risk assessments by biosafety experts. Software is based upon an MCDA scheme and uses peer reviewed criteria and weights. The software was developed upon Microsoft’s .net framework. The methodology defines likelihood and consequence of a laboratory exposure for thirteen unique scenarios and provides numerical relative risks for each of the relevant thirteen. The software produces 2-d graphs reflecting the relative risk and a sensitivitymore » analysis which highlights the overall importance of each factor. The software works as a set of questions with absolute scales and uses a weighted additive model to calculate the likelihood and consequence.« less

  12. Radiation risk estimation models

    SciTech Connect

    Hoel, D.G.

    1987-11-01

    Cancer risk models and their relationship to ionizing radiation are discussed. There are many model assumptions and risk factors that have a large quantitative impact on the cancer risk estimates. Other health end points such as mental retardation may be an even more serious risk than cancer for those with in utero exposures. 8 references.

  13. Chemical Mixture Risk Assessment Additivity-Based Approaches

    EPA Science Inventory

    Powerpoint presentation includes additivity-based chemical mixture risk assessment methods. Basic concepts, theory and example calculations are included. Several slides discuss the use of "common adverse outcomes" in analyzing phthalate mixtures.

  14. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  15. Melanoma Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  16. Public risk perception of food additives and food scares. The case in Suzhou, China.

    PubMed

    Wu, Linhai; Zhong, Yingqi; Shan, Lijie; Qin, Wei

    2013-11-01

    This study examined the factors affecting public risk perception of food additive safety and possible resulting food scares using a survey conducted in Suzhou, Jiangsu Province, China. The model was proposed based on literature relating to the role of risk perception and information perception of public purchase intention under food scares. Structural equation modeling (SEM) was used for data analysis. The results showed that attitude towards behavior, subjective norm and information perception exerted moderate to high effect on food scares, and the effects were also mediated by risk perceptions of additive safety. Significant covariance was observed between attitudes toward behavior, subjective norm and information perception. Establishing an effective mechanism of food safety risk communication, releasing information of government supervision on food safety in a timely manner, curbing misleading media reports on public food safety risk, and enhancing public knowledge of the food additives are key to the development and implementation of food safety risk management policies by the Chinese government. PMID:23831014

  17. Criteria for deviation from predictions by the concentration addition model.

    PubMed

    Takeshita, Jun-Ichi; Seki, Masanori; Kamo, Masashi

    2016-07-01

    Loewe's additivity (concentration addition) is a well-known model for predicting the toxic effects of chemical mixtures under the additivity assumption of toxicity. However, from the perspective of chemical risk assessment and/or management, it is important to identify chemicals whose toxicities are additive when present concurrently, that is, it should be established whether there are chemical mixtures to which the concentration addition predictive model can be applied. The objective of the present study was to develop criteria for judging test results that deviated from the predictions by the concentration addition chemical mixture model. These criteria were based on the confidence interval of the concentration addition model's prediction and on estimation of errors of the predicted concentration-effect curves by toxicity tests after exposure to single chemicals. A log-logit model with 2 parameters was assumed for the concentration-effect curve of each individual chemical. These parameters were determined by the maximum-likelihood method, and the criteria were defined using the variances and the covariance of the parameters. In addition, the criteria were applied to a toxicity test of a binary mixture of p-n-nonylphenol and p-n-octylphenol using the Japanese killifish, medaka (Oryzias latipes). Consequently, the concentration addition model using confidence interval was capable of predicting the test results at any level, and no reason for rejecting the concentration addition was found. Environ Toxicol Chem 2016;35:1806-1814. © 2015 SETAC. PMID:26660330

  18. Lunar Landing Operational Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian

    2010-01-01

    Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.

  19. Models for computing combat risk

    NASA Astrophysics Data System (ADS)

    Jelinek, Jan

    2002-07-01

    Combat always involves uncertainty and uncertainty entails risk. To ensure that a combat task is prosecuted with the desired probability of success, the task commander has to devise an appropriate task force and then adjust it continuously in the course of battle. In order to do so, he has to evaluate how the probability of task success is related to the structure, capabilities and numerical strengths of combatants. For this purpose, predictive models of combat dynamics for combats in which the combatants fire asynchronously at random instants are developed from the first principles. Combats involving forces with both unlimited and limited ammunition supply are studied and modeled by stochastic Markov processes. In addition to the Markov models, another class of models first proposed by Brown was explored. The models compute directly the probability of win, in which we are primarily interested, without integrating the state probability equations. Experiments confirm that they produce exactly the same results at much lower computational cost.

  20. Acute radiation risk models

    NASA Astrophysics Data System (ADS)

    Smirnova, Olga

    Biologically motivated mathematical models, which describe the dynamics of the major hematopoietic lineages (the thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems) in acutely/chronically irradiated humans are developed. These models are implemented as systems of nonlinear differential equations, which variables and constant parameters have clear biological meaning. It is shown that the developed models are capable of reproducing clinical data on the dynamics of these systems in humans exposed to acute radiation in the result of incidents and accidents, as well as in humans exposed to low-level chronic radiation. Moreover, the averaged value of the "lethal" dose rates of chronic irradiation evaluated within models of these four major hematopoietic lineages coincides with the real minimal dose rate of lethal chronic irradiation. The demonstrated ability of the models of the human thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems to predict the dynamical response of these systems to acute/chronic irradiation in wide ranges of doses and dose rates implies that these mathematical models form an universal tool for the investigation and prediction of the dynamics of the major human hematopoietic lineages for a vast pattern of irradiation scenarios. In particular, these models could be applied for the radiation risk assessment for health of astronauts exposed to space radiation during long-term space missions, such as voyages to Mars or Lunar colonies, as well as for health of people exposed to acute/chronic irradiation due to environmental radiological events.

  1. Network Reconstruction Using Nonparametric Additive ODE Models

    PubMed Central

    Henderson, James; Michailidis, George

    2014-01-01

    Network representations of biological systems are widespread and reconstructing unknown networks from data is a focal problem for computational biologists. For example, the series of biochemical reactions in a metabolic pathway can be represented as a network, with nodes corresponding to metabolites and edges linking reactants to products. In a different context, regulatory relationships among genes are commonly represented as directed networks with edges pointing from influential genes to their targets. Reconstructing such networks from data is a challenging problem receiving much attention in the literature. There is a particular need for approaches tailored to time-series data and not reliant on direct intervention experiments, as the former are often more readily available. In this paper, we introduce an approach to reconstructing directed networks based on dynamic systems models. Our approach generalizes commonly used ODE models based on linear or nonlinear dynamics by extending the functional class for the functions involved from parametric to nonparametric models. Concomitantly we limit the complexity by imposing an additive structure on the estimated slope functions. Thus the submodel associated with each node is a sum of univariate functions. These univariate component functions form the basis for a novel coupling metric that we define in order to quantify the strength of proposed relationships and hence rank potential edges. We show the utility of the method by reconstructing networks using simulated data from computational models for the glycolytic pathway of Lactocaccus Lactis and a gene network regulating the pluripotency of mouse embryonic stem cells. For purposes of comparison, we also assess reconstruction performance using gene networks from the DREAM challenges. We compare our method to those that similarly rely on dynamic systems models and use the results to attempt to disentangle the distinct roles of linearity, sparsity, and derivative

  2. Are major behavioral and sociodemographic risk factors for mortality additive or multiplicative in their effects?

    PubMed

    Mehta, Neil; Preston, Samuel

    2016-04-01

    All individuals are subject to multiple risk factors for mortality. In this paper, we consider the nature of interactions between certain major sociodemographic and behavioral risk factors associated with all-cause mortality in the United States. We develop the formal logic pertaining to two forms of interaction between risk factors, additive and multiplicative relations. We then consider the general circumstances in which additive or multiplicative relations might be expected. We argue that expectations about interactions among socio-demographic variables, and their relation to behavioral variables, have been stated in terms of additivity. However, the statistical models typically used to estimate the relation between risk factors and mortality assume that risk factors act multiplicatively. We examine empirically the nature of interactions among five major risk factors associated with all-cause mortality: smoking, obesity, race, sex, and educational attainment. Data were drawn from the cross-sectional NHANES III (1988-1994) and NHANES 1999-2010 surveys, linked to death records through December 31, 2011. Our analytic sample comprised 35,604 respondents and 5369 deaths. We find that obesity is additive with each of the remaining four variables. We speculate that its additivity is a reflection of the fact that obese status is generally achieved later in life. For all pairings of socio-demographic variables, risks are multiplicative. For survival chances, it is much more dangerous to be poorly educated if you are black or if you are male. And it is much riskier to be a male if you are black. These traits, established at birth or during childhood, literally result in deadly combinations. We conclude that the identification of interactions among risk factors can cast valuable light on the nature of the process being studied. It also has public health implications by identifying especially vulnerable groups and by properly identifying the proportion of deaths

  3. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  4. Additional risk of end-of-the-pipe geoengineering technologies

    NASA Astrophysics Data System (ADS)

    Bohle, Martin

    2014-05-01

    qualitatively from the known successes. They do not tackle the initial cause, namely the carbon-dioxide inputs that are too high. This is their additional specific risk. 'The acceptability of geoengineering will be determined as much by social, legal and political issues as by scientific and technical factors', conclude Adam Corner and Nick Pidgeon (2010) when reviewing social and ethical implications of geoengineering the climate. It is to debate in that context that most geoengineering technologies are 'end of the pipe technologies', what involves an additional specific risk. Should these technologies be part of the toolbox to tackle anthropogenic climate change? Adam Corner and Nick Pidgeon 2010, Geoengineering the climate: The social and ethical implications, Environment Vol. 52.

  5. CREATION OF THE MODEL ADDITIONAL PROTOCOL

    SciTech Connect

    Houck, F.; Rosenthal, M.; Wulf, N.

    2010-05-25

    In 1991, the international nuclear nonproliferation community was dismayed to discover that the implementation of safeguards by the International Atomic Energy Agency (IAEA) under its NPT INFCIRC/153 safeguards agreement with Iraq had failed to detect Iraq's nuclear weapon program. It was now clear that ensuring that states were fulfilling their obligations under the NPT would require not just detecting diversion but also the ability to detect undeclared materials and activities. To achieve this, the IAEA initiated what would turn out to be a five-year effort to reappraise the NPT safeguards system. The effort engaged the IAEA and its Member States and led to agreement in 1997 on a new safeguards agreement, the Model Protocol Additional to the Agreement(s) between States and the International Atomic Energy Agency for the Application of Safeguards. The Model Protocol makes explicit that one IAEA goal is to provide assurance of the absence of undeclared nuclear material and activities. The Model Protocol requires an expanded declaration that identifies a State's nuclear potential, empowers the IAEA to raise questions about the correctness and completeness of the State's declaration, and, if needed, allows IAEA access to locations. The information required and the locations available for access are much broader than those provided for under INFCIRC/153. The negotiation was completed in quite a short time because it started with a relatively complete draft of an agreement prepared by the IAEA Secretariat. This paper describes how the Model Protocol was constructed and reviews key decisions that were made both during the five-year period and in the actual negotiation.

  6. Cardiovascular risk assessment: addition of CKD and race to the Framingham equation

    PubMed Central

    Drawz, Paul E.; Baraniuk, Sarah; Davis, Barry R.; Brown, Clinton D.; Colon, Pedro J.; Cujyet, Aloysius B.; Dart, Richard A.; Graumlich, James F.; Henriquez, Mario A.; Moloo, Jamaluddin; Sakalayen, Mohammed G.; Simmons, Debra L.; Stanford, Carol; Sweeney, Mary Ellen; Wong, Nathan D.; Rahman, Mahboob

    2012-01-01

    Background/Aims The value of the Framingham equation in predicting cardiovascular risk in African Americans and patients with chronic kidney disease (CKD) is unclear. The purpose of the study was to evaluate whether the addition of CKD and race to the Framingham equation improves risk stratification in hypertensive patients. Methods Participants in the Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) were studied. Those randomized to doxazosin, age greater than 74 years, and those with a history of coronary heart disease (CHD) were excluded. Two risk stratification models were developed using Cox proportional hazards models in a two-thirds developmental sample. The first model included the traditional Framingham risk factors. The second model included the traditional risk factors plus CKD, defined by eGFR categories, and stratification by race (Black vs. Non-Black). The primary outcome was a composite of fatal CHD, nonfatal MI, coronary revascularization, and hospitalized angina. Results There were a total of 19,811 eligible subjects. In the validation cohort, there was no difference in C-statistics between the Framingham equation and the ALLHAT model including CKD and race. This was consistent across subgroups by race and gender and among those with CKD. One exception was among Non-Black women where the C-statistic was higher for the Framingham equation (0.68 vs 0.65, P=0.02). Additionally, net reclassification improvement was not significant for any subgroup based on race and gender, ranging from −5.5% to 4.4%. Conclusion The addition of CKD status and stratification by race does not improve risk prediction in high-risk hypertensive patients. PMID:23194494

  7. Detecting contaminated birthdates using generalized additive models

    PubMed Central

    2014-01-01

    Background Erroneous patient birthdates are common in health databases. Detection of these errors usually involves manual verification, which can be resource intensive and impractical. By identifying a frequent manifestation of birthdate errors, this paper presents a principled and statistically driven procedure to identify erroneous patient birthdates. Results Generalized additive models (GAM) enabled explicit incorporation of known demographic trends and birth patterns. With false positive rates controlled, the method identified birthdate contamination with high accuracy. In the health data set used, of the 58 actual incorrect birthdates manually identified by the domain expert, the GAM-based method identified 51, with 8 false positives (resulting in a positive predictive value of 86.0% (51/59) and a false negative rate of 12.0% (7/58)). These results outperformed linear time-series models. Conclusions The GAM-based method is an effective approach to identify systemic birthdate errors, a common data quality issue in both clinical and administrative databases, with high accuracy. PMID:24923281

  8. Lightning Climatology with a Generalized Additive Model

    NASA Astrophysics Data System (ADS)

    Simon, Thorsten; Mayr, Georg; Umlauf, Nikolaus; Zeileis, Achim

    2016-04-01

    This study present a lightning climatology on a 1km x 1km grid estimated via generalized additive models (GAM). GAMs provide a framework to account for non-linear effects in time and space and for non-linear spatial-temporal interaction terms simultaneously. The degrees of smoothness of the non-linear effects is selected automatically in our approach. Furthermore, the influence of topography is captured in the model by including a non-linear term. To illustrate our approach we use lightning data from the ALDIS networks and selected a region in Southeastern Austria, where complex terrain extends from 200 an 3800 m asl and summertime lightning activity is high compared to other parts of the Eastern Alps. The temporal effect in the GAM shows a rapid increase in lightning activity in early July and a slow decay in activity afterwards. The estimated spatial effect is not very smooth and requires approximately 225 effective degrees of freedom. It reveals that lightning is more likely in the Eastern and Southern part of the region of interest. This spatial effect only accounts for variability not already explained by the topography. The topography effect shows lightning to be more likely at higher altitudes. The effect describing the spatio-temporal interactions takes approximately 200 degrees of freedom, and reveals local deviations of the climatology.

  9. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  10. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  11. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  12. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  13. 46 CFR 308.204 - Additional war risk protection and indemnity insurance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Additional war risk protection and indemnity insurance... OPERATIONS WAR RISK INSURANCE War Risk Protection and Indemnity Insurance § 308.204 Additional war risk protection and indemnity insurance. Owners or charterers may obtain, on an excess basis, additional war...

  14. Testing a Gender Additive Model: The Role of Body Image in Adolescent Depression

    ERIC Educational Resources Information Center

    Bearman, Sarah Kate; Stice, Eric

    2008-01-01

    Despite consistent evidence that adolescent girls are at greater risk of developing depression than adolescent boys, risk factor models that account for this difference have been elusive. The objective of this research was to examine risk factors proposed by the "gender additive" model of depression that attempts to partially explain the increased…

  15. Mental Models of Security Risks

    NASA Astrophysics Data System (ADS)

    Asgharpour, Farzaneh; Liu, Debin; Camp, L. Jean

    In computer security, risk communication refers to informing computer users about the likelihood and magnitude of a threat. Efficacy of risk communication depends not only on the nature of the risk, but also on the alignment between the conceptual model embedded in the risk communication and the user's mental model of the risk. The gap between the mental models of security experts and non-experts could lead to ineffective risk communication. Our research shows that for a variety of the security risks self-identified security experts and non-experts have different mental models. We propose that the design of the risk communication methods should be based on the non-expert mental models.

  16. Improving coeliac disease risk prediction by testing non-HLA variants additional to HLA variants

    PubMed Central

    Romanos, Jihane; Rosén, Anna; Kumar, Vinod; Trynka, Gosia; Franke, Lude; Szperl, Agata; Gutierrez-Achury, Javier; van Diemen, Cleo C; Kanninga, Roan; Jankipersadsing, Soesma A; Steck, Andrea; Eisenbarth, Georges; van Heel, David A; Cukrowska, Bozena; Bruno, Valentina; Mazzilli, Maria Cristina; Núñez, Concepcion; Bilbao, Jose Ramon; Mearin, M Luisa; Barisani, Donatella; Rewers, Marian; Norris, Jill M; Ivarsson, Anneli; Boezen, H Marieke; Liu, Edwin; Wijmenga, Cisca

    2014-01-01

    Background The majority of coeliac disease (CD) patients are not being properly diagnosed and therefore remain untreated, leading to a greater risk of developing CD-associated complications. The major genetic risk heterodimer, HLA-DQ2 and DQ8, is already used clinically to help exclude disease. However, approximately 40% of the population carry these alleles and the majority never develop CD. Objective We explored whether CD risk prediction can be improved by adding non-HLA-susceptible variants to common HLA testing. Design We developed an average weighted genetic risk score with 10, 26 and 57 single nucleotide polymorphisms (SNP) in 2675 cases and 2815 controls and assessed the improvement in risk prediction provided by the non-HLA SNP. Moreover, we assessed the transferability of the genetic risk model with 26 non-HLA variants to a nested case–control population (n=1709) and a prospective cohort (n=1245) and then tested how well this model predicted CD outcome for 985 independent individuals. Results Adding 57 non-HLA variants to HLA testing showed a statistically significant improvement compared to scores from models based on HLA only, HLA plus 10 SNP and HLA plus 26 SNP. With 57 non-HLA variants, the area under the receiver operator characteristic curve reached 0.854 compared to 0.823 for HLA only, and 11.1% of individuals were reclassified to a more accurate risk group. We show that the risk model with HLA plus 26 SNP is useful in independent populations. Conclusions Predicting risk with 57 additional non-HLA variants improved the identification of potential CD patients. This demonstrates a possible role for combined HLA and non-HLA genetic testing in diagnostic work for CD. PMID:23704318

  17. Widespread non-additive and interaction effects within HLA loci modulate the risk of autoimmune diseases

    PubMed Central

    Lenz, Tobias L.; Deutsch, Aaron J.; Han, Buhm; Hu, Xinli; Okada, Yukinori; Eyre, Stephen; Knapp, Michael; Zhernakova, Alexandra; Huizinga, Tom W.J.; Abecasis, Goncalo; Becker, Jessica; Boeckxstaens, Guy E.; Chen, Wei-Min; Franke, Andre; Gladman, Dafna D.; Gockel, Ines; Gutierrez-Achury, Javier; Martin, Javier; Nair, Rajan P.; Nöthen, Markus M.; Onengut-Gumuscu, Suna; Rahman, Proton; Rantapää-Dahlqvist, Solbritt; Stuart, Philip E.; Tsoi, Lam C.; Van Heel, David A.; Worthington, Jane; Wouters, Mira M.; Klareskog, Lars; Elder, James T.; Gregersen, Peter K.; Schumacher, Johannes; Rich, Stephen S.; Wijmenga, Cisca; Sunyaev, Shamil R.; de Bakker, Paul I.W.; Raychaudhuri, Soumya

    2015-01-01

    Human leukocyte antigen (HLA) genes confer strong risk for autoimmune diseases on a log-additive scale. Here we speculated that differences in autoantigen binding repertoires between a heterozygote’s two expressed HLA variants may result in additional non-additive risk effects. We tested non-additive disease contributions of classical HLA alleles in patients and matched controls for five common autoimmune diseases: rheumatoid arthritis (RA, Ncases=5,337), type 1 diabetes (T1D, Ncases=5,567), psoriasis vulgaris (Ncases=3,089), idiopathic achalasia (Ncases=727), and celiac disease (Ncases=11,115). In four out of five diseases, we observed highly significant non-additive dominance effects (RA: P=2.5×1012; T1D: P=2.4×10−10; psoriasis: P=5.9×10−6; celiac disease: P=1.2×10−87). In three of these diseases, the dominance effects were explained by interactions between specific classical HLA alleles (RA: P=1.8×10−3; T1D: P=8.6×1027; celiac disease: P=6.0×10−100). These interactions generally increased disease risk and explained moderate but significant fractions of phenotypic variance (RA: 1.4%, T1D: 4.0%, and celiac disease: 4.1%, beyond a simple additive model). PMID:26258845

  18. New Zealand Seismic Risk Model

    NASA Astrophysics Data System (ADS)

    Molas, G.; Aslani, H.; Bryngelson, J.; Khan, Z.

    2006-12-01

    A seismic risk model for New Zealand has been developed to assisted insurers and reinsurers in assessing their financial risk posed by earthquakes. This presentation summarizes the methodology and data within the model and includes a discussion of the key results from the hazard and risk perspectives. The earthquake, risk-model framework has four components. First, the stochastic event set is determined, as well as its associated event probabilities. A ground-motion model including geotechnical data is added to complete the seismic hazard model. To determine risk, regional building vulnerability curves and a financial model are incorporated. An insurer property exposure database was developed to determine the insured seismic risk in these countries. Using this model, examination of resulting hazard maps (200, 475, 1000 and 2500 years) and of city-level, hazard-curves gives insight to the key drivers of risk across the region. Hazard de-aggregation allow for examination of key drivers of risk in terms of seismic sources, event magnitude and events types. Examination of loss costs for residential and commercial (short and mid-rise) structures gives insight into the risk perspective for these various lines of business. Finally, incorporation of the insurer property exposure allows for an examination of the insured risk across the region and between exposure concentrations including Wellington, Auckland and Christchurch.

  19. CHARACTERIZE AGGREGATE AND CUMULATIVE RISK TO MANAGE RISK TO HUMANS EXPOSED TO MULTIPLE ENVIRONMENTAL STRESSORS: DOSE ADDITIVITY FOR PESTICIDE MIXTURES: BEHAVIORAL AND NEUROCHEMICAL EFFECTS

    EPA Science Inventory

    SUMMARY: The Agency’s default assumption for the cumulative assessment of the risk of mixtures is additivity based on either single-chemical potency (dose addition) or single-chemical effects (effect addition). NTD is developing models to accurately predict effects of complex mix...

  20. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  1. Use of additive dentistry decreases risk by minimizing reduction.

    PubMed

    Palmer, K Michael

    2012-05-01

    This case required enhancement of esthetics and reduction of long-term risk of pathologic tooth wear and decay, as well as minimizing erosion caused by innate and environmental influences. The author weighed patient expectations, diet, treatment of teeth, and age to create a treatment plan that would conserve tooth structure while accomplishing the goals of the case. The patient's dentition was restored utilizing intact enamel, adhesive dentistry, and etchable ceramic materials that require less than 1 mm of occlusal reduction without a significant loss of strength. In this case, opening the vertical dimension of occlusion--which was done to increase the height of both the maxillary and mandibular arches, in keeping with the patient's esthetic desires--eliminated the need to remove excessive amounts of healthy tooth structure and facilitated treatment of the occlusal dysfunction. PMID:22616217

  2. Modeling techniques for gaining additional urban space

    NASA Astrophysics Data System (ADS)

    Thunig, Holger; Naumann, Simone; Siegmund, Alexander

    2009-09-01

    One of the major accompaniments of the globalization is the rapid growing of urban areas. Urban sprawl is the main environmental problem affecting those cities across different characteristics and continents. Various reasons for the increase in urban sprawl in the last 10 to 30 years have been proposed [1], and often depend on the socio-economic situation of cities. The quantitative reduction and the sustainable handling of land should be performed by inner urban development instead of expanding urban regions. Following the principal "spare the urban fringe, develop the inner suburbs first" requires differentiated tools allowing for quantitative and qualitative appraisals of current building potentials. Using spatial high resolution remote sensing data within an object-based approach enables the detection of potential areas while GIS-data provides information for the quantitative valuation. This paper presents techniques for modeling urban environment and opportunities of utilization of the retrieved information for urban planners and their special needs.

  3. RISK 0301 - MOLECULAR MODELING

    EPA Science Inventory

    Risk assessment practices, in general, for a range of diseases now encourages the use of mechanistic data to enhance the ability to predict responses at low, environmental exposures. In particular, the pathway from normal biology to pathologic state can be dcscribed by a set of m...

  4. Liver Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Cervical Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Pancreatic Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Prostate Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Ovarian Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Lung Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Bladder Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Testicular Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Colorectal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Breast Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Esophageal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  15. Risk modelling in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  16. Estimation of radiation risk in presence of classical additive and Berkson multiplicative errors in exposure doses.

    PubMed

    Masiuk, S V; Shklyar, S V; Kukush, A G; Carroll, R J; Kovgan, L N; Likhtarov, I A

    2016-07-01

    In this paper, the influence of measurement errors in exposure doses in a regression model with binary response is studied. Recently, it has been recognized that uncertainty in exposure dose is characterized by errors of two types: classical additive errors and Berkson multiplicative errors. The combination of classical additive and Berkson multiplicative errors has not been considered in the literature previously. In a simulation study based on data from radio-epidemiological research of thyroid cancer in Ukraine caused by the Chornobyl accident, it is shown that ignoring measurement errors in doses leads to overestimation of background prevalence and underestimation of excess relative risk. In the work, several methods to reduce these biases are proposed. They are new regression calibration, an additive version of efficient SIMEX, and novel corrected score methods. PMID:26795191

  17. Additive genetic risk from five serotonin system polymorphisms interacts with interpersonal stress to predict depression.

    PubMed

    Vrshek-Schallhorn, Suzanne; Stroud, Catherine B; Mineka, Susan; Zinbarg, Richard E; Adam, Emma K; Redei, Eva E; Hammen, Constance; Craske, Michelle G

    2015-11-01

    Behavioral genetic research supports polygenic models of depression in which many genetic variations each contribute a small amount of risk, and prevailing diathesis-stress models suggest gene-environment interactions (G×E). Multilocus profile scores of additive risk offer an approach that is consistent with polygenic models of depression risk. In a first demonstration of this approach in a G×E predicting depression, we created an additive multilocus profile score from 5 serotonin system polymorphisms (1 each in the genes HTR1A, HTR2A, HTR2C, and 2 in TPH2). Analyses focused on 2 forms of interpersonal stress as environmental risk factors. Using 5 years of longitudinal diagnostic and life stress interviews from 387 emerging young adults in the Youth Emotion Project, survival analyses show that this multilocus profile score interacts with major interpersonal stressful life events to predict major depressive episode onsets (hazard ratio [HR] = 1.815, p = .007). Simultaneously, there was a significant protective effect of the profile score without a recent event (HR = 0.83, p = .030). The G×E effect with interpersonal chronic stress was not significant (HR = 1.15, p = .165). Finally, effect sizes for genetic factors examined ignoring stress suggested such an approach could lead to overlooking or misinterpreting genetic effects. Both the G×E effect and the protective simple main effect were replicated in a sample of early adolescent girls (N = 105). We discuss potential benefits of the multilocus genetic profile score approach and caveats for future research. PMID:26595467

  18. The transport exponent in percolation models with additional loops

    NASA Astrophysics Data System (ADS)

    Babalievski, F.

    1994-10-01

    Several percolation models with additional loops were studied. The transport exponents for these models were estimated numerically by means of a transfer-matrix approach. It was found that the transport exponent has a drastically changed value for some of the models. This result supports some previous numerical studies on the vibrational properties of similar models (with additional loops).

  19. 42 CFR 417.442 - Risk HMO's and CMP's: Conditions for provision of additional benefits.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Risk HMO's and CMP's: Conditions for provision of... Medicare Contract § 417.442 Risk HMO's and CMP's: Conditions for provision of additional benefits. (a) General rule. Except as provided in paragraph (b) of this section, a risk HMO or CMP must, during...

  20. 42 CFR 417.442 - Risk HMO's and CMP's: Conditions for provision of additional benefits.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Risk HMO's and CMP's: Conditions for provision of... Medicare Contract § 417.442 Risk HMO's and CMP's: Conditions for provision of additional benefits. (a) General rule. Except as provided in paragraph (b) of this section, a risk HMO or CMP must, during...

  1. Multifractal Value at Risk model

    NASA Astrophysics Data System (ADS)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  2. Information risk and security modeling

    NASA Astrophysics Data System (ADS)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  3. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  4. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  5. Evaluation of the Performance of Smoothing Functions in Generalized Additive Models for Spatial Variation in Disease

    PubMed Central

    Siangphoe, Umaporn; Wheeler, David C.

    2015-01-01

    Generalized additive models (GAMs) with bivariate smoothing functions have been applied to estimate spatial variation in risk for many types of cancers. Only a handful of studies have evaluated the performance of smoothing functions applied in GAMs with regard to different geographical areas of elevated risk and different risk levels. This study evaluates the ability of different smoothing functions to detect overall spatial variation of risk and elevated risk in diverse geographical areas at various risk levels using a simulation study. We created five scenarios with different true risk area shapes (circle, triangle, linear) in a square study region. We applied four different smoothing functions in the GAMs, including two types of thin plate regression splines (TPRS) and two versions of locally weighted scatterplot smoothing (loess). We tested the null hypothesis of constant risk and detected areas of elevated risk using analysis of deviance with permutation methods and assessed the performance of the smoothing methods based on the spatial detection rate, sensitivity, accuracy, precision, power, and false-positive rate. The results showed that all methods had a higher sensitivity and a consistently moderate-to-high accuracy rate when the true disease risk was higher. The models generally performed better in detecting elevated risk areas than detecting overall spatial variation. One of the loess methods had the highest precision in detecting overall spatial variation across scenarios and outperformed the other methods in detecting a linear elevated risk area. The TPRS methods outperformed loess in detecting elevated risk in two circular areas. PMID:25983545

  6. Common genetic variants, acting additively, are a major source of risk for autism

    PubMed Central

    2012-01-01

    Background Autism spectrum disorders (ASD) are early onset neurodevelopmental syndromes typified by impairments in reciprocal social interaction and communication, accompanied by restricted and repetitive behaviors. While rare and especially de novo genetic variation are known to affect liability, whether common genetic polymorphism plays a substantial role is an open question and the relative contribution of genes and environment is contentious. It is probable that the relative contributions of rare and common variation, as well as environment, differs between ASD families having only a single affected individual (simplex) versus multiplex families who have two or more affected individuals. Methods By using quantitative genetics techniques and the contrast of ASD subjects to controls, we estimate what portion of liability can be explained by additive genetic effects, known as narrow-sense heritability. We evaluate relatives of ASD subjects using the same methods to evaluate the assumptions of the additive model and partition families by simplex/multiplex status to determine how heritability changes with status. Results By analyzing common variation throughout the genome, we show that common genetic polymorphism exerts substantial additive genetic effects on ASD liability and that simplex/multiplex family status has an impact on the identified composition of that risk. As a fraction of the total variation in liability, the estimated narrow-sense heritability exceeds 60% for ASD individuals from multiplex families and is approximately 40% for simplex families. By analyzing parents, unaffected siblings and alleles not transmitted from parents to their affected children, we conclude that the data for simplex ASD families follow the expectation for additive models closely. The data from multiplex families deviate somewhat from an additive model, possibly due to parental assortative mating. Conclusions Our results, when viewed in the context of results from genome

  7. Fine-mapping in the MHC region accounts for 18% additional genetic risk for celiac disease

    PubMed Central

    Gutierrez-Achury, Javier; Zhernakova, Alexandra; Pulit, Sara L.; Trynka, Gosia; Hunt, Karen A.; Romanos, Jihane; Raychaudhuri, Soumya; van Heel, David A.; Wijmenga, Cisca; de Bakker, Paul I.W.

    2015-01-01

    Although dietary gluten is the trigger, celiac disease risk is strongly influenced by genetic variation in the major histocompatibility complex (MHC) region. We fine-mapped the MHC association signal to identify additional risk factors independent of the HLA-DQ alleles and observed five novel associations that account for 18% of the genetic risk. Together with the 57 known non-MHC loci, genetic variation can now explain up to 48% of celiac disease heritability. PMID:25894500

  8. Complex Modelling Scheme Of An Additive Manufacturing Centre

    NASA Astrophysics Data System (ADS)

    Popescu, Liliana Georgeta

    2015-09-01

    This paper presents a modelling scheme sustaining the development of an additive manufacturing research centre model and its processes. This modelling is performed using IDEF0, the resulting model process representing the basic processes required in developing such a centre in any university. While the activities presented in this study are those recommended in general, changes may occur in specific existing situations in a research centre.

  9. Concentration Addition, Independent Action and Generalized Concentration Addition Models for Mixture Effect Prediction of Sex Hormone Synthesis In Vitro

    PubMed Central

    Hadrup, Niels; Taxvig, Camilla; Pedersen, Mikael; Nellemann, Christine; Hass, Ulla; Vinggaard, Anne Marie

    2013-01-01

    Humans are concomitantly exposed to numerous chemicals. An infinite number of combinations and doses thereof can be imagined. For toxicological risk assessment the mathematical prediction of mixture effects, using knowledge on single chemicals, is therefore desirable. We investigated pros and cons of the concentration addition (CA), independent action (IA) and generalized concentration addition (GCA) models. First we measured effects of single chemicals and mixtures thereof on steroid synthesis in H295R cells. Then single chemical data were applied to the models; predictions of mixture effects were calculated and compared to the experimental mixture data. Mixture 1 contained environmental chemicals adjusted in ratio according to human exposure levels. Mixture 2 was a potency adjusted mixture containing five pesticides. Prediction of testosterone effects coincided with the experimental Mixture 1 data. In contrast, antagonism was observed for effects of Mixture 2 on this hormone. The mixtures contained chemicals exerting only limited maximal effects. This hampered prediction by the CA and IA models, whereas the GCA model could be used to predict a full dose response curve. Regarding effects on progesterone and estradiol, some chemicals were having stimulatory effects whereas others had inhibitory effects. The three models were not applicable in this situation and no predictions could be performed. Finally, the expected contributions of single chemicals to the mixture effects were calculated. Prochloraz was the predominant but not sole driver of the mixtures, suggesting that one chemical alone was not responsible for the mixture effects. In conclusion, the GCA model seemed to be superior to the CA and IA models for the prediction of testosterone effects. A situation with chemicals exerting opposing effects, for which the models could not be applied, was identified. In addition, the data indicate that in non-potency adjusted mixtures the effects cannot always be

  10. Intelligent adversary risk analysis: a bioterrorism risk management model.

    PubMed

    Parnell, Gregory S; Smith, Christopher M; Moxley, Frederick I

    2010-01-01

    The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender-attacker-defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data. PMID:20002893

  11. "The Dose Makes the Poison": Informing Consumers About the Scientific Risk Assessment of Food Additives.

    PubMed

    Bearth, Angela; Cousin, Marie-Eve; Siegrist, Michael

    2016-01-01

    Intensive risk assessment is required before the approval of food additives. During this process, based on the toxicological principle of "the dose makes the poison,ˮ maximum usage doses are assessed. However, most consumers are not aware of these efforts to ensure the safety of food additives and are therefore sceptical, even though food additives bring certain benefits to consumers. This study investigated the effect of a short video, which explains the scientific risk assessment and regulation of food additives, on consumers' perceptions and acceptance of food additives. The primary goal of this study was to inform consumers and enable them to construct their own risk-benefit assessment and make informed decisions about food additives. The secondary goal was to investigate whether people have different perceptions of food additives of artificial (i.e., aspartame) or natural origin (i.e., steviolglycoside). To attain these research goals, an online experiment was conducted on 185 Swiss consumers. Participants were randomly assigned to either the experimental group, which was shown a video about the scientific risk assessment of food additives, or the control group, which was shown a video about a topic irrelevant to the study. After watching the video, the respondents knew significantly more, expressed more positive thoughts and feelings, had less risk perception, and more acceptance than prior to watching the video. Thus, it appears that informing consumers about complex food safety topics, such as the scientific risk assessment of food additives, is possible, and using a carefully developed information video is a successful strategy for informing consumers. PMID:25951078

  12. How to interpret a small increase in AUC with an additional risk prediction marker: Decision analysis comes through

    PubMed Central

    Baker, Stuart G.; Schuit, Ewoud; Steyerberg, Ewout W.; Pencina, Michael J.; Vickers, Andew; Moons, Karel G. M.; Mol, Ben W.J.; Lindeman, Karen S.

    2014-01-01

    An important question in the evaluation of an additional risk prediction marker is how to interpret a small increase in the area under the receiver operating characteristic curve (AUC). Many researchers believe that a change in AUC is a poor metric because it increases only slightly with the addition of a marker with a large odds ratio. Because it is not possible on purely statistical grounds to choose between the odds ratio and AUC, we invoke decision analysis, which incorporates costs and benefits. For example a timely estimate of the risk of later non-elective operative delivery can help a woman in labor decide if she wants an early elective cesarean section to avoid greater complications from possible later non-elective operative delivery. A basic risk prediction model for later non-elective operative delivery involves only antepartum markers. Because adding intrapartum markers to this risk prediction model increases AUC by 0.02, we questioned whether this small improvement is worthwhile. A key decision-analytic quantity is the risk threshold, here the risk of later non-elective operative delivery at which a patient would be indifferent between an early elective cesarean section and usual care. For a range of risk thresholds, we found that an increase in the net benefit of risk prediction requires collecting intrapartum marker data on 68 to 124 women for every correct prediction of later non-elective operative delivery. Because data collection is non-invasive, this test tradeoff of 68 to 124 is clinically acceptable, indicating the value of adding intrapartum markers to the risk prediction model. PMID:24825728

  13. Comprehensive European dietary exposure model (CEDEM) for food additives.

    PubMed

    Tennant, David R

    2016-05-01

    European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database. PMID:26987377

  14. Additive-multiplicative rates model for recurrent events.

    PubMed

    Liu, Yanyan; Wu, Yuanshan; Cai, Jianwen; Zhou, Haibo

    2010-07-01

    Recurrent events are frequently encountered in biomedical studies. Evaluating the covariates effects on the marginal recurrent event rate is of practical interest. There are mainly two types of rate models for the recurrent event data: the multiplicative rates model and the additive rates model. We consider a more flexible additive-multiplicative rates model for analysis of recurrent event data, wherein some covariate effects are additive while others are multiplicative. We formulate estimating equations for estimating the regression parameters. The estimators for these regression parameters are shown to be consistent and asymptotically normally distributed under appropriate regularity conditions. Moreover, the estimator of the baseline mean function is proposed and its large sample properties are investigated. We also conduct simulation studies to evaluate the finite sample behavior of the proposed estimators. A medical study of patients with cystic fibrosis suffered from recurrent pulmonary exacerbations is provided for illustration of the proposed method. PMID:20229314

  15. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  16. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins; John M. Beck

    2011-11-01

    The Next Generation Nuclear Plant (NGNP) Risk Management System (RMS) is a database used to maintain the project risk register. The RMS also maps risk reduction activities to specific identified risks. Further functionality of the RMS includes mapping reactor suppliers Design Data Needs (DDNs) to risk reduction tasks and mapping Phenomena Identification Ranking Table (PIRTs) to associated risks. This document outlines the basic instructions on how to use the RMS. This document constitutes Revision 1 of the NGNP Risk Management Database: A Model for Managing Risk. It incorporates the latest enhancements to the RMS. The enhancements include six new custom views of risk data - Impact/Consequence, Tasks by Project Phase, Tasks by Status, Tasks by Project Phase/Status, Tasks by Impact/WBS, and Tasks by Phase/Impact/WBS.

  17. Transferability of regional permafrost disturbance susceptibility modelling using generalized linear and generalized additive models

    NASA Astrophysics Data System (ADS)

    Rudy, Ashley C. A.; Lamoureux, Scott F.; Treitz, Paul; van Ewijk, Karin Y.

    2016-07-01

    To effectively assess and mitigate risk of permafrost disturbance, disturbance-prone areas can be predicted through the application of susceptibility models. In this study we developed regional susceptibility models for permafrost disturbances using a field disturbance inventory to test the transferability of the model to a broader region in the Canadian High Arctic. Resulting maps of susceptibility were then used to explore the effect of terrain variables on the occurrence of disturbances within this region. To account for a large range of landscape characteristics, the model was calibrated using two locations: Sabine Peninsula, Melville Island, NU, and Fosheim Peninsula, Ellesmere Island, NU. Spatial patterns of disturbance were predicted with a generalized linear model (GLM) and generalized additive model (GAM), each calibrated using disturbed and randomized undisturbed locations from both locations and GIS-derived terrain predictor variables including slope, potential incoming solar radiation, wetness index, topographic position index, elevation, and distance to water. Each model was validated for the Sabine and Fosheim Peninsulas using independent data sets while the transferability of the model to an independent site was assessed at Cape Bounty, Melville Island, NU. The regional GLM and GAM validated well for both calibration sites (Sabine and Fosheim) with the area under the receiver operating curves (AUROC) > 0.79. Both models were applied directly to Cape Bounty without calibration and validated equally with AUROC's of 0.76; however, each model predicted disturbed and undisturbed samples differently. Additionally, the sensitivity of the transferred model was assessed using data sets with different sample sizes. Results indicated that models based on larger sample sizes transferred more consistently and captured the variability within the terrain attributes in the respective study areas. Terrain attributes associated with the initiation of disturbances were

  18. Accelerated Nucleation Due to Trace Additives: A Fluctuating Coverage Model.

    PubMed

    Poon, Geoffrey G; Peters, Baron

    2016-03-01

    We develop a theory to account for variable coverage of trace additives that lower the interfacial free energy for nucleation. The free energy landscape is based on classical nucleation theory and a statistical mechanical model for Langmuir adsorption. Dynamics are modeled by diffusion-controlled attachment and detachment of solutes and adsorbing additives. We compare the mechanism and kinetics from a mean-field model, a projection of the dynamics and free energy surface onto nucleus size, and a full two-dimensional calculation using Kramers-Langer-Berezhkovskii-Szabo theory. The fluctuating coverage model predicts rates more accurately than mean-field models of the same process primarily because it more accurately estimates the potential of mean force along the size coordinate. PMID:26485064

  19. Modeling the cardiovascular system using a nonlinear additive autoregressive model with exogenous input

    NASA Astrophysics Data System (ADS)

    Riedl, M.; Suhrbier, A.; Malberg, H.; Penzel, T.; Bretthauer, G.; Kurths, J.; Wessel, N.

    2008-07-01

    The parameters of heart rate variability and blood pressure variability have proved to be useful analytical tools in cardiovascular physics and medicine. Model-based analysis of these variabilities additionally leads to new prognostic information about mechanisms behind regulations in the cardiovascular system. In this paper, we analyze the complex interaction between heart rate, systolic blood pressure, and respiration by nonparametric fitted nonlinear additive autoregressive models with external inputs. Therefore, we consider measurements of healthy persons and patients suffering from obstructive sleep apnea syndrome (OSAS), with and without hypertension. It is shown that the proposed nonlinear models are capable of describing short-term fluctuations in heart rate as well as systolic blood pressure significantly better than similar linear ones, which confirms the assumption of nonlinear controlled heart rate and blood pressure. Furthermore, the comparison of the nonlinear and linear approaches reveals that the heart rate and blood pressure variability in healthy subjects is caused by a higher level of noise as well as nonlinearity than in patients suffering from OSAS. The residue analysis points at a further source of heart rate and blood pressure variability in healthy subjects, in addition to heart rate, systolic blood pressure, and respiration. Comparison of the nonlinear models within and among the different groups of subjects suggests the ability to discriminate the cohorts that could lead to a stratification of hypertension risk in OSAS patients.

  20. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  1. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  2. Risk assessment for the combinational effects of food color additives: neural progenitor cells and hippocampal neurogenesis.

    PubMed

    Park, Mikyung; Park, Hee Ra; Kim, So Jung; Kim, Min-Sun; Kong, Kyoung Hye; Kim, Hyun Soo; Gong, Ein Ji; Kim, Mi Eun; Kim, Hyung Sik; Lee, Byung Mu; Lee, Jaewon

    2009-01-01

    In 2006, the Korea Food and Drug Administration reported that combinations of dietary colors such as allura red AC (R40), tartrazine (Y4), sunset yellow FCF (Y5), amaranth (R2), and brilliant blue FCF (B1) are widely used in food manufacturing. Although individual tar food colors are controlled based on acceptable daily intake (ADI), there is no apparent information available for how combinations of these additives affect food safety. In the current study, the potencies of single and combination use of R40, Y4, Y5, R2, and B1 were examined on neural progenitor cell (NPC) toxicity, a biomarker for developmental stage, and neurogenesis, indicative of adult central nervous system (CNS) functions. R40 and R2 reduced NPC proliferation and viability in mouse multipotent NPC, in the developing CNS model. Among several combinations tested in mouse model, combination of Y4 and B1 at 1000-fold higher than average daily intake in Korea significantly decreased numbers of newly generated cells in adult mouse hippocampus, indicating potent adverse actions on hippocampal neurogenesis. However, other combinations including R40 and R2 did not affect adult hippocampal neurogenesis in the dentate gyrus. Evidence indicates that single and combination use of most tar food colors may be safe with respect to risk using developmental NPC and adult hippocampal neurogenesis. However, the response to excessively high dose combination of Y4 and B1 is suggestive of synergistic effects to suppress proliferation of NPC in adult hippocampus. Data indicated that combinations of tar colors may adversely affect both developmental and adult hippocampal neurogenesis; thus, further extensive studies are required to assess the safety of these additive combinations. PMID:20077213

  3. Modeling Errors in Daily Precipitation Measurements: Additive or Multiplicative?

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Huffman, George J.; Adler, Robert F.; Tang, Ling; Sapiano, Matthew; Maggioni, Viviana; Wu, Huan

    2013-01-01

    The definition and quantification of uncertainty depend on the error model used. For uncertainties in precipitation measurements, two types of error models have been widely adopted: the additive error model and the multiplicative error model. This leads to incompatible specifications of uncertainties and impedes intercomparison and application.In this letter, we assess the suitability of both models for satellite-based daily precipitation measurements in an effort to clarify the uncertainty representation. Three criteria were employed to evaluate the applicability of either model: (1) better separation of the systematic and random errors; (2) applicability to the large range of variability in daily precipitation; and (3) better predictive skills. It is found that the multiplicative error model is a much better choice under all three criteria. It extracted the systematic errors more cleanly, was more consistent with the large variability of precipitation measurements, and produced superior predictions of the error characteristics. The additive error model had several weaknesses, such as non constant variance resulting from systematic errors leaking into random errors, and the lack of prediction capability. Therefore, the multiplicative error model is a better choice.

  4. Modelling the behaviour of additives in gun barrels

    NASA Astrophysics Data System (ADS)

    Rhodes, N.; Ludwig, J. C.

    1986-01-01

    A mathematical model which predicts the flow and heat transfer in a gun barrel is described. The model is transient, two-dimensional and equations are solved for velocities and enthalpies of a gas phase, which arises from the combustion of propellant and cartridge case, for particle additives which are released from the case; volume fractions of the gas and particles. Closure of the equations is obtained using a two-equation turbulence model. Preliminary calculations are described in which the proportions of particle additives in the cartridge case was altered. The model gives a good prediction of the ballistic performance and the gas to wall heat transfer. However, the expected magnitude of reduction in heat transfer when particles are present is not predicted. The predictions of gas flow invalidate some of the assumptions made regarding case and propellant behavior during combustion and further work is required to investigate these effects and other possible interactions, both chemical and physical, between gas and particles.

  5. An Additional Symmetry in the Weinberg-Salam Model

    SciTech Connect

    Bakker, B.L.G.; Veselov, A.I.; Zubkov, M.A.

    2005-06-01

    An additional Z{sub 6} symmetry hidden in the fermion and Higgs sectors of the Standard Model has been found recently. It has a singular nature and is connected to the centers of the SU(3) and SU(2) subgroups of the gauge group. A lattice regularization of the Standard Model was constructed that possesses this symmetry. In this paper, we report our results on the numerical simulation of its electroweak sector.

  6. Modeling uranium transport in acidic contaminated groundwater with base addition

    SciTech Connect

    Zhang, Fan; Luo, Wensui; Parker, Jack C.; Brooks, Scott C; Watson, David B; Jardine, Philip; Gu, Baohua

    2011-01-01

    This study investigates reactive transport modeling in a column of uranium(VI)-contaminated sediments with base additions in the circulating influent. The groundwater and sediment exhibit oxic conditions with low pH, high concentrations of NO{sub 3}{sup -}, SO{sub 4}{sup 2-}, U and various metal cations. Preliminary batch experiments indicate that additions of strong base induce rapid immobilization of U for this material. In the column experiment that is the focus of the present study, effluent groundwater was titrated with NaOH solution in an inflow reservoir before reinjection to gradually increase the solution pH in the column. An equilibrium hydrolysis, precipitation and ion exchange reaction model developed through simulation of the preliminary batch titration experiments predicted faster reduction of aqueous Al than observed in the column experiment. The model was therefore modified to consider reaction kinetics for the precipitation and dissolution processes which are the major mechanism for Al immobilization. The combined kinetic and equilibrium reaction model adequately described variations in pH, aqueous concentrations of metal cations (Al, Ca, Mg, Sr, Mn, Ni, Co), sulfate and U(VI). The experimental and modeling results indicate that U(VI) can be effectively sequestered with controlled base addition due to sorption by slowly precipitated Al with pH-dependent surface charge. The model may prove useful to predict field-scale U(VI) sequestration and remediation effectiveness.

  7. Using Set Model for Learning Addition of Integers

    ERIC Educational Resources Information Center

    Lestari, Umi Puji; Putri, Ratu Ilma Indra; Hartono, Yusuf

    2015-01-01

    This study aims to investigate how set model can help students' understanding of addition of integers in fourth grade. The study has been carried out to 23 students and a teacher of IVC SD Iba Palembang in January 2015. This study is a design research that also promotes PMRI as the underlying design context and activity. Results showed that the…

  8. 42 CFR 417.442 - Risk HMO's and CMP's: Conditions for provision of additional benefits.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Risk HMO's and CMP's: Conditions for provision of additional benefits. 417.442 Section 417.442 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM (CONTINUED) HEALTH MAINTENANCE ORGANIZATIONS, COMPETITIVE MEDICAL PLANS,...

  9. 42 CFR 417.442 - Risk HMO's and CMP's: Conditions for provision of additional benefits.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Risk HMO's and CMP's: Conditions for provision of additional benefits. 417.442 Section 417.442 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM (CONTINUED) HEALTH MAINTENANCE ORGANIZATIONS, COMPETITIVE MEDICAL PLANS,...

  10. Estimating soil water retention using soil component additivity model

    NASA Astrophysics Data System (ADS)

    Zeiliger, A.; Ermolaeva, O.; Semenov, V.

    2009-04-01

    Soil water retention is a major soil hydraulic property that governs soil functioning in ecosystems and greatly affects soil management. Data on soil water retention are used in research and applications in hydrology, agronomy, meteorology, ecology, environmental protection, and many other soil-related fields. Soil organic matter content and composition affect both soil structure and adsorption properties; therefore water retention may be affected by changes in soil organic matter that occur because of both climate change and modifications of management practices. Thus, effects of organic matter on soil water retention should be understood and quantified. Measurement of soil water retention is relatively time-consuming, and become impractical when soil hydrologic estimates are needed for large areas. One approach to soil water retention estimation from readily available data is based on the hypothesis that soil water retention may be estimated as an additive function obtained by summing up water retention of pore subspaces associated with soil textural and/or structural components and organic matter. The additivity model and was tested with 550 soil samples from the international database UNSODA and 2667 soil samples from the European database HYPRES containing all textural soil classes after USDA soil texture classification. The root mean square errors (RMSEs) of the volumetric water content estimates for UNSODA vary from 0.021 m3m-3 for coarse sandy loam to 0.075 m3m-3 for sandy clay. Obtained RMSEs are at the lower end of the RMSE range for regression-based water retention estimates found in literature. Including retention estimates of organic matter significantly improved RMSEs. The attained accuracy warrants testing the 'additivity' model with additional soil data and improving this model to accommodate various types of soil structure. Keywords: soil water retention, soil components, additive model, soil texture, organic matter.

  11. Cognitive vulnerability to depression: A comparison of the weakest link, keystone and additive models

    PubMed Central

    Reilly, Laura C.; Ciesla, Jeffrey A.; Felton, Julia W.; Weitlauf, Amy S.; Anderson, Nicholas L.

    2014-01-01

    Multiple theories of cognitive vulnerability to depression have been proposed, each focusing on different aspects of negative cognition and utilising different measures of risk. Various methods of integrating such multiple indices of risk have been examined in the literature, and each demonstrates some promise. Yet little is known about the interrelations among these methods, or their incremental validity in predicting changes in depression. The present study compared three integrative models of cognitive vulnerability: the additive, weakest link, and keystone models. Support was found for each model as predictive of depression over time, but only the weakest link model demonstrated incremental utility in predicting changes in depression over the other models. We also explore the correlation between these models and each model’s unique contribution to predicting onset of depressive symptoms. PMID:21851251

  12. Possible effects of protracted exposure on the additivity of risks from space radiations

    NASA Technical Reports Server (NTRS)

    Curtis, S. B.

    1996-01-01

    Conventional radiation risk assessments are presently based on the additivity assumption. This assumption states that risks from individual components of a complex radiation field involving many different types of radiation can be added to yield the total risk of the complex radiation field. If the assumption is not correct, the summations and integrations performed to obtain the presently quoted risk estimates are not appropriate. This problem is particularly important in the area of space radiation risk evaluation because of the many different types of high- and low-LET radiation present in the galactic cosmic ray environment. For both low- and high-LET radiations at low enough dose rates, the present convention is that the addivity assumption holds. Mathematically, the total risk, Rtot is assumed to be Rtot = summation (i) Ri where the summation runs over the different types of radiation present. If the total dose (or fluence) from each component is such that the interaction between biological lesions caused by separate single track traversals is negligible within a given cell, it is presently considered to be reasonable to accept the additivity assumption. However, when the exposure is protracted over many cell doubling times (as will be the case for extended missions to the moon or Mars), the possibility exists that radiation effects that depend on multiple cellular events over a long time period, such as is probably the case in radiation-induced carcinogenesis, may not be additive in the above sense and the exposure interval may have to be included in the evaluation procedure. It is shown, however, that "inverse" dose-rate effects are not expected from intermediate LET radiations arising from the galactic cosmic ray environment due to the "sensitive-window-in-the-cell-cycle" hypothesis.

  13. Additions to Mars Global Reference Atmospheric Model (Mars-GRAM)

    NASA Technical Reports Server (NTRS)

    Justus, C. G.

    1991-01-01

    Three major additions or modifications were made to the Mars Global Reference Atmospheric Model (Mars-GRAM): (1) in addition to the interactive version, a new batch version is available, which uses NAMELIST input, and is completely modular, so that the main driver program can easily be replaced by any calling program, such as a trajectory simulation program; (2) both the interactive and batch versions now have an option for treating local-scale dust storm effects, rather than just the global-scale dust storms in the original Mars-GRAM; and (3) the Zurek wave perturbation model was added, to simulate the effects of tidal perturbations, in addition to the random (mountain wave) perturbation model of the original Mars-GRAM. A minor modification has also been made which allows heights to go below local terrain height and return realistic pressure, density, and temperature (not the surface values) as returned by the original Mars-GRAM. This feature will allow simulations of Mars rover paths which might go into local valley areas which lie below the average height of the present, rather coarse-resolution, terrain height data used by Mars-GRAM. Sample input and output of both the interactive and batch version of Mars-GRAM are presented.

  14. Additions to Mars Global Reference Atmospheric Model (MARS-GRAM)

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; James, Bonnie

    1992-01-01

    Three major additions or modifications were made to the Mars Global Reference Atmospheric Model (Mars-GRAM): (1) in addition to the interactive version, a new batch version is available, which uses NAMELIST input, and is completely modular, so that the main driver program can easily be replaced by any calling program, such as a trajectory simulation program; (2) both the interactive and batch versions now have an option for treating local-scale dust storm effects, rather than just the global-scale dust storms in the original Mars-GRAM; and (3) the Zurek wave perturbation model was added, to simulate the effects of tidal perturbations, in addition to the random (mountain wave) perturbation model of the original Mars-GRAM. A minor modification was also made which allows heights to go 'below' local terrain height and return 'realistic' pressure, density, and temperature, and not the surface values, as returned by the original Mars-GRAM. This feature will allow simulations of Mars rover paths which might go into local 'valley' areas which lie below the average height of the present, rather coarse-resolution, terrain height data used by Mars-GRAM. Sample input and output of both the interactive and batch versions of Mars-GRAM are presented.

  15. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  16. Understanding Rasch Measurement: The Rasch Model, Additive Conjoint Measurement, and New Models of Probabilistic Measurement Theory.

    ERIC Educational Resources Information Center

    Karabatsos, George

    2001-01-01

    Describes similarities and differences between additive conjoint measurement and the Rasch model, and formalizes some new nonparametric item response models that are, in a sense, probabilistic measurement theory models. Applies these new models to published and simulated data. (SLD)

  17. Modeling Research Project Risks with Fuzzy Maps

    ERIC Educational Resources Information Center

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  18. Backbone additivity in the transfer model of protein solvation

    PubMed Central

    Hu, Char Y; Kokubo, Hironori; Lynch, Gillian C; Bolen, D Wayne; Pettitt, B Montgomery

    2010-01-01

    The transfer model implying additivity of the peptide backbone free energy of transfer is computationally tested. Molecular dynamics simulations are used to determine the extent of change in transfer free energy (ΔGtr) with increase in chain length of oligoglycine with capped end groups. Solvation free energies of oligoglycine models of varying lengths in pure water and in the osmolyte solutions, 2M urea and 2M trimethylamine N-oxide (TMAO), were calculated from simulations of all atom models, and ΔGtr values for peptide backbone transfer from water to the osmolyte solutions were determined. The results show that the transfer free energies change linearly with increasing chain length, demonstrating the principle of additivity, and provide values in reasonable agreement with experiment. The peptide backbone transfer free energy contributions arise from van der Waals interactions in the case of transfer to urea, but from electrostatics on transfer to TMAO solution. The simulations used here allow for the calculation of the solvation and transfer free energy of longer oligoglycine models to be evaluated than is currently possible through experiment. The peptide backbone unit computed transfer free energy of −54 cal/mol/M compares quite favorably with −43 cal/mol/M determined experimentally. PMID:20306490

  19. Backbone Additivity in the Transfer Model of Protein Solvation

    SciTech Connect

    Hu, Char Y.; Kokubo, Hironori; Lynch, Gillian C.; Bolen, D Wayne; Pettitt, Bernard M.

    2010-05-01

    The transfer model implying additivity of the peptide backbone free energy of transfer is computationally tested. Molecular dynamics simulations are used to determine the extent of change in transfer free energy (ΔGtr) with increase in chain length of oligoglycine with capped end groups. Solvation free energies of oligoglycine models of varying lengths in pure water and in the osmolyte solutions, 2M urea and 2M trimethylamine N-oxide (TMAO), were calculated from simulations of all atom models, and ΔGtr values for peptide backbone transfer from water to the osmolyte solutions were determined. The results show that the transfer free energies change linearly with increasing chain length, demonstrating the principle of additivity, and provide values in reasonable agreement with experiment. The peptide backbone transfer free energy contributions arise from van der Waals interactions in the case of transfer to urea, but from electrostatics on transfer to TMAO solution. The simulations used here allow for the calculation of the solvation and transfer free energy of longer oligoglycine models to be evaluated than is currently possible through experiment. The peptide backbone unit computed transfer free energy of –54 cal/mol/Mcompares quite favorably with –43 cal/mol/M determined experimentally.

  20. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    NASA Technical Reports Server (NTRS)

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition

  1. MTHFR homozygous mutation and additional risk factors for cerebral infarction in a large Italian family.

    PubMed

    Del Balzo, Francesca; Spalice, Alberto; Perla, Massimo; Properzi, Enrico; Iannetti, Paola

    2009-01-01

    Several cases with cerebral infarctions associated with the C677T mutation in the methylenetetrahydrofolate reductase gene (MTHFR) have been reported. Given the large number of asymptomatic individuals with the MTHFR mutation, additional risk factors for cerebral infarction should be considered. This study describes a large family with the MTHFR mutation and a combination of heterozygous factor V Leiden mutations and different additional exogenous and endogenous thrombogenic risk factors. Psychomotor retardation and a left fronto-insular infarct associated with the MTHFR mutation together with diminished factor VII and low level of protein C was documented in the first patient. In the second patient, generalized epilepsy and a malacic area in the right nucleus lenticularis was associated with the MTHFR mutation and a low level of protein C. In the third patient, right hemiparesis and a left fronto-temporal porencephalic cyst were documented, together with the MTHFR mutation and hyperhomocysteinemia. An extensive search of additional circumstantial and genetic thrombogenic risk factors should be useful for prophylaxis and prognosis of infants with cerebral infarctions associated with the MTHFR mutation and of their related family members. PMID:19068258

  2. Addition Table of Colours: Additive and Subtractive Mixtures Described Using a Single Reasoning Model

    ERIC Educational Resources Information Center

    Mota, A. R.; Lopes dos Santos, J. M. B.

    2014-01-01

    Students' misconceptions concerning colour phenomena and the apparent complexity of the underlying concepts--due to the different domains of knowledge involved--make its teaching very difficult. We have developed and tested a teaching device, the addition table of colours (ATC), that encompasses additive and subtractive mixtures in a single…

  3. Sensitivity analysis of geometric errors in additive manufacturing medical models.

    PubMed

    Pinto, Jose Miguel; Arrieta, Cristobal; Andia, Marcelo E; Uribe, Sergio; Ramos-Grez, Jorge; Vargas, Alex; Irarrazaval, Pablo; Tejos, Cristian

    2015-03-01

    Additive manufacturing (AM) models are used in medical applications for surgical planning, prosthesis design and teaching. For these applications, the accuracy of the AM models is essential. Unfortunately, this accuracy is compromised due to errors introduced by each of the building steps: image acquisition, segmentation, triangulation, printing and infiltration. However, the contribution of each step to the final error remains unclear. We performed a sensitivity analysis comparing errors obtained from a reference with those obtained modifying parameters of each building step. Our analysis considered global indexes to evaluate the overall error, and local indexes to show how this error is distributed along the surface of the AM models. Our results show that the standard building process tends to overestimate the AM models, i.e. models are larger than the original structures. They also show that the triangulation resolution and the segmentation threshold are critical factors, and that the errors are concentrated at regions with high curvatures. Errors could be reduced choosing better triangulation and printing resolutions, but there is an important need for modifying some of the standard building processes, particularly the segmentation algorithms. PMID:25649961

  4. Additive Manufacturing of Medical Models--Applications in Rhinology.

    PubMed

    Raos, Pero; Klapan, Ivica; Galeta, Tomislav

    2015-09-01

    In the paper we are introducing guidelines and suggestions for use of 3D image processing SW in head pathology diagnostic and procedures for obtaining physical medical model by additive manufacturing/rapid prototyping techniques, bearing in mind the improvement of surgery performance, its maximum security and faster postoperative recovery of patients. This approach has been verified in two case reports. In the treatment we used intelligent classifier-schemes for abnormal patterns using computer-based system for 3D-virtual and endoscopic assistance in rhinology, with appropriate visualization of anatomy and pathology within the nose, paranasal sinuses, and scull base area. PMID:26898064

  5. Genetic risks and genetic model specification.

    PubMed

    Zheng, Gang; Zhang, Wei; Xu, Jinfeng; Yuan, Ao; Li, Qizhai; Gastwirth, Joseph L

    2016-08-21

    Genetic risks and genetic models are often used in design and analysis of genetic epidemiology studies. A genetic model is defined in terms of two genetic risk measures: genotype relative risk and odds ratio. The impacts of choosing a risk measure on the resulting genetic models are studied in the power to detect association and deviation from Hardy-Weinberg equilibrium in cases using genetic relative risk. Extensive simulations demonstrate that the power of a study to detect associations using odds ratio is lower than that using relative risk with the same value when other parameters are fixed. When the Hardy-Weinberg equilibrium holds in the general population, the genetic model can be inferred by the deviation from Hardy-Weinberg equilibrium in only cases. Furthermore, it is more efficient than that based on the deviation from Hardy-Weinberg equilibrium in all cases and controls. PMID:27181372

  6. Multiscale Modeling of Powder Bed–Based Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Markl, Matthias; Körner, Carolin

    2016-07-01

    Powder bed fusion processes are additive manufacturing technologies that are expected to induce the third industrial revolution. Components are built up layer by layer in a powder bed by selectively melting confined areas, according to sliced 3D model data. This technique allows for manufacturing of highly complex geometries hardly machinable with conventional technologies. However, the underlying physical phenomena are sparsely understood and difficult to observe during processing. Therefore, an intensive and expensive trial-and-error principle is applied to produce components with the desired dimensional accuracy, material characteristics, and mechanical properties. This review presents numerical modeling approaches on multiple length scales and timescales to describe different aspects of powder bed fusion processes. In combination with tailored experiments, the numerical results enlarge the process understanding of the underlying physical mechanisms and support the development of suitable process strategies and component topologies.

  7. Requirements based system risk modeling

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Cornford, Steven; Feather, Martin

    2004-01-01

    The problem that we address in this paper is assessing the expected degree of success of the system or mission based on the degree to which each requirement is satisfied and the relative weight of the requirements. We assume a complete list of the requirements, the relevant risk elements and their probability of occurrence and the quantified effect of the risk elements on the requirements. In order to assess the degree to which each requirement is satisfied, we need to determine the effect of the various risk elements on the requirement.

  8. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  9. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  10. Modeling extreme risks in ecology.

    PubMed

    Burgman, Mark; Franklin, James; Hayes, Keith R; Hosack, Geoffrey R; Peters, Gareth W; Sisson, Scott A

    2012-11-01

    Extreme risks in ecology are typified by circumstances in which data are sporadic or unavailable, understanding is poor, and decisions are urgently needed. Expert judgments are pervasive and disagreements among experts are commonplace. We outline approaches to evaluating extreme risks in ecology that rely on stochastic simulation, with a particular focus on methods to evaluate the likelihood of extinction and quasi-extinction of threatened species, and the likelihood of establishment and spread of invasive pests. We evaluate the importance of assumptions in these assessments and the potential of some new approaches to account for these uncertainties, including hierarchical estimation procedures and generalized extreme value distributions. We conclude by examining the treatment of consequences in extreme risk analysis in ecology and how expert judgment may better be harnessed to evaluate extreme risks. PMID:22817845

  11. Risk assessment of additives through soft drinks and nectars consumption on Portuguese population: a 2010 survey.

    PubMed

    Diogo, Janina S G; Silva, Liliana S O; Pena, Angelina; Lino, Celeste M

    2013-12-01

    This study investigated whether the Portuguese population is at risk of exceeding ADI levels for acesulfame-K, saccharin, aspartame, caffeine, benzoic and sorbic acid through an assessment of dietary intake of additives and specific consumption of four types of beverages, traditional soft drinks and soft drinks based on mineral waters, energetic drinks, and nectars. The highest mean levels of additives were found for caffeine in energetic drinks, 293.5mg/L, for saccharin in traditional soft drinks, 18.4 mg/L, for acesulfame-K and aspartame in nectars, with 88.2 and 97.8 mg/L, respectively, for benzoic acid in traditional soft drinks, 125.7 mg/L, and for sorbic acid in soft drinks based on mineral water, 166.5 mg/L. Traditional soft drinks presented the highest acceptable daily intake percentages (ADIs%) for acesulfame-K, aspartame, benzoic and sorbic acid and similar value for saccharin (0.5%) when compared with soft drinks based on mineral water, 0.7%, 0.08%, 7.3%, and 1.92% versus 0.2%, 0.053%, 0.6%, and 0.28%, respectively. However for saccharin the highest percentage of ADI was obtained for nectars, 0.9%, in comparison with both types of soft drinks, 0.5%. Therefore, it is concluded that the Portuguese population is not at risk of exceeding the established ADIs for the studied additives. PMID:24036138

  12. [Critical of the additive model of the randomized controlled trial].

    PubMed

    Boussageon, Rémy; Gueyffier, François; Bejan-Angoulvant, Theodora; Felden-Dominiak, Géraldine

    2008-01-01

    Randomized, double-blind, placebo-controlled clinical trials are currently the best way to demonstrate the clinical effectiveness of drugs. Its methodology relies on the method of difference (John Stuart Mill), through which the observed difference between two groups (drug vs placebo) can be attributed to the pharmacological effect of the drug being tested. However, this additive model can be questioned in the event of statistical interactions between the pharmacological and the placebo effects. Evidence in different domains has shown that the placebo effect can influence the effect of the active principle. This article evaluates the methodological, clinical and epistemological consequences of this phenomenon. Topics treated include extrapolating results, accounting for heterogeneous results, demonstrating the existence of several factors in the placebo effect, the necessity to take these factors into account for given symptoms or pathologies, as well as the problem of the "specific" effect. PMID:18387273

  13. Relative Importance and Additive Effects of Maternal and Infant Risk Factors on Childhood Asthma

    PubMed Central

    Rosas-Salazar, Christian; James, Kristina; Escobar, Gabriel; Gebretsadik, Tebeb; Li, Sherian Xu; Carroll, Kecia N.; Walsh, Eileen; Mitchel, Edward; Das, Suman; Kumar, Rajesh; Yu, Chang; Dupont, William D.; Hartert, Tina V.

    2016-01-01

    Background Environmental exposures that occur in utero and during early life may contribute to the development of childhood asthma through alteration of the human microbiome. The objectives of this study were to estimate the cumulative effect and relative importance of environmental exposures on the risk of childhood asthma. Methods We conducted a population-based birth cohort study of mother-child dyads who were born between 1995 and 2003 and were continuously enrolled in the PRIMA (Prevention of RSV: Impact on Morbidity and Asthma) cohort. The individual and cumulative impact of maternal urinary tract infections (UTI) during pregnancy, maternal colonization with group B streptococcus (GBS), mode of delivery, infant antibiotic use, and older siblings at home, on the risk of childhood asthma were estimated using logistic regression. Dose-response effect on childhood asthma risk was assessed for continuous risk factors: number of maternal UTIs during pregnancy, courses of infant antibiotics, and number of older siblings at home. We further assessed and compared the relative importance of these exposures on the asthma risk. In a subgroup of children for whom maternal antibiotic use during pregnancy information was available, the effect of maternal antibiotic use on the risk of childhood asthma was estimated. Results Among 136,098 singleton birth infants, 13.29% developed asthma. In both univariate and adjusted analyses, maternal UTI during pregnancy (odds ratio [OR] 1.2, 95% confidence interval [CI] 1.18, 1.25; adjusted OR [AOR] 1.04, 95%CI 1.02, 1.07 for every additional UTI) and infant antibiotic use (OR 1.21, 95%CI 1.20, 1.22; AOR 1.16, 95%CI 1.15, 1.17 for every additional course) were associated with an increased risk of childhood asthma, while having older siblings at home (OR 0.92, 95%CI 0.91, 0.93; AOR 0.85, 95%CI 0.84, 0.87 for each additional sibling) was associated with a decreased risk of childhood asthma, in a dose-dependent manner. Compared with vaginal

  14. THE COMBINED CARCINOGENIC RISK FOR EXPOSURE TO MIXTURES OF DRINKING WATER DISINFECTION BY-PRODUCTS MAY BE LESS THAN ADDITIVE

    EPA Science Inventory

    The Combined Carcinogenic Risk for Exposure to Mixtures of Drinking Water Disinfection By-Products May be Less Than Additive

    Risk assessment methods for chemical mixtures in drinking water are not well defined. Current default risk assessments for chemical mixtures assume...

  15. A methodology for modeling regional terrorism risk.

    PubMed

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. PMID:21232062

  16. Statistical models for operational risk management

    NASA Astrophysics Data System (ADS)

    Cornalba, Chiara; Giudici, Paolo

    2004-07-01

    The Basel Committee on Banking Supervision has released, in the last few years, recommendations for the correct determination of the risks to which a banking organization is subject. This concerns, in particular, operational risks, which are all those management events that may determine unexpected losses. It is necessary to develop valid statistical models to measure and, consequently, predict, such operational risks. In the paper we present the possible approaches, including our own proposal, which is based on Bayesian networks.

  17. Addition of dipeptidyl peptidase-4 inhibitors to sulphonylureas and risk of hypoglycaemia: systematic review and meta-analysis

    PubMed Central

    Moore, Nicholas; Arnaud, Mickael; Robinson, Philip; Raschi, Emanuel; De Ponti, Fabrizio; Bégaud, Bernard; Pariente, Antoine

    2016-01-01

    Objective To quantify the risk of hypoglycaemia associated with the concomitant use of dipeptidyl peptidase-4 (DPP-4) inhibitors and sulphonylureas compared with placebo and sulphonylureas. Design Systematic review and meta-analysis. Data sources Medline, ISI Web of Science, SCOPUS, Cochrane Central Register of Controlled Trials, and clinicaltrial.gov were searched without any language restriction. Study selection Placebo controlled randomised trials comprising at least 50 participants with type 2 diabetes treated with DPP-4 inhibitors and sulphonylureas. Review methods Risk of bias in each trial was assessed using the Cochrane Collaboration tool. The risk ratio of hypoglycaemia with 95% confidence intervals was computed for each study and then pooled using fixed effect models (Mantel Haenszel method) or random effect models, when appropriate. Subgroup analyses were also performed (eg, dose of DPP-4 inhibitors). The number needed to harm (NNH) was estimated according to treatment duration. Results 10 studies were included, representing a total of 6546 participants (4020 received DPP-4 inhibitors plus sulphonylureas, 2526 placebo plus sulphonylureas). The risk ratio of hypoglycaemia was 1.52 (95% confidence interval 1.29 to 1.80). The NNH was 17 (95% confidence interval 11 to 30) for a treatment duration of six months or less, 15 (9 to 26) for 6.1 to 12 months, and 8 (5 to 15) for more than one year. In subgroup analysis, no difference was found between full and low doses of DPP-4 inhibitors: the risk ratio related to full dose DPP-4 inhibitors was 1.66 (1.34 to 2.06), whereas the increased risk ratio related to low dose DPP-4 inhibitors did not reach statistical significance (1.33, 0.92 to 1.94). Conclusions Addition of DPP-4 inhibitors to sulphonylurea to treat people with type 2 diabetes is associated with a 50% increased risk of hypoglycaemia and to one excess case of hypoglycaemia for every 17 patients in the first six months of treatment. This

  18. Quantitative risk modeling in aseptic manufacture.

    PubMed

    Tidswell, Edward C; McGarvey, Bernard

    2006-01-01

    Expedient risk assessment of aseptic manufacturing processes offers unique opportunities for improved and sustained assurance of product quality. Contemporary risk assessments applied to aseptic manufacturing processes, however, are commonly handicapped by assumptions and subjectivity, leading to inexactitude. Quantitative risk modeling augmented with Monte Carlo simulations represents a novel, innovative, and more efficient means of risk assessment. This technique relies upon fewer assumptions and removes subjectivity to more swiftly generate an improved, more realistic, quantitative estimate of risk. The fundamental steps and requirements for an assessment of the risk of bioburden ingress into aseptically manufactured products are described. A case study exemplifies how quantitative risk modeling and Monte Carlo simulations achieve a more rapid and improved determination of the risk of bioburden ingress during the aseptic filling of a parenteral product. Although application of quantitative risk modeling is described here purely for the purpose of process improvement, the technique has far wider relevance in the assisted disposition of batches, cleanroom management, and the utilization of real-time data from rapid microbial monitoring technologies. PMID:17089696

  19. PRISM: a planned risk information seeking model.

    PubMed

    Kahlor, LeeAnn

    2010-06-01

    Recent attention on health-related information seeking has focused primarily on information seeking within specific health and health risk contexts. This study attempts to shift some of that focus to individual-level variables that may impact health risk information seeking across contexts. To locate these variables, the researcher posits an integrated model, the Planned Risk Information Seeking Model (PRISM). The model, which treats risk information seeking as a deliberate (planned) behavior, maps variables found in the Theory of Planned Behavior (TPB; Ajzen, 1991) and the Risk Information Seeking and Processing Model (RISP; Griffin, Dunwoody, & Neuwirth, 1999), and posits linkages among those variables. This effort is further informed by Kahlor's (2007) Augmented RISP, the Theory of Motivated Information Management (Afifi & Weiner, 2004), the Comprehensive Model of Information Seeking (Johnson & Meischke, 1993), the Health Information Acquisition Model (Freimuth, Stein, & Kean, 1989), and the Extended Parallel Processing Model (Witte, 1998). The resulting integrated model accounted for 59% of the variance in health risk information-seeking intent and performed better than the TPB or the RISP alone. PMID:20512716

  20. Korean Risk Assessment Model for Breast Cancer Risk Prediction

    PubMed Central

    Park, Boyoung; Ma, Seung Hyun; Shin, Aesun; Chang, Myung-Chul; Choi, Ji-Yeob; Kim, Sungwan; Han, Wonshik; Noh, Dong-Young; Ahn, Sei-Hyun; Kang, Daehee; Yoo, Keun-Young; Park, Sue K.

    2013-01-01

    Purpose We evaluated the performance of the Gail model for a Korean population and developed a Korean breast cancer risk assessment tool (KoBCRAT) based upon equations developed for the Gail model for predicting breast cancer risk. Methods Using 3,789 sets of cases and controls, risk factors for breast cancer among Koreans were identified. Individual probabilities were projected using Gail's equations and Korean hazard data. We compared the 5-year and lifetime risk produced using the modified Gail model which applied Korean incidence and mortality data and the parameter estimators from the original Gail model with those produced using the KoBCRAT. We validated the KoBCRAT based on the expected/observed breast cancer incidence and area under the curve (AUC) using two Korean cohorts: the Korean Multicenter Cancer Cohort (KMCC) and National Cancer Center (NCC) cohort. Results The major risk factors under the age of 50 were family history, age at menarche, age at first full-term pregnancy, menopausal status, breastfeeding duration, oral contraceptive usage, and exercise, while those at and over the age of 50 were family history, age at menarche, age at menopause, pregnancy experience, body mass index, oral contraceptive usage, and exercise. The modified Gail model produced lower 5-year risk for the cases than for the controls (p = 0.017), while the KoBCRAT produced higher 5-year and lifetime risk for the cases than for the controls (p<0.001 and <0.001, respectively). The observed incidence of breast cancer in the two cohorts was similar to the expected incidence from the KoBCRAT (KMCC, p = 0.880; NCC, p = 0.878). The AUC using the KoBCRAT was 0.61 for the KMCC and 0.89 for the NCC cohort. Conclusions Our findings suggest that the KoBCRAT is a better tool for predicting the risk of breast cancer in Korean women, especially urban women. PMID:24204664

  1. Modelling suicide risk in later life.

    PubMed

    Lo, C F; Kwok, Cordelia M Y

    2006-08-01

    Affective disorder is generally regarded as the prominent risk factor for suicide in the old age population. Despite the large number of empirical studies available in the literature, there is no attempt in modelling the dynamics of an individual's level of suicide risk theoretically yet. In particular, a dynamic model which can simulate the time evolution of an individual's level of risk for suicide and provide quantitative estimates of the probability of suicide risk is still lacking. In the present study we apply the contingent claims analysis of credit risk modelling in the field of quantitative finance to derive a theoretical stochastic model for estimation of the probability of suicide risk in later life in terms of a signalling index of affective disorder. Our model is based upon the hypothesis that the current state of affective disorder of a patient can be represented by a signalling index and exhibits stochastic movement and that a threshold of affective disorder, which signifies the occurrence of suicide, exists. According to the numerical results, the implications of our model are consistent with the clinical findings. Hence, we believe that such a dynamic model will be essential to the design of effective suicide prevention strategies in the target population of older adults, especially in the primary care setting. PMID:16797044

  2. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  3. Major histocompatibility complex harbors widespread genotypic variability of non-additive risk of rheumatoid arthritis including epistasis.

    PubMed

    Wei, Wen-Hua; Bowes, John; Plant, Darren; Viatte, Sebastien; Yarwood, Annie; Massey, Jonathan; Worthington, Jane; Eyre, Stephen

    2016-01-01

    Genotypic variability based genome-wide association studies (vGWASs) can identify potentially interacting loci without prior knowledge of the interacting factors. We report a two-stage approach to make vGWAS applicable to diseases: firstly using a mixed model approach to partition dichotomous phenotypes into additive risk and non-additive environmental residuals on the liability scale and secondly using the Levene's (Brown-Forsythe) test to assess equality of the residual variances across genotype groups per marker. We found widespread significant (P < 2.5e-05) vGWAS signals within the major histocompatibility complex (MHC) across all three study cohorts of rheumatoid arthritis. We further identified 10 epistatic interactions between the vGWAS signals independent of the MHC additive effects, each with a weak effect but jointly explained 1.9% of phenotypic variance. PTPN22 was also identified in the discovery cohort but replicated in only one independent cohort. Combining the three cohorts boosted power of vGWAS and additionally identified TYK2 and ANKRD55. Both PTPN22 and TYK2 had evidence of interactions reported elsewhere. We conclude that vGWAS can help discover interacting loci for complex diseases but require large samples to find additional signals. PMID:27109064

  4. Major histocompatibility complex harbors widespread genotypic variability of non-additive risk of rheumatoid arthritis including epistasis

    PubMed Central

    Wei, Wen-Hua; Bowes, John; Plant, Darren; Viatte, Sebastien; Yarwood, Annie; Massey, Jonathan; Worthington, Jane; Eyre, Stephen

    2016-01-01

    Genotypic variability based genome-wide association studies (vGWASs) can identify potentially interacting loci without prior knowledge of the interacting factors. We report a two-stage approach to make vGWAS applicable to diseases: firstly using a mixed model approach to partition dichotomous phenotypes into additive risk and non-additive environmental residuals on the liability scale and secondly using the Levene’s (Brown-Forsythe) test to assess equality of the residual variances across genotype groups per marker. We found widespread significant (P < 2.5e-05) vGWAS signals within the major histocompatibility complex (MHC) across all three study cohorts of rheumatoid arthritis. We further identified 10 epistatic interactions between the vGWAS signals independent of the MHC additive effects, each with a weak effect but jointly explained 1.9% of phenotypic variance. PTPN22 was also identified in the discovery cohort but replicated in only one independent cohort. Combining the three cohorts boosted power of vGWAS and additionally identified TYK2 and ANKRD55. Both PTPN22 and TYK2 had evidence of interactions reported elsewhere. We conclude that vGWAS can help discover interacting loci for complex diseases but require large samples to find additional signals. PMID:27109064

  5. Overpaying morbidity adjusters in risk equalization models.

    PubMed

    van Kleef, R C; van Vliet, R C J A; van de Ven, W P M M

    2016-09-01

    Most competitive social health insurance markets include risk equalization to compensate insurers for predictable variation in healthcare expenses. Empirical literature shows that even the most sophisticated risk equalization models-with advanced morbidity adjusters-substantially undercompensate insurers for selected groups of high-risk individuals. In the presence of premium regulation, these undercompensations confront consumers and insurers with incentives for risk selection. An important reason for the undercompensations is that not all information with predictive value regarding healthcare expenses is appropriate for use as a morbidity adjuster. To reduce incentives for selection regarding specific groups we propose overpaying morbidity adjusters that are already included in the risk equalization model. This paper illustrates the idea of overpaying by merging data on morbidity adjusters and healthcare expenses with health survey information, and derives three preconditions for meaningful application. Given these preconditions, we think overpaying may be particularly useful for pharmacy-based cost groups. PMID:26420555

  6. Risk Management in environmental geotechnical modelling

    NASA Astrophysics Data System (ADS)

    Tammemäe, Olavi; Torn, Hardi

    2008-01-01

    The objective of this article is to provide an overview of the basis of risk analysis, assessment and management, accompanying problems and principles of risk management when drafting an environmental geotechnical model, enabling the analysis of an entire territory or developed region as a whole. The environmental impact will remain within the limits of the criteria specified with the standards and will be acceptable for human health and environment. An essential part of the solution of the problem is the engineering-geological model based on risk analysis and the assessment and forecast of mutual effects of the processes.

  7. Percolation model with an additional source of disorder.

    PubMed

    Kundu, Sumanta; Manna, S S

    2016-06-01

    The ranges of transmission of the mobiles in a mobile ad hoc network are not uniform in reality. They are affected by the temperature fluctuation in air, obstruction due to the solid objects, even the humidity difference in the environment, etc. How the varying range of transmission of the individual active elements affects the global connectivity in the network may be an important practical question to ask. Here a model of percolation phenomena, with an additional source of disorder, is introduced for a theoretical understanding of this problem. As in ordinary percolation, sites of a square lattice are occupied randomly with probability p. Each occupied site is then assigned a circular disk of random value R for its radius. A bond is defined to be occupied if and only if the radii R_{1} and R_{2} of the disks centered at the ends satisfy a certain predefined condition. In a very general formulation, one divides the R_{1}-R_{2} plane into two regions by an arbitrary closed curve. One defines a point within one region as representing an occupied bond; otherwise it is a vacant bond. The study of three different rules under this general formulation indicates that the percolation threshold always varies continuously. This threshold has two limiting values, one is p_{c}(sq), the percolation threshold for the ordinary site percolation on the square lattice, and the other is unity. The approach of the percolation threshold to its limiting values are characterized by two exponents. In a special case, all lattice sites are occupied by disks of random radii R∈{0,R_{0}} and a percolation transition is observed with R_{0} as the control variable, similar to the site occupation probability. PMID:27415234

  8. Percolation model with an additional source of disorder

    NASA Astrophysics Data System (ADS)

    Kundu, Sumanta; Manna, S. S.

    2016-06-01

    The ranges of transmission of the mobiles in a mobile ad hoc network are not uniform in reality. They are affected by the temperature fluctuation in air, obstruction due to the solid objects, even the humidity difference in the environment, etc. How the varying range of transmission of the individual active elements affects the global connectivity in the network may be an important practical question to ask. Here a model of percolation phenomena, with an additional source of disorder, is introduced for a theoretical understanding of this problem. As in ordinary percolation, sites of a square lattice are occupied randomly with probability p . Each occupied site is then assigned a circular disk of random value R for its radius. A bond is defined to be occupied if and only if the radii R1 and R2 of the disks centered at the ends satisfy a certain predefined condition. In a very general formulation, one divides the R1-R2 plane into two regions by an arbitrary closed curve. One defines a point within one region as representing an occupied bond; otherwise it is a vacant bond. The study of three different rules under this general formulation indicates that the percolation threshold always varies continuously. This threshold has two limiting values, one is pc(sq) , the percolation threshold for the ordinary site percolation on the square lattice, and the other is unity. The approach of the percolation threshold to its limiting values are characterized by two exponents. In a special case, all lattice sites are occupied by disks of random radii R ∈{0 ,R0} and a percolation transition is observed with R0 as the control variable, similar to the site occupation probability.

  9. Predictive qualitative risk model of bovine rabies occurrence in Brazil.

    PubMed

    Braga, Guilherme Basseto; Grisi-Filho, José Henrique Hildebrand; Leite, Bruno Meireles; de Sena, Elaine Fátima; Dias, Ricardo Augusto

    2014-03-01

    Bovine rabies remains endemic in Brazil and despite control efforts, the disease still spreads insidiously. The main vector is the hematophagous bat, Desmodus rotundus. The present work aimed to create a predictive qualitative model of the occurrence of bovine rabies in each municipality in 25 of the 27 Brazilian States. The risk of rabies transmission from bats to bovine was estimated using decision-tree models of receptivity and vulnerability. Questionnaires, which covered a number of questions related to the surveillance of possible risk factors, such as bovine rabies outbreaks in the previous year, the presence of bat roosts, bat rabies positivity and environmental changes, were sent to the local veterinary units of each State. The bovine density and geomorphologic features were obtained from national databases and geographic information systems. Of the 433 municipalities presenting bovine rabies outbreaks in 2010, 178 (41.1%) were classified by the model as high risk, 212 (49.0%) were classified as moderate risk, 25 (5.8%) were classified as low risk, whereas the risk was undetermined in 18 municipalities (4.1%). An ROC curve was built to determine if the risk evaluated by the model could adequately discriminate between municipalities with and without rabies occurrence in future years. The risk estimator for the year 2011 was classified as moderately accurate. In the future, these models could allow the targeting of rabies control efforts, with the adoption of control measures directed to the higher risk locations and the optimization of the field veterinary staff deployment throughout the country. Additionally, efforts must be made to encourage continuous surveillance of risk factors. PMID:24433635

  10. Additive Synergism between Asbestos and Smoking in Lung Cancer Risk: A Systematic Review and Meta-Analysis

    PubMed Central

    Ngamwong, Yuwadee; Tangamornsuksan, Wimonchat; Lohitnavy, Ornrat; Chaiyakunapruk, Nathorn; Scholfield, C. Norman; Reisfeld, Brad; Lohitnavy, Manupat

    2015-01-01

    Smoking and asbestos exposure are important risks for lung cancer. Several epidemiological studies have linked asbestos exposure and smoking to lung cancer. To reconcile and unify these results, we conducted a systematic review and meta-analysis to provide a quantitative estimate of the increased risk of lung cancer associated with asbestos exposure and cigarette smoking and to classify their interaction. Five electronic databases were searched from inception to May, 2015 for observational studies on lung cancer. All case-control (N = 10) and cohort (N = 7) studies were included in the analysis. We calculated pooled odds ratios (ORs), relative risks (RRs) and 95% confidence intervals (CIs) using a random-effects model for the association of asbestos exposure and smoking with lung cancer. Lung cancer patients who were not exposed to asbestos and non-smoking (A-S-) were compared with; (i) asbestos-exposed and non-smoking (A+S-), (ii) non-exposure to asbestos and smoking (A-S+), and (iii) asbestos-exposed and smoking (A+S+). Our meta-analysis showed a significant difference in risk of developing lung cancer among asbestos exposed and/or smoking workers compared to controls (A-S-), odds ratios for the disease (95% CI) were (i) 1.70 (A+S-, 1.31–2.21), (ii) 5.65; (A-S+, 3.38–9.42), (iii) 8.70 (A+S+, 5.8–13.10). The additive interaction index of synergy was 1.44 (95% CI = 1.26–1.77) and the multiplicative index = 0.91 (95% CI = 0.63–1.30). Corresponding values for cohort studies were 1.11 (95% CI = 1.00–1.28) and 0.51 (95% CI = 0.31–0.85). Our results point to an additive synergism for lung cancer with co-exposure of asbestos and cigarette smoking. Assessments of industrial health risks should take smoking and other airborne health risks when setting occupational asbestos exposure limits. PMID:26274395

  11. Additive Genetic Variation in Schizophrenia Risk Is Shared by Populations of African and European Descent

    PubMed Central

    de Candia, Teresa R.; Lee, S. Hong; Yang, Jian; Browning, Brian L.; Gejman, Pablo V.; Levinson, Douglas F.; Mowry, Bryan J.; Hewitt, John K.; Goddard, Michael E.; O’Donovan, Michael C.; Purcell, Shaun M.; Posthuma, Danielle; Visscher, Peter M.; Wray, Naomi R.; Keller, Matthew C.

    2013-01-01

    To investigate the extent to which the proportion of schizophrenia’s additive genetic variation tagged by SNPs is shared by populations of European and African descent, we analyzed the largest combined African descent (AD [n = 2,142]) and European descent (ED [n = 4,990]) schizophrenia case-control genome-wide association study (GWAS) data set available, the Molecular Genetics of Schizophrenia (MGS) data set. We show how a method that uses genomic similarities at measured SNPs to estimate the additive genetic correlation (SNP correlation [SNP-rg]) between traits can be extended to estimate SNP-rg for the same trait between ethnicities. We estimated SNP-rg for schizophrenia between the MGS ED and MGS AD samples to be 0.66 (SE = 0.23), which is significantly different from 0 (p(SNP-rg = 0) = 0.0003), but not 1 (p(SNP-rg = 1) = 0.26). We re-estimated SNP-rg between an independent ED data set (n = 6,665) and the MGS AD sample to be 0.61 (SE = 0.21, p(SNP-rg = 0) = 0.0003, p(SNP-rg = 1) = 0.16). These results suggest that many schizophrenia risk alleles are shared across ethnic groups and predate African-European divergence. PMID:23954163

  12. Minimum risk route model for hazardous materials

    SciTech Connect

    Ashtakala, B.; Eno, L.A.

    1996-09-01

    The objective of this study is to determine the minimum risk route for transporting a specific hazardous material (HM) between a point of origin and a point of destination (O-D pair) in the study area which minimizes risk to population and environment. The southern part of Quebec is chosen as the study area and major cities are identified as points of origin and destination on the highway network. Three classes of HM, namely chlorine gas, liquefied petroleum gas (LPG), and sulfuric acid, are chosen. A minimum risk route model has been developed to determine minimum risk routes between an O-D pair by using population or environment risk units as link impedances. The risk units for each link are computed by taking into consideration the probability of an accident and its consequences on that link. The results show that between the same O-D pair, the minimum risk routes are different for various HM. The concept of risk dissipation from origin to destination on the minimum risk route has been developed and dissipation curves are included.

  13. Development and validation of instantaneous risk model in nuclear power plant's risk monitor

    SciTech Connect

    Wang, J.; Li, Y.; Wang, F.; Wang, J.; Hu, L.

    2012-07-01

    The instantaneous risk model is the fundament of calculation and analysis in a risk monitor. This study focused on the development and validation of an instantaneous risk model. Therefore the principles converting from the baseline risk model to the instantaneous risk model were studied and separated trains' failure modes modeling method was developed. The development and validation process in an operating nuclear power plant's risk monitor were also introduced. Correctness of instantaneous risk model and rationality of converting method were demonstrated by comparison with the result of baseline risk model. (authors)

  14. The addition of whole soy flour to cafeteria diet reduces metabolic risk markers in wistar rats

    PubMed Central

    2013-01-01

    Background Soybean is termed a functional food because it contains bioactive compounds. However, its effects are not well known under unbalanced diet conditions. This work is aimed at evaluating the effect of adding whole soy flour to a cafeteria diet on intestinal histomorphometry, metabolic risk and toxicity markers in rats. Methods In this study, 30 male adult Wistar rats were used, distributed among three groups (n = 10): AIN-93 M diet, cafeteria diet (CAF) and cafeteria diet with soy flour (CAFS), for 56 days. The following parameters were measured: food intake; weight gain; serum concentrations of triglycerides, total cholesterol, HDL-c, glycated hemoglobin (HbA1c), aspartate (AST) and alanine (ALT) aminotransferases and Thiobarbituric Acid Reactive Substances (TBARS); humidity and lipid fecal content; weight and fat of the liver. The villous height, the crypt depth and the thickness of the duodenal and ileal circular and longitudinal muscle layers of the animals were also measured. Results There was a significant reduction in the food intake in the CAF group. The CAFS showed lower serum concentrations of triglycerides and serum TBARS and a lower percentage of hepatic fat, with a corresponding increase in thickness of the intestinal muscle layers. In the CAF group, an increase in the HbA1c, ALT, lipid excretion, liver TBARS and crypt depth, was observed associated with lower HDL-c and villous height. The addition of soy did not promote any change in these parameters. Conclusions The inclusion of whole soy flour in a high-fat diet may be helpful in reducing some markers of metabolic risk; however, more studies are required to clarify its effects on unbalanced diets. PMID:24119309

  15. Long range Ising model for credit risk modeling

    NASA Astrophysics Data System (ADS)

    Molins, Jordi; Vives, Eduard

    2005-07-01

    Within the framework of maximum entropy principle we show that the finite-size long-range Ising model is the adequate model for the description of homogeneous credit portfolios and the computation of credit risk when default correlations between the borrowers are included. The exact analysis of the model suggest that when the correlation increases a first-order-like transition may occur inducing a sudden risk increase.

  16. Conceptual models for cumulative risk assessment.

    PubMed

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects. PMID:22021317

  17. Using Generalized Additive Models to Analyze Single-Case Designs

    ERIC Educational Resources Information Center

    Shadish, William; Sullivan, Kristynn

    2013-01-01

    Many analyses for single-case designs (SCDs)--including nearly all the effect size indicators-- currently assume no trend in the data. Regression and multilevel models allow for trend, but usually test only linear trend and have no principled way of knowing if higher order trends should be represented in the model. This paper shows how Generalized…

  18. How much additional model complexity do the use of catchment hydrological signatures, additional data and expert knowledge warrant?

    NASA Astrophysics Data System (ADS)

    Hrachowitz, M.; Fovet, O.; RUIZ, L.; Gascuel-odoux, C.; Savenije, H.

    2013-12-01

    In the frequent absence of sufficient suitable data to constrain hydrological models, it is not uncommon to represent catchments at a range of scales by lumped model set-ups. Although process heterogeneity can average out on the catchment scale to generate simple catchment integrated responses whose general flow features can frequently be reproduced by lumped models, these models often fail to get details of the flow pattern as well as catchment internal dynamics, such as groundwater level changes, right to a sufficient degree, resulting in considerable predictive uncertainty. Traditionally, models are constrained by only one or two objectives functions, which does not warrant more than a handful of parameters to avoid elevated predictive uncertainty, thereby preventing more complex model set-ups accounting for increased process heterogeneity. In this study it was tested how much additional process heterogeneity is warranted in models when optimizing the model calibration strategy, using additional data and expert knowledge. Long-term timeseries of flow and groundwater levels for small nested experimental catchments in French Brittany with considerable differences in geology, topography and flow regime were used in this study to test which degree of model process heterogeneity is warranted with increased availability of information. In a first step, as a benchmark, the system was treated as one lumped entity and the model was trained based only on its ability to reproduce the hydrograph. Although it was found that the overall modelled flow generally reflects the observed flow response quite well, the internal system dynamics could not be reproduced. In further steps the complexity of this model was gradually increased, first by adding a separate riparian reservoir to the lumped set-up and then by a semi-distributed set-up, allowing for independent, parallel model structures, representing the contrasting nested catchments. Although calibration performance increased

  19. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    PubMed

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models. PMID:26336695

  20. Suicide risk assessment and suicide risk formulation: essential components of the therapeutic risk management model.

    PubMed

    Silverman, Morton M

    2014-09-01

    Suicide and other suicidal behaviors are often associated with psychiatric disorders and dysfunctions. Therefore, psychiatrists have significant opportunities to identify at-risk individuals and offer treatment to reduce that risk. Although a suicide risk assessment is a core competency requirement, many clinical psychiatrists lack the requisite training and skills to appropriately assess for suicide risk. Moreover, the standard of care requires psychiatrists to foresee the possibility that a patient might engage in suicidal behavior, hence to conduct a suicide risk formulation sufficient to guide triage and treatment planning. Based on data collected via a suicide risk assessment, a suicide risk formulation is a process whereby the psychiatrist forms a judgment about a patient's foreseeable risk of suicidal behavior in order to inform triage decisions, safety and treatment planning, and interventions to reduce risk. This paper addresses the components of this process in the context of the model for therapeutic risk management of the suicidal patient developed at the Veterans Integrated Service Network (VISN) 19 Mental Illness Research, Education and Clinical Center by Wortzel et al. PMID:25226200

  1. Addition of Diffusion Model to MELCOR and Comparison with Data

    SciTech Connect

    Brad Merrill; Richard Moore; Chang Oh

    2004-06-01

    A chemical diffusion model was incorporated into the thermal-hydraulics package of the MELCOR Severe Accident code (Reference 1) for analyzing air ingress events for a very high temperature gas-cooled reactor.

  2. Modelling dissimilarity: generalizing ultrametric and additive tree representations.

    PubMed

    Hubert, L; Arabie, P; Meulman, J

    2001-05-01

    Methods for the hierarchical clustering of an object set produce a sequence of nested partitions such that object classes within each successive partition are constructed from the union of object classes present at the previous level. Any such sequence of nested partitions can in turn be characterized by an ultrametric. An approach to generalizing an (ultrametric) representation is proposed in which the nested character of the partition sequence is relaxed and replaced by the weaker requirement that the classes within each partition contain objects consecutive with respect to a fixed ordering of the objects. A method for fitting such a structure to a given proximity matrix is discussed, along with several alternative strategies for graphical representation. Using this same ultrametric extension, additive tree representations can also be generalized by replacing the ultrametric component in the decomposition of an additive tree (into an ultrametric and a centroid metric). A common numerical illustration is developed and maintained throughout the paper. PMID:11393895

  3. Serum Total Bilirubin Levels Provide Additive Risk Information over the Framingham Risk Score for Identifying Asymptomatic Diabetic Patients at Higher Risk for Coronary Artery Stenosis

    PubMed Central

    Leem, Jaechan; Koh, Eun Hee; Jang, Jung Eun; Woo, Chang-Yun; Oh, Jin Sun; Lee, Min Jung; Kang, Joon-Won; Lim, Tae-Hwan; Jung, Chang Hee; Lee, Woo Je; Park, Joong-Yeol

    2015-01-01

    Background The diagnosis of coronary artery disease (CAD) is often delayed in patients with type 2 diabetes. Serum total bilirubin levels are inversely associated with CAD. However, no studies have examined whether this can be used as a biochemical marker for identifying asymptomatic diabetic patients at higher risk for having obstructive CAD. Methods We performed a cross-sectional study of 460 consecutive asymptomatic patients with type 2 diabetes. All patients underwent coronary computed tomographic angiography, and their serum total bilirubin levels were measured. Obstructive CAD was defined as ≥50% diameter stenosis in at least one coronary artery. Results Serum total bilirubin tertiles showed an inverse association with the prevalence of obstructive CAD. In multivariate logistic regression analysis, the odds ratio for the highest versus the lowest tertile of total bilirubin was 0.227 (95% confidence interval [CI], 0.130 to 0.398), and an increment of 1 µmol/L in serum total bilirubin level was associated with a 14.6% decrease in obstructive CAD after adjustment for confounding variables. Receiver operating characteristic curve analysis showed that the area under the curve for the Framingham Risk Score (FRS) plus serum total bilirubin level was 0.712 (95% CI, 0.668 to 0.753), which is significantly greater than that of the FRS alone (P=0.0028). Conclusion Serum total bilirubin level is inversely associated with obstructive CAD and provides additive risk information over the FRS. Serum total bilirubin may be helpful for identifying asymptomatic patients with type 2 diabetes who are at higher risk for obstructive CAD. PMID:26566499

  4. Additional Research Needs to Support the GENII Biosphere Models

    SciTech Connect

    Napier, Bruce A.; Snyder, Sandra F.; Arimescu, Carmen

    2013-11-30

    In the course of evaluating the current parameter needs for the GENII Version 2 code (Snyder et al. 2013), areas of possible improvement for both the data and the underlying models have been identified. As the data review was implemented, PNNL staff identified areas where the models can be improved both to accommodate the locally significant pathways identified and also to incorporate newer models. The areas are general data needs for the existing models and improved formulations for the pathway models. It is recommended that priorities be set by NRC staff to guide selection of the most useful improvements in a cost-effective manner. Suggestions are made based on relatively easy and inexpensive changes, and longer-term more costly studies. In the short term, there are several improved model formulations that could be applied to the GENII suite of codes to make them more generally useful. • Implementation of the separation of the translocation and weathering processes • Implementation of an improved model for carbon-14 from non-atmospheric sources • Implementation of radon exposure pathways models • Development of a KML processor for the output report generator module data that are calculated on a grid that could be superimposed upon digital maps for easier presentation and display • Implementation of marine mammal models (manatees, seals, walrus, whales, etc.). Data needs in the longer term require extensive (and potentially expensive) research. Before picking any one radionuclide or food type, NRC staff should perform an in-house review of current and anticipated environmental analyses to select “dominant” radionuclides of interest to allow setting of cost-effective priorities for radionuclide- and pathway-specific research. These include • soil-to-plant uptake studies for oranges and other citrus fruits, and • Development of models for evaluation of radionuclide concentration in highly-processed foods such as oils and sugars. Finally, renewed

  5. The risk of stillbirth and infant death by each additional week of expectant management stratified by maternal age

    PubMed Central

    Page, Jessica M.; Snowden, Jonathan M.; Cheng, Yvonne W.; Doss, Amy; Rosenstein, Melissa G.; Caughey, Aaron B.

    2016-01-01

    OBJECTIVE The objective of the study was to examine fetal/infant mortality by gestational age at term stratified by maternal age. STUDY DESIGN A retrospective cohort study was conducted using 2005 US national birth certificate data. For each week of term gestation, the risk of mortality associated with delivery was compared with composite mortality risk of expectant management. The expectant management measure included stillbirth and infant death. This expectant management risk was calculated to estimate the composite mortality risk with remaining pregnant an additional week by combining the risk of stillbirth during the additional week of pregnancy and infant death risk following delivery at the next week. Maternal age was stratified by 35 years or more compared with women younger than 35 years as well as subgroup analyses of younger than 20, 20–34, 35–39, or 40 years old or older. RESULTS The fetal/infant mortality risk of expectant management is greater than the risk of infant death at 39 weeks’ gestation in women 35 years old or older (15.2 vs 10.9 of 10,000, P < .05). In women younger than 35 years old, the risk of expectant management also exceeded that of infant death at 39 weeks (21.3 vs 18.8 of 10,000, P < .05). For women younger than 35 years old, the overall expectant management risk is influenced by higher infant death risk and does not rise significantly until 41 weeks compared with women 35 years old or older in which it increased at 40 weeks. CONCLUSION Risk varies by maternal age, and delivery at 39 weeks minimizes fetal/infant mortality for both groups, although the magnitude of the risk reduction is greater in older women. PMID:23707677

  6. Modeling and Managing Risk in Billing Infrastructures

    NASA Astrophysics Data System (ADS)

    Baiardi, Fabrizio; Telmon, Claudio; Sgandurra, Daniele

    This paper discusses risk modeling and risk management in information and communications technology (ICT) systems for which the attack impact distribution is heavy tailed (e.g., power law distribution) and the average risk is unbounded. Systems with these properties include billing infrastructures used to charge customers for services they access. Attacks against billing infrastructures can be classified as peripheral attacks and backbone attacks. The goal of a peripheral attack is to tamper with user bills; a backbone attack seeks to seize control of the billing infrastructure. The probability distribution of the overall impact of an attack on a billing infrastructure also has a heavy-tailed curve. This implies that the probability of a massive impact cannot be ignored and that the average impact may be unbounded - thus, even the most expensive countermeasures would be cost effective. Consequently, the only strategy for managing risk is to increase the resilience of the infrastructure by employing redundant components.

  7. Predicting the Survival Time for Bladder Cancer Using an Additive Hazards Model in Microarray Data

    PubMed Central

    TAPAK, Leili; MAHJUB, Hossein; SADEGHIFAR, Majid; SAIDIJAM, Massoud; POOROLAJAL, Jalal

    2016-01-01

    Background: One substantial part of microarray studies is to predict patients’ survival based on their gene expression profile. Variable selection techniques are powerful tools to handle high dimensionality in analysis of microarray data. However, these techniques have not been investigated in competing risks setting. This study aimed to investigate the performance of four sparse variable selection methods in estimating the survival time. Methods: The data included 1381 gene expression measurements and clinical information from 301 patients with bladder cancer operated in the years 1987 to 2000 in hospitals in Denmark, Sweden, Spain, France, and England. Four methods of the least absolute shrinkage and selection operator, smoothly clipped absolute deviation, the smooth integration of counting and absolute deviation and elastic net were utilized for simultaneous variable selection and estimation under an additive hazards model. The criteria of area under ROC curve, Brier score and c-index were used to compare the methods. Results: The median follow-up time for all patients was 47 months. The elastic net approach was indicated to outperform other methods. The elastic net had the lowest integrated Brier score (0.137±0.07) and the greatest median of the over-time AUC and C-index (0.803±0.06 and 0.779±0.13, respectively). Five out of 19 selected genes by the elastic net were significant (P<0.05) under an additive hazards model. It was indicated that the expression of RTN4, SON, IGF1R and CDC20 decrease the survival time, while the expression of SMARCAD1 increase it. Conclusion: The elastic net had higher capability than the other methods for the prediction of survival time in patients with bladder cancer in the presence of competing risks base on additive hazards model. PMID:27114989

  8. Risk management model of winter navigation operations.

    PubMed

    Valdez Banda, Osiris A; Goerlandt, Floris; Kuzmin, Vladimir; Kujala, Pentti; Montewka, Jakub

    2016-07-15

    The wintertime maritime traffic operations in the Gulf of Finland are managed through the Finnish-Swedish Winter Navigation System. This establishes the requirements and limitations for the vessels navigating when ice covers this area. During winter navigation in the Gulf of Finland, the largest risk stems from accidental ship collisions which may also trigger oil spills. In this article, a model for managing the risk of winter navigation operations is presented. The model analyses the probability of oil spills derived from collisions involving oil tanker vessels and other vessel types. The model structure is based on the steps provided in the Formal Safety Assessment (FSA) by the International Maritime Organization (IMO) and adapted into a Bayesian Network model. The results indicate that ship independent navigation and convoys are the operations with higher probability of oil spills. Minor spills are most probable, while major oil spills found very unlikely but possible. PMID:27207023

  9. Increased Risk of Additional Cancers Among Patients with Gastrointestinal Stromal Tumors: A Population-Based Study

    PubMed Central

    Murphy, James D.; Ma, Grace L.; Baumgartner, Joel M.; Madlensky, Lisa; Burgoyne, Adam M.; Tang, Chih-Min; Martinez, Maria Elena; Sicklick, Jason K.

    2015-01-01

    Purpose Most gastrointestinal stromal tumors (GIST) are considered non-hereditary or sporadic. However, single-institution studies suggest that GIST patients develop additional malignancies with increased frequencies. We hypothesized that we could gain greater insight into possible associations between GIST and other malignancies using a national cancer database inquiry. Methods Patients diagnosed with GIST (2001–2011) in the Surveillance, Epidemiology, and End Results database were included. Standardized prevalence ratios (SPRs) and standardized incidence ratios (SIRs) were used to quantify cancer risks incurred by GIST patients before and after GIST diagnoses, respectively, when compared with the general U.S. population. Results Of 6,112 GIST patients, 1,047 (17.1%) had additional cancers. There were significant increases in overall cancer rates: 44% (SPR=1.44) before diagnosis and 66% (SIR=1.66) after GIST diagnoses. Malignancies with significantly increased occurrence both before/after diagnoses included other sarcomas (SPR=5.24/SIR=4.02), neuroendocrine-carcinoid tumors (SPR=3.56/SIR=4.79), non-Hodgkin’s lymphoma (SPR=1.69/SIR=1.76), and colorectal adenocarcinoma (SPR=1.51/SIR=2.16). Esophageal adenocarcinoma (SPR=12.0), bladder adenocarcinoma (SPR=7.51), melanoma (SPR=1.46), and prostate adenocarcinoma (SPR=1.20) were significantly more common only before GIST. Ovarian carcinoma (SIR=8.72), small intestine adenocarcinoma (SIR=5.89), papillary thyroid cancer (SIR=5.16), renal cell carcinoma (SIR=4.46), hepatobiliary adenocarcinomas (SIR=3.10), gastric adenocarcinoma (SIR=2.70), pancreatic adenocarcinoma (SIR=2.03), uterine adenocarcinoma (SIR=1.96), non-small cell lung cancer (SIR=1.74), and transitional cell carcinoma of the bladder (SIR=1.65) were significantly more common only after GIST. Conclusion This is the first population-based study to characterize the associations and temporal relationships between GIST and other cancers, both by site and

  10. Clinical Model for Suicide Risk Assessment.

    ERIC Educational Resources Information Center

    Kral, Michael J.; Sakinofsky, Isaac

    1994-01-01

    Presents suicide risk assessment in a two-tiered model comprising background/contextual factors and subjectivity. The subjectivity portion is formulated around Shneidman's concepts of perturbation and lethality. Discusses decision of hospital admission versus ambulatory care. Suggests that theoretically informed approach should serve both…

  11. The addition of algebraic turbulence modeling to program LAURA

    NASA Technical Reports Server (NTRS)

    Cheatwood, F. Mcneil; Thompson, R. A.

    1993-01-01

    The Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA) is modified to allow the calculation of turbulent flows. This is accomplished using the Cebeci-Smith and Baldwin-Lomax eddy-viscosity models in conjunction with the thin-layer Navier-Stokes options of the program. Turbulent calculations can be performed for both perfect-gas and equilibrium flows. However, a requirement of the models is that the flow be attached. It is seen that for slender bodies, adequate resolution of the boundary-layer gradients may require more cells in the normal direction than a laminar solution, even when grid stretching is employed. Results for axisymmetric and three-dimensional flows are presented. Comparison with experimental data and other numerical results reveal generally good agreement, except in the regions of detached flow.

  12. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  13. Determinants of Low Birth Weight in Malawi: Bayesian Geo-Additive Modelling.

    PubMed

    Ngwira, Alfred; Stanley, Christopher C

    2015-01-01

    Studies on factors of low birth weight in Malawi have neglected the flexible approach of using smooth functions for some covariates in models. Such flexible approach reveals detailed relationship of covariates with the response. The study aimed at investigating risk factors of low birth weight in Malawi by assuming a flexible approach for continuous covariates and geographical random effect. A Bayesian geo-additive model for birth weight in kilograms and size of the child at birth (less than average or average and higher) with district as a spatial effect using the 2010 Malawi demographic and health survey data was adopted. A Gaussian model for birth weight in kilograms and a binary logistic model for the binary outcome (size of child at birth) were fitted. Continuous covariates were modelled by the penalized (p) splines and spatial effects were smoothed by the two dimensional p-spline. The study found that child birth order, mother weight and height are significant predictors of birth weight. Secondary education for mother, birth order categories 2-3 and 4-5, wealth index of richer family and mother height were significant predictors of child size at birth. The area associated with low birth weight was Chitipa and areas with increased risk to less than average size at birth were Chitipa and Mchinji. The study found support for the flexible modelling of some covariates that clearly have nonlinear influences. Nevertheless there is no strong support for inclusion of geographical spatial analysis. The spatial patterns though point to the influence of omitted variables with some spatial structure or possibly epidemiological processes that account for this spatial structure and the maps generated could be used for targeting development efforts at a glance. PMID:26114866

  14. Landslide risk models for decision making.

    PubMed

    Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio

    2009-11-01

    This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis. PMID:19732396

  15. Landslide risk mapping and modeling in China

    NASA Astrophysics Data System (ADS)

    Li, W.; Hong, Y.

    2015-12-01

    Under circumstances of global climate change, tectonic stress and human effect, landslides are among the most frequent and severely widespread natural hazards on Earth, as demonstrated in the World Atlas of Natural Hazards (McGuire et al., 2004). Every year, landslide activities cause serious economic loss as well as casualties (Róbert et al., 2005). How landslides can be monitored and predicted is an urgent research topic of the international landslide research community. Particularly, there is a lack of high quality and updated landslide risk maps and guidelines that can be employed to better mitigate and prevent landslide disasters in many emerging regions, including China (Hong, 2007). Since the 1950s, landslide events have been recorded in the statistical yearbooks, newspapers, and monographs in China. As disasters have been increasingly concerned by the government and the public, information about landslide events is becoming available from online news reports (Liu et al., 2012).This study presents multi-scale landslide risk mapping and modeling in China. At the national scale, based on historical data and practical experiences, we carry out landslide susceptibility and risk mapping by adopting a statistical approach and pattern recognition methods to construct empirical models. Over the identified landslide hot-spot areas, we further evaluate the slope-stability for each individual site (Sidle and Hirotaka, 2006), with the ultimate goal to set up a space-time multi-scale coupling system of Landslide risk mapping and modeling for landslide hazard monitoring and early warning.

  16. Software reliability: Additional investigations into modeling with replicated experiments

    NASA Technical Reports Server (NTRS)

    Nagel, P. M.; Schotz, F. M.; Skirvan, J. A.

    1984-01-01

    The effects of programmer experience level, different program usage distributions, and programming languages are explored. All these factors affect performance, and some tentative relational hypotheses are presented. An analytic framework for replicated and non-replicated (traditional) software experiments is presented. A method of obtaining an upper bound on the error rate of the next error is proposed. The method was validated empirically by comparing forecasts with actual data. In all 14 cases the bound exceeded the observed parameter, albeit somewhat conservatively. Two other forecasting methods are proposed and compared to observed results. Although demonstrated relative to this framework that stages are neither independent nor exponentially distributed, empirical estimates show that the exponential assumption is nearly valid for all but the extreme tails of the distribution. Except for the dependence in the stage probabilities, Cox's model approximates to a degree what is being observed.

  17. Quantitative risk modelling for new pharmaceutical compounds.

    PubMed

    Tang, Zhengru; Taylor, Mark J; Lisboa, Paulo; Dyas, Mark

    2005-11-15

    The process of discovering and developing new drugs is long, costly and risk-laden. Faced with a wealth of newly discovered compounds, industrial scientists need to target resources carefully to discern the key attributes of a drug candidate and to make informed decisions. Here, we describe a quantitative approach to modelling the risk associated with drug development as a tool for scenario analysis concerning the probability of success of a compound as a potential pharmaceutical agent. We bring together the three strands of manufacture, clinical effectiveness and financial returns. This approach involves the application of a Bayesian Network. A simulation model is demonstrated with an implementation in MS Excel using the modelling engine Crystal Ball. PMID:16257374

  18. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  19. A model of the holographic principle: Randomness and additional dimension

    NASA Astrophysics Data System (ADS)

    Boyarsky, Abraham; Góra, Paweł; Proppe, Harald

    2010-01-01

    In recent years an idea has emerged that a system in a 3-dimensional space can be described from an information point of view by a system on its 2-dimensional boundary. This mysterious correspondence is called the Holographic Principle and has had profound effects in string theory and our perception of space-time. In this note we describe a purely mathematical model of the Holographic Principle using ideas from nonlinear dynamical systems theory. We show that a random map on the surface S of a 3-dimensional open ball B has a natural counterpart in B, and the two maps acting in different dimensional spaces have the same entropy. We can reverse this construction if we start with a special 3-dimensional map in B called a skew product. The key idea is to use the randomness, as imbedded in the parameter of the 2-dimensional random map, to define a third dimension. The main result shows that if we start with an arbitrary dynamical system in B with entropy E we can construct a random map on S whose entropy is arbitrarily close to E.

  20. Additional Developments in Atmosphere Revitalization Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.

    2013-01-01

    NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM)

  1. Additional Developments in Atmosphere Revitalization Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.; Gomez, Carlos

    2013-01-01

    NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project within the AES program.

  2. Novel methods to evaluate fracture risk models

    PubMed Central

    Donaldson, M.G.; Cawthon, P. M.; Schousboe, J.T.; Ensrud, K.E.; Lui, L.Y.; Cauley, J.A.; Hillier, T.A.; Taylor, B.C.; Hochberg, M.C.; Bauer, D.C.; Cummings, S.R.

    2013-01-01

    Fracture prediction models help identify individuals at high risk who may benefit from treatment. Area Under the Curve (AUC) is used to compare prediction models. However, the AUC has limitations and may miss important differences between models. Novel reclassification methods quantify how accurately models classify patients who benefit from treatment and the proportion of patients above/below treatment thresholds. We applied two reclassification methods, using the NOF treatment thresholds, to compare two risk models: femoral neck BMD and age (“simple model”) and FRAX (”FRAX model”). The Pepe method classifies based on case/non-case status and examines the proportion of each above and below thresholds. The Cook method examines fracture rates above and below thresholds. We applied these to the Study of Osteoporotic Fractures. There were 6036 (1037 fractures) and 6232 (389 fractures) participants with complete data for major osteoporotic and hip fracture respectively. Both models for major osteoporotic fracture (0.68 vs. 0.69) and hip fracture (0.75 vs. 0.76) had similar AUCs. In contrast, using reclassification methods, each model classified a substantial number of women differently. Using the Pepe method, the FRAX model (vs. simple model), missed treating 70 (7%) cases of major osteoporotic fracture but avoided treating 285 (6%) non-cases. For hip fracture, the FRAX model missed treating 31 (8%) cases but avoided treating 1026 (18%) non-cases. The Cook method (both models, both fracture outcomes) had similar fracture rates above/below the treatment thresholds. Compared with the AUC, new methods provide more detailed information about how models classify patients. PMID:21351143

  3. [Risk hidden in the small print? : Some food additives may trigger pseudoallergic reactions].

    PubMed

    Zuberbier, Torsten; Hengstenberg, Claudine

    2016-06-01

    Some food additives may trigger pseudoallergenic reactions. However, the prevalence of such an overreaction is - despite the increasing number of food additives - rather low in the general population. The most common triggers of pseudoallergic reactions to food are naturally occurring ingredients. However, symptoms in patients with chronic urticaria should improve significantly on a pseudoallergen-free diet. In addition, some studies indicate that certain food additives may also have an impact on the symptoms of patients with neurodermatitis and asthma. PMID:27173908

  4. Simulation Assisted Risk Assessment: Blast Overpressure Modeling

    NASA Technical Reports Server (NTRS)

    Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael

    2006-01-01

    A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.

  5. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  6. Predicting the Risk of Rheumatoid Arthritis and Its Age of Onset through Modelling Genetic Risk Variants with Smoking

    PubMed Central

    Scott, Ian C.; Seegobin, Seth D.; Steer, Sophia; Tan, Rachael; Forabosco, Paola; Hinks, Anne; Eyre, Stephen; Morgan, Ann W.; Wilson, Anthony G.; Hocking, Lynne J.; Wordsworth, Paul; Barton, Anne; Worthington, Jane; Cope, Andrew P.; Lewis, Cathryn M.

    2013-01-01

    The improved characterisation of risk factors for rheumatoid arthritis (RA) suggests they could be combined to identify individuals at increased disease risks in whom preventive strategies may be evaluated. We aimed to develop an RA prediction model capable of generating clinically relevant predictive data and to determine if it better predicted younger onset RA (YORA). Our novel modelling approach combined odds ratios for 15 four-digit/10 two-digit HLA-DRB1 alleles, 31 single nucleotide polymorphisms (SNPs) and ever-smoking status in males to determine risk using computer simulation and confidence interval based risk categorisation. Only males were evaluated in our models incorporating smoking as ever-smoking is a significant risk factor for RA in men but not women. We developed multiple models to evaluate each risk factor's impact on prediction. Each model's ability to discriminate anti-citrullinated protein antibody (ACPA)-positive RA from controls was evaluated in two cohorts: Wellcome Trust Case Control Consortium (WTCCC: 1,516 cases; 1,647 controls); UK RA Genetics Group Consortium (UKRAGG: 2,623 cases; 1,500 controls). HLA and smoking provided strongest prediction with good discrimination evidenced by an HLA-smoking model area under the curve (AUC) value of 0.813 in both WTCCC and UKRAGG. SNPs provided minimal prediction (AUC 0.660 WTCCC/0.617 UKRAGG). Whilst high individual risks were identified, with some cases having estimated lifetime risks of 86%, only a minority overall had substantially increased odds for RA. High risks from the HLA model were associated with YORA (P<0.0001); ever-smoking associated with older onset disease. This latter finding suggests smoking's impact on RA risk manifests later in life. Our modelling demonstrates that combining risk factors provides clinically informative RA prediction; additionally HLA and smoking status can be used to predict the risk of younger and older onset RA, respectively. PMID:24068971

  7. Human Plague Risk: Spatial-Temporal Models

    NASA Technical Reports Server (NTRS)

    Pinzon, Jorge E.

    2010-01-01

    This chpater reviews the use of spatial-temporal models in identifying potential risks of plague outbreaks into the human population. Using earth observations by satellites remote sensing there has been a systematic analysis and mapping of the close coupling between the vectors of the disease and climate variability. The overall result is that incidence of plague is correlated to positive El Nino/Southem Oscillation (ENSO).

  8. Analysis of Time to Event Outcomes in Randomized Controlled Trials by Generalized Additive Models

    PubMed Central

    Argyropoulos, Christos; Unruh, Mark L.

    2015-01-01

    Background Randomized Controlled Trials almost invariably utilize the hazard ratio calculated with a Cox proportional hazard model as a treatment efficacy measure. Despite the widespread adoption of HRs, these provide a limited understanding of the treatment effect and may even provide a biased estimate when the assumption of proportional hazards in the Cox model is not verified by the trial data. Additional treatment effect measures on the survival probability or the time scale may be used to supplement HRs but a framework for the simultaneous generation of these measures is lacking. Methods By splitting follow-up time at the nodes of a Gauss Lobatto numerical quadrature rule, techniques for Poisson Generalized Additive Models (PGAM) can be adopted for flexible hazard modeling. Straightforward simulation post-estimation transforms PGAM estimates for the log hazard into estimates of the survival function. These in turn were used to calculate relative and absolute risks or even differences in restricted mean survival time between treatment arms. We illustrate our approach with extensive simulations and in two trials: IPASS (in which the proportionality of hazards was violated) and HEMO a long duration study conducted under evolving standards of care on a heterogeneous patient population. Findings PGAM can generate estimates of the survival function and the hazard ratio that are essentially identical to those obtained by Kaplan Meier curve analysis and the Cox model. PGAMs can simultaneously provide multiple measures of treatment efficacy after a single data pass. Furthermore, supported unadjusted (overall treatment effect) but also subgroup and adjusted analyses, while incorporating multiple time scales and accounting for non-proportional hazards in survival data. Conclusions By augmenting the HR conventionally reported, PGAMs have the potential to support the inferential goals of multiple stakeholders involved in the evaluation and appraisal of clinical trial

  9. Analysis and Modeling of soil hydrology under different soil additives in artificial runoff plots

    NASA Astrophysics Data System (ADS)

    Ruidisch, M.; Arnhold, S.; Kettering, J.; Huwe, B.; Kuzyakov, Y.; Ok, Y.; Tenhunen, J. D.

    2009-12-01

    The impact of monsoon events during June and July in the Korean project region Haean Basin, which is located in the northeastern part of South Korea plays a key role for erosion, leaching and groundwater pollution risk by agrochemicals. Therefore, the project investigates the main hydrological processes in agricultural soils under field and laboratory conditions on different scales (plot, hillslope and catchment). Soil hydrological parameters were analysed depending on different soil additives, which are known for prevention of soil erosion and nutrient loss as well as increasing of water infiltration, aggregate stability and soil fertility. Hence, synthetic water-soluble Polyacrylamides (PAM), Biochar (Black Carbon mixed with organic fertilizer), both PAM and Biochar were applied in runoff plots at three agricultural field sites. Additionally, as control a subplot was set up without any additives. The field sites were selected in areas with similar hillslope gradients and with emphasis on the dominant land management form of dryland farming in Haean, which is characterised by row planting and row covering by foil. Hydrological parameters like satured water conductivity, matrix potential and water content were analysed by infiltration experiments, continuous tensiometer measurements, time domain reflectometry as well as pressure plates to indentify characteristic water retention curves of each horizon. Weather data were observed by three weather stations next to the runoff plots. Measured data also provide the input data for modeling water transport in the unsatured zone in runoff plots with HYDRUS 1D/2D/3D and SWAT (Soil & Water Assessment Tool).

  10. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    SciTech Connect

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  11. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents. PMID:26133501

  12. USING DOSE ADDITION TO ESTIMATE CUMULATIVE RISKS FROM EXPOSURES TO MULTIPLE CHEMICALS

    EPA Science Inventory

    The Food Quality Protection Act (FQPA) of 1996 requires the EPA to consider the cumulative risk from exposure to multiple chemicals that have a common mechanism of toxicity. Three methods, hazard index (HI), point-of-departure index (PODI), and toxicity equivalence factor (TEF), ...

  13. [Food additives and genetically modified food--a risk for allergic patients?].

    PubMed

    Wüthrich, B

    1999-04-01

    Adverse reactions to food and food additives must be classified according to pathogenic criteria. It is necessary to strictly differentiate between an allergy, triggered by a substance-specific immunological mechanism, and an intolerance, in which no specific immune reaction can be established. In contrast to views expressed in the media, by laymen and patients, adverse reactions to additives are less frequent than is believed. Due to frequently "alternative" methods of examination, an allergy to food additives is often wrongly blamed as the cause of a wide variety of symptoms and illness. Diagnosing an allergy or intolerance to additives normally involves carrying out double-blind, placebo-controlled oral provocation tests with food additives. Allergic reactions to food additives occur particularly against additives which are organic in origin. In principle, it is possible that during the manufacture of genetically modified plants and food, proteins are transferred which potentially create allergies. However, legislation exists both in the USA (Federal Drug Administration, FDA) and in Switzerland (Ordinance on the approval process for GM food, GM food additives and GM accessory agents for processing) which require a careful analysis before a genetically modified product is launched, particularly where foreign genes are introduced. Products containing genetically modified organisms (GMO) as additives must be declared. In addition, the source of the foreign protein must be identified. The "Round-up ready" (RR) soya flour introduced in Switzerland is no different from natural soya flour in terms of its allergenic potential. Genetically modified food can be a blessing for allergic individuals if gene technology were to succeed in removing the allergen (e.g. such possibilities exist for rice). The same caution shown towards genetically modified food might also be advisable for foreign food in our diet. Luckily, the immune system of the digestive tract in healthy people

  14. Electricity market pricing, risk hedging and modeling

    NASA Astrophysics Data System (ADS)

    Cheng, Xu

    In this dissertation, we investigate the pricing, price risk hedging/arbitrage, and simplified system modeling for a centralized LMP-based electricity market. In an LMP-based market model, the full AC power flow model and the DC power flow model are most widely used to represent the transmission system. We investigate the differences of dispatching results, congestion pattern, and LMPs for the two power flow models. An appropriate LMP decomposition scheme to quantify the marginal costs of the congestion and real power losses is critical for the implementation of financial risk hedging markets. However, the traditional LMP decomposition heavily depends on the slack bus selection. In this dissertation we propose a slack-independent scheme to break LMP down into energy, congestion, and marginal loss components by analyzing the actual marginal cost of each bus at the optimal solution point. The physical and economic meanings of the marginal effect at each bus provide accurate price information for both congestion and losses, and thus the slack-dependency of the traditional scheme is eliminated. With electricity priced at the margin instead of the average value, the market operator typically collects more revenue from power sellers than that paid to power buyers. According to the LMP decomposition results, the revenue surplus is then divided into two parts: congestion charge surplus and marginal loss revenue surplus. We apply the LMP decomposition results to the financial tools, such as financial transmission right (FTR) and loss hedging right (LHR), which have been introduced to hedge against price risks associated to congestion and losses, to construct a full price risk hedging portfolio. The two-settlement market structure and the introduction of financial tools inevitably create market manipulation opportunities. We investigate several possible market manipulation behaviors by virtual bidding and propose a market monitor approach to identify and quantify such

  15. Modeling situation awareness and crash risk.

    PubMed

    Fisher, Donald L; Strayer, David L

    2014-01-01

    In this article we develop a model of the relationship between crash risk and a driver's situation awareness. We consider a driver's situation awareness to reflect the dynamic mental model of the driving environment and to be dependent upon several psychological processes including Scanning the driving environment, Predicting and anticipating hazards, Identifying potential hazards in the driving scene as they occur, Deciding on an action, and Executing an appropriate Response (SPIDER). Together, SPIDER is important for establishing and maintaining good situation awareness of the driving environment and good situation awareness is important for coordinating and scheduling the SPIDER-relevant processes necessary for safe driving. An Order-of-Processing (OP) model makes explicit the SPIDER-relevant processes and how they predict the likelihood of a crash when the driver is or is not distracted by a secondary task. For example, the OP model shows how a small decrease in the likelihood of any particular SPIDER activity being completed successfully (because of a concurrent secondary task performance) would lead to a large increase in the relative risk of a crash. PMID:24776225

  16. Modeling Situation Awareness and Crash Risk

    PubMed Central

    Fisher, Donald L.; Strayer, David. L.

    2014-01-01

    In this article we develop a model of the relationship between crash risk and a driver’s situation awareness. We consider a driver’s situation awareness to reflect the dynamic mental model of the driving environment and to be dependent upon several psychological processes including Scanning the driving environment, Predicting and anticipating hazards, Identifying potential hazards in the driving scene as they occur, Deciding on an action, and Executing an appropriate Response (SPIDER). Together, SPIDER is important for establishing and maintaining good situation awareness of the driving environment and good situation awareness is important for coordinating and scheduling the SPIDER-relevant processes necessary for safe driving. An Order-of-Processing (OP) model makes explicit the SPIDER-relevant processes and how they predict the likelihood of a crash when the driver is or is not distracted by a secondary task. For example, the OP model shows how a small decrease in the likelihood of any particular SPIDER activity being completed successfully (because of a concurrent secondary task performance) would lead to a large increase in the relative risk of a crash. PMID:24776225

  17. Modeling Flood Risk for South Korea

    NASA Astrophysics Data System (ADS)

    Mei, Y.; Li, S.

    2014-12-01

    Catastrophic flood events have caused significant losses for South Korea each year. It is very important to generate high resolution flood return period map for the government and insurance company to evaluate the flood risk. This research was initiated to achieve this goal. A 2000 year spatial distributed stochastical rainfall was generated by analyzing the historical rainfall of South Korea using principle component analysis. A rainfall-runoff model and a routing model were calibrated by driving the model with historical forcing and calibrated against gauge observations. The calibrated model was used to couple with the stochastical forcing to generate 2000 year discharge and runoff. The flood maps with different return periods were generated by numerically solving the shallow water equations using finite volume method on GPUs. The results of this research showed a reasonable flood map in South Korea, compared with the observed data. Further, this research could be used as an important reference for the government and insurance companies for risk management purpose.

  18. Central hemodynamics in risk assessment strategies: additive value over and above brachial blood pressure.

    PubMed

    Yannoutsos, Alexandra; Rinaldi, Elisa R; Zhang, Yi; Protogerou, Athanassios D; Safar, Michel E; Blacher, Jacques

    2015-01-01

    Although the clinical relevance of brachial blood pressure (BP) measurement for cardiovascular (CV) risk stratification is nowadays widely accepted, this approach can nevertheless present several limitations. Pulse pressure (PP) amplification accounts for the notable increase in PP from central to peripheral arterial sites. Target organs are more greatly exposed to central hemodynamic changes than peripheral organs. The pathophysiological significance of local BP pulsatility, which has a role in the pathogenesis of target organ damage in both the macro- and the microcirculation, may therefore not be accurately captured by brachial BP as traditionally evaluated with cuff measurements. The predictive value of central systolic BP and PP over brachial BP for major clinical outcomes has been demonstrated in the general population, in elderly adults and in patients at high CV risk, irrespective of the invasive or non-invasive methods used to assess central BP. Aortic stiffness, timing and intensity of wave reflections, and cardiac performance appear as major factors influencing central PP. Great emphasis has been placed on the role of aortic stiffness, disturbed arterial wave reflections and their intercorrelation in the pathophysiological mechanisms of CV diseases as well as on their capacity to predict target organ damage and clinical events. Comorbidities and age-related changes, together with gender-related specificities of arterial and cardiac parameters, are known to affect the predictive ability of central hemodynamics on individual CV risk. PMID:25341861

  19. Hazard and risk assessment of a nanoparticulate cerium oxide-based diesel fuel additive - a case study.

    PubMed

    Park, Barry; Donaldson, Kenneth; Duffin, Rodger; Tran, Lang; Kelly, Frank; Mudway, Ian; Morin, Jean-Paul; Guest, Robert; Jenkinson, Peter; Samaras, Zissis; Giannouli, Myrsini; Kouridis, Haris; Martin, Patricia

    2008-04-01

    Envirox is a scientifically and commercially proven diesel fuel combustion catalyst based on nanoparticulate cerium oxide and has been demonstrated to reduce fuel consumption, greenhouse gas emissions (CO(2)), and particulate emissions when added to diesel at levels of 5 mg/L. Studies have confirmed the adverse effects of particulates on respiratory and cardiac health, and while the use of Envirox contributes to a reduction in the particulate content in the air, it is necessary to demonstrate that the addition of Envirox does not alter the intrinsic toxicity of particles emitted in the exhaust. The purpose of this study was to evaluate the safety in use of Envirox by addressing the classical risk paradigm. Hazard assessment has been addressed by examining a range of in vitro cell and cell-free endpoints to assess the toxicity of cerium oxide nanoparticles as well as particulates emitted from engines using Envirox. Exposure assessment has taken data from modeling studies and from airborne monitoring sites in London and Newcastle adjacent to routes where vehicles using Envirox passed. Data have demonstrated that for the exposure levels measured, the estimated internal dose for a referential human in a chronic exposure situation is much lower than the no-observed-effect level (NOEL) in the in vitro toxicity studies. Exposure to nano-size cerium oxide as a result of the addition of Envirox to diesel fuel at the current levels of exposure in ambient air is therefore unlikely to lead to pulmonary oxidative stress and inflammation, which are the precursors for respiratory and cardiac health problems. PMID:18444008

  20. Modeling risk of occupational zoonotic influenza infection in swine workers.

    PubMed

    Paccha, Blanca; Jones, Rachael M; Gibbs, Shawn; Kane, Michael J; Torremorell, Montserrat; Neira-Ramirez, Victor; Rabinowitz, Peter M

    2016-08-01

    Zoonotic transmission of influenza A virus (IAV) between swine and workers in swine production facilities may play a role in the emergence of novel influenza strains with pandemic potential. Guidelines to prevent transmission of influenza to swine workers have been developed but there is a need for evidence-based decision-making about protective measures such as respiratory protection. A mathematical model was applied to estimate the risk of occupational IAV exposure to swine workers by contact and airborne transmission, and to evaluate the use of respirators to reduce transmission.  The Markov model was used to simulate the transport and exposure of workers to IAV in a swine facility. A dose-response function was used to estimate the risk of infection. This approach is similar to methods previously used to estimate the risk of infection in human health care settings. This study uses concentration of virus in air from field measurements collected during outbreaks of influenza in commercial swine facilities, and analyzed by polymerase chain reaction.  It was found that spending 25 min working in a barn during an influenza outbreak in a swine herd could be sufficient to cause zoonotic infection in a worker. However, this risk estimate was sensitive to estimates of viral infectivity to humans. Wearing an excellent fitting N95 respirator reduced this risk, but with high aerosol levels the predicted risk of infection remained high under certain assumptions.  The results of this analysis indicate that under the conditions studied, swine workers are at risk of zoonotic influenza infection. The use of an N95 respirator could reduce such risk. These findings have implications for risk assessment and preventive programs targeting swine workers. The exact level of risk remains uncertain, since our model may have overestimated the viability or infectivity of IAV. Additionally, the potential for partial immunity in swine workers associated with repeated low

  1. A flexible count data regression model for risk analysis.

    PubMed

    Guikema, Seth D; Coffelt, Jeremy P; Goffelt, Jeremy P

    2008-02-01

    In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets. PMID:18304118

  2. Use of generalised additive models to categorise continuous variables in clinical prediction

    PubMed Central

    2013-01-01

    Background In medical practice many, essentially continuous, clinical parameters tend to be categorised by physicians for ease of decision-making. Indeed, categorisation is a common practice both in medical research and in the development of clinical prediction rules, particularly where the ensuing models are to be applied in daily clinical practice to support clinicians in the decision-making process. Since the number of categories into which a continuous predictor must be categorised depends partly on the relationship between the predictor and the outcome, the need for more than two categories must be borne in mind. Methods We propose a categorisation methodology for clinical-prediction models, using Generalised Additive Models (GAMs) with P-spline smoothers to determine the relationship between the continuous predictor and the outcome. The proposed method consists of creating at least one average-risk category along with high- and low-risk categories based on the GAM smooth function. We applied this methodology to a prospective cohort of patients with exacerbated chronic obstructive pulmonary disease. The predictors selected were respiratory rate and partial pressure of carbon dioxide in the blood (PCO2), and the response variable was poor evolution. An additive logistic regression model was used to show the relationship between the covariates and the dichotomous response variable. The proposed categorisation was compared to the continuous predictor as the best option, using the AIC and AUC evaluation parameters. The sample was divided into a derivation (60%) and validation (40%) samples. The first was used to obtain the cut points while the second was used to validate the proposed methodology. Results The three-category proposal for the respiratory rate was ≤ 20;(20,24];> 24, for which the following values were obtained: AIC=314.5 and AUC=0.638. The respective values for the continuous predictor were AIC=317.1 and AUC=0.634, with no statistically

  3. Modeling risk of pneumonia epizootics in bighorn sheep

    USGS Publications Warehouse

    Sells, Sarah N.; Mitchell, Michael S.; Nowak, J. Joshua; Lukacs, Paul M.; Anderson, Neil J.; Ramsey, Jennifer M.; Gude, Justin A.; Krausman, Paul R.

    2015-01-01

    Pneumonia epizootics are a major challenge for management of bighorn sheep (Ovis canadensis) affecting persistence of herds, satisfaction of stakeholders, and allocations of resources by management agencies. Risk factors associated with the disease are poorly understood, making pneumonia epizootics hard to predict; such epizootics are thus managed reactively rather than proactively. We developed a model for herds in Montana that identifies risk factors and addresses biological questions about risk. Using Bayesian logistic regression with repeated measures, we found that private land, weed control using domestic sheep or goats, pneumonia history, and herd density were positively associated with risk of pneumonia epizootics in 43 herds that experienced 22 epizootics out of 637 herd-years from 1979–2013. We defined an area of high risk for pathogen exposure as the area of each herd distribution plus a 14.5-km buffer from that boundary. Within this area, the odds of a pneumonia epizootic increased by >1.5 times per additional unit of private land (unit is the standardized % of private land where global  = 25.58% and SD = 14.53%). Odds were >3.3 times greater if domestic sheep or goats were used for weed control in a herd's area of high risk. If a herd or its neighbors within the area of high risk had a history of a pneumonia epizootic, odds of a subsequent pneumonia epizootic were >10 times greater. Risk greatly increased when herds were at high density, with nearly 15 times greater odds of a pneumonia epizootic compared to when herds were at low density. Odds of a pneumonia epizootic also appeared to decrease following increased spring precipitation (odds = 0.41 per unit increase, global  = 100.18% and SD = 26.97%). Risk was not associated with number of federal sheep and goat allotments, proximity to nearest herds of bighorn sheep, ratio of rams to ewes, percentage of average winter precipitation, or whether herds were of native versus mixed

  4. An animal model of differential genetic risk for methamphetamine intake

    PubMed Central

    Phillips, Tamara J.; Shabani, Shkelzen

    2015-01-01

    The question of whether genetic factors contribute to risk for methamphetamine (MA) use and dependence has not been intensively investigated. Compared to human populations, genetic animal models offer the advantages of control over genetic family history and drug exposure. Using selective breeding, we created lines of mice that differ in genetic risk for voluntary MA intake and identified the chromosomal addresses of contributory genes. A quantitative trait locus was identified on chromosome 10 that accounts for more than 50% of the genetic variance in MA intake in the selected mouse lines. In addition, behavioral and physiological screening identified differences corresponding with risk for MA intake that have generated hypotheses that are testable in humans. Heightened sensitivity to aversive and certain physiological effects of MA, such as MA-induced reduction in body temperature, are hallmarks of mice bred for low MA intake. Furthermore, unlike MA-avoiding mice, MA-preferring mice are sensitive to rewarding and reinforcing MA effects, and to MA-induced increases in brain extracellular dopamine levels. Gene expression analyses implicate the importance of a network enriched in transcription factor genes, some of which regulate the mu opioid receptor gene, Oprm1, in risk for MA use. Neuroimmune factors appear to play a role in differential response to MA between the mice bred for high and low intake. In addition, chromosome 10 candidate gene studies provide strong support for a trace amine-associated receptor 1 gene, Taar1, polymorphism in risk for MA intake. MA is a trace amine-associated receptor 1 (TAAR1) agonist, and a non-functional Taar1 allele segregates with high MA consumption. Thus, reduced TAAR1 function has the potential to increase risk for MA use. Overall, existing findings support the MA drinking lines as a powerful model for identifying genetic factors involved in determining risk for harmful MA use. Future directions include the development of a

  5. Benefits and concerns associated with biotechnology-derived foods: can additional research reduce children health risks?

    PubMed

    Cantani, A

    2006-01-01

    The development of techniques devised for the genetic manipulation of foods poses new risks for children with food allergy (FA). The introduction of foreign allergenic proteins from different foods into previously tolerated foods may trigger allergic reactions, often complicating with anaphylactic shock in a subset of allergic babies. Children with FA, even if subjected to preventative diets, always challenge the risk of developing allergic manifestations after unintentional intake of a non tolerated food in restaurant settings, with relatives or schoolmates, etc, where product labelling is necessarily lacking. The introduction of potentially allergenic proteins into foods generally considered safe for allergic children can be done deliberately, by either substantially altering the food ingredients, or by genetic manipulation which change the composition or transfer allergens, or unintentionally by quality-control failures, due to contaminations in the production process, or to genetic mismanipulation. There is a controversy between multinationals often favored by governments and consumer association resistance, thus an equidistant analysis poses some unprecedented impediments. The importance of FA and the potential of transgenic plants to bring food allergens into the food supply should not be disregarded. The expression in soybeans of a Brazil nut protein resulted in a food allergen expressed in widely used infant formulas, so paving the way to an often reported multinational debacle. Genetic engineering poses innovative ethical and social concerns, as well as serious challenges to the environment, human health, animal welfare, and the future of agriculture. In this paper will be emphasized practical concepts more crucial for pediatricians. PMID:16910351

  6. Benefits and concerns associated with biotechnology-derived foods: can additional research reduce children health risks?

    PubMed

    Cantani, A

    2009-01-01

    The development of techniques devised for the genetic manipulation of foods poses new risks for children with food allergy (FA). The introduction of foreign allergenic proteins from different foods into previously tolerated foods may trigger allergic reactions, often complicating with anaphylactic shock in a subset of allergic babies. Children with FA, even if subjected to preventative diets, always challenge the risk of developing allergic manifestations after unintentional intake of a non tolerated food in restaurant settings, with relatives or schoolmates, etc, where product labelling is necessarily lacking. The introduction of potentially allergenic proteins into foods generally considered safe for allergic children can be done deliberately, by either substantially altering the food ingredients, or by genetic manipulation which change the composition or transfer allergens, or unintentionally by qualitycontrol failures, due to contaminations in the production process, or to genetic mismanipulation. There is a controversy between multinationals often favored by governments and consumer association resistance, thus an equidistant analysis poses some unprecedented impediments. The importance of FA and the potential of transgenic plants to bring food allergens into the food supply should not be disregarded. The expression in soybeans of a Brazil nut protein resulted in a food allergen ex-pressed in widely used infant formulas, so paving the way to an often reported multinational debacle. Genetic engineering poses innovative ethical and social concerns, as well as serious challenges to the environment, human health, animal welfare, and the future of agriculture. In this paper will be emphasized practical concepts more crucial for pediatricians. PMID:19364084

  7. A hybrid likelihood algorithm for risk modelling.

    PubMed

    Kellerer, A M; Kreisheimer, M; Chmelevsky, D; Barclay, D

    1995-03-01

    The risk of radiation-induced cancer is assessed through the follow-up of large cohorts, such as atomic bomb survivors or underground miners who have been occupationally exposed to radon and its decay products. The models relate to the dose, age and time dependence of the excess tumour rates, and they contain parameters that are estimated in terms of maximum likelihood computations. The computations are performed with the software package EPI-CURE, which contains the two main options of person-by person regression or of Poisson regression with grouped data. The Poisson regression is most frequently employed, but there are certain models that require an excessive number of cells when grouped data are used. One example involves computations that account explicitly for the temporal distribution of continuous exposures, as they occur with underground miners. In past work such models had to be approximated, but it is shown here that they can be treated explicitly in a suitably reformulated person-by person computation of the likelihood. The algorithm uses the familiar partitioning of the log-likelihood into two terms, L1 and L0. The first term, L1, represents the contribution of the 'events' (tumours). It needs to be evaluated in the usual way, but constitutes no computational problem. The second term, L0, represents the event-free periods of observation. It is, in its usual form, unmanageable for large cohorts. However, it can be reduced to a simple form, in which the number of computational steps is independent of cohort size. The method requires less computing time and computer memory, but more importantly it leads to more stable numerical results by obviating the need for grouping the data. The algorithm may be most relevant to radiation risk modelling, but it can facilitate the modelling of failure-time data in general. PMID:7604154

  8. Social models of HIV risk among young adults in Lesotho.

    PubMed

    Bulled, Nicola L

    2015-01-01

    Extensive research over the past 30 years has revealed that individual and social determinants impact HIV risk. Even so, prevention efforts focus primarily on individual behaviour change, with little recognition of the dynamic interplay of individual and social environment factors that further exacerbate risk engagement. Drawing on long-term research with young adults in Lesotho, I examine how social environment factors contribute to HIV risk. During preliminary ethnographic analysis, I developed novel scales to measure social control, adoption of modernity, and HIV knowledge. In survey research, I examined the effects of individual characteristics (i.e., socioeconomic status, HIV knowledge, adoption of modernity) and social environment (i.e., social control) on HIV risk behaviours. In addition, I measured the impact of altered environments by taking advantage of an existing situation whereby young adults attending a national college are assigned to either a main campus in a metropolitan setting or a satellite campus in a remote setting, irrespective of the environment in which they were socialised as youth. This arbitrary assignment process generates four distinct groups of young adults with altered or constant environments. Regression models show that lower levels of perceived social control and greater adoption of modernity are associated with HIV risk, controlling for other factors. The impact of social control and modernity varies with environment dynamics. PMID:26284999

  9. Extreme Earthquake Risk Estimation by Hybrid Modeling

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Cabrera, E.; Ashworth, M.; Garcia, S.; Emerson, D.; Perea, N.; Salazar, A.; Moulinec, C.

    2012-12-01

    The estimation of the hazard and the economical consequences i.e. the risk associated to the occurrence of extreme magnitude earthquakes in the neighborhood of urban or lifeline infrastructure, such as the 11 March 2011 Mw 9, Tohoku, Japan, represents a complex challenge as it involves the propagation of seismic waves in large volumes of the earth crust, from unusually large seismic source ruptures up to the infrastructure location. The large number of casualties and huge economic losses observed for those earthquakes, some of which have a frequency of occurrence of hundreds or thousands of years, calls for the development of new paradigms and methodologies in order to generate better estimates, both of the seismic hazard, as well as of its consequences, and if possible, to estimate the probability distributions of their ground intensities and of their economical impacts (direct and indirect losses), this in order to implement technological and economical policies to mitigate and reduce, as much as possible, the mentioned consequences. Herewith, we propose a hybrid modeling which uses 3D seismic wave propagation (3DWP) and neural network (NN) modeling in order to estimate the seismic risk of extreme earthquakes. The 3DWP modeling is achieved by using a 3D finite difference code run in the ~100 thousands cores Blue Gene Q supercomputer of the STFC Daresbury Laboratory of UK, combined with empirical Green function (EGF) techniques and NN algorithms. In particular the 3DWP is used to generate broadband samples of the 3D wave propagation of extreme earthquakes (plausible) scenarios corresponding to synthetic seismic sources and to enlarge those samples by using feed-forward NN. We present the results of the validation of the proposed hybrid modeling for Mw 8 subduction events, and show examples of its application for the estimation of the hazard and the economical consequences, for extreme Mw 8.5 subduction earthquake scenarios with seismic sources in the Mexican

  10. Arrest Histories of High-Risk Gay and Bisexual Men in Miami: Unexpected Additional Evidence For Syndemic Theory†

    PubMed Central

    Kurtz, Steven P.

    2009-01-01

    Gay and bisexual men continue to suffer the highest burden of HIV/AIDS in the U.S. Since the beginning of the epidemic, substance abuse has been shown to be one of the strongest predictors of sexual risk behaviors and seroconversion among this population. Recent research has focused on additional aspects of health risk disparities among gay and bisexual men, including depression and other mental health problems, childhood sexual abuse, and adult victimization, suggesting that these men are impacted by a syndemic of health risks. The involvement of gay and bisexual men with the criminal justice system is largely absent from the literature. This article describes the nature, extent and predictors of the arrest histories of a sample of gay and bisexual substance users at very high risk for HIV infection and/or transmission. These histories are surprisingly extensive, and are strongly associated with poverty, severe mental distress, substance abuse and dependence, and victimization. The involvement of gay and bisexual men in the criminal justice system deserves a stronger research focus because of the unique challenges facing such men and also because arrests are yet another marker for a host of health risks among them. PMID:19283955

  11. Multi-locus models of genetic risk of disease

    PubMed Central

    2010-01-01

    Background Evidence for genetic contribution to complex diseases is described by recurrence risks to relatives of diseased individuals. Genome-wide association studies allow a description of the genetics of the same diseases in terms of risk loci, their effects and allele frequencies. To reconcile the two descriptions requires a model of how risks from individual loci combine to determine an individual's overall risk. Methods We derive predictions of risk to relatives from risks at individual loci under a number of models and compare them with published data on disease risk. Results The model in which risks are multiplicative on the risk scale implies equality between the recurrence risk to monozygotic twins and the square of the recurrence risk to sibs, a relationship often not observed, especially for low prevalence diseases. We show that this theoretical equality is achieved by allowing impossible probabilities of disease. Other models, in which probabilities of disease are constrained to a maximum of one, generate results more consistent with empirical estimates for a range of diseases. Conclusions The unconstrained multiplicative model, often used in theoretical studies because of its mathematical tractability, is not a realistic model. We find three models, the constrained multiplicative, Odds (or Logit) and Probit (or liability threshold) models, all fit the data on risk to relatives. Currently, in practice it would be difficult to differentiate between these models, but this may become possible if genetic variants that explain the majority of the genetic variance are identified. PMID:20181060

  12. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    NASA Technical Reports Server (NTRS)

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  13. Multiprocessing and Correction Algorithm of 3D-models for Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Anamova, R. R.; Zelenov, S. V.; Kuprikov, M. U.; Ripetskiy, A. V.

    2016-07-01

    This article addresses matters related to additive manufacturing preparation. A layer-by-layer model presentation was developed on the basis of a routing method. Methods for correction of errors in the layer-by-layer model presentation were developed. A multiprocessing algorithm for forming an additive manufacturing batch file was realized.

  14. Validation analysis of probabilistic models of dietary exposure to food additives.

    PubMed

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty. PMID:14555358

  15. Adolescent mental health and academic functioning: empirical support for contrasting models of risk and vulnerability.

    PubMed

    Lucier-Greer, Mallory; O'Neal, Catherine W; Arnold, A Laura; Mancini, Jay A; Wickrama, Kandauda K A S

    2014-11-01

    Adolescents in military families contend with normative stressors that are universal and exist across social contexts (minority status, family disruptions, and social isolation) as well as stressors reflective of their military life context (e.g., parental deployment, school transitions, and living outside the United States). This study utilizes a social ecological perspective and a stress process lens to examine the relationship between multiple risk factors and relevant indicators of youth well-being, namely depressive symptoms and academic performance, as well as the mediating role of self-efficacy (N = 1,036). Three risk models were tested: an additive effects model (each risk factor uniquely influences outcomes), a full cumulative effects model (the collection of risk factors influences outcomes), a comparative model (a cumulative effects model exploring the differential effects of normative and military-related risks). This design allowed for the simultaneous examination of multiple risk factors and a comparison of alternative perspectives on measuring risk. Each model was predictive of depressive symptoms and academic performance through persistence; however, each model provides unique findings about the relationship between risk factors and youth outcomes. Discussion is provided pertinent to service providers and researchers on how risk is conceptualized and suggestions for identifying at-risk youth. PMID:25373055

  16. Risk assessment compatible fire models (RACFMs)

    SciTech Connect

    Lopez, A.R.; Gritzo, L.A.; Sherman, M.P.

    1998-07-01

    A suite of Probabilistic Risk Assessment Compatible Fire Models (RACFMs) has been developed to represent the hazard posed by a pool fire to weapon systems transported on the B52-H aircraft. These models represent both stand-off (i.e., the weapon system is outside of the flame zone but exposed to the radiant heat load from fire) and fully-engulfing scenarios (i.e., the object is fully covered by flames). The approach taken in developing the RACFMs for both scenarios was to consolidate, reconcile, and apply data and knowledge from all available resources including: data and correlations from the literature, data from an extensive full-scale fire test program at the Naval Air Warfare Center (NAWC) at China Lake, and results from a fire field model (VULCAN). In the past, a single, effective temperature, T{sub f}, was used to represent the fire. The heat flux to an object exposed to a fire was estimated using the relationship for black body radiation, {sigma}T{sub f}{sup 4}. Significant improvements have been made by employing the present approach which accounts for the presence of temperature distributions in fully-engulfing fires, and uses best available correlations to estimate heat fluxes in stand-off scenarios.

  17. Aquatic models, genomics and chemical risk management.

    PubMed

    Cheng, Keith C; Hinton, David E; Mattingly, Carolyn J; Planchart, Antonio

    2012-01-01

    The 5th Aquatic Animal Models for Human Disease meeting follows four previous meetings (Nairn et al., 2001; Schmale, 2004; Schmale et al., 2007; Hinton et al., 2009) in which advances in aquatic animal models for human disease research were reported, and community discussion of future direction was pursued. At this meeting, discussion at a workshop entitled Bioinformatics and Computational Biology with Web-based Resources (20 September 2010) led to an important conclusion: Aquatic model research using feral and experimental fish, in combination with web-based access to annotated anatomical atlases and toxicological databases, yields data that advance our understanding of human gene function, and can be used to facilitate environmental management and drug development. We propose here that the effects of genes and environment are best appreciated within an anatomical context - the specifically affected cells and organs in the whole animal. We envision the use of automated, whole-animal imaging at cellular resolution and computational morphometry facilitated by high-performance computing and automated entry into toxicological databases, as anchors for genetic and toxicological data, and as connectors between human and model system data. These principles should be applied to both laboratory and feral fish populations, which have been virtually irreplaceable sentinals for environmental contamination that results in human morbidity and mortality. We conclude that automation, database generation, and web-based accessibility, facilitated by genomic/transcriptomic data and high-performance and cloud computing, will potentiate the unique and potentially key roles that aquatic models play in advancing systems biology, drug development, and environmental risk management. PMID:21763781

  18. Breast Cancer Risk Assessment SAS Macro (Gail Model)

    Cancer.gov

    A SAS macro (commonly referred to as the Gail Model) that projects absolute risk of invasive breast cancer according to NCI’s Breast Cancer Risk Assessment Tool (BCRAT) algorithm for specified race/ethnic groups and age intervals.

  19. Model of meteoroid risk in near-Earth space

    NASA Astrophysics Data System (ADS)

    Mironov, V. V.; Murtazov, A. K.

    2015-11-01

    We present a model of the risk of meteoroid collision with spacecraft in near-Earth space. We assess the average risk of collision between spacecraft and bright meteoroids of the Perseids stream in 2007-2013.

  20. Statistical inference for the additive hazards model under outcome-dependent sampling

    PubMed Central

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P.; Zhou, Haibo

    2015-01-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer. PMID:26379363

  1. Modeling HIV Risk in Highly Vulnerable Youth

    ERIC Educational Resources Information Center

    Huba, G. J.; Panter, A. T.; Melchior, Lisa A.; Trevithick, Lee; Woods, Elizabeth R.; Wright, Eric; Feudo, Rudy; Tierney, Steven; Schneir, Arlene; Tenner, Adam; Remafedi, Gary; Greenberg, Brian; Sturdevant, Marsha; Goodman, Elizabeth; Hodgins, Antigone; Wallace, Michael; Brady, Russell E.; Singer, Barney; Marconi, Katherine

    2003-01-01

    This article examines the structure of several HIV risk behaviors in an ethnically and geographically diverse sample of 8,251 clients from 10 innovative demonstration projects intended for adolescents living with, or at risk for, HIV. Exploratory and confirmatory factor analyses identified 2 risk factors for men (sexual intercourse with men and a…

  2. An introduction to modeling longitudinal data with generalized additive models: applications to single-case designs.

    PubMed

    Sullivan, Kristynn J; Shadish, William R; Steiner, Peter M

    2015-03-01

    Single-case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time in both the presence and absence of treatment. This article introduces a statistical technique for analyzing SCD data that has not been much used in psychological and educational research: generalized additive models (GAMs). In parametric regression, the researcher must choose a functional form to impose on the data, for example, that trend over time is linear. GAMs reverse this process by letting the data inform the choice of functional form. In this article we review the problem that trend poses in SCDs, discuss how current SCD analytic methods approach trend, describe GAMs as a possible solution, suggest a GAM model testing procedure for examining the presence of trend in SCDs, present a small simulation to show the statistical properties of GAMs, and illustrate the procedure on 3 examples of different lengths. Results suggest that GAMs may be very useful both as a form of sensitivity analysis for checking the plausibility of assumptions about trend and as a primary data analysis strategy for testing treatment effects. We conclude with a discussion of some problems with GAMs and some future directions for research on the application of GAMs to SCDs. PMID:24885341

  3. Optimal dividends in the Brownian motion risk model with interest

    NASA Astrophysics Data System (ADS)

    Fang, Ying; Wu, Rong

    2009-07-01

    In this paper, we consider a Brownian motion risk model, and in addition, the surplus earns investment income at a constant force of interest. The objective is to find a dividend policy so as to maximize the expected discounted value of dividend payments. It is well known that optimality is achieved by using a barrier strategy for unrestricted dividend rate. However, ultimate ruin of the company is certain if a barrier strategy is applied. In many circumstances this is not desirable. This consideration leads us to impose a restriction on the dividend stream. We assume that dividends are paid to the shareholders according to admissible strategies whose dividend rate is bounded by a constant. Under this additional constraint, we show that the optimal dividend strategy is formed by a threshold strategy.

  4. Bankruptcy risk model and empirical tests

    PubMed Central

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M.; Urošević, Branko; Stanley, H. Eugene

    2010-01-01

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor—the debt-to-asset ratio R—in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes’s theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees—although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers. PMID:20937903

  5. Bankruptcy risk model and empirical tests.

    PubMed

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M; Urosevic, Branko; Stanley, H Eugene

    2010-10-26

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor--the debt-to-asset ratio R--in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes's theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees--although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers. PMID:20937903

  6. Structured additive regression modeling of age of menarche and menopause in a breast cancer screening program.

    PubMed

    Duarte, Elisa; de Sousa, Bruno; Cadarso-Suarez, Carmen; Rodrigues, Vitor; Kneib, Thomas

    2014-05-01

    Breast cancer risk is believed to be associated with several reproductive factors, such as early menarche and late menopause. This study is based on the registries of the first time a woman enters the screening program, and presents a spatio-temporal analysis of the variables age of menarche and age of menopause along with other reproductive and socioeconomic factors. The database was provided by the Portuguese Cancer League (LPCC), a private nonprofit organization dealing with multiple issues related to oncology of which the Breast Cancer Screening Program is one of its main activities. The registry consists of 259,652 records of women who entered the screening program for the first time between 1990 and 2007 (45-69-year age group). Structured Additive Regression (STAR) models were used to explore spatial and temporal correlations with a wide range of covariates. These models are flexible enough to deal with a variety of complex datasets, allowing us to reveal possible relationships among the variables considered in this study. The analysis shows that early menarche occurs in younger women and in municipalities located in the interior of central Portugal. Women living in inland municipalities register later ages for menopause, and those born in central Portugal after 1933 show a decreasing trend in the age of menopause. Younger ages of menarche and late menopause are observed in municipalities with a higher purchasing power index. The analysis performed in this study portrays the time evolution of the age of menarche and age of menopause and their spatial characterization, adding to the identification of factors that could be of the utmost importance in future breast cancer incidence research. PMID:24615881

  7. Research on R&D Project Risk Management Model

    NASA Astrophysics Data System (ADS)

    Gu, Xiaoyan; Cai, Chen; Song, Hao; Song, Juan

    R&D project is an exploratory high-risk investment activity and has potential management flexibility. In R&D project risk management process, it is hard to quantify risk with very little past information available. This paper introduces quality function deployment and real option in traditional project risk management process. Through waterfall decomposition mode, R&D project risk management process is constructed step by step; through real option, the managerial flexibility inherent in R&D project can be modeled. In the paper, first of all, according to the relation matrix between R&D project success factors and risk indexes, risk priority list can be obtained. Then, risk features of various stages are analyzed. Finally, real options are embedded into various stages of R&D project by the risk features. In order to effectively manage R&D risk in a dynamic cycle, the steps above should be carried out repeatedly.

  8. An Integrated Approach to Managing Business Process Risk Using Rich Organizational Models

    NASA Astrophysics Data System (ADS)

    Islam, M. M. Zahidul; Bhuiyan, Moshiur; Krishna, Aneesh; Ghose, Aditya

    Business processes represent the operational capabilities of an organization. In order to ensure process continuity, the effective management of risks becomes an area of key concern. In this paper we propose an approach for supporting risk identification with the use of higher-level organizational models. We provide some intuitive metrics for extracting measures of actor criticality and vulnerability from organizational models. This helps direct risk management to areas of critical importance within organization models. Additionally, the information can be used to assess alternative organizational structures in domains where risk mitigation is crucial. At the process level, these measures can be used to help direct improvements to the robustness and failsafe capabilities of critical or vulnerable processes. We believe our novel approach, will provide added benefits when used with other approaches to risk management during business process management, that do not reference the greater organizational context during risk assessment.

  9. 42 CFR 425.600 - Selection of risk model.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Selection of risk model. 425.600 Section 425.600... Selection of risk model. (a) For its initial agreement period, an ACO may elect to operate under one of the following tracks: (1) Track 1. Under Track 1, the ACO operates under the one-sided model (as described...

  10. 42 CFR 425.600 - Selection of risk model.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Selection of risk model. 425.600 Section 425.600... Selection of risk model. (a) For its initial agreement period, an ACO may elect to operate under one of the following tracks: (1) Track 1. Under Track 1, the ACO operates under the one-sided model (as described...

  11. 42 CFR 425.600 - Selection of risk model.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Selection of risk model. 425.600 Section 425.600... Selection of risk model. (a) For its initial agreement period, an ACO may elect to operate under one of the following tracks: (1) Track 1. Under Track 1, the ACO operates under the one-sided model (as described...

  12. Genetic predisposition to coronary heart disease and stroke using an additive genetic risk score: a population-based study in Greece

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Objective: To determine the extent to which the risk for incident coronary heart disease (CHD) increases in relation to a genetic risk score (GRS) that additively integrates the influence of high-risk alleles in nine documented single nucleotide polymorphisms (SNPs) for CHD, and to examine whether t...

  13. Proliferation Risk Characterization Model Prototype Model - User and Programmer Guidelines

    SciTech Connect

    Dukelow, J.S.; Whitford, D.

    1998-12-01

    A model for the estimation of the risk of diversion of weapons-capable materials was developed. It represents both the threat of diversion and site vulnerability as a product of a small number of variables (two to eight), each of which can take on a small number (two to four) of qualitatively defined (but quantitatively implemented) values. The values of the overall threat and vulnerability variables are then converted to threat and vulnerability categories. The threat and vulnerability categories are used to define the likelihood of diversion, also defined categorically. The evaluator supplies an estimate of the consequences of a diversion, defined categorically, but with the categories based on the IAEA Attractiveness levels. Likelihood and Consequences categories are used to define the Risk, also defined categorically. The threat, vulnerability, and consequences input provided by the evaluator contains a representation of his/her uncertainty in each variable assignment which is propagated all the way through to the calculation of the Risk categories. [Appendix G available on diskette only.

  14. Fitting additive hazards models for case-cohort studies: a multiple imputation approach.

    PubMed

    Jung, Jinhyouk; Harel, Ofer; Kang, Sangwook

    2016-07-30

    In this paper, we consider fitting semiparametric additive hazards models for case-cohort studies using a multiple imputation approach. In a case-cohort study, main exposure variables are measured only on some selected subjects, but other covariates are often available for the whole cohort. We consider this as a special case of a missing covariate by design. We propose to employ a popular incomplete data method, multiple imputation, for estimation of the regression parameters in additive hazards models. For imputation models, an imputation modeling procedure based on a rejection sampling is developed. A simple imputation modeling that can naturally be applied to a general missing-at-random situation is also considered and compared with the rejection sampling method via extensive simulation studies. In addition, a misspecification aspect in imputation modeling is investigated. The proposed procedures are illustrated using a cancer data example. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26194861

  15. A Social Ecological Model of Syndemic Risk affecting Women with and At-Risk for HIV in Impoverished Urban Communities.

    PubMed

    Batchelder, A W; Gonzalez, J S; Palma, A; Schoenbaum, E; Lounsbury, D W

    2015-12-01

    Syndemic risk is an ecological construct, defined by co-occurring interdependent socio-environmental, interpersonal and intrapersonal determinants. We posited syndemic risk to be a function of violence, substance use, perceived financial hardship, emotional distress and self-worth among women with and at-risk for HIV in an impoverished urban community. In order to better understand these interrelationships, we developed and validated a system dynamics (SD) model based upon peer-reviewed literature; secondary data analyses of a cohort dataset including women living with and at-risk of HIV in Bronx, NY (N = 620); and input from a Bronx-based community advisory board. Simulated model output revealed divergent levels and patterns of syndemic risk over time across different sample profiles. Outputs generated new insights about how to effectively explore multicomponent multi-level programs in order to strategically develop more effective services for this population. Specifically, the model indicated that effective multi-level interventions might bolster women's resilience by increasing self-worth, which may result in decreased perceived financial hardship and risk of violence. Overall, our stakeholder-informed model depicts how self-worth may be a major driver of vulnerability and a meaningful addition to syndemic theory affecting this population. PMID:26370203

  16. SMALL POPULATIONS REQUIRE SPECIFIC MODELING APPROACHES FOR ASSESSING RISK

    EPA Science Inventory

    All populations face non-zero risks of extinction. However, the risks for small populations, and therefore the modeling approaches necessary to predict them, are different from those of large populations. These differences are currently hindering assessment of risk to small pop...

  17. Modeling biotic habitat high risk areas

    USGS Publications Warehouse

    Despain, D.G.; Beier, P.; Tate, C.; Durtsche, B.M.; Stephens, T.

    2000-01-01

    Fire, especially stand replacing fire, poses a threat to many threatened and endangered species as well as their habitat. On the other hand, fire is important in maintaining a variety of successional stages that can be important for approach risk assessment to assist in prioritizing areas for allocation of fire mitigation funds. One example looks at assessing risk to the species and biotic communities of concern followed by the Colorado Natural Heritage Program. One looks at the risk to Mexican spottled owls. Another looks at the risk to cutthroat trout, and a fourth considers the general effects of fire and elk.

  18. MULTIMEDIA HUMAN EXPOSURE AND RISK ASSESSMENT MODELING

    EPA Science Inventory

    Exposures and health risk comparisons from different sites may be used for allocating limited resources available for remedial action. It is important that comparisons between different sites use similar levels of site-specific data and/or screening level data. Risk assessment c...

  19. Uses and Abuses of Models in Radiation Risk Management

    SciTech Connect

    Strom, Daniel J.

    1998-12-10

    This paper is a high-level overview of managing risks to workers, public, and the environment. It discusses the difference between a model and a hypothesis. The need for models in risk assessment is justified, and then it is shown that radiation risk models that are useable in risk management are highly simplistic. The weight of evidence is considered for and against the linear non-threshold (LNT) model for carcinogenesis and heritable ill-health that is currently the basis for radiation risk management. Finally, uses and misuses of this model are considered. It is concluded that the LNT model continues to be suitable for use as the basis for radiation protection.

  20. Risk management modeling and its application in maritime safety

    NASA Astrophysics Data System (ADS)

    Qin, Ting-Rong; Chen, Wei-Jiong; Zeng, Xiang-Kun

    2008-12-01

    Quantified risk assessment (QRA) needs mathematicization of risk theory. However, attention has been paid almost exclusively to applications of assessment methods, which has led to neglect of research into fundamental theories, such as the relationships among risk, safety, danger, and so on. In order to solve this problem, as a first step, fundamental theoretical relationships about risk and risk management were analyzed for this paper in the light of mathematics, and then illustrated with some charts. Second, man-machine-environment-management (MMEM) theory was introduced into risk theory to analyze some properties of risk. On the basis of this, a three-dimensional model of risk management was established that includes: a goal dimension; a management dimension; an operation dimension. This goal management operation (GMO) model was explained and then emphasis was laid on the discussion of the risk flowchart (operation dimension), which lays the groundwork for further study of risk management and qualitative and quantitative assessment. Next, the relationship between Formal Safety Assessment (FSA) and Risk Management was researched. This revealed that the FSA method, which the international maritime organization (IMO) is actively spreading, comes from Risk Management theory. Finally, conclusion were made about how to apply this risk management method to concrete fields efficiently and conveniently, as well as areas where further research is required.

  1. Increased bioclogging and corrosion risk by sulfate addition during iodine recovery at a natural gas production plant.

    PubMed

    Lim, Choon-Ping; Zhao, Dan; Takase, Yuta; Miyanaga, Kazuhiko; Watanabe, Tomoko; Tomoe, Yasuyoshi; Tanji, Yasunori

    2011-02-01

    Iodine recovery at a natural gas production plant in Japan involved the addition of sulfuric acid for pH adjustment, resulting in an additional about 200 mg/L of sulfate in the waste brine after iodine recovery. Bioclogging occurred at the waste brine injection well, causing a decrease in well injectivity. To examine the factors that contribute to bioclogging, an on-site experiment was conducted by amending 10 L of brine with different conditions and then incubating the brine for 5 months under open air. The control case was exposed to open air but did not receive additional chemicals. When sulfate addition was coupled with low iodine, there was a drastic increase in the total amount of accumulated biomass (and subsequently the risk of bioclogging) that was nearly six times higher than the control. The bioclogging-associated corrosion rate of carbon steel was 84.5 μm/year, which is four times higher than that observed under other conditions. Analysis of the microbial communities by denaturing gradient gel electrophoresis revealed that the additional sulfate established a sulfur cycle and induced the growth of phototrophic bacteria, including cyanobacteria and purple bacteria. In the presence of sulfate and low iodine levels, cyanobacteria and purple bacteria bloomed, and the accumulation of abundant biomass may have created a more conducive environment for anaerobic sulfate-reducing bacteria. It is believed that the higher corrosion rate was caused by a differential aeration cell that was established by the heterogeneous distribution of the biomass that covered the surface of the test coupons. PMID:20922384

  2. Building risk-on-a-chip models to improve breast cancer risk assessment and prevention

    PubMed Central

    Vidi, Pierre-Alexandre; Leary, James; Lelièvre, Sophie A.

    2013-01-01

    Summary Preventive actions for chronic diseases hold the promise of improving lives and reducing healthcare costs. For several diseases, including breast cancer, multiple risk and protective factors have been identified by epidemiologists. The impact of most of these factors has yet to be fully understood at the organism, tissue, cellular and molecular levels. Importantly, combinations of external and internal risk and protective factors involve cooperativity thus, synergizing or antagonizing disease onset. Models are needed to mechanistically decipher cancer risks under defined cellular and microenvironmental conditions. Here, we briefly review breast cancer risk models based on 3D cell culture and propose to improve risk modeling with lab-on-a-chip approaches. We suggest epithelial tissue polarity, DNA repair and epigenetic profiles as endpoints in risk assessment models and discuss the development of ‘risks-on-chips’ integrating biosensors of these endpoints and of general tissue homeostasis. Risks-on-chips will help identify biomarkers of risk, serve as screening platforms for cancer preventive agents, and provide a better understanding of risk mechanisms, hence resulting in novel developments in disease prevention. PMID:23681255

  3. On an Additive Semigraphoid Model for Statistical Networks With Application to Pathway Analysis

    PubMed Central

    Li, Bing; Chun, Hyonho; Zhao, Hongyu

    2014-01-01

    We introduce a nonparametric method for estimating non-gaussian graphical models based on a new statistical relation called additive conditional independence, which is a three-way relation among random vectors that resembles the logical structure of conditional independence. Additive conditional independence allows us to use one-dimensional kernel regardless of the dimension of the graph, which not only avoids the curse of dimensionality but also simplifies computation. It also gives rise to a parallel structure to the gaussian graphical model that replaces the precision matrix by an additive precision operator. The estimators derived from additive conditional independence cover the recently introduced nonparanormal graphical model as a special case, but outperform it when the gaussian copula assumption is violated. We compare the new method with existing ones by simulations and in genetic pathway analysis. PMID:26401064

  4. Meat and bone meal and mineral feed additives may increase the risk of oral prion disease transmission

    USGS Publications Warehouse

    Johnson, Christopher J.; McKenzie, Debbie; Pedersen, Joel A.; Aiken, Judd M.

    2011-01-01

    Ingestion of prion-contaminated materials is postulated to be a primary route of prion disease transmission. Binding of prions to soil (micro)particles dramatically enhances peroral disease transmission relative to unbound prions, and it was hypothesized that micrometer-sized particles present in other consumed materials may affect prion disease transmission via the oral route of exposure. Small, insoluble particles are present in many substances, including soil, human foods, pharmaceuticals, and animal feeds. It is known that meat and bone meal (MBM), a feed additive believed responsible for the spread of bovine spongiform encephalopathy (BSE), contains particles smaller than 20 μm and that the pathogenic prion protein binds to MBM. The potentiation of disease transmission via the oral route by exposure to MBM or three micrometer-sized mineral feed additives was determined. Data showed that when the disease agent was bound to any of the tested materials, the penetrance of disease was increased compared to unbound prions. Our data suggest that in feed or other prion-contaminated substances consumed by animals or, potentially, humans, the addition of MBM or the presence of microparticles could heighten risks of prion disease acquisition.

  5. Meat and bone meal and mineral feed additives may increase the risk of oral prion disease transmission

    USGS Publications Warehouse

    Johnson, C.J.; McKenzie, D.; Pedersen, J.A.; Aiken, Judd M.

    2011-01-01

    Ingestion of prion-contaminated materials is postulated to be a primary route of prion disease transmission. Binding of prions to soil (micro)particles dramatically enhances peroral disease transmission relative to unbound prions, and it was hypothesized that micrometer-sized particles present in other consumed materials may affect prion disease transmission via the oral route of exposure. Small, insoluble particles are present in many substances, including soil, human foods, pharmaceuticals, and animal feeds. It is known that meat and bone meal (MBM), a feed additive believed responsible for the spread of bovine spongiform encephalopathy (BSE), contains particles smaller than 20 ??m and that the pathogenic prion protein binds to MBM. The potentiation of disease transmission via the oral route by exposure to MBM or three micrometer-sized mineral feed additives was determined. Data showed that when the disease agent was bound to any of the tested materials, the penetrance of disease was increased compared to unbound prions. Our data suggest that in feed or other prion-contaminated substances consumed by animals or, potentially, humans, the addition of MBM or the presence of microparticles could heighten risks of prion disease acquisition. Copyright ?? 2011 Taylor & Francis Group, LLC.

  6. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  7. A Dual-Process Approach to Health Risk Decision Making: The Prototype Willingness Model

    ERIC Educational Resources Information Center

    Gerrard, Meg; Gibbons, Frederick X.; Houlihan, Amy E.; Stock, Michelle L.; Pomery, Elizabeth A.

    2008-01-01

    Although dual-process models in cognitive, personality, and social psychology have stimulated a large body of research about analytic and heuristic modes of decision making, these models have seldom been applied to the study of adolescent risk behaviors. In addition, the developmental course of these two kinds of information processing, and their…

  8. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces. PMID:21571428

  9. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  10. A Process Model for Assessing Adolescent Risk for Suicide.

    ERIC Educational Resources Information Center

    Stoelb, Matt; Chiriboga, Jennifer

    1998-01-01

    This comprehensive assessment process model includes primary, secondary, and situational risk factors and their combined implications and significance in determining an adolescent's level or risk for suicide. Empirical data and clinical intuition are integrated to form a working client model that guides the professional in continuously reassessing…

  11. Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.

    PubMed

    Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi

    2015-10-01

    In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. PMID:26010201

  12. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2016-03-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  13. Modelling public risk evaluation of natural hazards: a conceptual approach

    NASA Astrophysics Data System (ADS)

    Plattner, Th.

    2005-04-01

    In recent years, the dealing with natural hazards in Switzerland has shifted away from being hazard-oriented towards a risk-based approach. Decreasing societal acceptance of risk, accompanied by increasing marginal costs of protective measures and decreasing financial resources cause an optimization problem. Therefore, the new focus lies on the mitigation of the hazard's risk in accordance with economical, ecological and social considerations. This modern proceeding requires an approach in which not only technological, engineering or scientific aspects of the definition of the hazard or the computation of the risk are considered, but also the public concerns about the acceptance of these risks. These aspects of a modern risk approach enable a comprehensive assessment of the (risk) situation and, thus, sound risk management decisions. In Switzerland, however, the competent authorities suffer from a lack of decision criteria, as they don't know what risk level the public is willing to accept. Consequently, there exists a need for the authorities to know what the society thinks about risks. A formalized model that allows at least a crude simulation of the public risk evaluation could therefore be a useful tool to support effective and efficient risk mitigation measures. This paper presents a conceptual approach of such an evaluation model using perception affecting factors PAF, evaluation criteria EC and several factors without any immediate relation to the risk itself, but to the evaluating person. Finally, the decision about the acceptance Acc of a certain risk i is made by a comparison of the perceived risk Ri,perc with the acceptable risk Ri,acc.

  14. Modeling financial disaster risk management in developing countries

    NASA Astrophysics Data System (ADS)

    Mechler, R.; Hochrainer, S.; Pflug, G.; Linnerooth-Bayer, J.

    2005-12-01

    The public sector plays a major role in reducing the long-term economic repercussions of disasters by repairing damaged infrastructure and providing financial assistance to households and businesses. If critical infrastructure is not repaired in a timely manner, there can be serious effects on the economy and the livelihoods of the population. The repair of public infrastructure, however, can be a significant drain on public budgets especially in developing and transition countries. Developing country governments frequently lack the liquidity, even including international aid and loans, to fully repair damaged critical public infrastructure or provide sufficient support to households and businesses for their recovery. The earthquake in Gujarat, and other recent cases of government post-disaster liquidity crises, have sounded an alarm, prompting financial development organizations, such as the World Bank, among others, to call for greater attention to reducing financial vulnerability and increasing the resilience of the public sector. This talk reports on a model designed to illustrate the tradeoffs and choices a developing country must make in financially managing the economic risks due to natural disasters. Budgetary resources allocated to pre-disaster risk management strategies, such as loss mitigation measures, a catastrophe reserve fund, insurance and contingent credit arrangements for public assets, reduce the probability of financing gaps - the inability of governments to meet their full obligations in providing relief to private victims and restoring public infrastructure - or prevent the deterioration of the ability to undertake additional borrowing without incurring a debt crisis. The model -which is equipped with a graphical interface - can be a helpful tool for building capacity of policy makers for developing and assessing public financing strategies for disaster risk by indicating the respective costs and consequences of financing alternatives.

  15. A comprehensive Network Security Risk Model for process control networks.

    PubMed

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example. PMID:19000078

  16. Genomic prediction of growth in pigs based on a model including additive and dominance effects.

    PubMed

    Lopes, M S; Bastiaansen, J W M; Janss, L; Knol, E F; Bovenhuis, H

    2016-06-01

    Independent of whether prediction is based on pedigree or genomic information, the focus of animal breeders has been on additive genetic effects or 'breeding values'. However, when predicting phenotypes rather than breeding values of an animal, models that account for both additive and dominance effects might be more accurate. Our aim with this study was to compare the accuracy of predicting phenotypes using a model that accounts for only additive effects (MA) and a model that accounts for both additive and dominance effects simultaneously (MAD). Lifetime daily gain (DG) was evaluated in three pig populations (1424 Pietrain, 2023 Landrace, and 2157 Large White). Animals were genotyped using the Illumina SNP60K Beadchip and assigned to either a training data set to estimate the genetic parameters and SNP effects, or to a validation data set to assess the prediction accuracy. Models MA and MAD applied random regression on SNP genotypes and were implemented in the program Bayz. The additive heritability of DG across the three populations and the two models was very similar at approximately 0.26. The proportion of phenotypic variance explained by dominance effects ranged from 0.04 (Large White) to 0.11 (Pietrain), indicating that importance of dominance might be breed-specific. Prediction accuracies were higher when predicting phenotypes using total genetic values (sum of breeding values and dominance deviations) from the MAD model compared to using breeding values from both MA and MAD models. The highest increase in accuracy (from 0.195 to 0.222) was observed in the Pietrain, and the lowest in Large White (from 0.354 to 0.359). Predicting phenotypes using total genetic values instead of breeding values in purebred data improved prediction accuracy and reduced the bias of genomic predictions. Additional benefit of the method is expected when applied to predict crossbred phenotypes, where dominance levels are expected to be higher. PMID:26676611

  17. The effects of vehicle model and driver behavior on risk.

    PubMed

    Wenzel, Thomas P; Ross, Marc

    2005-05-01

    We study the dependence of risk on vehicle type and especially on vehicle model. Here, risk is measured by the number of driver fatalities per year per million vehicles registered. We analyze both the risk to the drivers of each vehicle model and the risk the vehicle model imposes on drivers of other vehicles with which it crashes. The "combined risk" associated with each vehicle model is simply the sum of the risk-to-drivers in all kinds of crashes and the risk-to-drivers-of-other-vehicles in two-vehicle crashes. We find that most car models are as safe to their drivers as most sport utility vehicles (SUVs); the increased risk of a rollover in a SUV roughly balances the higher risk for cars that collide with SUVs and pickup trucks. We find that SUVs and to a greater extent pickup trucks, impose much greater risks than cars on drivers of other vehicles; and these risks increase with increasing pickup size. The higher aggressivity of SUVs and pickups makes their combined risk higher than that of almost all cars. Effects of light truck design on their risk are revealed by the analysis of specific models: new unibody (or "crossover") SUVs appear, in preliminary analysis, to have much lower risks than the most popular truck-based SUVs. Much has been made in the past about the high risk of low-mass cars in certain kinds of collisions. We find there are other plausible explanations for this pattern of risk, which suggests that mass may not be fundamental to safety. While not conclusive, this is potentially important because improvement in fuel economy is a major goal for designers of new vehicles. We find that accounting for the most risky drivers, young males and the elderly, does not change our general results. Similarly, we find with California data that the high risk of rural driving and the high level of rural driving by pickups does not increase the risk-to-drivers of pickups relative to that for cars. However, other more subtle differences in drivers and the

  18. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk. PMID:19087232

  19. A Multiple Risk Factors Model of the Development of Aggression among Early Adolescents from Urban Disadvantaged Neighborhoods

    ERIC Educational Resources Information Center

    Kim, Sangwon; Orpinas, Pamela; Kamphaus, Randy; Kelder, Steven H.

    2011-01-01

    This study empirically derived a multiple risk factors model of the development of aggression among middle school students in urban, low-income neighborhoods, using Hierarchical Linear Modeling (HLM). Results indicated that aggression increased from sixth to eighth grade. Additionally, the influences of four risk domains (individual, family,…

  20. Injury count model for quantification of risk of occupational injury.

    PubMed

    Khanzode, Vivek V; Maiti, J; Ray, P K

    2011-06-01

    Reduction of risk of occupational injuries is one of the most challenging problems faced by industry. Assessing and comparing risks involved in different jobs is one of the important steps towards reducing injury risk. In this study, a comprehensive scheme is given for assessing and comparing injury risks with the development of injury count model, injury risk model and derived statistics. The hazards present in a work system and the nature of the job carried out by workers are perceived as important drivers of injury potential of a work system. A loglinear model is used to quantify injury counts and the event-tree approach with joint, marginal and conditional probabilities is used to quantify injury risk. A case study was carried out in an underground coal mine. Finally a number of indices are proposed for the case study mine to capture risk of injury in different jobs. The findings of this study will help in designing injury intervention strategies for the mine studied. The job-wise risk profiles will be used to prioritise the jobs for redesign. The absolute indices can be applied for benchmarking job-wise risks and the relative indices can be used for comparing job-wise risks across work systems. PMID:21432706

  1. Modeling oxygen dissolution and biological uptake during pulse oxygen additions in oenological fermentations.

    PubMed

    Saa, Pedro A; Moenne, M Isabel; Pérez-Correa, J Ricardo; Agosin, Eduardo

    2012-09-01

    Discrete oxygen additions during oenological fermentations can have beneficial effects both on yeast performance and on the resulting wine quality. However, the amount and time of the additions must be carefully chosen to avoid detrimental effects. So far, most oxygen additions are carried out empirically, since the oxygen dynamics in the fermenting must are not completely understood. To efficiently manage oxygen dosage, we developed a mass balance model of the kinetics of oxygen dissolution and biological uptake during wine fermentation on a laboratory scale. Model calibration was carried out employing a novel dynamic desorption-absorption cycle based on two optical sensors able to generate enough experimental data for the precise determination of oxygen uptake and volumetric mass transfer coefficients. A useful system for estimating the oxygen solubility in defined medium and musts was also developed and incorporated into the mass balance model. Results indicated that several factors, such as the fermentation phase, wine composition, mixing and carbon dioxide concentration, must be considered when performing oxygen addition during oenological fermentations. The present model will help develop better oxygen addition policies in wine fermentations on an industrial scale. PMID:22349928

  2. Assessing Academic Risk of Student-Athletes: Applicability of the NCAA Graduation Risk Overview Model to GPA

    ERIC Educational Resources Information Center

    Johnson, James

    2013-01-01

    In an effort to standardize academic risk assessment, the NCAA developed the graduation risk overview (GRO) model. Although this model was designed to assess graduation risk, its ability to predict grade-point average (GPA) remained unknown. Therefore, 134 individual risk assessments were made to determine GRO model effectiveness in the…

  3. Ecological risk assessment of water environment for Luanhe River Basin based on relative risk model.

    PubMed

    Liu, Jingling; Chen, Qiuying; Li, Yongli

    2010-11-01

    The relative risk model (RRM) was applied in regional ecological risk assessments successfully. In this study, the RRM was developed through increasing the data of risk source and introducing the source-stressor-habitat exposure filter (SSH), the endpoint-habitat exposure filter (EH) and the stressor-endpoint effect filter (SE) to reflect the meaning of exposure and effect more explicit. Water environment which include water quality, water quantity and aquatic ecosystems was selected as the ecological risk assessment endpoints. The Luanhe River Basin located in the North China was selected as model case. The results showed that there were three low risk regions, one medium risk region and two high risk regions in the Luanhe River Basin. The results also indicated habitat destruction was the largest stressor with the risk scores as high as 11.87 for the Luanhe water environment, the second was oxygen consuming organic pollutants (9.28) and the third was nutrients (7.78). So these three stressors were the main influencing factors of the ecological pressure in the study area. Furthermore, animal husbandry was the biggest source with the risk scores as high as 20.38, the second was domestic sewage (14.00), and the third was polluting industry (9.96). For habitats, waters and farmland were enduring the bigger pressure and should be taken considerable attention. Water deterioration and ecological service values damaged were facing the biggest risk pressure, and secondly was biodiversity decreased and landscape fragmentation. PMID:20683654

  4. A software quality model and metrics for risk assessment

    NASA Technical Reports Server (NTRS)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  5. Phase two of Site 300`s ecological risk assessment: Model verification and risk management

    SciTech Connect

    Carlson, T.M.; Gregory, S.D.

    1995-12-31

    The authors completed the baseline ecological risk assessment (ERA) for Lawrence Livermore National Laboratory`s Site 300 in 1993. Using data collection and modeling techniques adapted from the human health risk assessment (HRA), they evaluated the potential hazard of contaminants in environmental media to ecological receptors. They identified potential hazards to (1) aquatic invertebrates from heavy metal contaminants in surface water, (2) burrowing vertebrates from contaminants volatilizing from subsurface soil into burrow air, and (3) grazing deer and burrowing vertebrates from cadmium contamination in surface soil. They recently began collecting data to refine the estimates of potential hazard to these ecological receptors. Bioassay results form the surface water failed to verify a hazard to aquatic invertebrates. Soil vapor surveys of subsurface burrows did verify the presence of high concentrations of volatile organic compounds (VOCs). However, they have not yet verified a true impact on the burrowing populations. The authors also completed an extensive surface soil sampling program, which identified local hot spots of cadmium contamination. In addition, they have been collecting data on the land use patterns of the deer population. Their data indicate that deer do not typically use those areas with cadmium surface soil contamination. Information from this phase of the ERA, along with the results of the HRA, will direct the selection of remedial alternatives for the site. For the ecological receptors, remedial alternatives include developing a risk management program which includes ensuring that (1) sensitive burrowing species (such as rare or endangered species) do not use areas of surface or subsurface contamination, and (2) deer populations do not use areas of surface soil contamination.

  6. Vector generalized additive models for extreme rainfall data analysis (study case rainfall data in Indramayu)

    NASA Astrophysics Data System (ADS)

    Utami, Eka Putri Nur; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall pattern are good indicators for potential disasters. Global Circulation Model (GCM) contains global scale information that can be used to predict the rainfall data. Statistical downscaling (SD) utilizes the global scale information to make inferences in the local scale. Essentially, SD can be used to predict local scale variables based on global scale variables. SD requires a method to accommodate non linear effects and extreme values. Extreme value Theory (EVT) can be used to analyze the extreme value. One of methods to identify the extreme events is peak over threshold that follows Generalized Pareto Distribution (GPD). The vector generalized additive model (VGAM) is an extension of the generalized additive model. It is able to accommodate linear or nonlinear effects by involving more than one additive predictors. The advantage of VGAM is to handle multi response models. The key idea of VGAM are iteratively reweighted least square for maximum likelihood estimation, penalized smoothing, fisher scoring and additive models. This works aims to analyze extreme rainfall data in Indramayu using VGAM. The results show that the VGAM with GPD is able to predict extreme rainfall data accurately. The prediction in February is very close to the actual value at quantile 75.

  7. HUMAN EXPOSURE MODELING FOR CUMULATIVE RISK

    EPA Science Inventory

    US EPA's Office of Research and Development (ORD) has identified cumulative risk assessment as a priority research area. This is because humans and other organisms are exposed to a multitude of chemicals, physical agents, and other stressors through multiple pathways, routes, an...

  8. Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults

    ERIC Educational Resources Information Center

    Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.

    2007-01-01

    Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…

  9. Integrated reservoir characterization: Improvement in heterogeneities stochastic modelling by integration of additional external constraints

    SciTech Connect

    Doligez, B.; Eschard, R.; Geffroy, F.

    1997-08-01

    The classical approach to construct reservoir models is to start with a fine scale geological model which is informed with petrophysical properties. Then scaling-up techniques allow to obtain a reservoir model which is compatible with the fluid flow simulators. Geostatistical modelling techniques are widely used to build the geological models before scaling-up. These methods provide equiprobable images of the area under investigation, which honor the well data, and which variability is the same than the variability computed from the data. At an appraisal phase, when few data are available, or when the wells are insufficient to describe all the heterogeneities and the behavior of the field, additional constraints are needed to obtain a more realistic geological model. For example, seismic data or stratigraphic models can provide average reservoir information with an excellent areal coverage, but with a poor vertical resolution. New advances in modelisation techniques allow now to integrate this type of additional external information in order to constrain the simulations. In particular, 2D or 3D seismic derived information grids, or sand-shale ratios maps coming from stratigraphic models can be used as external drifts to compute the geological image of the reservoir at the fine scale. Examples are presented to illustrate the use of these new tools, their impact on the final reservoir model, and their sensitivity to some key parameters.

  10. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    PubMed

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods. PMID:26328545

  11. User`s guide for the Simplified Risk Model (SRM)

    SciTech Connect

    Peatross, R.G.; Eide, S.A.

    1996-10-01

    SRM can be used to quickly compare relative values relating to risk for many environmental management activities or alternatives at US DOE sites. Purpose of this guide is to provide the user with the essential values and decision points for each model variable. The numerical results are useful for ranking and screening purposes and should not be compared directly against absolute risk numerical results such as in CERCLA baseline risk assessments or Safety Analysis Reports. Implementing the SRM entails performing several preliminary steps, selecting values of the risk elements, calculating the risk equations, and checking the results. SRM considers two types of waste management states: inactive (rest) and active (transition). SRM considers risk from exposures to radionuclides and hazardous chemicals, as well as industrial hazards; however this user`s guide does not cover risk from industrial hazards (Section 10 of Eide et al. (1996) must be consulted).

  12. Risk models and scores for type 2 diabetes: systematic review

    PubMed Central

    Mathur, Rohini; Dent, Tom; Meads, Catherine; Greenhalgh, Trisha

    2011-01-01

    Objective To evaluate current risk models and scores for type 2 diabetes and inform selection and implementation of these in practice. Design Systematic review using standard (quantitative) and realist (mainly qualitative) methodology. Inclusion criteria Papers in any language describing the development or external validation, or both, of models and scores to predict the risk of an adult developing type 2 diabetes. Data sources Medline, PreMedline, Embase, and Cochrane databases were searched. Included studies were citation tracked in Google Scholar to identify follow-on studies of usability or impact. Data extraction Data were extracted on statistical properties of models, details of internal or external validation, and use of risk scores beyond the studies that developed them. Quantitative data were tabulated to compare model components and statistical properties. Qualitative data were analysed thematically to identify mechanisms by which use of the risk model or score might improve patient outcomes. Results 8864 titles were scanned, 115 full text papers considered, and 43 papers included in the final sample. These described the prospective development or validation, or both, of 145 risk prediction models and scores, 94 of which were studied in detail here. They had been tested on 6.88 million participants followed for up to 28 years. Heterogeneity of primary studies precluded meta-analysis. Some but not all risk models or scores had robust statistical properties (for example, good discrimination and calibration) and had been externally validated on a different population. Genetic markers added nothing to models over clinical and sociodemographic factors. Most authors described their score as “simple” or “easily implemented,” although few were specific about the intended users and under what circumstances. Ten mechanisms were identified by which measuring diabetes risk might improve outcomes. Follow-on studies that applied a risk score as part of an

  13. Experimental model and analytic solution for real-time observation of vehicle's additional steer angle

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaolong; Li, Liang; Pan, Deng; Cao, Chengmao; Song, Jian

    2014-03-01

    The current research of real-time observation for vehicle roll steer angle and compliance steer angle(both of them comprehensively referred as the additional steer angle in this paper) mainly employs the linear vehicle dynamic model, in which only the lateral acceleration of vehicle body is considered. The observation accuracy resorting to this method cannot meet the requirements of vehicle real-time stability control, especially under extreme driving conditions. The paper explores the solution resorting to experimental method. Firstly, a multi-body dynamic model of a passenger car is built based on the ADAMS/Car software, whose dynamic accuracy is verified by the same vehicle's roadway test data of steady static circular test. Based on this simulation platform, several influencing factors of additional steer angle under different driving conditions are quantitatively analyzed. Then ɛ-SVR algorithm is employed to build the additional steer angle prediction model, whose input vectors mainly include the sensor information of standard electronic stability control system(ESC). The method of typical slalom tests and FMVSS 126 tests are adopted to make simulation, train model and test model's generalization performance. The test result shows that the influence of lateral acceleration on additional steer angle is maximal (the magnitude up to 1°), followed by the longitudinal acceleration-deceleration and the road wave amplitude (the magnitude up to 0.3°). Moreover, both the prediction accuracy and the calculation real-time of the model can meet the control requirements of ESC. This research expands the accurate observation methods of the additional steer angle under extreme driving conditions.

  14. Prediction models for cardiovascular disease risk in the general population: systematic review

    PubMed Central

    Hooft, Lotty; Schuit, Ewoud; Debray, Thomas P A; Collins, Gary S; Tzoulaki, Ioanna; Lassale, Camille M; Siontis, George C M; Chiocchia, Virginia; Roberts, Corran; Schlüssel, Michael Maia; Gerry, Stephen; Black, James A; Heus, Pauline; van der Schouw, Yvonne T; Peelen, Linda M; Moons, Karel G M

    2016-01-01

    Objective To provide an overview of prediction models for risk of cardiovascular disease (CVD) in the general population. Design Systematic review. Data sources Medline and Embase until June 2013. Eligibility criteria for study selection Studies describing the development or external validation of a multivariable model for predicting CVD risk in the general population. Results 9965 references were screened, of which 212 articles were included in the review, describing the development of 363 prediction models and 473 external validations. Most models were developed in Europe (n=167, 46%), predicted risk of fatal or non-fatal coronary heart disease (n=118, 33%) over a 10 year period (n=209, 58%). The most common predictors were smoking (n=325, 90%) and age (n=321, 88%), and most models were sex specific (n=250, 69%). Substantial heterogeneity in predictor and outcome definitions was observed between models, and important clinical and methodological information were often missing. The prediction horizon was not specified for 49 models (13%), and for 92 (25%) crucial information was missing to enable the model to be used for individual risk prediction. Only 132 developed models (36%) were externally validated and only 70 (19%) by independent investigators. Model performance was heterogeneous and measures such as discrimination and calibration were reported for only 65% and 58% of the external validations, respectively. Conclusions There is an excess of models predicting incident CVD in the general population. The usefulness of most of the models remains unclear owing to methodological shortcomings, incomplete presentation, and lack of external validation and model impact studies. Rather than developing yet another similar CVD risk prediction model, in this era of large datasets, future research should focus on externally validating and comparing head-to-head promising CVD risk models that already exist, on tailoring or even combining these models to local

  15. Antimicrobial combinations: Bliss independence and Loewe additivity derived from mechanistic multi-hit models.

    PubMed

    Baeder, Desiree Y; Yu, Guozhi; Hozé, Nathanaël; Rolff, Jens; Regoes, Roland R

    2016-05-26

    Antimicrobial peptides (AMPs) and antibiotics reduce the net growth rate of bacterial populations they target. It is relevant to understand if effects of multiple antimicrobials are synergistic or antagonistic, in particular for AMP responses, because naturally occurring responses involve multiple AMPs. There are several competing proposals describing how multiple types of antimicrobials add up when applied in combination, such as Loewe additivity or Bliss independence. These additivity terms are defined ad hoc from abstract principles explaining the supposed interaction between the antimicrobials. Here, we link these ad hoc combination terms to a mathematical model that represents the dynamics of antimicrobial molecules hitting targets on bacterial cells. In this multi-hit model, bacteria are killed when a certain number of targets are hit by antimicrobials. Using this bottom-up approach reveals that Bliss independence should be the model of choice if no interaction between antimicrobial molecules is expected. Loewe additivity, on the other hand, describes scenarios in which antimicrobials affect the same components of the cell, i.e. are not acting independently. While our approach idealizes the dynamics of antimicrobials, it provides a conceptual underpinning of the additivity terms. The choice of the additivity term is essential to determine synergy or antagonism of antimicrobials.This article is part of the themed issue 'Evolutionary ecology of arthropod antimicrobial peptides'. PMID:27160596

  16. Randomised Controlled Feasibility Trial of an Evidence-Informed Behavioural Intervention for Obese Adults with Additional Risk Factors

    PubMed Central

    Sniehotta, Falko F.; Dombrowski, Stephan U.; Avenell, Alison; Johnston, Marie; McDonald, Suzanne; Murchie, Peter; Ramsay, Craig R.; Robertson, Kim; Araujo-Soares, Vera

    2011-01-01

    Background Interventions for dietary and physical activity changes in obese adults may be less effective for participants with additional obesity-related risk factors and co-morbidities than for otherwise healthy individuals. This study aimed to test the feasibility and acceptability of the recruitment, allocation, measurement, retention and intervention procedures of a randomised controlled trial of an intervention to improve physical activity and dietary practices amongst obese adults with additional obesity related risk factors. Method Pilot single centre open-labelled outcome assessor-blinded randomised controlled trial of obese (Body Mass Index (BMI)≥30 kg/m2) adults (age≥18 y) with obesity related co-morbidities such as type 2 diabetes, impaired glucose tolerance or hypertension. Participants were randomly allocated to a manual-based group intervention or a leaflet control condition in accordance to a 2∶1 allocation ratio. Primary outcome was acceptability and feasibility of trial procedures, secondary outcomes included measures of body composition, physical activity, food intake and psychological process measures. Results Out of 806 potentially eligible individuals identified through list searches in two primary care general medical practices N = 81 participants (63% female; mean-age = 56.56(11.44); mean-BMI = 36.73(6.06)) with 2.35(1.47) co-morbidities were randomised. Scottish Index of Multiple Deprivation (SIMD) was the only significant predictor of providing consent to take part in the study (higher chances of consent for invitees with lower levels of deprivation). Participant flowcharts, qualitative and quantitative feedback suggested good acceptance and feasibility of intervention procedures but 34.6% of randomised participants were lost to follow-up due to overly high measurement burden and sub-optimal retention procedures. Participants in the intervention group showed positive trends for most psychological, behavioural and body

  17. Frailty Models for Familial Risk with Application to Breast Cancer.

    PubMed

    Gorfine, Malka; Hsu, Li; Parmigiani, Giovanni

    2013-12-01

    In evaluating familial risk for disease we have two main statistical tasks: assessing the probability of carrying an inherited genetic mutation conferring higher risk; and predicting the absolute risk of developing diseases over time, for those individuals whose mutation status is known. Despite substantial progress, much remains unknown about the role of genetic and environmental risk factors, about the sources of variation in risk among families that carry high-risk mutations, and about the sources of familial aggregation beyond major Mendelian effects. These sources of heterogeneity contribute substantial variation in risk across families. In this paper we present simple and efficient methods for accounting for this variation in familial risk assessment. Our methods are based on frailty models. We implemented them in the context of generalizing Mendelian models of cancer risk, and compared our approaches to others that do not consider heterogeneity across families. Our extensive simulation study demonstrates that when predicting the risk of developing a disease over time conditional on carrier status, accounting for heterogeneity results in a substantial improvement in the area under the curve of the receiver operating characteristic. On the other hand, the improvement for carriership probability estimation is more limited. We illustrate the utility of the proposed approach through the analysis of BRCA1 and BRCA2 mutation carriers in the Washington Ashkenazi Kin-Cohort Study of Breast Cancer. PMID:24678132

  18. A risk analysis model in concurrent engineering product development.

    PubMed

    Wu, Desheng Dash; Kefan, Xie; Gang, Chen; Ping, Gui

    2010-09-01

    Concurrent engineering has been widely accepted as a viable strategy for companies to reduce time to market and achieve overall cost savings. This article analyzes various risks and challenges in product development under the concurrent engineering environment. A three-dimensional early warning approach for product development risk management is proposed by integrating graphical evaluation and review technique (GERT) and failure modes and effects analysis (FMEA). Simulation models are created to solve our proposed concurrent engineering product development risk management model. Solutions lead to identification of key risk controlling points. This article demonstrates the value of our approach to risk analysis as a means to monitor various risks typical in the manufacturing sector. This article has three main contributions. First, we establish a conceptual framework to classify various risks in concurrent engineering (CE) product development (PD). Second, we propose use of existing quantitative approaches for PD risk analysis purposes: GERT, FMEA, and product database management (PDM). Based on quantitative tools, we create our approach for risk management of CE PD and discuss solutions of the models. Third, we demonstrate the value of applying our approach using data from a typical Chinese motor company. PMID:20840492

  19. A Risk Score with Additional Four Independent Factors to Predict the Incidence and Recovery from Metabolic Syndrome: Development and Validation in Large Japanese Cohorts

    PubMed Central

    Obokata, Masaru; Negishi, Kazuaki; Ohyama, Yoshiaki; Okada, Haruka; Imai, Kunihiko; Kurabayashi, Masahiko

    2015-01-01

    Background Although many risk factors for Metabolic syndrome (MetS) have been reported, there is no clinical score that predicts its incidence. The purposes of this study were to create and validate a risk score for predicting both incidence and recovery from MetS in a large cohort. Methods Subjects without MetS at enrollment (n = 13,634) were randomly divided into 2 groups and followed to record incidence of MetS. We also examined recovery from it in rest 2,743 individuals with prevalent MetS. Results During median follow-up of 3.0 years, 878 subjects in the derivation and 757 in validation cohorts developed MetS. Multiple logistic regression analysis identified 12 independent variables from the derivation cohort and initial score for subsequent MetS was created, which showed good discrimination both in the derivation (c-statistics 0.82) and validation cohorts (0.83). The predictability of the initial score for recovery from MetS was tested in the 2,743 MetS population (906 subjects recovered from MetS), where nine variables (including age, sex, γ-glutamyl transpeptidase, uric acid and five MetS diagnostic criteria constituents.) remained significant. Then, the final score was created using the nine variables. This score significantly predicted both the recovery from MetS (c-statistics 0.70, p<0.001, 78% sensitivity and 54% specificity) and incident MetS (c-statistics 0.80) with an incremental discriminative ability over the model derived from five factors used in the diagnosis of MetS (continuous net reclassification improvement: 0.35, p < 0.001 and integrated discrimination improvement: 0.01, p<0.001). Conclusions We identified four additional independent risk factors associated with subsequent MetS, developed and validated a risk score to predict both incident and recovery from MetS. PMID:26230621

  20. Sparse Additive Ordinary Differential Equations for Dynamic Gene Regulatory Network Modeling.

    PubMed

    Wu, Hulin; Lu, Tao; Xue, Hongqi; Liang, Hua

    2014-04-01

    The gene regulation network (GRN) is a high-dimensional complex system, which can be represented by various mathematical or statistical models. The ordinary differential equation (ODE) model is one of the popular dynamic GRN models. High-dimensional linear ODE models have been proposed to identify GRNs, but with a limitation of the linear regulation effect assumption. In this article, we propose a sparse additive ODE (SA-ODE) model, coupled with ODE estimation methods and adaptive group LASSO techniques, to model dynamic GRNs that could flexibly deal with nonlinear regulation effects. The asymptotic properties of the proposed method are established and simulation studies are performed to validate the proposed approach. An application example for identifying the nonlinear dynamic GRN of T-cell activation is used to illustrate the usefulness of the proposed method. PMID:25061254

  1. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data

    PubMed Central

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2012-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions. PMID:23645976

  2. Back-end Science Model Integration for Ecological Risk Assessment

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  3. Back-end Science Model Integration for Ecological Risk Assessment.

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  4. Risk prediction models for hepatocellular carcinoma in different populations

    PubMed Central

    Ma, Xiao; Yang, Yang; Tu, Hong; Gao, Jing; Tan, Yu-Ting; Zheng, Jia-Li; Bray, Freddie; Xiang, Yong-Bing

    2016-01-01

    Hepatocellular carcinoma (HCC) is a malignant disease with limited therapeutic options due to its aggressive progression. It places heavy burden on most low and middle income countries to treat HCC patients. Nowadays accurate HCC risk predictions can help making decisions on the need for HCC surveillance and antiviral therapy. HCC risk prediction models based on major risk factors of HCC are useful and helpful in providing adequate surveillance strategies to individuals who have different risk levels. Several risk prediction models among cohorts of different populations for estimating HCC incidence have been presented recently by using simple, efficient, and ready-to-use parameters. Moreover, using predictive scoring systems to assess HCC development can provide suggestions to improve clinical and public health approaches, making them more cost-effective and effort-effective, for inducing personalized surveillance programs according to risk stratification. In this review, the features of risk prediction models of HCC across different populations were summarized, and the perspectives of HCC risk prediction models were discussed as well. PMID:27199512

  5. Risk prediction models for hepatocellular carcinoma in different populations.

    PubMed

    Ma, Xiao; Yang, Yang; Tu, Hong; Gao, Jing; Tan, Yu-Ting; Zheng, Jia-Li; Bray, Freddie; Xiang, Yong-Bing

    2016-04-01

    Hepatocellular carcinoma (HCC) is a malignant disease with limited therapeutic options due to its aggressive progression. It places heavy burden on most low and middle income countries to treat HCC patients. Nowadays accurate HCC risk predictions can help making decisions on the need for HCC surveillance and antiviral therapy. HCC risk prediction models based on major risk factors of HCC are useful and helpful in providing adequate surveillance strategies to individuals who have different risk levels. Several risk prediction models among cohorts of different populations for estimating HCC incidence have been presented recently by using simple, efficient, and ready-to-use parameters. Moreover, using predictive scoring systems to assess HCC development can provide suggestions to improve clinical and public health approaches, making them more cost-effective and effort-effective, for inducing personalized surveillance programs according to risk stratification. In this review, the features of risk prediction models of HCC across different populations were summarized, and the perspectives of HCC risk prediction models were discussed as well. PMID:27199512

  6. Risk Prediction Models for Other Cancers or Multiple Sites

    Cancer.gov

    Developing statistical models that estimate the probability of developing other multiple cancers over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Competing Risk Regression Models for Epidemiologic Data

    PubMed Central

    Cole, Stephen R.; Gange, Stephen J.

    2009-01-01

    Competing events can preclude the event of interest from occurring in epidemiologic data and can be analyzed by using extensions of survival analysis methods. In this paper, the authors outline 3 regression approaches for estimating 2 key quantities in competing risks analysis: the cause-specific relative hazard (csRH) and the subdistribution relative hazard (sdRH). They compare and contrast the structure of the risk sets and the interpretation of parameters obtained with these methods. They also demonstrate the use of these methods with data from the Women's Interagency HIV Study established in 1993, treating time to initiation of highly active antiretroviral therapy or to clinical disease progression as competing events. In our example, women with an injection drug use history were less likely than those without a history of injection drug use to initiate therapy prior to progression to acquired immunodeficiency syndrome or death by both measures of association (csRH = 0.67, 95% confidence interval: 0.57, 0.80 and sdRH = 0.60, 95% confidence interval: 0.50, 0.71). Moreover, the relative hazards for disease progression prior to treatment were elevated (csRH = 1.71, 95% confidence interval: 1.37, 2.13 and sdRH = 2.01, 95% confidence interval: 1.62, 2.51). Methods for competing risks should be used by epidemiologists, with the choice of method guided by the scientific question. PMID:19494242

  8. Validation of a novel air toxic risk model with air monitoring.

    PubMed

    Pratt, Gregory C; Dymond, Mary; Ellickson, Kristie; Thé, Jesse

    2012-01-01

    Three modeling systems were used to estimate human health risks from air pollution: two versions of MNRiskS (for Minnesota Risk Screening), and the USEPA National Air Toxics Assessment (NATA). MNRiskS is a unique cumulative risk modeling system used to assess risks from multiple air toxics, sources, and pathways on a local to a state-wide scale. In addition, ambient outdoor air monitoring data were available for estimation of risks and comparison with the modeled estimates of air concentrations. Highest air concentrations and estimated risks were generally found in the Minneapolis-St. Paul metropolitan area and lowest risks in undeveloped rural areas. Emissions from mobile and area (nonpoint) sources created greater estimated risks than emissions from point sources. Highest cancer risks were via ingestion pathway exposures to dioxins and related compounds. Diesel particles, acrolein, and formaldehyde created the highest estimated inhalation health impacts. Model-estimated air concentrations were generally highest for NATA and lowest for the AERMOD version of MNRiskS. This validation study showed reasonable agreement between available measurements and model predictions, although results varied among pollutants, and predictions were often lower than measurements. The results increased confidence in identifying pollutants, pathways, geographic areas, sources, and receptors of potential concern, and thus provide a basis for informing pollution reduction strategies and focusing efforts on specific pollutants (diesel particles, acrolein, and formaldehyde), geographic areas (urban centers), and source categories (nonpoint sources). The results heighten concerns about risks from food chain exposures to dioxins and PAHs. Risk estimates were sensitive to variations in methodologies for treating emissions, dispersion, deposition, exposure, and toxicity. PMID:21651597

  9. Lymphatic Filariasis Transmission Risk Map of India, Based on a Geo-Environmental Risk Model

    PubMed Central

    Sabesan, Shanmugavelu; Raju, Konuganti Hari Kishan; Srivastava, Pradeep Kumar; Jambulingam, Purushothaman

    2013-01-01

    Abstract The strategy adopted by a global program to interrupt transmission of lymphatic filariasis (LF) is mass drug administration (MDA) using chemotherapy. India also followed this strategy by introducing MDA in the historically known endemic areas. All other areas, which remained unsurveyed, were presumed to be nonendemic and left without any intervention. Therefore, identification of LF transmission risk areas in the entire country has become essential so that they can be targeted for intervention. A geo-environmental risk model (GERM) developed earlier was used to create a filariasis transmission risk map for India. In this model, a Standardized Filariasis Transmission Risk Index (SFTRI, based on geo-environmental risk variables) was used as a predictor of transmission risk. The relationship between SFTRI and endemicity (historically known) of an area was quantified by logistic regression analysis. The quantified relationship was validated by assessing the filarial antigenemia status of children living in the unsurveyed areas through a ground truth study. A significant positive relationship was observed between SFTRI and the endemicity of an area. Overall, the model prediction of filarial endemic status of districts was found to be correct in 92.8% of the total observations. Thus, among the 190 districts hitherto unsurveyed, as many as 113 districts were predicted to be at risk, and the remaining at no risk. The GERM developed on geographic information system (GIS) platform is useful for LF spatial delimitation on a macrogeographic/regional scale. Furthermore, the risk map developed will be useful for the national LF elimination program by identifying areas at risk for intervention and for undertaking surveillance in no-risk areas. PMID:23808973

  10. Formation and reduction of carcinogenic furan in various model systems containing food additives.

    PubMed

    Kim, Jin-Sil; Her, Jae-Young; Lee, Kwang-Geun

    2015-12-15

    The aim of this study was to analyse and reduce furan in various model systems. Furan model systems consisting of monosaccharides (0.5M glucose and ribose), amino acids (0.5M alanine and serine) and/or 1.0M ascorbic acid were heated at 121°C for 25 min. The effects of food additives (each 0.1M) such as metal ions (iron sulphate, magnesium sulphate, zinc sulphate and calcium sulphate), antioxidants (BHT and BHA), and sodium sulphite on the formation of furan were measured. The level of furan formed in the model systems was 6.8-527.3 ng/ml. The level of furan in the model systems of glucose/serine and glucose/alanine increased 7-674% when food additives were added. In contrast, the level of furan decreased by 18-51% in the Maillard reaction model systems that included ribose and alanine/serine with food additives except zinc sulphate. PMID:26190608

  11. Modeling Longitudinal Data with Generalized Additive Models: Applications to Single-Case Designs

    ERIC Educational Resources Information Center

    Sullivan, Kristynn J.; Shadish, William R.

    2013-01-01

    Single case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time both in the presence and absence of treatment. For a variety of reasons, interest in the statistical analysis and meta-analysis of these designs has been growing in recent years. This paper proposes modeling SCD data with…

  12. NB-PLC channel modelling with cyclostationary noise addition & OFDM implementation for smart grid

    NASA Astrophysics Data System (ADS)

    Thomas, Togis; Gupta, K. K.

    2016-03-01

    Power line communication (PLC) technology can be a viable solution for the future ubiquitous networks because it provides a cheaper alternative to other wired technology currently being used for communication. In smart grid Power Line Communication (PLC) is used to support communication with low rate on low voltage (LV) distribution network. In this paper, we propose the channel modelling of narrowband (NB) PLC in the frequency range 5 KHz to 500 KHz by using ABCD parameter with cyclostationary noise addition. Behaviour of the channel was studied by the addition of 11KV/230V transformer, by varying load location and load. Bit error rate (BER) Vs signal to noise ratio SNR) was plotted for the proposed model by employing OFDM. Our simulation results based on the proposed channel model show an acceptable performance in terms of bit error rate versus signal to noise ratio, which enables communication required for smart grid applications.

  13. Hemolysate-mediated platelet aggregation: an additional risk mechanism contributing to thrombosis of continuous flow ventricular assist devices.

    PubMed

    Tran, Phat L; Pietropaolo, Maria-Grazia; Valerio, Lorenzo; Brengle, William; Wong, Raymond K; Kazui, Toshinobu; Khalpey, Zain I; Redaelli, Alberto; Sheriff, Jawaad; Bluestein, Danny; Slepian, Marvin J

    2016-07-01

    Despite the clinical success and growth in the utilization of continuous flow ventricular assist devices (cfVADs) for the treatment of advanced heart failure, hemolysis and thrombosis remain major limitations. Inadequate and/or ineffective anticoagulation regimens, combined with high pump speed and non-physiological flow patterns, can result in hemolysis which often is accompanied by pump thrombosis. An unexpected increase in cfVADs thrombosis was reported by multiple major VAD implanting centers in 2014, highlighting the association of hemolysis and a rise in lactate dehydrogenase (LDH) presaging thrombotic events. It is well established that thrombotic complications arise from the abnormal shear stresses generated by cfVADs. What remains unknown is the link between cfVAD-associated hemolysis and pump thrombosis. Can hemolysis of red blood cells (RBCs) contribute to platelet aggregation, thereby, facilitating prothrombotic complications in cfVADs? Herein, we examine the effect of RBC-hemolysate and selected major constituents, i.e., lactate dehydrogenase (LDH) and plasma free hemoglobin (pHb) on platelet aggregation, utilizing electrical resistance aggregometry. Our hypothesis is that elements of RBCs, released as a result of shear-mediated hemolysis, will contribute to platelet aggregation. We show that RBC hemolysate and pHb, but not LDH, are direct contributors to platelet aggregation, posing an additional risk mechanism for cfVAD thrombosis. PMID:26590166

  14. FREQUENCY ANALYSIS OF PESTICIDE CONCENTRATIONS FOR RISK ASSESSMENT (FRANCO MODEL)

    EPA Science Inventory

    This report describes a method for statistically characterizing the occurrence and duration of pesticide concentrations in surface waters receiving runoff from agricultural lands. The characterization bridges the gap between simulated instream pesticide modeling and the risk asse...

  15. Submission Form for Peer-Reviewed Cancer Risk Prediction Models

    Cancer.gov

    If you have information about a peer-reviewd cancer risk prediction model that you would like to be considered for inclusion on this list, submit as much information as possible through the form on this page.

  16. A model for assessing the risk of human trafficking on a local level

    NASA Astrophysics Data System (ADS)

    Colegrove, Amanda

    Human trafficking is a human rights violation that is difficult to quantify. Models for estimating the number of victims of trafficking presented by previous researchers depend on inconsistent, poor quality data. As an intermediate step to help current efforts by nonprofits to combat human trafficking, this project presents a model that is not dependent on quantitative data specific to human trafficking, but rather profiles the risk of human trafficking at the local level through causative factors. Businesses, indicated by the literature, were weighted based on the presence of characteristics that increase the likelihood of trafficking in persons. The mean risk was calculated by census tract to reveal the multiplicity of risk levels in both rural and urban settings. Results indicate that labor trafficking may be a more diffuse problem in Missouri than sex trafficking. Additionally, spatial patterns of risk remained largely the same regardless of adjustments made to the model.

  17. Novel modelling solutions for debris risk reduction

    NASA Astrophysics Data System (ADS)

    Stokes, P. H.; Walker, R.; Wilkinson, J. E.; Swinerd, G. G.

    1999-01-01

    The Defence Evaluation & Research Agency (DERA) has a long association with the field of space debris research. Effort has focused on the development of software tools (IDES and SDS) to model the debris environment and its long and short term evolution. These models are now well established and recognised for their distinct capabilities. More recently, DERA has begun developing a new software tool called SHIELD. This is an innovative concurrent engineering model designed to assist engineers in identifying the most cost-effective debris protection strategy for a satellite. The model uses a novel survivability metric technique in conjunction with a genetic algorithm to search for the optimum choice and location of bumper shields, and the optimum arrangement of critical satellite components. This paper briefly summarises the unique aspects of the environment models and recent results, before describing the new SHIELD model in some detail.

  18. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    SciTech Connect

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  19. Modeling exposure to persistent chemicals in hazard and risk assessment.

    PubMed

    Cowan-Ellsberry, Christina E; McLachlan, Michael S; Arnot, Jon A; Macleod, Matthew; McKone, Thomas E; Wania, Frank

    2009-10-01

    Fate and exposure modeling has not, thus far, been explicitly used in the risk profile documents prepared for evaluating the significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of persistent organic pollutants (POP) and persistent, bioaccumulative, and toxic (PBT) chemicals in the environment. The goal of this publication is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include 1) benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk; 2) directly estimating the exposure of the environment, biota, and humans to provide information to complement measurements or where measurements are not available or are limited; 3) to identify the key processes and chemical or environmental parameters that determine the exposure, thereby allowing the effective prioritization of research or measurements to improve the risk profile; and 4) forecasting future time trends, including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and

  20. An Integrated Risk Management Model for Source Water Protection Areas

    PubMed Central

    Chiueh, Pei-Te; Shang, Wei-Ting; Lo, Shang-Lien

    2012-01-01

    Watersheds are recognized as the most effective management unit for the protection of water resources. For surface water supplies that use water from upstream watersheds, evaluating threats to water quality and implementing a watershed management plan are crucial for the maintenance of drinking water safe for humans. The aim of this article is to establish a risk assessment model that provides basic information for identifying critical pollutants and areas at high risk for degraded water quality. In this study, a quantitative risk model that uses hazard quotients for each water quality parameter was combined with a qualitative risk model that uses the relative risk level of potential pollution events in order to characterize the current condition and potential risk of watersheds providing drinking water. In a case study of Taipei Source Water Area in northern Taiwan, total coliforms and total phosphorus were the top two pollutants of concern. Intensive tea-growing and recreational activities around the riparian zone may contribute the greatest pollution to the watershed. Our risk assessment tool may be enhanced by developing, recording, and updating information on pollution sources in the water supply watersheds. Moreover, management authorities could use the resultant information to create watershed risk management plans. PMID:23202770

  1. A Hybrid Tsunami Risk Model for Japan

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A. V.; Smith, D. F.; Khater, M.; Khemici, O.; Betov, B.; Scott, J.

    2014-12-01

    Around the margins of the Pacific Ocean, denser oceanic plates slipping under continental plates cause subduction earthquakes generating large tsunami waves. The subducting Pacific and Philippine Sea plates create damaging interplate earthquakes followed by huge tsunami waves. It was a rupture of the Japan Trench subduction zone (JTSZ) and the resultant M9.0 Tohoku-Oki earthquake that caused the unprecedented tsunami along the Pacific coast of Japan on March 11, 2011. EQECAT's Japan Earthquake model is a fully probabilistic model which includes a seismo-tectonic model describing the geometries, magnitudes, and frequencies of all potential earthquake events; a ground motion model; and a tsunami model. Within the much larger set of all modeled earthquake events, fault rupture parameters for about 24000 stochastic and 25 historical tsunamigenic earthquake events are defined to simulate tsunami footprints using the numerical tsunami model COMCOT. A hybrid approach using COMCOT simulated tsunami waves is used to generate inundation footprints, including the impact of tides and flood defenses. Modeled tsunami waves of major historical events are validated against observed data. Modeled tsunami flood depths on 30 m grids together with tsunami vulnerability and financial models are then used to estimate insured loss in Japan from the 2011 tsunami. The primary direct report of damage from the 2011 tsunami is in terms of the number of buildings damaged by municipality in the tsunami affected area. Modeled loss in Japan from the 2011 tsunami is proportional to the number of buildings damaged. A 1000-year return period map of tsunami waves shows high hazard along the west coast of southern Honshu, on the Pacific coast of Shikoku, and on the east coast of Kyushu, primarily associated with major earthquake events on the Nankai Trough subduction zone (NTSZ). The highest tsunami hazard of more than 20m is seen on the Sanriku coast in northern Honshu, associated with the JTSZ.

  2. Usefulness and limitations of global flood risk models

    NASA Astrophysics Data System (ADS)

    Ward, Philip; Jongman, Brenden; Salamon, Peter; Simpson, Alanna; Bates, Paul; De Groeve, Tom; Muis, Sanne; Coughlan de Perez, Erin; Rudari, Roberto; Mark, Trigg; Winsemius, Hessel

    2016-04-01

    Global flood risk models are now a reality. Initially, their development was driven by a demand from users for first-order global assessments to identify risk hotspots. Relentless upward trends in flood damage over the last decade have enhanced interest in such assessments. The adoption of the Sendai Framework for Disaster Risk Reduction and the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts have made these efforts even more essential. As a result, global flood risk models are being used more and more in practice, by an increasingly large number of practitioners and decision-makers. However, they clearly have their limits compared to local models. To address these issues, a team of scientists and practitioners recently came together at the Global Flood Partnership meeting to critically assess the question 'What can('t) we do with global flood risk models?'. The results of this dialogue (Ward et al., 2013) will be presented, opening a discussion on similar broader initiatives at the science-policy interface in other natural hazards. In this contribution, examples are provided of successful applications of global flood risk models in practice (for example together with the World Bank, Red Cross, and UNISDR), and limitations and gaps between user 'wish-lists' and model capabilities are discussed. Finally, a research agenda is presented for addressing these limitations and reducing the gaps. Ward et al., 2015. Nature Climate Change, doi:10.1038/nclimate2742

  3. Modeling the Risks of Geothermal Development

    SciTech Connect

    Golabi, K.; Nair, K.; Rothstein, S.; Sioshansi, F.

    1980-12-16

    Geothermal energy has emerged as a promising energy source in recent years and has received serious attention from developers and potential users. Despite the advantages of this resource, such as potential cost competitiveness, reliability, public acceptance, etc., the commercial development and use of geothermal energy has been slow. Impediments to the development of this resource include technical, financial, environmental and regulatory uncertainties. Since geothermal power is unique in that the generation facility is tied to a single fuel at a single site, these uncertainties are of particular concern to utility companies. The areas of uncertainty and potential risks are well known. This paper presents a method for quantifying the relevant uncertainties and a framework for aggregating the risks through the use of submodels. The objective submodels can be combined with subjective probabilities (when sufficient data is not available) to yield a probability distribution over a single criterion (levelized busbar cost) that can be used to compare the desirability of geothermal power development with respect to other alternatives.

  4. ARSENIC MODEL DEVELOPMENT FOR IMPROVED RISK ASSESSMENT

    EPA Science Inventory

    This project integrates research on the kinetic behavior and metabolism of arsenic at both the cellular and whole organism levels using a physiologically based pharmacokinetic (PBPK) modeling approach. The ultimate goal is development of a robust human PBPK model for arsenic met...

  5. Modeling Environment for Total Risk-2E

    EPA Science Inventory

    MENTOR-2E uses an integrated, mechanistically consistent source-to-dose-to-response modeling framework to quantify inhalation exposure and doses resulting from emergency events. It is an implementation of the MENTOR system that is focused towards modeling of the impacts of rele...

  6. A Corrosion Risk Assessment Model for Underground Piping

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  7. A Model for Risk Assessment in Health Care.

    PubMed

    Prijatelj, Vesna; Rajkovič, Vladislav; Šušteršič, Olga

    2016-01-01

    The purpose of our research is to reduce risks and hence prevent errors in the health care process. The aim is to design an organizational information model using error prevention methods for risk assessment in a clinical setting. The model is based on selected indicators of quality nursing care, resulting from the world-known theoretical and practical models combined with experience in the Slovenian health care. The proposed organizational information model and software solution has a significant impact on the professional attention, communication and information, critical thinking, experience and knowledge. PMID:27332383

  8. Lung cancer in never smokers Epidemiology and risk prediction models

    PubMed Central

    McCarthy, William J.; Meza, Rafael; Jeon, Jihyoun; Moolgavkar, Suresh

    2012-01-01

    In this chapter we review the epidemiology of lung cancer incidence and mortality among never smokers/ nonsmokers and describe the never smoker lung cancer risk models used by CISNET modelers. Our review focuses on those influences likely to have measurable population impact on never smoker risk, such as secondhand smoke, even though the individual-level impact may be small. Occupational exposures may also contribute importantly to the population attributable risk of lung cancer. We examine the following risk factors in this chapter: age, environmental tobacco smoke, cooking fumes, ionizing radiation including radon gas, inherited genetic susceptibility, selected occupational exposures, preexisting lung disease, and oncogenic viruses. We also compare the prevalence of never smokers between the three CISNET smoking scenarios and present the corresponding lung cancer mortality estimates among never smokers as predicted by a typical CISNET model. PMID:22882894

  9. Generalized Additive Mixed-Models for Pharmacology Using Integrated Discrete Multiple Organ Co-Culture.

    PubMed

    Ingersoll, Thomas; Cole, Stephanie; Madren-Whalley, Janna; Booker, Lamont; Dorsey, Russell; Li, Albert; Salem, Harry

    2016-01-01

    Integrated Discrete Multiple Organ Co-culture (IDMOC) is emerging as an in-vitro alternative to in-vivo animal models for pharmacology studies. IDMOC allows dose-response relationships to be investigated at the tissue and organoid levels, yet, these relationships often exhibit responses that are far more complex than the binary responses often measured in whole animals. To accommodate departure from binary endpoints, IDMOC requires an expansion of analytic techniques beyond simple linear probit and logistic models familiar in toxicology. IDMOC dose-responses may be measured at continuous scales, exhibit significant non-linearity such as local maxima or minima, and may include non-independent measures. Generalized additive mixed-modeling (GAMM) provides an alternative description of dose-response that relaxes assumptions of independence and linearity. We compared GAMMs to traditional linear models for describing dose-response in IDMOC pharmacology studies. PMID:27110941

  10. Generalized Additive Mixed-Models for Pharmacology Using Integrated Discrete Multiple Organ Co-Culture

    PubMed Central

    Ingersoll, Thomas; Cole, Stephanie; Madren-Whalley, Janna; Booker, Lamont; Dorsey, Russell; Li, Albert; Salem, Harry

    2016-01-01

    Integrated Discrete Multiple Organ Co-culture (IDMOC) is emerging as an in-vitro alternative to in-vivo animal models for pharmacology studies. IDMOC allows dose-response relationships to be investigated at the tissue and organoid levels, yet, these relationships often exhibit responses that are far more complex than the binary responses often measured in whole animals. To accommodate departure from binary endpoints, IDMOC requires an expansion of analytic techniques beyond simple linear probit and logistic models familiar in toxicology. IDMOC dose-responses may be measured at continuous scales, exhibit significant non-linearity such as local maxima or minima, and may include non-independent measures. Generalized additive mixed-modeling (GAMM) provides an alternative description of dose-response that relaxes assumptions of independence and linearity. We compared GAMMs to traditional linear models for describing dose-response in IDMOC pharmacology studies. PMID:27110941

  11. Use of additive technologies for practical working with complex models for foundry technologies

    NASA Astrophysics Data System (ADS)

    Olkhovik, E.; Butsanets, A. A.; Ageeva, A. A.

    2016-07-01

    The article presents the results of research of additive technology (3D printing) application for developing a geometrically complex model of castings parts. Investment casting is well known and widely used technology for the production of complex parts. The work proposes the use of a 3D printing technology for manufacturing models parts, which are removed by thermal destruction. Traditional methods of equipment production for investment casting involve the use of manual labor which has problems with dimensional accuracy, and CNC technology which is less used. Such scheme is low productive and demands considerable time. We have offered an alternative method which consists in printing the main knots using a 3D printer (PLA and ABS) with a subsequent production of castings models from them. In this article, the main technological methods are considered and their problems are discussed. The dimensional accuracy of models in comparison with investment casting technology is considered as the main aspect.

  12. Forest fire risk assessment in Sweden using climate model data: bias correction and future changes

    NASA Astrophysics Data System (ADS)

    Yang, W.; Gardelin, M.; Olsson, J.; Bosshard, T.

    2015-01-01

    As the risk for a forest fire is largely influenced by weather, evaluating its tendency under a changing climate becomes important for management and decision making. Currently, biases in climate models make it difficult to realistically estimate the future climate and consequent impact on fire risk. A distribution-based scaling (DBS) approach was developed as a post-processing tool that intends to correct systematic biases in climate modelling outputs. In this study, we used two projections, one driven by historical reanalysis (ERA40) and one from a global climate model (ECHAM5) for future projection, both having been dynamically downscaled by a regional climate model (RCA3). The effects of the post-processing tool on relative humidity and wind speed were studied in addition to the primary variables precipitation and temperature. Finally, the Canadian Fire Weather Index system was used to evaluate the influence of changing meteorological conditions on the moisture content in fuel layers and the fire-spread risk. The forest fire risk results using DBS are proven to better reflect risk using observations than that using raw climate outputs. For future periods, southern Sweden is likely to have a higher fire risk than today, whereas northern Sweden will have a lower risk of forest fire.

  13. Climate and weather risk in natural resource models

    NASA Astrophysics Data System (ADS)

    Merrill, Nathaniel Henry

    This work, consisting of three manuscripts, addresses natural resource management under risk due to variation in climate and weather. In three distinct but theoretically related applications, I quantify the role of natural resources in stabilizing economic outcomes. In Manuscript 1, we address policy designed to effect the risk of cyanobacteria blooms in a drinking water reservoir through watershed wide policy. Combining a hydrologic and economic model for a watershed in Rhode Island, we solve for the efficient allocation of best management practices (BMPs) on livestock pastures to meet a monthly risk-based as well as mean-based water quality objective. In order to solve for the efficient allocations of nutrient control effort, we optimize a probabilistically constrained integer-programming problem representing the choices made on each farm and the resultant conditions that support cyanobacteria blooms. In doing so, we employ a genetic algorithm (GA). We hypothesize that management based on controlling the upper tail of the probability distribution of phosphorus loading implies different efficient management actions as compared to controlling mean loading. We find a shift to more intense effort on fewer acres when a probabilistic objective is specified with cost savings of meeting risk levels of up to 25% over mean loading based policies. Additionally, we illustrate the relative cost effectiveness of various policies designed to meet this risk-based objective. Rainfall and the subsequent overland runoff is the source of transportation of nutrients to a receiving water body, with larger amounts of phosphorus moving in more intense rainfall events. We highlight the importance of this transportation mechanism by comparing policies under climate change scenarios, where the intensity of rainfall is projected to increase and the time series process of rainfall to change. In Manuscript 2, we introduce a new economic groundwater model that incorporates the gradual shift

  14. State to State and Charged Particle Kinetic Modeling of Time Filtering and Cs Addition

    SciTech Connect

    Capitelli, M.; Gorse, C.; Longo, S.; Diomede, P.; Pagano, D.

    2007-08-10

    We present here an account on the progress of kinetic simulation of non equilibrium plasmas in conditions of interest for negative ion production by using the 1D Bari code for hydrogen plasma simulation. The model includes the state to state kinetics of the vibrational level population of hydrogen molecules, plus a PIC/MCC module for the multispecies dynamics of charged particles. In particular we present new results for the modeling of two issues of great interest: the time filtering and the Cs addition via surface coverage.

  15. Additional interfacial force in lattice Boltzmann models for incompressible multiphase flows.

    PubMed

    Li, Q; Luo, K H; Gao, Y J; He, Y L

    2012-02-01

    The existing lattice Boltzmann models for incompressible multiphase flows are mostly constructed with two distribution functions: one is the order parameter distribution function, which is used to track the interface between different phases, and the other is the pressure distribution function for solving the velocity field. In this paper, it is shown that in these models the recovered momentum equation is inconsistent with the target one: an additional force is included in the recovered momentum equation. The additional force has the following features. First, it is proportional to the macroscopic velocity. Second, it is zero in every single-phase region but is nonzero in the interface. Therefore it can be interpreted as an interfacial force. To investigate the effects of the additional interfacial force, numerical simulations are carried out for the problem of Rayleigh-Taylor instability, droplet splashing on a thin liquid film, and the evolution of a falling droplet under gravity. Numerical results demonstrate that, with the increase of the velocity or the Reynolds number, the additional interfacial force will gradually have an important influence on the interface and affect the numerical accuracy. PMID:22463354

  16. Model for Assembly Line Re-Balancing Considering Additional Capacity and Outsourcing to Face Demand Fluctuations

    NASA Astrophysics Data System (ADS)

    Samadhi, TMAA; Sumihartati, Atin

    2016-02-01

    The most critical stage in a garment industry is sewing process, because generally, it consists of a number of operations and a large number of sewing machines for each operation. Therefore, it requires a balancing method that can assign task to work station with balance workloads. Many studies on assembly line balancing assume a new assembly line, but in reality, due to demand fluctuation and demand increased a re-balancing is needed. To cope with those fluctuating demand changes, additional capacity can be carried out by investing in spare sewing machine and paying for sewing service through outsourcing. This study develops an assembly line balancing (ALB) model on existing line to cope with fluctuating demand change. Capacity redesign is decided if the fluctuation demand exceeds the available capacity through a combination of making investment on new machines and outsourcing while considering for minimizing the cost of idle capacity in the future. The objective of the model is to minimize the total cost of the line assembly that consists of operating costs, machine cost, adding capacity cost, losses cost due to idle capacity and outsourcing costs. The model develop is based on an integer programming model. The model is tested for a set of data of one year demand with the existing number of sewing machines of 41 units. The result shows that additional maximum capacity up to 76 units of machine required when there is an increase of 60% of the average demand, at the equal cost parameters..

  17. Default risk modeling beyond the first-passage approximation: extended Black-Cox model.

    PubMed

    Katz, Yuri A; Shokhirev, Nikolai V

    2010-07-01

    We develop a generalization of the Black-Cox structural model of default risk. The extended model captures uncertainty related to firm's ability to avoid default even if company's liabilities momentarily exceeding its assets. Diffusion in a linear potential with the radiation boundary condition is used to mimic a company's default process. The exact solution of the corresponding Fokker-Planck equation allows for derivation of analytical expressions for the cumulative probability of default and the relevant hazard rate. Obtained closed formulas fit well the historical data on global corporate defaults and demonstrate the split behavior of credit spreads for bonds of companies in different categories of speculative-grade ratings with varying time to maturity. Introduction of the finite rate of default at the boundary improves valuation of credit risk for short time horizons, which is the key advantage of the proposed model. We also consider the influence of uncertainty in the initial distance to the default barrier on the outcome of the model and demonstrate that this additional source of incomplete information may be responsible for nonzero credit spreads for bonds with very short time to maturity. PMID:20866698

  18. Default risk modeling beyond the first-passage approximation: Extended Black-Cox model

    NASA Astrophysics Data System (ADS)

    Katz, Yuri A.; Shokhirev, Nikolai V.

    2010-07-01

    We develop a generalization of the Black-Cox structural model of default risk. The extended model captures uncertainty related to firm’s ability to avoid default even if company’s liabilities momentarily exceeding its assets. Diffusion in a linear potential with the radiation boundary condition is used to mimic a company’s default process. The exact solution of the corresponding Fokker-Planck equation allows for derivation of analytical expressions for the cumulative probability of default and the relevant hazard rate. Obtained closed formulas fit well the historical data on global corporate defaults and demonstrate the split behavior of credit spreads for bonds of companies in different categories of speculative-grade ratings with varying time to maturity. Introduction of the finite rate of default at the boundary improves valuation of credit risk for short time horizons, which is the key advantage of the proposed model. We also consider the influence of uncertainty in the initial distance to the default barrier on the outcome of the model and demonstrate that this additional source of incomplete information may be responsible for nonzero credit spreads for bonds with very short time to maturity.

  19. The Global Earthquake Model and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  20. Application of the Beck model to stock markets: Value-at-Risk and portfolio risk assessment

    NASA Astrophysics Data System (ADS)

    Kozaki, M.; Sato, A.-H.

    2008-02-01

    We apply the Beck model, developed for turbulent systems that exhibit scaling properties, to stock markets. Our study reveals that the Beck model elucidates the properties of stock market returns and is applicable to practical use such as the Value-at-Risk estimation and the portfolio analysis. We perform empirical analysis with daily/intraday data of the S&P500 index return and find that the volatility fluctuation of real markets is well-consistent with the assumptions of the Beck model: The volatility fluctuates at a much larger time scale than the return itself and the inverse of variance, or “inverse temperature”, β obeys Γ-distribution. As predicted by the Beck model, the distribution of returns is well-fitted by q-Gaussian distribution of Tsallis statistics. The evaluation method of Value-at-Risk (VaR), one of the most significant indicators in risk management, is studied for q-Gaussian distribution. Our proposed method enables the VaR evaluation in consideration of tail risk, which is underestimated by the variance-covariance method. A framework of portfolio risk assessment under the existence of tail risk is considered. We propose a multi-asset model with a single volatility fluctuation shared by all assets, named the single β model, and empirically examine the agreement between the model and an imaginary portfolio with Dow Jones indices. It turns out that the single β model gives good approximation to portfolios composed of the assets with non-Gaussian and correlated returns.

  1. Modeling Manufacturing Processes to Mitigate Technological Risk

    SciTech Connect

    Allgood, G.O.; Manges, W.W.

    1999-10-24

    An economic model is a tool for determining the justifiable cost of new sensors and subsystems with respect to value and operation. This process balances the R and D costs against the expense of maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.). This model has been successfully used and deployed in the CAFE Project. The economic model was one of seven (see Fig. 1) elements critical in developing an investment strategy. It has been successfully used in guiding the R and D activities on the CAFE Project, suspending activities on three new sensor technologies, and continuing development o f two others. The model has also been used to justify the development of a new prognostic approach for diagnosing machine health using COTS equipment and a new algorithmic approach. maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.).

  2. Generalized linear and generalized additive models in studies of species distributions: Setting the scene

    USGS Publications Warehouse

    Guisan, A.; Edwards, T.C., Jr.; Hastie, T.

    2002-01-01

    An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.

  3. A patch-based cross masking model for natural images with detail loss and additive defects

    NASA Astrophysics Data System (ADS)

    Liu, Yucheng; Allebach, Jan P.

    2015-03-01

    Visual masking is an effect that contents of the image reduce the detectability of a given target signal hidden in the image. The effect of visual masking has found its application in numerous image processing and vision tasks. In the past few decades, numerous research has been conducted on visual masking based on models optimized for artificial targets placed upon unnatural masks. Over the years, there is a tendency to apply masking model to predict natural image quality and detection threshold of distortion presented in natural images. However, to our knowledge few studies have been conducted to understand the generalizability of masking model to different types of distortion presented in natural images. In this work, we measure the ability of natural image patches in masking three different types of distortion, and analyse the performance of conventional gain control model in predicting the distortion detection threshold. We then propose a new masking model, where detail loss and additive defects are modeled in two parallel vision channels and interact with each other via a cross masking mechanism. We show that the proposed cross masking model has better adaptability to various image structures and distortions in natural scenes.

  4. Injury prevention and risk communication: a mental models approach.

    PubMed

    Austin, Laurel C; Fischhoff, Baruch

    2012-04-01

    Individuals' decisions and behaviour can play a critical role in determining both the probability and severity of injury. Behavioural decision research studies peoples' decision-making processes in terms comparable to scientific models of optimal choices, providing a basis for focusing interventions on the most critical opportunities to reduce risks. That research often seeks to identify the 'mental models' that underlie individuals' interpretations of their circumstances and the outcomes of possible actions. In the context of injury prevention, a mental models approach would ask why people fail to see risks, do not make use of available protective interventions or misjudge the effectiveness of protective measures. If these misunderstandings can be reduced through context-appropriate risk communications, then their improved mental models may help people to engage more effectively in behaviours that they judge to be in their own best interest. If that proves impossible, then people may need specific instructions, not trusting to intuition or even paternalistic protection against situations that they cannot sufficiently control. The method entails working with domain specialists to elicit and create an expert model of the risk situation, interviewing lay people to elicit their comparable mental models, and developing and evaluating communication interventions designed to close the gaps between lay people and experts. This paper reviews the theory and method behind this research stream and uses examples to discuss how the approach can be used to develop scientifically validated context-sensitive injury risk communications. PMID:22088928

  5. Improving default risk prediction using Bayesian model uncertainty techniques.

    PubMed

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. PMID:23163724

  6. Modeling Environment for Total Risk-1A

    EPA Science Inventory

    MENTOR-1A uses an integrated, mechanistically consistent source-to-dose modeling framework to quantify inhalation exposure and dose for individuals and/or populations due to co-occurring air pollutants. It uses the "One Atmosphere" concept to characterize simultaneous exposures t...

  7. Modeling Environment for Total Risk-4M

    EPA Science Inventory

    MENTOR-4M uses an integrated, mechanistically consistent, source-to-dose modeling framework to quantify simultaneous exposures and doses of individuals and populations to multiple contaminants. It is an implementation of the MENTOR system for exposures to Multiple contaminants fr...

  8. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region. PMID:23242683

  9. The PRIMROSE cardiovascular risk prediction models for people with severe mental illness

    PubMed Central

    Osborn, David PJ; Hardoon, Sarah; Omar, Rumana Z; Holt, Richard IG; King, Michael; Larsen, John; Marston, Louise; Morris, Richard W; Nazareth, Irwin; Walters, Kate; Petersen, Irene

    2015-01-01

    Importance People with Severe Mental Illness (SMI) including schizophrenia and bipolar disorder have excess cardiovascular disease (CVD). Risk prediction models, validated for the general population, may not accurately estimate cardiovascular risk in this group. Objectives To develop and validate a risk model exclusive to predicting CVD events in people with SMI, using established cardiovascular risk factors and additional variables. Design Prospective cohort and risk score development study. Setting UK Primary care Participants 38,824 people with a diagnosis of SMI (schizophrenia, bipolar disorder or other non-organic psychosis) aged 30-90 years. Median follow-up 5.6 years with 2,324 CVD events (6%). Main outcomes and measures Ten year risk of first cardiovascular event (myocardial infarction, angina pectoris, cerebrovascular accidents or major coronary surgery). Predictors included age, gender, height, weight, systolic blood pressure, diabetes, smoking, body mass index (BMI), lipid profile, social deprivation, SMI diagnosis, prescriptions of antidepressant , antipsychotics and reports of heavy alcohol use. Results We developed two risk models for people with SMI: The PRIMROSE BMI model and a lipid model. These mutually excluded lipids and BMI. From cross-validations, in terms of discrimination, for men, the PRIMROSE lipid model D statistic was 1.92 (1.80-2.03) and C statistic was 0.80 (0.76-0.83) compared to 1.74 (1.54-1.86) and 0.78 (0.75-0.82) for published Framingham risk scores; in women corresponding results were 1.87 (1.76-1.98) and 0.80 (0.76-0.83) for the PRIMROSE lipid model and 1.58 (1.48-1.68) and 0.76 (0.72-0.80) for Framingham. Discrimination statistics for the PRIMROSE BMI model were comparable to those for the PRIMROSE lipid model. Calibration plots suggested that both PRIMROSE models were superior to the Framingham models. Conclusion and relevance The PRIMROSE BMI and lipid CVD risk prediction models performed better in SMI than models which only

  10. Empirical Analysis of Farm Credit Risk under the Structure Model

    ERIC Educational Resources Information Center

    Yan, Yan

    2009-01-01

    The study measures farm credit risk by using farm records collected by Farm Business Farm Management (FBFM) during the period 1995-2004. The study addresses the following questions: (1) whether farm's financial position is fully described by the structure model, (2) what are the determinants of farm capital structure under the structure model, (3)…

  11. Evaporation model for beam based additive manufacturing using free surface lattice Boltzmann methods

    NASA Astrophysics Data System (ADS)

    Klassen, Alexander; Scharowsky, Thorsten; Körner, Carolin

    2014-07-01

    Evaporation plays an important role in many technical applications including beam-based additive manufacturing processes, such as selective electron beam or selective laser melting (SEBM/SLM). In this paper, we describe an evaporation model which we employ within the framework of a two-dimensional free surface lattice Boltzmann method. With this method, we solve the hydrodynamics as well as thermodynamics of the molten material taking into account the mass and energy losses due to evaporation and the recoil pressure acting on the melt pool. Validation of the numerical model is performed by measuring maximum melt depths and evaporative losses in samples of pure titanium and Ti-6Al-4V molten by an electron beam. Finally, the model is applied to create processing maps for an SEBM process. The results predict that the penetration depth of the electron beam, which is a function of the acceleration voltage, has a significant influence on evaporation effects.

  12. Parity Symmetry and Parity Breaking in the Quantum Rabi Model with Addition of Ising Interaction

    NASA Astrophysics Data System (ADS)

    Wang, Qiong; He, Zhi; Yao, Chun-Mei

    2015-04-01

    We explore the possibility to generate new parity symmetry in the quantum Rabi model after a bias is introduced. In contrast to a mathematical treatment in a previous publication [J. Phys. A 46 (2013) 265302], we consider a physically realistic method by involving an additional spin into the quantum Rabi model to couple with the original spin by an Ising interaction, and then the parity symmetry is broken as well as the scaling behavior of the ground state by introducing a bias. The rule can be found that the parity symmetry is broken by introducing a bias and then restored by adding new degrees of freedom. Experimental feasibility of realizing the models under discussion is investigated. Supported by the National Natural Science Foundation of China under Grant Nos. 61475045 and 11347142, the Natural Science Foundation of Hunan Province, China under Grant No. 2015JJ3092

  13. Modeling of Flood Risk for the Continental United States

    NASA Astrophysics Data System (ADS)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  14. How pharmacokinetic modeling could improve a risk assessment for manganese

    EPA Science Inventory

    The neurotoxicity of manganese (Mn) is well established, yet the risk assessment of Mn is made complex by certain enigmas. These include apparently greatertoxicity via inhalation compared to oral exposure and greater toxicity in humans compared to rats. In addition, until recentl...

  15. Hydrophobic interactions in model enclosures from small to large length scales: non-additivity in explicit and implicit solvent models

    PubMed Central

    Wang, Lingle; Friesner, Richard A.; Berne, B. J.

    2011-01-01

    The binding affinities between a united-atom methane and various model hydrophobic enclosures were studied through high accuracy free energy perturbation methods (FEP). We investigated the non-additivity of the hydrophobic interaction in these systems, measured by the deviation of its binding affinity from that predicted by the pairwise additivity approximation. While only small non-additivity effects were previously reported in the interactions in methane trimers, we found large cooperative effects (as large as −1.14 kcal mol−1 or approximately a 25% increase in the binding affinity) and anti-cooperative effects (as large as 0.45 kcal mol−1) for these model enclosed systems. Decomposition of the total potential of mean force (PMF) into increasing orders of multi-body interactions indicates that the contributions of the higher order multi-body interactions can be either positive or negative in different systems, and increasing the order of multi-body interactions considered did not necessarily improve the accuracy. A general correlation between the sign of the non-additivity effect and the curvature of the solute molecular surface was observed. We found that implicit solvent models based on the molecular surface area (MSA) performed much better, not only in predicting binding affinities, but also in predicting the non-additivity effects, compared with models based on the solvent accessible surface area (SASA), suggesting that MSA is a better descriptor of the curvature of the solutes. We also show how the non-additivity contribution changes as the hydrophobicity of the plate is decreased from the dewetting regime to the wetting regime. PMID:21043426

  16. Model for Solar Proton Risk Assessment

    NASA Technical Reports Server (NTRS)

    Xapos, M. A.; Stauffer, C.; Gee, G. B.; Barth, J. L.; Stassinopoulos, E. G.; McGuire, R. E.

    2004-01-01

    A statistical model for cumulative solar proton event fluences during space missions is presented that covers both the solar minimum and solar maximum phases of the solar cycle. It is based on data from the IMP and GOES series of satellites that is integrated together to allow the best features of each data set to be taken advantage of. This allows fluence-energy spectra to be extended out to energies of 327 MeV.

  17. Parametric Estimation in a Recurrent Competing Risks Model

    PubMed Central

    Peña, Edsel A.

    2014-01-01

    A resource-efficient approach to making inferences about the distributional properties of the failure times in a competing risks setting is presented. Efficiency is gained by observing recurrences of the competing risks over a random monitoring period. The resulting model is called the recurrent competing risks model (RCRM) and is coupled with two repair strategies whenever the system fails. Maximum likelihood estimators of the parameters of the marginal distribution functions associated with each of the competing risks and also of the system lifetime distribution function are presented. Estimators are derived under perfect and partial repair strategies. Consistency and asymptotic properties of the estimators are obtained. The estimation methods are applied to a data set of failures for cars under warranty. Simulation studies are used to ascertain the small sample properties and the efficiency gains of the resulting estimators. PMID:25346751

  18. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  19. Testing Departure from Additivity in Tukey’s Model using Shrinkage: Application to a Longitudinal Setting

    PubMed Central

    Ko, Yi-An; Mukherjee, Bhramar; Smith, Jennifer A.; Park, Sung Kyun; Kardia, Sharon L.R.; Allison, Matthew A.; Vokonas, Pantel S.; Chen, Jinbo; Diez-Roux, Ana V.

    2014-01-01

    While there has been extensive research developing gene-environment interaction (GEI) methods in case-control studies, little attention has been given to sparse and efficient modeling of GEI in longitudinal studies. In a two-way table for GEI with rows and columns as categorical variables, a conventional saturated interaction model involves estimation of a specific parameter for each cell, with constraints ensuring identifiability. The estimates are unbiased but are potentially inefficient because the number of parameters to be estimated can grow quickly with increasing categories of row/column factors. On the other hand, Tukey’s one degree of freedom (df) model for non-additivity treats the interaction term as a scaled product of row and column main effects. Due to the parsimonious form of interaction, the interaction estimate leads to enhanced efficiency and the corresponding test could lead to increased power. Unfortunately, Tukey’s model gives biased estimates and low power if the model is misspecified. When screening multiple GEIs where each genetic and environmental marker may exhibit a distinct interaction pattern, a robust estimator for interaction is important for GEI detection. We propose a shrinkage estimator for interaction effects that combines estimates from both Tukey’s and saturated interaction models and use the corresponding Wald test for testing interaction in a longitudinal setting. The proposed estimator is robust to misspecification of interaction structure. We illustrate the proposed methods using two longitudinal studies — the Normative Aging Study and the Multi-Ethnic Study of Atherosclerosis. PMID:25112650

  20. Wall-models for large eddy simulation based on a generic additive-filter formulation

    NASA Astrophysics Data System (ADS)

    Sanchez Rocha, Martin

    Based on the philosophy of only resolving the large scales of turbulent motion, Large Eddy Simulation (LES) has demonstrated potential to provide high-fidelity turbulence simulations at low computational cost. However, when the scales that control the turbulence in a particular flow are not large, LES has to increase significantly its computational cost to provide accurate predictions. This is the case in wall-bounded flows, where the grid resolution required by LES to resolve the near-wall structures is close to the requirements to resolve the smallest dissipative scales in turbulence. Therefore, to reduce this demanding requirement, it has been proposed to model the near-wall region with Reynolds-Averaged Navier-Stokes (RANS) models, in what is known as hybrid RANS/LES approach. In this work, the mathematical implications of merging two different turbulence modeling approaches are addressed by deriving the exact hybrid RANS/LES Navier-Stokes equations. These equations are derived by introducing an additive-filter, which linearly combines the RANS and LES operators with a blending function. The equations derived with the additive-filter predict additional hybrid terms, which represent the interactions between RANS and LES formulations. Theoretically, the prediction of the hybrid terms demonstrates that the hybridization of the two approaches cannot be accomplished only by the turbulence model equations, as it is claimed in current hybrid RANS/LES models. The importance of the exact hybrid RANS/LES equations is demonstrated by conducting numerical calculations on a turbulent flat-plate boundary layer. Results indicate that the hybrid terms help to maintain an equilibrated model transition when the hybrid formulation switches from RANS to LES. Results also indicate, that when the hybrid terms are not included, the accuracy of the calculations strongly relies on the blending function implemented in the additive-filter. On the other hand, if the exact equations are

  1. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic

  2. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  3. Positive autoantibodies to ZnT8 indicate elevated risk for additional autoimmune conditions in patients with Addison's disease.

    PubMed

    Fichna, Marta; Rogowicz-Frontczak, Anita; Żurawek, Magdalena; Fichna, Piotr; Gryczyńska, Maria; Zozulińska-Ziółkiewicz, Dorota; Ruchała, Marek

    2016-07-01

    Autoimmune Addison's disease (AAD) associates with exceptional susceptibility to develop other autoimmune conditions, including type 1 diabetes (T1D), marked by positive serum autoantibodies to insulin (IAA), glutamic acid decarboxylase (GADA) and insulinoma-associated protein 2 (IA-2A). Zinc transporter 8 (ZnT8) is a new T1D autoantigen, encoded by the SLC30A8 gene. Its polymorphic variant rs13266634C/T seems associated with the occurrence of serum ZnT8 antibodies (ZnT8A). This study was designed to determine the prevalence of serum ZnT8A and their clinical implication in 140 AAD patients. Other beta cell and thyroid-specific autoantibodies were also investigated, and ZnT8A results were confronted with the rs13266634 genotype. ZnT8A were detectable in 8.5 %, GADA in 20.7 %, IA-2A in 5.7 %, IAA in 1.6 % and various anti-thyroid antibodies in 7.1-67.8 % individuals. Type 1 diabetes was found in 10 % AAD patients. ZnT8A were positive in 57.1 % of T1D patients and 3.4 % non-diabetic AAD. Analysis of ZnT8A enabled to identify autoimmunity in two (14.3 %) T1D individuals previously classified as autoantibody-negative. ZnT8A-positive patients revealed significantly higher number of autoimmune conditions (p < 0.001), increased prevalence of T1D (p < 0.001) and other beta cell-specific autoantibodies. Carriers of the rs13266634 T-allele displayed increased frequency (p = 0.006) and higher titres of ZnT8A (p = 0.002). Our study demonstrates high incidence of ZnT8A in AAD patients. ZnT8A are associated with coexisting T1D and predictive of T1D in non-diabetic subjects. Moreover, positive ZnT8A in AAD indicate elevated risk for additional autoimmune conditions. Autoantibodies to beta cell antigens, comprising ZnT8, could be included in routine screening panels in AAD. PMID:26972575

  4. Risk Management Model in Surface Exploitation of Mineral Deposits

    NASA Astrophysics Data System (ADS)

    Stojanović, Cvjetko

    2016-06-01

    Risk management is an integrative part of all types of project management. One of the main tasks of pre-investment studies and other project documentation is the tendency to protect investment projects as much as possible against investment risks. Therefore, the provision and regulation of risk information ensure the identification of the probability of the emergence of adverse events, their forms, causes and consequences, and provides a timely measures of protection against risks. This means that risk management involves a set of management methods and techniques used to reduce the possibility of realizing the adverse events and consequences and thus increase the possibilities of achieving the planned results with minimal losses. Investment in mining projects are of capital importance because they are very complex projects, therefore being very risky, because of the influence of internal and external factors and limitations arising from the socio-economic environment. Due to the lack of a risk management system, numerous organizations worldwide have suffered significant financial losses. Therefore, it is necessary for any organization to establish a risk management system as a structural element of system management system as a whole. This paper presents an approach to a Risk management model in the project of opening a surface coal mine, developed based on studies of extensive scientific literature and personal experiences of the author, and which, with certain modifications, may find use for any investment project, both in the mining industry as well as in investment projects in other areas.

  5. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  6. Modelling microbial health risk of wastewater reuse: A systems perspective.

    PubMed

    Beaudequin, Denise; Harden, Fiona; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Mengersen, Kerrie

    2015-11-01

    There is a widespread need for the use of quantitative microbial risk assessment (QMRA) to determine reclaimed water quality for specific uses, however neither faecal indicator levels nor pathogen concentrations alone are adequate for assessing exposure health risk. The aim of this study was to build a conceptual model representing factors contributing to the microbiological health risks of reusing water treated in maturation ponds. This paper describes the development of an unparameterised model that provides a visual representation of theoretical constructs and variables of interest. Information was collected from the peer-reviewed literature and through consultation with experts from regulatory authorities and academic disciplines. In this paper we explore how, considering microbial risk as a modular system, following the QMRA framework enables incorporation of the many factors influencing human exposure and dose response, to better characterise likely human health impacts. By using and expanding upon the QMRA framework we deliver new insights into this important field of environmental exposures. We present a conceptual model of health risk of microbial exposure which can be used for maturation ponds and, more importantly, as a generic tool to assess health risk in diverse wastewater reuse scenarios. PMID:26277638

  7. A first screening and risk assessment of pharmaceuticals and additives in personal care products in waste water, sludge, recipient water and sediment from Faroe Islands, Iceland and Greenland.

    PubMed

    Huber, Sandra; Remberger, Mikael; Kaj, Lennart; Schlabach, Martin; Jörundsdóttir, Hrönn Ó; Vester, Jette; Arnórsson, Mímir; Mortensen, Inge; Schwartson, Richard; Dam, Maria

    2016-08-15

    A screening of a broad range of pharmaceuticals and additives in personal care products (PPCPs) in sub-arctic locations of the Faroe Islands (FO), Iceland (IS) and Greenland (GL) was conducted. In total 36 pharmaceuticals including some metabolites, and seven additives in personal care products were investigated in influent and effluent waters as well as sludge of waste water treatment plants (WWTPs) and in water and sediment of recipients. Concentrations and distribution patterns for PPCPs discharged via sewage lines (SLs) to the marine environment were assessed. Of the 36 pharmaceuticals or metabolites analysed 33 were found close to or above the limit of detection (LOD) in all or a part of the samples. All of the seven investigated additives in personal care products were detected above the LOD. Some of the analysed PPCPs occurred in every or almost every sample. Among these were diclofenac, ibuprofen, lidocaine, naproxen, metformin, citalopram, venlafaxine, amiloride, furosemide, metoprolol, sodium dodecyl sulphate (SDS) and cetrimonium salt (ATAC-C16). Additionally, the study encompasses ecotoxicological risk assessment of 2/3 of the analysed PPCPs in recipient and diluted effluent waters. For candesartan only a small margin to levels with inacceptable risks was observed in diluted effluent waters at two locations (FO). Chronical risks for aquatic organisms staying and/or living around WWTP effluent pipe-outlets were indicated for 17β-estradiol and estriol in the three countries. Additives in PCPs were found to pose the largest risk to the aquatic environment. The surfactants CAPB and ATAC-C16 were found in concentrations resulting in risk factors up to 375 for CAPB and 165 for ATAC-C16 in recipients for diluted effluents from Iggia, Nuuk (GL) and Torshavn (FO) respectively. These results demonstrates a potentially high ecological risk stemming from discharge of surfactants as used in household and industrial detergents as well as additives in personal care

  8. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  9. Topsoil organic carbon content of Europe, a new map based on a generalised additive model

    NASA Astrophysics Data System (ADS)

    de Brogniez, Delphine; Ballabio, Cristiano; Stevens, Antoine; Jones, Robert J. A.; Montanarella, Luca; van Wesemael, Bas

    2014-05-01

    There is an increasing demand for up-to-date spatially continuous organic carbon (OC) data for global environment and climatic modeling. Whilst the current map of topsoil organic carbon content for Europe (Jones et al., 2005) was produced by applying expert-knowledge based pedo-transfer rules on large soil mapping units, the aim of this study was to replace it by applying digital soil mapping techniques on the first European harmonised geo-referenced topsoil (0-20 cm) database, which arises from the LUCAS (land use/cover area frame statistical survey) survey. A generalized additive model (GAM) was calibrated on 85% of the dataset (ca. 17 000 soil samples) and a backward stepwise approach selected slope, land cover, temperature, net primary productivity, latitude and longitude as environmental covariates (500 m resolution). The validation of the model (applied on 15% of the dataset), gave an R2 of 0.27. We observed that most organic soils were under-predicted by the model and that soils of Scandinavia were also poorly predicted. The model showed an RMSE of 42 g kg-1 for mineral soils and of 287 g kg-1 for organic soils. The map of predicted OC content showed the lowest values in Mediterranean countries and in croplands across Europe, whereas highest OC content were predicted in wetlands, woodlands and in mountainous areas. The map of standard error of the OC model predictions showed high values in northern latitudes, wetlands, moors and heathlands, whereas low uncertainty was mostly found in croplands. A comparison of our results with the map of Jones et al. (2005) showed a general agreement on the prediction of mineral soils' OC content, most probably because the models use some common covariates, namely land cover and temperature. Our model however failed to predict values of OC content greater than 200 g kg-1, which we explain by the imposed unimodal distribution of our model, whose mean is tilted towards the majority of soils, which are mineral. Finally, average

  10. Comparison of prosthetic models produced by traditional and additive manufacturing methods

    PubMed Central

    Park, Jin-Young; Kim, Hae-Young; Kim, Ji-Hwan; Kim, Jae-Hong

    2015-01-01

    PURPOSE The purpose of this study was to verify the clinical-feasibility of additive manufacturing by comparing the accuracy of four different manufacturing methods for metal coping: the conventional lost wax technique (CLWT); subtractive methods with wax blank milling (WBM); and two additive methods, multi jet modeling (MJM), and micro-stereolithography (Micro-SLA). MATERIALS AND METHODS Thirty study models were created using an acrylic model with the maxillary upper right canine, first premolar, and first molar teeth. Based on the scan files from a non-contact blue light scanner (Identica; Medit Co. Ltd., Seoul, Korea), thirty cores were produced using the WBM, MJM, and Micro-SLA methods, respectively, and another thirty frameworks were produced using the CLWT method. To measure the marginal and internal gap, the silicone replica method was adopted, and the silicone images obtained were evaluated using a digital microscope (KH-7700; Hirox, Tokyo, Japan) at 140X magnification. Analyses were performed using two-way analysis of variance (ANOVA) and Tukey post hoc test (α=.05). RESULTS The mean marginal gaps and internal gaps showed significant differences according to tooth type (P<.001 and P<.001, respectively) and manufacturing method (P<.037 and P<.001, respectively). Micro-SLA did not show any significant difference from CLWT regarding mean marginal gap compared to the WBM and MJM methods. CONCLUSION The mean values of gaps resulting from the four different manufacturing methods were within a clinically allowable range, and, thus, the clinical use of additive manufacturing methods is acceptable as an alternative to the traditional lost wax-technique and subtractive manufacturing. PMID:26330976

  11. Thermodynamic network model for predicting effects of substrate addition and other perturbations on subsurface microbial communities

    SciTech Connect

    Jack Istok; Melora Park; James McKinley; Chongxuan Liu; Lee Krumholz; Anne Spain; Aaron Peacock; Brett Baldwin

    2007-04-19

    The overall goal of this project is to develop and test a thermodynamic network model for predicting the effects of substrate additions and environmental perturbations on microbial growth, community composition and system geochemistry. The hypothesis is that a thermodynamic analysis of the energy-yielding growth reactions performed by defined groups of microorganisms can be used to make quantitative and testable predictions of the change in microbial community composition that will occur when a substrate is added to the subsurface or when environmental conditions change.

  12. A simulation model for risk assessment of turbine wheels

    NASA Astrophysics Data System (ADS)

    Safie, Fayssal M.; Hage, Richard T.

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  13. A simulation model for risk assessment of turbine wheels

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Hage, Richard T.

    1991-01-01

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  14. Development and Application of Chronic Disease Risk Prediction Models

    PubMed Central

    Oh, Sun Min; Stefani, Katherine M.

    2014-01-01

    Currently, non-communicable chronic diseases are a major cause of morbidity and mortality worldwide, and a large proportion of chronic diseases are preventable through risk factor management. However, the prevention efficacy at the individual level is not yet satisfactory. Chronic disease prediction models have been developed to assist physicians and individuals in clinical decision-making. A chronic disease prediction model assesses multiple risk factors together and estimates an absolute disease risk for the individual. Accurate prediction of an individual's future risk for a certain disease enables the comparison of benefits and risks of treatment, the costs of alternative prevention strategies, and selection of the most efficient strategy for the individual. A large number of chronic disease prediction models, especially targeting cardiovascular diseases and cancers, have been suggested, and some of them have been adopted in the clinical practice guidelines and recommendations of many countries. Although few chronic disease prediction tools have been suggested in the Korean population, their clinical utility is not as high as expected. This article reviews methodologies that are commonly used for developing and evaluating a chronic disease prediction model and discusses the current status of chronic disease prediction in Korea. PMID:24954311

  15. A new model for polluted soil risk assessment

    NASA Astrophysics Data System (ADS)

    Andretta, M.; Serra, R.; Villani, M.

    2006-08-01

    In this paper, we discuss the most important theoretical aspects of polluted soil Risk Assessment Methodologies, which have been developed in order to evaluate the risk, for the exposed people, connected with the residual contaminant concentration in polluted soil, and we make a short presentation of the major different kinds of risk assessment methodologies. We also underline the relevant role played, in this kind of analysis, by the pollutant transport models. We also describe a new and innovative model, based on the general framework of the so-called Cellular Automata (CA), initially developed in the UE-Esprit Project COLOMBO for the simulation of bioremediation processes. These kinds of models, for their intrinsic "finite and discrete" characteristics, seem to be very well suited for a detailed analysis of the shape of the pollutant sources, the contaminant fates and the evaluation of target in the risk assessment evaluation. In particular, we will describe the future research activities we are going to develop in the area of a strict integration between pollutant fate and transport models and Risk Analysis Methodologies.

  16. A Cooperative Model for IS Security Risk Management in Distributed Environment

    PubMed Central

    Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively. PMID:24563626

  17. The use of ecosystem models in risk assessment

    SciTech Connect

    Starodub, M.E.; Miller, P.A.; Willes, R.F.

    1994-12-31

    Ecosystem models, when used in conjunction with available environmental effects monitoring data enable informed decisions regarding actions that should be taken to manage ecological risks from areas of localized chemical loadings and accumulation. These models provide quantitative estimates of chemical concentrations in various environmental media. The reliable application of these models as predictive tools for environmental assessment requires a thorough understanding of the theory and mathematical relationships described by the models and demands rigorous validation of input data and model results with field and laboratory data. Food chain model selection should be based on the ability to best simulate the interactions of the food web and processes governing the transfer of chemicals from the dissolved and particulate phase to various trophic levels for the site in question. This requires that the user be familiar with the theories on which these models are based, and be aware of the merits and short comings of each prior to attempting to model food chain accumulation. Questions to be asked include: are all potential exposure pathways addressed? are omitted pathways critical to the risk assessment process? is the model flexible? To answer these questions one must consider the, chemical(s) of concern, site-specific ecosystem characteristics, risk assessment receptor (aquatic, wildlife, human) dietary habits, influence of effluent characteristics on food chain dynamics.

  18. Physics-based Entry, Descent and Landing Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  19. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  20. Evaluating biomarkers to model cancer risk post cosmic ray exposure

    NASA Astrophysics Data System (ADS)

    Sridharan, Deepa M.; Asaithamby, Aroumougame; Blattnig, Steve R.; Costes, Sylvain V.; Doetsch, Paul W.; Dynan, William S.; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D.; Peterson, Leif E.; Plante, Ianik; Ponomarev, Artem L.; Saha, Janapriya; Snijders, Antoine M.; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M.

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  1. Additive influence of genetic predisposition and conventional risk factors in the incidence of coronary heart disease: a population-based study in Greece

    PubMed Central

    Yiannakouris, Nikos; Katsoulis, Michail; Trichopoulou, Antonia; Ordovas, Jose M; Trichopoulos, Dimitrios

    2014-01-01

    Objectives An additive genetic risk score (GRS) for coronary heart disease (CHD) has previously been associated with incident CHD in the population-based Greek European Prospective Investigation into Cancer and nutrition (EPIC) cohort. In this study, we explore GRS-‘environment’ joint actions on CHD for several conventional cardiovascular risk factors (ConvRFs), including smoking, hypertension, type-2 diabetes mellitus (T2DM), body mass index (BMI), physical activity and adherence to the Mediterranean diet. Design A case–control study. Setting The general Greek population of the EPIC study. Participants and outcome measures 477 patients with medically confirmed incident CHD and 1271 controls participated in this study. We estimated the ORs for CHD by dividing participants at higher or lower GRS and, alternatively, at higher or lower ConvRF, and calculated the relative excess risk due to interaction (RERI) as a measure of deviation from additivity. Results The joint presence of higher GRS and higher risk ConvRF was in all instances associated with an increased risk of CHD, compared with the joint presence of lower GRS and lower risk ConvRF. The OR (95% CI) was 1.7 (1.2 to 2.4) for smoking, 2.7 (1.9 to 3.8) for hypertension, 4.1 (2.8 to 6.1) for T2DM, 1.9 (1.4 to 2.5) for lower physical activity, 2.0 (1.3 to 3.2) for high BMI and 1.5 (1.1 to 2.1) for poor adherence to the Mediterranean diet. In all instances, RERI values were fairly small and not statistically significant, suggesting that the GRS and the ConvRFs do not have effects beyond additivity. Conclusions Genetic predisposition to CHD, operationalised through a multilocus GRS, and ConvRFs have essentially additive effects on CHD risk. PMID:24500614

  2. Are Masking-Based Models of Risk Useful?

    PubMed

    Gisiner, Robert C

    2016-01-01

    As our understanding of directly observable effects from anthropogenic sound exposure has improved, concern about "unobservable" effects such as stress and masking have received greater attention. Equal energy models of masking such as power spectrum models have the appeal of simplicity, but do they offer biologically realistic assessments of the risk of masking? Data relevant to masking such as critical ratios, critical bandwidths, temporal resolution, and directional resolution along with what is known about general mammalian antimasking mechanisms all argue for a much more complicated view of masking when making decisions about the risk of masking inherent in a given anthropogenic sound exposure scenario. PMID:26610979

  3. A dynamical systems model for nuclear power plant risk

    NASA Astrophysics Data System (ADS)

    Hess, Stephen Michael

    The recent transition to an open access generation marketplace has forced nuclear plant operators to become much more cost conscious and focused on plant performance. Coincidentally, the regulatory perspective also is in a state of transition from a command and control framework to one that is risk-informed and performance-based. Due to these structural changes in the economics and regulatory system associated with commercial nuclear power plant operation, there is an increased need for plant management to explicitly manage nuclear safety risk. Application of probabilistic risk assessment techniques to model plant hardware has provided a significant contribution to understanding the potential initiating events and equipment failures that can lead to core damage accidents. Application of the lessons learned from these analyses has supported improved plant operation and safety over the previous decade. However, this analytical approach has not been nearly as successful in addressing the impact of plant processes and management effectiveness on the risks of plant operation. Thus, the research described in this dissertation presents a different approach to address this issue. Here we propose a dynamical model that describes the interaction of important plant processes among themselves and their overall impact on nuclear safety risk. We first provide a review of the techniques that are applied in a conventional probabilistic risk assessment of commercially operating nuclear power plants and summarize the typical results obtained. The limitations of the conventional approach and the status of research previously performed to address these limitations also are presented. Next, we present the case for the application of an alternative approach using dynamical systems theory. This includes a discussion of previous applications of dynamical models to study other important socio-economic issues. Next, we review the analytical techniques that are applicable to analysis of

  4. Guarana provides additional stimulation over caffeine alone in the planarian model.

    PubMed

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R; Constable, Mic Andre; Mulligan, Margaret E; Voura, Evelyn B

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose. PMID:25880065

  5. Guarana Provides Additional Stimulation over Caffeine Alone in the Planarian Model

    PubMed Central

    Moustakas, Dimitrios; Mezzio, Michael; Rodriguez, Branden R.; Constable, Mic Andre; Mulligan, Margaret E.; Voura, Evelyn B.

    2015-01-01

    The stimulant effect of energy drinks is primarily attributed to the caffeine they contain. Many energy drinks also contain other ingredients that might enhance the tonic effects of these caffeinated beverages. One of these additives is guarana. Guarana is a climbing plant native to the Amazon whose seeds contain approximately four times the amount of caffeine found in coffee beans. The mix of other natural chemicals contained in guarana seeds is thought to heighten the stimulant effects of guarana over caffeine alone. Yet, despite the growing use of guarana as an additive in energy drinks, and a burgeoning market for it as a nutritional supplement, the science examining guarana and how it affects other dietary ingredients is lacking. To appreciate the stimulant effects of guarana and other natural products, a straightforward model to investigate their physiological properties is needed. The planarian provides such a system. The locomotor activity and convulsive response of planarians with substance exposure has been shown to provide an excellent system to measure the effects of drug stimulation, addiction and withdrawal. To gauge the stimulant effects of guarana we studied how it altered the locomotor activity of the planarian species Dugesia tigrina. We report evidence that guarana seeds provide additional stimulation over caffeine alone, and document the changes to this stimulation in the context of both caffeine and glucose. PMID:25880065

  6. Nonparametric Independence Screening in Sparse Ultra-High Dimensional Additive Models.

    PubMed

    Fan, Jianqing; Feng, Yang; Song, Rui

    2011-06-01

    A variable screening procedure via correlation learning was proposed in Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under general nonparametric models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, a data-driven thresholding and an iterative nonparametric independence screening (INIS) are also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data analysis demonstrate that the proposed procedure works well with moderate sample size and large dimension and performs better than competing methods. PMID:22279246

  7. Creep damage in a localized load sharing fibre bundle model with additional ageing

    NASA Astrophysics Data System (ADS)

    Lennartz-Sassinek, Sabine; Danku, Zsuzsa; Main, Ian; Kun, Ferenc

    2013-04-01

    Many fields of science are interested in the damage growth in earth materials. Often the damage propagates not in big avalanches like the crack growth measured by acoustic emissions. Also "silent" damage may occur whose emissions are either to small to be detected or mix with back ground noise. These silent emissions may carry the majority of the over all damage in a system until failure. One famous model for damage growth is the fibre bundle model. Here we consider an extended version of a localized load sharing fibre bundle model which incorporates additional time dependent ageing of each fibre motivated by a chemically active environment. We present the non-trivial time dependent damage growth in this model in the low load limit representing creep damage far away from failure. We show both numerical simulations and analytical equations describing the damage rate of silent events and the corresponding amount of triggered "acoustic" damage. The analytical description is in agreement with the numerical results.

  8. Assessing Landslide Risk Areas Using Statistical Models and Land Cover

    NASA Astrophysics Data System (ADS)

    Kim, H. G.; Lee, D. K.; Park, C.; Ahn, Y.; Sung, S.; Park, J. H.

    2015-12-01

    Recently, damages due to landslides have increased in Republic of Korea. Extreme weathers like typhoon, heavy rainfall related to climate change are the main factor of the damages. Especially, Inje-gun, Gangwon-do had severe landslide damages in 2006 and 2007. In Inje-gun, 91% areas are forest, therefore, many land covers related to human activities were adjacent to forest land. Thus, establishment of adaptation plans to landslides was urgently needed. Landslide risk assessment can serve as a good information to policy makers. The objective of this study was assessing landslide risk areas to support establishment of adaptation plans to reduce landslide damages. Statistical distribution models (SDMs) were used to evaluate probability of landslide occurrence. Various SDMs were used to make landslide probability maps considering uncertainty of SDMs. The types of land cover were classified into 5 grades considering vulnerable level to landslide. The landslide probability maps were overlaid with land cover map to calculate landslide risk. As a result of overlay analysis, landslide risk areas were derived. Especially agricultural areas and transportation areas showed high risk and large areas in the risk map. In conclusion, policy makers in Inje-gun must consider the landslide risk map to establish adaptation plans effectively.

  9. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases. PMID:24402720

  10. Model Scramjet Inlet Unstart Induced by Mass Addition and Heat Release

    NASA Astrophysics Data System (ADS)

    Im, Seong-Kyun; Baccarella, Damiano; McGann, Brendan; Liu, Qili; Wermer, Lydiy; Do, Hyungrok

    2015-11-01

    The inlet unstart phenomena in a model scramjet are investigated at an arc-heated hypersonic wind tunnel. The unstart induced by nitrogen or ethylene jets at low or high enthalpy Mach 4.5 freestream flow conditions are compared. The jet injection pressurizes the downstream flow by mass addition and flow blockage. In case of the ethylene jet injection, heat release from combustion increases the backpressure further. Time-resolved schlieren imaging is performed at the jet and the lip of the model inlet to visualize the flow features during unstart. High frequency pressure measurements are used to provide information on pressure fluctuation at the scramjet wall. In both of the mass and heat release driven unstart cases, it is observed that there are similar flow transient and quasi-steady behaviors of unstart shockwave system during the unstart processes. Combustion driven unstart induces severe oscillatory flow motions of the jet and the unstart shock at the lip of the scramjet inlet after the completion of the unstart process, while the unstarted flow induced by solely mass addition remains relatively steady. The discrepancies between the processes of mass and heat release driven unstart are explained by flow choking mechanism.

  11. Analysis of Radiation Pneumonitis Risk Using a Generalized Lyman Model

    SciTech Connect

    Tucker, Susan L. Liu, H. Helen; Liao Zhongxing; Wei Xiong; Wang Shulian; Jin Hekun; Komaki, Ritsuko; Martel, Mary K.; Mohan, Radhe

    2008-10-01

    Purpose: To introduce a version of the Lyman normal-tissue complication probability (NTCP) model adapted to incorporate censored time-to-toxicity data and clinical risk factors and to apply the generalized model to analysis of radiation pneumonitis (RP) risk. Methods and Materials: Medical records and radiation treatment plans were reviewed retrospectively for 576 patients with non-small cell lung cancer treated with radiotherapy. The time to severe (Grade {>=}3) RP was computed, with event times censored at last follow-up for patients not experiencing this endpoint. The censored time-to-toxicity data were analyzed using the standard and generalized Lyman models with patient smoking status taken into account. Results: The generalized Lyman model with patient smoking status taken into account produced NTCP estimates up to 27 percentage points different from the model based on dose-volume factors alone. The generalized model also predicted that 8% of the expected cases of severe RP were unobserved because of censoring. The estimated volume parameter for lung was not significantly different from n = 1, corresponding to mean lung dose. Conclusions: NTCP models historically have been based solely on dose-volume effects and binary (yes/no) toxicity data. Our results demonstrate that inclusion of nondosimetric risk factors and censored time-to-event data can markedly affect outcome predictions made using NTCP models.

  12. A Model for Risk Analysis of Oil Tankers

    NASA Astrophysics Data System (ADS)

    Montewka, Jakub; Krata, Przemysław; Goerland, Floris; Kujala, Pentti

    2010-01-01

    The paper presents a model for risk analysis regarding marine traffic, with the emphasis on two types of the most common marine accidents which are: collision and grounding. The focus is on oil tankers as these pose the highest environmental risk. A case study in selected areas of Gulf of Finland in ice free conditions is presented. The model utilizes a well-founded formula for risk calculation, which combines the probability of an unwanted event with its consequences. Thus the model is regarded a block type model, consisting of blocks for the probability of collision and grounding estimation respectively as well as blocks for consequences of an accident modelling. Probability of vessel colliding is assessed by means of a Minimum Distance To Collision (MDTC) based model. The model defines in a novel way the collision zone, using mathematical ship motion model and recognizes traffic flow as a non homogeneous process. The presented calculations address waterways crossing between Helsinki and Tallinn, where dense cross traffic during certain hours is observed. For assessment of a grounding probability, a new approach is proposed, which utilizes a newly developed model, where spatial interactions between objects in different locations are recognized. A ship at a seaway and navigational obstructions may be perceived as interacting objects and their repulsion may be modelled by a sort of deterministic formulation. Risk due to tankers running aground addresses an approach fairway to an oil terminal in Sköldvik, near Helsinki. The consequences of an accident are expressed in monetary terms, and concern costs of an oil spill, based on statistics of compensations claimed from the International Oil Pollution Compensation Funds (IOPC Funds) by parties involved.

  13. Evaluation of Periodontal Risk in Adult Patients using Two Different Risk Assessment Models – A Pilot Study

    PubMed Central

    Bade, Shruthi; Bollepalli, Appaiah Chowdary; Katuri, Kishore Kumar; Devulapalli, Narasimha Swamy; Swarna, Chakrapani

    2015-01-01

    Objective: The aim of the present study was to evaluate the periodontal risk of individuals using periodontal risk assessment (PRA) model and modified PRA model. Materials and Methods: A total of 50 patients with chronic periodontitis, age 30-60 years were selected randomly and charting of the periodontal status was performed and those who met the inclusion criteria were enrolled in the study. Parameters recorded were- percentage of sites with bleeding on probing (BOP), number of sites with pocket depths (PD) ≥ 5mm, number of the teeth lost, bone loss (BL)/age ratio, Clinical attachment loss(CAL)/age ratio, diabetic and smoking status, dental status, systemic factors like diabetes were assessed. All the risk factors were plotted on the radar chart in (PRA) and (mPRA) models, using Microsoft excel and periodontal risk were categorized as low, moderate and high risk. Results: Among 50 patients 31 were in low risk, 9 in moderate risk, and 10 in high risk identified by modified (PRA) model, whereas 28 patients were in low risk, 13 in moderate risk and 9 in high risk identified by (PRA). Statistical analysis demonstrated that there was no significant difference between the risk scores (X2 = 0.932 with degree of freedom = 2, P = 0.627). Conclusion: Both the periodontal risk models are effective in evaluating the risk factors and can be useful tool for predicting proper diagnosis, disease progression and therapeutic strategies during the supportive periodontal therapy. PMID:25859520

  14. Evaluating the Risks: A Bernoulli Process Model of HIV Infection and Risk Reduction.

    ERIC Educational Resources Information Center

    Pinkerton, Steven D.; Abramson, Paul R.

    1993-01-01

    A Bernoulli process model of human immunodeficiency virus (HIV) is used to evaluate infection risks associated with various sexual behaviors (condom use, abstinence, or monogamy). Results suggest that infection is best mitigated through measures that decrease infectivity, such as condom use. (SLD)

  15. Launch Vehicle Debris Models and Crew Vehicle Ascent Abort Risk

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott

    2013-01-01

    For manned space launch systems, a reliable abort system is required to reduce the risks associated with a launch vehicle failure during ascent. Understanding the risks associated with failure environments can be achieved through the use of physics-based models of these environments. Debris fields due to destruction of the launch vehicle is one such environment. To better analyze the risk posed by debris, a physics-based model for generating launch vehicle debris catalogs has been developed. The model predicts the mass distribution of the debris field based on formulae developed from analysis of explosions. Imparted velocity distributions are computed using a shock-physics code to model the explosions within the launch vehicle. A comparison of the debris catalog with an existing catalog for the Shuttle external tank show good comparison in the debris characteristics and the predicted debris strike probability. The model is used to analyze the effects of number of debris pieces and velocity distributions on the strike probability and risk.

  16. Forewarning model for water pollution risk based on Bayes theory.

    PubMed

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk. PMID:24194413

  17. The benefits of an additional worker are task-dependent: assessing low-back injury risks during prefabricated (panelized) wall construction.

    PubMed

    Kim, Sunwook; Nussbaum, Maury A; Jia, Bochen

    2012-09-01

    Team manual material handling is a common practice in residential construction where prefabricated building components (e.g., wall panels) are increasingly used. As part of a larger effort to enable proactive control of ergonomic exposures among workers handling panels, this study explored the effects of additional workers on injury risks during team-based panel erection tasks, specifically by quantifying how injury risks are affected by increasing the number of workers (by one, above the nominal or most common number). Twenty-four participants completed panel erection tasks with and without an additional worker under different panel mass and size conditions. Four risk assessment methods were employed that emphasized the low back. Though including an additional worker generally reduced injury risk across several panel masses and sizes, the magnitude of these benefits varied depending on the specific task and exhibited somewhat high variability within a given task. These results suggest that a simple, generalizable recommendation regarding team-based panel erection tasks is not warranted. Rather, a more systems-level approach accounting for both injury risk and productivity (a strength of panelized wall systems) should be undertaken. PMID:22226545

  18. Risk assessment of consuming agricultural products irrigated with reclaimed wastewater: An exposure model

    NASA Astrophysics Data System (ADS)

    van Ginneken, Meike; Oron, Gideon

    2000-09-01

    This study assesses health risks to consumers due to the use of agricultural products irrigated with reclaimed wastewater. The analysis is based on a definition of an exposure model which takes into account several parameters: (1) the quality of the applied wastewater, (2) the irrigation method, (3) the elapsed times between irrigation, harvest, and product consumption, and (4) the consumers' habits. The exposure model is used for numerical simulation of human consumers' risks using the Monte Carlo simulation method. The results of the numerical simulation show large deviations, probably caused by uncertainty (impreciseness in quality of input data) and variability due to diversity among populations. There is a 10-orders of magnitude difference in the risk of infection between the different exposure scenarios with the same water quality. This variation indicates the need for setting risk-based criteria for wastewater reclamation rather than single water quality guidelines. Extra data are required to decrease uncertainty in the risk assessment. Future research needs to include definition of acceptable risk criteria, more accurate dose-response modeling, information regarding pathogen survival in treated wastewater, additional data related to the passage of pathogens into and in the plants during irrigation, and information regarding the behavior patterns of the community of human consumers.

  19. Risk modeling, assessment, and management of lahar flow threat.

    PubMed

    Leung, M F; Santos, J R; Haimes, Y Y

    2003-12-01

    The 1991 eruption of Mount Pinatubo in the Philippines is considered one of the most violent and destructive volcanic activities in the 20th century. Lahar is the Indonesian term for volcanic ash, and lahar flows resulting from the massive amount of volcanic materials deposited on the mountain's slope posed continued post-eruption threats to the surrounding areas, destroying lives, homes, agricultural products, and infrastructures. Risks of lahar flows were identified immediately after the eruption, with scientific data provided by the Philippine Institute of Volcanology, the U.S. Geological Survey, and other research institutions. However, competing political, economic, and social agendas subordinated the importance of scientific information to policy making. Using systemic risk analysis and management, this article addresses the issues of multiple objectives and the effective integration of scientific techniques into the decision-making process. It provides a modeling framework for identifying, prioritizing, and evaluating policies for managing risk. The major considerations are: (1) applying a holistic approach to risk analysis through hierarchical holographic modeling, (2) applying statistical methods to gain insight into the problem of uncertainty in risk assessment, (3) using multiobjective trade-off analysis to address the issue of multiple decisionmakers and stakeholders in the decision-making process, (4) using the conditional expected value of extreme events to complement and supplement the expected value in quantifying risk, and (5) assessing the impacts of multistage decisions. Numerical examples based on ex post data are formulated to illustrate applications to various problems. The resulting framework from this study can serve as a general baseline model for assessing and managing risks of natural disasters, which the Philippines' lead agency-the National Disaster Coordinating Council (NDCC)-and other related organizations can use for their decision

  20. Were There "Additional Foreseeable Risks" in the SUPPORT Study? Lessons Not Learned from the ARDSnet Clinical Trials.

    PubMed

    Silverman, Henry J; Dreyfuss, Didier

    2015-01-01

    Even though the interventions were adapted from standard clinical practice, the way they were provided meant that the care given infants in the study was distinctly different from standard care, with different risk profiles. Parents should have been informed about those differences. PMID:25530226

  1. Inaccuracy of Self-Evaluation as Additional Variable for Prediction of Students at Risk of Failing First-Year Chemistry

    ERIC Educational Resources Information Center

    Potgieter, Marietjie; Ackermann, Mia; Fletcher, Lizelle

    2010-01-01

    Early identification of students at risk of failing first-year chemistry allows timely intervention. Cognitive factors alone are insufficient predictors for success; however, non cognitive factors are usually difficult to measure. We have explored the use of demographic and performance variables, as well as the accuracy of self-evaluation as an…

  2. Cognitive Processes that Account for Mental Addition Fluency Differences between Children Typically Achieving in Arithmetic and Children At-Risk for Failure in Arithmetic

    ERIC Educational Resources Information Center

    Berg, Derek H.; Hutchinson, Nancy L.

    2010-01-01

    This study investigated whether processing speed, short-term memory, and working memory accounted for the differential mental addition fluency between children typically achieving in arithmetic (TA) and children at-risk for failure in arithmetic (AR). Further, we drew attention to fluency differences in simple (e.g., 5 + 3) and complex (e.g., 16 +…

  3. Additive influence of genetic predisposition and conventional risk factors in the incidence of coronary heart disease: a population-based study in Greece

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An additive genetic risk score (GRS) for coronary heart disease (CHD) has previously been associated with incident CHD in the population-based Greek European Prospective Investigation into Cancer and nutrition (EPIC) cohort. In this study, we explore GRS-‘environment’ joint actions on CHD for severa...

  4. Historical application of a social amplification of risk model: Economic impacts of risk events at nuclear weapons facilities

    SciTech Connect

    Metz, W.C.

    1996-04-01

    Public perception of risk is being cited as a documented reason to rethink a very contentious congressionally mandated process for siting interim storage and permanent disposal facilities for high-level radioactive waste. Rigorous survey research has shown that the public holds intense, negative images of {open_quotes}nuclear{close_quotes} and {open_quotes}radioactive{close_quotes} technologies, activities, and facilities. Potential host states and opponents claim that these negative images, coupled with an amplification of negative risk events, will potentially stigmatize the area surrounding such facilities and result in significant economic losses. At issue is whether a supporting social amplification of risk model is applicable to communities hosting facilities that are part of the U.S. Department of Energy Nuclear Weapons Complex. An initial assessment of high-profile discrete and cumulative key negative risk events at such nuclear facilities does not validate that there has been stigmatization or substantial social and economic consequences in the host areas. Before any changes to major national policy are implemented, additional research is required to determine if the nearby public`s {open_quotes}pragmatic logic,{close_quotes} based on practical knowledge and experience, attenuates the link between public opinion and demographic and economic behaviors. 40 refs.

  5. Smoking and polymorphisms in xenobiotic metabolism and DNA repair genes are additive risk factors affecting bladder cancer in Northern Tunisia.

    PubMed

    Rouissi, Kamel; Ouerhani, Slah; Hamrita, Bechr; Bougatef, Karim; Marrakchi, Raja; Cherif, Mohamed; Ben Slama, Mohamed Riadh; Bouzouita, Mohamed; Chebil, Mohamed; Ben Ammar Elgaaied, Amel

    2011-12-01

    Cancer epidemiology has undergone marked development since the nineteen-fifties. One of the most spectacular and specific contributions was the demonstration of the massive effect of smoking and genetic polymorphisms on the occurrence of bladder cancer. The tobacco carcinogens are metabolized by various xenobiotic metabolizing enzymes, such as the super-families of N-acetyltransferases (NAT) and glutathione S-transferases (GST). DNA repair is essential to an individual's ability to respond to damage caused by tobacco carcinogens. Alterations in DNA repair genes may affect cancer risk by influencing individual susceptibility to this environmental exposure. Polymorphisms in NAT2, GST and DNA repair genes alter the ability of these enzymes to metabolize carcinogens or to repair alterations caused by this process. We have conducted a case-control study to assess the role of smoking, slow NAT2 variants, GSTM1 and GSTT1 null, and XPC, XPD, XPG nucleotide excision-repair (NER) genotypes in bladder cancer development in North Tunisia. Taken alone, each gene unless NAT2 did not appear to be a factor affecting bladder cancer susceptibility. For the NAT2 slow acetylator genotypes, the NAT2*5/*7 diplotype was found to have a 7-fold increased risk to develop bladder cancer (OR = 7.14; 95% CI: 1.30-51.41). However, in tobacco consumers, we have shown that Null GSTM1, Wild GSTT1, Slow NAT2, XPC (CC) and XPG (CC) are genetic risk factors for the disease. When combined together in susceptible individuals compared to protected individuals these risk factors give an elevated OR (OR = 61). So, we have shown a strong cumulative effect of tobacco and different combinations of studied genetic risk factors which lead to a great susceptibility to bladder cancer. PMID:21647780

  6. Climate and weather risk in natural resource models

    NASA Astrophysics Data System (ADS)

    Merrill, Nathaniel Henry

    This work, consisting of three manuscripts, addresses natural resource management under risk due to variation in climate and weather. In three distinct but theoretically related applications, I quantify the role of natural resources in stabilizing economic outcomes. In Manuscript 1, we address policy designed to effect the risk of cyanobacteria blooms in a drinking water reservoir through watershed wide policy. Combining a hydrologic and economic model for a watershed in Rhode Island, we solve for the efficient allocation of best management practices (BMPs) on livestock pastures to meet a monthly risk-based as well as mean-based water quality objective. In order to solve for the efficient allocations of nutrient control effort, we optimize a probabilistically constrained integer-programming problem representing the choices made on each farm and the resultant conditions that support cyanobacteria blooms. In doing so, we employ a genetic algorithm (GA). We hypothesize that management based on controlling the upper tail of the probability distribution of phosphorus loading implies different efficient management actions as compared to controlling mean loading. We find a shift to more intense effort on fewer acres when a probabilistic objective is specified with cost savings of meeting risk levels of up to 25% over mean loading based policies. Additionally, we illustrate the relative cost effectiveness of various policies designed to meet this risk-based objective. Rainfall and the subsequent overland runoff is the source of transportation of nutrients to a receiving water body, with larger amounts of phosphorus moving in more intense rainfall events. We highlight the importance of this transportation mechanism by comparing policies under climate change scenarios, where the intensity of rainfall is projected to increase and the time series process of rainfall to change. In Manuscript 2, we introduce a new economic groundwater model that incorporates the gradual shift

  7. Modeling particulate matter concentrations measured through mobile monitoring in a deletion/substitution/addition approach

    NASA Astrophysics Data System (ADS)

    Su, Jason G.; Hopke, Philip K.; Tian, Yilin; Baldwin, Nichole; Thurston, Sally W.; Evans, Kristin; Rich, David Q.

    2015-12-01

    Land use regression modeling (LUR) through local scale circular modeling domains has been used to predict traffic-related air pollution such as nitrogen oxides (NOX). LUR modeling for fine particulate matters (PM), which generally have smaller spatial gradients than NOX, has been typically applied for studies involving multiple study regions. To increase the spatial coverage for fine PM and key constituent concentrations, we designed a mobile monitoring network in Monroe County, New York to measure pollutant concentrations of black carbon (BC, wavelength at 880 nm), ultraviolet black carbon (UVBC, wavelength at 3700 nm) and Delta-C (the difference between the UVBC and BC concentrations) using the Clarkson University Mobile Air Pollution Monitoring Laboratory (MAPL). A Deletion/Substitution/Addition (D/S/A) algorithm was conducted, which used circular buffers as a basis for statistics. The algorithm maximizes the prediction accuracy for locations without measurements using the V-fold cross-validation technique, and it reduces overfitting compared to other approaches. We found that the D/S/A LUR modeling approach could achieve good results, with prediction powers of 60%, 63%, and 61%, respectively, for BC, UVBC, and Delta-C. The advantage of mobile monitoring is that it can monitor pollutant concentrations at hundreds of spatial points in a region, rather than the typical less than 100 points from a fixed site saturation monitoring network. This research indicates that a mobile saturation sampling network, when combined with proper modeling techniques, can uncover small area variations (e.g., 10 m) in particulate matter concentrations.

  8. The biobehavioral family model: testing social support as an additional exogenous variable.

    PubMed

    Woods, Sarah B; Priest, Jacob B; Roush, Tara

    2014-12-01

    This study tests the inclusion of social support as a distinct exogenous variable in the Biobehavioral Family Model (BBFM). The BBFM is a biopsychosocial approach to health that proposes that biobehavioral reactivity (anxiety and depression) mediates the relationship between family emotional climate and disease activity. Data for this study included married, English-speaking adult participants (n = 1,321; 55% female; M age = 45.2 years) from the National Comorbidity Survey Replication, a nationally representative epidemiological study of the frequency of mental disorders in the United States. Participants reported their demographics, marital functioning, social support from friends and relatives, anxiety and depression (biobehavioral reactivity), number of chronic health conditions, and number of prescription medications. Confirmatory factor analyses supported the items used in the measures of negative marital interactions, social support, and biobehavioral reactivity, as well as the use of negative marital interactions, friends' social support, and relatives' social support as distinct factors in the model. Structural equation modeling indicated a good fit of the data to the hypothesized model (χ(2)  = 846.04, p = .000, SRMR = .039, CFI = .924, TLI = .914, RMSEA = .043). Negative marital interactions predicted biobehavioral reactivity (β = .38, p < .001), as did relatives' social support, inversely (β = -.16, p < .001). Biobehavioral reactivity predicted disease activity (β = .40, p < .001) and was demonstrated to be a significant mediator through tests of indirect effects. Findings are consistent with previous tests of the BBFM with adult samples, and suggest the important addition of family social support as a predicting factor in the model. PMID:24981970

  9. A habitat suitability model for Chinese sturgeon determined using the generalized additive method

    NASA Astrophysics Data System (ADS)

    Yi, Yujun; Sun, Jie; Zhang, Shanghong

    2016-03-01

    The Chinese sturgeon is a type of large anadromous fish that migrates between the ocean and rivers. Because of the construction of dams, this sturgeon's migration path has been cut off, and this species currently is on the verge of extinction. Simulating suitable environmental conditions for spawning followed by repairing or rebuilding its spawning grounds are effective ways to protect this species. Various habitat suitability models based on expert knowledge have been used to evaluate the suitability of spawning habitat. In this study, a two-dimensional hydraulic simulation is used to inform a habitat suitability model based on the generalized additive method (GAM). The GAM is based on real data. The values of water depth and velocity are calculated first via the hydrodynamic model and later applied in the GAM. The final habitat suitability model is validated using the catch per unit effort (CPUEd) data of 1999 and 2003. The model results show that a velocity of 1.06-1.56 m/s and a depth of 13.33-20.33 m are highly suitable ranges for the Chinese sturgeon to spawn. The hydraulic habitat suitability indexes (HHSI) for seven discharges (4000; 9000; 12,000; 16,000; 20,000; 30,000; and 40,000 m3/s) are calculated to evaluate integrated habitat suitability. The results show that the integrated habitat suitability reaches its highest value at a discharge of 16,000 m3/s. This study is the first to apply a GAM to evaluate the suitability of spawning grounds for the Chinese sturgeon. The study provides a reference for the identification of potential spawning grounds in the entire basin.

  10. State-of-the-Art in Tsunami Risk Modelling for a global perspective

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Tsunamis can be considered as the natural hazard with the largest global spread in terms of hazard distribution due to a single event with the exception of global extinction level events (volcanoes, meteor impacts). Multiple extreme events have occurred during the last decade, including the devastating 2004 Sumatra tsunami and the events in Japan and Chile in 2011. In general, the hazard and risk of tsunamis is investigated in regional or inter-regional projects like in Japan, Indonesia or New Zealand following different methodologies and investigating various source mechanisms. Thus, in this study, a review of the state-of-the-art in global tsunami risk modelling has been undertaken. The most recent and up-to-date methodologies and projects from all over the world have been assembled for a direct comparison to provide a global perspective into a hazard scenario that affects multiple countries at once to extreme magnitudes. The assemblage of these models provides an insight into the temporal and spatial development of tsunami risk research and how it was adopted by research institutes and combined into official hazard modelling. A global map is assembled, indicating local and international case studies and projects with respect to their source model and date of development. In addition, the study also covers the development of software packages used to set up hazard and risk models and it investigates the different source processes of tsunami generation and propagation. A comparison is made using a multicriteria approach to examine the physical models and capabilities of software packages as well as the source identification procedure in different hazard models. A complete and up-to-date overview of tsunami risk and hazard modelling is presented, compared and classified which has far-reaching uses for the preparation of a tsunami risk assessment at any point on the earth.

  11. Generalized additive models used to predict species abundance in the Gulf of Mexico: an ecosystem modeling tool.

    PubMed

    Drexler, Michael; Ainsworth, Cameron H

    2013-01-01

    Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM) approach is used to describe the abundance of 40 species groups (i.e. functional groups) across the Gulf of Mexico (GoM) using a large fisheries independent data set (SEAMAP) and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum) and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist. PMID:23691223

  12. Generalized Additive Models Used to Predict Species Abundance in the Gulf of Mexico: An Ecosystem Modeling Tool

    PubMed Central

    Drexler, Michael; Ainsworth, Cameron H.

    2013-01-01

    Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM) approach is used to describe the abundance of 40 species groups (i.e. functional groups) across the Gulf of Mexico (GoM) using a large fisheries independent data set (SEAMAP) and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum) and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist. PMID:23691223

  13. Nonlinear feedback in a six-dimensional Lorenz Model: impact of an additional heating term

    NASA Astrophysics Data System (ADS)

    Shen, B.-W.

    2015-03-01

    In this study, a six-dimensional Lorenz model (6DLM) is derived, based on a recent study using a five-dimensional (5-D) Lorenz model (LM), in order to examine the impact of an additional mode and its accompanying heating term on solution stability. The new mode added to improve the representation of the steamfunction is referred to as a secondary streamfunction mode, while the two additional modes, that appear in both the 6DLM and 5DLM but not in the original LM, are referred to as secondary temperature modes. Two energy conservation relationships of the 6DLM are first derived in the dissipationless limit. The impact of three additional modes on solution stability is examined by comparing numerical solutions and ensemble Lyapunov exponents of the 6DLM and 5DLM as well as the original LM. For the onset of chaos, the critical value of the normalized Rayleigh number (rc) is determined to be 41.1. The critical value is larger than that in the 3DLM (rc ~ 24.74), but slightly smaller than the one in the 5DLM (rc ~ 42.9). A stability analysis and numerical experiments obtained using generalized LMs, with or without simplifications, suggest the following: (1) negative nonlinear feedback in association with the secondary temperature modes, as first identified using the 5DLM, plays a dominant role in providing feedback for improving the solution's stability of the 6DLM, (2) the additional heating term in association with the secondary streamfunction mode may destabilize the solution, and (3) overall feedback due to the secondary streamfunction mode is much smaller than the feedback due to the secondary temperature modes; therefore, the critical Rayleigh number of the 6DLM is comparable to that of the 5DLM. The 5DLM and 6DLM collectively suggest different roles for small-scale processes (i.e., stabilization vs. destabilization), consistent with the following statement by Lorenz (1972): If the flap of a butterfly's wings can be instrumental in generating a tornado, it can

  14. Nonlinear feedback in a six-dimensional Lorenz model: impact of an additional heating term

    NASA Astrophysics Data System (ADS)

    Shen, B.-W.

    2015-12-01

    In this study, a six-dimensional Lorenz model (6DLM) is derived, based on a recent study using a five-dimensional (5-D) Lorenz model (LM), in order to examine the impact of an additional mode and its accompanying heating term on solution stability. The new mode added to improve the representation of the streamfunction is referred to as a secondary streamfunction mode, while the two additional modes, which appear in both the 6DLM and 5DLM but not in the original LM, are referred to as secondary temperature modes. Two energy conservation relationships of the 6DLM are first derived in the dissipationless limit. The impact of three additional modes on solution stability is examined by comparing numerical solutions and ensemble Lyapunov exponents of the 6DLM and 5DLM as well as the original LM. For the onset of chaos, the critical value of the normalized Rayleigh number (rc) is determined to be 41.1. The critical value is larger than that in the 3DLM (rc ~ 24.74), but slightly smaller than the one in the 5DLM (rc ~ 42.9). A stability analysis and numerical experiments obtained using generalized LMs, with or without simplifications, suggest the following: (1) negative nonlinear feedback in association with the secondary temperature modes, as first identified using the 5DLM, plays a dominant role in providing feedback for improving the solution's stability of the 6DLM, (2) the additional heating term in association with the secondary streamfunction mode may destabilize the solution, and (3) overall feedback due to the secondary streamfunction mode is much smaller than the feedback due to the secondary temperature modes; therefore, the critical Rayleigh number of the 6DLM is comparable to that of the 5DLM. The 5DLM and 6DLM collectively suggest different roles for small-scale processes (i.e., stabilization vs. destabilization), consistent with the following statement by Lorenz (1972): "If the flap of a butterfly's wings can be instrumental in generating a tornado, it can

  15. Impact of an additional chronic BDNF reduction on learning performance in an Alzheimer mouse model

    PubMed Central

    Psotta, Laura; Rockahr, Carolin; Gruss, Michael; Kirches, Elmar; Braun, Katharina; Lessmann, Volkmar; Bock, Jörg; Endres, Thomas

    2015-01-01

    There is increasing evidence that brain-derived neurotrophic factor (BDNF) plays a crucial role in Alzheimer’s disease (AD) pathology. A number of studies demonstrated that AD patients exhibit reduced BDNF levels in the brain and the blood serum, and in addition, several animal-based studies indicated a potential protective effect of BDNF against Aβ-induced neurotoxicity. In order to further investigate the role of BDNF in the etiology of AD, we created a novel mouse model by crossing a well-established AD mouse model (APP/PS1) with a mouse exhibiting a chronic BDNF deficiency (BDNF+/−). This new triple transgenic mouse model enabled us to further analyze the role of BDNF in AD in vivo. We reasoned that in case BDNF has a protective effect against AD pathology, an AD-like phenotype in our new mouse model should occur earlier and/or in more severity than in the APP/PS1-mice. Indeed, the behavioral analysis revealed that the APP/PS1-BDNF+/−-mice show an earlier onset of learning impairments in a two-way active avoidance task in comparison to APP/PS1- and BDNF+/−-mice. However in the Morris water maze (MWM) test, we could not observe an overall aggrevated impairment in spatial learning and also short-term memory in an object recognition task remained intact in all tested mouse lines. In addition to the behavioral experiments, we analyzed the amyloid plaque pathology in the APP/PS1 and APP/PS1-BDNF+/−-mice and observed a comparable plaque density in the two genotypes. Moreover, our results revealed a higher plaque density in prefrontal cortical compared to hippocampal brain regions. Our data reveal that higher cognitive tasks requiring the recruitment of cortical networks appear to be more severely affected in our new mouse model than learning tasks requiring mainly sub-cortical networks. Furthermore, our observations of an accelerated impairment in active avoidance learning in APP/PS1-BDNF+/−-mice further supports the hypothesis that BDNF deficiency

  16. BIOLOGICALLY BASED DOSE RESPONSE MODELS FOR DEVELOPMENTAL TOXICITY RISK ASSESSMENT

    EPA Science Inventory

    Present risk assessment procedures for non-cancer endpoints generally rely on the determination of No Observed Adverse Effects Levels (NOAELS) in animal models followed by the application of various Uncertainty Factors (UFs) to account for unknowns in extrapolating high dose toxi...

  17. Field Evaluation of an Avian Risk Assessment Model

    EPA Science Inventory

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in ...

  18. Surface Water Contamination Risk Assessment Modeled by Fuzzy-WRASTIC.

    PubMed

    Alavipoor, Fatemeh Sadat; Ghorbaninia, Zahra; Karimi, Saeed; Jafari, Hamidreza

    2016-07-01

    This research provides a Fuzzy-WRASTIC new model for water resource contamination risk assessment in a GIS (Geographic Information System) environment. First, this method setting in a multi-criteria evaluation framework (MCE) reviewed and mapped the sub criteria of every above-mentioned criterion. Then, related sub-layers were phased by the observance of GIS environment standards. In the next step, first the sub-layers were combined together, next the modeling of pollution risk status was done by utilizing a fuzzy overlay method and applying the OR, AND, SUM, PRODUCT and GAMMA operators by using WLC (Weighted Linear Combination) method and providing weights in the WRASTIC model. The results provide the best combination of modeling and the percentages of its risk categories of low, medium, high and very high, which are respectively 1.8, 14.07, 51.43 and 32.7. More areas have severe risk due to the unbalanced arrangement and compact of land uses around the compact surface water resources. PMID:27329055

  19. A Dual System Model of Preferences under Risk

    ERIC Educational Resources Information Center

    Mukherjee, Kanchan

    2010-01-01

    This article presents a dual system model (DSM) of decision making under risk and uncertainty according to which the value of a gamble is a combination of the values assigned to it independently by the affective and deliberative systems. On the basis of research on dual process theories and empirical research in Hsee and Rottenstreich (2004) and…

  20. Job strain (demands and control model) as a predictor of cardiovascular risk factors among petrochemical personnel

    PubMed Central

    Habibi, Ehsanollah; Poorabdian, Siamak; Shakerian, Mahnaz

    2015-01-01

    Background: One of the practical models for the assessment of stressful working conditions due to job strain is job demand and control model, which explains how physical and psychological adverse consequences, including cardiovascular risk factors can be established due to high work demands (the amount of workload, in addition to time limitations to complete that work) and low control of the worker on his/her work (lack of decision making) in the workplace. The aim of this study was to investigate how certain cardiovascular risk factors (including body mass index [BMI], heart rate, blood pressure, cholesterol and smoking) and the job demand and job control are related to each other. Materials and Methods: This prospective cohort study was conducted on 500 workers of the petrochemical industry in south of Iran, 2009. The study population was selected using simple random statistical method. They completed job demand and control questionnaire. The cardiovascular risk factors data was extracted from the workers hygiene profiles. Chi-square (χ2) test and hypothesis test (η) were used to assess the possible relationship between different quantified variables, individual demographic and cardiovascular risk factors. Results: The results of this study revealed that a significant relationship can be found between job demand control model and cardiovascular risk factors. Chi-square test result for the heart rate showed the highest (χ2 = 145.078) relationship, the corresponding results for smoking and BMI were χ2 = 85.652 and χ2 = 30.941, respectively. Subsequently, hypothesis testing results for cholesterol and hypertension was 0.469 and 0.684, respectively. Discussion: Job strain is likely to be associated with an increased risk of cardiovascular risk factors among male staff in a petrochemical company in Iran. The parameters illustrated in the Job demands and control model can act as acceptable predictors for the probability of job stress occurrence followed by showing

  1. Thermal Stability of Nanocrystalline Alloys by Solute Additions and A Thermodynamic Modeling

    NASA Astrophysics Data System (ADS)

    Saber, Mostafa

    and alpha → gamma phase transformation in Fe-Ni-Zr alloys. In addition to the experimental study of thermal stabilization of nanocrystalline Fe-Cr-Zr or Fe-Ni-Zr alloys, the thesis presented here developed a new predictive model, applicable to strongly segregating solutes, for thermodynamic stabilization of binary alloys. This model can serve as a benchmark for selecting solute and evaluating the possible contribution of stabilization. Following a regular solution model, both the chemical and elastic strain energy contributions are combined to obtain the mixing enthalpy. The total Gibbs free energy of mixing is then minimized with respect to simultaneous variations in the grain boundary volume fraction and the solute concentration in the grain boundary and the grain interior. The Lagrange multiplier method was used to obtained numerical solutions. Application are given for the temperature dependence of the grain size and the grain boundary solute excess for selected binary system where experimental results imply that thermodynamic stabilization could be operative. This thesis also extends the binary model to a new model for thermodynamic stabilization of ternary nanocrystalline alloys. It is applicable to strongly segregating size-misfit solutes and uses input data available in the literature. In a same manner as the binary model, this model is based on a regular solution approach such that the chemical and elastic strain energy contributions are incorporated into the mixing enthalpy DeltaHmix, and the mixing entropy DeltaSmix is obtained using the ideal solution approximation. The Gibbs mixing free energy Delta Gmix is then minimized with respect to simultaneous variations in grain growth and solute segregation parameters. The Lagrange multiplier method is similarly used to obtain numerical solutions for the minimum Delta Gmix. The temperature dependence of the nanocrystalline grain size and interfacial solute excess can be obtained for selected ternary systems. As

  2. Why we need new approaches to low-dose risk modeling

    SciTech Connect

    Alvarez, J.L.; Seiler, F.A.

    1996-06-01

    The linear no-threshold model for radiation effects was introduced as a conservative model for the design of radiation protection programs. The model has persisted not only as the basis for such programs, but has come to be treated as a dogma and is often confused with scientific fact. In this examination a number of serious problems with the linear no-threshold model of radiation carcinogenesis were demonstrated, many of them invalidating the hypothesis. It was shown that the relative risk formalism did not approach 1 as the dose approaches zero. When morality ratios were used instead, the data in the region below 0.3 Sv were systematically below the predictions of the linear model. It was also shown that the data above 0.3 Sv were of little use in formulating a model at low doses. In addition, these data are valid only for doses accumulated at high dose rates, and there is no scientific justification for using the model in low-dose, low-dose-rate extrapolations for purposes of radiation protection. Further examination of model fits to the Japanese survivor data were attempted. Several such models were fit to the data including an unconstrained linear, linear-square root, and Weibull, all of which fit the data better than the relative risk, linear no-threshold model. These fits were used to demonstrate that the linear model systematically over estimates the risk at low doses in the Japanese survivor data set. It is recommended here that an unbiased re-analysis of the data be undertaken and the results used to construct a new model, based on all pertinent data. This model could then form the basis for managing radiation risks in the appropriate regions of dose and dose rate.

  3. Generalized nonlinear models for rear-end crash risk analysis.

    PubMed

    Lao, Yunteng; Zhang, Guohui; Wang, Yinhai; Milton, John

    2014-01-01

    A generalized nonlinear model (GNM)-based approach for modeling highway rear-end crash risk is formulated using Washington State traffic safety data. Previous studies majorly focused on causal factor identification and crash risk modeling using Generalized linear Models (GLMs), such as Poisson regression, Logistic regression, etc. However, their basic assumption of a generalized linear relationship between the dependent variable (for example, crash rate) and independent variables (for example, contribute factors to crashes) established via a link function can be often violated in reality. Consequently, the GLM-based modeling results could provide biased findings and conclusions. In this research, a GNM-based approach is developed to utilize a nonlinear regression function to better elaborate non-monotonic relationships between the independent and dependent variables using the rear end accident data collected from 10 highway routes from 2002 through 2006. The results show for example that truck percentage and grade have a parabolic impact: they increase crash risks initially, but decrease them after the certain thresholds. Such non-monotonic relationships cannot be captured by regular GLMs which further demonstrate the flexibility of GNM-based approaches in the nonlinear relationship among data and providing more reasonable explanations. The superior GNM-based model interpretations help better understand the parabolic impacts of some specific contributing factors for selecting and evaluating rear-end crash safety improvement plans. PMID:24125803

  4. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety. PMID:20055976

  5. Development and Validation of Osteoporosis Risk-Assessment Model for Korean Men

    PubMed Central

    Oh, Sun Min; Song, Bo Mi; Nam, Byung-Ho; Rhee, Yumie; Moon, Seong-Hwan; Kim, Deog Young; Kang, Dae Ryong

    2016-01-01

    Purpose The aim of the present study was to develop an osteoporosis risk-assessment model to identify high-risk individuals among Korean men. Materials and Methods The study used data from 1340 and 1110 men ≥50 years who participated in the 2009 and 2010 Korean National Health and Nutrition Examination Survey, respectively, for development and validation of an osteoporosis risk-assessment model. Osteoporosis was defined as T score ≤-2.5 at either the femoral neck or lumbar spine. Performance of the candidate models and the Osteoporosis Self-assessment Tool for Asian (OSTA) was compared with sensitivity, specificity, and area under the receiver operating characteristics curve (AUC). A net reclassification improvement was further calculated to compare the developed Korean Osteoporosis Risk-Assessment Model for Men (KORAM-M) with OSTA. Results In the development dataset, the prevalence of osteoporosis was 8.1%. KORAM-M, consisting of age and body weight, had a sensitivity of 90.8%, a specificity of 42.4%, and an AUC of 0.666 with a cut-off score of -9. In the validation dataset, similar results were shown: sensitivity 87.9%, specificity 39.7%, and AUC 0.638. Additionally, risk categorization with KORAM-M showed improved reclassification over that of OSTA up to 22.8%. Conclusion KORAM-M can be simply used as a pre-screening tool to identify candidates for dual energy X-ray absorptiometry tests. PMID:26632400

  6. Design and tuning of standard additive model based fuzzy PID controllers for multivariable process systems.

    PubMed

    Harinath, Eranda; Mann, George K I

    2008-06-01

    This paper describes a design and two-level tuning method for fuzzy proportional-integral derivative (FPID) controllers for a multivariable process where the fuzzy inference uses the inference of standard additive model. The proposed method can be used for any n x n multi-input-multi-output process and guarantees closed-loop stability. In the two-level tuning scheme, the tuning follows two steps: low-level tuning followed by high-level tuning. The low-level tuning adjusts apparent linear gains, whereas the high-level tuning changes the nonlinearity in the normalized fuzzy output. In this paper, two types of FPID configurations are considered, and their performances are evaluated by using a real-time multizone temperature control problem having a 3 x 3 process system. PMID:18558531

  7. Estimation of the lag time in a subsequent monomer addition model for fibril elongation.

    PubMed

    Shoffner, Suzanne K; Schnell, Santiago

    2016-08-01

    Fibrillogenesis, the production or development of protein fibers, has been linked to protein folding diseases. The progress curve of fibrils or aggregates typically takes on a sigmoidal shape with a lag phase, a rapid growth phase, and a final plateau regime. The study of the lag phase and the estimation of its critical timescale provide insight into the factors regulating the fibrillation process. However, methods to estimate a quantitative expression for the lag time rely on empirical expressions, which cannot connect the lag time to kinetic parameters associated with the reaction mechanisms of protein fibrillation. Here we introduce an approach for the estimation of the lag time using the governing rate equations of the elementary reactions of a subsequent monomer addition model for protein fibrillation as a case study. We show that the lag time is given by the sum of the critical timescales for each fibril intermediate in the subsequent monomer addition mechanism and therefore reveals causal connectivity between intermediate species. Furthermore, we find that single-molecule assays of protein fibrillation can exhibit a lag phase without a nucleation process, while dyes and extrinsic fluorescent probe bulk assays of protein fibrillation do not exhibit an observable lag phase during template-dependent elongation. Our approach could be valuable for investigating the effects of intrinsic and extrinsic factors to the protein fibrillation reaction mechanism and provides physicochemical insights into parameters regulating the lag phase. PMID:27250246

  8. Supra-additive effects of tramadol and acetaminophen in a human pain model.

    PubMed

    Filitz, Jörg; Ihmsen, Harald; Günther, Werner; Tröster, Andreas; Schwilden, Helmut; Schüttler, Jürgen; Koppert, Wolfgang

    2008-06-01

    The combination of analgesic drugs with different pharmacological properties may show better efficacy with less side effects. Aim of this study was to examine the analgesic and antihyperalgesic properties of the weak opioid tramadol and the non-opioid acetaminophen, alone as well as in combination, in an experimental pain model in humans. After approval of the local Ethics Committee, 17 healthy volunteers were enrolled in this double-blind and placebo-controlled study in a cross-over design. Transcutaneous electrical stimulation at high current densities (29.6+/-16.2 mA) induced spontaneous acute pain (NRS=6 of 10) and distinct areas of hyperalgesia for painful mechanical stimuli (pinprick-hyperalgesia). Pain intensities as well as the extent of the areas of hyperalgesia were assessed before, during and 150 min after a 15 min lasting intravenous infusion of acetaminophen (650 mg), tramadol (75 mg), a combination of both (325 mg acetaminophen and 37.5mg tramadol), or saline 0.9%. Tramadol led to a maximum pain reduction of 11.7+/-4.2% with negligible antihyperalgesic properties. In contrast, acetaminophen led to a similar pain reduction (9.8+/-4.4%), but a sustained antihyperalgesic effect (34.5+/-14.0% reduction of hyperalgesic area). The combination of both analgesics at half doses led to a supra-additive pain reduction of 15.2+/-5.7% and an enhanced antihyperalgesic effect (41.1+/-14.3% reduction of hyperalgesic areas) as compared to single administration of acetaminophen. Our study provides first results on interactions of tramadol and acetaminophen on experimental pain and hyperalgesia in humans. Pharmacodynamic modeling combined with the isobolographic technique showed supra-additive effects of the combination of acetaminophen and tramadol concerning both, analgesia and antihyperalgesia. The results might act as a rationale for combining both analgesics. PMID:17709207

  9. The MCRA model for probabilistic single-compound and cumulative risk assessment of pesticides.

    PubMed

    van der Voet, Hilko; de Boer, Waldo J; Kruisselbrink, Johannes W; Goedhart, Paul W; van der Heijden, Gerie W A M; Kennedy, Marc C; Boon, Polly E; van Klaveren, Jacob D

    2015-05-01

    Pesticide risk assessment is hampered by worst-case assumptions leading to overly pessimistic assessments. On the other hand, cumulative health effects of similar pesticides are often not taken into account. This paper describes models and a web-based software system developed in the European research project ACROPOLIS. The models are appropriate for both acute and chronic exposure assessments of single compounds and of multiple compounds in cumulative assessment groups. The software system MCRA (Monte Carlo Risk Assessment) is available for stakeholders in pesticide risk assessment at mcra.rivm.nl. We describe the MCRA implementation of the methods as advised in the 2012 EFSA Guidance on probabilistic modelling, as well as more refined methods developed in the ACROPOLIS project. The emphasis is on cumulative assessments. Two approaches, sample-based and compound-based, are contrasted. It is shown that additional data on agricultural use of pesticides may give more realistic risk assessments. Examples are given of model and software validation of acute and chronic assessments, using both simulated data and comparisons against the previous release of MCRA and against the standard software DEEM-FCID used by the Environmental Protection Agency in the USA. It is shown that the EFSA Guidance pessimistic model may not always give an appropriate modelling of exposure. PMID:25455888

  10. An integrated model-based approach to the risk assessment of pesticide drift from vineyards

    NASA Astrophysics Data System (ADS)

    Pivato, Alberto; Barausse, Alberto; Zecchinato, Francesco; Palmeri, Luca; Raga, Roberto; Lavagnolo, Maria Cristina; Cossu, Raffaello

    2015-06-01

    The inhalation of pesticide in air is of particular concern for people living in close contact with intensive agricultural activities. This study aims to develop an integrated modelling methodology to assess whether pesticides pose a risk to the health of people living near vineyards, and apply this methodology in the world-renowned Prosecco DOCG (Italian label for protection of origin and geographical indication of wines) region. A sample field in Bigolino di Valdobbiadene (North-Eastern Italy) was selected to perform the pesticide fate modellization and the consequent inhalation risk assessment for people living in the area. The modellization accounts for the direct pesticide loss during the treatment of vineyards and for the volatilization from soil after the end of the treatment. A fugacity model was used to assess the volatilization flux from soil. The Gaussian puff air dispersion model CALPUFF was employed to assess the airborne concentration of the emitted pesticide over the simulation domain. The subsequent risk assessment integrates the HArmonised environmental Indicators for pesticide Risk (HAIR) and US-EPA guidelines. In this case study the modelled situation turned to be safe from the point of view of human health in the case of non-carcinogenic compounds, and additional improvements were suggested to further mitigate the effect of the most critical compound.

  11. Modelling tsunami inundation for risk analysis at the Andaman Sea Coast of Thailand

    NASA Astrophysics Data System (ADS)

    Kaiser, G.; Kortenhaus, A.

    2009-04-01

    The mega-tsunami of Dec. 26, 2004 strongly impacted the Andaman Sea coast of Thailand and devastated coastal ecosystems as well as towns, settlements and tourism resorts. In addition to the tragic loss of many lives, the destruction or damage of life-supporting infrastructure, such as buildings, roads, water & power supply etc. caused high economic losses in the region. To mitigate future tsunami impacts there is a need to assess the tsunami hazard and vulnerability in flood prone areas at the Andaman Sea coast in order to determine the spatial distribution of risk and to develop risk management strategies. In the bilateral German-Thai project TRAIT research is performed on integrated risk assessment for the Provinces Phang Nga and Phuket in southern Thailand, including a hazard analysis, i.e. modelling tsunami propagation to the coast, tsunami wave breaking and inundation characteristics, as well as vulnerability analysis of the socio-economic and the ecological system in order to determine the scenario-based, specific risk for the region. In this presentation results of the hazard analysis and the inundation simulation are presented and discussed. Numerical modelling of tsunami propagation and inundation simulation is an inevitable tool for risk analysis, risk management and evacuation planning. While numerous investigations have been made to model tsunami wave generation and propagation in the Indian Ocean, there is still a lack in determining detailed inundation patterns, i.e. water depth and flow dynamics. However, for risk management and evacuation planning this knowledge is essential. As the accuracy of the inundation simulation is strongly depending on the available bathymetric and the topographic data, a multi-scale approach is chosen in this work. The ETOPO Global Relief Model as a bathymetric basis and the Shuttle Radar Topography Mission (SRTM90) have been widely applied in tsunami modelling approaches as these data are free and almost world

  12. Development of distribution system reliability and risk analysis models

    NASA Astrophysics Data System (ADS)

    Northcote-Green, J. E. D.; Vismor, T. D.; Brooks, C. L.

    1981-08-01

    The overall objectives of a research project were to: determine distribution reliability assessment methods currently used by the industry; develop a general outage reporting scheme suitable for a wide variety of distributing utilities (reliability model); develop a model for predicting the reliability of future system configurations (risk model); and compile a handbook of reliability assessment methods designed specifically for use by the practicing distribution engineer. Emphasis was placed on compiling and organizing reliability assessment techniques presently used by the industry. The project examined reliability evaluation from two perspectives: historical and predictive assessment. Two reliability assessment models, HISRAM - the historical reliability assessment model and PRAM - the predictive reliability assessment model were developed. Each model was tested in a utility environment by the Duquesne Light Company and the Public Service Electric and Gas Company of New Jersey. A survey of 56 diverse utilities served as a basis for examining current distribution reliability assessment practices in the electric power industry.

  13. Challenges of Modeling Flood Risk at Large Scales

    NASA Astrophysics Data System (ADS)

    Guin, J.; Simic, M.; Rowe, J.

    2009-04-01

    Flood risk management is a major concern for many nations and for the insurance sector in places where this peril is insured. A prerequisite for risk management, whether in the public sector or in the private sector is an accurate estimation of the risk. Mitigation measures and traditional flood management techniques are most successful when the problem is viewed at a large regional scale such that all inter-dependencies in a river network are well understood. From an insurance perspective the jury is still out there on whether flood is an insurable peril. However, with advances in modeling techniques and computer power it is possible to develop models that allow proper risk quantification at the scale suitable for a viable insurance market for flood peril. In order to serve the insurance market a model has to be event-simulation based and has to provide financial risk estimation that forms the basis for risk pricing, risk transfer and risk management at all levels of insurance industry at large. In short, for a collection of properties, henceforth referred to as a portfolio, the critical output of the model is an annual probability distribution of economic losses from a single flood occurrence (flood event) or from an aggregation of all events in any given year. In this paper, the challenges of developing such a model are discussed in the context of Great Britain for which a model has been developed. The model comprises of several, physically motivated components so that the primary attributes of the phenomenon are accounted for. The first component, the rainfall generator simulates a continuous series of rainfall events in space and time over thousands of years, which are physically realistic while maintaining the statistical properties of rainfall at all locations over the model domain. A physically based runoff generation module feeds all the rivers in Great Britain, whose total length of stream links amounts to about 60,000 km. A dynamical flow routing

  14. Modeling of Radiation Risks for Human Space Missions

    NASA Technical Reports Server (NTRS)

    Fletcher, Graham

    2004-01-01

    Prior to any human space flight, calculations of radiation risks are used to determine the acceptable scope of astronaut activity. Using the supercomputing facilities at NASA Ames Research Center, Ames researchers have determined the damage probabilities of DNA functional groups by space radiation. The data supercede those used in the current Monte Carlo model for risk assessment. One example is the reaction of DNA with hydroxyl radical produced by the interaction of highly energetic particles from space radiation with water molecules in the human body. This reaction is considered an important cause of DNA mutations, although its mechanism is not well understood.

  15. Additive Effects of the Risk Alleles of PNPLA3 and TM6SF2 on Non-alcoholic Fatty Liver Disease (NAFLD) in a Chinese Population

    PubMed Central

    Wang, Xiaoliang; Liu, Zhipeng; Wang, Kai; Wang, Zhaowen; Sun, Xing; Zhong, Lin; Deng, Guilong; Song, Guohe; Sun, Baining; Peng, Zhihai; Liu, Wanqing

    2016-01-01

    Recent genome-wide association studies have identified that variants in or near PNPLA3, NCAN, GCKR, LYPLAL1, and TM6SF2 are significantly associated with non-alcoholic fatty liver disease (NAFLD) in multiple ethnic groups. Studies on their impact on NAFLD in Han Chinese are still limited. In this study, we examined the relevance of these variants to NAFLD in a community-based Han Chinese population and further explored their potential joint effect on NAFLD. Six single nucleotide polymorphisms (SNPs) (PNPLA3 rs738409, rs2294918, NCAN rs2228603, GCKR rs780094, LYPLAL1 rs12137855, and TM6SF2 rs58542926) previously identified in genome-wide analyses, to be associated with NAFLD were genotyped in 384 NAFLD patients and 384 age- and gender-matched healthy controls. We found two out of the six polymorphisms, PNPLA3 rs738409 (OR = 1.52, 95%CI: 1.19–1.96; P = 0.00087) and TM6SF2 rs58542926 (OR = 2.11, 95%CI: 1.34–3.39; P = 0.0016) are independently associated with NAFLD after adjustment for the effects of age, gender, and BMI. Our analysis further demonstrated the strong additive effects of the risk alleles of PNPLA3 and TM6SF2 with an overall significance between the number of risk alleles and NAFLD (OR = 1.64, 95%CI: 1.34–2.01; P = 1.4 × 10-6). The OR for NAFLD increased in an additive manner, with an average increase in OR of 1.52 per additional risk allele. Our results confirmed that the PNPLA3 and TM6SF2 variants were the most significant risk alleles for NAFLD in Chinese population. Therefore, genotyping these two genetic risk factors may help identify individuals with the highest risk of NAFLD. PMID:27532011

  16. Reducing uncertainty in risk modeling for methylmercury exposure

    SciTech Connect

    Ponce, R.; Egeland, G.; Middaugh, J.; Lee, R.

    1995-12-31

    The biomagnification and bioaccumulation of methylmercury in marine species represents a challenge for risk assessment related to the consumption of subsistence foods in Alaska. Because of the profound impact that food consumption advisories have on indigenous peoples seeking to preserve a way of life, there is a need to reduce uncertainty in risk assessment. Thus, research was initiated to reduce the uncertainty in assessing the health risks associated with the consumption of subsistence foods. Because marine subsistence foods typically contain elevated levels of methylmercury, preliminary research efforts have focused on methylmercury as the principal chemical of concern. Of particular interest are the antagonistic effects of selenium on methylmercury toxicity. Because of this antagonism, methylmercury exposure through the consumption of marine mammal meat (with high selenium) may not be as toxic as comparable exposures through other sources of dietary intake, such as in the contaminated bread episode of Iraq (containing relatively low selenium). This hypothesis is supported by animal experiments showing reduced toxicity of methylmercury associated with marine mammal meat, by the antagonistic influence of selenium on methylmercury toxicity, and by negative clinical findings in adult populations exposed to methylmercury through a marine diet not subject to industrial contamination. Exploratory model development is underway to identify potential improvements and applications of current deterministic and probabilistic models, particularly by incorporating selenium as an antagonist in risk modeling methods.

  17. Mental models in risk assessment: informing people about drugs.

    PubMed

    Jungermann, H; Schütz, H; Thüring, M

    1988-03-01

    One way to communicate about the risks of drugs is through the use of package inserts. The problems associated with this medium of informing patients have been investigated by several researchers who found that people require information about drugs they are using, including extensive risk information, and that they are willing to take this information into account in their usage of drugs. But empirical results also show that people easily misinterpret the information given. A conceptual framework is proposed that might be used for better understanding the cognitive processes involved in such a type of risk assessment and communication. It is based on the idea that people develop, through experience, a mental model of how a drug works, which effects it might produce, that contraindications have to be considered, etc. This mental model is "run" when a specific package insert has been read and a specific question arises such as, for example, whether certain symptoms can be explained as normal or whether they require special attention and action. We argue that the mental model approach offers a useful perspective for examining how people understand package inserts, and consequently for improving their content and design. The approach promises to be equally useful for other aspects of risk analysis that are dependent upon human judgment and decision making, e.g., threat diagnosis and human reliability analysis. PMID:3375502

  18. Modeling and additive manufacturing of bio-inspired composites with tunable fracture mechanical properties.

    PubMed

    Dimas, Leon S; Buehler, Markus J

    2014-07-01

    Flaws, imperfections and cracks are ubiquitous in material systems and are commonly the catalysts of catastrophic material failure. As stresses and strains tend to concentrate around cracks and imperfections, structures tend to fail far before large regions of material have ever been subjected to significant loading. Therefore, a major challenge in material design is to engineer systems that perform on par with pristine structures despite the presence of imperfections. In this work we integrate knowledge of biological systems with computational modeling and state of the art additive manufacturing to synthesize advanced composites with tunable fracture mechanical properties. Supported by extensive mesoscale computer simulations, we demonstrate the design and manufacturing of composites that exhibit deformation mechanisms characteristic of pristine systems, featuring flaw-tolerant properties. We analyze the results by directly comparing strain fields for the synthesized composites, obtained through digital image correlation (DIC), and the computationally tested composites. Moreover, we plot Ashby diagrams for the range of simulated and experimental composites. Our findings show good agreement between simulation and experiment, confirming that the proposed mechanisms have a significant potential for vastly improving the fracture response of composite materials. We elucidate the role of stiffness ratio variations of composite constituents as an important feature in determining the composite properties. Moreover, our work validates the predictive ability of our models, presenting them as useful tools for guiding further material design. This work enables the tailored design and manufacturing of composites assembled from inferior building blocks, that obtain optimal combinations of stiffness and toughness. PMID:24700202

  19. Effects of Mn addition on dislocation loop formation in A533B and model alloys

    NASA Astrophysics Data System (ADS)

    Watanabe, H.; Masaki, S.; Masubuchi, S.; Yoshida, N.; Dohi, K.

    2013-08-01

    It is well known that the radiation hardening or embrittlement of pressure vessel steels is very sensitive to the contents of minor solutes. To study the effect of dislocation loop formation on radiation hardening in these steels, in situ observation using a high-voltage electron microscope was conducted for the reference pressure vessel steel JRQ and Fe-based model alloys containing Mn, Si, and Ni. In the Fe-based model alloys, the addition of Mn was most effective for increasing dislocation loop density at 290 °C. Based on the assumption that a di-interstitial was adopted as the nucleus for the formation of an interstitial loop, a binding energy of 0.22 eV was obtained for the interaction of a Mn atom and an interstitial. The formation of Mn clusters detected by three-dimensional atom probe and interstitial-type loops at room temperature clearly showed that the oversized Mn atoms migrate through an interstitial mechanism. The temperature and flux dependence of loop density in pressure vessel steels was very weak up to 290 °C. This suggests that interstitial atoms are deeply trapped by the radiation-induced solute clusters in pressure vessel steels.

  20. Modelling of fire count data: fire disaster risk in Ghana.

    PubMed

    Boadi, Caleb; Harvey, Simon K; Gyeke-Dako, Agyapomaa

    2015-01-01

    Stochastic dynamics involved in ecological count data require distribution fitting procedures to model and make informed judgments. The study provides empirical research, focused on the provision of an early warning system and a spatial graph that can detect societal fire risks. It offers an opportunity for communities, organizations, risk managers, actuaries and governments to be aware of, and understand fire risks, so that they will increase the direct tackling of the threats posed by fire. Statistical distribution fitting method that best helps identify the stochastic dynamics of fire count data is used. The aim is to provide a fire-prediction model and fire spatial graph for observed fire count data. An empirical probability distribution model is fitted to the fire count data and compared to the theoretical probability distribution of the stochastic process of fire count data. The distribution fitted to the fire frequency count data helps identify the class of models that are exhibited by the fire and provides time leading decisions. The research suggests that fire frequency and loss (fire fatalities) count data in Ghana are best modelled with a Negative Binomial Distribution. The spatial map of observed fire frequency and fatality measured over 5 years (2007-2011) offers in this study a first regional assessment of fire frequency and fire fatality in Ghana. PMID:26702383

  1. Evaluation of Major Online Diabetes Risk Calculators and Computerized Predictive Models

    PubMed Central

    Stiglic, Gregor; Pajnkihar, Majda

    2015-01-01

    Classical paper-and-pencil based risk assessment questionnaires are often accompanied by the online versions of the questionnaire to reach a wider population. This study focuses on the loss, especially in risk estimation performance, that can be inflicted by direct transformation from the paper to online versions of risk estimation calculators by ignoring the possibilities of more complex and accurate calculations that can be performed using the online calculators. We empirically compare the risk estimation performance between four major diabetes risk calculators and two, more advanced, predictive models. National Health and Nutrition Examination Survey (NHANES) data from 1999–2012 was used to evaluate the performance of detecting diabetes and pre-diabetes. American Diabetes Association risk test achieved the best predictive performance in category of classical paper-and-pencil based tests with an Area Under the ROC Curve (AUC) of 0.699 for undiagnosed diabetes (0.662 for pre-diabetes) and 47% (47% for pre-diabetes) persons selected for screening. Our results demonstrate a significant difference in performance with additional benefits for a lower number of persons selected for screening when statistical methods are used. The best AUC overall was obtained in diabetes risk prediction using logistic regression with AUC of 0.775 (0.734) and an average 34% (48%) persons selected for screening. However, generalized boosted regression models might be a better option from the economical point of view as the number of selected persons for screening of 30% (47%) lies significantly lower for diabetes risk assessment in comparison to logistic regression (p < 0.001), with a significantly higher AUC (p < 0.001) of 0.774 (0.740) for the pre-diabetes group. Our results demonstrate a serious lack of predictive performance in four major online diabetes risk calculators. Therefore, one should take great care and consider optimizing the online versions of questionnaires that were

  2. Framework for Risk Analysis in Multimedia Environmental Systems: Modeling Individual Steps of a Risk Assessment Process

    SciTech Connect

    Shah, Anuj; Castleton, Karl J.; Hoopes, Bonnie L.

    2004-06-01

    The study of the release and effects of chemicals in the environment and their associated risks to humans is central to public and private decision making. FRAMES 1.X, Framework for Risk Analysis in Multimedia Environmental Systems, is a systems modeling software platform, developed by PNNL, Pacific Northwest National Laboratory, that helps scientists study the release and effects of chemicals on a source to outcome basis, create environmental models for similar risk assessment and management problems. The unique aspect of FRAMES is to dynamically introduce software modules representing individual components of a risk assessment (e.g., source release of contaminants, fate and transport in various environmental media, exposure, etc.) within a software framework, manipulate their attributes and run simulations to obtain results. This paper outlines the fundamental constituents of FRAMES 2.X, an enhanced version of FRAMES 1.X, that greatly improve the ability of the module developers to “plug” their self-developed software modules into the system. The basic design, the underlying principles and a discussion of the guidelines for module developers are presented.

  3. The determination of risk areas for muddy floods based on a worst-case erosion modelling

    NASA Astrophysics Data System (ADS)

    Saathoff, Ulfert; Schindewolf, Marcus; Annika Arévalo, Sarah

    2013-04-01

    Soil erosion and muddy floods are a frequently occurring hazard in the German state of Saxony, because of the topography and the high relief energy together with the high proportion of arable land. Still, the events are rather heterogeneously distributed and we do not know where damage is likely to occur. The goal of this study is to locate hot spots for the risk of muddy floods, with the objective to prevent high economic damage in future. We applied a soil erosion and deposition map of Saxony, calculated with the process based soil erosion model EROSION 3D. This map shows the potential soil erosion and transported sediment for worst case soil conditions and a 10 year rain storm event. Furthermore, a map of the current landuse in the state is used. From the landuse map, we extracted those areas that are especially vulnerable to muddy floods, like residential and industrial areas, infrastructural facilities (e.g. power plants, hospitals) and highways. In combination with the output of the soil erosion model, the amount of sediment, that enters each single landuse entity, is calculated. Based on this data, a state-wide map with classified risks is created. The results are furthermore used to identify the risk of muddy floods for each single municipality in Saxony. The results are evaluated with data of real occurred muddy flood events with documented locations during the period between 2000 and 2010. Additionally, plausibility tests are performed for selected areas (examination of landuse, topography and soil). The results prove to be plausible and most of the documented events can be explained by the modelled risk map. The created map can be used by different institutions like city and traffic planners, to estimate the risk of muddy flood occurrence at specific locations. Furthermore, the risk map can serve insurance companies to evaluate the insurance risk of a building. To make them easily accessible, the risk map will be published online via a web GIS

  4. Derivation of a risk assessment model for hospital-acquired venous thrombosis: the NAVAL score.

    PubMed

    de Bastos, Marcos; Barreto, Sandhi M; Caiafa, Jackson S; Boguchi, Tânia; Silva, José Luiz Padilha; Rezende, Suely M

    2016-05-01

    Venous thrombosis (VT) is a preventable cause of death in hospitalized patients. The main strategy to decrease VT incidence is timely thromboprophylaxis in at-risk patients. We sought to evaluate the reliability of risk assessment model (RAM) data, the incremental usefulness of additional variables and the modelling of an adjusted score (the NAVAL score). We used the RAM proposed by Caprini for initial assessment. A 5 % systematic sample of data was independently reviewed for reliability. We evaluated the incremental usefulness of six variables for VT during the score modelling by logistic regression. We then assessed the NAVAL score for calibration, reclassification and discrimination performances. We observed 11,091 patients with 37 (0.3 %) VT events. Using the Caprini RAM, high-risk and moderate-risk patients were respectively associated with a 17.4 (95 % confidence interval [CI] 6.1-49.9) and 4.2 (95 % CI 1.6-11.0) increased VT risk compared with low-risk patients. Four independent variables were selected for the NAVAL score: "Age", "Admission clinic", "History of previous VT event" and "History of thrombophilia". The area under the receiver-operating-characteristic curve for the NAVAL score was 0.72 (95 % CI 0.63-0.81). The Net Reclassification Index (NRI) for the NAVAL score compared with the Caprini RAM was -0.1 (95 % CI -0.3 to 0.1; p = 0.28). We conclude that the NAVAL score is a simplified tool for the stratification of VT risk in hospitalized patients. With only four variables, it demonstrated good performance and discrimination, but requires external validation before clinical application. We also confirm that the Caprini RAM can effectively stratify VT risk in hospitalized patients in our population. PMID:26446587

  5. Mouse models of osteoarthritis: modelling risk factors and assessing outcomes.

    PubMed

    Fang, Hang; Beier, Frank

    2014-07-01

    Osteoarthritis (OA) is a prevalent musculoskeletal disease that results in pain and low quality of life for patients, as well as enormous medical and socioeconomic burdens. The molecular mechanisms responsible for the initiation and progression of OA are still poorly understood. As such, mouse models of the disease are having increasingly important roles in OA research owing to the advancements of microsurgical techniques and the use of genetically modified mice, as well as the development of novel assessment tools. In this Review, we discuss available mouse models of OA and applicable assessment tools in studies of experimental OA. PMID:24662645

  6. The elaboration likelihood model and communication about food risks.

    PubMed

    Frewer, L J; Howard, C; Hedderley, D; Shepherd, R

    1997-12-01

    Factors such as hazard type and source credibility have been identified as important in the establishment of effective strategies for risk communication. The elaboration likelihood model was adapted to investigate the potential impact of hazard type, information source, and persuasive content of information on individual engagement in elaborative, or thoughtful, cognitions about risk messages. One hundred sixty respondents were allocated to one of eight experimental groups, and the effects of source credibility, persuasive content of information and hazard type were systematically varied. The impact of the different factors on beliefs about the information and elaborative processing examined. Low credibility was particularly important in reducing risk perceptions, although persuasive content and hazard type were also influential in determining whether elaborative processing occurred. PMID:9463930

  7. Agents, Bayes, and Climatic Risks - a modular modelling approach

    NASA Astrophysics Data System (ADS)

    Haas, A.; Jaeger, C.

    2005-08-01

    When insurance firms, energy companies, governments, NGOs, and other agents strive to manage climatic risks, it is by no way clear what the aggregate outcome should and will be. As a framework for investigating this subject, we present the LAGOM model family. It is based on modules depicting learning social agents. For managing climate risks, our agents use second order probabilities and update them by means of a Bayesian mechanism while differing in priors and risk aversion. The interactions between these modules and the aggregate outcomes of their actions are implemented using further modules. The software system is implemented as a series of parallel processes using the CIAMn approach. It is possible to couple modules irrespective of the language they are written in, the operating system under which they are run, and the physical location of the machine.

  8. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  9. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  10. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  11. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  12. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  13. Issues in risk assessment and modifications of the NRC health effects models

    SciTech Connect

    Gilbert, E.S.

    1992-07-02

    A report, Health Effects Models for Nuclear Power Plant Accident Consequence Analysis, was published by the US Nuclear Regulatory Commission, in 1985, and revised in 1989. These reports provided models for estimating health effects that would be expected to result from the radiation exposure received in a nuclear reactor accident. Separate models were given for early occurring effects, late somatic effects, and genetic effects; however, this paper addresses only late somatic effects, or the risk of cancer expected to occur in the lifetimes of exposed individuals. The 1989 revision was prepared prior to the publication of the BEIR V, 1988 UNSCEAR, and ICRP 60 reports. For this reason, an addendum was needed that would provide modified risk models that took into account these recent reports, and, more generally, any new evidence that had appeared since the 1989 publication. Of special importance was consideration of updated analyses of the Japanese A-bomb survivor study data based on revised DS86 dosimetry. The process of preparing the addendum required thorough review and evaluation of the models used by the BEIR V, UNSCEAR, and ICRP committees, and also required thorough consideration of the various decisions that must be made in any risk assessment effort. This paper emphasizes general issues and problems that arise in risk assessment, and also indicates areas where additional development and application of statistical methods may be fruitful.

  14. Intrauterine diabetic environment confers risks for type 2 diabetes mellitus and obesity in the offspring, in addition to genetic susceptibility.

    PubMed

    Dabelea, D; Pettitt, D J

    2001-01-01

    Numerous studies have reported that offspring whose mothers had type 2 diabetes mellitus (DM) are more likely to develop type 2 DM, impaired glucose tolerance, and obesity at an early age than offspring whose fathers had DM. Exposure to the diabetic intrauterine environment has been shown to be an important risk factor for all these conditions. To what extent transmission of type 2 DM from mother to offspring is the effect of genetic inheritance and to what extent it is the long-term consequence of exposure to maternal hyperglycemia is still uncertain. There are, of course, interactions between the diabetic intrauterine environment and genetics. Several data in experimental animals as well as in humans suggest, however, that exposure of the fetus to the mother's DM confers a risk for type 2 DM and obesity that is above any genetically transmitted susceptibility. In the Pima Indian population much of the increase in childhood type 2 DM can be attributed to the diabetic intrauterine environment. This suggests that intensive glucose control during pregnancy might have extended beneficial effects, contributing to a decrease in the prevalence of childhood type 2 DM. PMID:11592564

  15. Child Effortful Control, Teacher-student Relationships, and Achievement in Academically At-risk Children: Additive and Interactive Effects

    PubMed Central

    Liew, Jeffrey; Chen, Qi; Hughes, Jan N.

    2009-01-01

    The joint contributions of child effortful control (using inhibitory control and task accuracy as behavioral indices) and positive teacher-student relationships at first grade on reading and mathematics achievement at second grade were examined in 761 children who were predominantly from low-income and ethnic minority backgrounds and assessed to be academically at-risk at entry to first grade. Analyses accounted for clustering effects, covariates, baselines of effortful control measures, and prior levels of achievement. Even with such conservative statistical controls, interactive effects were found for task accuracy and positive teacher-student relationships on future achievement. Results suggest that task accuracy served as a protective factor so that children with high task accuracy performed well academically despite not having positive teacher-student relationships. Further, positive teacher-student relationships served as a compensatory factor so that children with low task accuracy performed just as well as those with high task accuracy if they were paired with a positive and supportive teacher. Importantly, results indicate that the influence of positive teacher-student relationships on future achievement was most pronounced for students with low effortful control on tasks that require fine motor skills, accuracy, and attention-related skills. Study results have implications for narrowing achievement disparities for academically at-risk children. PMID:20161421

  16. Development of Standardized Probabilistic Risk Assessment Models for Shutdown Operations Integrated in SPAR Level 1 Model

    SciTech Connect

    S. T. Khericha; J. Mitman

    2008-05-01

    Nuclear plant operating experience and several studies show that the risk from shutdown operation during Modes 4, 5, and 6 at pressurized water reactors and Modes 4 and 5 at boiling water reactors can be significant. This paper describes using the U.S. Nuclear Regulatory Commission’s full-power Standardized Plant Analysis Risk (SPAR) model as the starting point for development of risk evaluation models for commercial nuclear power plants. The shutdown models are integrated with their respective internal event at-power SPAR model. This is accomplished by combining the modified system fault trees from the SPAR full-power model with shutdown event tree logic. Preliminary human reliability analysis results indicate that risk is dominated by the operator’s ability to correctly diagnose events and initiate systems.

  17. Risk factors assessment and risk prediction models in lung cancer screening candidates

    PubMed Central

    Wachuła, Ewa; Szabłowska-Siwik, Sylwia; Boratyn-Nowicka, Agnieszka; Czyżewski, Damian

    2016-01-01

    From February 2015, low-dose computed tomography (LDCT) screening entered the armamentarium of diagnostic tools broadly available to individuals at high-risk of developing lung cancer. While a huge number of pulmonary nodules are identified, only a small fraction turns out to be early lung cancers. The majority of them constitute a variety of benign lesions. Although it entails a burden of the diagnostic work-up, the undisputable benefit emerges from: (I) lung cancer diagnosis at earlier stages (stage shift); (II) additional findings enabling the implementation of a preventive action beyond the realm of thoracic oncology. This review presents how to utilize the risk factors from distinct categories such as epidemiology, radiology and biomarkers to target the fraction of population, which may benefit most from the introduced screening modality. PMID:27195269

  18. Risk factors assessment and risk prediction models in lung cancer screening candidates.

    PubMed

    Adamek, Mariusz; Wachuła, Ewa; Szabłowska-Siwik, Sylwia; Boratyn-Nowicka, Agnieszka; Czyżewski, Damian

    2016-04-01

    From February 2015, low-dose computed tomography (LDCT) screening entered the armamentarium of diagnostic tools broadly available to individuals at high-risk of developing lung cancer. While a huge number of pulmonary nodules are identified, only a small fraction turns out to be early lung cancers. The majority of them constitute a variety of benign lesions. Although it entails a burden of the diagnostic work-up, the undisputable benefit emerges from: (I) lung cancer diagnosis at earlier stages (stage shift); (II) additional findings enabling the implementation of a preventive action beyond the realm of thoracic oncology. This review presents how to utilize the risk factors from distinct categories such as epidemiology, radiology and biomarkers to target the fraction of population, which may benefit most from the introduced screening modality. PMID:27195269

  19. Holistic flood risk assessment using agent-based modelling: the case of Sint Maarten Island

    NASA Astrophysics Data System (ADS)

    Abayneh Abebe, Yared; Vojinovic, Zoran; Nikolic, Igor; Hammond, Michael; Sanchez, Arlex; Pelling, Mark

    2015-04-01

    Floods in coastal regions are regarded as one of the most dangerous and harmful disasters. Though commonly referred to as natural disasters, coastal floods are also attributable to various social, economic, historical and political issues. Rapid urbanisation in coastal areas combined with climate change and poor governance can lead to a significant increase in the risk of pluvial flooding coinciding with fluvial and coastal flooding posing a greater risk of devastation in coastal communities. Disasters that can be triggered by hydro-meteorological events are interconnected and interrelated with both human activities and natural processes. They, therefore, require holistic approaches to help understand their complexity in order to design and develop adaptive risk management approaches that minimise social and economic losses and environmental impacts, and increase resilience to such events. Being located in the North Atlantic Ocean, Sint Maarten is frequently subjected to hurricanes. In addition, the stormwater catchments and streams on Sint Maarten have several unique characteristics that contribute to the severity of flood-related impacts. Urban environments are usually situated in low-lying areas, with little consideration for stormwater drainage, and as such are subject to flash flooding. Hence, Sint Maarten authorities drafted policies to minimise the risk of flood-related disasters on the island. In this study, an agent-based model is designed and applied to understand the implications of introduced policies and regulations, and to understand how different actors' behaviours influence the formation, propagation and accumulation of flood risk. The agent-based model built for this study is based on the MAIA meta-model, which helps to decompose, structure and conceptualize socio-technical systems with an agent-oriented perspective, and is developed using the NetLogo simulation environment. The agents described in this model are households and businesses, and

  20. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models

    SciTech Connect

    David G. Hoel, PhD

    2012-04-19

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival function and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact

  1. ByMuR model: interaction among risks and uncertainty treatment in long-term multi-hazard/risk assessments

    NASA Astrophysics Data System (ADS)

    Selva, J.

    2012-12-01

    Multi-risk approaches have been recently proposed to assess and compare different risks in the same target area. The key point of multi-risk assessments are the development of homogeneous risk definitions and the treatment of risk interaction. The lack of treatment of interaction may lead to significant biases and thus to erroneous risk hierarchization, which is one of primary output of risk assessments for decision makers. Within the framework of the Italian project "ByMuR - Bayesian Multi-Risk assessment", a formal model (ByMuR model) to assess multi-risk for a target area is under development, aiming (i) to perform multi-risk analyses treating interaction between different hazardous phenomena, accounting for possible effects of interaction at hazard, vulnerability and exposure levels, and (ii) to explicitly account for all uncertainties (aleatory and epistemic) through a Bayesian approach, allowing a meaningful comparison among different risks. The model is meant to be general, but it is targeted to the assessment of volcanic, seismic and tsunami risks for the city of Naples (Italy). Here, it is presented the preliminary development of the ByMuR model. The applicability of the methodology is demonstrated through illustrative examples, in which the effects of uncertainties and the bias in single-risk estimation induced by the assumption of independence among risks are explicitly assessed. An extensive application of this methodology at regional and sub-regional scale would allow to identify where a given interaction has significant effects in long-term risk assessments, and thus when multi-risk analyses should be considered in order to provide unbiased risk estimations.

  2. Field evaluation of an avian risk assessment model

    USGS Publications Warehouse

    Vyas, N.B.; Spann, J.W.; Hulse, C.S.; Borges, S.L.; Bennett, R.S.; Torrez, M.; Williams, B.I.; Leffel, R.

    2006-01-01

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in the field. We tested technical-grade diazinon and its D Z N- 50W (50% diazinon active ingredient wettable powder) formulation on Canada goose (Branta canadensis) goslings. Brain acetylcholinesterase activity was measured, and the feathers and skin, feet. and gastrointestinal contents were analyzed for diazinon residues. The dose-response curves showed that diazinon was significantly more toxic to goslings in the outdoor test than in the laboratory tests. The deterministic risk assessment method identified the potential for risk to birds in general, but the factors associated with extrapolating from the laboratory to the field, and from the laboratory test species to other species, resulted in the underestimation of risk to the goslings. The present study indicates that laboratory-based risk quotients should be interpreted with caution.

  3. Making Risk Models Operational for Situational Awareness and Decision Support

    SciTech Connect

    Paulson, Patrick R.; Coles, Garill A.; Shoemaker, Steven V.

    2012-06-12

    Modernization of nuclear power operations control systems, in particular the move to digital control systems, creates an opportunity to modernize existing legacy infrastructure and extend plant life. We describe here decision support tools that allow the assessment of different facets of risk and support the optimization of available resources to reduce risk as plants are upgraded and maintained. This methodology could become an integrated part of the design review process and a part of the operations management systems. The methodology can be applied to the design of new reactors such as small nuclear reactors (SMR), and be helpful in assessing the risks of different configurations of the reactors. Our tool provides a low cost evaluation of alternative configurations and provides an expanded safety analysis by considering scenarios while early in the implementation cycle where cost impacts can be minimized. The effects of failures can be modeled and thoroughly vetted to understand their potential impact on risk. The process and tools presented here allow for an integrated assessment of risk by supporting traditional defense in depth approaches while taking into consideration the insertion of new digital instrument and control systems.

  4. Survivorship models for estimating the risk of decompression sickness.

    PubMed

    Kumar, K V; Powell, M R

    1994-07-01

    Several approaches have been used for modeling the incidence of decompression sickness (DCS) such as Hill's dose-response and logistic regression. Most of these methods do not include the time-to-onset information in the model. Survival analysis (failure time analysis) is appropriate when the time to onset of an event is of interest. The applicability of survival analysis for modeling the risk of DCS is illustrated by using data obtained from hypobaric chamber exposures simulating extravehicular activities (n = 426). Univariate analysis of incidence-free survival proportions were obtained for Doppler-detectable circulating microbubbles (CMB), symptoms of DCS and test aborts. A log-linear failure time regression model with 360-min half-time tissue ratio (TR) as covariate was constructed, and estimated probabilities for various TR values were calculated. Further regression analysis by including CMB status in this model showed significant improvement (p < 0.05) in the estimation of DCS over the previous model. Since DCS is dependent on the exposure pressure as well as the duration of exposure, we recommend the use of survival analysis for modeling the risk of DCS. PMID:7945136

  5. The impact of consumer phase models in microbial risk analysis.

    PubMed

    Nauta, Maarten; Christensen, Bjarke

    2011-02-01

    In quantitative microbiological risk assessment (QMRA), the consumer phase model (CPM) describes the part of the food chain between purchase of the food product at retail and exposure. Construction of a CPM is complicated by the large variation in consumer food handling practices and a limited availability of data. Therefore, several subjective (simplifying) assumptions have to be made when a CPM is constructed, but with a single CPM their impact on the QMRA results is unclear. We therefore compared the performance of eight published CPMs for Campylobacter in broiler meat in an example of a QMRA, where all the CPMs were analyzed using one single input distribution of concentrations at retail, and the same dose-response relationship. It was found that, between CPMs, there may be a considerable difference in the estimated probability of illness per serving. However, the estimated relative risk reductions are less different for scenarios modeling the implementation of control measures. For control measures affecting the Campylobacter prevalence, the relative risk is proportional irrespective of the CPM used. However, for control measures affecting the concentration the CPMs show some difference in the estimated relative risk. This difference is largest for scenarios where the aim is to remove the highly contaminated portion from human exposure. Given these results, we conclude that for many purposes it is not necessary to develop a new detailed CPM for each new QMRA. However, more observational data on consumer food handling practices and their impact on microbial transfer and survival are needed to generalize this conclusion. PMID:20738819

  6. Recurrence models of volcanic events: Applications to volcanic risk assessment

    SciTech Connect

    Crowe, B.M.; Picard, R.; Valentine, G.; Perry, F.V.

    1992-03-01

    An assessment of the risk of future volcanism has been conducted for isolation of high-level radioactive waste at the potential Yucca Mountain site in southern Nevada. Risk used in this context refers to a combined assessment of the probability and consequences of future volcanic activity. Past studies established bounds on the probability of magmatic disruption of a repository. These bounds were revised as additional data were gathered from site characterization studies. The probability of direct intersection of a potential repository located in an eight km{sup 2} area of Yucca Mountain by ascending basalt magma was bounded by the range of 10{sup {minus}8} to 10{sup {minus}10} yr{sup {minus}1 2}. The consequences of magmatic disruption of a repository were estimated in previous studies to be limited. The exact releases from such an event are dependent on the strike of an intruding basalt dike relative to the repository geometry, the timing of the basaltic event relative to the age of the radioactive waste and the mechanisms of release and dispersal of the waste radionuclides in the accessible environment. The combined low probability of repository disruption and the limited releases associated with this event established the basis for the judgement that the risk of future volcanism was relatively low. It was reasoned that that risk of future volcanism was not likely to result in disqualification of the potential Yucca Mountain site.

  7. Radiation-Induced Leukemia at Doses Relevant to Radiation Therapy: Modeling Mechanisms and Estimating Risks

    NASA Technical Reports Server (NTRS)

    Shuryak, Igor; Sachs, Rainer K.; Hlatky, Lynn; Mark P. Little; Hahnfeldt, Philip; Brenner, David J.

    2006-01-01

    Because many cancer patients are diagnosed earlier and live longer than in the past, second cancers induced by radiation therapy have become a clinically significant issue. An earlier biologically based model that was designed to estimate risks of high-dose radiation induced solid cancers included initiation of stem cells to a premalignant state, inactivation of stem cells at high radiation doses, and proliferation of stem cells during cellular repopulation after inactivation. This earlier model predicted the risks of solid tumors induced by radiation therapy but overestimated the corresponding leukemia risks. Methods: To extend the model to radiation-induced leukemias, we analyzed in addition to cellular initiation, inactivation, and proliferation a repopulation mechanism specific to the hematopoietic system: long-range migration through the blood stream of hematopoietic stem cells (HSCs) from distant locations. Parameters for the model were derived from HSC biologic data in the literature and from leukemia risks among atomic bomb survivors v^ ho were subjected to much lower radiation doses. Results: Proliferating HSCs that migrate from sites distant from the high-dose region include few preleukemic HSCs, thus decreasing the high-dose leukemia risk. The extended model for leukemia provides risk estimates that are consistent with epidemiologic data for leukemia risk associated with radiation therapy over a wide dose range. For example, when applied to an earlier case-control study of 110000 women undergoing radiotherapy for uterine cancer, the model predicted an excess relative risk (ERR) of 1.9 for leukemia among women who received a large inhomogeneous fractionated external beam dose to the bone marrow (mean = 14.9 Gy), consistent with the measured ERR (2.0, 95% confidence interval [CI] = 0.2 to 6.4; from 3.6 cases expected and 11 cases observed). As a corresponding example for brachytherapy, the predicted ERR of 0.80 among women who received an inhomogeneous low

  8. Intensifying instruction: Does additional instructional time make a difference for the most at-risk first graders?

    PubMed

    Harn, Beth A; Linan-Thompson, Sylvia; Roberts, Gregory

    2008-01-01

    Research is clear on the benefit of early intervention efforts and the importance of intensive instructional supports; however, understanding which features to intensify is less clear. General intervention features of group size, instructional delivery, and time are areas schools can consider manipulating to intensify instruction. Also, each of these features can vary along a continuum making them easier or more challenging for schools to implement. What is unclear is if implementing very intensive interventions early in school (first grade), which require significantly more school resources, provides accordingly accelerated student learning. This article investigates the role of intensifying instructional time for the most at-risk first graders in schools implementing research-based instructional and assessment practices within multitiered instructional support systems. Results indicate that students receiving more intensive intervention made significantly more progress across a range of early reading measures. Intervention features, limitations, recommendations for practice, and implications for treatment resisters are discussed. PMID:18354932

  9. Integrated Water and Sanitation Risk Assessment and Modeling in the Upper Sonora River basin (Northwest, Mexico)

    NASA Astrophysics Data System (ADS)

    Mayer, A. S.; Robles-Morua, A.; Halvorsen, K. E.; Vivoni, E. R.; Auer, M. T.

    2011-12-01

    explore the use of participatory modeling frameworks in less developed regions. Results indicate that respondents agreed strongly with the hydrologic and water quality modeling methodologies presented and considered the modeling results useful. Our results also show that participatory modeling approaches can have short term impacts as seen in the changes in water-related risk perceptions. In total, these projects revealed that water resources management solutions need to take into account variations across the human landscape (i.e. risk perceptions) and variations in the biophysical response of watersheds to natural phenomena (i.e. streamflow generation) and to anthropogenic activities (i.e. contaminant fate and transport). In addition, this work underscores the notion that sustainable water resources solutions need to contend with uncertainty in our understanding and predictions of human perceptions and biophysical systems.

  10. Fire risk in San Diego County, California: A weighted Bayesian model approach

    USGS Publications Warehouse

    Kolden, Crystal A.; Weigel, Timothy J.

    2007-01-01

    Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.

  11. Engineering models for catastrophe risk and their application to insurance

    NASA Astrophysics Data System (ADS)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  12. Enhancement of colour stability of anthocyanins in model beverages by gum arabic addition.

    PubMed

    Chung, Cheryl; Rojanasasithara, Thananunt; Mutilangi, William; McClements, David Julian

    2016-06-15

    This study investigated the potential of gum arabic to improve the stability of anthocyanins that are used in commercial beverages as natural colourants. The degradation of purple carrot anthocyanin in model beverage systems (pH 3.0) containing L-ascorbic acid proceeded with a first-order reaction rate during storage (40 °C for 5 days in light). The addition of gum arabic (0.05-5.0%) significantly enhanced the colour stability of anthocyanin, with the most stable systems observed at intermediate levels (1.5%). A further increase in concentration (>1.5%) reduced its efficacy due to a change in the conformation of the gum arabic molecules that hindered their exposure to the anthocyanins. Fluorescence quenching measurements showed that the anthocyanin could have interacted with the glycoprotein fractions of the gum arabic through hydrogen bonding, resulting in enhanced stability. Overall, this study provides valuable information about enhancing the stability of anthocyanins in beverage systems using natural ingredients. PMID:26868542

  13. Influence of the heterogeneous reaction HCL + HOCl on an ozone hole model with hydrocarbon additions

    SciTech Connect

    Elliott, S.; Cicerone, R.J.; Turco, R.P.

    1994-02-20

    Injection of ethane or propane has been suggested as a means for reducing ozone loss within the Antarctic vortex because alkanes can convert active chlorine radicals into hydrochloric acid. In kinetic models of vortex chemistry including as heterogeneous processes only the hydrolysis and HCl reactions of ClONO{sub 2} and N{sub 2}O{sub 5}, parts per billion by volume levels of the light alkanes counteract ozone depletion by sequestering chlorine atoms. Introduction of the surface reaction of HCl with HOCl causes ethane to deepen baseline ozone holes and generally works to impede any mitigation by hydrocarbons. The increased depletion occurs because HCl + HOCl can be driven by HO{sub x} radicals released during organic oxidation. Following initial hydrogen abstraction by chlorine, alkane breakdown leads to a net hydrochloric acid activation as the remaining hydrogen atoms enter the photochemical system. Lowering the rate constant for reactions of organic peroxy radicals with ClO to 10{sup {minus}13} cm{sup 3} molecule{sup {minus}1} s{sup {minus}1} does not alter results, and the major conclusions are insensitive to the timing of the ethane additions. Ignoring the organic peroxy radical plus ClO reactions entirely restores remediation capabilities by allowing HO{sub x} removal independent of HCl. Remediation also returns if early evaporation of polar stratospheric clouds leaves hydrogen atoms trapped in aldehyde intermediates, but real ozone losses are small in such cases. 95 refs., 4 figs., 7 tabs.

  14. In vivo characterization of two additional Leishmania donovani strains using the murine and hamster model.

    PubMed

    Kauffmann, F; Dumetz, F; Hendrickx, S; Muraille, E; Dujardin, J-C; Maes, L; Magez, S; De Trez, C

    2016-05-01

    Leishmania donovani is a protozoan parasite causing the neglected tropical disease visceral leishmaniasis. One difficulty to study the immunopathology upon L. donovani infection is the limited adaptability of the strains to experimental mammalian hosts. Our knowledge about L. donovani infections relies on a restricted number of East African strains (LV9, 1S). Isolated from patients in the 1960s, these strains were described extensively in mice and Syrian hamsters and have consequently become 'reference' laboratory strains. L. donovani strains from the Indian continent display distinct clinical features compared to East African strains. Some reports describing the in vivo immunopathology of strains from the Indian continent exist. This study comprises a comprehensive immunopathological characterization upon infection with two additional strains, the Ethiopian L. donovani L82 strain and the Nepalese L. donovani BPK282 strain in both Syrian hamsters and C57BL/6 mice. Parameters that include parasitaemia levels, weight loss, hepatosplenomegaly and alterations in cellular composition of the spleen and liver, showed that the L82 strain generated an overall more virulent infection compared to the BPK282 strain. Altogether, both L. donovani strains are suitable and interesting for subsequent in vivo investigation of visceral leishmaniasis in the Syrian hamster and the C57BL/6 mouse model. PMID:27012562

  15. FIRESTORM: Modelling the water quality risk of wildfire.

    NASA Astrophysics Data System (ADS)

    Mason, C. I.; Sheridan, G. J.; Smith, H. G.; Jones, O.; Chong, D.; Tolhurst, K.

    2012-04-01

    Following wildfire, loss of vegetation and changes to soil properties may result in decreases in infiltration rates, less rainfall interception, and higher overland flow velocities. Rainfall events affecting burn areas before vegetation recovers can cause high magnitude erosion events that impact on downstream water quality. For cities and towns that rely upon fire-prone forest catchments for water supply, wildfire impacts on water quality represent a credible risk to water supply security. Quantifying the risk associated with the occurrence of wildfires and the magnitude of water quality impacts has important implications for managing water supplies. At present, no suitable integrative model exists that considers the probabilistic nature of system inputs as well as the range of processes and scales involved in this problem. We present FIRESTORM, a new model currently in development that aims to determine the range of sediment and associated contaminant loads that may be delivered to water supply reservoirs from the combination of wildfire and subsequent rainfall events. This Monte Carlo model incorporates the probabilistic nature of fire ignition, fire weather and rainfall, and includes deterministic models for fire behaviour and locally dominant erosion processes. FIRESTORM calculates the magnitude and associated annual risk of catchment-scale sediment loads associated with the occurrence of wildfire and rainfall generated by two rain event types. The two event types are localised, high intensity, short-duration convective storms, and widespread, longer duration synoptic-scale rainfall events. Initial application and testing of the model will focus on the two main reservoirs supplying water to Melbourne, Australia, both of which are situated in forest catchments vulnerable to wildfire. Probabilistic fire ignition and weather scenarios have been combined using 40 years of fire records and weather observations. These are used to select from a dataset of over 80

  16. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    PubMed Central

    Mumtaz, Moiz; Fisher, Jeffrey; Blount, Benjamin; Ruiz, Patricia

    2012-01-01

    Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK) models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures. PMID:22523493

  17. Modeling the risk of secondary malignancies after radiotherapy.

    PubMed

    Schneider, Uwe

    2011-01-01

    In developed countries, more than half of all cancer patients receive radiotherapy at some stage in the management of their disease. However, a radiation-induced secondary malignancy can be the price of success if the primary cancer is cured or at least controlled. Therefore, there is increasing concern regarding radiation-related second cancer risks in long-term radiotherapy survivors and a corresponding need to be able to predict cancer risks at high radiation doses. Of particular interest are second cancer risk estimates for new radiation treatment modalities such as intensity modulated radiotherapy, intensity modulated arc-therapy, proton and heavy ion radiotherapy. The long term risks from such modern radiotherapy treatment techniques have not yet been determined and are unlikely to become apparent for many years, due to the long latency time for solid tumor induction. Most information on the dose-response of radiation-induced cancer is derived from data on the A-bomb survivors who were exposed to γ-rays and neutrons. Since, for radiation protection purposes, the dose span of main interest is between zero and one Gy, the analysis of the A-bomb survivors is usually focused on this range. With increasing cure rates, estimates of cancer risk for doses larger than one Gy are becoming more important for radiotherapy patients. Therefore in this review, emphasis was placed on doses relevant for radiotherapy with respect to radiation induced solid cancer. Simple radiation protection models should be used only with extreme care for risk estimates in radiotherapy, since they are developed exclusively for low dose. When applied to scatter radiation, such models can predict only a fraction of observed second malignancies. Better semi-empirical models include the effect of dose fractionation and represent the dose-response relationships more accurately. The involved uncertainties are still huge for most of the organs and tissues. A major reason for this is that the

  18. A Family-Centered Model for Sharing Genetic Risk.

    PubMed

    Daly, Mary B

    2015-01-01

    The successes of the Human Genome Project have ushered in a new era of genomic science. To effectively translate these discoveries, it will be critical to improve the communication of genetic risk within families. This will require a systematic approach that accounts for the nature of family relationships and sociocultural beliefs. This paper proposes the application of the Family Systems Illness Model, used in the setting of cancer care, to the evolving field of genomics. PMID:26479564

  19. Modeling the operational risk in Iranian commercial banks: case study of a private bank

    NASA Astrophysics Data System (ADS)

    Momen, Omid; Kimiagari, Alimohammad; Noorbakhsh, Eaman

    2012-08-01

    The Basel Committee on Banking Supervision from the Bank for International Settlement classifies banking risks into three main categories including credit risk, market risk, and operational risk. The focus of this study is on the operational risk measurement in Iranian banks. Therefore, issues arising when trying to implement operational risk models in Iran are discussed, and then, some solutions are recommended. Moreover, all steps of operational risk measurement based on Loss Distribution Approach with Iran's specific modifications are presented. We employed the approach of this study to model the operational risk of an Iranian private bank. The results are quite reasonable, comparing the scale of bank and other risk categories.

  20. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    NASA Technical Reports Server (NTRS)

    2012-01-01

    estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements (NCRP, 2000, 2006). However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as opposed to the previous approach based on linear energy transfer, the development of a new solar particle event (SPE) model, and the updates to galactic cosmic ray (GCR) and shielding transport models. The newer epidemiological information includes updates to the cancer incidence rates from the life span study (LSS) of the Japanese atomic bomb survivors (Preston et al., 2007), transferred to the U.S. population and converted to cancer mortality rates from U.S. population statistics. In addition, the proposed model provides an alternative analysis applicable to lifetime never-smokers (NSs). Details of the uncertainty analysis in the model have also been updated and revised. NASA's proposed model and associated uncertainties are complex in their formulation and as such require a very clear and precise set of descriptions. The committee found the 2011 NASA report challenging to review largely because of the lack of clarity in the model descriptions and derivation of the various parameters used. The committee requested some clarifications from NASA throughout its review and was able to resolve many, but not all, of the ambiguities in the written description.

  1. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  2. A Novel Risk Score to the Prediction of 10-year Risk for Coronary Artery Disease Among the Elderly in Beijing Based on Competing Risk Model

    PubMed Central

    Liu, Long; Tang, Zhe; Li, Xia; Luo, Yanxia; Guo, Jin; Li, Haibin; Liu, Xiangtong; Tao, Lixin; Yan, Aoshuang; Guo, Xiuhua

    2016-01-01

    Abstract The study aimed to construct a risk prediction model for coronary artery disease (CAD) based on competing risk model among the elderly in Beijing and develop a user-friendly CAD risk score tool. We used competing risk model to evaluate the risk of developing a first CAD event. On the basis of the risk factors that were included in the competing risk model, we constructed the CAD risk prediction model with Cox proportional hazard model. Time-dependent receiver operating characteristic (ROC) curve and time-dependent area under the ROC curve (AUC) were used to evaluate the discrimination ability of the both methods. Calibration plots were applied to assess the calibration ability and adjusted for the competing risk of non-CAD death. Net reclassification index (NRI) and integrated discrimination improvement (IDI) were applied to quantify the improvement contributed by the new risk factors. Internal validation of predictive accuracy was performed using 1000 times of bootstrap re-sampling. Of the 1775 participants without CAD at baseline, 473 incident cases of CAD were documented for a 20-year follow-up. Time-dependent AUCs for men and women at t = 10 years were 0.841 [95% confidence interval (95% CI): 0.806–0.877], 0.804 (95% CI: 0.768–0.839) in Fine and Gray model, 0.784 (95% CI: 0.738–0.830), 0.733 (95% CI: 0.692–0.775) in Cox proportional hazard model. The competing risk model was significantly superior to Cox proportional hazard model on discrimination and calibration. The cut-off values of the risk score that marked the difference between low-risk and high-risk patients were 34 points for men and 30 points for women, which have good sensitivity and specificity. A sex-specific multivariable risk factor algorithm-based competing risk model has been developed on the basis of an elderly Chinese cohort, which could be applied to predict an individual's risk and provide a useful guide to identify the groups at a high risk for CAD among the Chinese

  3. A new model for polluted soil risk assessment

    NASA Astrophysics Data System (ADS)

    Andretta, M.; Villani, M.; Serra, R.

    2003-04-01

    In the last years, the problem of the evaluation of the risk related to soil pollution has became more and more important, all over the world. The increasing number of polluted soils in all the industrialised counties has required the formalisation of well defined methodologies for defining the technical and economical limits of soil remediation. Mainly, these limits are defined in terms of general threshold values that, in some cases, can not be reached even with the so called Best Available Technology (B.A.T.) due for example to the characteristics of the pollutants or of the affected soil, or on the extremely high cost or duration of the remedial intervention. For these reasons, both in the North American Countries and in the European ones, many alternative methodologies based on systematic and scientifically well founded approaches have been developed, in order to determine the real effects of the pollution on the receptor targets. Typically, these methodologies are organised into different levels of detail, the so called "TIERS". Tier 1 is based on a conservative estimation of the risk for the targets, that comes from very general and "worst case" general situations. Tier 2 is based on a more detailed and site specific estimation of the hazard, evaluated by the use of semi-empirical, analytical formulas for the source characterisation, the transport of the pollutant, the target exposition evaluation. Tier 3 is the more detailed and site specific level of application of the risk assessment methodologies and requires the use of numerical methods with many detailed information on the site and on the receptors (e.g.: chemical/physical parameters of the pollutants, hydro-geological data, exposition data, etc.) In this paper, we describe the most important theoretical aspects of the polluted soil risk assessment methodologies and the relevant role played, in this kind of analysis, by the pollutant transport models. In particular, we describe a new and innovative

  4. Evaluation of data for Sinkhole-development risk models

    NASA Astrophysics Data System (ADS)

    Upchurch, Sam B.; Littlefield, James R.

    1988-10-01

    Before risk assessments for sinkhole damage and indemnification are developed, a data base must be created to predict the occurrence and distribution of sinkholes. This database must be evaluated in terms of the following questions: (1) are available records of modern sinkhole development adequate, (2) can the distribution of ancient sinks be used for predictive purposes, and (3) at what areal scale must sinkhole occurrences be evaluated for predictive and risk analysis purposes? Twelve 7.5' quadrangles with varying karst development in Hillsborough County, Florida provide insight into these questions. The area includes 179 modern sinks that developed between 1964 and 1985 and 2,303 ancient sinks. The sinks occur in urban, suburban, agricultural, and major forest wetland areas. The number of ancient sinks ranges from 0.1 to 3.2/km2 and averages 1.1/km2 for the entire area. The quadrangle area occupied by ancient sinks ranges from 0.3 to 10.2 percent. The distribution of ancient sinkholes within a quadrangle ranges from 0 to over 25 percent of the land surface. In bare karst areas, the sinks are localized along major lineaments, especially at lineament intersections. Where there is covered karst, ancient sinks may be obscured. Modern sinkholes did not uniformly through time, they ranged from 0 to 29/yr. The regional occurrence rate is 7.6/yr. Most were reported in urban or suburban areas and their locations coincide with the lineament-controlled areas of ancient karst. Moving-average analysis indicates that the distribution of modern sinks is highly localized and ranges from 0 to 1.9/km2. Chi-square tests show that the distribution of ancient sinks in bare karst areas significantly predicts the locations of modern sinks. In areas of covered karst, the locations of ancient sinkholes do not predict modern sinks. It appears that risk-assessment models for sinkhole development can use the distribution of ancient sinks where bare karst is present. In covered karst areas

  5. Time-based collision risk modeling for air traffic management

    NASA Astrophysics Data System (ADS)

    Bell, Alan E.

    Since the emergence of commercial aviation in the early part of last century, economic forces have driven a steadily increasing demand for air transportation. Increasing density of aircraft operating in a finite volume of airspace is accompanied by a corresponding increase in the risk of collision, and in response to a growing number of incidents and accidents involving collisions between aircraft, governments worldwide have developed air traffic control systems and procedures to mitigate this risk. The objective of any collision risk management system is to project conflicts and provide operators with sufficient opportunity to recognize potential collisions and take necessary actions to avoid them. It is therefore the assertion of this research that the currency of collision risk management is time. Future Air Traffic Management Systems are being designed around the foundational principle of four dimensional trajectory based operations, a method that replaces legacy first-come, first-served sequencing priorities with time-based reservations throughout the airspace system. This research will demonstrate that if aircraft are to be sequenced in four dimensions, they must also be separated in four dimensions. In order to separate aircraft in four dimensions, time must emerge as the primary tool by which air traffic is managed. A functional relationship exists between the time-based performance of aircraft, the interval between aircraft scheduled to cross some three dimensional point in space, and the risk of collision. This research models that relationship and presents two key findings. First, a method is developed by which the ability of an aircraft to meet a required time of arrival may be expressed as a robust standard for both industry and operations. Second, a method by which airspace system capacity may be increased while maintaining an acceptable level of collision risk is presented and demonstrated for the purpose of formulating recommendations for procedures

  6. Assessment of Yellow Fever Epidemic Risk: An Original Multi-criteria Modeling Approach

    PubMed Central

    Briand, Sylvie; Beresniak, Ariel; Nguyen, Tim; Yonli, Tajoua; Duru, Gerard; Kambire, Chantal; Perea, William

    2009-01-01

    Background Yellow fever (YF) virtually disappeared in francophone West African countries as a result of YF mass vaccination campaigns carried out between 1940 and 1953. However, because of the failure to continue mass vaccination campaigns, a resurgence of the deadly disease in many African countries began in the early 1980s. We developed an original modeling approach to assess YF epidemic risk (vulnerability) and to prioritize the populations to be vaccinated. Methods and Findings We chose a two-step assessment of vulnerability at district level consisting of a quantitative and qualitative assessment per country. Quantitative assessment starts with data collection on six risk factors: five risk factors associated with “exposure” to virus/vector and one with “susceptibility” of a district to YF epidemics. The multiple correspondence analysis (MCA) modeling method was specifically adapted to reduce the five exposure variables to one aggregated exposure indicator. Health districts were then projected onto a two-dimensional graph to define different levels of vulnerability. Districts are presented on risk maps for qualitative analysis in consensus groups, allowing the addition of factors, such as population migrations or vector density, that could not be included in MCA. The example of rural districts in Burkina Faso show five distinct clusters of risk profiles. Based on this assessment, 32 of 55 districts comprising over 7 million people were prioritized for preventive vaccination campaigns. Conclusion This assessment of yellow fever epidemic risk at the district level includes MCA modeling and consensus group modification. MCA provides a standardized way to reduce complexity. It supports an informed public health decision-making process that empowers local stakeholders through the consensus group. This original approach can be applied to any disease with documented risk factors. PMID:19597548

  7. Model risk for European-style stock index options.

    PubMed

    Gençay, Ramazan; Gibson, Rajna

    2007-01-01

    In empirical modeling, there have been two strands for pricing in the options literature, namely the parametric and nonparametric models. Often, the support for the nonparametric methods is based on a benchmark such as the Black-Scholes (BS) model with constant volatility. In this paper, we study the stochastic volatility (SV) and stochastic volatility random jump (SVJ) models as parametric benchmarks against feedforward neural network (FNN) models, a class of neural network models. Our choice for FNN models is due to their well-studied universal approximation properties of an unknown function and its partial derivatives. Since the partial derivatives of an option pricing formula are risk pricing tools, an accurate estimation of the unknown option pricing function is essential for pricing and hedging. Our findings indicate that FNN models offer themselves as robust option pricing tools, over their sophisticated parametric counterparts in predictive settings. There are two routes to explain the superiority of FNN models over the parametric models in forecast settings. These are nonnormality of return distributions and adaptive learning. PMID:17278472

  8. A Spatio-temporal Model of African Animal Trypanosomosis Risk

    PubMed Central

    Dicko, Ahmadou H.; Percoma, Lassane; Sow, Adama; Adam, Yahaya; Mahama, Charles; Sidibé, Issa; Dayo, Guiguigbaza-Kossigan; Thévenon, Sophie; Fonta, William; Sanfo, Safietou; Djiteye, Aligui; Salou, Ernest; Djohan, Vincent; Cecchi, Giuliano; Bouyer, Jérémy

    2015-01-01

    Background African animal trypanosomosis (AAT) is a major constraint to sustainable development of cattle farming in sub-Saharan Africa. The habitat of the tsetse fly vector is increasingly fragmented owing to demographic pressure and shifts in climate, which leads to heterogeneous risk of cyclical transmission both in space and time. In Burkina Faso and Ghana, the most important vectors are riverine species, namely Glossina palpalis gambiensis and G. tachinoides, which are more resilient to human-induced changes than the savannah and forest species. Although many authors studied the distribution of AAT risk both in space and time, spatio-temporal models allowing predictions of it are lacking. Methodology/Principal Findings We used datasets generated by various projects, including two baseline surveys conducted in Burkina Faso and Ghana within PATTEC (Pan African Tsetse and Trypanosomosis Eradication Campaign) national initiatives. We computed the entomological inoculation rate (EIR) or tsetse challenge using a range of environmental data. The tsetse apparent density and their infection rate were separately estimated and subsequently combined to derive the EIR using a “one layer-one model” approach. The estimated EIR was then projected into suitable habitat. This risk index was finally validated against data on bovine trypanosomosis. It allowed a good prediction of the parasitological status (r2 = 67%), showed a positive correlation but less predictive power with serological status (r2 = 22%) aggregated at the village level but was not related to the illness status (r2 = 2%). Conclusions/Significance The presented spatio-temporal model provides a fine-scale picture of the dynamics of AAT risk in sub-humid areas of West Africa. The estimated EIR was high in the proximity of rivers during the dry season and more widespread during the rainy season. The present analysis is a first step in a broader framework for an efficient risk management of climate

  9. The Use of the Satellite Breakup Risk Assessment Model (SBRAM) to Characterize Collision Risk to Manned Spacecraft

    NASA Technical Reports Server (NTRS)

    Matney, Mark J.; Theall, Jeffrey R.; McKay, Gordon A. (Technical Monitor)

    1999-01-01

    NASA uses environment models such as ORDEM96 to characterize the long-term orbital debris collision hazard for spacecraft in LEO. Occasionally, however, there are breakups of satellites or rocket bodies that create enhanced collision hazard for a period of time. This enhanced collision hazard can pose increased risks to space operations - especially those involving manned missions where the tolerance for risk is very low. NASA has developed SBRAM to simulate the enhanced debris environment in the days and weeks that follow such a breakup. This simulation provides the kind of risk probabilities that can be used by mission planners to consider if changes are warranted for the mission. Announcements of breakups come to NASA from US Space Command as soon as they are identified. The pre-breakup orbit and time of breakup are used to determine the initial conditions of the explosion. SBRAM uses the latest explosion models developed at NASA to simulate a debris cloud for the breakup. The model uses a Monte Carlo technique to create a random debris cloud from the probability distributions in the breakup model. Each piece of debris randomly created in the cloud is propagated in a deterministic manner to include the effects of drag and other orbital perturbations. The detailed geometry of each simulated close approach to the target spacecraft is noted and logged and the collision probability is computed using an estimated probability density in down-range and cross-range positions of both the target spacecraft and debris object. The collision probability is computed from the overlap of these probability densities for each close-approach geometry and summed over all computed conjunctions. Cloud propagation runs over the desired time interval are then repeated until the scale of the collision risk can be estimated to a desired precision. This paper presents an overview of the SBRAM model and a number of examples, both real and hypothetical, to demonstrate its use. In addition

  10. Peer Review of NRC Standardized Plant Analysis Risk Models

    SciTech Connect

    Anthony Koonce; James Knudsen; Robert Buell

    2011-03-01

    The Nuclear Regulatory Commission (NRC) Standardized Plant Analysis Risk (SPAR) Models underwent a Peer Review using ASME PRA standard (Addendum C) as endorsed by NRC in Regulatory Guide (RG) 1.200. The review was performed by a mix of industry probabilistic risk analysis (PRA) experts and NRC PRA experts. Representative SPAR models, one PWR and one BWR, were reviewed against Capability Category I of the ASME PRA standard. Capability Category I was selected as the basis for review due to the specific uses/applications of the SPAR models. The BWR SPAR model was reviewed against 331 ASME PRA Standard Supporting Requirements; however, based on the Capability Category I level of review and the absence of internal flooding and containment performance (LERF) logic only 216 requirements were determined to be applicable. Based on the review, the BWR SPAR model met 139 of the 216 supporting requirements. The review also generated 200 findings or suggestions. Of these 200 findings and suggestions 142 were findings and 58 were suggestions. The PWR SPAR model was also evaluated against the same 331 ASME PRA Standard Supporting Requirements. Of these requirements only 215 were deemed appropriate for the review (for the same reason as noted for the BWR). The PWR review determined that 125 of the 215 supporting requirements met Capability Category I or greater. The review identified 101 findings or suggestions (76 findings and 25 suggestions). These findings or suggestions were developed to identify areas where SPAR models could be enhanced. A process to prioritize and incorporate the findings/suggestions supporting requirements into the SPAR models is being developed. The prioritization process focuses on those findings that will enhance the accuracy, completeness and usability of the SPAR models.

  11. Applying risk and resilience models to predicting the effects of media violence on development.

    PubMed

    Prot, Sara; Gentile, Douglas A

    2014-01-01

    Although the effects of media violence on children and adolescents have been studied for over 50 years, they remain controversial. Much of this controversy is driven by a misunderstanding of causality that seeks the cause of atrocities such as school shootings. Luckily, several recent developments in risk and resilience theories offer a way out of this controversy. Four risk and resilience models are described, including the cascade model, dose-response gradients, pathway models, and turning-point models. Each is described and applied to the existing media effects literature. Recommendations for future research are discussed with regard to each model. In addition, we examine current developments in theorizing that stressors have sensitizing versus steeling effects and recent interest in biological and gene by environment interactions. We also discuss several of the cultural aspects that have supported the polarization and misunderstanding of the literature, and argue that applying risk and resilience models to the theories and data offers a more balanced way to understand the subtle effects of media violence on aggression within a multicausal perspective. PMID:24851351

  12. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas

    PubMed Central

    Bedford, Tim; Daneshkhah, Alireza

    2015-01-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets. PMID:26332240

  13. Integrated Assessment Modeling for Carbon Storage Risk and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Bromhal, G. S.; Dilmore, R.; Pawar, R.; Stauffer, P. H.; Gastelum, J.; Oldenburg, C. M.; Zhang, Y.; Chu, S.

    2013-12-01

    The National Risk Assessment Partnership (NRAP) has developed tools to perform quantitative risk assessment at site-specific locations for long-term carbon storage. The approach that is being used is to divide the storage and containment system into components (e.g., reservoirs, seals, wells, groundwater aquifers), to develop detailed models for each component, to generate reduced order models (ROMs) based on the detailed models, and to reconnect the reduced order models within an integrated assessment model (IAM). CO2-PENS, developed at Los Alamos National Lab, is being used as the IAM for the simulations in this study. The benefit of this approach is that simulations of the complete system can be generated on a relatively rapid time scale so that Monte Carlo simulation can be performed. In this study, hundreds of thousands of runs of the IAMs have been generated to estimate likelihoods of the quantity of CO2 released to the atmosphere, size of aquifer impacted by pH, size of aquifer impacted by TDS, and size of aquifer with different metals concentrations. Correlations of the output variables with different reservoir, seal, wellbore, and aquifer parameters have been generated. Importance measures have been identified, and inputs have been ranked in the order of their impact on the output quantities. Presentation will describe the approach used, representative results, and implications for how the Monte Carlo analysis is implemented on uncertainty quantification.

  14. Relapse and Risk-taking among Iranian Methamphetamine Abusers Undergoing Matrix Treatment Model

    PubMed Central

    Taymoori, Parvaneh; Pashaei, Tahereh

    2016-01-01

    Background This study investigated the correlation between risk-taking and relapse among methamphetamine (MA) abusers undergoing the Matrix Model of treatment. Methods This cross-sectional study was conducted on male patients who were stimulant drug abusers undergoing the matrix treatment in the National Center for Addiction Research. A sampling was done using the availability method including 92 male patients. Demographic questionnaires and drug abuse related questionnaire were completed for each patient. Then, Bart’s balloon risk-taking test was administered to the patients. Findings Participants had a mean age ± standard deviation (SD) of 27.59 ± 6.60 years with an age range of 17-29 years. Unemployment, unmarried status, criminal offense, and also addiction family history increased the probability of relapse. In addition, a greater adjusted score of the risk-taking test increased the odds of relapse by more than 97%. The simultaneous abuse of opium and stimulants compared to the abuse of stimulants only, revealed no statistically significant differences for relapse. Patients with higher risk-taking behavior had a more probability of relapse. Conclusion This finding indirectly implies the usefulness of Bart’s risk-taking test in assessing risk-taking behavior in stimulant drug abusers. PMID:27274793

  15. Predicting Risk of Type 2 Diabetes Mellitus with Genetic Risk Models on the Basis of Established Genome-wide Association Markers: A Systematic Review

    PubMed Central

    Bao, Wei; Hu, Frank B.; Rong, Shuang; Rong, Ying; Bowers, Katherine; Schisterman, Enrique F.; Liu, Liegang; Zhang, Cuilin

    2013-01-01

    This study aimed to evaluate the predictive performance of genetic risk models based on risk loci identified and/or confirmed in genome-wide association studies for type 2 diabetes mellitus. A systematic literature search was conducted in the PubMed/MEDLINE and EMBASE databases through April 13, 2012, and published data relevant to the prediction of type 2 diabetes based on genome-wide association marker–based risk models (GRMs) were included. Of the 1,234 potentially relevant articles, 21 articles representing 23 studies were eligible for inclusion. The median area under the receiver operating characteristic curve (AUC) among eligible studies was 0.60 (range, 0.55–0.68), which did not differ appreciably by study design, sample size, participants’ race/ethnicity, or the number of genetic markers included in the GRMs. In addition, the AUCs for type 2 diabetes did not improve appreciably with the addition of genetic markers into conventional risk factor–based models (median AUC, 0.79 (range, 0.63–0.91) vs. median AUC, 0.78 (range, 0.63–0.90), respectively). A limited number of included studies used reclassification measures and yielded inconsistent results. In conclusion, GRMs showed a low predictive performance for risk of type 2 diabetes, irrespective of study design, participants’ race/ethnicity, and the number of genetic markers included. Moreover, the addition of genome-wide association markers into conventional risk models produced little improvement in predictive performance. PMID:24008910

  16. Future Bloom and Blossom Frost Risk for Malus domestica Considering Climate Model and Impact Model Uncertainties

    PubMed Central

    Hoffmann, Holger; Rath, Thomas

    2013-01-01

    The future bloom and risk of blossom frosts for Malus domestica were projected using regional climate realizations and phenological ( = impact) models. As climate impact projections are susceptible to uncertainties of climate and impact models and model concatenation, the significant horizon of the climate impact signal was analyzed by applying 7 impact models, including two new developments, on 13 climate realizations of the IPCC emission scenario A1B. Advancement of phenophases and a decrease in blossom frost risk for Lower Saxony (Germany) for early and late ripeners was determined by six out of seven phenological models. Single model/single grid point time series of bloom showed significant trends by 2021–2050 compared to 1971–2000, whereas the joint signal of all climate and impact models did not stabilize until 2043. Regarding blossom frost risk, joint projection variability exceeded the projected signal. Thus, blossom frost risk cannot be stated to be lower by the end of the 21st century despite a negative trend. As a consequence it is however unlikely to increase. Uncertainty of temperature, blooming date and blossom frost risk projection reached a minimum at 2078–2087. The projected phenophases advanced by 5.5 d K−1, showing partial compensation of delayed fulfillment of the winter chill requirement and faster completion of the following forcing phase in spring. Finally, phenological model performance was improved by considering the length of day. PMID:24116022

  17. A Risk Prediction Model for Smoking Experimentation in Mexican American Youth

    PubMed Central

    Talluri, Rajesh; Wilkinson, Anna V.; Spitz, Margaret R.; Shete, Sanjay

    2014-01-01

    Background Smoking experimentation in Mexican American youth is problematic. In light of the research showing that preventing smoking experimentation is a valid strategy for smoking prevention, there is a need to identify Mexican American youth at high risk for experimentation. Methods A prospective population-based cohort of 1179 adolescents of Mexican descent was followed for 5 years starting in 2005–06. Participants completed a baseline interview at a home visit followed by three telephone interviews at intervals of approximately 6 months and additional interviews at two home visits in 2008–09 and 2010–11. The primary end point of interest in this study was smoking experimentation. Information regarding social, cultural, and behavioral factors (e.g., acculturation, susceptibility to experimentation, home characteristics, household influences) was collected at baseline using validated questionnaires. Results Age, sex, cognitive susceptibility, household smoking behavior, peer influence, neighborhood influence, acculturation, work characteristics, positive outcome expectations, family cohesion, degree of tension, ability to concentrate, and school discipline were found to be associated with smoking experimentation. In a validation dataset, the proposed risk prediction model had an AUC of 0.719 (95% confidence interval, 0.637 to 0.801)for predicting absolute risk for smoking experimentation within 1 year. Conclusions The proposed risk prediction model is able to quantify the risk of smoking experimentation in Mexican American adolescents. PMID:25063521

  18. Has Microsoft® Left Behind Risk Modeling in Cardiac and Thoracic Surgery?

    PubMed Central

    Poullis, Mike

    2011-01-01

    Abstract: This concept paper examines a number of key areas central to quality and risk assessment in cardiac surgery. The effect of surgeon and institutional factors with regard to outcomes in cardiac surgery is utilized to demonstrate the need to sub analyze cardiac surgeons performance in a more sophisticated manner than just operation type and patient risk factors, as in current risk models. By utilizing the mathematical/engineering concept of Fourier analysis in the breakdown of cardiac surgical results the effects of each of the core components that makes up the care package of a patient’s experiences are examined. The core components examined include: institutional, regional, patient, and surgeon effects. The limitations of current additive (Parsonnet, Euroscore) and logistic (Euroscore, Southern Thoracic Society) regression risk analysis techniques are discussed. The inadequacy of current modeling techniques is demonstrated via the use of known medical formula for calculating flow in the internal mammary artery and the calculation of blood pressure. By examining the fundamental limitations of current risk analysis techniques a new technique is proposed that embraces modern software computer technology via the use of structured query language. PMID:21449233

  19. Risk-Adjusted Models for Adverse Obstetric Outcomes and Variation in Risk Adjusted Outcomes Across Hospitals

    PubMed Central

    Bailit, Jennifer L.; Grobman, William A.; Rice, Madeline Murguia; Spong, Catherine Y.; Wapner, Ronald J.; Varner, Michael W.; Thorp, John M.; Leveno, Kenneth J.; Caritis, Steve N.; Shubert, Phillip J.; Tita, Alan T. N.; Saade, George; Sorokin, Yoram; Rouse, Dwight J.; Blackwell, Sean C.; Tolosa, Jorge E.; Van Dorsten, J. Peter

    2014-01-01

    Objective Regulatory bodies and insurers evaluate hospital quality using obstetrical outcomes, however meaningful comparisons should take pre-existing patient characteristics into account. Furthermore, if risk-adjusted outcomes are consistent within a hospital, fewer measures and resources would be needed to assess obstetrical quality. Our objective was to establish risk-adjusted models for five obstetric outcomes and assess hospital performance across these outcomes. Study Design A cohort study of 115,502 women and their neonates born in 25 hospitals in the United States between March 2008 and February 2011. Hospitals were ranked according to their unadjusted and risk-adjusted frequency of venous thromboembolism, postpartum hemorrhage, peripartum infection, severe perineal laceration, and a composite neonatal adverse outcome. Correlations between hospital risk-adjusted outcome frequencies were assessed. Results Venous thromboembolism occurred too infrequently (0.03%, 95% CI 0.02% – 0.04%) for meaningful assessment. Other outcomes occurred frequently enough for assessment (postpartum hemorrhage 2.29% (95% CI 2.20–2.38), peripartum infection 5.06% (95% CI 4.93–5.19), severe perineal laceration at spontaneous vaginal delivery 2.16% (95% CI 2.06–2.27), neonatal composite 2.73% (95% CI 2.63–2.84)). Although there was high concordance between unadjusted and adjusted hospital rankings, several individual hospitals had an adjusted rank that was substantially different (as much as 12 rank tiers) than their unadjusted rank. None of the correlations between hospital adjusted outcome frequencies was significant. For example, the hospital with the lowest adjusted frequency of peripartum infection had the highest adjusted frequency of severe perineal laceration. Conclusions Evaluations based on a single risk-adjusted outcome cannot be generalized to overall hospital obstetric performance. PMID:23891630

  20. The characteristics of lightning risk and zoning in Beijing simulated by a risk assessment model

    NASA Astrophysics Data System (ADS)

    Hu, H.; Wang, J.; Pan, J.

    2014-08-01

    In this study, the cloud-to-ground (CG) lightning flash/stroke density was derived from the lightning location finder (LLF) data recorded between 2007 and 2011. The vulnerability of land surfaces was then assessed from the classification of the study areas into buildings, outdoor areas under the building canopy and open-field areas, which makes it convenient to deduce the location factor and confirm the protective capability. Subsequently, the potential number of dangerous lightning events at a location could be estimated from the product of the CG stroke density and the location's vulnerability. Although the human beings and all their material properties are identically exposed to lightning, the lightning casualty risk and property loss risk was assessed respectively due to their vulnerability discrepancy. Our analysis of the CG flash density in Beijing revealed that the valley of JuMaHe to the southwest, the ChangPing-ShunYi zone downwind of the Beijing metropolis, and the mountainous PingGu-MiYun zone near the coast are the most active lightning areas, with densities greater than 1.5 flashes km-2 year-1. Moreover, the mountainous northeastern, northern, and northwestern rural areas are relatively more vulnerable to lightning because the high-elevation terrain attracts lightning and there is little protection. In contrast, lightning incidents by induced lightning are most likely to occur in densely populated urban areas, and the property damage caused by lightning here is more extensive than that in suburban and rural areas. However, casualty incidents caused by direct lightning strokes seldom occur in urban areas. On the other hand, the simulation based on the lightning risk assessment model (LRAM) demonstrates that the casualty risk is higher in rural areas, whereas the property loss risk is higher in urban areas, and this conclusion is also supported by the historical casualty and damage reports.

  1. Quantifying Systemic Risk by Solutions of the Mean-Variance Risk Model

    PubMed Central

    Morgenstern, Ingo

    2016-01-01

    The world is still recovering from the financial crisis peaking in September 2008. The triggering event was the bankruptcy of Lehman Brothers. To detect such turmoils, one can investigate the time-dependent behaviour of correlations between assets or indices. These cross-correlations have been connected to the systemic risks within markets by several studies in the aftermath of this crisis. We study 37 different US indices which cover almost all aspects of the US economy and show that monitoring an average investor’s behaviour can be used to quantify times of increased risk. In this paper the overall investing strategy is approximated by the ground-states of the mean-variance model along the efficient frontier bound to real world constraints. Changes in the behaviour of the average investor is utlilized as a early warning sign. PMID:27351482

  2. Quantifying Systemic Risk by Solutions of the Mean-Variance Risk Model.

    PubMed

    Jurczyk, Jan; Eckrot, Alexander; Morgenstern, Ingo

    2016-01-01

    The world is still recovering from the financial crisis peaking in September 2008. The triggering event was the bankruptcy of Lehman Brothers. To detect such turmoils, one can investigate the time-dependent behaviour of correlations between assets or indices. These cross-correlations have been connected to the systemic risks within markets by several studies in the aftermath of this crisis. We study 37 different US indices which cover almost all aspects of the US economy and show that monitoring an average investor's behaviour can be used to quantify times of increased risk. In this paper the overall investing strategy is approximated by the ground-states of the mean-variance model along the efficient frontier bound to real world constraints. Changes in the behaviour of the average investor is utlilized as a early warning sign. PMID:27351482

  3. Lung dosimetry and risk assessment of nanoparticles: Evaluating and extending current models in rats and humans

    SciTech Connect

    Kuempel, E.D.; Tran, C.L.; Castranova, V.; Bailer, A.J.

    2006-09-15

    Risk assessment of occupational exposure to nanomaterials is needed. Human data are limited, but quantitative data are available from rodent studies. To use these data in risk assessment, a scientifically reasonable approach for extrapolating the rodent data to humans is required. One approach is allometric adjustment for species differences in the relationship between airborne exposure and internal dose. Another approach is lung dosimetry modeling, which provides a biologically-based, mechanistic method to extrapolate doses from animals to humans. However, current mass-based lung dosimetry models may not fully account for differences in the clearance and translocation of nanoparticles. In this article, key steps in quantitative risk assessment are illustrated, using dose-response data in rats chronically exposed to either fine or ultrafine titanium dioxide (TiO{sub 2}), carbon black (CB), or diesel exhaust particulate (DEP). The rat-based estimates of the working lifetime airborne concentrations associated with 0.1% excess risk of lung cancer are approximately 0.07 to 0.3 mg/m{sup 3} for ultrafine TiO{sub 2}, CB, or DEP, and 0.7 to 1.3 mg/m{sup 3} for fine TiO{sub 2}. Comparison of observed versus model-predicted lung burdens in rats shows that the dosimetry models predict reasonably well the retained mass lung burdens of fine or ultrafine poorly soluble particles in rats exposed by chronic inhalation. Additional model validation is needed for nanoparticles of varying characteristics, as well as extension of these models to include particle translocation to organs beyond the lungs. Such analyses would provide improved prediction of nanoparticle dose for risk assessment.

  4. Soil Cd availability to Indian mustard and environmental risk following EDTA addition to Cd-contaminated soil.

    PubMed

    Jiang, X J; Luo, Y M; Zhao, Q G; Baker, A J M; Christie, P; Wong, M H

    2003-02-01

    A pot experiment was conducted to investigate the influence of EDTA on the extractability of Cd in the soil and uptake of Cd by Indian mustard (Brassica juncea). Twenty levels of soil Cd concentration ranging from 10 to 200 mg kg(-1) were produced by spiking aliquots of a clay loam paddy soil with Cd(NO3)2. One week before the plants were harvested EDTA was applied to pots in which the soil had been spiked with 20, 40, 60...200 mg Cd kg(-1). The EDTA was added at the rate calculated to complex with all of the Cd added at the 200 mg kg(-1) level. Control pots spiked with 10, 30, 50... 190 mg Cd kg(-1) received no EDTA. The plants were harvested after 42 days' growth. Soil water- and NH4NO3-extractable Cd fractions increased rapidly following EDTA application. Root Cd concentrations decreased after EDTA application, but shoot concentrations increased when the soil Cd levels were >130 mg kg(-1) and Cd toxicity symptoms were observed. The increases in soil solution Cd induced by EDTA did not increase plant total Cd uptake but appeared to stimulate the translocation of the metal from roots to shoots when the plants appeared to be under Cd toxicity stress. The results are discussed in relation to the possible mechanisms by which EDTA may change the solubility and bioavailability of Cd in the soil and the potential for plant uptake and environmental risk due to leaching losses to groundwater. PMID:12688496

  5. Biomarkers of leukemia risk: benzene as a model.

    PubMed Central

    Smith, M T; Zhang, L

    1998-01-01

    Although relatively rare, leukemias place a considerable financial burden on society and cause psychologic trauma to many families. Leukemia is the most common cancer in children. The causes of leukemia in adults and children are largely unknown, but occupational and environmental factors are strongly suspected. Genetic predisposition may also play a major role. Our aim is to use molecular epidemiology and toxicology to find the cause of leukemia and develop biomarkers of leukemia risk. We have studied benzene as a model chemical leukemogen, and we have identified risk factors for susceptibility to benzene toxicity. Numerous studies have associated exposure to benzene with increased levels of chromosome aberrations in circulating lymphocytes of exposed workers. Increased levels of chromosome aberrations have, in turn, been correlated with a heightened risk of cancer, especially for hematologic malignancy, in two recent cohort studies in Europe. Conventional chromosome analysis is laborious, however, and requires highly trained personnel. Further, it lacks statistical power, as only a small number of cells can be examined. The recently developed fluorescence in situ hybridization (FISH) and polymerase chain reaction (PCR)-based technologies have allowed the detection of specific chromosome aberrations. These techniques are far less time consuming and are more sensitive than classical chromosomal analysis. Because leukemias commonly show a variety of specific chromosome aberrations, detection of these aberrations by FISH and PCR in peripheral blood may provide improved biomarkers of leukemia risk. PMID:9703476

  6. Risk analysis of nuclear safeguards regulations. [Aggregated Systems Model (ASM)

    SciTech Connect

    Al-Ayat, R.A.; Altman, W.D.; Judd, B.R.

    1982-06-01

    The Aggregated Systems Model (ASM), a probabilisitic risk analysis tool for nuclear safeguards, was applied to determine benefits and costs of proposed amendments to NRC regulations governing nuclear material control and accounting systems. The objective of the amendments was to improve the ability to detect insiders attempting to steal large quantities of special nuclear material (SNM). Insider threats range from likely events with minor consequences to unlikely events with catastrophic consequences. Moreover, establishing safeguards regulations is complicated by uncertainties in threats, safeguards performance, and consequences, and by the subjective judgments and difficult trade-offs between risks and safeguards costs. The ASM systematically incorporates these factors in a comprehensive, analytical framework. The ASM was used to evaluate the effectiveness of current safeguards and to quantify the risk of SNM theft. Various modifications designed to meet the objectives of the proposed amendments to reduce that risk were analyzed. Safeguards effectiveness was judged in terms of the probability of detecting and preventing theft, the expected time to detection, and the expected quantity of SNM diverted in a year. Data were gathered in tours and interviews at NRC-licensed facilities. The assessment at each facility was begun by carefully selecting scenarios representing the range of potential insider threats. A team of analysts and facility managers assigned probabilities for detection and prevention events in each scenario. Using the ASM we computed the measures of system effectiveness and identified cost-effective safeguards modifications that met the objectives of the proposed amendments.

  7. Modeling and managing risk early in software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  8. Guide for developing conceptual models for ecological risk assessments

    SciTech Connect

    Suter, G.W., II

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs.

  9. Predictive model of avian electrocution risk on overhead power lines.

    PubMed

    Dwyer, J F; Harness, R E; Donohue, K

    2014-02-01

    Electrocution on overhead power structures negatively affects avian populations in diverse ecosystems worldwide, contributes to the endangerment of raptor populations in Europe and Africa, and is a major driver of legal action against electric utilities in North America. We investigated factors associated with avian electrocutions so poles that are likely to electrocute a bird can be identified and retrofitted prior to causing avian mortality. We used historical data from southern California to identify patterns of avian electrocution by voltage, month, and year to identify species most often killed by electrocution in our study area and to develop a predictive model that compared poles where an avian electrocution was known to have occurred (electrocution poles) with poles where no known electrocution occurred (comparison poles). We chose variables that could be quantified by personnel with little training in ornithology or electric systems. Electrocutions were more common at distribution voltages (≤ 33 kV) and during breeding seasons and were more commonly reported after a retrofitting program began. Red-tailed Hawks (Buteo jamaicensis) (n = 265) and American Crows (Corvus brachyrhynchos) (n = 258) were the most commonly electrocuted species. In the predictive model, 4 of 14 candidate variables were required to distinguish electrocution poles from comparison poles: number of jumpers (short wires connecting energized equipment), number of primary conductors, presence of grounding, and presence of unforested unpaved areas as the dominant nearby land cover. When tested against a sample of poles not used to build the model, our model distributed poles relatively normally across electrocution-risk values and identified the average risk as higher for electrocution poles relative to comparison poles. Our model can be used to reduce avian electrocutions through proactive identification and targeting of high-risk poles for retrofitting. PMID:24033371

  10. Comparison of time series models for predicting campylobacteriosis risk in New Zealand.

    PubMed

    Al-Sakkaf, A; Jones, G

    2014-05-01

    Predicting campylobacteriosis cases is a matter of considerable concern in New Zealand, after the number of the notified cases was the highest among the developed countries in 2006. Thus, there is a need to develop a model or a tool to predict accurately the number of campylobacteriosis cases as the Microbial Risk Assessment Model used to predict the number of campylobacteriosis cases failed to predict accurately the number of actual cases. We explore the appropriateness of classical time series modelling approaches for predicting campylobacteriosis. Finding the most appropriate time series model for New Zealand data has additional practical considerations given a possible structural change, that is, a specific and sudden change in response to the implemented interventions. A univariate methodological approach was used to predict monthly disease cases using New Zealand surveillance data of campylobacteriosis incidence from 1998 to 2009. The data from the years 1998 to 2008 were used to model the time series with the year 2009 held out of the data set for model validation. The best two models were then fitted to the full 1998-2009 data and used to predict for each month of 2010. The Holt-Winters (multiplicative) and ARIMA (additive) intervention models were considered the best models for predicting campylobacteriosis in New Zealand. It was noticed that the prediction by an additive ARIMA with intervention was slightly better than the prediction by a Holt-Winter multiplicative method for the annual total in year 2010, the former predicting only 23 cases less than the actual reported cases. It is confirmed that classical time series techniques such as ARIMA with intervention and Holt-Winters can provide a good prediction performance for campylobacteriosis risk in New Zealand. The results reported by this study are useful to the New Zealand Health and Safety Authority's efforts in addressing the problem of the campylobacteriosis epidemic. PMID:23551848

  11. A risk assessment example for soil invertebrates using spatially explicit agent-based models.

    PubMed

    Reed, Melissa; Alvarez, Tania; Chelinho, Sónia; Forbes, Valery; Johnston, Alice; Meli, Mattia; Voss, Frank; Pastorok, Rob

    2016-01-01

    refined risk assessment, using ABMs and population-level endpoints while yielding outputs that directly address the protection goals. We recommend choosing model outputs that are closely related to specific protection goals, using available toxicity data and accepted fate models to the extent possible in parameterizing models to minimize additional data needs and testing, evaluating, and documenting models following recent guidance. PMID:26411378

  12. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Astrophysics Data System (ADS)

    Grant, W.; Lutomski, M.

    2012-01-01

    The International Space Station (ISS) program is continuing to expand the use of Probabilistic Risk Assessments (PRAs). The use of PRAs in the ISS decision making process has proven very successful over the past 8 years. PRAs are used in the decision making process to address significant operational and design issues as well as to identify, communicate, and mitigate risks. Future PRAs are expected to have major impacts on not only the ISS, but also future NASA programs and projects. Many of these PRAs will have their foundation in the current ISS PRA model and in PRA trade studies that are being developed for the ISS Program. ISS PRAs have supported: -Development of reliability requirements for future NASA and commercial spacecraft, -Determination of inherent risk for visiting vehicles, -Evaluation of potential crew rescue scenarios, -Operational requirements and alternatives, -Planning of Extravehicular activities (EV As) and, -Evaluation of robotics operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decisions that were made.

  13. Evolutionary systemic modelling of practices on flood risk

    NASA Astrophysics Data System (ADS)

    Khatibi, Rahman

    2011-04-01

    SummaryOver time since the prehistory, interactions with floods have undergone evolutionary transitions including aversion to flood risk, flood defence and flood risk management, each serving as a mindset or a paradigm. Historic data describing these interactions are used in this paper to "model" these transitions and to explain them. This is a new bottom-up modelling capability based on a set of postulates integrating: (i) systemic thinking where systems are effected by four types of feedback loops to be described in the paper, which include positive/negative feedback; and (ii) evolutionary thinking, where each feedback loop is associated with a "risk mindset." These mindsets can undergo evolutionary transition from one to the next and the transition is largely driven by natural selection. After an evolutionary transition, lower mindsets do not necessarily disappear but can adapt and coexist with higher order loops. Based on the insight gained, the paper argues that (i) as the loops coexist pluralistically, systems increase in their complexity; (ii) there may be unexpected dynamic behaviours when a system is interacted with different types of feedback loops; and (iii) currently, these dynamic behaviours are overlooked, suggesting possible loopholes, bottlenecks or barriers and hence the motivation for this paper.

  14. Modeling Linkage Disequilibrium Increases Accuracy of Polygenic Risk Scores.

    PubMed

    Vilhjálmsson, Bjarni J; Yang, Jian; Finucane, Hilary K; Gusev, Alexander; Lindström, Sara; Ripke, Stephan; Genovese, Giulio; Loh, Po-Ru; Bhatia, Gaurav; Do, Ron; Hayeck, Tristan; Won, Hong-Hee; Kathiresan, Sekar; Pato, Michele; Pato, Carlos; Tamimi, Rulla; Stahl, Eli; Zaitlen, Noah; Pasaniuc, Bogdan; Belbin, Gillian; Kenny, Eimear E; Schierup, Mikkel H; De Jager, Philip; Patsopoulos, Nikolaos A; McCarroll, Steve; Daly, Mark; Purcell, Shaun; Chasman, Daniel; Neale, Benjamin; Goddard, Michael; Visscher, Peter M; Kraft, Peter; Patterson, Nick; Price, Alkes L

    2015-10-01

    Polygenic risk scores have shown great promise in predicting complex disease risk and will become more accurate as training sample sizes increase. The standard approach for calculating risk scores involves linkage disequilibrium (LD)-based marker pruning and applying a p value threshold to association statistics, but this discards information and can reduce predictive accuracy. We introduce LDpred, a method that infers the posterior mean effect size of each marker by using a prior on effect sizes and LD information from an external reference panel. Theory and simulations show that LDpred outperforms the approach of pruning followed by thresholding, particularly at large sample sizes. Accordingly, predicted R(2) increased from 20.1% to 25.3% in a large schizophrenia dataset and from 9.8% to 12.0% in a large multiple sclerosis dataset. A similar relative improvement in accuracy was observed for three additional large disease datasets and for non-European schizophrenia samples. The advantage of LDpred over existing methods will grow as sample sizes increase. PMID:26430803

  15. Modeling Linkage Disequilibrium Increases Accuracy of Polygenic Risk Scores

    PubMed Central

    Vilhjálmsson, Bjarni J.; Yang, Jian; Finucane, Hilary K.; Gusev, Alexander; Lindström, Sara; Ripke, Stephan; Genovese, Giulio; Loh, Po-Ru; Bhatia, Gaurav; Do, Ron; Hayeck, Tristan; Won, Hong-Hee; Ripke, Stephan; Neale, Benjamin M.; Corvin, Aiden; Walters, James T.R.; Farh, Kai-How; Holmans, Peter A.; Lee, Phil; Bulik-Sullivan, Brendan; Collier, David A.; Huang, Hailiang; Pers, Tune H.; Agartz, Ingrid; Agerbo, Esben; Albus, Margot; Alexander, Madeline; Amin, Farooq; Bacanu, Silviu A.; Begemann, Martin; Belliveau, Richard A.; Bene, Judit; Bergen, Sarah E.; Bevilacqua, Elizabeth; Bigdeli, Tim B.; Black, Donald W.; Bruggeman, Richard; Buccola, Nancy G.; Buckner, Randy L.; Byerley, William; Cahn, Wiepke; Cai, Guiqing; Campion, Dominique; Cantor, Rita M.; Carr, Vaughan J.; Carrera, Noa; Catts, Stanley V.; Chambert, Kimberly D.; Chan, Raymond C.K.; Chen, Ronald Y.L.; Chen, Eric Y.H.; Cheng, Wei; Cheung, Eric F.C.; Chong, Siow Ann; Cloninger, C. Robert; Cohen, David; Cohen, Nadine; Cormican, Paul; Craddock, Nick; Crowley, James J.; Curtis, David; Davidson, Michael; Davis, Kenneth L.; Degenhardt, Franziska; Del Favero, Jurgen; DeLisi, Lynn E.; Demontis, Ditte; Dikeos, Dimitris; Dinan, Timothy; Djurovic, Srdjan; Donohoe, Gary; Drapeau, Elodie; Duan, Jubao; Dudbridge, Frank; Durmishi, Naser; Eichhammer, Peter; Eriksson, Johan; Escott-Price, Valentina; Essioux, Laurent; Fanous, Ayman H.; Farrell, Martilias S.; Frank, Josef; Franke, Lude; Freedman, Robert; Freimer, Nelson B.; Friedl, Marion; Friedman, Joseph I.; Fromer, Menachem; Genovese, Giulio; Georgieva, Lyudmila; Gershon, Elliot S.; Giegling, Ina; Giusti-Rodrguez, Paola; Godard, Stephanie; Goldstein, Jacqueline I.; Golimbet, Vera; Gopal, Srihari; Gratten, Jacob; Grove, Jakob; de Haan, Lieuwe; Hammer, Christian; Hamshere, Marian L.; Hansen, Mark; Hansen, Thomas; Haroutunian, Vahram; Hartmann, Annette M.; Henskens, Frans A.; Herms, Stefan; Hirschhorn, Joel N.; Hoffmann, Per; Hofman, Andrea; Hollegaard, Mads V.; Hougaard, David M.; Ikeda, Masashi; Joa, Inge; Julia, Antonio; Kahn, Rene S.; Kalaydjieva, Luba; Karachanak-Yankova, Sena; Karjalainen, Juha; Kavanagh, David; Keller, Matthew C.; Kelly, Brian J.; Kennedy, James L.; Khrunin, Andrey; Kim, Yunjung; Klovins, Janis; Knowles, James A.; Konte, Bettina; Kucinskas, Vaidutis; Kucinskiene, Zita Ausrele; Kuzelova-Ptackova, Hana; Kahler, Anna K.; Laurent, Claudine; Keong, Jimmy Lee Chee; Lee, S. Hong; Legge, Sophie E.; Lerer, Bernard; Li, Miaoxin; Li, Tao; Liang, Kung-Yee; Lieberman, Jeffrey; Limborska, Svetlana; Loughland, Carmel M.; Lubinski, Jan; Lnnqvist, Jouko; Macek, Milan; Magnusson, Patrik K.E.; Maher, Brion S.; Maier, Wolfgang; Mallet, Jacques; Marsal, Sara; Mattheisen, Manuel; Mattingsdal, Morten; McCarley, Robert W.; McDonald, Colm; McIntosh, Andrew M.; Meier, Sandra; Meijer, Carin J.; Melegh, Bela; Melle, Ingrid; Mesholam-Gately, Raquelle I.; Metspalu, Andres; Michie, Patricia T.; Milani, Lili; Milanova, Vihra; Mokrab, Younes; Morris, Derek W.; Mors, Ole; Mortensen, Preben B.; Murphy, Kieran C.; Murray, Robin M.; Myin-Germeys, Inez; Mller-Myhsok, Bertram; Nelis, Mari; Nenadic, Igor; Nertney, Deborah A.; Nestadt, Gerald; Nicodemus, Kristin K.; Nikitina-Zake, Liene; Nisenbaum, Laura; Nordin, Annelie; O’Callaghan, Eadbhard; O’Dushlaine, Colm; O’Neill, F. Anthony; Oh, Sang-Yun; Olincy, Ann; Olsen, Line; Van Os, Jim; Pantelis, Christos; Papadimitriou, George N.; Papiol, Sergi; Parkhomenko, Elena; Pato, Michele T.; Paunio, Tiina; Pejovic-Milovancevic, Milica; Perkins, Diana O.; Pietilinen, Olli; Pimm, Jonathan; Pocklington, Andrew J.; Powell, John; Price, Alkes; Pulver, Ann E.; Purcell, Shaun M.; Quested, Digby; Rasmussen, Henrik B.; Reichenberg, Abraham; Reimers, Mark A.; Richards, Alexander L.; Roffman, Joshua L.; Roussos, Panos; Ruderfer, Douglas M.; Salomaa, Veikko; Sanders, Alan R.; Schall, Ulrich; Schubert, Christian R.; Schulze, Thomas G.; Schwab, Sibylle G.; Scolnick, Edward M.; Scott, Rodney J.; Seidman, Larry J.; Shi, Jianxin; Sigurdsson, Engilbert; Silagadze, Teimuraz; Silverman, Jeremy M.; Sim, Kang; Slominsky, Petr; Smoller, Jordan W.; So, Hon-Cheong; Spencer, Chris C.A.; Stahl, Eli A.; Stefansson, Hreinn; Steinberg, Stacy; Stogmann, Elisabeth; Straub, Richard E.; Strengman, Eric; Strohmaier, Jana; Stroup, T. Scott; Subramaniam, Mythily; Suvisaari, Jaana; Svrakic, Dragan M.; Szatkiewicz, Jin P.; Sderman, Erik; Thirumalai, Srinivas; Toncheva, Draga; Tooney, Paul A.; Tosato, Sarah; Veijola, Juha; Waddington, John; Walsh, Dermot; Wang, Dai; Wang, Qiang; Webb, Bradley T.; Weiser, Mark; Wildenauer, Dieter B.; Williams, Nigel M.; Williams, Stephanie; Witt, Stephanie H.; Wolen, Aaron R.; Wong, Emily H.M.; Wormley, Brandon K.; Wu, Jing Qin; Xi, Hualin Simon; Zai, Clement C.; Zheng, Xuebin; Zimprich, Fritz; Wray, Naomi R.; Stefansson, Kari; Visscher, Peter M.; Adolfsson, Rolf; Andreassen, Ole A.; Blackwood, Douglas H.R.; Bramon, Elvira; Buxbaum, Joseph D.; Børglum, Anders D.; Cichon, Sven; Darvasi, Ariel; Domenici, Enrico; Ehrenreich, Hannelore; Esko, Tonu; Gejman, Pablo V.; Gill, Michael; Gurling, Hugh; Hultman, Christina M.; Iwata, Nakao; Jablensky, Assen V.; Jonsson, Erik G.; Kendler, Kenneth S.; Kirov, George; Knight, Jo; Lencz, Todd; Levinson, Douglas F.; Li, Qingqin S.; Liu, Jianjun; Malhotra, Anil K.; McCarroll, Steven A.; McQuillin, Andrew; Moran, Jennifer L.; Mortensen, Preben B.; Mowry, Bryan J.; Nthen, Markus M.; Ophoff, Roel A.; Owen, Michael J.; Palotie, Aarno; Pato, Carlos N.; Petryshen, Tracey L.; Posthuma, Danielle; Rietschel, Marcella; Riley, Brien P.; Rujescu, Dan; Sham, Pak C.; Sklar, Pamela; St. Clair, David; Weinberger, Daniel R.; Wendland, Jens R.; Werge, Thomas; Daly, Mark J.; Sullivan, Patrick F.; O’Donovan, Michael C.; Kraft, Peter; Hunter, David J.; Adank, Muriel; Ahsan, Habibul; Aittomäki, Kristiina; Baglietto, Laura; Berndt, Sonja; Blomquist, Carl; Canzian, Federico; Chang-Claude, Jenny; Chanock, Stephen J.; Crisponi, Laura; Czene, Kamila; Dahmen, Norbert; Silva, Isabel dos Santos; Easton, Douglas; Eliassen, A. Heather; Figueroa, Jonine; Fletcher, Olivia; Garcia-Closas, Montserrat; Gaudet, Mia M.; Gibson, Lorna; Haiman, Christopher A.; Hall, Per; Hazra, Aditi; Hein, Rebecca; Henderson, Brian E.; Hofman, Albert; Hopper, John L.; Irwanto, Astrid; Johansson, Mattias; Kaaks, Rudolf; Kibriya, Muhammad G.; Lichtner, Peter; Lindström, Sara; Liu, Jianjun; Lund, Eiliv; Makalic, Enes; Meindl, Alfons; Meijers-Heijboer, Hanne; Müller-Myhsok, Bertram; Muranen, Taru A.; Nevanlinna, Heli; Peeters, Petra H.; Peto, Julian; Prentice, Ross L.; Rahman, Nazneen; Sánchez, María José; Schmidt, Daniel F.; Schmutzler, Rita K.; Southey, Melissa C.; Tamimi, Rulla; Travis, Ruth; Turnbull, Clare; Uitterlinden, Andre G.; van der Luijt, Rob B.; Waisfisz, Quinten; Wang, Zhaoming; Whittemore, Alice S.; Yang, Rose; Zheng, Wei; Kathiresan, Sekar; Pato, Michele; Pato, Carlos; Tamimi, Rulla; Stahl, Eli; Zaitlen, Noah; Pasaniuc, Bogdan; Belbin, Gillian; Kenny, Eimear E.; Schierup, Mikkel H.; De Jager, Philip; Patsopoulos, Nikolaos A.; McCarroll, Steve; Daly, Mark; Purcell, Shaun; Chasman, Daniel; Neale, Benjamin; Goddard, Michael; Visscher, Peter M.; Kraft, Peter; Patterson, Nick; Price, Alkes L.

    2015-01-01

    Polygenic risk scores have shown great promise in predicting complex disease risk and will become more accurate as training sample sizes increase. The standard approach for calculating risk scores involves linkage disequilibrium (LD)-based marker pruning and applying a p value threshold to association statistics, but this discards information and can reduce predictive accuracy. We introduce LDpred, a method that infers the posterior mean effect size of each marker by using a prior on effect sizes and LD information from an external reference panel. Theory and simulations show that LDpred outperforms the approach of pruning followed by thresholding, particularly at large sample sizes. Accordingly, predicted R2 increased from 20.1% to 25.3% in a large schizophrenia dataset and from 9.8% to 12.0% in a large multiple sclerosis dataset. A similar relative improvement in accuracy was observed for three additional large disease datasets and for non-European schizophrenia samples. The advantage of LDpred over existing methods will grow as sample sizes increase. PMID:26430803

  16. An approach towards risk assessment for the use of a synergistic metallic diesel particulate filter (DPF) regeneration additive

    NASA Astrophysics Data System (ADS)

    Cook, S. L.; Richards, P. J.

    The motivations for legislation to set diesel emissions limits requiring the use of diesel particulate filters (DPF) are summarised. If the DPF is to be used, demonstration of regeneration (combustion of collected carbonaceous material) without additional emission problems is important. Potential metal emissions resulting from use of a synergistic Fe/Sr fuel-borne DPF regeneration catalyst are evaluated. Measurements over legislated drive cycle estimate the metals to comprise 1-2% of the solid material emitted, and the DPF to collect >99% of such material. Diesel particulate matter is used as a marker, and from existing air quality and emission inventory measurements, maximum conceivable increases of <1 ng m -3 and <250 pg m -3 for iron and strontium, respectively, are calculated. From environmental assessment levels, derived from occupational exposure limits, these are not significant. For humans, daily ingress of airborne Sr is estimated at 3.5 ng. This is small compared to the known Sr contents of lungs, blood and the daily diet. In the context of reductions of other metals, particulate matter and pollutant emissions, the overall assessment is that the use of these metals to enable use of a DPF allows significant net environmental benefit to be obtained.

  17. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  18. DOSE-RESPONSE MODELING FOR ASSESSING CUMULATIVE PESTICIDE RISK

    EPA Science Inventory

    This project is still in its early phases. Future work in this area will involve theoretical analyses of the limits of dose-additivity assumptions, and physiologically-based pharmacokinetic and pharmacodynamic (PBPK/PD) models for n-methyl carbamate and pyrethroid pesticides (in ...

  19. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    NASA Technical Reports Server (NTRS)

    2012-01-01

    At the request of NASA, the National Research Council's (NRC's) Committee for Evaluation of Space Radiation Cancer Risk Model1 reviewed a number of changes that NASA proposes to make to its model for estimating the risk of radiation-induced cancer in astronauts. The NASA model in current use was last updated in 2005, and the proposed model would incorporate recent research directed at improving the quantification and understanding of the health risks posed by the space radiation environment. NASA's proposed model is defined by the 2011 NASA report Space Radiation Cancer Risk Projections and Uncertainties--2010 . The committee's evaluation is based primarily on this source, which is referred to hereafter as the 2011 NASA report, with mention of specific sections or tables. The overall process for estimating cancer risks due to low linear energy transfer (LET) radiation exposure has been fully described in reports by a number of organizations. The approaches described in the reports from all of these expert groups are quite similar. NASA's proposed space radiation cancer risk assessment model calculates, as its main output, age- and gender-specific risk of exposure-induced death (REID) for use in the estimation of mission and astronaut-specific cancer risk. The model also calculates the associated uncertainties in REID. The general approach for estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements. However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as

  20. Additive and interaction effects at three amino acid positions in HLA-DQ and HLA-DR molecules drive type 1 diabetes risk

    PubMed Central

    Hu, Xinli; Deutsch, Aaron J; Lenz, Tobias L; Onengut-Gumuscu, Suna; Han, Buhm; Chen, Wei-Min; Howson, Joanna M M; Todd, John A; de Bakker, Paul I W; Rich, Stephen S; Raychaudhuri, Soumya

    2016-01-01

    Variation in the human leukocyte antigen (HLA) genes accounts for one-half of the genetic risk in type 1 diabetes (T1D). Amino acid changes in the HLA-DR and HLA-DQ molecules mediate most of the risk, but extensive linkage disequilibrium complicates the localization of independent effects. Using 18,832 case-control samples, we localized the signal to 3 amino acid positions in HLA-DQ and HLA-DR. HLA-DQβ1 position 57 (previously known; P = 1 × 10−1,355) by itself explained 15.2% of the total phenotypic variance. Independent effects at HLA-DRβ1 positions 13 (P = 1 × 10−721) and 71 (P = 1 × 10−95) increased the proportion of variance explained to 26.9%. The three positions together explained 90% of the phenotypic variance in the HLA-DRB1–HLA-DQA1–HLA-DQB1 locus. Additionally, we observed significant interactions for 11 of 21 pairs of common HLA-DRB1–HLA-DQA1–HLA-DQB1 haplotypes (P = 1.6 × 10−64). HLA-DRβ1 positions 13 and 71 implicate the P4 pocket in the antigen-binding groove, thus pointing to another critical protein structure for T1D risk, in addition to the HLA-DQ P9 pocket. PMID:26168013

  1. Risk of adverse events with bevacizumab addition to therapy in advanced non-small-cell lung cancer: a meta-analysis of randomized controlled trials

    PubMed Central

    Lai, Xi-Xi; Xu, Ren-Ai; Yu-Ping, Li; Yang, Han

    2016-01-01

    Background Bevacizumab, a monoclonal antibody against vascular endothelial growth factor ligand, has shown survival benefits in the treatment of many types of malignant tumors, including non-small-cell lung cancer (NSCLC). We conducted this systematic review and meta-analysis to investigate the risk of the most clinically relevant adverse events related to bevacizumab in advanced NSCLC. Methods Databases from PubMed, Web of Science, and Cochrane Library up to August 2015, were searched to identify relevant studies. We included prospective randomized controlled Phase II/III clinical trials that compared therapy with or without bevacizumab for advanced NSCLC. Summary relative risk (RR) and 95% confidence intervals were calculated using random effects or fixed effects according to the heterogeneity among included trials. Results A total of 3,745 patients from nine clinical trials were included in the meta-analysis. Summary RRs showed a statistically significant bevacizumab-associated increased risk in three of the adverse outcomes studied: proteinuria (RR =7.55), hypertension (RR =5.34), and hemorrhagic events (RR =2.61). No statistically significant differences were found for gastrointestinal perforation (P=0.60), arterial and venous thromboembolic events (P=0.35 and P=0.92, respectively), or fatal events (P=0.29). Conclusion The addition of bevacizumab to therapy in advanced NSCLC did significantly increase the risk of proteinuria, hypertension, and hemorrhagic events but not arterial/venous thromboembolic events, gastrointestinal perforation, or fatal adverse events. PMID:27143937

  2. Improving nutrient management practices in agriculture: The role of risk-based beliefs in understanding farmers' attitudes toward taking additional action

    NASA Astrophysics Data System (ADS)

    Wilson, Robyn S.; Howard, Gregory; Burnett, Elizabeth A.

    2014-08-01

    A recent increase in the amount of dissolved reactive phosphorus (DRP) entering the western Lake Erie basin is likely due to increased spring storm events in combination with issues related to fertilizer application and timing. These factors in combination with warmer lake temperatures have amplified the spread of toxic algal blooms. We assessed the attitudes of farmers in northwest Ohio toward taking at least one additional action to reduce nutrient loss on their farm. Specifically, we (1) identified to what extent farm and farmer characteristics (e.g., age, gross farm sales) as well as risk-based beliefs (e.g., efficacy, risk perception) influenced attitudes, and (2) assessed how these characteristics and beliefs differ in their predictive ability based on unobservable latent classes of farmers. Risk perception, or a belief that negative impacts to profit and water quality from nutrient loss were likely, was the most consistent predictor of farmer attitudes. Response efficacy, or a belief that taking action on one's farm made a difference, was found to significantly influence attitudes, although this belief was particularly salient for the minority class of farmers who were older and more motivated by profit. Communication efforts should focus on the negative impacts of nutrient loss to both the farm (i.e., profit) and the natural environment (i.e., water quality) to raise individual perceived risk among the majority, while the minority need higher perceived efficacy or more specific information about the economic effectiveness of particular recommended practices.

  3. GERMcode: A Stochastic Model for Space Radiation Risk Assessment

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2012-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and high charge and energy (HZE) particles that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of HZE particles in tissue and shielding materials is made with a stochastic approach that includes both particle track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections. For NSRL applications, the GERMcode evaluates a set of biophysical properties, such as the Poisson distribution of particles or delta-ray hits for a given cellular area and particle dose, the radial dose on tissue, and the frequency distribution of energy deposition in a DNA volume. By utilizing the ProE/Fishbowl ray-tracing analysis, the GERMcode will be used as a bi-directional radiation transport model for future spacecraft shielding analysis in support of Mars mission risk assessments. Recent radiobiological experiments suggest the need for new approaches to risk assessment that include time-dependent biological events due to the signaling times for activation and relaxation of biological processes in cells and tissue. Thus, the tracking of the temporal and spatial distribution of events in tissue is a major goal of the GERMcode in support of the simulation of biological processes important in GCR risk assessments. In order to validate our approach, basic radiobiological responses such as cell survival curves, mutation, chromosomal

  4. Factor analysis models for structuring covariance matrices of additive genetic effects: a Bayesian implementation

    PubMed Central

    de los Campos, Gustavo; Gianola, Daniel

    2007-01-01

    Multivariate linear models are increasingly important in quantitative genetics. In high dimensional specifications, factor analysis (FA) may provide an avenue for structuring (co)variance matrices, thus reducing the number of parameters needed for describing (co)dispersion. We describe how FA can be used to model genetic effects in the context of a multivariate linear mixed model. An orthogonal common factor structure is used to model genetic effects under Gaussian assumption, so that the marginal likelihood is multivariate normal with a structured genetic (co)variance matrix. Under standard prior assumptions, all fully conditional distributions have closed form, and samples from the joint posterior distribution can be obtained via Gibbs sampling. The model and the algorithm developed for its Bayesian implementation were used to describe five repeated records of milk yield in dairy cattle, and a one common FA model was compared with a standard multiple trait model. The Bayesian Information Criterion favored the FA model. PMID:17897592

  5. Collaborative modelling for interactive participation in urban flood risk management

    NASA Astrophysics Data System (ADS)

    Evers, M.

    2012-04-01

    This paper presents an attempt to enhance the role of local stakeholders in dealing with urban floods. The concept is based on the DIANE-CM project (Decentralised Integrated Analysis and Enhancement of Awareness through Collaborative Modelling and Management of Flood Risk) of the ERANET CRUE programme. The main objective of the project was to develop and test the advanced methodology for enhancing the resilience of the local communities to flooding by a participative and interactive approach. Through collaborative modelling, a social learning process was initiated which will enhance the social capacity of the stakeholders due to the interaction process. The other aim of the project was to better understand how data from hazard and vulnerability analyses and improved maps, as well as from the near real time flood prediction, can be used to initiate a public dialogue (i.e. collaborative mapping and planning activities) in order to carry out more informed and shared decision making processes and to enhance flood risk awareness - which will improve the flood resilience situation. The concept of collaborative modelling was applied in two case studies: (1) the Roding river/Cranbrook catchment in the UK, with focus on pluvial flooding, and (2) the Alster catchment in Germany, with focus on fluvial flooding.

  6. How TK-TD and population models for aquatic macrophytes could support the risk assessment for plant protection products.

    PubMed

    Hommen, Udo; Schmitt, Walter; Heine, Simon; Brock, Theo Cm; Duquesne, Sabine; Manson, Phil; Meregalli, Giovanna; Ochoa-Acuña, Hugo; van Vliet, Peter; Arts, Gertie

    2016-01-01

    This case study of the Society of Environmental Toxicology and Chemistry (SETAC) workshop MODELINK demonstrates the potential use of mechanistic effects models for macrophytes to extrapolate from effects of a plant protection product observed in laboratory tests to effects resulting from dynamic exposure on macrophyte populations in edge-of-field water bodies. A standard European Union (EU) risk assessment for an example herbicide based on macrophyte laboratory tests indicated risks for several exposure scenarios. Three of these scenarios are further analyzed using effect models for 2 aquatic macrophytes, the free-floating standard test species Lemna sp., and the sediment-rooted submerged additional standard test species Myriophyllum spicatum. Both models include a toxicokinetic (TK) part, describing uptake and elimination of the toxicant, a toxicodynamic (TD) part, describing the internal concentration-response function for growth inhibition, and a description of biomass growth as a function of environmental factors to allow simulating seasonal dynamics. The TK-TD models are calibrated and tested using laboratory tests, whereas the growth models were assumed to be fit for purpose based on comparisons of predictions with typical growth patterns observed in the field. For the risk assessment, biomass dynamics are predicted for the control situation and for several exposure levels. Based on specific protection goals for macrophytes, preliminary example decision criteria are suggested for evaluating the model outputs. The models refined the risk indicated by lower tier testing for 2 exposure scenarios, while confirming the risk associated for the third. Uncertainties related to the experimental and the modeling approaches and their application in the risk assessment are discussed. Based on this case study and the assumption that the models prove suitable for risk assessment once fully evaluated, we recommend that 1) ecological scenarios be developed that are also

  7. Adaptation of the pore diffusion model to describe multi-addition batch uptake high-throughput screening experiments.

    PubMed

    Traylor, Steven J; Xu, Xuankuo; Li, Yi; Jin, Mi; Li, Zheng Jian

    2014-11-14

    Equilibrium isotherm and kinetic mass transfer measurements are critical to mechanistic modeling of binding and elution behavior within a chromatographic column. However, traditional methods of measuring these parameters are impractically time- and labor-intensive. While advances in high-throughput robotic liquid handling systems have created time and labor-saving methods of performing kinetic and equilibrium measurements of proteins on chromatographic resins in a 96-well plate format, these techniques continue to be limited by physical constraints on protein addition, incubation and separation times; the available concentration of protein stocks and process pools; and practical constraints on resin and fluid volumes in the 96-well format. In this stu