Population-based absolute risk estimation with survey data
Kovalchik, Stephanie A.; Pfeiffer, Ruth M.
2013-01-01
Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614
Development of a GCR Event-based Risk Model
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee
2009-01-01
A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how GCR event rates mapped to biological signaling induction and relaxation times. We considered several hypotheses related to signaling and cancer risk, and then performed simulations for conditions where aberrant or adaptive signaling would occur on long-duration space mission. Our results do not support the conventional assumptions of dose, linearity and additivity. A discussion on how event-based systems biology models, which focus on biological signaling as the mechanism to propagate damage or adaptation, can be further developed for cancer and CNS space radiation risk projections is given.
Semicompeting risks in aging research: methods, issues and needs
Varadhan, Ravi; Xue, Qian-Li; Bandeen-Roche, Karen
2015-01-01
A semicompeting risks problem involves two-types of events: a nonterminal and a terminal event (death). Typically, the nonterminal event is the focus of the study, but the terminal event can preclude the occurrence of the nonterminal event. Semicompeting risks are ubiquitous in studies of aging. Examples of semicompeting risk dyads include: dementia and death, frailty syndrome and death, disability and death, and nursing home placement and death. Semicompeting risk models can be divided into two broad classes: models based only on observables quantities (class O) and those based on potential (latent) failure times (class L). The classical illness-death model belongs to class O. This model is a special case of the multistate models, which has been an active area of methodology development. During the past decade and a half, there has also been a flurry of methodological activity on semicompeting risks based on latent failure times (L models). These advances notwithstanding, the semi-competing risks methodology has not penetrated biomedical research, in general, and gerontological research, in particular. Some possible reasons for this lack of uptake are: the methods are relatively new and sophisticated, conceptual problems associated with potential failure time models are difficult to overcome, paucity of expository articles aimed at educating practitioners, and non-availability of readily usable software. The main goals of this review article are: (i) to describe the major types of semicompeting risks problems arising in aging research, (ii) to provide a brief survey of the semicompeting risks methods, (iii) to suggest appropriate methods for addressing the problems in aging research, (iv) to highlight areas where more work is needed, and (v) to suggest ways to facilitate the uptake of the semicompeting risks methodology by the broader biomedical research community. PMID:24729136
Gaziano, Thomas A; Young, Cynthia R; Fitzmaurice, Garrett; Atwood, Sidney; Gaziano, J Michael
2008-01-01
Summary Background Around 80% of all cardiovascular deaths occur in developing countries. Assessment of those patients at high risk is an important strategy for prevention. Since developing countries have limited resources for prevention strategies that require laboratory testing, we assessed if a risk prediction method that did not require any laboratory tests could be as accurate as one requiring laboratory information. Methods The National Health and Nutrition Examination Survey (NHANES) was a prospective cohort study of 14 407 US participants aged between 25–74 years at the time they were first examined (between 1971 and 1975). Our follow-up study population included participants with complete information on these surveys who did not report a history of cardiovascular disease (myocardial infarction, heart failure, stroke, angina) or cancer, yielding an analysis dataset N=6186. We compared how well either method could predict first-time fatal and non-fatal cardiovascular disease events in this cohort. For the laboratory-based model, which required blood testing, we used standard risk factors to assess risk of cardiovascular disease: age, systolic blood pressure, smoking status, total cholesterol, reported diabetes status, and current treatment for hypertension. For the non-laboratory-based model, we substituted body-mass index for cholesterol. Findings In the cohort of 6186, there were 1529 first-time cardiovascular events and 578 (38%) deaths due to cardiovascular disease over 21 years. In women, the laboratory-based model was useful for predicting events, with a c statistic of 0·829. The c statistic of the non-laboratory-based model was 0·831. In men, the results were similar (0·784 for the laboratory-based model and 0·783 for the non-laboratory-based model). Results were similar between the laboratory-based and non-laboratory-based models in both men and women when restricted to fatal events only. Interpretation A method that uses non-laboratory-based risk factors predicted cardiovascular events as accurately as one that relied on laboratory-based values. This approach could simplify risk assessment in situations where laboratory testing is inconvenient or unavailable. PMID:18342687
Multilevel joint competing risk models
NASA Astrophysics Data System (ADS)
Karunarathna, G. H. S.; Sooriyarachchi, M. R.
2017-09-01
Joint modeling approaches are often encountered for different outcomes of competing risk time to event and count in many biomedical and epidemiology studies in the presence of cluster effect. Hospital length of stay (LOS) has been the widely used outcome measure in hospital utilization due to the benchmark measurement for measuring multiple terminations such as discharge, transferred, dead and patients who have not completed the event of interest at the follow up period (censored) during hospitalizations. Competing risk models provide a method of addressing such multiple destinations since classical time to event models yield biased results when there are multiple events. In this study, the concept of joint modeling has been applied to the dengue epidemiology in Sri Lanka, 2006-2008 to assess the relationship between different outcomes of LOS and platelet count of dengue patients with the district cluster effect. Two key approaches have been applied to build up the joint scenario. In the first approach, modeling each competing risk separately using the binary logistic model, treating all other events as censored under the multilevel discrete time to event model, while the platelet counts are assumed to follow a lognormal regression model. The second approach is based on the endogeneity effect in the multilevel competing risks and count model. Model parameters were estimated using maximum likelihood based on the Laplace approximation. Moreover, the study reveals that joint modeling approach yield more precise results compared to fitting two separate univariate models, in terms of AIC (Akaike Information Criterion).
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
Jeon, Soyoung; Paciorek, Christopher J.; Wehner, Michael F.
2016-02-16
Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output basedmore » on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.« less
Oliva, Elizabeth M; Bowe, Thomas; Tavakoli, Sara; Martins, Susana; Lewis, Eleanor T; Paik, Meenah; Wiechers, Ilse; Henderson, Patricia; Harvey, Michael; Avoundjian, Tigran; Medhanie, Amanuel; Trafton, Jodie A
2017-02-01
Concerns about opioid-related adverse events, including overdose, prompted the Veterans Health Administration (VHA) to launch an Opioid Safety Initiative and Overdose Education and Naloxone Distribution program. To mitigate risks associated with opioid prescribing, a holistic approach that takes into consideration both risk factors (e.g., dose, substance use disorders) and risk mitigation interventions (e.g., urine drug screening, psychosocial treatment) is needed. This article describes the Stratification Tool for Opioid Risk Mitigation (STORM), a tool developed in VHA that reflects this holistic approach and facilitates patient identification and monitoring. STORM prioritizes patients for review and intervention according to their modeled risk for overdose/suicide-related events and displays risk factors and risk mitigation interventions obtained from VHA electronic medical record (EMR)-data extracts. Patients' estimated risk is based on a predictive risk model developed using fiscal year 2010 (FY2010: 10/1/2009-9/30/2010) EMR-data extracts and mortality data among 1,135,601 VHA patients prescribed opioid analgesics to predict risk for an overdose/suicide-related event in FY2011 (2.1% experienced an event). Cross-validation was used to validate the model, with receiver operating characteristic curves for the training and test data sets performing well (>.80 area under the curve). The predictive risk model distinguished patients based on risk for overdose/suicide-related adverse events, allowing for identification of high-risk patients and enrichment of target populations of patients with greater safety concerns for proactive monitoring and application of risk mitigation interventions. Results suggest that clinical informatics can leverage EMR-extracted data to identify patients at-risk for overdose/suicide-related events and provide clinicians with actionable information to mitigate risk. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Modeling hard clinical end-point data in economic analyses.
Kansal, Anuraag R; Zheng, Ying; Palencia, Roberto; Ruffolo, Antonio; Hass, Bastian; Sorensen, Sonja V
2013-11-01
The availability of hard clinical end-point data, such as that on cardiovascular (CV) events among patients with type 2 diabetes mellitus, is increasing, and as a result there is growing interest in using hard end-point data of this type in economic analyses. This study investigated published approaches for modeling hard end-points from clinical trials and evaluated their applicability in health economic models with different disease features. A review of cost-effectiveness models of interventions in clinically significant therapeutic areas (CV diseases, cancer, and chronic lower respiratory diseases) was conducted in PubMed and Embase using a defined search strategy. Only studies integrating hard end-point data from randomized clinical trials were considered. For each study included, clinical input characteristics and modeling approach were summarized and evaluated. A total of 33 articles (23 CV, eight cancer, two respiratory) were accepted for detailed analysis. Decision trees, Markov models, discrete event simulations, and hybrids were used. Event rates were incorporated either as constant rates, time-dependent risks, or risk equations based on patient characteristics. Risks dependent on time and/or patient characteristics were used where major event rates were >1%/year in models with fewer health states (<7). Models of infrequent events or with numerous health states generally preferred constant event rates. The detailed modeling information and terminology varied, sometimes requiring interpretation. Key considerations for cost-effectiveness models incorporating hard end-point data include the frequency and characteristics of the relevant clinical events and how the trial data is reported. When event risk is low, simplification of both the model structure and event rate modeling is recommended. When event risk is common, such as in high risk populations, more detailed modeling approaches, including individual simulations or explicitly time-dependent event rates, are more appropriate to accurately reflect the trial data.
Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.
2016-01-01
Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.
Pasea, Laura; Chung, Sheng-Chia; Pujades-Rodriguez, Mar; Moayyeri, Alireza; Denaxas, Spiros; Fox, Keith A.A.; Wallentin, Lars; Pocock, Stuart J.; Timmis, Adam; Banerjee, Amitava; Patel, Riyaz; Hemingway, Harry
2017-01-01
Aims The aim of this study is to develop models to aid the decision to prolong dual antiplatelet therapy (DAPT) that requires balancing an individual patient’s potential benefits and harms. Methods and results Using population-based electronic health records (EHRs) (CALIBER, England, 2000–10), of patients evaluated 1 year after acute myocardial infarction (MI), we developed (n = 12 694 patients) and validated (n = 5613) prognostic models for cardiovascular (cardiovascular death, MI or stroke) events and three different bleeding endpoints. We applied trial effect estimates to determine potential benefits and harms of DAPT and the net clinical benefit of individuals. Prognostic models for cardiovascular events (c-index: 0.75 (95% CI: 0.74, 0.77)) and bleeding (c index 0.72 (95% CI: 0.67, 0.77)) were well calibrated: 3-year risk of cardiovascular events was 16.5% overall (5.2% in the lowest- and 46.7% in the highest-risk individuals), while for major bleeding, it was 1.7% (0.3% in the lowest- and 5.4% in the highest-risk patients). For every 10 000 patients treated per year, we estimated 249 (95% CI: 228, 269) cardiovascular events prevented and 134 (95% CI: 87, 181) major bleeding events caused in the highest-risk patients, and 28 (95% CI: 19, 37) cardiovascular events prevented and 9 (95% CI: 0, 20) major bleeding events caused in the lowest-risk patients. There was a net clinical benefit of prolonged DAPT in 63–99% patients depending on how benefits and harms were weighted. Conclusion Prognostic models for cardiovascular events and bleeding using population-based EHRs may help to personalise decisions for prolonged DAPT 1-year following acute MI. PMID:28329300
Pasea, Laura; Chung, Sheng-Chia; Pujades-Rodriguez, Mar; Moayyeri, Alireza; Denaxas, Spiros; Fox, Keith A A; Wallentin, Lars; Pocock, Stuart J; Timmis, Adam; Banerjee, Amitava; Patel, Riyaz; Hemingway, Harry
2017-04-07
The aim of this study is to develop models to aid the decision to prolong dual antiplatelet therapy (DAPT) that requires balancing an individual patient's potential benefits and harms. Using population-based electronic health records (EHRs) (CALIBER, England, 2000-10), of patients evaluated 1 year after acute myocardial infarction (MI), we developed (n = 12 694 patients) and validated (n = 5613) prognostic models for cardiovascular (cardiovascular death, MI or stroke) events and three different bleeding endpoints. We applied trial effect estimates to determine potential benefits and harms of DAPT and the net clinical benefit of individuals. Prognostic models for cardiovascular events (c-index: 0.75 (95% CI: 0.74, 0.77)) and bleeding (c index 0.72 (95% CI: 0.67, 0.77)) were well calibrated: 3-year risk of cardiovascular events was 16.5% overall (5.2% in the lowest- and 46.7% in the highest-risk individuals), while for major bleeding, it was 1.7% (0.3% in the lowest- and 5.4% in the highest-risk patients). For every 10 000 patients treated per year, we estimated 249 (95% CI: 228, 269) cardiovascular events prevented and 134 (95% CI: 87, 181) major bleeding events caused in the highest-risk patients, and 28 (95% CI: 19, 37) cardiovascular events prevented and 9 (95% CI: 0, 20) major bleeding events caused in the lowest-risk patients. There was a net clinical benefit of prolonged DAPT in 63-99% patients depending on how benefits and harms were weighted. Prognostic models for cardiovascular events and bleeding using population-based EHRs may help to personalise decisions for prolonged DAPT 1-year following acute MI. © The Author 2017. Published on behalf of the European Society of Cardiology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
Gordon, William J; Polansky, Jesse M; Boscardin, W John; Fung, Kathy Z; Steinman, Michael A
2010-11-01
US cholesterol guidelines use original and simplified versions of the Framingham model to estimate future coronary risk and thereby classify patients into risk groups with different treatment strategies. We sought to compare risk estimates and risk group classification generated by the original, complex Framingham model and the simplified, point-based version. We assessed 2,543 subjects age 20-79 from the 2001-2006 National Health and Nutrition Examination Surveys (NHANES) for whom Adult Treatment Panel III (ATP-III) guidelines recommend formal risk stratification. For each subject, we calculated the 10-year risk of major coronary events using the original and point-based Framingham models, and then compared differences in these risk estimates and whether these differences would place subjects into different ATP-III risk groups (<10% risk, 10-20% risk, or >20% risk). Using standard procedures, all analyses were adjusted for survey weights, clustering, and stratification to make our results nationally representative. Among 39 million eligible adults, the original Framingham model categorized 71% of subjects as having "moderate" risk (<10% risk of a major coronary event in the next 10 years), 22% as having "moderately high" (10-20%) risk, and 7% as having "high" (>20%) risk. Estimates of coronary risk by the original and point-based models often differed substantially. The point-based system classified 15% of adults (5.7 million) into different risk groups than the original model, with 10% (3.9 million) misclassified into higher risk groups and 5% (1.8 million) into lower risk groups, for a net impact of classifying 2.1 million adults into higher risk groups. These risk group misclassifications would impact guideline-recommended drug treatment strategies for 25-46% of affected subjects. Patterns of misclassifications varied significantly by gender, age, and underlying CHD risk. Compared to the original Framingham model, the point-based version misclassifies millions of Americans into risk groups for which guidelines recommend different treatment strategies.
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.
2015-12-03
Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the electric power grid’s security and resilience. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be problematic as there exists an insufficient statistical basis to directly estimate the probabilities and consequences of their occurrence. Since risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact low frequency events (HILFs) is essential. Insightsmore » from such a model can inform where resources are most rationally and effectively expended. The present effort is focused on development of a HILF risk assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision makers across numerous stakeholder sectors. The North American Electric Reliability Corporation (NERC) 2014 Standard TPL-001-4 considers severe events for transmission reliability planning, but does not address events of such severity that they have the potential to fail a substantial fraction of grid assets over a region, such as geomagnetic disturbances (GMD), extreme seismic events, and coordinated cyber-physical attacks. These are beyond current planning guidelines. As noted, the risks associated with such events cannot be statistically estimated based on historic experience; however, there does exist a stable of risk modeling techniques for rare events that have proven of value across a wide range of engineering application domains. There is an active and growing interest in evaluating the value of risk management techniques in the State transmission planning and emergency response communities, some of this interest in the context of grid modernization activities. The availability of a grid HILF risk model, integrated across multi-hazard domains which, when interrogated, can support transparent, defensible and effective decisions, is an attractive prospect among these communities. In this report, we document an integrated HILF risk framework intended to inform the development of risk models. These models would be based on the systematic and comprehensive (to within scope) characterization of hazards to the level of detail required for modeling risk, identification of the stressors associated with the hazards (i.e., the means of impacting grid and supporting infrastructure), characterization of the vulnerability of assets to these stressors and the probabilities of asset compromise, the grid’s dynamic response to the asset failures, and assessment of subsequent severities of consequence with respect to selected impact metrics, such as power outage duration and geographic reach. Specifically, the current framework is being developed to;1. Provide the conceptual and overarching technical paradigms for the development of risk models; 2. Identify the classes of models required to implement the framework - providing examples of existing models, and also identifying where modeling gaps exist; 3. Identify the types of data required, addressing circumstances under which data are sparse and the formal elicitation of informed judgment might be required; and 4. Identify means by which the resultant risk models might be interrogated to form the necessary basis for risk management.« less
Developing points-based risk-scoring systems in the presence of competing risks.
Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P
2016-09-30
Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Modeling the Impact of Control on the Attractiveness of Risk in a Prospect Theory Framework
Young, Diana L.; Goodie, Adam S.; Hall, Daniel B.
2010-01-01
Many decisions involve a degree of personal control over event outcomes, which is exerted through one’s knowledge or skill. In three experiments we investigated differences in decision making between prospects based on a) the outcome of random events and b) the outcome of events characterized by control. In Experiment 1, participants estimated certainty equivalents (CEs) for bets based on either random events or the correctness of their answers to U.S. state population questions across the probability spectrum. In Experiment 2, participants estimated CEs for bets based on random events, answers to U.S. state population questions, or answers to questions about 2007 NCAA football game results. Experiment 3 extended the same procedure as Experiment 1 using a within-subjects design. We modeled data from all experiments in a prospect theory framework to establish psychological mechanisms underlying decision behavior. Participants weighted the probabilities associated with bets characterized by control so as to reflect greater risk attractiveness relative to bets based on random events, as evidenced by more elevated weighting functions under conditions of control. This research elucidates possible cognitive mechanisms behind increased risk taking for decisions characterized by control, and implications for various literatures are discussed. PMID:21278906
Modeling the Impact of Control on the Attractiveness of Risk in a Prospect Theory Framework.
Young, Diana L; Goodie, Adam S; Hall, Daniel B
2011-01-01
Many decisions involve a degree of personal control over event outcomes, which is exerted through one's knowledge or skill. In three experiments we investigated differences in decision making between prospects based on a) the outcome of random events and b) the outcome of events characterized by control. In Experiment 1, participants estimated certainty equivalents (CEs) for bets based on either random events or the correctness of their answers to U.S. state population questions across the probability spectrum. In Experiment 2, participants estimated CEs for bets based on random events, answers to U.S. state population questions, or answers to questions about 2007 NCAA football game results. Experiment 3 extended the same procedure as Experiment 1 using a within-subjects design. We modeled data from all experiments in a prospect theory framework to establish psychological mechanisms underlying decision behavior. Participants weighted the probabilities associated with bets characterized by control so as to reflect greater risk attractiveness relative to bets based on random events, as evidenced by more elevated weighting functions under conditions of control. This research elucidates possible cognitive mechanisms behind increased risk taking for decisions characterized by control, and implications for various literatures are discussed.
Hsu, H E; Rydzak, C E; Cotich, K L; Wang, B; Sax, P E; Losina, E; Freedberg, K A; Goldie, S J; Lu, Z; Walensky, R P
2011-02-01
The aim of the study was to quantify the benefits (life expectancy gains) and risks (efavirenz-related teratogenicity) associated with using efavirenz in HIV-infected women of childbearing age in the USA. We used data from the Women's Interagency HIV Study in an HIV disease simulation model to estimate life expectancy in women who receive an efavirenz-based initial antiretroviral regimen compared with those who delay efavirenz use and receive a boosted protease inhibitor-based initial regimen. To estimate excess risk of teratogenic events with and without efavirenz exposure per 100,000 women, we incorporated literature-based rates of pregnancy, live births, and teratogenic events into a decision analytic model. We assumed a teratogenicity risk of 2.90 events/100 live births in women exposed to efavirenz during pregnancy and 2.68/100 live births in unexposed women. Survival for HIV-infected women who received an efavirenz-based initial antiretroviral therapy (ART) regimen was 0.89 years greater than for women receiving non-efavirenz-based initial therapy (28.91 vs. 28.02 years). The rate of teratogenic events was 77.26/100,000 exposed women, compared with 72.46/100,000 unexposed women. Survival estimates were sensitive to variations in treatment efficacy and AIDS-related mortality. Estimates of excess teratogenic events were most sensitive to pregnancy rates and number of teratogenic events/100 live births in efavirenz-exposed women. Use of non-efavirenz-based initial ART in HIV-infected women of childbearing age may reduce life expectancy gains from antiretroviral treatment, but may also prevent teratogenic events. Decision-making regarding efavirenz use presents a trade-off between these two risks; this study can inform discussions between patients and health care providers.
Antic, Darko; Milic, Natasa; Nikolovski, Srdjan; Todorovic, Milena; Bila, Jelena; Djurdjevic, Predrag; Andjelic, Bosko; Djurasinovic, Vladislava; Sretenovic, Aleksandra; Vukovic, Vojin; Jelicic, Jelena; Hayman, Suzanne; Mihaljevic, Biljana
2016-10-01
Lymphoma patients are at increased risk of thromboembolic events but thromboprophylaxis in these patients is largely underused. We sought to develop and validate a simple model, based on individual clinical and laboratory patient characteristics that would designate lymphoma patients at risk for thromboembolic event. The study population included 1,820 lymphoma patients who were treated in the Lymphoma Departments at the Clinics of Hematology, Clinical Center of Serbia and Clinical Center Kragujevac. The model was developed using data from a derivation cohort (n = 1,236), and further assessed in the validation cohort (n = 584). Sixty-five patients (5.3%) in the derivation cohort and 34 (5.8%) patients in the validation cohort developed thromboembolic events. The variables independently associated with risk for thromboembolism were: previous venous and/or arterial events, mediastinal involvement, BMI>30 kg/m(2) , reduced mobility, extranodal localization, development of neutropenia and hemoglobin level < 100g/L. Based on the risk model score, the population was divided into the following risk categories: low (score 0-1), intermediate (score 2-3), and high (score >3). For patients classified at risk (intermediate and high-risk scores), the model produced negative predictive value of 98.5%, positive predictive value of 25.1%, sensitivity of 75.4%, and specificity of 87.5%. A high-risk score had positive predictive value of 65.2%. The diagnostic performance measures retained similar values in the validation cohort. Developed prognostic Thrombosis Lymphoma - ThroLy score is more specific for lymphoma patients than any other available score targeting thrombosis in cancer patients. Am. J. Hematol. 91:1014-1019, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
An evaluation of Computational Fluid dynamics model for flood risk analysis
NASA Astrophysics Data System (ADS)
Di Francesco, Silvia; Biscarini, Chiara; Montesarchio, Valeria
2014-05-01
This work presents an analysis of the hydrological-hydraulic engineering requisites for Risk evaluation and efficient flood damage reduction plans. Most of the research efforts have been dedicated to the scientific and technical aspects of risk assessment, providing estimates of possible alternatives and of the risk associated. In the decision making process for mitigation plan, the contribute of scientist is crucial, due to the fact that Risk-Damage analysis is based on evaluation of flow field ,of Hydraulic Risk and on economical and societal considerations. The present paper will focus on the first part of process, the mathematical modelling of flood events which is the base for all further considerations. The evaluation of potential catastrophic damage consequent to a flood event and in particular to dam failure requires modelling of the flood with sufficient detail so to capture the spatial and temporal evolutions of the event, as well of the velocity field. Thus, the selection of an appropriate mathematical model to correctly simulate flood routing is an essential step. In this work we present the application of two 3D Computational fluid dynamics models to a synthetic and real case study in order to evaluate the correct evolution of flow field and the associated flood Risk . The first model is based on a opensource CFD platform called openFoam. Water flow is schematized with a classical continuum approach based on Navier-Stokes equation coupled with Volume of fluid (VOF) method to take in account the multiphase character of river bottom-water- air systems. The second model instead is based on the Lattice Boltzmann method, an innovative numerical fluid dynamics scheme based on Boltzmann's kinetic equation that represents the flow dynamics at the macroscopic level by incorporating a microscopic kinetic approach. Fluid is seen as composed by particles that can move and collide among them. Simulation results from both models are promising and congruent to experimental results available in literature, thought the LBM model requires less computational effort respect to the NS one.
A simulation framework for mapping risks in clinical processes: the case of in-patient transfers.
Dunn, Adam G; Ong, Mei-Sing; Westbrook, Johanna I; Magrabi, Farah; Coiera, Enrico; Wobcke, Wayne
2011-05-01
To model how individual violations in routine clinical processes cumulatively contribute to the risk of adverse events in hospital using an agent-based simulation framework. An agent-based simulation was designed to model the cascade of common violations that contribute to the risk of adverse events in routine clinical processes. Clinicians and the information systems that support them were represented as a group of interacting agents using data from direct observations. The model was calibrated using data from 101 patient transfers observed in a hospital and results were validated for one of two scenarios (a misidentification scenario and an infection control scenario). Repeated simulations using the calibrated model were undertaken to create a distribution of possible process outcomes. The likelihood of end-of-chain risk is the main outcome measure, reported for each of the two scenarios. The simulations demonstrate end-of-chain risks of 8% and 24% for the misidentification and infection control scenarios, respectively. Over 95% of the simulations in both scenarios are unique, indicating that the in-patient transfer process diverges from prescribed work practices in a variety of ways. The simulation allowed us to model the risk of adverse events in a clinical process, by generating the variety of possible work subject to violations, a novel prospective risk analysis method. The in-patient transfer process has a high proportion of unique trajectories, implying that risk mitigation may benefit from focusing on reducing complexity rather than augmenting the process with further rule-based protocols.
van Rosendael, Alexander R; Maliakal, Gabriel; Kolli, Kranthi K; Beecy, Ashley; Al'Aref, Subhi J; Dwivedi, Aeshita; Singh, Gurpreet; Panday, Mohit; Kumar, Amit; Ma, Xiaoyue; Achenbach, Stephan; Al-Mallah, Mouaz H; Andreini, Daniele; Bax, Jeroen J; Berman, Daniel S; Budoff, Matthew J; Cademartiri, Filippo; Callister, Tracy Q; Chang, Hyuk-Jae; Chinnaiyan, Kavitha; Chow, Benjamin J W; Cury, Ricardo C; DeLago, Augustin; Feuchtner, Gudrun; Hadamitzky, Martin; Hausleiter, Joerg; Kaufmann, Philipp A; Kim, Yong-Jin; Leipsic, Jonathon A; Maffei, Erica; Marques, Hugo; Pontone, Gianluca; Raff, Gilbert L; Rubinshtein, Ronen; Shaw, Leslee J; Villines, Todd C; Gransar, Heidi; Lu, Yao; Jones, Erica C; Peña, Jessica M; Lin, Fay Y; Min, James K
Machine learning (ML) is a field in computer science that demonstrated to effectively integrate clinical and imaging data for the creation of prognostic scores. The current study investigated whether a ML score, incorporating only the 16 segment coronary tree information derived from coronary computed tomography angiography (CCTA), provides enhanced risk stratification compared with current CCTA based risk scores. From the multi-center CONFIRM registry, patients were included with complete CCTA risk score information and ≥3 year follow-up for myocardial infarction and death (primary endpoint). Patients with prior coronary artery disease were excluded. Conventional CCTA risk scores (conventional CCTA approach, segment involvement score, duke prognostic index, segment stenosis score, and the Leaman risk score) and a score created using ML were compared for the area under the receiver operating characteristic curve (AUC). Only 16 segment based coronary stenosis (0%, 1-24%, 25-49%, 50-69%, 70-99% and 100%) and composition (calcified, mixed and non-calcified plaque) were provided to the ML model. A boosted ensemble algorithm (extreme gradient boosting; XGBoost) was used and the entire data was randomly split into a training set (80%) and testing set (20%). First, tuned hyperparameters were used to generate a trained model from the training data set (80% of data). Second, the performance of this trained model was independently tested on the unseen test set (20% of data). In total, 8844 patients (mean age 58.0 ± 11.5 years, 57.7% male) were included. During a mean follow-up time of 4.6 ± 1.5 years, 609 events occurred (6.9%). No CAD was observed in 48.7% (3.5% event), non-obstructive CAD in 31.8% (6.8% event), and obstructive CAD in 19.5% (15.6% event). Discrimination of events as expressed by AUC was significantly better for the ML based approach (0.771) vs the other scores (ranging from 0.685 to 0.701), P < 0.001. Net reclassification improvement analysis showed that the improved risk stratification was the result of down-classification of risk among patients that did not experience events (non-events). A risk score created by a ML based algorithm, that utilizes standard 16 coronary segment stenosis and composition information derived from detailed CCTA reading, has greater prognostic accuracy than current CCTA integrated risk scores. These findings indicate that a ML based algorithm can improve the integration of CCTA derived plaque information to improve risk stratification. Published by Elsevier Inc.
Wong, Man Sing; Ho, Hung Chak; Yang, Lin; Shi, Wenzhong; Yang, Jinxin; Chan, Ta-Chien
2017-07-24
Dust events have long been recognized to be associated with a higher mortality risk. However, no study has investigated how prolonged dust events affect the spatial variability of mortality across districts in a downwind city. In this study, we applied a spatial regression approach to estimate the district-level mortality during two extreme dust events in Hong Kong. We compared spatial and non-spatial models to evaluate the ability of each regression to estimate mortality. We also compared prolonged dust events with non-dust events to determine the influences of community factors on mortality across the city. The density of a built environment (estimated by the sky view factor) had positive association with excess mortality in each district, while socioeconomic deprivation contributed by lower income and lower education induced higher mortality impact in each territory planning unit during a prolonged dust event. Based on the model comparison, spatial error modelling with the 1st order of queen contiguity consistently outperformed other models. The high-risk areas with higher increase in mortality were located in an urban high-density environment with higher socioeconomic deprivation. Our model design shows the ability to predict spatial variability of mortality risk during an extreme weather event that is not able to be estimated based on traditional time-series analysis or ecological studies. Our spatial protocol can be used for public health surveillance, sustainable planning and disaster preparation when relevant data are available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.
2016-06-25
Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the security and resilience of the electric power grid. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be difficult, as the statistical basis needed to directly estimate the probabilities and consequences of their occurrence does not exist. Because risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact, low-frequency events (HILFs) ismore » essential. Insights from such a model indicate where resources are most rationally and effectively expended. A risk-informed realization of designing and maintaining a grid resilient to HILFs will demand consideration of a spectrum of hazards/threats to infrastructure integrity, an understanding of their likelihoods of occurrence, treatment of the fragilities of critical assets to the stressors induced by such events, and through modeling grid network topology, the extent of damage associated with these scenarios. The model resulting from integration of these elements will allow sensitivity assessments based on optional risk management strategies, such as alternative pooling, staging and logistic strategies, and emergency contingency planning. This study is focused on the development of an end-to-end HILF risk-assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision-makers across numerous stakeholder groups in directing resources optimally towards the management of risks to operational continuity.« less
A probabilistic strategy for parametric catastrophe insurance
NASA Astrophysics Data System (ADS)
Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin
2017-04-01
Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss events. Due to the nature of parametric programmes, it is still necessary to clearly define when a payout is due or not, and so a decision threshold probability above which a loss event is considered to occur must be set, effectively converting the issued probabilities into deterministic binary outcomes. Model skill and value are evaluated over the range of possible threshold probabilities, with the objective of defining the optimal one. The predictive ability of the model is assessed. In terms of value assessment, a decision model is proposed, allowing users to quantify monetarily their expected expenses when different combinations of model event triggering and actual event occurrence take place, directly tackling the problem of basis risk.
Baker, Simon; Priest, Patricia; Jackson, Rod
2000-01-01
Objective To estimate the impact of using thresholds based on absolute risk of cardiovascular disease to target drug treatment to lower blood pressure in the community. Design Modelling of three thresholds of treatment for hypertension based on the absolute risk of cardiovascular disease. 5 year risk of disease was estimated for each participant using an equation to predict risk. Net predicted impact of the thresholds on the number of people treated and the number of disease events averted over 5 years was calculated assuming a relative treatment benefit of one quarter. Setting Auckland, New Zealand. Participants 2158 men and women aged 35-79 years randomly sampled from the general electoral rolls. Main outcome measures Predicted 5 year risk of cardiovascular disease event, estimated number of people for whom treatment would be recommended, and disease events averted over 5 years at different treatment thresholds. Results 46 374 (12%) Auckland residents aged 35-79 receive drug treatment to lower their blood pressure, averting an estimated 1689 disease events over 5 years. Restricting treatment to individuals with blood pressure ⩾170/100 mm Hg and those with blood pressure between 150/90-169/99 mm Hg who have a predicted 5 year risk of disease ⩾10% would increase the net number for whom treatment would be recommended by 19 401. This 42% relative increase is predicted to avert 1139/1689 (68%) additional disease events overall over 5 years compared with current treatment. If the threshold for 5 year risk of disease is set at 15% the number recommended for treatment increases by <10% but about 620/1689 (37%) additional events can be averted. A 20% threshold decreases the net number of patients recommended for treatment by about 10% but averts 204/1689 (12%) more disease events than current treatment. Conclusions Implementing treatment guidelines that use treatment thresholds based on absolute risk could significantly improve the efficiency of drug treatment to lower blood pressure in primary care. PMID:10710577
Joint Modeling Approach for Semicompeting Risks Data with Missing Nonterminal Event Status
Hu, Chen; Tsodikov, Alex
2014-01-01
Semicompeting risks data, where a subject may experience sequential non-terminal and terminal events, and the terminal event may censor the non-terminal event but not vice versa, are widely available in many biomedical studies. We consider the situation when a proportion of subjects’ non-terminal events is missing, such that the observed data become a mixture of “true” semicompeting risks data and partially observed terminal event only data. An illness-death multistate model with proportional hazards assumptions is proposed to study the relationship between non-terminal and terminal events, and provide covariate-specific global and local association measures. Maximum likelihood estimation based on semiparametric regression analysis is used for statistical inference, and asymptotic properties of proposed estimators are studied using empirical process and martingale arguments. We illustrate the proposed method with simulation studies and data analysis of a follicular cell lymphoma study. PMID:24430204
Hsu, HE; Rydzak, CE; Cotich, KL; Wang, B; Sax, PE; Losina, E; Freedberg, KA; Goldie, SJ; Lu, Z; Walensky, RP
2010-01-01
Objectives We quantified the benefits (life expectancy gains) and harms (efavirenz-related teratogenicity) associated with using efavirenz in HIV-infected women of childbearing age in the United States. Methods We used data from the Women’s Interagency HIV Study in an HIV disease simulation model to estimate life expectancy in women who receive an efavirenz-based initial antiretroviral regimen compared with those who delay efavirenz use and receive a boosted protease inhibitor-based initial regimen. To estimate excess risk of teratogenic events with and without efavirenz exposure per 100,000 women, we incorporated literature-based rates of pregnancy, live births, and teratogenic events into a decision analytic model. We assumed a teratogenicity risk of 2.90 events/100 live births in women exposed to efavirenz during pregnancy and 2.68/100 live births in unexposed women. Results Survival for HIV-infected women who received an efavirenz-based initial antiretroviral therapy regimen was 0.89 years greater than for women receiving non-efavirenz-based initial therapy (28.91 vs. 28.02 years). The rate of teratogenic events was 77.26/100,000 exposed women, compared with 72.46/100,000 unexposed women. Survival estimates were sensitive to variations in treatment efficacy and AIDS-related mortality. Estimates of excess teratogenic events were most sensitive to pregnancy rates and number of teratogenic events/100 live births in efavirenz-exposed women. Conclusions Use of non-efavirenz-based initial antiretroviral therapy in HIV-infected women of childbearing age may reduce life expectancy gains from antiretroviral treatment, but may also prevent teratogenic events. Decision-making regarding efavirenz use presents a tradeoff between these two risks; this study can inform discussions between patients and health care providers. PMID:20561082
Validation in the Absence of Observed Events.
Lathrop, John; Ezell, Barry
2016-04-01
This article addresses the problem of validating models in the absence of observed events, in the area of weapons of mass destruction terrorism risk assessment. We address that problem with a broadened definition of "validation," based on stepping "up" a level to considering the reason why decisionmakers seek validation, and from that basis redefine validation as testing how well the model can advise decisionmakers in terrorism risk management decisions. We develop that into two conditions: validation must be based on cues available in the observable world; and it must focus on what can be done to affect that observable world, i.e., risk management. That leads to two foci: (1) the real-world risk generating process, and (2) best use of available data. Based on our experience with nine WMD terrorism risk assessment models, we then describe three best use of available data pitfalls: SME confidence bias, lack of SME cross-referencing, and problematic initiation rates. Those two foci and three pitfalls provide a basis from which we define validation in this context in terms of four tests--Does the model: … capture initiation? … capture the sequence of events by which attack scenarios unfold? … consider unanticipated scenarios? … consider alternative causal chains? Finally, we corroborate our approach against three validation tests from the DOD literature: Is the model a correct representation of the process to be simulated? To what degree are the model results comparable to the real world? Over what range of inputs are the model results useful? © 2015 Society for Risk Analysis.
Hamazaki, Yuko; Morikawa, Yuko; Nakamura, Koshi; Sakurai, Masaru; Miura, Katsuyuki; Ishizaki, Masao; Kido, Teruhiko; Naruse, Yuchi; Suwazono, Yasushi; Nakagawa, Hideaki
2011-09-01
Although previous epidemiological studies have investigated the relationship between sleep duration and various cardiovascular events, the results have been inconsistent. Accordingly, we conducted a follow-up survey to investigate the relationship between sleep duration and cardiovascular events among male workers, accounting for occupational factors that might confound the true relationship. A total of 2282 male employees aged 35-54 years based in a factory in Japan were followed for 14 years. The risk of cardiovascular events was compared among 4 groups stratified based on sleep duration at baseline (<6, 6-6.9, 7-7.9, and ≥8 hours). Cardiovascular events included stroke, coronary events and sudden cardiac death. The hazard ratios for events were calculated using a Cox proportional hazards model, with the 7-7.9-hour group serving as a reference. The model was adjusted for potential confounders including traditional cardiovascular risk factors and working characteristics. During 14 years of follow-up, 64 cardiovascular events were recorded including 30 strokes, 27 coronary events and 7 sudden cardiac deaths. After adjustment for possible confounders, the hazard ratios for cardiovascular and coronary events in the <6-hour group were 3.49 [95% confidence interval (95% CI) 1.30-9.40] and 4.95 (95% CI 1.31-18.73), respectively. There was no significant increment in the risk of stroke for any sleep duration groups. Short sleep duration (<6 hours) was a significant risk factor for coronary events in a Japanese male working population.
Forbes, David; Lewis, Virginia; Varker, Tracey; Phelps, Andrea; O'Donnell, Meaghan; Wade, Darryl J; Ruzek, Josef I; Watson, Patricia; Bryant, Richard A; Creamer, Mark
2011-01-01
International clinical practice guidelines for the management of psychological trauma recommend Psychological First Aid (PFA) as an early intervention for survivors of potentially traumatic events. These recommendations are consensus-based, and there is little published evidence assessing the effectiveness of PFA. This is not surprising given the nature of the intervention and the complicating factors involved in any evaluation of PFA. There is, nevertheless, an urgent need for stronger evidence evaluating its effectiveness. The current paper posits that the implementation and evaluation of PFA within high risk organizational settings is an ideal place to start. The paper provides a framework for a phasic approach to implementing PFA within such settings and presents a model for evaluating its effectiveness using a logic- or theory-based approach which considers both pre-event and post-event factors. Phases 1 and 2 of the PFA model are pre-event actions, and phases 3 and 4 are post-event actions. It is hoped that by using the Phased PFA model and evaluation method proposed in this paper, future researchers will begin to undertake the important task of building the evidence about the most effective approach to providing PFA in high risk organizational and community disaster settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Redus, K.S.
2007-07-01
The foundation of statistics deals with (a) how to measure and collect data and (b) how to identify models using estimates of statistical parameters derived from the data. Risk is a term used by the statistical community and those that employ statistics to express the results of a statistically based study. Statistical risk is represented as a probability that, for example, a statistical model is sufficient to describe a data set; but, risk is also interpreted as a measure of worth of one alternative when compared to another. The common thread of any risk-based problem is the combination of (a)more » the chance an event will occur, with (b) the value of the event. This paper presents an introduction to, and some examples of, statistical risk-based decision making from a quantitative, visual, and linguistic sense. This should help in understanding areas of radioactive waste management that can be suitably expressed using statistical risk and vice-versa. (authors)« less
Validation in the Absence of Observed Events
Lathrop, John; Ezell, Barry
2015-07-22
Here our paper addresses the problem of validating models in the absence of observed events, in the area of Weapons of Mass Destruction terrorism risk assessment. We address that problem with a broadened definition of “Validation,” based on “backing up” to the reason why modelers and decision makers seek validation, and from that basis re-define validation as testing how well the model can advise decision makers in terrorism risk management decisions. We develop that into two conditions: Validation must be based on cues available in the observable world; and it must focus on what can be done to affect thatmore » observable world, i.e. risk management. That in turn leads to two foci: 1.) the risk generating process, 2.) best use of available data. Based on our experience with nine WMD terrorism risk assessment models, we then describe three best use of available data pitfalls: SME confidence bias, lack of SME cross-referencing, and problematic initiation rates. Those two foci and three pitfalls provide a basis from which we define validation in this context in terms of four tests -- Does the model: … capture initiation? … capture the sequence of events by which attack scenarios unfold? … consider unanticipated scenarios? … consider alternative causal chains? Finally, we corroborate our approach against three key validation tests from the DOD literature: Is the model a correct representation of the simuland? To what degree are the model results comparable to the real world? Over what range of inputs are the model results useful?« less
NASA Astrophysics Data System (ADS)
Gao, Lu; Zhang, Ying; Ding, Guoyong; Liu, Qiyong; Wang, Changke; Jiang, Baofa
2016-12-01
Assessing and responding to health risk of climate change is important because of its impact on the natural and societal ecosystems. More frequent and severe flood events will occur in China due to climate change. Given that population is projected to increase, more people will be vulnerable to flood events, which may lead to an increased incidence of HAV infection in the future. This population-based study is going to project the future health burden of HAV infection associated with flood events in Huai River Basin of China. The study area covered four cities of Anhui province in China, where flood events were frequent. Time-series adjusted Poisson regression model was developed to quantify the risks of flood events on HAV infection based on the number of daily cases during summer seasons from 2005 to 2010, controlling for other meteorological variables. Projections of HAV infection in 2020 and 2030 were estimated based on the scenarios of flood events and demographic data. Poisson regression model suggested that compared with the periods without flood events, the risks of severe flood events for HAV infection were significant (OR = 1.28, 95 % CI 1.05-1.55), while risks were not significant from moderate flood events (OR = 1.16, 95 % CI 0.72-1.87) and mild flood events (OR = 1.14, 95 % CI 0.87-1.48). Using the 2010 baseline data and the flood event scenarios (one severe flood event), increased incidence of HAV infection were estimated to be between 0.126/105 and 0.127/105 for 2020. Similarly, the increased HAV infection incidence for 2030 was projected to be between 0.382/105 and 0.399/105. Our study has, for the first time, quantified the increased incidence of HAV infection that will result from flood events in Anhui, China, in 2020 and 2030. The results have implications for public health preparation for developing public health responses to reduce HAV infection during future flood events.
Toth, Peter P; Danese, Mark; Villa, Guillermo; Qian, Yi; Beaubrun, Anne; Lira, Armando; Jansen, Jeroen P
2017-06-01
To estimate real-world cardiovascular disease (CVD) burden and value-based price range of evolocumab for a US-context, high-risk, secondary-prevention population. Burden of CVD was assessed using the UK-based Clinical Practice Research Datalink (CPRD) in order to capture complete CV burden including CV mortality. Patients on standard of care (SOC; high-intensity statins) in CPRD were selected based on eligibility criteria of FOURIER, a phase 3 CV outcomes trial of evolocumab, and categorized into four cohorts: high-risk prevalent atherosclerotic CVD (ASCVD) cohort (n = 1448), acute coronary syndrome (ACS) (n = 602), ischemic stroke (IS) (n = 151), and heart failure (HF) (n = 291) incident cohorts. The value-based price range for evolocumab was assessed using a previously published economic model. The model incorporated CPRD CV event rates and considered CV event reduction rate ratios per 1 mmol/L reduction in low-density lipoprotein-cholesterol (LDL-C) from a meta-analysis of statin trials by the Cholesterol Treatment Trialists Collaboration (CTTC), i.e. CTTC relationship. Multiple-event rates of composite CV events (ACS, IS, or coronary revascularization) per 100 patient-years were 12.3 for the high-risk prevalent ASCVD cohort, and 25.7, 13.3, and 23.3, respectively, for incident ACS, IS, and HF cohorts. Approximately one-half (42%) of the high-risk ASCVD patients with a new CV event during follow-up had a subsequent CV event. Combining these real-world event rates and the CTTC relationship in the economic model, the value-based price range (credible interval) under a willingness-to-pay threshold of $150,000/quality-adjusted life-year gained for evolocumab was $11,990 ($9,341-$14,833) to $16,856 ($12,903-$20,678) in ASCVD patients with baseline LDL-C levels ≥70 mg/dL and ≥100 mg/dL, respectively. Real-world CVD burden is substantial. Using the observed CVD burden in CPRD and the CTTC relationship, the cost-effectiveness analysis showed that, accounting for uncertainties, the expected value-based price for evolocumab is higher than its current annual cost, as long as the payer discount off list price is greater than 20%.
MERINOVA: Meteorological risks as drivers of environmental innovation in agro-ecosystem management
NASA Astrophysics Data System (ADS)
Gobin, Anne; Oger, Robert; Marlier, Catherine; Van De Vijver, Hans; Vandermeulen, Valerie; Van Huylenbroeck, Guido; Zamani, Sepideh; Curnel, Yannick; Mettepenningen, Evi
2013-04-01
The BELSPO funded project 'MERINOVA' deals with risks associated with extreme weather phenomena and with risks of biological origin such as pests and diseases. The major objectives of the proposed project are to characterise extreme meteorological events, assess the impact on Belgian agro-ecosystems, characterise their vulnerability and resilience to these events, and explore innovative adaptation options to agricultural risk management. The project comprises of five major parts that reflect the chain of risks: (i) Hazard: Assessing the likely frequency and magnitude of extreme meteorological events by means of probability density functions; (ii) Impact: Analysing the potential bio-physical and socio-economic impact of extreme weather events on agro-ecosystems in Belgium using process-based modelling techniques commensurate with the regional scale; (iii) Vulnerability: Identifying the most vulnerable agro-ecosystems using fuzzy multi-criteria and spatial analysis; (iv) Risk Management: Uncovering innovative risk management and adaptation options using actor-network theory and fuzzy cognitive mapping techniques; and, (v) Communication: Communicating to research, policy and practitioner communities using web-based techniques. The different tasks of the MERINOVA project require expertise in several scientific disciplines: meteorology, statistics, spatial database management, agronomy, bio-physical impact modelling, socio-economic modelling, actor-network theory, fuzzy cognitive mapping techniques. These expertises are shared by the four scientific partners who each lead one work package. The MERINOVA project will concentrate on promoting a robust and flexible framework by demonstrating its performance across Belgian agro-ecosystems, and by ensuring its relevance to policy makers and practitioners. Impacts developed from physically based models will not only provide information on the state of the damage at any given time, but also assist in understanding the links between different factors causing damage and determining bio-physical vulnerability. Socio-economic impacts will enlarge the basis for vulnerability mapping, risk management and adaptation options. A strong expert and end-user network will be established to help disseminating and exploiting project results to meet user needs.
Modeling Compound Flood Hazards in Coastal Embayments
NASA Astrophysics Data System (ADS)
Moftakhari, H.; Schubert, J. E.; AghaKouchak, A.; Luke, A.; Matthew, R.; Sanders, B. F.
2017-12-01
Coastal cities around the world are built on lowland topography adjacent to coastal embayments and river estuaries, where multiple factors threaten increasing flood hazards (e.g. sea level rise and river flooding). Quantitative risk assessment is required for administration of flood insurance programs and the design of cost-effective flood risk reduction measures. This demands a characterization of extreme water levels such as 100 and 500 year return period events. Furthermore, hydrodynamic flood models are routinely used to characterize localized flood level intensities (i.e., local depth and velocity) based on boundary forcing sampled from extreme value distributions. For example, extreme flood discharges in the U.S. are estimated from measured flood peaks using the Log-Pearson Type III distribution. However, configuring hydrodynamic models for coastal embayments is challenging because of compound extreme flood events: events caused by a combination of extreme sea levels, extreme river discharges, and possibly other factors such as extreme waves and precipitation causing pluvial flooding in urban developments. Here, we present an approach for flood risk assessment that coordinates multivariate extreme analysis with hydrodynamic modeling of coastal embayments. First, we evaluate the significance of correlation structure between terrestrial freshwater inflow and oceanic variables; second, this correlation structure is described using copula functions in unit joint probability domain; and third, we choose a series of compound design scenarios for hydrodynamic modeling based on their occurrence likelihood. The design scenarios include the most likely compound event (with the highest joint probability density), preferred marginal scenario and reproduced time series of ensembles based on Monte Carlo sampling of bivariate hazard domain. The comparison between resulting extreme water dynamics under the compound hazard scenarios explained above provides an insight to the strengths/weaknesses of each approach and helps modelers choose the appropriate scenario that best fit to the needs of their project. The proposed risk assessment approach can help flood hazard modeling practitioners achieve a more reliable estimate of risk, by cautiously reducing the dimensionality of the hazard analysis.
MobRISK: a model for assessing the exposure of road users to flash flood events
NASA Astrophysics Data System (ADS)
Shabou, Saif; Ruin, Isabelle; Lutoff, Céline; Debionne, Samuel; Anquetin, Sandrine; Creutin, Jean-Dominique; Beaufils, Xavier
2017-09-01
Recent flash flood impact studies highlight that road networks are often disrupted due to adverse weather and flash flood events. Road users are thus particularly exposed to road flooding during their daily mobility. Previous exposure studies, however, do not take into consideration population mobility. Recent advances in transportation research provide an appropriate framework for simulating individual travel-activity patterns using an activity-based approach. These activity-based mobility models enable the prediction of the sequence of activities performed by individuals and locating them with a high spatial-temporal resolution. This paper describes the development of the MobRISK microsimulation system: a model for assessing the exposure of road users to extreme hydrometeorological events. MobRISK aims at providing an accurate spatiotemporal exposure assessment by integrating travel-activity behaviors and mobility adaptation with respect to weather disruptions. The model is applied in a flash-flood-prone area in southern France to assess motorists' exposure to the September 2002 flash flood event. The results show that risk of flooding mainly occurs in principal road links with considerable traffic load. However, a lag time between the timing of the road submersion and persons crossing these roads contributes to reducing the potential vehicle-related fatal accidents. It is also found that sociodemographic variables have a significant effect on individual exposure. Thus, the proposed model demonstrates the benefits of considering spatiotemporal dynamics of population exposure to flash floods and presents an important improvement in exposure assessment methods. Such improved characterization of road user exposures can present valuable information for flood risk management services.
Estimating the Probability of Rare Events Occurring Using a Local Model Averaging.
Chen, Jin-Hua; Chen, Chun-Shu; Huang, Meng-Fan; Lin, Hung-Chih
2016-10-01
In statistical applications, logistic regression is a popular method for analyzing binary data accompanied by explanatory variables. But when one of the two outcomes is rare, the estimation of model parameters has been shown to be severely biased and hence estimating the probability of rare events occurring based on a logistic regression model would be inaccurate. In this article, we focus on estimating the probability of rare events occurring based on logistic regression models. Instead of selecting a best model, we propose a local model averaging procedure based on a data perturbation technique applied to different information criteria to obtain different probability estimates of rare events occurring. Then an approximately unbiased estimator of Kullback-Leibler loss is used to choose the best one among them. We design complete simulations to show the effectiveness of our approach. For illustration, a necrotizing enterocolitis (NEC) data set is analyzed. © 2016 Society for Risk Analysis.
Simulating lifetime outcomes associated with complications for people with type 1 diabetes.
Lung, Tom W C; Clarke, Philip M; Hayes, Alison J; Stevens, Richard J; Farmer, Andrew
2013-06-01
The aim of this study was to develop a discrete-time simulation model for people with type 1 diabetes mellitus, to estimate and compare mean life expectancy and quality-adjusted life-years (QALYs) over a lifetime between intensive and conventional blood glucose treatment groups. We synthesized evidence on type 1 diabetes patients using several published sources. The simulation model was based on 13 equations to estimate risks of events and mortality. Cardiovascular disease (CVD) risk was obtained from results of the DCCT (diabetes control and complications trial). Mortality post-CVD event was based on a study using linked administrative data on people with diabetes from Western Australia. Information on incidence of renal disease and the progression to CVD was obtained from studies in Finland and Italy. Lower-extremity amputation (LEA) risk was based on the type 1 diabetes Swedish inpatient registry, and the risk of blindness was obtained from results of a German-based study. Where diabetes-specific data were unavailable, information from other populations was used. We examine the degree and source of parameter uncertainty and illustrate an application of the model in estimating lifetime outcomes of using intensive and conventional treatments for blood glucose control. From 15 years of age, male and female patients had an estimated life expectancy of 47.2 (95 % CI 35.2-59.2) and 52.7 (95 % CI 41.7-63.6) years in the intensive treatment group. The model produced estimates of the lifetime benefits of intensive treatment for blood glucose from the DCCT of 4.0 (95 % CI 1.2-6.8) QALYs for women and 4.6 (95 % CI 2.7-6.9) QALYs for men. Absolute risk per 1,000 person-years for fatal CVD events was simulated to be 1.37 and 2.51 in intensive and conventional treatment groups, respectively. The model incorporates diabetic complications risk data from a type 1 diabetes population and synthesizes other type 1-specific data to estimate long-term outcomes of CVD, end-stage renal disease, LEA and risk of blindness, along with life expectancy and QALYs. External validation was carried out using life expectancy and absolute risk for fatal CVD events. Because of the flexible and transparent nature of the model, it has many potential future applications.
VanWagner, Lisa B; Ning, Hongyan; Whitsett, Maureen; Levitsky, Josh; Uttal, Sarah; Wilkins, John T; Abecassis, Michael M; Ladner, Daniela P; Skaro, Anton I; Lloyd-Jones, Donald M
2017-12-01
Cardiovascular disease (CVD) complications are important causes of morbidity and mortality after orthotopic liver transplantation (OLT). There is currently no preoperative risk-assessment tool that allows physicians to estimate the risk for CVD events following OLT. We sought to develop a point-based prediction model (risk score) for CVD complications after OLT, the Cardiovascular Risk in Orthotopic Liver Transplantation risk score, among a cohort of 1,024 consecutive patients aged 18-75 years who underwent first OLT in a tertiary-care teaching hospital (2002-2011). The main outcome measures were major 1-year CVD complications, defined as death from a CVD cause or hospitalization for a major CVD event (myocardial infarction, revascularization, heart failure, atrial fibrillation, cardiac arrest, pulmonary embolism, and/or stroke). The bootstrap method yielded bias-corrected 95% confidence intervals for the regression coefficients of the final model. Among 1,024 first OLT recipients, major CVD complications occurred in 329 (32.1%). Variables selected for inclusion in the model (using model optimization strategies) included preoperative recipient age, sex, race, employment status, education status, history of hepatocellular carcinoma, diabetes, heart failure, atrial fibrillation, pulmonary or systemic hypertension, and respiratory failure. The discriminative performance of the point-based score (C statistic = 0.78, bias-corrected C statistic = 0.77) was superior to other published risk models for postoperative CVD morbidity and mortality, and it had appropriate calibration (Hosmer-Lemeshow P = 0.33). The point-based risk score can identify patients at risk for CVD complications after OLT surgery (available at www.carolt.us); this score may be useful for identification of candidates for further risk stratification or other management strategies to improve CVD outcomes after OLT. (Hepatology 2017;66:1968-1979). © 2017 by the American Association for the Study of Liver Diseases.
How Confident can we be in Flood Risk Assessments?
NASA Astrophysics Data System (ADS)
Merz, B.
2017-12-01
Flood risk management should be based on risk analyses quantifying the risk and its reduction for different risk reduction strategies. However, validating risk estimates by comparing model simulations with past observations is hardly possible, since the assessment typically encompasses extreme events and their impacts that have not been observed before. Hence, risk analyses are strongly based on assumptions and expert judgement. This situation opens the door for cognitive biases, such as `illusion of certainty', `overconfidence' or `recency bias'. Such biases operate specifically in complex situations with many factors involved, when uncertainty is high and events are probabilistic, or when close learning feedback loops are missing - aspects that all apply to risk analyses. This contribution discusses how confident we can be in flood risk assessments, and reflects about more rigorous approaches towards their validation.
Li, Jian; Zhang, Min; Loerbroks, Adrian; Angerer, Peter; Siegrist, Johannes
2015-01-01
Though much evidence indicates that work stress increases the risk of incident of coronary heart disease (CHD), little is known about the role of work stress in the development of recurrent CHD events. The objective of this study was to review and synthesize the existing epidemiological evidence on whether work stress increases the risk of recurrent CHD events in patients with the first CHD. A systematic literature search in the PubMed database (January 1990 - December 2013) for prospective studies was performed. Inclusion criteria included: peer-reviewed English papers with original data, studies with substantial follow-up (> 3 years), end points defined as cardiac death or nonfatal myocardial infarction, as well as work stress assessed with reliable and valid instruments. Meta-analysis using random-effects modeling was conducted in order to synthesize the observed effects across the studies. Five papers derived from 4 prospective studies conducted in Sweden and Canada were included in this systematic review. The measurement of work stress was based on the Demand- Control model (4 papers) or the Effort-Reward Imbalance model (1 paper). According to the estimation by meta-analysis based on 4 papers, a significant effect of work stress on the risk of recurrent CHD events (hazard ratio: 1.65, 95% confidence interval: 1.23-2.22) was observed. Our findings suggest that, in patients with the first CHD, work stress is associated with an increased relative risk of recurrent CHD events by 65%. Due to the limited literature, more well-designed prospective research is needed to examine this association, in particular, from other than western regions of the world. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Revealing the underlying drivers of disaster risk: a global analysis
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2017-04-01
Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.
Robinson, Tom; Elley, C Raina; Wells, Sue; Robinson, Elizabeth; Kenealy, Tim; Pylypchuk, Romana; Bramley, Dale; Arroll, Bruce; Crengle, Sue; Riddell, Tania; Ameratunga, Shanthi; Metcalf, Patricia; Drury, Paul L
2012-09-01
New Zealand (NZ) guidelines recommend treating people for cardiovascular disease (CVD) risk on the basis of five-year absolute risk using a NZ adaptation of the Framingham risk equation. A diabetes-specific Diabetes Cohort Study (DCS) CVD predictive risk model has been developed and validated using NZ Get Checked data. To revalidate the DCS model with an independent cohort of people routinely assessed using PREDICT, a web-based CVD risk assessment and management programme. People with Type 2 diabetes without pre-existing CVD were identified amongst people who had a PREDICT risk assessment between 2002 and 2005. From this group we identified those with sufficient data to allow estimation of CVD risk with the DCS models. We compared the DCS models with the NZ Framingham risk equation in terms of discrimination, calibration, and reclassification implications. Of 3044 people in our study cohort, 1829 people had complete data and therefore had CVD risks calculated. Of this group, 12.8% (235) had a cardiovascular event during the five-year follow-up. The DCS models had better discrimination than the currently used equation, with C-statistics being 0.68 for the two DCS models and 0.65 for the NZ Framingham model. The DCS models were superior to the NZ Framingham equation at discriminating people with diabetes who will have a cardiovascular event. The adoption of a DCS model would lead to a small increase in the number of people with diabetes who are treated with medication, but potentially more CVD events would be avoided.
A Global Drought and Flood Catalogue for the past 100 years
NASA Astrophysics Data System (ADS)
Sheffield, J.; He, X.; Peng, L.; Pan, M.; Fisher, C. K.; Wood, E. F.
2017-12-01
Extreme hydrological events cause the most impacts of natural hazards globally, impacting on a wide range of sectors including, most prominently, agriculture, food security and water availability and quality, but also on energy production, forestry, health, transportation and fisheries. Understanding how floods and droughts intersect, and have changed in the past provides the basis for understanding current risk and how it may change in the future. To do this requires an understanding of the mechanisms associated with events and therefore their predictability, attribution of long-term changes in risk, and quantification of projections of changes in the future. Of key importance are long-term records of relevant variables so that risk can be quantified more accurately, given the growing acknowledgement that risk is not stationary under long-term climate variability and climate change. To address this, we develop a catalogue of drought and flood events based on land surface and hydrodynamic modeling, forced by a hybrid meteorological dataset that draws from the continuity and coverage of reanalysis, and satellite datasets, merged with global gauge databases. The meteorological dataset is corrected for temporal inhomogeneities, spurious trends and variable inter-dependencies to ensure long-term consistency, as well as realistic representation of short-term variability and extremes. The VIC land surface model is run for the past 100 years at 0.25-degree resolution for global land areas. The VIC runoff is then used to drive the CaMa-Flood hydrodynamic model to obtain information on flood inundation risk. The model outputs are compared to satellite based estimates of flood and drought conditions and the observational flood record. The data are analyzed in terms of the spatio-temporal characteristics of large-scale flood and drought events with a particular focus on characterizing the long-term variability in risk. Significant changes in risk occur on multi-decadal time scales and are mostly associated with variability in the North Atlantic and Pacific. The catalogue can be used for analysis of extreme events, risk assessment, and as a benchmark for model evaluation.
History of Fire Events in the U.S. Commercial Nuclear Industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bijan Najafi; Joglar-Biloch, Francisco; Kassawara, Robert P.
2002-07-01
Over the past decade, interest in performance-based fire protection has increased within the nuclear industry. In support of this growing interest, in 1997 the Electric Power Research Institute (EPRI) developed a long-range plan to develop/improve data and tools needed to support Risk-Informed/Performance-Based fire protection. This plan calls for continued improvement in collection and use of information obtained from fire events at nuclear plants. The data collection process has the objectives of improving the insights gained from such data and reducing the uncertainty in fire risk and fire modeling methods in order to make them a more reliable basis for performancemore » based fire protection programs. In keeping with these objectives, EPRI continues to collect, review and analyze fire events in support of the nuclear industry. EPRI collects these records in cooperation with the Nuclear Electric Insurance Limited (NEIL), by compiling public fire event reports and by direct solicitation of U.S. nuclear facilities. EPRI fire data collection project is based on the principle that the understanding of history is one of the cornerstones of improving fire protection technology and practice. Therefore, the goal has been to develop and maintain a comprehensive database of fire events with flexibility to support various aspects of fire protection engineering. With more than 1850 fire records over a period of three decades and 2400 reactor years, this is the most comprehensive database of nuclear power industry fire events in existence today. In general, the frequency of fires in the U.S. commercial nuclear industry remains constant. In few cases, e.g., transient fires and fires in BWR offgas/recombiner systems, where either increasing or decreasing trends are observed, these trends tend to slow after 1980. The key issues in improving quality of the data remain to be consistency of the recording and reporting of fire events and difficulties in collection of records. EPRI has made significant progress towards improving the quality of the fire events data through use of multiple collection methods as well as its review and verification. To date EPRI has used this data to develop a generic fire ignition frequency model for U.S. nuclear power industry (Ref. 1, 4 and 5) as well as to support other models in support of EPRI Fire Risk Methods such as a cable fire manual suppression model. EPRI will continue its effort to collect and analyze operating data to support risk informed/performance based fire safety engineering, including collection and analysis of impairment data for fire protection systems and features. This paper provides details on the collection and application of fire events to risk informed/performance based fire protection. The paper also provides valuable insights into improving both collection and use of fire events data. (authors)« less
The development of a simulation model of the treatment of coronary heart disease.
Cooper, Keith; Davies, Ruth; Roderick, Paul; Chase, Debbie; Raftery, James
2002-11-01
A discrete event simulation models the progress of patients who have had a coronary event, through their treatment pathways and subsequent coronary events. The main risk factors in the model are age, sex, history of previous events and the extent of the coronary vessel disease. The model parameters are based on data collected from epidemiological studies of incidence and prognosis, efficacy studies. national surveys and treatment audits. The simulation results were validated against different sources of data. The initial results show that increasing revascularisation has considerable implications for resource use but has little impact on patient mortality.
Ray, Anne E; Stapleton, Jerod L; Turrisi, Rob; Mun, Eun-Young
2014-09-01
College students who play drinking games (DGs) more frequently report higher levels of alcohol use and experience more alcohol-related harm. However, the extent to which they are at risk for increased consumption and harm as a result of DG play on a given event after accounting for their typical DG participation, and typical and event drinking, is unclear. We examined whether first-year students consumed more alcohol and were more likely to experience consequences on drinking occasions when they played DGs. Participants (n = 336) completed up to six web-based surveys following weekend drinking events in their first semester. Alcohol use, DG play, and consequences were reported for the Friday and Saturday prior to each survey. Typical DG tendencies were controlled in all models. Typical and event alcohol use were controlled in models predicting risk for consequences. Participants consumed more alcohol on DG versus non-DG events. All students were more likely to experience blackout drinking consequences when they played DGs. Women were more likely to experience social-interpersonal consequences when they played DGs. DG play is an event-specific risk factor for increased alcohol use among first-year students, regardless of individual DG play tendencies. Further, event DG play signals increased risk for blackout drinking consequences for all students, and social-interpersonal consequences for women, aside from the amount of alcohol consumed on those occasions as well as typical drinking behaviors. Prevention efforts to reduce high-risk drinking may be strengthened by highlighting both event- and person-specific risks of DG play.
A Basis Function Approach to Simulate Storm Surge Events for Coastal Flood Risk Assessment
NASA Astrophysics Data System (ADS)
Wu, Wenyan; Westra, Seth; Leonard, Michael
2017-04-01
Storm surge is a significant contributor to flooding in coastal and estuarine regions, especially when it coincides with other flood producing mechanisms, such as extreme rainfall. Therefore, storm surge has always been a research focus in coastal flood risk assessment. Often numerical models have been developed to understand storm surge events for risk assessment (Kumagai et al. 2016; Li et al. 2016; Zhang et al. 2016) (Bastidas et al. 2016; Bilskie et al. 2016; Dalledonne and Mayerle 2016; Haigh et al. 2014; Kodaira et al. 2016; Lapetina and Sheng 2015), and assess how these events may change or evolve in the future (Izuru et al. 2015; Oey and Chou 2016). However, numeric models often require a lot of input information and difficulties arise when there are not sufficient data available (Madsen et al. 2015). Alternative, statistical methods have been used to forecast storm surge based on historical data (Hashemi et al. 2016; Kim et al. 2016) or to examine the long term trend in the change of storm surge events, especially under climate change (Balaguru et al. 2016; Oh et al. 2016; Rueda et al. 2016). In these studies, often the peak of surge events is used, which result in the loss of dynamic information within a tidal cycle or surge event (i.e. a time series of storm surge values). In this study, we propose an alternative basis function (BF) based approach to examine the different attributes (e.g. peak and durations) of storm surge events using historical data. Two simple two-parameter BFs were used: the exponential function and the triangular function. High quality hourly storm surge record from 15 tide gauges around Australia were examined. It was found that there are significantly location and seasonal variability in the peak and duration of storm surge events, which provides additional insights in coastal flood risk. In addition, the simple form of these BFs allows fast simulation of storm surge events and minimises the complexity of joint probability analysis for flood risk analysis considering multiple flood producing mechanisms. This is the first step in applying a Monte Carlo based joint probability method for flood risk assessment.
2014-01-01
Background Recurrent events data analysis is common in biomedicine. Literature review indicates that most statistical models used for such data are often based on time to the first event or consider events within a subject as independent. Even when taking into account the non-independence of recurrent events within subjects, data analyses are mostly done with continuous risk interval models, which may not be appropriate for treatments with sustained effects (e.g., drug treatments of malaria patients). Furthermore, results can be biased in cases of a confounding factor implying different risk exposure, e.g. in malaria transmission: if subjects are located at zones showing different environmental factors implying different risk exposures. Methods This work aimed to compare four different approaches by analysing recurrent malaria episodes from a clinical trial assessing the effectiveness of three malaria treatments [artesunate + amodiaquine (AS + AQ), artesunate + sulphadoxine-pyrimethamine (AS + SP) or artemether-lumefantrine (AL)], with continuous and discontinuous risk intervals: Andersen-Gill counting process (AG-CP), Prentice-Williams-Peterson counting process (PWP-CP), a shared gamma frailty model, and Generalized Estimating Equations model (GEE) using Poisson distribution. Simulations were also made to analyse the impact of the addition of a confounding factor on malaria recurrent episodes. Results Using the discontinuous interval analysis, AG-CP and Shared gamma frailty models provided similar estimations of treatment effect on malaria recurrent episodes when adjusted on age category. The patients had significant decreased risk of recurrent malaria episodes when treated with AS + AQ or AS + SP arms compared to AL arm; Relative Risks were: 0.75 (95% CI (Confidence Interval): 0.62-0.89), 0.74 (95% CI: 0.62-0.88) respectively for AG-CP model and 0.76 (95% CI: 0.64-0.89), 0.74 (95% CI: 0.62-0.87) for the Shared gamma frailty model. With both discontinuous and continuous risk intervals analysis, GEE Poisson distribution models failed to detect the effect of AS + AQ arm compared to AL arm when adjusted for age category. The discontinuous risk interval analysis was found to be the more appropriate approach. Conclusion Repeated event in infectious diseases such as malaria can be analysed with appropriate existing models that account for the correlation between multiple events within subjects with common statistical software packages, after properly setting up the data structures. PMID:25073652
NASA Astrophysics Data System (ADS)
Guillod, Benoit P.; Massey, Neil; Otto, Friederike E. L.; Allen, Myles R.; Jones, Richard; Hall, Jim W.
2016-04-01
Droughts and related water scarcity can have large impacts on societies and consist of interactions between a number of natural and human factors. Meteorological conditions are usually the first natural trigger of droughts, and climate change is expected to impact these and thereby the frequency and intensity of the events. However, extreme events such as droughts are, by definition, rare, and accurately quantifying the risk related to such events is therefore difficult. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying the risks associated with droughts in the UK under present and future conditions. To do so, a large number of drought events, from climate model simulations downscaled at 25km over Europe, are being fed into hydrological models of various complexity and used for the estimation of drought risk associated with human and natural systems, including impacts on the economy, industry, agriculture, terrestrial and aquatic ecosystems, and socio-cultural aspects. Here, we present the hydro-meteorological drought event set that has been produced by weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model (GCM) simulations, downscaled at 25km over Europe by a nested Regional Climate Model (RCM). Simulations include the past 100 years as well as two future horizons (2030s and 2080s), and provide a large number of sequences of spatio-temporally consistent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. The drought event set for use in impact studies is constructed by extracting sequences of dry conditions from these model runs, leading to several thousand drought events. In addition to describing methodological and validation aspects of the synthetic drought event sets, we provide insights into drought risk in the UK, its meteorological drivers, and how it can be expected to change in the future. Finally, we assess the applicability of this methodology to other regions. [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.
Gao, Lu; Zhang, Ying; Ding, Guoyong; Liu, Qiyong; Wang, Changke; Jiang, Baofa
2016-12-01
Assessing and responding to health risk of climate change is important because of its impact on the natural and societal ecosystems. More frequent and severe flood events will occur in China due to climate change. Given that population is projected to increase, more people will be vulnerable to flood events, which may lead to an increased incidence of HAV infection in the future. This population-based study is going to project the future health burden of HAV infection associated with flood events in Huai River Basin of China. The study area covered four cities of Anhui province in China, where flood events were frequent. Time-series adjusted Poisson regression model was developed to quantify the risks of flood events on HAV infection based on the number of daily cases during summer seasons from 2005 to 2010, controlling for other meteorological variables. Projections of HAV infection in 2020 and 2030 were estimated based on the scenarios of flood events and demographic data. Poisson regression model suggested that compared with the periods without flood events, the risks of severe flood events for HAV infection were significant (OR = 1.28, 95 % CI 1.05-1.55), while risks were not significant from moderate flood events (OR = 1.16, 95 % CI 0.72-1.87) and mild flood events (OR = 1.14, 95 % CI 0.87-1.48). Using the 2010 baseline data and the flood event scenarios (one severe flood event), increased incidence of HAV infection were estimated to be between 0.126/10 5 and 0.127/10 5 for 2020. Similarly, the increased HAV infection incidence for 2030 was projected to be between 0.382/10 5 and 0.399/10 5 . Our study has, for the first time, quantified the increased incidence of HAV infection that will result from flood events in Anhui, China, in 2020 and 2030. The results have implications for public health preparation for developing public health responses to reduce HAV infection during future flood events.
Morshedi-Meibodi, Ali; Larson, Martin G; Levy, Daniel; O'Donnell, Christopher J; Vasan, Ramachandran S
2002-10-15
A delayed heart rate (HR) recovery after graded exercise testing has been associated with increased all-cause mortality in clinic-based samples. No prior study has examined the association of HR recovery after exercise with the incidence of coronary heart disease (CHD) and cardiovascular disease (CVD) events. We evaluated 2,967 Framingham study subjects (1,400 men, mean age 43 years) who were free of CVD and underwent a treadmill exercise test (Bruce protocol) at a routine examination. We examined the relations of HR recovery indexes (decrease in HR from peak exercise) to the incidence of a first CHD or CVD event and all-cause mortality, adjusting for established CVD risk factors. During follow-up (mean 15 years), 214 subjects experienced a CHD event (156 men), 312 developed a CVD event (207 men), and 167 died (105 men). In multivariable models, continuous HR recovery indexes were not associated with the incidence of CHD or CVD events, or with all-cause mortality. However, in models evaluating quintile-based cut points, the top quintile of HR recovery (greatest decline in HR) at 1-minute after exercise was associated with a lower risk of CHD (hazards ratio vs bottom 4 quintiles 0.54, 95% confidence interval [CI], 0.32 to 0.93) and CVD (hazards ratio 0.61, 95% CI 0.41 to 0.93), but not all-cause mortality (hazards ratio 0.99, 95% CI 0.60 to 1.62). In our community-based sample, HR recovery indexes were not associated with all-cause mortality. A very rapid HR recovery immediately after exercise was associated with lower risk of CHD and CVD events. These findings should be confirmed in other settings.
Talaei, Mohammad; Sadeghi, Masoumeh; Roohafza, Hamid Reza; Masoudkabir, Farzad; OveisGharan, Shahram; Mohebian, Mohammad Reza; Mañanas, Miquel Angel
2017-01-01
This study was designed to develop a risk assessment chart for the clinical management and prevention of the risk of cardiovascular disease (CVD) in Iranian population, which is vital for developing national prevention programs. The Isfahan Cohort Study (ICS) is a population-based prospective study of 6504 Iranian adults ≥35 years old, followed-up for ten years, from 2001 to 2010. Behavioral and cardiometabolic risk factors were examined every five years, while biennial follow-ups for the occurrence of the events was performed by phone calls or by verbal autopsy. Among these participants, 5432 (2784 women, 51.3%) were CVD free at baseline examination and had at least one follow-up. Cox proportional hazard regression was used to predict the risk of ischemic CVD events, including sudden cardiac death due to unstable angina, myocardial infarction, and stroke. The model fit statistics such as area under the receiver-operating characteristic (AUROC), calibration chi-square and the overall bias were used to assess the model performance. We also tested the Framingham model for comparison. Seven hundred and five CVD events occurred during 49452.8 person-years of follow-up. The event probabilities were calculated and presented color-coded on each gender-specific PARS chart. The AUROC and Harrell’s C indices were 0.74 (95% CI, 0.72–0.76) and 0.73, respectively. In the calibration, the Nam-D’Agostino χ2 was 10.82 (p = 0.29). The overall bias of the proposed model was 95.60%. PARS model was also internally validated using cross-validation. The Android app and the Web-based risk assessment tool were also developed as to have an impact on public health. In comparison, the refitted and recalibrated Framingham models, estimated the CVD incidence with the overall bias of 149.60% and 128.23% for men, and 222.70% and 176.07% for women, respectively. In conclusion, the PARS risk assessment chart is a simple, accurate, and well-calibrated tool for predicting a 10-year risk of CVD occurrence in Iranian population and can be used in an attempt to develop national guidelines for the CVD management. PMID:29261727
NASA Astrophysics Data System (ADS)
Medina, Neiler; Sanchez, Arlex; Nokolic, Igor; Vojinovic, Zoran
2016-04-01
This research explores the uses of Agent Based Models (ABM) and its potential to test large scale evacuation strategies in coastal cities at risk from flood events due to extreme hydro-meteorological events with the final purpose of disaster risk reduction by decreasing human's exposure to the hazard. The first part of the paper corresponds to the theory used to build the models such as: Complex adaptive systems (CAS) and the principles and uses of ABM in this field. The first section outlines the pros and cons of using AMB to test city evacuation strategies at medium and large scale. The second part of the paper focuses on the central theory used to build the ABM, specifically the psychological and behavioral model as well as the framework used in this research, specifically the PECS reference model is cover in this section. The last part of this section covers the main attributes or characteristics of human beings used to described the agents. The third part of the paper shows the methodology used to build and implement the ABM model using Repast-Symphony as an open source agent-based modelling and simulation platform. The preliminary results for the first implementation in a region of the island of Sint-Maarten a Dutch Caribbean island are presented and discussed in the fourth section of paper. The results obtained so far, are promising for a further development of the model and its implementation and testing in a full scale city
Nijhuis, Rogier L; Stijnen, Theo; Peeters, Anna; Witteman, Jacqueline C M; Hofman, Albert; Hunink, M G Myriam
2006-01-01
To determine the apparent and internal validity of the Rotterdam Ischemic heart disease & Stroke Computer (RISC) model, a Monte Carlo-Markov model, designed to evaluate the impact of cardiovascular disease (CVD) risk factors and their modification on life expectancy (LE) and cardiovascular disease-free LE (DFLE) in a general population (hereinafter, these will be referred to together as (DF)LE). The model is based on data from the Rotterdam Study, a cohort follow-up study of 6871 subjects aged 55 years and older who visited the research center for risk factor assessment at baseline (1990-1993) and completed a follow-up visit 7 years later (original cohort). The transition probabilities and risk factor trends used in the RISC model were based on data from 3501 subjects (the study cohort). To validate the RISC model, the number of simulated CVD events during 7 years' follow-up were compared with the observed number of events in the study cohort and the original cohort, respectively, and simulated (DF)LEs were compared with the (DF)LEs calculated from multistate life tables. Both in the study cohort and in the original cohort, the simulated distribution of CVD events was consistent with the observed number of events (CVD deaths: 7.1% v. 6.6% and 7.4% v. 7.6%, respectively; non-CVD deaths: 11.2% v. 11.5% and 12.9% v. 13.0%, respectively). The distribution of (DF)LEs estimated with the RISC model consistently encompassed the (DF)LEs calculated with multistate life tables. The simulated events and (DF)LE estimates from the RISC model are consistent with observed data from a cohort follow-up study.
Moran, Andrew; Gu, Dongfeng; Zhao, Dong; Coxson, Pamela; Wang, Y. Claire; Chen, Chung-Shiuan; Liu, Jing; Cheng, Jun; Bibbins-Domingo, Kirsten; Shen, Yu-Ming; He, Jiang; Goldman, Lee
2010-01-01
Background The relative effects of individual and combined risk factor trends on future cardiovascular disease in China have not been quantified in detail. Methods and Results Future risk factor trends in China were projected based on prior trends. Cardiovascular disease (coronary heart disease and stroke) in adults ages 35 to 84 years was projected from 2010 to 2030 using the Coronary Heart Disease Policy Model–China, a Markov computer simulation model. With risk factor levels held constant, projected annual cardiovascular events increased by >50% between 2010 and 2030 based on population aging and growth alone. Projected trends in blood pressure, total cholesterol, diabetes (increases), and active smoking (decline) would increase annual cardiovascular disease events by an additional 23%, an increase of approximately 21.3 million cardiovascular events and 7.7 million cardiovascular deaths over 2010 to 2030. Aggressively reducing active smoking in Chinese men to 20% prevalence in 2020 and 10% prevalence in 2030 or reducing mean systolic blood pressure by 3.8 mm Hg in men and women would counteract adverse trends in other risk factors by preventing cardiovascular events and 2.9 to 5.7 million total deaths over 2 decades. Conclusions Aging and population growth will increase cardiovascular disease by more than a half over the coming 20 years, and projected unfavorable trends in blood pressure, total cholesterol, diabetes, and body mass index may accelerate the epidemic. National policy aimed at controlling blood pressure, smoking, and other risk factors would counteract the expected future cardiovascular disease epidemic in China. PMID:20442213
Weather based risks and insurances for agricultural production
NASA Astrophysics Data System (ADS)
Gobin, Anne
2015-04-01
Extreme weather events such as frost, drought, heat waves and rain storms can have devastating effects on cropping systems. According to both the agriculture and finance sectors, a risk assessment of extreme weather events and their impact on cropping systems is needed. The principle of return periods or frequencies of natural hazards is adopted in many countries as the basis of eligibility for the compensation of associated losses. For adequate risk management and eligibility, hazard maps for events with a 20-year return period are often used. Damages due to extreme events are strongly dependent on crop type, crop stage, soil type and soil conditions. The impact of extreme weather events particularly during the sensitive periods of the farming calendar therefore requires a modelling approach to capture the mixture of non-linear interactions between the crop, its environment and the occurrence of the meteorological event in the farming calendar. Physically based crop models such as REGCROP (Gobin, 2010) assist in understanding the links between different factors causing crop damage. Subsequent examination of the frequency, magnitude and impacts of frost, drought, heat stress and soil moisture stress in relation to the cropping season and crop sensitive stages allows for risk profiles to be confronted with yields, yield losses and insurance claims. The methodology is demonstrated for arable food crops, bio-energy crops and fruit. The perspective of rising risk-exposure is exacerbated further by limited aid received for agricultural damage, an overall reduction of direct income support to farmers and projected intensification of weather extremes with climate change. Though average yields have risen continuously due to technological advances, there is no evidence that relative tolerance to adverse weather events has improved. The research is funded by the Belgian Science Policy Organisation (Belspo) under contract nr SD/RI/03A.
Risk analysis of urban gas pipeline network based on improved bow-tie model
NASA Astrophysics Data System (ADS)
Hao, M. J.; You, Q. J.; Yue, Z.
2017-11-01
Gas pipeline network is a major hazard source in urban areas. In the event of an accident, there could be grave consequences. In order to understand more clearly the causes and consequences of gas pipeline network accidents, and to develop prevention and mitigation measures, the author puts forward the application of improved bow-tie model to analyze risks of urban gas pipeline network. The improved bow-tie model analyzes accident causes from four aspects: human, materials, environment and management; it also analyzes the consequences from four aspects: casualty, property loss, environment and society. Then it quantifies the causes and consequences. Risk identification, risk analysis, risk assessment, risk control, and risk management will be clearly shown in the model figures. Then it can suggest prevention and mitigation measures accordingly to help reduce accident rate of gas pipeline network. The results show that the whole process of an accident can be visually investigated using the bow-tie model. It can also provide reasons for and predict consequences of an unfortunate event. It is of great significance in order to analyze leakage failure of gas pipeline network.
Gould, A Lawrence
2016-12-30
Conventional practice monitors accumulating information about drug safety in terms of the numbers of adverse events reported from trials in a drug development program. Estimates of between-treatment adverse event risk differences can be obtained readily from unblinded trials with adjustment for differences among trials using conventional statistical methods. Recent regulatory guidelines require monitoring the cumulative frequency of adverse event reports to identify possible between-treatment adverse event risk differences without unblinding ongoing trials. Conventional statistical methods for assessing between-treatment adverse event risks cannot be applied when the trials are blinded. However, CUSUM charts can be used to monitor the accumulation of adverse event occurrences. CUSUM charts for monitoring adverse event occurrence in a Bayesian paradigm are based on assumptions about the process generating the adverse event counts in a trial as expressed by informative prior distributions. This article describes the construction of control charts for monitoring adverse event occurrence based on statistical models for the processes, characterizes their statistical properties, and describes how to construct useful prior distributions. Application of the approach to two adverse events of interest in a real trial gave nearly identical results for binomial and Poisson observed event count likelihoods. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Grove, Erik L; Hansen, Peter Riis; Olesen, Jonas B; Ahlehoff, Ole; Selmer, Christian; Lindhardsen, Jesper; Madsen, Jan Kyst; Køber, Lars; Torp-Pedersen, Christian; Gislason, Gunnar H
2011-01-01
Objective To examine the effect of proton pump inhibitors on adverse cardiovascular events in aspirin treated patients with first time myocardial infarction. Design Retrospective nationwide propensity score matched study based on administrative data. Setting All hospitals in Denmark. Participants All aspirin treated patients surviving 30 days after a first myocardial infarction from 1997 to 2006, with follow-up for one year. Patients treated with clopidogrel were excluded. Main outcome measures The risk of the combined end point of cardiovascular death, myocardial infarction, or stroke associated with use of proton pump inhibitors was analysed using Kaplan-Meier analysis, Cox proportional hazard models, and propensity score matched Cox proportional hazard models. Results 3366 of 19 925 (16.9%) aspirin treated patients experienced recurrent myocardial infarction, stroke, or cardiovascular death. The hazard ratio for the combined end point in patients receiving proton pump inhibitors based on the time dependent Cox proportional hazard model was 1.46 (1.33 to 1.61; P<0.001) and for the propensity score matched model based on 8318 patients it was 1.61 (1.45 to 1.79; P<0.001). A sensitivity analysis showed no increase in risk related to use of H2 receptor blockers (1.04, 0.79 to 1.38; P=0.78). Conclusion In aspirin treated patients with first time myocardial infarction, treatment with proton pump inhibitors was associated with an increased risk of adverse cardiovascular events. PMID:21562004
Ray, Anne E.; Stapleton, Jerod L.; Turrisi, Rob; Mun, Eun-Young
2014-01-01
Background College students who play drinking games (DGs) more frequently report higher levels of alcohol use and experience more alcohol-related harm. However, the extent to which they are at risk for increased consumption and harm as a result of DG play on a given event after accounting for their typical DG participation, and typical and event drinking, is unclear. Objectives We examined whether first-year students consumed more alcohol and were more likely to experience consequences on drinking occasions when they played DGs. Methods Participants (N = 336) completed up to six web-based surveys following weekend drinking events in their first semester. Alcohol use, DG play, and consequences were reported for the Friday and Saturday prior to each survey. Typical DG tendencies were controlled in all models. Typical and event alcohol use were controlled in models predicting risk for consequences. Results Participants consumed more alcohol on DG versus non-DG events. All students were more likely to experience blackout drinking consequences when they played DGs. Women were more likely to experience social-interpersonal consequences when they played DGs. Conclusion DG play is an event-specific risk factor for increased alcohol use among first-year students, regardless of individual DG play tendencies. Further, event DG play signals increased risk for blackout drinking consequences for all students, and social-interpersonal consequences for women, aside from the amount of alcohol consumed on those occasions as well as typical drinking behaviors. Prevention efforts to reduce high-risk drinking may be strengthened by highlighting both event- and person-specific risks of DG play. PMID:25192202
Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng
2013-05-01
Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials.
A Spatial Framework to Map Heat Health Risks at Multiple Scales.
Ho, Hung Chak; Knudby, Anders; Huang, Wei
2015-12-18
In the last few decades extreme heat events have led to substantial excess mortality, most dramatically in Central Europe in 2003, in Russia in 2010, and even in typically cool locations such as Vancouver, Canada, in 2009. Heat-related morbidity and mortality is expected to increase over the coming centuries as the result of climate-driven global increases in the severity and frequency of extreme heat events. Spatial information on heat exposure and population vulnerability may be combined to map the areas of highest risk and focus mitigation efforts there. However, a mismatch in spatial resolution between heat exposure and vulnerability data can cause spatial scale issues such as the Modifiable Areal Unit Problem (MAUP). We used a raster-based model to integrate heat exposure and vulnerability data in a multi-criteria decision analysis, and compared it to the traditional vector-based model. We then used the Getis-Ord G(i) index to generate spatially smoothed heat risk hotspot maps from fine to coarse spatial scales. The raster-based model allowed production of maps at spatial resolution, more description of local-scale heat risk variability, and identification of heat-risk areas not identified with the vector-based approach. Spatial smoothing with the Getis-Ord G(i) index produced heat risk hotspots from local to regional spatial scale. The approach is a framework for reducing spatial scale issues in future heat risk mapping, and for identifying heat risk hotspots at spatial scales ranging from the block-level to the municipality level.
Limits of Risk Predictability in a Cascading Alternating Renewal Process Model.
Lin, Xin; Moussawi, Alaa; Korniss, Gyorgy; Bakdash, Jonathan Z; Szymanski, Boleslaw K
2017-07-27
Most risk analysis models systematically underestimate the probability and impact of catastrophic events (e.g., economic crises, natural disasters, and terrorism) by not taking into account interconnectivity and interdependence of risks. To address this weakness, we propose the Cascading Alternating Renewal Process (CARP) to forecast interconnected global risks. However, assessments of the model's prediction precision are limited by lack of sufficient ground truth data. Here, we establish prediction precision as a function of input data size by using alternative long ground truth data generated by simulations of the CARP model with known parameters. We illustrate the approach on a model of fires in artificial cities assembled from basic city blocks with diverse housing. The results confirm that parameter recovery variance exhibits power law decay as a function of the length of available ground truth data. Using CARP, we also demonstrate estimation using a disparate dataset that also has dependencies: real-world prediction precision for the global risk model based on the World Economic Forum Global Risk Report. We conclude that the CARP model is an efficient method for predicting catastrophic cascading events with potential applications to emerging local and global interconnected risks.
Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic met...
Calibration plots for risk prediction models in the presence of competing risks.
Gerds, Thomas A; Andersen, Per K; Kattan, Michael W
2014-08-15
A predicted risk of 17% can be called reliable if it can be expected that the event will occur to about 17 of 100 patients who all received a predicted risk of 17%. Statistical models can predict the absolute risk of an event such as cardiovascular death in the presence of competing risks such as death due to other causes. For personalized medicine and patient counseling, it is necessary to check that the model is calibrated in the sense that it provides reliable predictions for all subjects. There are three often encountered practical problems when the aim is to display or test if a risk prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves for competing risks models based on jackknife pseudo-values that are combined with a nearest neighborhood smoother and a cross-validation approach to deal with all three problems. Copyright © 2014 John Wiley & Sons, Ltd.
Modeling Dynamic Food Choice Processes to Understand Dietary Intervention Effects.
Marcum, Christopher Steven; Goldring, Megan R; McBride, Colleen M; Persky, Susan
2018-02-17
Meal construction is largely governed by nonconscious and habit-based processes that can be represented as a collection of in dividual, micro-level food choices that eventually give rise to a final plate. Despite this, dietary behavior intervention research rarely captures these micro-level food choice processes, instead measuring outcomes at aggregated levels. This is due in part to a dearth of analytic techniques to model these dynamic time-series events. The current article addresses this limitation by applying a generalization of the relational event framework to model micro-level food choice behavior following an educational intervention. Relational event modeling was used to model the food choices that 221 mothers made for their child following receipt of an information-based intervention. Participants were randomized to receive either (a) control information; (b) childhood obesity risk information; (c) childhood obesity risk information plus a personalized family history-based risk estimate for their child. Participants then made food choices for their child in a virtual reality-based food buffet simulation. Micro-level aspects of the built environment, such as the ordering of each food in the buffet, were influential. Other dynamic processes such as choice inertia also influenced food selection. Among participants receiving the strongest intervention condition, choice inertia decreased and the overall rate of food selection increased. Modeling food selection processes can elucidate the points at which interventions exert their influence. Researchers can leverage these findings to gain insight into nonconscious and uncontrollable aspects of food selection that influence dietary outcomes, which can ultimately improve the design of dietary interventions.
A quantile-based Time at Risk: A new approach for assessing risk in financial markets
NASA Astrophysics Data System (ADS)
Bolgorian, Meysam; Raei, Reza
2013-11-01
In this paper, we provide a new measure for evaluation of risk in financial markets. This measure is based on the return interval of critical events in financial markets or other investment situations. Our main goal was to devise a model like Value at Risk (VaR). As VaR, for a given financial asset, probability level and time horizon, gives a critical value such that the likelihood of loss on the asset over the time horizon exceeds this value is equal to the given probability level, our concept of Time at Risk (TaR), using a probability distribution function of return intervals, provides a critical time such that the probability that the return interval of a critical event exceeds this time equals the given probability level. As an empirical application, we applied our model to data from the Tehran Stock Exchange Price Index (TEPIX) as a financial asset (market portfolio) and reported the results.
BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods
NASA Astrophysics Data System (ADS)
Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.
2017-12-01
Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made through Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. With the ability to cope with incomplete information and use expert knowledge, as well as inherently providing quantitative uncertainty information, it is shown that loss models based on BNs are superior to deterministic approaches for pluvial flood risk assessment.
2014-01-01
Background Cardiovascular diseases are the main cause of death worldwide, making their prevention a major health care challenge. In 2006, a German statutory health insurance company presented a novel individualised prevention programme (KardioPro), which focused on coronary heart disease (CHD) screening, risk factor assessment, early detection and secondary prevention. This study evaluates KardioPro in CHD risk subgroups, and analyses the cost-effectiveness of different individualised prevention strategies. Methods The CHD risk subgroups were assembled based on routine data from the statutory health insurance company, making use of a quasi-beta regression model for risk prediction. The control group was selected via propensity score matching based on logistic regression and an approximate nearest neighbour approach. The main outcome was cost-effectiveness. Effectiveness was measured as event-free time, and events were defined as myocardial infarction, stroke and death. Incremental cost-effectiveness ratios comparing participants with non-participants were calculated for each subgroup. To assess the uncertainty of results, a bootstrapping approach was applied. Results The cost-effectiveness of KardioPro in the group at high risk of CHD was €20,901 per event-free year; in the medium-risk group, €52,323 per event-free year; in the low-risk group, €186,074 per event-free year; and in the group with known CHD, €26,456 per event-free year. KardioPro was associated with a significant health gain but also a significant cost increase. However, statistical significance could not be shown for all subgroups. Conclusion The cost-effectiveness of KardioPro differs substantially according to the group being targeted. Depending on the willingness-to-pay, it may be reasonable to only offer KardioPro to patients at high risk of further cardiovascular events. This high-risk group could be identified from routine statutory health insurance data. However, the long-term consequences of KardioPro still need to be evaluated. PMID:24938674
A Review of Recent Advances in Research on Extreme Heat Events
NASA Technical Reports Server (NTRS)
Horton, Radley M.; Mankin, Justin S.; Lesk, Corey; Coffel, Ethan; Raymond, Colin
2016-01-01
Reviewing recent literature, we report that changes in extreme heat event characteristics such as magnitude, frequency, and duration are highly sensitive to changes in mean global-scale warming. Numerous studies have detected significant changes in the observed occurrence of extreme heat events, irrespective of how such events are defined. Further, a number of these studies have attributed present-day changes in the risk of individual heat events and the documented global-scale increase in such events to anthropogenic-driven warming. Advances in process-based studies of heat events have focused on the proximate land-atmosphere interactions through soil moisture anomalies, and changes in occurrence of the underlying atmospheric circulation associated with heat events in the mid-latitudes. While evidence for a number of hypotheses remains limited, climate change nevertheless points to tail risks of possible changes in heat extremes that could exceed estimates generated from model outputs of mean temperature. We also explore risks associated with compound extreme events and nonlinear impacts associated with extreme heat.
L'Italien, G; Ford, I; Norrie, J; LaPuerta, P; Ehreth, J; Jackson, J; Shepherd, J
2000-03-15
The clinical decision to treat hypercholesterolemia is premised on an awareness of patient risk, and cardiac risk prediction models offer a practical means of determining such risk. However, these models are based on observational cohorts where estimates of the treatment benefit are largely inferred. The West of Scotland Coronary Prevention Study (WOSCOPS) provides an opportunity to develop a risk-benefit prediction model from the actual observed primary event reduction seen in the trial. Five-year Cox model risk estimates were derived from all WOSCOPS subjects (n = 6,595 men, aged 45 to 64 years old at baseline) using factors previously shown to be predictive of definite fatal coronary heart disease or nonfatal myocardial infarction. Model risk factors included age, diastolic blood pressure, total cholesterol/ high-density lipoprotein ratio (TC/HDL), current smoking, diabetes, family history of fatal coronary heart disease, nitrate use or angina, and treatment (placebo/ 40-mg pravastatin). All risk factors were expressed as categorical variables to facilitate risk assessment. Risk estimates were incorporated into a simple, hand-held slide rule or risk tool. Risk estimates were identified for 5-year age bands (45 to 65 years), 4 categories of TC/HDL ratio (<5.5, 5.5 to <6.5, 6.5 to <7.5, > or = 7.5), 2 levels of diastolic blood pressure (<90, > or = 90 mm Hg), from 0 to 3 additional risk factors (current smoking, diabetes, family history of premature fatal coronary heart disease, nitrate use or angina), and pravastatin treatment. Five-year risk estimates ranged from 2% in very low-risk subjects to 61% in the very high-risk subjects. Risk reduction due to pravastatin treatment averaged 31%. Thus, the Cardiovascular Event Reduction Tool (CERT) is a risk prediction model derived from the WOSCOPS trial. Its use will help physicians identify patients who will benefit from cholesterol reduction.
Weather based risks and insurances for crop production in Belgium
NASA Astrophysics Data System (ADS)
Gobin, Anne
2014-05-01
Extreme weather events such as late frosts, droughts, heat waves and rain storms can have devastating effects on cropping systems. Damages due to extreme events are strongly dependent on crop type, crop stage, soil type and soil conditions. The perspective of rising risk-exposure is exacerbated further by limited aid received for agricultural damage, an overall reduction of direct income support to farmers and projected intensification of weather extremes with climate change. According to both the agriculture and finance sectors, a risk assessment of extreme weather events and their impact on cropping systems is needed. The impact of extreme weather events particularly during the sensitive periods of the farming calendar requires a modelling approach to capture the mixture of non-linear interactions between the crop, its environment and the occurrence of the meteorological event. The risk of soil moisture deficit increases towards harvesting, such that drought stress occurs in spring and summer. Conversely, waterlogging occurs mostly during early spring and autumn. Risks of temperature stress appear during winter and spring for chilling and during summer for heat. Since crop development is driven by thermal time and photoperiod, the regional crop model REGCROP (Gobin, 2010) enabled to examine the likely frequency, magnitude and impacts of frost, drought, heat stress and waterlogging in relation to the cropping season and crop sensitive stages. The risk profiles were subsequently confronted with yields, yield losses and insurance claims for different crops. Physically based crop models such as REGCROP assist in understanding the links between different factors causing crop damage as demonstrated for cropping systems in Belgium. Extreme weather events have already precipitated contraction of insurance coverage in some markets (e.g. hail insurance), and the process can be expected to continue if the losses or damages from such events increase in the future. Climate change will stress this further and impacts on crop growth are expected to be twofold, owing to the sensitive stages occurring earlier during the growing season and to the changes in return period of extreme weather events. Though average yields have risen continuously due to technological advances, there is no evidence that relative tolerance to adverse weather events has improved. The research is funded by the Belgian Science Policy Organisation (Belspo) under contract nr SD/RI/03A.
Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic me...
An individual-based model for population viability analysis of humpback chub in Grand Canyon
Pine, William Pine; Healy, Brian; Smith, Emily Omana; Trammell, Melissa; Speas, Dave; Valdez, Rich; Yard, Mike; Walters, Carl; Ahrens, Rob; Vanhaverbeke, Randy; Stone, Dennis; Wilson, Wade
2013-01-01
We developed an individual-based population viability analysis model (females only) for evaluating risk to populations from catastrophic events or conservation and research actions. This model tracks attributes (size, weight, viability, etc.) for individual fish through time and then compiles this information to assess the extinction risk of the population across large numbers of simulation trials. Using a case history for the Little Colorado River population of Humpback Chub Gila cypha in Grand Canyon, Arizona, we assessed extinction risk and resiliency to a catastrophic event for this population and then assessed a series of conservation actions related to removing specific numbers of Humpback Chub at different sizes for conservation purposes, such as translocating individuals to establish other spawning populations or hatchery refuge development. Our results suggested that the Little Colorado River population is generally resilient to a single catastrophic event and also to removals of larvae and juveniles for conservation purposes, including translocations to establish new populations. Our results also suggested that translocation success is dependent on similar survival rates in receiving and donor streams and low emigration rates from recipient streams. In addition, translocating either large numbers of larvae or small numbers of large juveniles has generally an equal likelihood of successful population establishment at similar extinction risk levels to the Little Colorado River donor population. Our model created a transparent platform to consider extinction risk to populations from catastrophe or conservation actions and should prove useful to managers assessing these risks for endangered species such as Humpback Chub.
NASA Astrophysics Data System (ADS)
Mueller, M.; Mahoney, K. M.; Holman, K. D.
2015-12-01
The Bureau of Reclamation (Reclamation) is responsible for the safety of Taylor Park Dam, located in central Colorado at an elevation of 9300 feet. A key aspect of dam safety is anticipating extreme precipitation, runoff and the associated inflow of water to the reservoir within a probabilistic framework for risk analyses. The Cooperative Institute for Research in Environmental Sciences (CIRES) has partnered with Reclamation to improve understanding and estimation of precipitation in the western United States, including the Taylor Park watershed. A significant challenge is that Taylor Park Dam is located in a relatively data-sparse region, surrounded by mountains exceeding 12,000 feet. To better estimate heavy precipitation events in this basin, a high-resolution modeling approach is used. The Weather Research and Forecasting (WRF) model is employed to simulate events that have produced observed peaks in streamflow at the location of interest. Importantly, an ensemble of model simulations are run on each event so that uncertainty bounds (i.e., forecast error) may be provided such that the model outputs may be more effectively used in Reclamation's risk assessment framework. Model estimates of precipitation (and the uncertainty thereof) are then used in rainfall runoff models to determine the probability of inflows to the reservoir for use in Reclamation's dam safety risk analyses.
An Accident Precursor Analysis Process Tailored for NASA Space Systems
NASA Technical Reports Server (NTRS)
Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare
2010-01-01
Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.
Risk Analysis of Earth-Rock Dam Failures Based on Fuzzy Event Tree Method
Fu, Xiao; Gu, Chong-Shi; Su, Huai-Zhi; Qin, Xiang-Nan
2018-01-01
Earth-rock dams make up a large proportion of the dams in China, and their failures can induce great risks. In this paper, the risks associated with earth-rock dam failure are analyzed from two aspects: the probability of a dam failure and the resulting life loss. An event tree analysis method based on fuzzy set theory is proposed to calculate the dam failure probability. The life loss associated with dam failure is summarized and refined to be suitable for Chinese dams from previous studies. The proposed method and model are applied to one reservoir dam in Jiangxi province. Both engineering and non-engineering measures are proposed to reduce the risk. The risk analysis of the dam failure has essential significance for reducing dam failure probability and improving dam risk management level. PMID:29710824
Existential risks: exploring a robust risk reduction strategy.
Jebari, Karim
2015-06-01
A small but growing number of studies have aimed to understand, assess and reduce existential risks, or risks that threaten the continued existence of mankind. However, most attention has been focused on known and tangible risks. This paper proposes a heuristic for reducing the risk of black swan extinction events. These events are, as the name suggests, stochastic and unforeseen when they happen. Decision theory based on a fixed model of possible outcomes cannot properly deal with this kind of event. Neither can probabilistic risk analysis. This paper will argue that the approach that is referred to as engineering safety could be applied to reducing the risk from black swan extinction events. It will also propose a conceptual sketch of how such a strategy may be implemented: isolated, self-sufficient, and continuously manned underground refuges. Some characteristics of such refuges are also described, in particular the psychosocial aspects. Furthermore, it is argued that this implementation of the engineering safety strategy safety barriers would be effective and plausible and could reduce the risk of an extinction event in a wide range of possible (known and unknown) scenarios. Considering the staggering opportunity cost of an existential catastrophe, such strategies ought to be explored more vigorously.
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
NASA Astrophysics Data System (ADS)
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, Jonathan; Thompson, Sandra E.; Brothers, Alan J.
The ability to estimate the likelihood of future events based on current and historical data is essential to the decision making process of many government agencies. Successful predictions related to terror events and characterizing the risks will support development of options for countering these events. The predictive tasks involve both technical and social component models. The social components have presented a particularly difficult challenge. This paper outlines some technical considerations of this modeling activity. Both data and predictions associated with the technical and social models will likely be known with differing certainties or accuracies – a critical challenge is linkingmore » across these model domains while respecting this fundamental difference in certainty level. This paper will describe the technical approach being taken to develop the social model and identification of the significant interfaces between the technical and social modeling in the context of analysis of diversion of nuclear material.« less
Hoefer, Imo E.; Eijkemans, Marinus J. C.; Asselbergs, Folkert W.; Anderson, Todd J.; Britton, Annie R.; Dekker, Jacqueline M.; Engström, Gunnar; Evans, Greg W.; de Graaf, Jacqueline; Grobbee, Diederick E.; Hedblad, Bo; Holewijn, Suzanne; Ikeda, Ai; Kitagawa, Kazuo; Kitamura, Akihiko; de Kleijn, Dominique P. V.; Lonn, Eva M.; Lorenz, Matthias W.; Mathiesen, Ellisiv B.; Nijpels, Giel; Okazaki, Shuhei; O’Leary, Daniel H.; Pasterkamp, Gerard; Peters, Sanne A. E.; Polak, Joseph F.; Price, Jacqueline F.; Robertson, Christine; Rembold, Christopher M.; Rosvall, Maria; Rundek, Tatjana; Salonen, Jukka T.; Sitzer, Matthias; Stehouwer, Coen D. A.; Bots, Michiel L.; den Ruijter, Hester M.
2015-01-01
Background Clinical manifestations and outcomes of atherosclerotic disease differ between ethnic groups. In addition, the prevalence of risk factors is substantially different. Primary prevention programs are based on data derived from almost exclusively White people. We investigated how race/ethnic differences modify the associations of established risk factors with atherosclerosis and cardiovascular events. Methods We used data from an ongoing individual participant meta-analysis involving 17 population-based cohorts worldwide. We selected 60,211 participants without cardiovascular disease at baseline with available data on ethnicity (White, Black, Asian or Hispanic). We generated a multivariable linear regression model containing risk factors and ethnicity predicting mean common carotid intima-media thickness (CIMT) and a multivariable Cox regression model predicting myocardial infarction or stroke. For each risk factor we assessed how the association with the preclinical and clinical measures of cardiovascular atherosclerotic disease was affected by ethnicity. Results Ethnicity appeared to significantly modify the associations between risk factors and CIMT and cardiovascular events. The association between age and CIMT was weaker in Blacks and Hispanics. Systolic blood pressure associated more strongly with CIMT in Asians. HDL cholesterol and smoking associated less with CIMT in Blacks. Furthermore, the association of age and total cholesterol levels with the occurrence of cardiovascular events differed between Blacks and Whites. Conclusion The magnitude of associations between risk factors and the presence of atherosclerotic disease differs between race/ethnic groups. These subtle, yet significant differences provide insight in the etiology of cardiovascular disease among race/ethnic groups. These insights aid the race/ethnic-specific implementation of primary prevention. PMID:26134404
Gijsberts, Crystel M; Groenewegen, Karlijn A; Hoefer, Imo E; Eijkemans, Marinus J C; Asselbergs, Folkert W; Anderson, Todd J; Britton, Annie R; Dekker, Jacqueline M; Engström, Gunnar; Evans, Greg W; de Graaf, Jacqueline; Grobbee, Diederick E; Hedblad, Bo; Holewijn, Suzanne; Ikeda, Ai; Kitagawa, Kazuo; Kitamura, Akihiko; de Kleijn, Dominique P V; Lonn, Eva M; Lorenz, Matthias W; Mathiesen, Ellisiv B; Nijpels, Giel; Okazaki, Shuhei; O'Leary, Daniel H; Pasterkamp, Gerard; Peters, Sanne A E; Polak, Joseph F; Price, Jacqueline F; Robertson, Christine; Rembold, Christopher M; Rosvall, Maria; Rundek, Tatjana; Salonen, Jukka T; Sitzer, Matthias; Stehouwer, Coen D A; Bots, Michiel L; den Ruijter, Hester M
2015-01-01
Clinical manifestations and outcomes of atherosclerotic disease differ between ethnic groups. In addition, the prevalence of risk factors is substantially different. Primary prevention programs are based on data derived from almost exclusively White people. We investigated how race/ethnic differences modify the associations of established risk factors with atherosclerosis and cardiovascular events. We used data from an ongoing individual participant meta-analysis involving 17 population-based cohorts worldwide. We selected 60,211 participants without cardiovascular disease at baseline with available data on ethnicity (White, Black, Asian or Hispanic). We generated a multivariable linear regression model containing risk factors and ethnicity predicting mean common carotid intima-media thickness (CIMT) and a multivariable Cox regression model predicting myocardial infarction or stroke. For each risk factor we assessed how the association with the preclinical and clinical measures of cardiovascular atherosclerotic disease was affected by ethnicity. Ethnicity appeared to significantly modify the associations between risk factors and CIMT and cardiovascular events. The association between age and CIMT was weaker in Blacks and Hispanics. Systolic blood pressure associated more strongly with CIMT in Asians. HDL cholesterol and smoking associated less with CIMT in Blacks. Furthermore, the association of age and total cholesterol levels with the occurrence of cardiovascular events differed between Blacks and Whites. The magnitude of associations between risk factors and the presence of atherosclerotic disease differs between race/ethnic groups. These subtle, yet significant differences provide insight in the etiology of cardiovascular disease among race/ethnic groups. These insights aid the race/ethnic-specific implementation of primary prevention.
Hahn, Ezra; Jiang, Haiyan; Ng, Angela; Bashir, Shaheena; Ahmed, Sameera; Tsang, Richard; Sun, Alexander; Gospodarowicz, Mary; Hodgson, David
2017-08-01
Mediastinal radiation therapy (RT) for Hodgkin lymphoma (HL) is associated with late cardiotoxicity, but there are limited data to indicate which dosimetric parameters are most valuable for predicting this risk. This study investigated which whole heart dosimetric measurements provide the most information regarding late cardiotoxicity, and whether coronary artery dosimetry was more predictive of this outcome than whole heart dosimetry. A random sample of 125 HL patients treated with mediastinal RT was selected, and 3-dimensional cardiac dose-volume data were generated from historical plans using validated methods. Cardiac events were determined by linking patients to population-based datasets of inpatient and same-day hospitalizations and same-day procedures. Variables collected for the whole heart and 3 coronary arteries included the following: Dmean, Dmax, Dmin, dose homogeneity, V5, V10, V20, and V30. Multivariable competing risk regression models were generated for the whole heart and coronary arteries. There were 44 cardiac events documented, of which 70% were ischemic. The best multivariable model included the following covariates: whole heart Dmean (hazard ratio [HR] 1.09, P=.0083), dose homogeneity (HR 0.94, P=.0034), male sex (HR 2.31, P=.014), and age (HR 1.03, P=.0049). When any adverse cardiac event was the outcome, models using coronary artery variables did not perform better than models using whole heart variables. However, in a subanalysis of ischemic cardiac events only, the model using coronary artery variables was superior to the whole heart model and included the following covariates: age (HR 1.05, P<.001), volume of left anterior descending artery receiving 5 Gy (HR 0.98, P=.003), and volume of left circumflex artery receiving 20 Gy (HR 1.03, P<.001). In addition to higher mean heart dose, increasing inhomogeneity in cardiac dose was associated with a greater risk of late cardiac effects. When all types of cardiotoxicity were evaluated, the whole heart variable model outperformed the coronary artery models. However, when events were limited to ischemic cardiotoxicity, the coronary artery-based model was superior. Copyright © 2017 Elsevier Inc. All rights reserved.
Spatio-temporal population estimates for risk management
NASA Astrophysics Data System (ADS)
Cockings, Samantha; Martin, David; Smith, Alan; Martin, Rebecca
2013-04-01
Accurate estimation of population at risk from hazards and effective emergency management of events require not just appropriate spatio-temporal modelling of hazards but also of population. While much recent effort has been focused on improving the modelling and predictions of hazards (both natural and anthropogenic), there has been little parallel advance in the measurement or modelling of population statistics. Different hazard types occur over diverse temporal cycles, are of varying duration and differ significantly in their spatial extent. Even events of the same hazard type, such as flood events, vary markedly in their spatial and temporal characteristics. Conceptually and pragmatically then, population estimates should also be available for similarly varying spatio-temporal scales. Routine population statistics derived from traditional censuses or surveys are usually static representations in both space and time, recording people at their place of usual residence on census/survey night and presenting data for administratively defined areas. Such representations effectively fix the scale of population estimates in both space and time, which is unhelpful for meaningful risk management. Over recent years, the Pop24/7 programme of research, based at the University of Southampton (UK), has developed a framework for spatio-temporal modelling of population, based on gridded population surfaces. Based on a data model which is fully flexible in terms of space and time, the framework allows population estimates to be produced for any time slice relevant to the data contained in the model. It is based around a set of origin and destination centroids, which have capacities, spatial extents and catchment areas, all of which can vary temporally, such as by time of day, day of week, season. A background layer, containing information on features such as transport networks and landuse, provides information on the likelihood of people being in certain places at specific times. Unusual patterns associated with special events can also be modelled and the framework is fully volume preserving. Outputs from the model are gridded population surfaces for the specified time slice, either for total population or by sub-groups (e.g. age). Software to implement the models (SurfaceBuilder247) has been developed and pre-processed layers for typical time slices for England and Wales in 2001 and 2006 are available for UK academic purposes. The outputs and modelling framework from the Pop24/7 programme provide significant opportunities for risk management applications. For estimates of mid- to long-term cumulative population exposure to hazards, such as in flood risk mapping, populations can be produced for numerous time slices and integrated with flood models. For applications in emergency response/ management, time-specific population models can be used as seeds for agent-based models or other response/behaviour models. Estimates for sub-groups of the population also permit exploration of vulnerability through space and time. This paper outlines the requirements for effective spatio-temporal population models for risk management. It then describes the Pop24/7 framework and illustrates its potential for risk management through presentation of examples from natural and anthropogenic hazard applications. The paper concludes by highlighting key challenges for future research in this area.
Predictions of Leukemia Risks to Astronauts from Solar Particle Events
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; Atwell, W.; Kim, M. Y.; George, K. A.; Ponomarev, A.; Nikjoo, H.; Wilson, J. W.
2006-01-01
Leukemias consisting of acute and chronic myeloid leukemia and acute lymphatic lymphomas represent the earliest cancers that appear after radiation exposure, have a high lethality fraction, and make up a significant fraction of the overall fatal cancer risk from radiation for adults. Several considerations impact the recommendation of a preferred model for the estimation of leukemia risks from solar particle events (SPE's): The BEIR VII report recommends several changes to the method of calculation of leukemia risk compared to the methods recommended by the NCRP Report No. 132 including the preference of a mixture model with additive and multiplicative components in BEIR VII compared to the additive transfer model recommended by NCRP Report No. 132. Proton fluences and doses vary considerably across marrow regions because of the characteristic spectra of primary solar protons making the use of an average dose suspect. Previous estimates of bone marrow doses from SPE's have used an average body-shielding distribution for marrow based on the computerized anatomical man model (CAM). We have developed an 82-point body-shielding distribution that faithfully reproduces the mean and variance of SPE doses in the active marrow regions (head and neck, chest, abdomen, pelvis and thighs) allowing for more accurate estimation of linear- and quadratic-dose components of the marrow response. SPE's have differential dose-rates and a pseudo-quadratic dose response term is possible in the peak-flux period of an event. Also, the mechanistic basis for leukemia risk continues to improve allowing for improved strategies in choosing dose-rate modulation factors and radiation quality descriptors. We make comparisons of the various choices of the components in leukemia risk estimates in formulating our preferred model. A major finding is that leukemia could be the dominant risk to astronauts for a major solar particle event.
Lessons learnt from tropical cyclone losses
NASA Astrophysics Data System (ADS)
Honegger, Caspar; Wüest, Marc; Zimmerli, Peter; Schoeck, Konrad
2016-04-01
Swiss Re has a long history in developing natural catastrophe loss models. The tropical cyclone USA and China model are examples for event-based models in their second generation. Both are based on basin-wide probabilistic track sets and calculate explicitly the losses from the sub-perils wind and storm surge in an insurance portfolio. Based on these models, we present two cases studies. China: a view on recent typhoon loss history Over the last 20 years only very few major tropical cyclones have caused severe insurance losses in the Pearl River Delta region and Shanghai, the two main exposure clusters along China's southeast coast. Several storms have made landfall in China every year but most struck areas with relatively low insured values. With this study, we make the point that typhoon landfalls in China have a strong hit-or-miss character and available insured loss experience is too short to form a representative view of risk. Historical storm tracks and a simple loss model applied to a market portfolio - all from publicly available data - are sufficient to illustrate this. An event-based probabilistic model is necessary for a reliable judgement of the typhoon risk in China. New York: current and future tropical cyclone risk In the aftermath of hurricane Sandy 2012, Swiss Re supported the City of New York in identifying ways to significantly improve the resilience to severe weather and climate change. Swiss Re provided a quantitative assessment of potential climate related risks facing the city as well as measures that could reduce those impacts.
A risk-based coverage model for video surveillance camera control optimization
NASA Astrophysics Data System (ADS)
Zhang, Hongzhou; Du, Zhiguo; Zhao, Xingtao; Li, Peiyue; Li, Dehua
2015-12-01
Visual surveillance system for law enforcement or police case investigation is different from traditional application, for it is designed to monitor pedestrians, vehicles or potential accidents. Visual surveillance risk is defined as uncertainty of visual information of targets and events monitored in present work and risk entropy is introduced to modeling the requirement of police surveillance task on quality and quantity of vide information. the prosed coverage model is applied to calculate the preset FoV position of PTZ camera.
Tools used by the insurance industry to assess risk from hydroclimatic extremes
NASA Astrophysics Data System (ADS)
Higgs, Stephanie; McMullan, Caroline
2016-04-01
Probabilistic catastrophe models are widely used within the insurance industry to assess and price the risk of natural hazards to individual residences through to portfolios of millions of properties. Over the relatively short period that catastrophe models have been available (almost 30 years), the insurance industry has built up a financial resilience to key natural hazards in certain areas (e.g. US tropical cyclone, European extra-tropical cyclone and flood). However, due the rapidly expanding global population and increase in wealth, together with uncertainties in the behaviour of meteorological phenomena introduced by climate change, the domain in which natural hazards impact society is growing. As a result, the insurance industry faces new challenges in assessing the risk and uncertainty from natural hazards. As a catastrophe modelling company, AIR Worldwide has a toolbox of options available to help the insurance industry assess extreme climatic events and their associated uncertainty. Here we discuss several of these tools: from helping analysts understand how uncertainty is inherently built in to probabilistic catastrophe models, to understanding alternative stochastic catalogs for tropical cyclone based on climate conditioning. Through the use of stochastic extreme disaster events such as those provided through AIR's catalogs or through the Lloyds of London marketplace (RDS's) to provide useful benchmarks for the loss probability exceedence and tail-at-risk metrics outputted from catastrophe models; to the visualisation of 1000+ year event footprints and hazard intensity maps. Ultimately the increased transparency of catastrophe models and flexibility of a software platform that allows for customisation of modelled and non-modelled risks will drive a greater understanding of extreme hydroclimatic events within the insurance industry.
Common-Cause Failure Treatment in Event Assessment: Basis for a Proposed New Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dana Kelly; Song-Hua Shen; Gary DeMoss
2010-06-01
Event assessment is an application of probabilistic risk assessment in which observed equipment failures and outages are mapped into the risk model to obtain a numerical estimate of the event’s risk significance. In this paper, we focus on retrospective assessments to estimate the risk significance of degraded conditions such as equipment failure accompanied by a deficiency in a process such as maintenance practices. In modeling such events, the basic events in the risk model that are associated with observed failures and other off-normal situations are typically configured to be failed, while those associated with observed successes and unchallenged components aremore » assumed capable of failing, typically with their baseline probabilities. This is referred to as the failure memory approach to event assessment. The conditioning of common-cause failure probabilities for the common cause component group associated with the observed component failure is particularly important, as it is insufficient to simply leave these probabilities at their baseline values, and doing so may result in a significant underestimate of risk significance for the event. Past work in this area has focused on the mathematics of the adjustment. In this paper, we review the Basic Parameter Model for common-cause failure, which underlies most current risk modelling, discuss the limitations of this model with respect to event assessment, and introduce a proposed new framework for common-cause failure, which uses a Bayesian network to model underlying causes of failure, and which has the potential to overcome the limitations of the Basic Parameter Model with respect to event assessment.« less
Distributions of observed death tolls govern sensitivity to human fatalities
Olivola, Christopher Y.; Sagara, Namika
2009-01-01
How we react to humanitarian crises, epidemics, and other tragic events involving the loss of human lives depends largely on the extent to which we are moved by the size of their associated death tolls. Many studies have demonstrated that people generally exhibit a diminishing sensitivity to the number of human fatalities and, equivalently, a preference for risky (vs. sure) alternatives in decisions under risk involving human losses. However, the reason for this tendency remains unknown. Here we show that the distributions of event-related death tolls that people observe govern their evaluations of, and risk preferences concerning, human fatalities. In particular, we show that our diminishing sensitivity to human fatalities follows from the fact that these death tolls are approximately power-law distributed. We further show that, by manipulating the distribution of mortality-related events that people observe, we can alter their risk preferences in decisions involving fatalities. Finally, we show that the tendency to be risk-seeking in mortality-related decisions is lower in countries in which high-mortality events are more frequently observed. Our results support a model of magnitude evaluation based on memory sampling and relative judgment. This model departs from the utility-based approaches typically encountered in psychology and economics in that it does not rely on stable, underlying value representations to explain valuation and choice, or on choice behavior to derive value functions. Instead, preferences concerning human fatalities emerge spontaneously from the distributions of sampled events and the relative nature of the evaluation process. PMID:20018778
Vascular protection in peripheral artery disease: systematic review and modelling study.
Hackam, D G; Sultan, N M; Criqui, M H
2009-07-01
To ascertain the effectiveness of medical therapy for reducing risk in peripheral artery disease (PAD) and to model the potential impact of combining multiple efficacious approaches. 17 electronic databases, reference lists of primary studies, clinical practice guidelines, review articles, trial registries and conference proceedings from cardiology, vascular surgery and atherosclerosis meetings were screened. Eligible studies were randomized trials or meta-analyses of randomized trials of medical therapy for PAD which reported major cardiovascular events (myocardial infarction, stroke and cardiovascular death). Baseline event rates for modelling analyses were derived from published natural history cohorts. Overall, three strategies had persuasive evidence for reducing risk in PAD: antiplatelet agents (pooled RRR 26%, 95% CI 10 to 42), statins (pooled RRR 26%, 95% CI 18 to 33) and angiotensin-converting enzyme inhibitors (individual trial RRR 25%, 95% CI 8 to 39). The estimated cumulative relative risk reduction for all three strategies was 59% (CI 32 to 76). Given a 5-year major cardiovascular event rate of 25%, the corresponding absolute risk reduction and number needed to treat to prevent one event were 15% (CI 8 to 19) and 7 (CI 5 to 12), respectively. Population level analyses suggest that increased uptake of these modalities could prevent more than 200 000 events in patients with PAD each year. The use of multiple efficacious strategies has the potential to substantially reduce the cardiovascular burden of PAD. However, these data should be regarded as hypothetical, since they are based on mathematical modelling rather than factorial randomized trials.
Multifractal Value at Risk model
NASA Astrophysics Data System (ADS)
Lee, Hojin; Song, Jae Wook; Chang, Woojin
2016-06-01
In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.
Mortality risks during extreme temperature events (ETEs) using a distributed lag non-linear model
NASA Astrophysics Data System (ADS)
Allen, Michael J.; Sheridan, Scott C.
2018-01-01
This study investigates the relationship between all-cause mortality and extreme temperature events (ETEs) from 1975 to 2004. For 50 U.S. locations, these heat and cold events were defined based on location-specific thresholds of daily mean apparent temperature. Heat days were defined by a 3-day mean apparent temperature greater than the 95th percentile while extreme heat days were greater than the 97.5th percentile. Similarly, calculations for cold and extreme cold days relied upon the 5th and 2.5th percentiles. A distributed lag non-linear model assessed the relationship between mortality and ETEs for a cumulative 14-day period following exposure. Subsets for season and duration effect denote the differences between early- and late-season as well as short and long ETEs. While longer-lasting heat days resulted in elevated mortality, early season events also impacted mortality outcomes. Over the course of the summer season, heat-related risk decreased, though prolonged heat days still had a greater influence on mortality. Unlike heat, cold-related risk was greatest in more southerly locations. Risk was highest for early season cold events and decreased over the course of the winter season. Statistically, short episodes of cold showed the highest relative risk, suggesting unsettled weather conditions may have some relationship to cold-related mortality. For both heat and cold, results indicate higher risk to the more extreme thresholds. Risk values provide further insight into the role of adaptation, geographical variability, and acclimatization with respect to ETEs.
Vistisen, Dorte; Andersen, Gregers Stig; Hansen, Christian Stevns; Hulman, Adam; Henriksen, Jan Erik; Bech-Nielsen, Henning; Jørgensen, Marit Eika
2016-03-15
Patients with type 1 diabetes mellitus are at increased risk of developing cardiovascular disease (CVD), but they are currently undertreated. There are no risk scores used on a regular basis in clinical practice for assessing the risk of CVD in type 1 diabetes mellitus. From 4306 clinically diagnosed adult patients with type 1 diabetes mellitus, we developed a prediction model for estimating the risk of first fatal or nonfatal CVD event (ischemic heart disease, ischemic stroke, heart failure, and peripheral artery disease). Detailed clinical data including lifestyle factors were linked to event data from validated national registers. The risk prediction model was developed by using a 2-stage approach. First, a nonparametric, data-driven approach was used to identify potentially informative risk factors and interactions (random forest and survival tree analysis). Second, based on results from the first step, Poisson regression analysis was used to derive the final model. The final CVD prediction model was externally validated in a different population of 2119 patients with type 1 diabetes mellitus. During a median follow-up of 6.8 years (interquartile range, 2.9-10.9) a total of 793 (18.4%) patients developed CVD. The final prediction model included age, sex, diabetes duration, systolic blood pressure, low-density lipoprotein cholesterol, hemoglobin A1c, albuminuria, glomerular filtration rate, smoking, and exercise. Discrimination was excellent for a 5-year CVD event with a C-statistic of 0.826 (95% confidence interval, 0.807-0.845) in the derivation data and a C-statistic of 0.803 (95% confidence interval, 0.767-0.839) in the validation data. The Hosmer-Lemeshow test showed good calibration (P>0.05) in both cohorts. This high-performing CVD risk model allows for the implementation of decision rules in a clinical setting. © 2016 American Heart Association, Inc.
Meteorological risks as drivers of innovation for agroecosystem management
NASA Astrophysics Data System (ADS)
Gobin, Anne; Van de Vyver, Hans; Zamani, Sepideh; Curnel, Yannick; Planchon, Viviane; Verspecht, Ann; Van Huylenbroeck, Guido
2015-04-01
Devastating weather-related events recorded in recent years have captured the interest of the general public in Belgium. The MERINOVA project research hypothesis is that meteorological risks act as drivers of environmental innovation in agro-ecosystem management which is being tested using a "chain of risk" approach. The major objectives are to (1) assess the probability of extreme meteorological events by means of probability density functions; (2) analyse the extreme events impact of on agro-ecosystems using process-based bio-physical modelling methods; (3) identify the most vulnerable agro-ecosystems using fuzzy multi-criteria and spatial analysis; (4) uncover innovative risk management and adaptation options using actor-network theory and economic modelling; and, (5) communicate to research, policy and practitioner communities using web-based techniques. Generalized Extreme Value (GEV) theory was used to model annual rainfall maxima based on location-, scale- and shape-parameters that determine the centre of the distribution, the deviation of the location-parameter and the upper tail decay, respectively. Likewise the distributions of consecutive rainy days, rainfall deficits and extreme 24-hour rainfall were modelled. Spatial interpolation of GEV-derived return levels resulted in maps of extreme precipitation, precipitation deficits and wet periods. The degree of temporal overlap between extreme weather conditions and sensitive periods in the agro-ecosystem was determined using a bio-physically based modelling framework that couples phenological models, a soil water balance, crop growth and environmental models. 20-year return values were derived for frost, heat stress, drought, waterlogging and field access during different sensitive stages for different arable crops. Extreme yield values were detected from detrended long term arable yields and relationships were found with soil moisture conditions, heat stress or other meteorological variables during the season. A methodology for identifying agro-ecosystem vulnerability was developed using spatially explicit information and was tested for arable crop production in Belgium. The different components of vulnerability for a region include spatial information on meteorology, soil available water content, soil erosion, the degree of waterlogging, crop share and the diversity of potato varieties. The level of vulnerability and resilience of an agro-ecosystem is also determined by risk management. The types of agricultural risk and their relative importance differ across sectors and farm types. Risk types are further distinguished according to production, market, institutional, financial and liability risks. Strategies are often combined in the risk management strategy of a farmer and include reduction and prevention, mitigation, coping and impact reduction. Based on an extensive literature review, a portfolio of potential strategies was identified at farm, market and policy level. Research hypotheses were tested using an on-line questionnaire on knowledge of agricultural risk, measuring the general risk aversion of the farmer and risk management strategies. The "chain of risk" approach adopted as a research methodology allows for investigating the hypothesis that meteorological risks act as drivers for agricultural innovation. Risks related to extreme weather events in Belgium are mainly caused by heat, frost, excess rainfall, drought and storms, and their impact is predominantly felt by arable, horticultural and extensive dairy farmers. Quantification of the risk is evaluated in terms of probability of occurrence, magnitude, frequency and extent of impact on several agro-ecosystems services. The spatial extent of vulnerability is developed by integrating different layers of geo-information, while risk management is analysed using questionnaires and economic modelling methods. Future work will concentrate on the further development and testing of the currently developed modelling methodologies. https://merinova.vito.be The research is funded by the Belgian Science Policy Organisation (Belspo) under contract nr SD/RI/03A.
From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model
NASA Astrophysics Data System (ADS)
Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter
2014-05-01
The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.
The comparison of various approach to evaluation erosion risks and design control erosion measures
NASA Astrophysics Data System (ADS)
Kapicka, Jiri
2015-04-01
In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas with different importance.
Gronich, Naomi; Lavi, Idit; Rennert, Gad
2011-01-01
Background: Combined oral contraceptives are a common method of contraception, but they carry a risk of venous and arterial thrombosis. We assessed whether use of drospirenone was associated with an increase in thrombotic risk relative to third-generation combined oral contraceptives. Methods: Using computerized records of the largest health care provider in Israel, we identified all women aged 12 to 50 years for whom combined oral contraceptives had been dispensed between Jan. 1, 2002, and Dec. 31, 2008. We followed the cohort until 2009. We used Poisson regression models to estimate the crude and adjusted rate ratios for risk factors for venous thrombotic events (specifically deep vein thrombosis and pulmonary embolism) and arterial thromboic events (specifically transient ischemic attack and cerebrovascular accident). We performed multivariable analyses to compare types of contraceptives, with adjustment for the various risk factors. Results: We identified a total of 1017 (0.24%) venous and arterial thrombotic events among 431 223 use episodes during 819 749 woman-years of follow-up (6.33 venous events and 6.10 arterial events per 10 000 woman-years). In a multivariable model, use of drospirenone carried an increased risk of venous thrombotic events, relative to both third-generation combined oral contraceptives (rate ratio [RR] 1.43, 95% confidence interval [CI] 1.15–1.78) and second-generation combined oral contraceptives (RR 1.65, 95% CI 1.02–2.65). There was no increase in the risk of arterial thrombosis with drospirenone. Interpretation: Use of drospirenone-containing oral contraceptives was associated with an increased risk of deep vein thrombosis and pulmonary embolism, but not transient ischemic attack or cerebrovascular attack, relative to second- and third-generation combined oral contraceptives. PMID:22065352
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Land-Use Portfolio Modeler, Version 1.0
Taketa, Richard; Hong, Makiko
2010-01-01
Natural hazards pose significant threats to the public safety and economic health of many communities throughout the world. Community leaders and decision-makers continually face the challenges of planning and allocating limited resources to invest in protecting their communities against catastrophic losses from natural-hazard events. Public efforts to assess community vulnerability and encourage loss-reduction measures through mitigation often focused on either aggregating site-specific estimates or adopting standards based upon broad assumptions about regional risks. The site-specific method usually provided the most accurate estimates, but was prohibitively expensive, whereas regional risk assessments were often too general to be of practical use. Policy makers lacked a systematic and quantitative method for conducting a regional-scale risk assessment of natural hazards. In response, Bernknopf and others developed the portfolio model, an intermediate-scale approach to assessing natural-hazard risks and mitigation policy alternatives. The basis for the portfolio-model approach was inspired by financial portfolio theory, which prescribes a method of optimizing return on investment while reducing risk by diversifying investments in different security types. In this context, a security type represents a unique combination of features and hazard-risk level, while financial return is defined as the reduction in losses resulting from an investment in mitigation of chosen securities. Features are selected for mitigation and are modeled like investment portfolios. Earth-science and economic data for the features are combined and processed in order to analyze each of the portfolios, which are then used to evaluate the benefits of mitigating the risk in selected locations. Ultimately, the decision maker seeks to choose a portfolio representing a mitigation policy that maximizes the expected return-on-investment, while minimizing the uncertainty associated with that return-on-investment. The portfolio model, now known as the Land-Use Portfolio Model (LUPM), provided the framework for the development of the Land-Use Portfolio Modeler, Version 1.0 software (LUPM v1.0). The software provides a geographic information system (GIS)-based modeling tool for evaluating alternative risk-reduction mitigation strategies for specific natural-hazard events. The modeler uses information about a specific natural-hazard event and the features exposed to that event within the targeted study region to derive a measure of a given mitigation strategy`s effectiveness. Harnessing the spatial capabilities of a GIS enables the tool to provide a rich, interactive mapping environment in which users can create, analyze, visualize, and compare different
Karmali, Kunal N.; Lloyd-Jones, Donald M.; Zanchetti, Alberto; Jackson, Rodney; Woodward, Mark; Neal, Bruce C.; Berge, Eivind; Teo, Koon; Davis, Barry R.; Pepine, Carl
2018-01-01
Background Clinical practice guidelines have traditionally recommended blood pressure treatment based primarily on blood pressure thresholds. In contrast, using predicted cardiovascular risk has been advocated as a more effective strategy to guide treatment decisions for cardiovascular disease (CVD) prevention. We aimed to compare outcomes from a blood pressure-lowering treatment strategy based on predicted cardiovascular risk with one based on systolic blood pressure (SBP) level. Methods and findings We used individual participant data from the Blood Pressure Lowering Treatment Trialists’ Collaboration (BPLTTC) from 1995 to 2013. Trials randomly assigned participants to either blood pressure-lowering drugs versus placebo or more intensive versus less intensive blood pressure-lowering regimens. We estimated 5-y risk of CVD events using a multivariable Weibull model previously developed in this dataset. We compared the two strategies at specific SBP thresholds and across the spectrum of risk and blood pressure levels studied in BPLTTC trials. The primary outcome was number of CVD events avoided per persons treated. We included data from 11 trials (47,872 participants). During a median of 4.0 y of follow-up, 3,566 participants (7.5%) experienced a major cardiovascular event. Areas under the curve comparing the two treatment strategies throughout the range of possible thresholds for CVD risk and SBP demonstrated that, on average, a greater number of CVD events would be avoided for a given number of persons treated with the CVD risk strategy compared with the SBP strategy (area under the curve 0.71 [95% confidence interval (CI) 0.70–0.72] for the CVD risk strategy versus 0.54 [95% CI 0.53–0.55] for the SBP strategy). Compared with treating everyone with SBP ≥ 150 mmHg, a CVD risk strategy would require treatment of 29% (95% CI 26%–31%) fewer persons to prevent the same number of events or would prevent 16% (95% CI 14%–18%) more events for the same number of persons treated. Compared with treating everyone with SBP ≥ 140 mmHg, a CVD risk strategy would require treatment of 3.8% (95% CI 12.5% fewer to 7.2% more) fewer persons to prevent the same number of events or would prevent 3.1% (95% CI 1.5%–5.0%) more events for the same number of persons treated, although the former estimate was not statistically significant. In subgroup analyses, the CVD risk strategy did not appear to be more beneficial than the SBP strategy in patients with diabetes mellitus or established CVD. Conclusions A blood pressure-lowering treatment strategy based on predicted cardiovascular risk is more effective than one based on blood pressure levels alone across a range of thresholds. These results support using cardiovascular risk assessment to guide blood pressure treatment decision-making in moderate- to high-risk individuals, particularly for primary prevention. PMID:29558462
Karmali, Kunal N; Lloyd-Jones, Donald M; van der Leeuw, Joep; Goff, David C; Yusuf, Salim; Zanchetti, Alberto; Glasziou, Paul; Jackson, Rodney; Woodward, Mark; Rodgers, Anthony; Neal, Bruce C; Berge, Eivind; Teo, Koon; Davis, Barry R; Chalmers, John; Pepine, Carl; Rahimi, Kazem; Sundström, Johan
2018-03-01
Clinical practice guidelines have traditionally recommended blood pressure treatment based primarily on blood pressure thresholds. In contrast, using predicted cardiovascular risk has been advocated as a more effective strategy to guide treatment decisions for cardiovascular disease (CVD) prevention. We aimed to compare outcomes from a blood pressure-lowering treatment strategy based on predicted cardiovascular risk with one based on systolic blood pressure (SBP) level. We used individual participant data from the Blood Pressure Lowering Treatment Trialists' Collaboration (BPLTTC) from 1995 to 2013. Trials randomly assigned participants to either blood pressure-lowering drugs versus placebo or more intensive versus less intensive blood pressure-lowering regimens. We estimated 5-y risk of CVD events using a multivariable Weibull model previously developed in this dataset. We compared the two strategies at specific SBP thresholds and across the spectrum of risk and blood pressure levels studied in BPLTTC trials. The primary outcome was number of CVD events avoided per persons treated. We included data from 11 trials (47,872 participants). During a median of 4.0 y of follow-up, 3,566 participants (7.5%) experienced a major cardiovascular event. Areas under the curve comparing the two treatment strategies throughout the range of possible thresholds for CVD risk and SBP demonstrated that, on average, a greater number of CVD events would be avoided for a given number of persons treated with the CVD risk strategy compared with the SBP strategy (area under the curve 0.71 [95% confidence interval (CI) 0.70-0.72] for the CVD risk strategy versus 0.54 [95% CI 0.53-0.55] for the SBP strategy). Compared with treating everyone with SBP ≥ 150 mmHg, a CVD risk strategy would require treatment of 29% (95% CI 26%-31%) fewer persons to prevent the same number of events or would prevent 16% (95% CI 14%-18%) more events for the same number of persons treated. Compared with treating everyone with SBP ≥ 140 mmHg, a CVD risk strategy would require treatment of 3.8% (95% CI 12.5% fewer to 7.2% more) fewer persons to prevent the same number of events or would prevent 3.1% (95% CI 1.5%-5.0%) more events for the same number of persons treated, although the former estimate was not statistically significant. In subgroup analyses, the CVD risk strategy did not appear to be more beneficial than the SBP strategy in patients with diabetes mellitus or established CVD. A blood pressure-lowering treatment strategy based on predicted cardiovascular risk is more effective than one based on blood pressure levels alone across a range of thresholds. These results support using cardiovascular risk assessment to guide blood pressure treatment decision-making in moderate- to high-risk individuals, particularly for primary prevention.
Assessment of cardiovascular risk based on a data-driven knowledge discovery approach.
Mendes, D; Paredes, S; Rocha, T; Carvalho, P; Henriques, J; Cabiddu, R; Morais, J
2015-01-01
The cardioRisk project addresses the development of personalized risk assessment tools for patients who have been admitted to the hospital with acute myocardial infarction. Although there are models available that assess the short-term risk of death/new events for such patients, these models were established in circumstances that do not take into account the present clinical interventions and, in some cases, the risk factors used by such models are not easily available in clinical practice. The integration of the existing risk tools (applied in the clinician's daily practice) with data-driven knowledge discovery mechanisms based on data routinely collected during hospitalizations, will be a breakthrough in overcoming some of these difficulties. In this context, the development of simple and interpretable models (based on recent datasets), unquestionably will facilitate and will introduce confidence in this integration process. In this work, a simple and interpretable model based on a real dataset is proposed. It consists of a decision tree model structure that uses a reduced set of six binary risk factors. The validation is performed using a recent dataset provided by the Portuguese Society of Cardiology (11113 patients), which originally comprised 77 risk factors. A sensitivity, specificity and accuracy of, respectively, 80.42%, 77.25% and 78.80% were achieved showing the effectiveness of the approach.
Lunar Landing Operational Risk Model
NASA Technical Reports Server (NTRS)
Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian
2010-01-01
Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.
NASA Astrophysics Data System (ADS)
Hsieh, Nan-Hung; Liao, Chung-Min
2013-04-01
Asian dust storms (ADS) events are seasonally-based meteorological phenomena that exacerbate chronic respiratory diseases. The purpose of this study was to assess human health risk from airborne dust exposure during ADS events in Taiwan. A probabilistic risk assessment framework was developed based on exposure and experimental data to quantify ADS events induced lung function decrement. The study reanalyzed experimental data from aerosol challenge in asthmatic individuals to construct the dose-response relationship between inhaled dust aerosol dose and decreasing percentage of forced expiratory volume in 1 s (%FEV1). An empirical lung deposition model was used to predict deposition fraction for size specific dust aerosols in pulmonary regions. The toxicokinetic and toxicodynamic models were used to simulate dust aerosols binding kinetics in lung airway in that %FEV1 change was also predicted. The mask respirators were applied to control the inhaled dose under dust aerosols exposure. Our results found that only 2% probability the mild ADS events were likely to cause %FEV1 decrement higher than 5%. There were 50% probability of decreasing %FEV1 exceeding 16.9, 18.9, and 7.1% in north, center, and south Taiwan under severe ADS events, respectively. Our result implicates that the use of activated carbon of mask respirators has the best efficacy for reducing inhaled dust aerosol dose, by which the %FEV1 decrement can be reduced up to less than 1%.
Risk management of a fund for natural disasters
NASA Astrophysics Data System (ADS)
Flores, C.
2003-04-01
Mexico is a country which has to deal with several natural disaster risks: earthquakes, droughts, volcanic eruptions, floods, slides, wild fires, extreme temperatures, etc. In order to reduce the country's vulnerability to the impact of these natural disasters and to support rapid recovery when they occur, the government established in 1996 Mexico's Fund for Natural Disasters (FONDEN). Since its creation, its resources have been insufficient to meet all government obligations. The aim of this project is the development of a dynamic strategy to optimise the management of a fund for natural disasters starting from the example of FONDEN. The problem of budgetary planning is being considered for the modelling. We control the level of the fund's cash (R_t)0<= t
Shah, Ravi; Heydari, Bobak; Coelho-Filho, Otavio; Murthy, Venkatesh L; Abbasi, Siddique; Feng, Jiazhuo H; Pencina, Michael; Neilan, Tomas G; Meadows, Judith L; Francis, Sanjeev; Blankstein, Ron; Steigner, Michael; di Carli, Marcelo; Jerosch-Herold, Michael; Kwong, Raymond Y
2013-08-06
A recent large-scale clinical trial found that an initial invasive strategy does not improve cardiac outcomes beyond optimized medical therapy in patients with stable coronary artery disease. Novel methods to stratify at-risk patients may refine therapeutic decisions to improve outcomes. In a cohort of 815 consecutive patients referred for evaluation of myocardial ischemia, we determined the net reclassification improvement of the risk of cardiac death or nonfatal myocardial infarction (major adverse cardiac events) incremental to clinical risk models, using guideline-based low (<1%), moderate (1% to 3%), and high (>3%) annual risk categories. In the whole cohort, inducible ischemia demonstrated a strong association with major adverse cardiac events (hazard ratio=14.66; P<0.0001) with low negative event rates of major adverse cardiac events and cardiac death (0.6% and 0.4%, respectively). This prognostic robustness was maintained in patients with previous coronary artery disease (hazard ratio=8.17; P<0.0001; 1.3% and 0.6%, respectively). Adding inducible ischemia to the multivariable clinical risk model (adjusted for age and previous coronary artery disease) improved discrimination of major adverse cardiac events (C statistic, 0.81-0.86; P=0.04; adjusted hazard ratio=7.37; P<0.0001) and reclassified 91.5% of patients at moderate pretest risk (65.7% to low risk; 25.8% to high risk) with corresponding changes in the observed event rates (0.3%/y and 4.9%/y for low and high risk posttest, respectively). Categorical net reclassification index was 0.229 (95% confidence interval, 0.063-0.391). Continuous net reclassification improvement was 1.11 (95% confidence interval, 0.81-1.39). Stress cardiac magnetic resonance imaging effectively reclassifies patient risk beyond standard clinical variables, specifically in patients at moderate to high pretest clinical risk and in patients with previous coronary artery disease. http://www.clinicaltrials.gov. Unique identifier: NCT01821924.
2013-01-01
Background Comparison of outcomes between populations or centres may be confounded by any casemix differences and standardisation is carried out to avoid this. However, when the casemix adjustment models are large and complex, direct standardisation has been described as “practically impossible”, and indirect standardisation may lead to unfair comparisons. We propose a new method of directly standardising for risk rather than standardising for casemix which overcomes these problems. Methods Using a casemix model which is the same model as would be used in indirect standardisation, the risk in individuals is estimated. Risk categories are defined, and event rates in each category for each centre to be compared are calculated. A weighted sum of the risk category specific event rates is then calculated. We have illustrated this method using data on 6 million admissions to 146 hospitals in England in 2007/8 and an existing model with over 5000 casemix combinations, and a second dataset of 18,668 adult emergency admissions to 9 centres in the UK and overseas and a published model with over 20,000 casemix combinations and a continuous covariate. Results Substantial differences between conventional directly casemix standardised rates and rates from direct risk standardisation (DRS) were found. Results based on DRS were very similar to Standardised Mortality Ratios (SMRs) obtained from indirect standardisation, with similar standard errors. Conclusions Direct risk standardisation using our proposed method is as straightforward as using conventional direct or indirect standardisation, always enables fair comparisons of performance to be made, can use continuous casemix covariates, and was found in our examples to have similar standard errors to the SMR. It should be preferred when there is a risk that conventional direct or indirect standardisation will lead to unfair comparisons. PMID:24168424
Nicholl, Jon; Jacques, Richard M; Campbell, Michael J
2013-10-29
Comparison of outcomes between populations or centres may be confounded by any casemix differences and standardisation is carried out to avoid this. However, when the casemix adjustment models are large and complex, direct standardisation has been described as "practically impossible", and indirect standardisation may lead to unfair comparisons. We propose a new method of directly standardising for risk rather than standardising for casemix which overcomes these problems. Using a casemix model which is the same model as would be used in indirect standardisation, the risk in individuals is estimated. Risk categories are defined, and event rates in each category for each centre to be compared are calculated. A weighted sum of the risk category specific event rates is then calculated. We have illustrated this method using data on 6 million admissions to 146 hospitals in England in 2007/8 and an existing model with over 5000 casemix combinations, and a second dataset of 18,668 adult emergency admissions to 9 centres in the UK and overseas and a published model with over 20,000 casemix combinations and a continuous covariate. Substantial differences between conventional directly casemix standardised rates and rates from direct risk standardisation (DRS) were found. Results based on DRS were very similar to Standardised Mortality Ratios (SMRs) obtained from indirect standardisation, with similar standard errors. Direct risk standardisation using our proposed method is as straightforward as using conventional direct or indirect standardisation, always enables fair comparisons of performance to be made, can use continuous casemix covariates, and was found in our examples to have similar standard errors to the SMR. It should be preferred when there is a risk that conventional direct or indirect standardisation will lead to unfair comparisons.
Holistic flood risk assessment using agent-based modelling: the case of Sint Maarten Island
NASA Astrophysics Data System (ADS)
Abayneh Abebe, Yared; Vojinovic, Zoran; Nikolic, Igor; Hammond, Michael; Sanchez, Arlex; Pelling, Mark
2015-04-01
Floods in coastal regions are regarded as one of the most dangerous and harmful disasters. Though commonly referred to as natural disasters, coastal floods are also attributable to various social, economic, historical and political issues. Rapid urbanisation in coastal areas combined with climate change and poor governance can lead to a significant increase in the risk of pluvial flooding coinciding with fluvial and coastal flooding posing a greater risk of devastation in coastal communities. Disasters that can be triggered by hydro-meteorological events are interconnected and interrelated with both human activities and natural processes. They, therefore, require holistic approaches to help understand their complexity in order to design and develop adaptive risk management approaches that minimise social and economic losses and environmental impacts, and increase resilience to such events. Being located in the North Atlantic Ocean, Sint Maarten is frequently subjected to hurricanes. In addition, the stormwater catchments and streams on Sint Maarten have several unique characteristics that contribute to the severity of flood-related impacts. Urban environments are usually situated in low-lying areas, with little consideration for stormwater drainage, and as such are subject to flash flooding. Hence, Sint Maarten authorities drafted policies to minimise the risk of flood-related disasters on the island. In this study, an agent-based model is designed and applied to understand the implications of introduced policies and regulations, and to understand how different actors' behaviours influence the formation, propagation and accumulation of flood risk. The agent-based model built for this study is based on the MAIA meta-model, which helps to decompose, structure and conceptualize socio-technical systems with an agent-oriented perspective, and is developed using the NetLogo simulation environment. The agents described in this model are households and businesses, and policies on spatial planning rules are implemented. Preliminary results demonstrate the evolving nature of flood risks and describe the effectiveness of different planning policies to reduce risk and increase resilience.
A prediction model for colon cancer surveillance data.
Good, Norm M; Suresh, Krithika; Young, Graeme P; Lockett, Trevor J; Macrae, Finlay A; Taylor, Jeremy M G
2015-08-15
Dynamic prediction models make use of patient-specific longitudinal data to update individualized survival probability predictions based on current and past information. Colonoscopy (COL) and fecal occult blood test (FOBT) results were collected from two Australian surveillance studies on individuals characterized as high-risk based on a personal or family history of colorectal cancer. Motivated by a Poisson process, this paper proposes a generalized nonlinear model with a complementary log-log link as a dynamic prediction tool that produces individualized probabilities for the risk of developing advanced adenoma or colorectal cancer (AAC). This model allows predicted risk to depend on a patient's baseline characteristics and time-dependent covariates. Information on the dates and results of COLs and FOBTs were incorporated using time-dependent covariates that contributed to patient risk of AAC for a specified period following the test result. These covariates serve to update a person's risk as additional COL, and FOBT test information becomes available. Model selection was conducted systematically through the comparison of Akaike information criterion. Goodness-of-fit was assessed with the use of calibration plots to compare the predicted probability of event occurrence with the proportion of events observed. Abnormal COL results were found to significantly increase risk of AAC for 1 year following the test. Positive FOBTs were found to significantly increase the risk of AAC for 3 months following the result. The covariates that incorporated the updated test results were of greater significance and had a larger effect on risk than the baseline variables. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Ludwig, R.
2017-12-01
There is as yet no confirmed knowledge whether and how climate change contributes to the magnitude and frequency of hydrological extreme events and how regional water management could adapt to the corresponding risks. The ClimEx project (2015-2019) investigates the effects of climate change on the meteorological and hydrological extreme events and their implications for water management in Bavaria and Québec. High Performance Computing is employed to enable the complex simulations in a hydro-climatological model processing chain, resulting in a unique high-resolution and transient (1950-2100) dataset of climatological and meteorological forcing and hydrological response: (1) The climate module has developed a large ensemble of high resolution data (12km) of the CRCM5 RCM for Central Europe and North-Eastern North America, downscaled from 50 members of the CanESM2 GCM. The dataset is complemented by all available data from the Euro-CORDEX project to account for the assessment of both natural climate variability and climate change. The large ensemble with several thousand model years provides the potential to catch rare extreme events and thus improves the process understanding of extreme events with return periods of 1000+ years. (2) The hydrology module comprises process-based and spatially explicit model setups (e.g. WaSiM) for all major catchments in Bavaria and Southern Québec in high temporal (3h) and spatial (500m) resolution. The simulations form the basis for in depth analysis of hydrological extreme events based on the inputs from the large climate model dataset. The specific data situation enables to establish a new method for `virtual perfect prediction', which assesses climate change impacts on flood risk and water resources management by identifying patterns in the data which reveal preferential triggers of hydrological extreme events. The presentation will highlight first results from the analysis of the large scale ClimEx model ensemble, showing the current and future ratio of natural variability and climate change impacts on meteorological extreme events. Selected data from the ensemble is used to drive a hydrological model experiment to illustrate the capacity to better determine the recurrence periods of hydrological extreme events under conditions of climate change.
Solar Energetic Particle Event Risks for Future Human Missions within the Inner Heliosphere
NASA Astrophysics Data System (ADS)
Over, S.; Ford, J.
2017-12-01
As astronauts travel beyond low-Earth orbit (LEO), space weather research will play a key role in determining risks from space radiation. Of interest are the rare, large solar energetic particle (SEP) events that can cause significant medical effects during flight. Historical SEP data were analyzed from the Geostationary Operational Environmental Satellites (GOES) program covering the time period of 1986 to 2016 for SEP events. The SEP event data were combined with a Monte Carlo approach to develop a risk model to determine maximum expected doses for missions within the inner heliosphere. Presented here are results from risk assessments for proposed Mars transits as compared to a geostationary Earth-bound mission. Overall, the greatest risk was for the return from Mars with a Venus swing-by, due to the additional transit length and decreased distance from the Sun as compared to traditional Hohmann transfers. The overall results do not indicate that the effects of SEP events alone would prohibit these missions based on current radiation limits alone, but the combination of doses from SEP events and galactic cosmic radiation may be significant, and should be considered in all phases of mission design.
The Generation of a Stochastic Flood Event Catalogue for Continental USA
NASA Astrophysics Data System (ADS)
Quinn, N.; Wing, O.; Smith, A.; Sampson, C. C.; Neal, J. C.; Bates, P. D.
2017-12-01
Recent advances in the acquisition of spatiotemporal environmental data and improvements in computational capabilities has enabled the generation of large scale, even global, flood hazard layers which serve as a critical decision-making tool for a range of end users. However, these datasets are designed to indicate only the probability and depth of inundation at a given location and are unable to describe the likelihood of concurrent flooding across multiple sites.Recent research has highlighted that although the estimation of large, widespread flood events is of great value to flood mitigation and insurance industries, to date it has been difficult to deal with this spatial dependence structure in flood risk over relatively large scales. Many existing approaches have been restricted to empirical estimates of risk based on historic events, limiting their capability of assessing risk over the full range of plausible scenarios. Therefore, this research utilises a recently developed model-based approach to describe the multisite joint distribution of extreme river flows across continental USA river gauges. Given an extreme event at a site, the model characterises the likelihood neighbouring sites are also impacted. This information is used to simulate an ensemble of plausible synthetic extreme event footprints from which flood depths are extracted from an existing global flood hazard catalogue. Expected economic losses are then estimated by overlaying flood depths with national datasets defining asset locations, characteristics and depth damage functions. The ability of this approach to quantify probabilistic economic risk and rare threshold exceeding events is expected to be of value to those interested in the flood mitigation and insurance sectors.This work describes the methodological steps taken to create the flood loss catalogue over a national scale; highlights the uncertainty in the expected annual economic vulnerability within the USA from extreme river flows; and presents future developments to the modelling approach.
NASA Astrophysics Data System (ADS)
Guo, Aijun; Chang, Jianxia; Wang, Yimin; Huang, Qiang; Zhou, Shuai
2018-05-01
Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on regional flood control systems. This work advances traditional flood risk analysis by proposing a univariate and copula-based bivariate hydrological risk framework which incorporates both flood control and sediment transport. In developing the framework, the conditional probabilities of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula-based model. Moreover, a Monte Carlo-based algorithm is designed to quantify the sampling uncertainty associated with univariate and bivariate hydrological risk analyses. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The univariate and bivariate return periods, risk and reliability in the context of uncertainty for the purposes of flood control and sediment transport are assessed for the study regions. The results indicate that sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the event that AMF exceeds the design flood of downstream hydraulic structures in the UCX and UCH. Moreover, there is considerable sampling uncertainty affecting the univariate and bivariate hydrologic risk evaluation, which greatly challenges measures of future flood mitigation. In addition, results also confirm that the developed framework can estimate conditional probabilities associated with different flood events under various extreme precipitation scenarios aiming for flood control and sediment transport. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.
Quasi-continuous stochastic simulation framework for flood modelling
NASA Astrophysics Data System (ADS)
Moustakis, Yiannis; Kossieris, Panagiotis; Tsoukalas, Ioannis; Efstratiadis, Andreas
2017-04-01
Typically, flood modelling in the context of everyday engineering practices is addressed through event-based deterministic tools, e.g., the well-known SCS-CN method. A major shortcoming of such approaches is the ignorance of uncertainty, which is associated with the variability of soil moisture conditions and the variability of rainfall during the storm event.In event-based modeling, the sole expression of uncertainty is the return period of the design storm, which is assumed to represent the acceptable risk of all output quantities (flood volume, peak discharge, etc.). On the other hand, the varying antecedent soil moisture conditions across the basin are represented by means of scenarios (e.g., the three AMC types by SCS),while the temporal distribution of rainfall is represented through standard deterministic patterns (e.g., the alternative blocks method). In order to address these major inconsistencies,simultaneously preserving the simplicity and parsimony of the SCS-CN method, we have developed a quasi-continuous stochastic simulation approach, comprising the following steps: (1) generation of synthetic daily rainfall time series; (2) update of potential maximum soil moisture retention, on the basis of accumulated five-day rainfall; (3) estimation of daily runoff through the SCS-CN formula, using as inputs the daily rainfall and the updated value of soil moisture retention;(4) selection of extreme events and application of the standard SCS-CN procedure for each specific event, on the basis of synthetic rainfall.This scheme requires the use of two stochastic modelling components, namely the CastaliaR model, for the generation of synthetic daily data, and the HyetosMinute model, for the disaggregation of daily rainfall to finer temporal scales. Outcomes of this approach are a large number of synthetic flood events, allowing for expressing the design variables in statistical terms and thus properly evaluating the flood risk.
A Predictive Model for Medical Events Based on Contextual Embedding of Temporal Sequences
Wang, Zhimu; Huang, Yingxiang; Wang, Shuang; Wang, Fei; Jiang, Xiaoqian
2016-01-01
Background Medical concepts are inherently ambiguous and error-prone due to human fallibility, which makes it hard for them to be fully used by classical machine learning methods (eg, for tasks like early stage disease prediction). Objective Our work was to create a new machine-friendly representation that resembles the semantics of medical concepts. We then developed a sequential predictive model for medical events based on this new representation. Methods We developed novel contextual embedding techniques to combine different medical events (eg, diagnoses, prescriptions, and labs tests). Each medical event is converted into a numerical vector that resembles its “semantics,” via which the similarity between medical events can be easily measured. We developed simple and effective predictive models based on these vectors to predict novel diagnoses. Results We evaluated our sequential prediction model (and standard learning methods) in estimating the risk of potential diseases based on our contextual embedding representation. Our model achieved an area under the receiver operating characteristic (ROC) curve (AUC) of 0.79 on chronic systolic heart failure and an average AUC of 0.67 (over the 80 most common diagnoses) using the Medical Information Mart for Intensive Care III (MIMIC-III) dataset. Conclusions We propose a general early prognosis predictor for 80 different diagnoses. Our method computes numeric representation for each medical event to uncover the potential meaning of those events. Our results demonstrate the efficiency of the proposed method, which will benefit patients and physicians by offering more accurate diagnosis. PMID:27888170
Dalton, Jarrod E; Perzynski, Adam T; Zidar, David A; Rothberg, Michael B; Coulton, Claudia J; Milinovich, Alex T; Einstadter, Douglas; Karichu, James K; Dawson, Neal V
2017-10-03
Inequality in health outcomes in relation to Americans' socioeconomic position is rising. First, to evaluate the spatial relationship between neighborhood disadvantage and major atherosclerotic cardiovascular disease (ASCVD)-related events; second, to evaluate the relative extent to which neighborhood disadvantage and physiologic risk account for neighborhood-level variation in ASCVD event rates. Observational cohort analysis of geocoded longitudinal electronic health records. A single academic health center and surrounding neighborhoods in northeastern Ohio. 109 793 patients from the Cleveland Clinic Health System (CCHS) who had an outpatient lipid panel drawn between 2007 and 2010. The date of the first qualifying lipid panel served as the study baseline. Time from baseline to the first occurrence of a major ASCVD event (myocardial infarction, stroke, or cardiovascular death) within 5 years, modeled as a function of a locally derived neighborhood disadvantage index (NDI) and the predicted 5-year ASCVD event rate from the Pooled Cohort Equations Risk Model (PCERM) of the American College of Cardiology and American Heart Association. Outcome data were censored if no CCHS encounters occurred for 2 consecutive years or when state death data were no longer available (that is, from 2014 onward). The PCERM systematically underpredicted ASCVD event risk among patients from disadvantaged communities. Model discrimination was poorer among these patients (concordance index [C], 0.70 [95% CI, 0.67 to 0.74]) than those from the most affluent communities (C, 0.80 [CI, 0.78 to 0.81]). The NDI alone accounted for 32.0% of census tract-level variation in ASCVD event rates, compared with 10.0% accounted for by the PCERM. Patients from affluent communities were overrepresented. Outcomes of patients who received treatment for cardiovascular disease at Cleveland Clinic were assumed to be independent of whether the patients came from a disadvantaged or an affluent neighborhood. Neighborhood disadvantage may be a powerful regulator of ASCVD event risk. In addition to supplemental risk models and clinical screening criteria, population-based solutions are needed to ameliorate the deleterious effects of neighborhood disadvantage on health outcomes. The Clinical and Translational Science Collaborative of Cleveland and National Institutes of Health.
Stress testing hydrologic models using bottom-up climate change assessment
NASA Astrophysics Data System (ADS)
Stephens, C.; Johnson, F.; Marshall, L. A.
2017-12-01
Bottom-up climate change assessment is a promising approach for understanding the vulnerability of a system to potential future changes. The technique has been utilised successfully in risk-based assessments of future flood severity and infrastructure vulnerability. We find that it is also an ideal tool for assessing hydrologic model performance in a changing climate. In this study, we applied bottom-up climate change to compare the performance of two different hydrologic models (an event-based and a continuous model) under increasingly severe climate change scenarios. This allowed us to diagnose likely sources of future prediction error in the two models. The climate change scenarios were based on projections for southern Australia, which indicate drier average conditions with increased extreme rainfall intensities. We found that the key weakness in using the event-based model to simulate drier future scenarios was the model's inability to dynamically account for changing antecedent conditions. This led to increased variability in model performance relative to the continuous model, which automatically accounts for the wetness of a catchment through dynamic simulation of water storages. When considering more intense future rainfall events, representation of antecedent conditions became less important than assumptions around (non)linearity in catchment response. The linear continuous model we applied may underestimate flood risk in a future climate with greater extreme rainfall intensity. In contrast with the recommendations of previous studies, this indicates that continuous simulation is not necessarily the key to robust flood modelling under climate change. By applying bottom-up climate change assessment, we were able to understand systematic changes in relative model performance under changing conditions and deduce likely sources of prediction error in the two models.
NASA Astrophysics Data System (ADS)
Manan, Norhafizah A.; Abidin, Basir
2015-02-01
Five percent of patients who went through Percutaneous Coronary Intervention (PCI) experienced Major Adverse Cardiac Events (MACE) after PCI procedure. Risk prediction of MACE following a PCI procedure therefore is helpful. This work describes a review of such prediction models currently in use. Literature search was done on PubMed and SCOPUS database. Thirty literatures were found but only 4 studies were chosen based on the data used, design, and outcome of the study. Particular emphasis was given and commented on the study design, population, sample size, modeling method, predictors, outcomes, discrimination and calibration of the model. All the models had acceptable discrimination ability (C-statistics >0.7) and good calibration (Hosmer-Lameshow P-value >0.05). Most common model used was multivariate logistic regression and most popular predictor was age.
Characterizing Mega-Earthquake Related Tsunami on Subduction Zones without Large Historical Events
NASA Astrophysics Data System (ADS)
Williams, C. R.; Lee, R.; Astill, S.; Farahani, R.; Wilson, P. S.; Mohammed, F.
2014-12-01
Due to recent large tsunami events (e.g., Chile 2010 and Japan 2011), the insurance industry is very aware of the importance of managing its exposure to tsunami risk. There are currently few tools available to help establish policies for managing and pricing tsunami risk globally. As a starting point and to help address this issue, Risk Management Solutions Inc. (RMS) is developing a global suite of tsunami inundation footprints. This dataset will include both representations of historical events as well as a series of M9 scenarios on subductions zones that have not historical generated mega earthquakes. The latter set is included to address concerns about the completeness of the historical record for mega earthquakes. This concern stems from the fact that the Tohoku Japan earthquake was considerably larger than had been observed in the historical record. Characterizing the source and rupture pattern for the subduction zones without historical events is a poorly constrained process. In many case, the subduction zones can be segmented based on changes in the characteristics of the subducting slab or major ridge systems. For this project, the unit sources from the NOAA propagation database are utilized to leverage the basin wide modeling included in this dataset. The length of the rupture is characterized based on subduction zone segmentation and the slip per unit source can be determined based on the event magnitude (i.e., M9) and moment balancing. As these events have not occurred historically, there is little to constrain the slip distribution. Sensitivity tests on the potential rupture pattern have been undertaken comparing uniform slip to higher shallow slip and tapered slip models. Subduction zones examined include the Makran Trench, the Lesser Antilles and the Hikurangi Trench. The ultimate goal is to create a series of tsunami footprints to help insurers understand their exposures at risk to tsunami inundation around the world.
NASA Astrophysics Data System (ADS)
Ludwig, Ralf; Baese, Frank; Braun, Marco; Brietzke, Gilbert; Brissette, Francois; Frigon, Anne; Giguère, Michel; Komischke, Holger; Kranzlmueller, Dieter; Leduc, Martin; Martel, Jean-Luc; Ricard, Simon; Schmid, Josef; von Trentini, Fabian; Turcotte, Richard; Weismueller, Jens; Willkofer, Florian; Wood, Raul
2017-04-01
The recent accumulation of extreme hydrological events in Bavaria and Québec has stimulated scientific and also societal interest. In addition to the challenges of an improved prediction of such situations and the implications for the associated risk management, there is, as yet, no confirmed knowledge whether and how climate change contributes to the magnitude and frequency of hydrological extreme events and how regional water management could adapt to the corresponding risks. The ClimEx project (2015-2019) investigates the effects of climate change on the meteorological and hydrological extreme events and their implications for water management in Bavaria and Québec. High Performance Computing is employed to enable the complex simulations in a hydro-climatological model processing chain, resulting in a unique high-resolution and transient (1950-2100) dataset of climatological and meteorological forcing and hydrological response: (1) The climate module has developed a large ensemble of high resolution data (12km) of the CRCM5 RCM for Central Europe and North-Eastern North America, downscaled from 50 members of the CanESM2 GCM. The dataset is complemented by all available data from the Euro-CORDEX project to account for the assessment of both natural climate variability and climate change. The large ensemble with several thousand model years provides the potential to catch rare extreme events and thus improves the process understanding of extreme events with return periods of 1000+ years. (2) The hydrology module comprises process-based and spatially explicit model setups (e.g. WaSiM) for all major catchments in Bavaria and Southern Québec in high temporal (3h) and spatial (500m) resolution. The simulations form the basis for in depth analysis of hydrological extreme events based on the inputs from the large climate model dataset. The specific data situation enables to establish a new method for 'virtual perfect prediction', which assesses climate change impacts on flood risk and water resources management by identifying patterns in the data which reveal preferential triggers of hydrological extreme events. The presentation will highlight first results from the analysis of the large scale ClimEx model ensemble, showing the current and future ratio of natural variability and climate change impacts on meteorological extreme events. Selected data from the ensemble is used to drive a hydrological model experiment to illustrate the capacity to better determine the recurrence periods of hydrological extreme events under conditions of climate change. [The authors acknowledge funding for the project from the Bavarian State Ministry for the Environment and Consumer Protection].
NASA Astrophysics Data System (ADS)
Nunes, Ana
2015-04-01
Extreme meteorological events played an important role in catastrophic occurrences observed in the past over densely populated areas in Brazil. This motived the proposal of an integrated system for analysis and assessment of vulnerability and risk caused by extreme events in urban areas that are particularly affected by complex topography. That requires a multi-scale approach, which is centered on a regional modeling system, consisting of a regional (spectral) climate model coupled to a land-surface scheme. This regional modeling system employs a boundary forcing method based on scale-selective bias correction and assimilation of satellite-based precipitation estimates. Scale-selective bias correction is a method similar to the spectral nudging technique for dynamical downscaling that allows internal modes to develop in agreement with the large-scale features, while the precipitation assimilation procedure improves the modeled deep-convection and drives the land-surface scheme variables. Here, the scale-selective bias correction acts only on the rotational part of the wind field, letting the precipitation assimilation procedure to correct moisture convergence, in order to reconstruct South American current climate within the South American Hydroclimate Reconstruction Project. The hydroclimate reconstruction outputs might eventually produce improved initial conditions for high-resolution numerical integrations in metropolitan regions, generating more reliable short-term precipitation predictions, and providing accurate hidrometeorological variables to higher resolution geomorphological models. Better representation of deep-convection from intermediate scales is relevant when the resolution of the regional modeling system is refined by any method to meet the scale of geomorphological dynamic models of stability and mass movement, assisting in the assessment of risk areas and estimation of terrain stability over complex topography. The reconstruction of past extreme events also helps the development of a system for decision-making, regarding natural and social disasters, and reducing impacts. Numerical experiments using this regional modeling system successfully modeled severe weather events in Brazil. Comparisons with the NCEP Climate Forecast System Reanalysis outputs were made at resolutions of about 40- and 25-km of the regional climate model.
Exaggerated risk: prospect theory and probability weighting in risky choice.
Kusev, Petko; van Schaik, Paul; Ayton, Peter; Dent, John; Chater, Nick
2009-11-01
In 5 experiments, we studied precautionary decisions in which participants decided whether or not to buy insurance with specified cost against an undesirable event with specified probability and cost. We compared the risks taken for precautionary decisions with those taken for equivalent monetary gambles. Fitting these data to Tversky and Kahneman's (1992) prospect theory, we found that the weighting function required to model precautionary decisions differed from that required for monetary gambles. This result indicates a failure of the descriptive invariance axiom of expected utility theory. For precautionary decisions, people overweighted small, medium-sized, and moderately large probabilities-they exaggerated risks. This effect is not anticipated by prospect theory or experience-based decision research (Hertwig, Barron, Weber, & Erev, 2004). We found evidence that exaggerated risk is caused by the accessibility of events in memory: The weighting function varies as a function of the accessibility of events. This suggests that people's experiences of events leak into decisions even when risk information is explicitly provided. Our findings highlight a need to investigate how variation in decision content produces variation in preferences for risk.
Earthquake Hazard and Risk in Alaska
NASA Astrophysics Data System (ADS)
Black Porto, N.; Nyst, M.
2014-12-01
Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance: the Trans-Alaska pipeline, industrial facilities in Valdez, and typical residential wood buildings in Anchorage, Fairbanks and Juneau.
Challenges of Modeling Flood Risk at Large Scales
NASA Astrophysics Data System (ADS)
Guin, J.; Simic, M.; Rowe, J.
2009-04-01
Flood risk management is a major concern for many nations and for the insurance sector in places where this peril is insured. A prerequisite for risk management, whether in the public sector or in the private sector is an accurate estimation of the risk. Mitigation measures and traditional flood management techniques are most successful when the problem is viewed at a large regional scale such that all inter-dependencies in a river network are well understood. From an insurance perspective the jury is still out there on whether flood is an insurable peril. However, with advances in modeling techniques and computer power it is possible to develop models that allow proper risk quantification at the scale suitable for a viable insurance market for flood peril. In order to serve the insurance market a model has to be event-simulation based and has to provide financial risk estimation that forms the basis for risk pricing, risk transfer and risk management at all levels of insurance industry at large. In short, for a collection of properties, henceforth referred to as a portfolio, the critical output of the model is an annual probability distribution of economic losses from a single flood occurrence (flood event) or from an aggregation of all events in any given year. In this paper, the challenges of developing such a model are discussed in the context of Great Britain for which a model has been developed. The model comprises of several, physically motivated components so that the primary attributes of the phenomenon are accounted for. The first component, the rainfall generator simulates a continuous series of rainfall events in space and time over thousands of years, which are physically realistic while maintaining the statistical properties of rainfall at all locations over the model domain. A physically based runoff generation module feeds all the rivers in Great Britain, whose total length of stream links amounts to about 60,000 km. A dynamical flow routing algorithm propagates the flows for each simulated event. The model incorporates a digital terrain model (DTM) at 10m horizontal resolution, which is used to extract flood plain cross-sections such that a one-dimensional hydraulic model can be used to estimate extent and elevation of flooding. In doing so the effect of flood defenses in mitigating floods are accounted for. Finally a suite of vulnerability relationships have been developed to estimate flood losses for a portfolio of properties that are exposed to flood hazard. Historical experience indicates that a for recent floods in Great Britain more than 50% of insurance claims occur outside the flood plain and these are primarily a result of excess surface flow, hillside flooding, flooding due to inadequate drainage. A sub-component of the model addresses this issue by considering several parameters that best explain the variability of claims off the flood plain. The challenges of modeling such a complex phenomenon at a large scale largely dictate the choice of modeling approaches that need to be adopted for each of these model components. While detailed numerically-based physical models exist and have been used for conducting flood hazard studies, they are generally restricted to small geographic regions. In a probabilistic risk estimation framework like our current model, a blend of deterministic and statistical techniques have to be employed such that each model component is independent, physically sound and is able to maintain the statistical properties of observed historical data. This is particularly important because of the highly non-linear behavior of the flooding process. With respect to vulnerability modeling, both on and off the flood plain, the challenges include the appropriate scaling of a damage relationship when applied to a portfolio of properties. This arises from the fact that the estimated hazard parameter used for damage assessment, namely maximum flood depth has considerable uncertainty. The uncertainty can be attributed to various sources among which are imperfections in the hazard modeling, inherent errors in the DTM, lack of accurate information on the properties that are being analyzed, imperfections in the vulnerability relationships, inability of the model to account for local mitigation measures that are usually undertaken when a real event is unfolding and lack of details in the claims data that are used for model calibration. Nevertheless, the model once calibrated provides a very robust framework for analyzing relative and absolute risk. The paper concludes with key economic statistics of flood risk for Great Britain as a whole including certain large loss-causing scenarios affecting the greater London region. The model estimates a total financial loss of 5.6 billion GBP to all properties at a 1% annual aggregate exceedance probability level.
NASA Astrophysics Data System (ADS)
Balbus, J. M.; Kirsch, T.; Mitrani-Reiser, J.
2017-12-01
Over recent decades, natural disasters and mass-casualty events in United States have repeatedly revealed the serious consequences of health care facility vulnerability and the subsequent ability to deliver care for the affected people. Advances in predictive modeling and vulnerability assessment for health care facility failure, integrated infrastructure, and extreme weather events have now enabled a more rigorous scientific approach to evaluating health care system vulnerability and assessing impacts of natural and human disasters as well as the value of specific interventions. Concurrent advances in computing capacity also allow, for the first time, full integration of these multiple individual models, along with the modeling of population behaviors and mass casualty responses during a disaster. A team of federal and academic investigators led by the National Center for Disaster Medicine and Public Health (NCDMPH) is develoing a platform for integrating extreme event forecasts, health risk/impact assessment and population simulations, critical infrastructure (electrical, water, transportation, communication) impact and response models, health care facility-specific vulnerability and failure assessments, and health system/patient flow responses. The integration of these models is intended to develop much greater understanding of critical tipping points in the vulnerability of health systems during natural and human disasters and build an evidence base for specific interventions. Development of such a modeling platform will greatly facilitate the assessment of potential concurrent or sequential catastrophic events, such as a terrorism act following a severe heat wave or hurricane. This presentation will highlight the development of this modeling platform as well as applications not just for the US health system, but also for international science-based disaster risk reduction efforts, such as the Sendai Framework and the WHO SMART hospital project.
MONTEIRO, J.F.G.; ESCUDERO, D.J.; WEINREB, C.; FLANIGAN, T.; GALEA, S.; FRIEDMAN, S.R.; MARSHALL, B.D.L.
2017-01-01
SUMMARY We investigated how different models of HIV transmission, and assumptions regarding the distribution of unprotected sex and syringe-sharing events (‘risk acts’), affect quantitative understanding of HIV transmission process in people who inject drugs (PWID). The individual-based model simulated HIV transmission in a dynamic sexual and injecting network representing New York City. We constructed four HIV transmission models: model 1, constant probabilities; model 2, random number of sexual and parenteral acts; model 3, viral load individual assigned; and model 4, two groups of partnerships (low and high risk). Overall, models with less heterogeneity were more sensitive to changes in numbers risk acts, producing HIV incidence up to four times higher than that empirically observed. Although all models overestimated HIV incidence, micro-simulations with greater heterogeneity in the HIV transmission modelling process produced more robust results and better reproduced empirical epidemic dynamics. PMID:26753627
A calibration hierarchy for risk models was defined: from utopia to empirical data.
Van Calster, Ben; Nieboer, Daan; Vergouwe, Yvonne; De Cock, Bavo; Pencina, Michael J; Steyerberg, Ewout W
2016-06-01
Calibrated risk models are vital for valid decision support. We define four levels of calibration and describe implications for model development and external validation of predictions. We present results based on simulated data sets. A common definition of calibration is "having an event rate of R% among patients with a predicted risk of R%," which we refer to as "moderate calibration." Weaker forms of calibration only require the average predicted risk (mean calibration) or the average prediction effects (weak calibration) to be correct. "Strong calibration" requires that the event rate equals the predicted risk for every covariate pattern. This implies that the model is fully correct for the validation setting. We argue that this is unrealistic: the model type may be incorrect, the linear predictor is only asymptotically unbiased, and all nonlinear and interaction effects should be correctly modeled. In addition, we prove that moderate calibration guarantees nonharmful decision making. Finally, results indicate that a flexible assessment of calibration in small validation data sets is problematic. Strong calibration is desirable for individualized decision support but unrealistic and counter productive by stimulating the development of overly complex models. Model development and external validation should focus on moderate calibration. Copyright © 2016 Elsevier Inc. All rights reserved.
Feinstein, Matthew; Ning, Hongyan; Kang, Joseph; Bertoni, Alain; Carnethon, Mercedes; Lloyd-Jones, Donald M
2012-07-03
No studies have compared first cardiovascular disease (CVD) events and non-CVD death between races in a competing risks framework, which examines risks for numerous events simultaneously. We used competing Cox models to estimate hazards for first CVD events and non-CVD death within and between races in 3 multicenter, National Heart, Lung, and Blood Institute-sponsored cohorts. Of 14 569 Atherosclerosis Risk in Communities (ARIC) study participants aged 45 to 64 years with mean follow-up of 10.5 years, 11.6% had CVD and 5.0% had non-CVD death as first events; among 4237 Cardiovascular Health Study (CHS) study participants aged 65 to 84 years and followed for 8.5 years, these figures were 43.2% and 15.7%, respectively. Middle-aged blacks were significantly more likely than whites to experience any CVD as a first event; this disparity disappeared by older adulthood and after adjustment for CVD risk factors. The pattern of results was similar for Multi-Ethnic Study of Atherosclerosis (MESA) participants. Traditional Cox and competing risks models yielded different results for coronary heart disease risk. Black men appeared somewhat more likely than white men to experience coronary heart disease with use of a standard Cox model (hazard ratio 1.06; 95% CI 0.90, 1.26), whereas they appeared less likely than white men to have a first coronary heart disease event with use of a competing risks model (hazard ratio, 0.77; 95% CI, 0.60, 1.00). CVD affects blacks at an earlier age than whites; this may be attributable in part to elevated CVD risk factor levels among blacks. Racial disparities in first CVD incidence disappear by older adulthood. Competing risks analyses may yield somewhat different results than traditional Cox models and provide a complementary approach to examining risks for first CVD events.
NASA Astrophysics Data System (ADS)
Fraisse, C.; Pequeno, D.; Staub, C. G.; Perry, C.
2016-12-01
Climate variability, particularly the occurrence of extreme weather conditions such as dry spells and heat stress during sensitive crop developmental phases can substantially increase the prospect of reduced crop yields. Yield losses or crop failure risk due to stressful weather conditions vary mainly due to stress severity and exposure time and duration. The magnitude of stress effects is also crop specific, differing in terms of thresholds and adaptation to environmental conditions. To help producers in the Southeast USA mitigate and monitor the risk of crop losses due to extreme weather events we developed a web-based tool that evaluates the risk of extreme weather events during the season taking into account the crop development stages. Producers can enter their plans for the upcoming season in a given field (e.g. crop, variety, planting date, acreage etc.), select or not a specific El Nino Southern Oscillation (ENSO) phase, and will be presented with the probabilities (ranging from 0 -100%) of extreme weather events occurring during sensitive phases of the growing season for the selected conditions. The DSSAT models CERES-Maize, CROPGRO-Soybean, CROPGRO-Cotton, and N-Wheat phenology models have been translated from FORTRAN to a standalone versions in R language. These models have been tested in collaboration with Extension faculty and producers during the 2016 season and their usefulness for risk mitigation and monitoring evaluated. A companion AgroClimate app was also developed to help producers track and monitor phenology development during the cropping season.
Developing a Malaysia flood model
NASA Astrophysics Data System (ADS)
Haseldine, Lucy; Baxter, Stephen; Wheeler, Phil; Thomson, Tina
2014-05-01
Faced with growing exposures in Malaysia, insurers have a need for models to help them assess their exposure to flood losses. The need for an improved management of flood risks has been further highlighted by the 2011 floods in Thailand and recent events in Malaysia. The increasing demand for loss accumulation tools in Malaysia has lead to the development of the first nationwide probabilistic Malaysia flood model, which we present here. The model is multi-peril, including river flooding for thousands of kilometres of river and rainfall-driven surface water flooding in major cities, which may cause losses equivalent to river flood in some high-density urban areas. The underlying hazard maps are based on a 30m digital surface model (DSM) and 1D/2D hydraulic modelling in JFlow and RFlow. Key mitigation schemes such as the SMART tunnel and drainage capacities are also considered in the model. The probabilistic element of the model is driven by a stochastic event set based on rainfall data, hence enabling per-event and annual figures to be calculated for a specific insurance portfolio and a range of return periods. Losses are estimated via depth-damage vulnerability functions which link the insured damage to water depths for different property types in Malaysia. The model provides a unique insight into Malaysian flood risk profiles and provides insurers with return period estimates of flood damage and loss to property portfolios through loss exceedance curve outputs. It has been successfully validated against historic flood events in Malaysia and is now being successfully used by insurance companies in the Malaysian market to obtain reinsurance cover.
Modeling urban flood risk territories for Riga city
NASA Astrophysics Data System (ADS)
Piliksere, A.; Sennikovs, J.; Virbulis, J.; Bethers, U.; Bethers, P.; Valainis, A.
2012-04-01
Riga, the capital of Latvia, is located on River Daugava at the Gulf of Riga. The main flooding risks of Riga city are: (1) storm caused water setup in South part of Gulf of Riga (storm event), (2) water level increase caused by Daugava River discharge maximums (spring snow melting event) and (3) strong rainfall or rapid snow melting in densely populated urban areas. The first two flooding factors were discussed previously (Piliksere et al, 2011). The aims of the study were (1) the identification of the flood risk situations in densely populated areas, (2) the quantification of the flooding scenarios caused by rain and snow melting events of different return periods nowadays, in the near future (2021-2050), far future (2071-2100) taking into account the projections of climate change, (3) estimation of groundwater level for Riga city, (4) the building and calibration of the hydrological mathematical model based on SWMM (EPA, 2004) for the domain potentially vulnerable for rain and snow melt flooding events, (5) the calculation of rain and snow melting flood events with different return periods, (6) mapping the potentially flooded areas on a fine grid. The time series of short term precipitation events during warm time period of year (id est. rain events) were analyzed for 35 year long time period. Annual maxima of precipitation intensity for events with different duration (5 min; 15 min; 1h; 3h; 6h; 12h; 1 day; 2 days; 4 days; 10 days) were calculated. The time series of long term simultaneous precipitation data and observations of the reduction of thickness of snow cover were analyzed for 27 year long time period. Snow thawing periods were detected and maximum of snow melting intensity for events with different intensity (1day; 2 days; 4 days; 7 days; 10 days) were calculated. According to the occurrence probability six scenarios for each event for nowadays, near and far future with return period once in 5, 10, 20, 50, 100 and 200 years were constructed based on the Gumbell extreme value analysis. The hydrological modelling driven by the temperature and precipitation data series from regional climate models were used for evaluation of rain event maximums in the future periods. The usage of the climate model data in hydrological models causes systematic errors; therefore the bias correction method (Sennikovs, Bethers, 2009) was applied for determination of the future rainfall intensities. SWMM model was built for the urban area. Objects of hydraulic importance (manifold, penstock, ditch, pumping station, weir, well, catchment sub-basin etc.) were included in the model. There exist pure rain sewage system and mixed rain-water/household sewage system in Riga. Sewage system with wastewater load proportional to population density was taken account and calibrated. Model system was calibrated for a real rain event against the water flux time series into sewage treatment plant of Riga. High resolution (~1.5 points per square meter) digital terrain map was used as the base for finite element mesh for the geospatial mapping of results of hydraulic calculations. Main results of study are (1) detection of the hot spots of densely populated urban areas; (2) identification of the weak chains of the melioration and sewage systems; (3) mapping the elevation of ground water mainly caused by snow melting. A.Piliksere, A.Valainis, J.Seņņikovs, (2011), A flood risk assessment for Riga city taking account climate changes, EGU, Vienna, Austria. EPA, (2004), Storm water management model. User's manual version 5.0. US Environmental Protection Agency J.Sennikovs, U.Bethers, (2009), Statistical downscaling method of regional climate model results for hydrological modelling. 18th World IMACS/MODSIM Congress, Cairns, Australia.
Bestvina, Christine M; Wroblewski, Kristen E; Daly, Bobby; Beach, Brittany; Chow, Selina; Hantel, Andrew; Malec, Monica; Huber, Michael T; Polite, Blase N
2018-06-01
Accurate understanding of the prognosis of an advanced cancer patient can lead to decreased aggressive care at the end of life and earlier hospice enrollment. Our goal was to determine the association between high-risk clinical events identified by a simple, rules-based algorithm and decreased overall survival, to target poor prognosis cancer patients who would urgently benefit from advanced care planning. A retrospective analysis was performed on outpatient oncology patients with an index visit from April 1, 2015, through June 30, 2015. We examined a three-month window for "high-risk events," defined as (1) change in chemotherapy, (2) emergency department (ED) visit, and (3) hospitalization. Patients were followed until January 31, 2017. A total of 219 patients receiving palliative chemotherapy at the University of Chicago Medicine with a prognosis of ≤12 months were included. The main outcome was overall survival, and each "high-risk event" was treated as a time-varying covariate in a Cox proportional hazards regression model to calculate a hazard ratio (HR) of death. A change in chemotherapy regimen, ED visit, hospitalization, and at least one high-risk event occurred in 54% (118/219), 10% (22/219), 26% (57/219), and 67% (146/219) of patients, respectively. The adjusted HR of death for patients with a high-risk event was 1.72 (95% confidence interval [CI] 1.19-2.46, p = 0.003), with hospitalization reaching significance (HR 2.74, 95% CI 1.84-4.09, p < 0.001). The rules-based algorithm identified those with the greatest risk of death among a poor prognosis patient group. Implementation of this algorithm in the electronic health record can identify patients with increased urgency to address goals of care.
Improving patient safety by optimizing the use of nursing human resources.
Rochefort, Christian M; Buckeridge, David L; Abrahamowicz, Michal
2015-06-14
Recent ecological studies have suggested that inadequate nurse staffing may contribute to the incidence of adverse events in acute care hospitals. However, longitudinal studies are needed to further examine these associations and to identify the staffing patterns that are of greatest risk. The aims of this study are to determine if (a) nurse staffing levels are associated with an increased risk of adverse events, (b) the risk of adverse events in relationship to nurse staffing levels is modified by the complexity of patient requirements, and (c) optimal nurse staffing levels can be established. A dynamic cohort of all adult medical, surgical, and intensive care unit patients admitted between 2010 and 2015 to a Canadian academic health center will be followed during the inpatient and 7-day post-discharge period to assess the occurrence and frequency of adverse events in relationship to antecedent nurse staffing levels. Four potentially preventable adverse events will be measured: (a) hospital-acquired pneumonia, (b) ventilator-associated pneumonia, (c) venous thromboembolism, and (d) in-hospital fall. These events were selected for their high incidence, morbidity and mortality rates, and because they are hypothesized to be related to nurse staffing levels. Adverse events will be ascertained from electronic health record data using validated automated detection algorithms. Patient exposure to nurse staffing will be measured on every shift of the hospitalization using electronic payroll records. To examine the association between nurse staffing levels and the risk of adverse events, four Cox proportional hazards regression models will be used (one for each adverse event), while adjusting for patient characteristics and risk factors of adverse event occurrence. To determine if the association between nurse staffing levels and the occurrence of adverse events is modified by the complexity of patient requirements, interaction terms will be included in the regression models, and their significance assessed. To assess for the presence of optimal nurse staffing levels, flexible nonlinear spline functions will be fitted. This study will likely generate evidence-based information that will assist managers in making the most effective use of scarce nursing resources and in identifying staffing patterns that minimize the risk of adverse events.
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Nounu, Hatem N.; Ponomarev, Artem L.; Cucinotta, Francis A.
2011-01-01
A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) [1] for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of heavy ions in tissue and shielding materials is made with a stochastic approach that includes both ion track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model [2]. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections
Method and system for dynamic probabilistic risk assessment
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)
2013-01-01
The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.
Catastrophe loss modelling of storm-surge flood risk in eastern England.
Muir Wood, Robert; Drayton, Michael; Berger, Agnete; Burgess, Paul; Wright, Tom
2005-06-15
Probabilistic catastrophe loss modelling techniques, comprising a large stochastic set of potential storm-surge flood events, each assigned an annual rate of occurrence, have been employed for quantifying risk in the coastal flood plain of eastern England. Based on the tracks of the causative extratropical cyclones, historical storm-surge events are categorized into three classes, with distinct windfields and surge geographies. Extreme combinations of "tide with surge" are then generated for an extreme value distribution developed for each class. Fragility curves are used to determine the probability and magnitude of breaching relative to water levels and wave action for each section of sea defence. Based on the time-history of water levels in the surge, and the simulated configuration of breaching, flow is time-stepped through the defences and propagated into the flood plain using a 50 m horizontal-resolution digital elevation model. Based on the values and locations of the building stock in the flood plain, losses are calculated using vulnerability functions linking flood depth and flood velocity to measures of property loss. The outputs from this model for a UK insurance industry portfolio include "loss exceedence probabilities" as well as "average annualized losses", which can be employed for calculating coastal flood risk premiums in each postcode.
Denton, Brian T.; Hayward, Rodney A.
2017-01-01
Background Intensive blood pressure (BP) treatment can avert cardiovascular disease (CVD) events but can cause some serious adverse events. We sought to develop and validate risk models for predicting absolute risk difference (increased risk or decreased risk) for CVD events and serious adverse events from intensive BP therapy. A secondary aim was to test if the statistical method of elastic net regularization would improve the estimation of risk models for predicting absolute risk difference, as compared to a traditional backwards variable selection approach. Methods and findings Cox models were derived from SPRINT trial data and validated on ACCORD-BP trial data to estimate risk of CVD events and serious adverse events; the models included terms for intensive BP treatment and heterogeneous response to intensive treatment. The Cox models were then used to estimate the absolute reduction in probability of CVD events (benefit) and absolute increase in probability of serious adverse events (harm) for each individual from intensive treatment. We compared the method of elastic net regularization, which uses repeated internal cross-validation to select variables and estimate coefficients in the presence of collinearity, to a traditional backwards variable selection approach. Data from 9,069 SPRINT participants with complete data on covariates were utilized for model development, and data from 4,498 ACCORD-BP participants with complete data were utilized for model validation. Participants were exposed to intensive (goal systolic pressure < 120 mm Hg) versus standard (<140 mm Hg) treatment. Two composite primary outcome measures were evaluated: (i) CVD events/deaths (myocardial infarction, acute coronary syndrome, stroke, congestive heart failure, or CVD death), and (ii) serious adverse events (hypotension, syncope, electrolyte abnormalities, bradycardia, or acute kidney injury/failure). The model for CVD chosen through elastic net regularization included interaction terms suggesting that older age, black race, higher diastolic BP, and higher lipids were associated with greater CVD risk reduction benefits from intensive treatment, while current smoking was associated with fewer benefits. The model for serious adverse events chosen through elastic net regularization suggested that male sex, current smoking, statin use, elevated creatinine, and higher lipids were associated with greater risk of serious adverse events from intensive treatment. SPRINT participants in the highest predicted benefit subgroup had a number needed to treat (NNT) of 24 to prevent 1 CVD event/death over 5 years (absolute risk reduction [ARR] = 0.042, 95% CI: 0.018, 0.066; P = 0.001), those in the middle predicted benefit subgroup had a NNT of 76 (ARR = 0.013, 95% CI: −0.0001, 0.026; P = 0.053), and those in the lowest subgroup had no significant risk reduction (ARR = 0.006, 95% CI: −0.007, 0.018; P = 0.71). Those in the highest predicted harm subgroup had a number needed to harm (NNH) of 27 to induce 1 serious adverse event (absolute risk increase [ARI] = 0.038, 95% CI: 0.014, 0.061; P = 0.002), those in the middle predicted harm subgroup had a NNH of 41 (ARI = 0.025, 95% CI: 0.012, 0.038; P < 0.001), and those in the lowest subgroup had no significant risk increase (ARI = −0.007, 95% CI: −0.043, 0.030; P = 0.72). In ACCORD-BP, participants in the highest subgroup of predicted benefit had significant absolute CVD risk reduction, but the overall ACCORD-BP participant sample was skewed towards participants with less predicted benefit and more predicted risk than in SPRINT. The models chosen through traditional backwards selection had similar ability to identify absolute risk difference for CVD as the elastic net models, but poorer ability to correctly identify absolute risk difference for serious adverse events. A key limitation of the analysis is the limited sample size of the ACCORD-BP trial, which expanded confidence intervals for ARI among persons with type 2 diabetes. Additionally, it is not possible to mechanistically explain the physiological relationships explaining the heterogeneous treatment effects captured by the models, since the study was an observational secondary data analysis. Conclusions We found that predictive models could help identify subgroups of participants in both SPRINT and ACCORD-BP who had lower versus higher ARRs in CVD events/deaths with intensive BP treatment, and participants who had lower versus higher ARIs in serious adverse events. PMID:29040268
Effects of Aftershock Declustering in Risk Modeling: Case Study of a Subduction Sequence in Mexico
NASA Astrophysics Data System (ADS)
Kane, D. L.; Nyst, M.
2014-12-01
Earthquake hazard and risk models often assume that earthquake rates can be represented by a stationary Poisson process, and that aftershocks observed in historical seismicity catalogs represent a deviation from stationarity that must be corrected before earthquake rates are estimated. Algorithms for classifying individual earthquakes as independent mainshocks or as aftershocks vary widely, and analysis of a single catalog can produce considerably different earthquake rates depending on the declustering method implemented. As these rates are propagated through hazard and risk models, the modeled results will vary due to the assumptions implied by these choices. In particular, the removal of large aftershocks following a mainshock may lead to an underestimation of the rate of damaging earthquakes and potential damage due to a large aftershock may be excluded from the model. We present a case study based on the 1907 - 1911 sequence of nine 6.9 <= Mw <= 7.9 earthquakes along the Cocos - North American plate subduction boundary in Mexico in order to illustrate the variability in risk under various declustering approaches. Previous studies have suggested that subduction zone earthquakes in Mexico tend to occur in clusters, and this particular sequence includes events that would be labeled as aftershocks in some declustering approaches yet are large enough to produce significant damage. We model the ground motion for each event, determine damage ratios using modern exposure data, and then compare the variability in the modeled damage from using the full catalog or one of several declustered catalogs containing only "independent" events. We also consider the effects of progressive damage caused by each subsequent event and how this might increase or decrease the total losses expected from this sequence.
12 CFR Appendix B to Part 3 - Risk-Based Capital Guidelines; Market Risk Adjustment
Code of Federal Regulations, 2010 CFR
2010-01-01
... impact of adverse market events on a bank's covered positions. Backtests provide information about the... meets such criteria as a consequence of accounting, operational, or similar considerations, and the OCC... must use its internal model to measure its daily VAR, in accordance with the requirements of this...
12 CFR Appendix C to Part 325 - Risk-Based Capital for State Non-Member Banks: Market Risk
Code of Federal Regulations, 2010 CFR
2010-01-01
... provide information about the impact of adverse market events on a bank's covered positions. Backtests provide information about the accuracy of an internal model by comparing a bank's daily VAR measures to... determines the bank meets such criteria as a consequence of accounting, operational, or similar...
Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought
NASA Astrophysics Data System (ADS)
Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.
2017-10-01
Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.
NASA Astrophysics Data System (ADS)
Guillod, B. P.; Massey, N.; Otto, F. E. L.; Allen, M. R.; Jones, R.; Hall, J. W.
2016-12-01
Extreme events being rare by definition, accurately quantifying the probabilities associated with a given event is difficult. This is particularly true for droughts, for which only few events are available in the observational record owing to their long-lasting characteristics. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying present and future risks associated with droughts in the UK. To do so, a large number of modelled weather time series for "synthetic" drought events are being fed into hydrological and impact models to assess their impacts on various sectors (social sciences, economy, industry, agriculture, and ecosystems). Here, we present and analyse the hydro-meteorological drought event sets that have been produced with a new version of weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model simulations, downscaled at 25km over Europe by a nested Regional Climate Model. Simulations include the past 100 years as well as two future time slices (2030s and 2080s), and provide a large number of sequences of spatio-temporally coherent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. Beside presenting the methodology and validation of the event sets, we provide insights into drought risk in the UK and the drivers of drought. In particular, we examine their sensitivity to sea surface temperature and sea ice patterns, both in the recent past and for future projections. How drought risk in the UK can be expected to change in the future will also be discussed. Finally, we assess the applicability of this methodology to other regions. Reference: [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.
Veronesi, G; Gianfagna, F; Giampaoli, S; Chambless, L E; Mancia, G; Cesana, G; Ferrario, M M
2014-07-01
The aim of this study is to assess whether family history of coronary heart disease (CHD) and education as proxy of social status improve long-term cardiovascular disease risk prediction in a low-incidence European population. The 20-year risk of first coronary or ischemic stroke events was estimated using sex-specific Cox models in 3956 participants of three population-based surveys in northern Italy, aged 35-69 years and free of cardiovascular disease at enrollment. The additional contribution of education and positive family history of CHD was defined as change in discrimination and Net Reclassification Improvement (NRI) over the model including 7 traditional risk factors. Kaplan-Meier 20-year risk was 16.8% in men (254 events) and 6.4% in women (102 events). Low education (hazard ratio=1.35, 95%CI 0.98-1.85) and family history of CHD (1.55; 1.19-2.03) were associated with the endpoint in men, but not in women. In men, the addition of education and family history significantly improved discrimination by 1%; NRI was 6% (95%CI: 0.2%-15.2%), raising to 20% (0.5%-44%) in those at intermediate risk. NRI in women at intermediate risk was 7%. In low-incidence populations, family history of CHD and education, easily assessed in clinical practice, should be included in long-term cardiovascular disease risk scores, at least in men. Copyright © 2014 Elsevier Inc. All rights reserved.
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model
NASA Astrophysics Data System (ADS)
Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.
2014-12-01
European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.
76 FR 53162 - Acceptance of Public Submissions Regarding the Study of Stable Value Contracts
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-25
... the risk of a run on a SVF? To the extent that SVC providers use value-at-risk (``VaR'') models, do such VaR models adequately assess the risk of loss resulting from such events or other possible but extremely unlikely events? Do other loss models more adequately assess the risk of loss, such as the...
Kälsch, Hagen; Lehmann, Nils; Mahabadi, Amir A; Bauer, Marcus; Kara, Kaffer; Hüppe, Patricia; Moebus, Susanne; Möhlenkamp, Stefan; Dragano, Nico; Schmermund, Axel; Stang, Andreas; Jöckel, Karl-Heinz; Erbel, Raimund
2014-06-01
Aortic valve calcification (AVC) is considered a manifestation of atherosclerosis. In this study, we investigated whether AVC adds to cardiovascular risk prediction beyond Framingham risk factors and coronary artery calcification (CAC). A total of 3944 subjects from the population based Heinz Nixdorf Recall Study (59.3±7.7 years; 53% females) were evaluated for coronary events, stroke, and cardiovascular disease (CVD) events (including all plus CV death) over 9.1±1.9 years. CT scans were performed to quantify AVC. Cox proportional hazards regressions and Harrell's C were used to examine AVC as event predictor in addition to risk factors and CAC. During follow-up, 138 (3.5%) subjects experienced coronary events, 101 (2.6%) had a stroke, and 257 (6.5%) experienced CVD events. In subjects with AVC>0 versus AVC=0 the incidence of coronary events was 8.0% versus 3.0% (p<0.001) and the incidence of CVD events was 13.0% versus 5.7% (p<0.001). The frequency of events increased significantly with increasing AVC scores (p<0.001). After adjustment for Framingham risk factors, high AVC scores (3rd tertile) remained independently associated with coronary events (HR 2.21, 95% CI 1.28 to 3.81) and CVD events (HR 1.67, 95% CI 1.08 to 2.58). After further adjustment for CAC score, HRs were attenuated (coronary events 1.55, 95% CI 0.89 to 2.69; CVD events 1.29, 95% CI 0.83 to 2.00). When adding AVC to the model containing traditional risk factors and CAC, Harrell's C indices did not increase for coronary events (from 0.744 to 0.744) or CVD events (from 0.759 to 0.759). AVC is associated with incident coronary and CVD events independent of Framingham risk factors. However, AVC fails to improve cardiovascular event prediction over Framingham risk factors and CAC. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Global Distribution of Outbreaks of Water-Associated Infectious Diseases
Yang, Kun; LeJeune, Jeffrey; Alsdorf, Doug; Lu, Bo; Shum, C. K.; Liang, Song
2012-01-01
Background Water plays an important role in the transmission of many infectious diseases, which pose a great burden on global public health. However, the global distribution of these water-associated infectious diseases and underlying factors remain largely unexplored. Methods and Findings Based on the Global Infectious Disease and Epidemiology Network (GIDEON), a global database including water-associated pathogens and diseases was developed. In this study, reported outbreak events associated with corresponding water-associated infectious diseases from 1991 to 2008 were extracted from the database. The location of each reported outbreak event was identified and geocoded into a GIS database. Also collected in the GIS database included geo-referenced socio-environmental information including population density (2000), annual accumulated temperature, surface water area, and average annual precipitation. Poisson models with Bayesian inference were developed to explore the association between these socio-environmental factors and distribution of the reported outbreak events. Based on model predictions a global relative risk map was generated. A total of 1,428 reported outbreak events were retrieved from the database. The analysis suggested that outbreaks of water-associated diseases are significantly correlated with socio-environmental factors. Population density is a significant risk factor for all categories of reported outbreaks of water-associated diseases; water-related diseases (e.g., vector-borne diseases) are associated with accumulated temperature; water-washed diseases (e.g., conjunctivitis) are inversely related to surface water area; both water-borne and water-related diseases are inversely related to average annual rainfall. Based on the model predictions, “hotspots” of risks for all categories of water-associated diseases were explored. Conclusions At the global scale, water-associated infectious diseases are significantly correlated with socio-environmental factors, impacting all regions which are affected disproportionately by different categories of water-associated infectious diseases. PMID:22348158
Prognostic Utility of Novel Biomarkers of Cardiovascular Stress: The Framingham Heart Study
Wang, Thomas J.; Wollert, Kai C.; Larson, Martin G.; Coglianese, Erin; McCabe, Elizabeth L.; Cheng, Susan; Ho, Jennifer E.; Fradley, Michael G.; Ghorbani, Anahita; Xanthakis, Vanessa; Kempf, Tibor; Benjamin, Emelia J.; Levy, Daniel; Vasan, Ramachandran S.; Januzzi, James L.
2013-01-01
Background Biomarkers for predicting cardiovascular events in community-based populations have not consistently added information to standard risk factors. A limitation of many previously studied biomarkers is their lack of cardiovascular specificity. Methods and Results To determine the prognostic value of 3 novel biomarkers induced by cardiovascular stress, we measured soluble ST2, growth differentiation factor-15, and high-sensitivity troponin I in 3,428 participants (mean age 59, 53% women) in the Framingham Heart Study. We performed multivariable-adjusted proportional hazards models to assess the individual and combined ability of the biomarkers to predict adverse outcomes. We also constructed a “multimarker” score composed of the 3 biomarkers, in addition to B-type natriuretic peptide and high-sensitivity C-reactive protein. During a mean follow-up of 11.3 years, there were 488 deaths, 336 major cardiovascular events, 162 heart failure events, and 142 coronary events. In multivariable-adjusted models, the 3 new biomarkers were associated with each endpoint (p<0.001) except for coronary events. Individuals with multimarker scores in the highest quartile had a 3-fold risk of death (adjusted hazard ratio, 3.2, 95% CI, 2.2–4.7; p<0.001), 6-fold risk of heart failure (6.2, 95% CI, 2.6–14.8; p<0.001), and 2-fold risk of cardiovascular events (1.9, 95% CI, 1.3–2.7; p=0.001). Addition of the multimarker score to clinical variables led to significant increases in the c-statistic (p=0.007 or lower) and net reclassification improvement (p=0.001 or lower). Conclusions Multiple biomarkers of cardiovascular stress are detectable in ambulatory individuals, and add prognostic value to standard risk factors for predicting death, overall cardiovascular events, and heart failure. PMID:22907935
Prognostic utility of novel biomarkers of cardiovascular stress: the Framingham Heart Study.
Wang, Thomas J; Wollert, Kai C; Larson, Martin G; Coglianese, Erin; McCabe, Elizabeth L; Cheng, Susan; Ho, Jennifer E; Fradley, Michael G; Ghorbani, Anahita; Xanthakis, Vanessa; Kempf, Tibor; Benjamin, Emelia J; Levy, Daniel; Vasan, Ramachandran S; Januzzi, James L
2012-09-25
Biomarkers for predicting cardiovascular events in community-based populations have not consistently added information to standard risk factors. A limitation of many previously studied biomarkers is their lack of cardiovascular specificity. To determine the prognostic value of 3 novel biomarkers induced by cardiovascular stress, we measured soluble ST2, growth differentiation factor-15, and high-sensitivity troponin I in 3428 participants (mean age, 59 years; 53% women) in the Framingham Heart Study. We performed multivariable-adjusted proportional hazards models to assess the individual and combined ability of the biomarkers to predict adverse outcomes. We also constructed a "multimarker" score composed of the 3 biomarkers in addition to B-type natriuretic peptide and high-sensitivity C-reactive protein. During a mean follow-up of 11.3 years, there were 488 deaths, 336 major cardiovascular events, 162 heart failure events, and 142 coronary events. In multivariable-adjusted models, the 3 new biomarkers were associated with each end point (P<0.001) except coronary events. Individuals with multimarker scores in the highest quartile had a 3-fold risk of death (adjusted hazard ratio, 3.2; 95% confidence interval, 2.2-4.7; P<0.001), 6-fold risk of heart failure (6.2; 95% confidence interval, 2.6-14.8; P<0.001), and 2-fold risk of cardiovascular events (1.9; 95% confidence interval, 1.3-2.7; P=0.001). Addition of the multimarker score to clinical variables led to significant increases in the c statistic (P=0.005 or lower) and net reclassification improvement (P=0.001 or lower). Multiple biomarkers of cardiovascular stress are detectable in ambulatory individuals and add prognostic value to standard risk factors for predicting death, overall cardiovascular events, and heart failure.
Risk assessments: Validation, gut feeling and cognitive biases (Plinius Medal Lecture)
NASA Astrophysics Data System (ADS)
Merz, Bruno
2017-04-01
Risk management is ideally based on comprehensive risk assessments quantifying the current risk and its reduction for different mitigation strategies. Given the pivotal role of risk assessments, this contribution discusses the basis for our confidence in risk assessments. Traditional validation, i.e. comparing model simulations with past observations, is often not possible since the assessment typically contains extreme events and their impacts that have not been observed before. In this situation, the assessment is strongly based on assumptions, expert judgement and best guess. This is an unfavorable situation as humans fall prey to cognitive biases, such as 'illusion of certainty', 'overconfidence' or 'recency bias'. Such biases operate specifically in complex situations with many factors involved, when uncertainty is high and events are probabilistic, or when close learning feedback loops are missing - aspects that all apply to risk assessments. We reflect on the role of gut feeling in risk assessments, illustrate the pitfalls of cognitive biases, and discuss the possibilities for better understanding how confident we can be in the numbers resulting from risk assessments.
NASA Astrophysics Data System (ADS)
Enzenhoefer, R.; Binning, P. J.; Nowak, W.
2015-09-01
Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any point in time, which then affects the pumped quality upon transport through the aquifer. In such situations, estimating the overall risk is not trivial, and three key questions emerge: (1) How to aggregate the impacts from different contaminants and spill locations to an overall, cumulative impact on the value at risk? (2) How to properly account for the stochastic nature of spill events when converting the aggregated impact to a risk estimate? (3) How will the overall risk and subsequent decision making depend on stakeholder objectives, where stakeholder objectives refer to the values at risk, risk attitudes and risk metrics that can vary between stakeholders. In this study, we provide a STakeholder-Objective Risk Model (STORM) for assessing the total aggregated risk. Or concept is a quantitative, probabilistic and modular framework for simulation-based risk estimation. It rests on the source-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired by a German drinking water catchment. As one may expect, the results depend strongly on the chosen stakeholder objectives, but they are equally sensitive to different approaches for risk aggregation across different hazards, contaminant types, and over time.
Xu, Stanley; Hambidge, Simon J; McClure, David L; Daley, Matthew F; Glanz, Jason M
2013-08-30
In the examination of the association between vaccines and rare adverse events after vaccination in postlicensure observational studies, it is challenging to define appropriate risk windows because prelicensure RCTs provide little insight on the timing of specific adverse events. Past vaccine safety studies have often used prespecified risk windows based on prior publications, biological understanding of the vaccine, and expert opinion. Recently, a data-driven approach was developed to identify appropriate risk windows for vaccine safety studies that use the self-controlled case series design. This approach employs both the maximum incidence rate ratio and the linear relation between the estimated incidence rate ratio and the inverse of average person time at risk, given a specified risk window. In this paper, we present a scan statistic that can identify appropriate risk windows in vaccine safety studies using the self-controlled case series design while taking into account the dependence of time intervals within an individual and while adjusting for time-varying covariates such as age and seasonality. This approach uses the maximum likelihood ratio test based on fixed-effects models, which has been used for analyzing data from self-controlled case series design in addition to conditional Poisson models. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Nissen, Katrin; Ulbrich, Uwe
2016-04-01
An event based detection algorithm for extreme precipitation is applied to a multi-model ensemble of regional climate model simulations. The algorithm determines extent, location, duration and severity of extreme precipitation events. We assume that precipitation in excess of the local present-day 10-year return value will potentially exceed the capacity of the drainage systems that protect critical infrastructure elements. This assumption is based on legislation for the design of drainage systems which is in place in many European countries. Thus, events exceeding the local 10-year return value are detected. In this study we distinguish between sub-daily events (3 hourly) with high precipitation intensities and long-duration events (1-3 days) with high precipitation amounts. The climate change simulations investigated here were conducted within the EURO-CORDEX framework and exhibit a horizontal resolution of approximately 12.5 km. The period between 1971-2100 forced with observed and scenario (RCP 8.5 and RCP 4.5) greenhouse gas concentrations was analysed. Examined are changes in event frequency, event duration and size. The simulations show an increase in the number of extreme precipitation events for the future climate period over most of the area, which is strongest in Northern Europe. Strength and statistical significance of the signal increase with increasing greenhouse gas concentrations. This work has been conducted within the EU project RAIN (Risk Analysis of Infrastructure Networks in response to extreme weather).
Wassink, Annemarie M; van der Graaf, Yolanda; Janssen, Kristel J; Cook, Nancy R; Visseren, Frank L
2012-12-01
Although the overall average 10-year cardiovascular risk for patients with manifest atherosclerosis is considered to be more than 20%, actual risk for individual patients ranges from much lower to much higher. We investigated whether information on metabolic syndrome (MetS) or its individual components improves cardiovascular risk stratification in these patients. We conducted a prospective cohort study in 3679 patients with clinical manifest atherosclerosis from the Secondary Manifestations of ARTerial disease (SMART) study. Primary outcome was defined as any cardiovascular event (cardiovascular death, ischemic stroke or myocardial infarction). Three pre-specified prediction models were derived, all including information on established MetS components. The association between outcome and predictors was quantified using a Cox proportional hazard analysis. Model performance was assessed using global goodness-of-fit fit (χ(2)), discrimination (C-index) and ability to improve risk stratification. A total of 417 cardiovascular events occurred among 3679 patients with 15,102 person-years of follow-up (median follow-up 3.7 years, range 1.6-6.4 years). Compared to a model with age and gender only, all MetS-based models performed slightly better in terms of global model fit (χ(2)) but not C-index. The Net Reclassification Index associated with the addition of MetS (yes/no), the dichotomous MetS-components or the continuous MetS-components on top of age and gender was 2.1% (p = 0.29), 2.3% (p = 0.31) and 7.5% (p = 0.01), respectively. Prediction models incorporating age, gender and MetS can discriminate between patients with clinical manifest atherosclerosis at the highest vascular risk and those at lower risk. The addition of MetS components to a model with age and gender correctly reclassifies only a small proportion of patients into higher- and lower-risk categories. The clinical utility of a prediction model with MetS is therefore limited.
NASA Astrophysics Data System (ADS)
Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.
2017-12-01
Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in space. We undertake a number of quality checks of the stochastic model and compare real and simulated footprints to show that the method is able to re-create realistic patterns even at continental scales where there is large variation in flood generating mechanisms. We then show how these patterns can be used to drive a large scale 2D hydraulic to predict regional scale flooding.
Strategic regulatory approaches for the qualification of a biomarker assay for safety use.
Valeri, Anna P; Beharry, Michelle; Jones, David R
2013-02-01
Biomarkers can be defined as key molecular or cellular events that link a specific biological event to a health outcome. As such, biomarkers play an important role in understanding the relationships between exposure to a xenobiotic, the development of chronic human diseases, and the identification of subgroups that are at increased risk of disease. Much progress has been made in identifying and validating new biomarkers to be used in population-based studies. The increasing availability and use of biomarkers to aid informed decision-making in risk-benefit decisions highlights the need for careful assessment of the validity of such models. In particular, models involving new biomarkers require careful validation and regulatory acceptance.
NASA Astrophysics Data System (ADS)
van der Wiel, K.; Kapnick, S. B.; Vecchi, G.; Smith, J. A.
2017-12-01
The Mississippi-Missouri river catchment houses millions of people and much of the U.S. national agricultural production. Severe flooding events can therefore have large negative societal, natural and economic impacts. GFDL FLOR, a global coupled climate model (atmosphere, ocean, land, sea ice with integrated river routing module) is used to investigate the characteristics of great Mississippi floods with an average return period of 100 years. Model experiments under pre-industrial greenhouse gas forcing were conducted for 3400 years, such that the most extreme flooding events were explicitly modeled and the land and/or atmospheric causes could be investigated. It is shown that melt of snow pack and frozen sub-surface water in the Missouri and Upper Mississippi basins prime the river system, subsequently sensitizing it to above average precipitation in the Ohio and Tennessee basins. The months preceding the greatest flooding events are above average wet, leading to moist sub-surface conditions. Anomalous melt depends on the availability of frozen water in the catchment, therefore anomalous amounts of sub-surface frozen water and anomalous large snow pack in winter (Nov-Feb) make the river system susceptible for these great flooding events in spring (Feb-Apr). An additional experiment of 1200 years under transient greenhouse gas forcing (RCP4.5, 5 members) was done to investigate potential future change in flood risk. Based on a peak-over-threshold method, it is found that the number of great flooding events decreases in a warmer future. This decrease coincides with decreasing occurrence of large melt events, but is despite increasing numbers of large precipitation events. Though the model results indicate a decreasing risk for the greatest flooding events, the predictability of events might decrease in a warmer future given the changing characters of melt and precipitation.
Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan
2011-11-01
To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P < 0.0001) based on testing by the Lagrangemultiplier. Therefore, the over-dispersion dispersed data using a modified Poisson regression and negative binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.
A Seismic Source Model for Central Europe and Italy
NASA Astrophysics Data System (ADS)
Nyst, M.; Williams, C.; Onur, T.
2006-12-01
We present a seismic source model for Central Europe (Belgium, Germany, Switzerland, and Austria) and Italy, as part of an overall seismic risk and loss modeling project for this region. A separate presentation at this conference discusses the probabilistic seismic hazard and risk assessment (Williams et al., 2006). Where available we adopt regional consensus models and adjusts these to fit our format, otherwise we develop our own model. Our seismic source model covers the whole region under consideration and consists of the following components: 1. A subduction zone environment in Calabria, SE Italy, with interface events between the Eurasian and African plates and intraslab events within the subducting slab. The subduction zone interface is parameterized as a set of dipping area sources that follow the geometry of the surface of the subducting plate, whereas intraslab events are modeled as plane sources at depth; 2. The main normal faults in the upper crust along the Apennines mountain range, in Calabria and Central Italy. Dipping faults and (sub-) vertical faults are parameterized as dipping plane and line sources, respectively; 3. The Upper and Lower Rhine Graben regime that runs from northern Italy into eastern Belgium, parameterized as a combination of dipping plane and line sources, and finally 4. Background seismicity, parameterized as area sources. The fault model is based on slip rates using characteristic recurrence. The modeling of background and subduction zone seismicity is based on a compilation of several national and regional historic seismic catalogs using a Gutenberg-Richter recurrence model. Merging the catalogs encompasses the deletion of double, fake and very old events and the application of a declustering algorithm (Reasenberg, 2000). The resulting catalog contains a little over 6000 events, has an average b-value of -0.9, is complete for moment magnitudes 4.5 and larger, and is used to compute a gridded a-value model (smoothed historical seismicity) for the region. The logic tree weighs various completeness intervals and minimum magnitudes. Using a weighted scheme of European and global ground motion models together with a detailed site classification map for Europe based on Eurocode 8, we generate hazard maps for recurrence periods of 200, 475, 1000 and 2500 yrs.
Predictors of heart failure in patients with stable coronary artery disease: a PEACE study.
Lewis, Eldrin F; Solomon, Scott D; Jablonski, Kathleen A; Rice, Madeline Murguia; Clemenza, Francesco; Hsia, Judith; Maggioni, Aldo P; Zabalgoitia, Miguel; Huynh, Thao; Cuddy, Thomas E; Gersh, Bernard J; Rouleau, Jean; Braunwald, Eugene; Pfeffer, Marc A
2009-05-01
Heart failure (HF) is a disease commonly associated with coronary artery disease. Most risk models for HF development have focused on patients with acute myocardial infarction. The Prevention of Events with Angiotensin-Converting Enzyme Inhibition population enabled the development of a risk model to predict HF in patients with stable coronary artery disease and preserved ejection fraction. In the 8290, Prevention of Events with Angiotensin-Converting Enzyme Inhibition patients without preexisting HF, new-onset HF hospitalizations, and fatal HF were assessed over a median follow-up of 4.8 years. Covariates were evaluated and maintained in the Cox regression multivariable model using backward selection if P<0.05. A risk score was developed and converted to an integer-based scoring system. Among the Prevention of Events with Angiotensin-Converting Enzyme Inhibition population (age, 64+/-8; female, 18%; prior myocardial infarction, 55%), there were 268 cases of fatal and nonfatal HF. Twelve characteristics were associated with increased risk of HF along with several baseline medications, including older age, history of hypertension, and diabetes. Randomization to trandolapril independently reduced the risk of HF. There was no interaction between trandolapril treatment and other risk factors for HF. The risk score (range, 0 to 21) demonstrated excellent discriminatory power (c-statistic 0.80). Risk of HF ranged from 1.75% in patients with a risk score of 0% to 33% in patients with risk score >or=16. Among patients with stable coronary artery disease and preserved ejection fraction, traditional and newer factors were independently associated with increased risk of HF. Trandolopril decreased the risk of HF in these patients with preserved ejection fraction.
Risks from Solar Particle Events for Long Duration Space Missions Outside Low Earth Orbit
NASA Technical Reports Server (NTRS)
Over, S.; Myers, J.; Ford, J.
2016-01-01
The Integrated Medical Model (IMM) simulates the medical occurrences and mission outcomes for various mission profiles using probabilistic risk assessment techniques. As part of the work with the Integrated Medical Model (IMM), this project focuses on radiation risks from acute events during extended human missions outside low Earth orbit (LEO). Of primary importance in acute risk assessment are solar particle events (SPEs), which are low probability, high consequence events that could adversely affect mission outcomes through acute radiation damage to astronauts. SPEs can be further classified into coronal mass ejections (CMEs) and solar flares/impulsive events (Fig. 1). CMEs are an eruption of solar material and have shock enhancements that contribute to make these types of events higher in total fluence than impulsive events.
Dynamic TIMI Risk Score for STEMI
Amin, Sameer T.; Morrow, David A.; Braunwald, Eugene; Sloan, Sarah; Contant, Charles; Murphy, Sabina; Antman, Elliott M.
2013-01-01
Background Although there are multiple methods of risk stratification for ST‐elevation myocardial infarction (STEMI), this study presents a prospectively validated method for reclassification of patients based on in‐hospital events. A dynamic risk score provides an initial risk stratification and reassessment at discharge. Methods and Results The dynamic TIMI risk score for STEMI was derived in ExTRACT‐TIMI 25 and validated in TRITON‐TIMI 38. Baseline variables were from the original TIMI risk score for STEMI. New variables were major clinical events occurring during the index hospitalization. Each variable was tested individually in a univariate Cox proportional hazards regression. Variables with P<0.05 were incorporated into a full multivariable Cox model to assess the risk of death at 1 year. Each variable was assigned an integer value based on the odds ratio, and the final score was the sum of these values. The dynamic score included the development of in‐hospital MI, arrhythmia, major bleed, stroke, congestive heart failure, recurrent ischemia, and renal failure. The C‐statistic produced by the dynamic score in the derivation database was 0.76, with a net reclassification improvement (NRI) of 0.33 (P<0.0001) from the inclusion of dynamic events to the original TIMI risk score. In the validation database, the C‐statistic was 0.81, with a NRI of 0.35 (P=0.01). Conclusions This score is a prospectively derived, validated means of estimating 1‐year mortality of STEMI at hospital discharge and can serve as a clinically useful tool. By incorporating events during the index hospitalization, it can better define risk and help to guide treatment decisions. PMID:23525425
Dynamic TIMI risk score for STEMI.
Amin, Sameer T; Morrow, David A; Braunwald, Eugene; Sloan, Sarah; Contant, Charles; Murphy, Sabina; Antman, Elliott M
2013-01-29
Although there are multiple methods of risk stratification for ST-elevation myocardial infarction (STEMI), this study presents a prospectively validated method for reclassification of patients based on in-hospital events. A dynamic risk score provides an initial risk stratification and reassessment at discharge. The dynamic TIMI risk score for STEMI was derived in ExTRACT-TIMI 25 and validated in TRITON-TIMI 38. Baseline variables were from the original TIMI risk score for STEMI. New variables were major clinical events occurring during the index hospitalization. Each variable was tested individually in a univariate Cox proportional hazards regression. Variables with P<0.05 were incorporated into a full multivariable Cox model to assess the risk of death at 1 year. Each variable was assigned an integer value based on the odds ratio, and the final score was the sum of these values. The dynamic score included the development of in-hospital MI, arrhythmia, major bleed, stroke, congestive heart failure, recurrent ischemia, and renal failure. The C-statistic produced by the dynamic score in the derivation database was 0.76, with a net reclassification improvement (NRI) of 0.33 (P<0.0001) from the inclusion of dynamic events to the original TIMI risk score. In the validation database, the C-statistic was 0.81, with a NRI of 0.35 (P=0.01). This score is a prospectively derived, validated means of estimating 1-year mortality of STEMI at hospital discharge and can serve as a clinically useful tool. By incorporating events during the index hospitalization, it can better define risk and help to guide treatment decisions.
Logue, Jennifer; Murray, Heather M; Welsh, Paul; Shepherd, James; Packard, Chris; Macfarlane, Peter; Cobbe, Stuart; Ford, Ian; Sattar, Naveed
2011-04-01
The effect of body mass index (BMI) on coronary heart disease (CHD) risk is attenuated when mediators of this risk (such as diabetes, hypertension and hyperlipidaemia) are accounted for. However, there is now evidence of a differential effect of risk factors on fatal and non-fatal CHD events, with markers of inflammation more strongly associated with fatal than non-fatal events. To describe the association with BMI separately for both fatal and non-fatal CHD risk after accounting for classical risk factors and to assess any independent effects of obesity on CHD risk. In the West of Scotland Coronary Prevention Study BMI in 6082 men (mean age 55 years) with hypercholesterolaemia, but no history of diabetes or CVD, was related to the risk of fatal and non-fatal CHD events. After excluding participants with any event in the first 2 years, 1027 non-fatal and 214 fatal CHD events occurred during 14.7 years of follow-up. A minimally adjusted model (age, sex, statin treatment) and a maximally adjusted model (including known CVD risk factors and deprivation) were compared, with BMI 25-27.4 kg/m² as referent. The risk of non-fatal events was similar across all BMI categories in both models. The risk of fatal CHD events was increased in men with BMI 30.0-39.9 kg/m² in both the minimally adjusted model (HR = 1.75 (95% CI 1.12 to 2.74)) and the maximally adjusted model (HR = 1.60 (95% CI 1.02 to 2.53)). These hypothesis generating data suggest that obesity is associated with fatal, but not non-fatal, CHD after accounting for known cardiovascular risk factors and deprivation. Clinical trial registration WOSCOPS was carried out and completed before the requirement for clinical trial registration.
Risk Assessment in Underground Coalmines Using Fuzzy Logic in the Presence of Uncertainty
NASA Astrophysics Data System (ADS)
Tripathy, Debi Prasad; Ala, Charan Kumar
2018-04-01
Fatal accidents are occurring every year as regular events in Indian coal mining industry. To increase the safety conditions, it has become a prerequisite to performing a risk assessment of various operations in mines. However, due to uncertain accident data, it is hard to conduct a risk assessment in mines. The object of this study is to present a method to assess safety risks in underground coalmines. The assessment of safety risks is based on the fuzzy reasoning approach. Mamdani fuzzy logic model is developed in the fuzzy logic toolbox of MATLAB. A case study is used to demonstrate the applicability of the developed model. The summary of risk evaluation in case study mine indicated that mine fire has the highest risk level among all the hazard factors. This study could help the mine management to prepare safety measures based on the risk rankings obtained.
Change of flood risk under climate change based on Discharge Probability Index in Japan
NASA Astrophysics Data System (ADS)
Nitta, T.; Yoshimura, K.; Kanae, S.; Oki, T.
2010-12-01
Water-related disasters under the climate change have recently gained considerable interest, and there have been many studies referring to flood risk at the global scale (e.g. Milly et al., 2002; Hirabayashi et al., 2008). In order to build adaptive capacity, however, regional impact evaluation is needed. We thus focus on the flood risk over Japan in the present study. The output from the Regional Climate Model 20 (RCM20), which was developed by the Meteorological Research Institute, was used. The data was first compared with observed data based on Automated Meteorological Data Acquisition System and ground weather observations, and the model biases were corrected using the ratio and difference of the 20-year mean values. The bias-corrected RCM20 atmospheric data were then forced to run a land surface model and a river routing model (Yoshimura et al., 2007; Ngo-Duc, T. et al. 2007) to simulate river discharge during 1981-2000, 2031-2050, and 2081-2100. Simulated river discharge was converted to Discharge Probability Index (DPI), which was proposed by Yoshimura et al based on a statistical approach. The bias and uncertainty of the models are already taken into account in the concept of DPI, so that DPI serves as a good indicator of flood risk. We estimated the statistical parameters for DPI using the river discharge for 1981-2000 with an assumption that the parameters stay the same in the different climate periods. We then evaluated the occurrence of flood events corresponding to DPI categories in each 20 years and averaged them in 9 regions. The results indicate that low DPI flood events (return period of 2 years) will become more frequent in 2031-2050 and high DPI flood events (return period of 200 years) will become more frequent in 2081-2100 compared with the period of 1981-2000, though average precipitation will become larger during 2031-2050 than during 2081-2100 in most regions. It reflects the increased extreme precipitation during 2081-2100.
A regressive storm model for extreme space weather
NASA Astrophysics Data System (ADS)
Terkildsen, Michael; Steward, Graham; Neudegg, Dave; Marshall, Richard
2012-07-01
Extreme space weather events, while rare, pose significant risk to society in the form of impacts on critical infrastructure such as power grids, and the disruption of high end technological systems such as satellites and precision navigation and timing systems. There has been an increased focus on modelling the effects of extreme space weather, as well as improving the ability of space weather forecast centres to identify, with sufficient lead time, solar activity with the potential to produce extreme events. This paper describes the development of a data-based model for predicting the occurrence of extreme space weather events from solar observation. The motivation for this work was to develop a tool to assist space weather forecasters in early identification of solar activity conditions with the potential to produce extreme space weather, and with sufficient lead time to notify relevant customer groups. Data-based modelling techniques were used to construct the model, and an extensive archive of solar observation data used to train, optimise and test the model. The optimisation of the base model aimed to eliminate false negatives (missed events) at the expense of a tolerable increase in false positives, under the assumption of an iterative improvement in forecast accuracy during progression of the solar disturbance, as subsequent data becomes available.
Coley, Rebecca Yates; Browna, Elizabeth R.
2016-01-01
Inconsistent results in recent HIV prevention trials of pre-exposure prophylactic interventions may be due to heterogeneity in risk among study participants. Intervention effectiveness is most commonly estimated with the Cox model, which compares event times between populations. When heterogeneity is present, this population-level measure underestimates intervention effectiveness for individuals who are at risk. We propose a likelihood-based Bayesian hierarchical model that estimates the individual-level effectiveness of candidate interventions by accounting for heterogeneity in risk with a compound Poisson-distributed frailty term. This model reflects the mechanisms of HIV risk and allows that some participants are not exposed to HIV and, therefore, have no risk of seroconversion during the study. We assess model performance via simulation and apply the model to data from an HIV prevention trial. PMID:26869051
Adjusting for Confounding in Early Postlaunch Settings: Going Beyond Logistic Regression Models.
Schmidt, Amand F; Klungel, Olaf H; Groenwold, Rolf H H
2016-01-01
Postlaunch data on medical treatments can be analyzed to explore adverse events or relative effectiveness in real-life settings. These analyses are often complicated by the number of potential confounders and the possibility of model misspecification. We conducted a simulation study to compare the performance of logistic regression, propensity score, disease risk score, and stabilized inverse probability weighting methods to adjust for confounding. Model misspecification was induced in the independent derivation dataset. We evaluated performance using relative bias confidence interval coverage of the true effect, among other metrics. At low events per coefficient (1.0 and 0.5), the logistic regression estimates had a large relative bias (greater than -100%). Bias of the disease risk score estimates was at most 13.48% and 18.83%. For the propensity score model, this was 8.74% and >100%, respectively. At events per coefficient of 1.0 and 0.5, inverse probability weighting frequently failed or reduced to a crude regression, resulting in biases of -8.49% and 24.55%. Coverage of logistic regression estimates became less than the nominal level at events per coefficient ≤5. For the disease risk score, inverse probability weighting, and propensity score, coverage became less than nominal at events per coefficient ≤2.5, ≤1.0, and ≤1.0, respectively. Bias of misspecified disease risk score models was 16.55%. In settings with low events/exposed subjects per coefficient, disease risk score methods can be useful alternatives to logistic regression models, especially when propensity score models cannot be used. Despite better performance of disease risk score methods than logistic regression and propensity score models in small events per coefficient settings, bias, and coverage still deviated from nominal.
Stiletto, R; Röthke, M; Schäfer, E; Lefering, R; Waydhas, Ch
2006-10-01
Patient security has become one of the major aspects of clinical management in recent years. The crucial point in research was focused on malpractice. In contradiction to the economic process in non medical fields, the analysis of errors during the in-patient treatment time was neglected. Patient risk management can be defined as a structured procedure in a clinical unit with the aim to reduce harmful events. A risk point model was created based on a Delphi process and founded on the DIVI data register. The risk point model was evaluated in clinically working ICU departments participating in the register data base. The results of the risk point evaluation will be integrated in the next data base update. This might be a step to improve the reliability of the register to measure quality assessment in the ICU.
NASA Space Radiation Program Integrative Risk Model Toolkit
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris
2015-01-01
NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.
Meier-Hirmer, Carolina; Schumacher, Martin
2013-06-20
The aim of this article is to propose several methods that allow to investigate how and whether the shape of the hazard ratio after an intermediate event depends on the waiting time to occurrence of this event and/or the sojourn time in this state. A simple multi-state model, the illness-death model, is used as a framework to investigate the occurrence of this intermediate event. Several approaches are shown and their advantages and disadvantages are discussed. All these approaches are based on Cox regression. As different time-scales are used, these models go beyond Markov models. Different estimation methods for the transition hazards are presented. Additionally, time-varying covariates are included into the model using an approach based on fractional polynomials. The different methods of this article are then applied to a dataset consisting of four studies conducted by the German Breast Cancer Study Group (GBSG). The occurrence of the first isolated locoregional recurrence (ILRR) is studied. The results contribute to the debate on the role of the ILRR with respect to the course of the breast cancer disease and the resulting prognosis. We have investigated different modelling strategies for the transition hazard after ILRR or in general after an intermediate event. Including time-dependent structures altered the resulting hazard functions considerably and it was shown that this time-dependent structure has to be taken into account in the case of our breast cancer dataset. The results indicate that an early recurrence increases the risk of death. A late ILRR increases the hazard function much less and after the successful removal of the second tumour the risk of death is almost the same as before the recurrence. With respect to distant disease, the appearance of the ILRR only slightly increases the risk of death if the recurrence was treated successfully. It is important to realize that there are several modelling strategies for the intermediate event and that each of these strategies has restrictions and may lead to different results. Especially in the medical literature considering breast cancer development, the time-dependency is often neglected in the statistical analyses. We show that the time-varying variables cannot be neglected in the case of ILRR and that fractional polynomials are a useful tool for finding the functional form of these time-varying variables.
Targeting high-risk employees may reduce cardiovascular racial disparities.
Burke, James F; Vijan, Sandeep; Chekan, Lynette A; Makowiec, Ted M; Thomas, Laurita; Morgenstern, Lewis B
2014-09-01
A possible remedy for health disparities is for employers to promote cardiovascular health among minority employees. We sought to quantify the financial return to employers of interventions to improve minority health, and to determine whether a race- or risk-targeted strategy was better. Retrospective claims-based cohort analysis. Unconditional per-person costs attributable to stroke and myocardial infarction (MI) were estimated for University of Michigan employees from 2006 to 2009 using a 2-part model. The model was then used to predict the costs of cardiovascular disease to the University for 2 subgroups of employees-minorities and high-risk patients-and to calculate cost-savings thresholds: the point at which the costs of hypothetical interventions (eg, workplace fitness programs) would equal the cost savings from stroke/ MI prevention. Of the 38,314 enrollees, 10% were African American. Estimated unconditional payments for stroke/MI were almost the same in African Americans ($128 per employee per year; 95% CI, $79-$177) and whites ($128 per employee per year; 95% CI, $101- $156), including higher event rates and lower payments per event in African Americans. Targeting the highest risk decile with interventions to reduce stroke/MI would result in a substantially higher cost-savings threshold ($81) compared with targeting African Americans ($13). An unanticipated consequence of risk-based targeting is that African Americans would substantially benefit: an intervention targeted at the top risk decile would prevent 75% of the events in African Americans, just as would an intervention that exclusively targeted African Americans. Targeting all high-risk employees for cardiovascular risk reduction may be a win-win-win situation for employers: improving health, decreasing costs, and reducing disparities.
Lu, Tao; Wang, Min; Liu, Guangying; Dong, Guang-Hui; Qian, Feng
2016-01-01
It is well known that there is strong relationship between HIV viral load and CD4 cell counts in AIDS studies. However, the relationship between them changes during the course of treatment and may vary among individuals. During treatments, some individuals may experience terminal events such as death. Because the terminal event may be related to the individual's viral load measurements, the terminal mechanism is non-ignorable. Furthermore, there exists competing risks from multiple types of events, such as AIDS-related death and other death. Most joint models for the analysis of longitudinal-survival data developed in literatures have focused on constant coefficients and assume symmetric distribution for the endpoints, which does not meet the needs for investigating the nature of varying relationship between HIV viral load and CD4 cell counts in practice. We develop a mixed-effects varying-coefficient model with skewed distribution coupled with cause-specific varying-coefficient hazard model with random-effects to deal with varying relationship between the two endpoints for longitudinal-competing risks survival data. A fully Bayesian inference procedure is established to estimate parameters in the joint model. The proposed method is applied to a multicenter AIDS cohort study. Various scenarios-based potential models that account for partial data features are compared. Some interesting findings are presented.
Alshehry, Zahir H; Mundra, Piyushkumar A; Barlow, Christopher K; Mellett, Natalie A; Wong, Gerard; McConville, Malcolm J; Simes, John; Tonkin, Andrew M; Sullivan, David R; Barnes, Elizabeth H; Nestel, Paul J; Kingwell, Bronwyn A; Marre, Michel; Neal, Bruce; Poulter, Neil R; Rodgers, Anthony; Williams, Bryan; Zoungas, Sophia; Hillis, Graham S; Chalmers, John; Woodward, Mark; Meikle, Peter J
2016-11-22
Clinical lipid measurements do not show the full complexity of the altered lipid metabolism associated with diabetes mellitus or cardiovascular disease. Lipidomics enables the assessment of hundreds of lipid species as potential markers for disease risk. Plasma lipid species (310) were measured by a targeted lipidomic analysis with liquid chromatography electrospray ionization-tandem mass spectrometry on a case-cohort (n=3779) subset from the ADVANCE trial (Action in Diabetes and Vascular Disease: Preterax and Diamicron-MR Controlled Evaluation). The case-cohort was 61% male with a mean age of 67 years. All participants had type 2 diabetes mellitus with ≥1 additional cardiovascular risk factors, and 35% had a history of macrovascular disease. Weighted Cox regression was used to identify lipid species associated with future cardiovascular events (nonfatal myocardial infarction, nonfatal stroke, and cardiovascular death) and cardiovascular death during a 5-year follow-up period. Multivariable models combining traditional risk factors with lipid species were optimized with the Akaike information criteria. C statistics and NRIs were calculated within a 5-fold cross-validation framework. Sphingolipids, phospholipids (including lyso- and ether- species), cholesteryl esters, and glycerolipids were associated with future cardiovascular events and cardiovascular death. The addition of 7 lipid species to a base model (14 traditional risk factors and medications) to predict cardiovascular events increased the C statistic from 0.680 (95% confidence interval [CI], 0.678-0.682) to 0.700 (95% CI, 0.698-0.702; P<0.0001) with a corresponding continuous NRI of 0.227 (95% CI, 0.219-0.235). The prediction of cardiovascular death was improved with the incorporation of 4 lipid species into the base model, showing an increase in the C statistic from 0.740 (95% CI, 0.738-0.742) to 0.760 (95% CI, 0.757-0.762; P<0.0001) and a continuous net reclassification index of 0.328 (95% CI, 0.317-0.339). The results were validated in a subcohort with type 2 diabetes mellitus (n=511) from the LIPID trial (Long-Term Intervention With Pravastatin in Ischemic Disease). The improvement in the prediction of cardiovascular events, above traditional risk factors, demonstrates the potential of plasma lipid species as biomarkers for cardiovascular risk stratification in diabetes mellitus. URL: https://clinicaltrials.gov. Unique identifier: NCT00145925. © 2016 American Heart Association, Inc.
IT Operational Risk Measurement Model Based on Internal Loss Data of Banks
NASA Astrophysics Data System (ADS)
Hao, Xiaoling
Business operation of banks relies increasingly on information technology (IT) and the most important role of IT is to guarantee the operational continuity of business process. Therefore, IT Risk management efforts need to be seen from the perspective of operational continuity. Traditional IT risk studies focused on IT asset-based risk analysis and risk-matrix based qualitative risk evaluation. In practice, IT risk management practices of banking industry are still limited to the IT department and aren't integrated into business risk management, which causes the two departments to work in isolation. This paper presents an improved methodology for dealing with IT operational risk. It adopts quantitative measurement method, based on the internal business loss data about IT events, and uses Monte Carlo simulation to predict the potential losses. We establish the correlation between the IT resources and business processes to make sure risk management of IT and business can work synergistically.
Migraine and risk of stroke in older adults
Gardener, Hannah; Rundek, Tatjana; Elkind, Mitchell S.V.; Sacco, Ralph L.
2015-01-01
Objective: To examine the association between migraine and stroke/vascular outcomes in a racially/ethnically diverse, older cohort. Methods: Participants from the Northern Manhattan Study, a population-based cohort study of stroke incidence, were assessed for migraine symptoms using a self-report questionnaire based on criteria from the International Classification of Headache Disorders, second edition. We estimated the association between migraine and combined vascular events including stroke and stroke only over a mean follow-up of 11 years, using Cox models adjusted for sociodemographic and vascular risk factors. Results: Of 1,292 participants (mean age 68 ± 9 years) with migraine data followed prospectively for vascular events, 262 patients (20%) had migraine and 75 (6%) had migraine with aura. No association was found between migraine (with or without aura) and risk of either stroke or combined cardiovascular events. There was an interaction between migraine and current smoking (p = 0.02 in relation to stroke and p = 0.03 for combined vascular events), such that those with migraine and smoking were at an increased risk. The hazard ratio of stroke for migraine among current smokers was 3.17 (95% confidence interval [CI] 1.13–8.85) and among current nonsmokers was 0.77 (95% CI 0.44–1.35). In relation to combined vascular events, the hazard ratio for migraine vs no migraine among current smokers was 1.83 (95% CI 0.89–3.75) and among current nonsmokers was 0.63 (95% CI 0.43–0.94). Conclusion: In our racially/ethnically diverse population-based cohort, migraine was associated with an increased risk of stroke among active smokers but not among nonsmokers. PMID:26203088
NASA Technical Reports Server (NTRS)
Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.
2012-01-01
The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.
Young, Jim; Xiao, Yongling; Moodie, Erica E M; Abrahamowicz, Michal; Klein, Marina B; Bernasconi, Enos; Schmid, Patrick; Calmy, Alexandra; Cavassini, Matthias; Cusini, Alexia; Weber, Rainer; Bucher, Heiner C
2015-08-01
Patients with HIV exposed to the antiretroviral drug abacavir may have an increased risk of cardiovascular disease (CVD). There is concern that this association arises because of a channeling bias. Even if exposure is a risk, it is not clear how that risk changes as exposure cumulates. We assess the effect of exposure to abacavir on the risk of CVD events in the Swiss HIV Cohort Study. We use a new marginal structural Cox model to estimate the effect of abacavir as a flexible function of past exposures while accounting for risk factors that potentially lie on a causal pathway between exposure to abacavir and CVD. A total of 11,856 patients were followed for a median of 6.6 years; 365 patients had a CVD event (4.6 events per 1000 patient-years). In a conventional Cox model, recent--but not cumulative--exposure to abacavir increased the risk of a CVD event. In the new marginal structural Cox model, continued exposure to abacavir during the past 4 years increased the risk of a CVD event (hazard ratio = 2.06; 95% confidence interval: 1.43 to 2.98). The estimated function for the effect of past exposures suggests that exposure during the past 6-36 months caused the greatest increase in risk. Abacavir increases the risk of a CVD event: the effect of exposure is not immediate, rather the risk increases as exposure cumulates over the past few years. This gradual increase in risk is not consistent with a rapidly acting mechanism, such as acute inflammation.
NASA Astrophysics Data System (ADS)
Hartmann, A. J.; Ireson, A. M.
2017-12-01
Chalk aquifers represent an important source of drinking water in the UK. Due to its fractured-porous structure, Chalk aquifers are characterized by highly dynamic groundwater fluctuations that enhance the risk of groundwater flooding. The risk of groundwater flooding can be assessed by physically-based groundwater models. But for reliable results, a-priori information about the distribution of hydraulic conductivities and porosities is necessary, which is often not available. For that reason, conceptual simulation models are often used to predict groundwater behaviour. They commonly require calibration by historic groundwater observations. Consequently, their prediction performance may reduce significantly, when it comes to system states that did not occur within the calibration time series. In this study, we calibrate a conceptual model to the observed groundwater level observations at several locations within a Chalk system in Southern England. During the calibration period, no groundwater flooding occurred. We then apply our model to predict the groundwater dynamics of the system at a time that includes a groundwater flooding event. We show that the calibrated model provides reasonable predictions before and after the flooding event but it over-estimates groundwater levels during the event. After modifying the model structure to include topographic information, the model is capable of prediction the groundwater flooding event even though groundwater flooding never occurred in the calibration period. Although straight forward, our approach shows how conceptual process-based models can be applied to predict system states and dynamics that did not occur in the calibration period. We believe such an approach can be transferred to similar cases, especially to regions where rainfall intensities are expected to trigger processes and system states that may have not yet been observed.
People's Risk Recognition Preceding Evacuation and Its Role in Demand Modeling and Planning.
Urata, Junji; Pel, Adam J
2018-05-01
Evacuation planning and management involves estimating the travel demand in the event that such action is required. This is usually done as a function of people's decision to evacuate, which we show is strongly linked to their risk awareness. We use an empirical data set, which shows tsunami evacuation behavior, to demonstrate that risk recognition is not synonymous with objective risk, but is instead determined by a combination of factors including risk education, information, and sociodemographics, and that it changes dynamically over time. Based on these findings, we formulate an ordered logit model to describe risk recognition combined with a latent class model to describe evacuation choices. Our proposed evacuation choice model along with a risk recognition class can evaluate quantitatively the influence of disaster mitigation measures, risk education, and risk information. The results obtained from the risk recognition model show that risk information has a greater impact in the sense that people recognize their high risk. The results of the evacuation choice model show that people who are unaware of their risk take a longer time to evacuate. © 2017 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Green, Daniel; Yu, Dapeng; Pattison, Ian
2017-04-01
Surface water flooding occurs when intense precipitation events overwhelm the drainage capacity of an area and excess overland flow is unable to infiltrate into the ground or drain via natural or artificial drainage channels, such as river channels, manholes or SuDS. In the UK, over 3 million properties are at risk from surface water flooding alone, accounting for approximately one third of the UK's flood risk. The risk of surface water flooding is projected to increase due to several factors, including population increases, land-use alterations and future climatic changes in precipitation resulting in an increased magnitude and frequency of intense precipitation events. Numerical inundation modelling is a well-established method of investigating surface water flood risk, allowing the researcher to gain a detailed understanding of the depth, velocity, discharge and extent of actual or hypothetical flood scenarios over a wide range of spatial scales. However, numerical models require calibration of key hydrological and hydraulic parameters (e.g. infiltration, evapotranspiration, drainage rate, roughness) to ensure model outputs adequately represent the flood event being studied. Furthermore, validation data such as crowdsourced images or spatially-referenced flood depth collected during a flood event may provide a useful validation of inundation depth and extent for actual flood events. In this study, a simplified two-dimensional inertial based flood inundation model requiring minimal pre-processing of data (FloodMap-HydroInundation) was used to model a short-duration, intense rainfall event (27.8 mm in 15 minutes) that occurred over the Loughborough University campus on the 28th June 2012. High resolution (1m horizontal, +/- 15cm vertical) DEM data, rasterised Ordnance Survey topographic structures data and precipitation data recorded at the University weather station were used to conduct numerical modelling over the small (< 2km2), contained urban catchment. To validate model outputs and allow a reconstruction of spatially referenced flood depth and extent during the flood event, crowdsourced images were obtained from social media (Twitter) and from individuals present during the flood event via the University noticeboards, as well as using dGPS flood depth data collected at one of the worst affected areas. An investigation into the sensitivity of key model parameters suggests that the numerical model code is highly sensitivity to changes within the recommended range of roughness and infiltration values, as well as changes in DEM and building mesh resolutions, but less sensitive to changes in evapotranspiration and drainage capacity parameters. The study also demonstrates the potential of using crowdsourced images to validate urban surface water flood models and inform parameterisation when calibrating numerical inundation models.
Parikh, Nisha I.; Jeppson, Rebecca P.; Berger, Jeffrey S.; Eaton, Charles B.; Kroenke, Candyce H.; LeBlanc, Erin S.; Lewis, Cora E.; Loucks, Eric B.; Parker, Donna R.; Rillamas-Sun, Eileen; Ryckman, Kelli K; Waring, Molly E.; Schenken, Robert S.; Johnson, Karen C; Edstedt-Bonamy, Anna-Karin; Allison, Matthew A.; Howard, Barbara V.
2016-01-01
Background Reproductive factors provide an early window into a woman’s coronary heart disease (CHD) risk, however their contribution to CHD risk stratification is uncertain. Methods and Results In the Women’s Health Initiative Observational Study, we constructed Cox proportional hazards models for CHD including age, pregnancy status, number of live births, age at menarche, menstrual irregularity, age at first birth, stillbirths, miscarriages, infertility ≥ 1 year, infertility cause, and breastfeeding. We next added each candidate reproductive factor to an established CHD risk factor model. A final model was then constructed with significant reproductive factors added to established CHD risk factors. Improvement in C-statistic, net reclassification index (or NRI with risk categories of <5%, 5–<10%, and ≥10% 10-year risk of CHD) and integrated discriminatory index (IDI) were assessed. Among 72,982 women [n=4607 CHD events, median follow-up=12.0 (IQR=8.3–13.7) years, mean (SD) age 63.2 (7.2) years], an age-adjusted reproductive risk factor model had a C-statistic of 0.675 for CHD. In a model adjusted for established CHD risk factors, younger age at first birth, number of still births, number of miscarriages and lack of breastfeeding were positively associated with CHD. Reproductive factors modestly improved model discrimination (C-statistic increased from 0.726 to 0.730; IDI=0.0013, p-value < 0.0001). Net reclassification for women with events was not improved (NRI events=0.007, p-value=0.18); and for women without events was marginally improved (NRI non-events=0.002, p-value=0.04) Conclusions Key reproductive factors are associated with CHD independently of established CHD risk factors, very modestly improve model discrimination and do not materially improve net reclassification. PMID:27143682
Syndromic surveillance system based on near real-time cattle mortality monitoring.
Torres, G; Ciaravino, V; Ascaso, S; Flores, V; Romero, L; Simón, F
2015-05-01
Early detection of an infectious disease incursion will minimize the impact of outbreaks in livestock. Syndromic surveillance based on the analysis of readily available data can enhance traditional surveillance systems and allow veterinary authorities to react in a timely manner. This study was based on monitoring the number of cattle carcasses sent for rendering in the veterinary unit of Talavera de la Reina (Spain). The aim was to develop a system to detect deviations from expected values which would signal unexpected health events. Historical weekly collected dead cattle (WCDC) time series stabilized by the Box-Cox transformation and adjusted by the minimum least squares method were used to build the univariate cycling regression model based on a Fourier transformation. Three different models, according to type of production system, were built to estimate the baseline expected number of WCDC. Two types of risk signals were generated: point risk signals when the observed value was greater than the upper 95% confidence interval of the expected baseline, and cumulative risk signals, generated by a modified cumulative sum algorithm, when the cumulative sums of reported deaths were above the cumulative sum of expected deaths. Data from 2011 were used to prospectively validate the model generating seven risk signals. None of them were correlated to infectious disease events but some coincided, in time, with very high climatic temperatures recorded in the region. The harvest effect was also observed during the first week of the study year. Establishing appropriate risk signal thresholds is a limiting factor of predictive models; it needs to be adjusted based on experience gained during the use of the models. To increase the sensitivity and specificity of the predictions epidemiological interpretation of non-specific risk signals should be complemented by other sources of information. The methodology developed in this study can enhance other existing early detection surveillance systems. Syndromic surveillance based on mortality monitoring can reduce the detection time for certain disease outbreaks associated with mild mortality only detected at regional level. The methodology can be adapted to monitor other parameters routinely collected at farm level which can be influenced by communicable diseases. Copyright © 2015 Elsevier B.V. All rights reserved.
Du, Mark; Chase, Monica; Oguz, Mustafa; Davies, Glenn
2017-09-01
To evaluate long-term health benefits and risks of adding vorapaxar (VOR) to the standard care antiplatelet therapy (SC) of aspirin and/or clopidogrel, among a population with a recent myocardial infarction (MI) and/or peripheral artery disease (PAD). In a state-transition model, patients transition between health states (event-free, recurrent MI, stroke, death), while at risk of experiencing non-transition-related revascularization and non-fatal bleeding events. Risk equations developed from the TRA 2°P-TIMI 50 trial's patient-level data were used to predict cardiovascular (CV) outcomes over longer time horizons. Additional sources, including trials and US-based observational studies, informed the inputs for short-term CV risk, non-CV death, and health-related quality of life. Survival and quality-adjusted life-years (QALYs) were estimated over a lifetime horizon, discounted at 3% per year. Within a cohort of 7361 patients with recent MI and/or PAD, VOR + SC relative to SC alone yielded 176 fewer CV events (MIs, strokes, or CV deaths), but 27 more major bleeding events. VOR + SC was associated with increased life expectancy and health benefits (19.93 undiscounted life-years [LYs], 9.57 discounted QALYs vs. 19.61 undiscounted LYs, 9.41 discounted QALYs). The results were most sensitive to scenarios varying time of vorapaxar initiation, and the assumptions in the 90 day period post-MI. Additional analyses showed that add-on vorapaxar provides consistent incremental benefits in high-risk subgroups. This study contributes to the growing literature on secondary prevention add-on therapy, as results from these modeling analyses suggest that adding vorapaxar to SC for patients at high atherothrombotic risk can provide long-term health benefits.
Frolov, Alexander Vladimirovich; Vaikhanskaya, Tatjana Gennadjevna; Melnikova, Olga Petrovna; Vorobiev, Anatoly Pavlovich; Guel, Ludmila Michajlovna
2017-01-01
The development of prognostic factors of life-threatening ventricular tachyarrhythmias (VTA) and sudden cardiac death (SCD) continues to maintain its priority and relevance in cardiology. The development of a method of personalised prognosis based on multifactorial analysis of the risk factors associated with life-threatening heart rhythm disturbances is considered a key research and clinical task. To design a prognostic and mathematical model to define personalised risk for life-threatening VTA in patients with chronic heart failure (CHF). The study included 240 patients with CHF (mean-age of 50.5 ± 12.1 years; left ventricular ejection fraction 32.8 ± 10.9%; follow-up period 36.8 ± 5.7 months). The participants received basic therapy for heart failure. The elec-trocardiogram (ECG) markers of myocardial electrical instability were assessed including microvolt T-wave alternans, heart rate turbulence, heart rate deceleration, and QT dispersion. Additionally, echocardiography and Holter monitoring (HM) were performed. The cardiovascular events were considered as primary endpoints, including SCD, paroxysmal ventricular tachycardia/ventricular fibrillation (VT/VF) based on HM-ECG data, and data obtained from implantable device interrogation (CRT-D, ICD) as well as appropriated shocks. During the follow-up period, 66 (27.5%) subjects with CHF showed adverse arrhythmic events, including nine SCD events and 57 VTAs. Data from a stepwise discriminant analysis of cumulative ECG-markers of myocardial electrical instability were used to make a mathematical model of preliminary VTA risk stratification. Uni- and multivariate Cox logistic regression analysis were performed to define an individualised risk stratification model of SCD/VTA. A binary logistic regression model demonstrated a high prognostic significance of discriminant function with a classification sensitivity of 80.8% and specificity of 99.1% (F = 31.2; c2 = 143.2; p < 0.0001). The method of personalised risk stratification using Cox logistic regression allows correct classification of more than 93.9% of CHF cases. A robust body of evidence concerning logistic regression prognostic significance to define VTA risk allows inclusion of this method into the algorithm of subsequent control and selection of the optimal treatment modality to treat patients with CHF.
Siberry, George K; Harris, D. Robert; Oliveira, Ricardo Hugo; Krauss, Margot R.; Hofer, Cristina B.; Tiraboschi, Adriana Aparecida; Marques, Heloisa; Succi, Regina C.; Abreu, Thalita; Negra, Marinella Della; Mofenson, Lynne M.; Hazra, Rohan
2012-01-01
Background This study evaluated a wide range of viral load (VL) thresholds to identify a cut-point that best predicts new clinical events in children on stable highly-active antiretroviral therapy (HAART). Methods Cox proportional hazards modeling was used to assess the adjusted risk of World Health Organization stage 3 or 4 clinical events (WHO events) as a function of time-varying CD4, VL, and hemoglobin values in a cohort study of Latin American children on HAART ≥ 6 months. Models were fit using different VL cut-points between 400 and 50,000 copies/mL, with model fit evaluated on the basis of the minimum Akaike Information Criterion (AIC) value, a standard model fit statistic. Results Models were based on 67 subjects with WHO events out of 550 subjects on study. The VL cutpoints of > 2600 copies/mL and > 32,000 copies/mL corresponded to the lowest AIC values and were associated with the highest hazard ratios [2.0 (p = 0.015) and 2.1 (p = 0.0058), respectively] for WHO events. Conclusions In HIV-infected Latin American children on stable HAART, two distinct VL thresholds (> 2,600 copies/mL and > 32,000 copies/mL) were identified for predicting children at significantly increased risk of HIV-related clinical illness, after accounting for CD4 level, hemoglobin level, and other significant factors. PMID:22343177
GERMcode: A Stochastic Model for Space Radiation Risk Assessment
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.
2012-01-01
A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and high charge and energy (HZE) particles that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of HZE particles in tissue and shielding materials is made with a stochastic approach that includes both particle track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections. For NSRL applications, the GERMcode evaluates a set of biophysical properties, such as the Poisson distribution of particles or delta-ray hits for a given cellular area and particle dose, the radial dose on tissue, and the frequency distribution of energy deposition in a DNA volume. By utilizing the ProE/Fishbowl ray-tracing analysis, the GERMcode will be used as a bi-directional radiation transport model for future spacecraft shielding analysis in support of Mars mission risk assessments. Recent radiobiological experiments suggest the need for new approaches to risk assessment that include time-dependent biological events due to the signaling times for activation and relaxation of biological processes in cells and tissue. Thus, the tracking of the temporal and spatial distribution of events in tissue is a major goal of the GERMcode in support of the simulation of biological processes important in GCR risk assessments. In order to validate our approach, basic radiobiological responses such as cell survival curves, mutation, chromosomal aberrations, and representative mouse tumor induction curves are implemented into the GERMcode. Extension of these descriptions to other endpoints related to non-targeted effects and biochemical pathway responses will be discussed.
NASA Astrophysics Data System (ADS)
Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel
2017-04-01
Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.
Campbell, William; Ganna, Andrea; Ingelsson, Erik; Janssens, A Cecile J W
2016-01-01
We propose a new measure of assessing the performance of risk models, the area under the prediction impact curve (auPIC), which quantifies the performance of risk models in terms of their average health impact in the population. Using simulated data, we explain how the prediction impact curve (PIC) estimates the percentage of events prevented when a risk model is used to assign high-risk individuals to an intervention. We apply the PIC to the Atherosclerosis Risk in Communities (ARIC) Study to illustrate its application toward prevention of coronary heart disease. We estimated that if the ARIC cohort received statins at baseline, 5% of events would be prevented when the risk model was evaluated at a cutoff threshold of 20% predicted risk compared to 1% when individuals were assigned to the intervention without the use of a model. By calculating the auPIC, we estimated that an average of 15% of events would be prevented when considering performance across the entire interval. We conclude that the PIC is a clinically meaningful measure for quantifying the expected health impact of risk models that supplements existing measures of model performance. Copyright © 2016 Elsevier Inc. All rights reserved.
A flexible count data regression model for risk analysis.
Guikema, Seth D; Coffelt, Jeremy P; Goffelt, Jeremy P
2008-02-01
In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets.
NASA Astrophysics Data System (ADS)
Okolelova, Ella; Shibaeva, Marina; Shalnev, Oleg
2018-03-01
The article analyses risks in high-rise construction in terms of investment value with account of the maximum probable loss in case of risk event. The authors scrutinized the risks of high-rise construction in regions with various geographic, climatic and socio-economic conditions that may influence the project environment. Risk classification is presented in general terms, that includes aggregated characteristics of risks being common for many regions. Cluster analysis tools, that allow considering generalized groups of risk depending on their qualitative and quantitative features, were used in order to model the influence of the risk factors on the implementation of investment project. For convenience of further calculations, each type of risk is assigned a separate code with the number of the cluster and the subtype of risk. This approach and the coding of risk factors makes it possible to build a risk matrix, which greatly facilitates the task of determining the degree of impact of risks. The authors clarified and expanded the concept of the price risk, which is defined as the expected value of the event, 105 which extends the capabilities of the model, allows estimating an interval of the probability of occurrence and also using other probabilistic methods of calculation.
ERMiT: Estimating Post-Fire Erosion in Probabilistic Terms
NASA Astrophysics Data System (ADS)
Pierson, F. B.; Robichaud, P. R.; Elliot, W. J.; Hall, D. E.; Moffet, C. A.
2006-12-01
Mitigating the impact of post-wildfire runoff and erosion on life, property, and natural resources have cost the United States government tens of millions of dollars over the past decade. The decision of where, when, and how to apply the most effective mitigation treatments requires land managers to assess the risk of damaging runoff and erosion events occurring after a fire. The Erosion Risk Management Tool (ERMiT) is a web-based application that estimates erosion in probabilistic terms on burned and recovering forest, range, and chaparral lands. Unlike most erosion prediction models, ERMiT does not provide `average annual erosion rates;' rather, it provides a distribution of erosion rates with the likelihood of their occurrence. ERMiT combines rain event variability with spatial and temporal variabilities of hillslope burn severity, soil properties, and ground cover to estimate Water Erosion Prediction Project (WEPP) model input parameter values. Based on 20 to 40 individual WEPP runs, ERMiT produces a distribution of rain event erosion rates with a probability of occurrence for each of five post-fire years. Over the 5 years of modeled recovery, the occurrence probability of the less erodible soil parameters is increased and the occurrence probability of the more erodible soil parameters is decreased. In addition, the occurrence probabilities and the four spatial arrangements of burn severity (arrangements of overland flow elements (OFE's)), are shifted toward lower burn severity with each year of recovery. These yearly adjustments are based on field measurements made through post-fire recovery periods. ERMiT also provides rain event erosion rate distributions for hillslopes that have been treated with seeding, straw mulch, straw wattles and contour-felled log erosion barriers. Such output can help managers make erosion mitigation treatment decisions based on the probability of high sediment yields occurring, the value of resources at risk for damage, cost, and other management considerations.
Sacco, Ralph L.; Khatri, Minesh; Rundek, Tatjana; Xu, Qiang; Gardener, Hannah; Boden-Albala, Bernadette; Di Tullio, Marco R.; Homma, Shunichi; Elkind, Mitchell SV; Paik, Myunghee C
2010-01-01
Objective To improve global vascular risk prediction with behavioral and anthropometric factors. Background Few cardiovascular risk models are designed to predict the global vascular risk of MI, stroke, or vascular death in multi-ethnic individuals, and existing schemes do not fully include behavioral risk factors. Methods A randomly-derived, population-based, prospective cohort of 2737 community participants free of stroke and coronary artery disease were followed annually for a median of 9.0 years in the Northern Manhattan Study (mean age 69 years; 63.2% women; 52.7% Hispanic, 24.9% African-American, 19.9% white). A global vascular risk score (GVRS) predictive of stroke, myocardial infarction, or vascular death was developed by adding variables to the traditional Framingham cardiovascular variables based on the likelihood ratio criterion. Model utility was assessed through receiver operating characteristics, calibration, and effect on reclassification of subjects. Results Variables which significantly added to the traditional Framingham profile included waist circumference, alcohol consumption, and physical activity. Continuous measures for blood pressure and fasting blood sugar were used instead of hypertension and diabetes. Ten -year event-free probabilities were 0.95 for the first quartile of GVRS, 0.89 for the second quartile, 0.79 for the third quartile, and 0.56 for the fourth quartile. The addition of behavioral factors in our model improved prediction of 10 -year event rates compared to a model restricted to the traditional variables. Conclusion A global vascular risk score that combines both traditional, behavioral, and anthropometric risk factors, uses continuous variables for physiological parameters, and is applicable to non-white subjects could improve primary prevention strategies. PMID:19958966
Seismic Risk Assessment for the Kyrgyz Republic
NASA Astrophysics Data System (ADS)
Pittore, Massimiliano; Sousa, Luis; Grant, Damian; Fleming, Kevin; Parolai, Stefano; Fourniadis, Yannis; Free, Matthew; Moldobekov, Bolot; Takeuchi, Ko
2017-04-01
The Kyrgyz Republic is one of the most socially and economically dynamic countries in Central Asia, and one of the most endangered by earthquake hazard in the region. In order to support the government of the Kyrgyz Republic in the development of a country-level Disaster Risk Reduction strategy, a comprehensive seismic risk study has been developed with the support of the World Bank. As part of this project, state-of-the-art hazard, exposure and vulnerability models have been developed and combined into the assessment of direct physical and economic risk on residential, educational and transportation infrastructure. The seismic hazard has been modelled with three different approaches, in order to provide a comprehensive overview of the possible consequences. A probabilistic seismic hazard assessment (PSHA) approach has been used to quantitatively evaluate the distribution of expected ground shaking intensity, as constrained by the compiled earthquake catalogue and associated seismic source model. A set of specific seismic scenarios based on events generated from known fault systems have been also considered, in order to provide insight on the expected consequences in case of strong events in proximity of densely inhabited areas. Furthermore, long-span catalogues of events have been generated stochastically and employed in the probabilistic analysis of expected losses over the territory of the Kyrgyz Republic. Damage and risk estimates have been computed by using an exposure model recently developed for the country, combined with the assignment of suitable fragility/vulnerability models. The risk estimation has been carried out with spatial aggregation at the district (rayon) level. The obtained results confirm the high level of seismic risk throughout the country, also pinpointing the location of several risk hotspots, particularly in the southern districts, in correspondence with the Ferghana valley. The outcome of this project will further support the local decision makers in implementing specific prevention and mitigation measures that are consistent with a broad risk reduction strategy.
Bayesian Approach for Flexible Modeling of Semicompeting Risks Data
Han, Baoguang; Yu, Menggang; Dignam, James J.; Rathouz, Paul J.
2016-01-01
Summary Semicompeting risks data arise when two types of events, non-terminal and terminal, are observed. When the terminal event occurs first, it censors the non-terminal event, but not vice versa. To account for possible dependent censoring of the non-terminal event by the terminal event and to improve prediction of the terminal event using the non-terminal event information, it is crucial to model their association properly. Motivated by a breast cancer clinical trial data analysis, we extend the well-known illness-death models to allow flexible random effects to capture heterogeneous association structures in the data. Our extension also represents a generalization of the popular shared frailty models that usually assume that the non-terminal event does not affect the hazards of the terminal event beyond a frailty term. We propose a unified Bayesian modeling approach that can utilize existing software packages for both model fitting and individual specific event prediction. The approach is demonstrated via both simulation studies and a breast cancer data set analysis. PMID:25274445
J Waves for Predicting Cardiac Events in Hypertrophic Cardiomyopathy.
Tsuda, Toyonobu; Hayashi, Kenshi; Konno, Tetsuo; Sakata, Kenji; Fujita, Takashi; Hodatsu, Akihiko; Nagata, Yoji; Teramoto, Ryota; Nomura, Akihiro; Tanaka, Yoshihiro; Furusho, Hiroshi; Takamura, Masayuki; Kawashiri, Masa-Aki; Fujino, Noboru; Yamagishi, Masakazu
2017-10-01
This study sought to investigate whether the presence of J waves was associated with cardiac events in patients with hypertrophic cardiomyopathy (HCM). It has been uncertain whether the presence of J waves predicts life-threatening cardiac events in patients with HCM. This study evaluated consecutive 338 patients with HCM (207 men; age 61 ± 17 years of age). A J-wave was defined as J-point elevation >0.1 mV in at least 2 contiguous inferior and/or lateral leads. Cardiac events were defined as sudden cardiac death, ventricular fibrillation or sustained ventricular tachycardia, or appropriate implantable cardiac defibrillator therapy. The study also investigated whether adding the J-wave in a conventional risk model improved a prediction of cardiac events. J waves were seen in 46 (13.6%) patients at registration. Cardiac events occurred in 31 patients (9.2%) during median follow-up of 4.9 years (interquartile range: 2.6 to 7.1 years). In a Cox proportional hazards model, the presence of J waves was significantly associated with cardiac events (adjusted hazard ratio: 4.01; 95% confidence interval [CI]: 1.78 to 9.05; p = 0.001). Compared with the conventional risk model, the model using J waves in addition to conventional risks better predicted cardiac events (net reclassification improvement, 0.55; 95% CI: 0.20 to 0.90; p = 0.002). The presence of J waves was significantly associated with cardiac events in HCM. Adding J waves to conventional cardiac risk factors improved prediction of cardiac events. Further confirmatory studies are needed before considering J-point elevation as a marker of risk for use in making management decisions regarding risk in patients with HCM. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
A risk-based multi-objective model for optimal placement of sensors in water distribution system
NASA Astrophysics Data System (ADS)
Naserizade, Sareh S.; Nikoo, Mohammad Reza; Montaseri, Hossein
2018-02-01
In this study, a new stochastic model based on Conditional Value at Risk (CVaR) and multi-objective optimization methods is developed for optimal placement of sensors in water distribution system (WDS). This model determines minimization of risk which is caused by simultaneous multi-point contamination injection in WDS using CVaR approach. The CVaR considers uncertainties of contamination injection in the form of probability distribution function and calculates low-probability extreme events. In this approach, extreme losses occur at tail of the losses distribution function. Four-objective optimization model based on NSGA-II algorithm is developed to minimize losses of contamination injection (through CVaR of affected population and detection time) and also minimize the two other main criteria of optimal placement of sensors including probability of undetected events and cost. Finally, to determine the best solution, Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), as a subgroup of Multi Criteria Decision Making (MCDM) approach, is utilized to rank the alternatives on the trade-off curve among objective functions. Also, sensitivity analysis is done to investigate the importance of each criterion on PROMETHEE results considering three relative weighting scenarios. The effectiveness of the proposed methodology is examined through applying it to Lamerd WDS in the southwestern part of Iran. The PROMETHEE suggests 6 sensors with suitable distribution that approximately cover all regions of WDS. Optimal values related to CVaR of affected population and detection time as well as probability of undetected events for the best optimal solution are equal to 17,055 persons, 31 mins and 0.045%, respectively. The obtained results of the proposed methodology in Lamerd WDS show applicability of CVaR-based multi-objective simulation-optimization model for incorporating the main uncertainties of contamination injection in order to evaluate extreme value of losses in WDS.
NASA Astrophysics Data System (ADS)
Moser, M.
2009-04-01
The catchment Gadeinerbach in the District of Lungau/Salzburg/Austria is prone to debris flows. Large debris flow events dates back from the years 1934 and 1953. In the upper catchment large mass movements represent debris sources. A field study shows the debris potential and the catchment looks like a "sleeping torrential giant". To carry out mitigation measures a detailed risk management concept, based on a risk assessment in combination of historical analysis, field study and numerical modeling on the alluvial fan was conducted. Human activities have partly altered the surface of the alluvial fan Gadeinerbach but nevertheless some important hazard indicators could be found. With the hazard indicators and photo analysis from the large debris flow event 1934 the catchment character could be pointed out. With the help of these historical data sets (hazard indicators, sediment and debris amount...) it is possible to calibrate the provided numerical models and to win useful knowledge over the pro and cons and their application. The results were used to simulate the design event and furthermore to derive mitigation measures. Therefore the most effective protection against debris with a reduction of the high energy level to a lower level under particular energy change in combination with a debris/bedload deposition place has been carried out. Expert opinion, the study of historical data and a field work is in addition to numerical simulation techniques very necessary for the work in the field of natural hazard management.
Building a Database for a Quantitative Model
NASA Technical Reports Server (NTRS)
Kahn, C. Joseph; Kleinhammer, Roger
2014-01-01
A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.
A data-based model to locate mass movements triggered by seismic events in Sichuan, China.
de Souza, Fabio Teodoro
2014-01-01
Earthquakes affect the entire world and have catastrophic consequences. On May 12, 2008, an earthquake of magnitude 7.9 on the Richter scale occurred in the Wenchuan area of Sichuan province in China. This event, together with subsequent aftershocks, caused many avalanches, landslides, debris flows, collapses, and quake lakes and induced numerous unstable slopes. This work proposes a methodology that uses a data mining approach and geographic information systems to predict these mass movements based on their association with the main and aftershock epicenters, geologic faults, riverbeds, and topography. A dataset comprising 3,883 mass movements is analyzed, and some models to predict the location of these mass movements are developed. These predictive models could be used by the Chinese authorities as an important tool for identifying risk areas and rescuing survivors during similar events in the future.
Modeling the Historical Flood Events in France
NASA Astrophysics Data System (ADS)
Ali, Hani; Blaquière, Simon
2017-04-01
We will present the simulation results for different scenarios based on the flood model developed by AXA Global P&C CAT Modeling team. The model uses a Digital Elevation Model (DEM) with 75 m resolution, a hydrographic system (DB Carthage), daily rainfall data from "Météo France", water level from "HYDRO Banque" the French Hydrological Database (www.hydro.eaufrance.fr), for more than 1500 stations, hydrological model from IRSTEA and in-house hydraulic tool. In particular, the model re-simulates the most important and costly flood events that occurred during the past decade in France: we will present the re-simulated meteorological conditions since 1964 and estimate insurance loss incurred on current AXA portfolio of individual risks.
Risk of cardiac events in Long QT syndrome patients when taking antiseizure medications.
Auerbach, David S; Biton, Yitschak; Polonsky, Bronislava; McNitt, Scott; Gross, Robert A; Dirksen, Robert T; Moss, Arthur J
2018-01-01
Many antiseizure medications (ASMs) affect ion channel function. We investigated whether ASMs alter the risk of cardiac events in patients with corrected QT (QT c ) prolongation. The study included people from the Rochester-based Long QT syndrome (LQTS) Registry with baseline QT c prolongation and history of ASM therapy (n = 296). Using multivariate Anderson-Gill models, we assessed the risk of recurrent cardiac events associated with ASM therapy. We stratified by LQTS genotype and predominant mechanism of ASM action (Na + channel blocker and gamma-aminobutyric acid modifier.) There was an increased risk of cardiac events when participants with QT c prolongation were taking vs off ASMs (HR 1.65, 95% confidence interval [CI] 1.36-2.00, P < 0.001). There was an increased risk of cardiac events when LQTS2 (HR 1.49, 95% CI 1.03-2.15, P = 0.036) but not LQTS1 participants were taking ASMs (interaction, P = 0.016). Na + channel blocker ASMs were associated with an increased risk of cardiac events in participants with QT c prolongation, specifically LQTS2, but decreased risk in LQTS1. The increased risk when taking all ASMs and Na + channel blocker ASMs was attenuated by concurrent beta-adrenergic blocker therapy (interaction, P < 0.001). Gamma-aminobutyric acid modifier ASMs were associated with an increased risk of events in patients not concurrently treated with beta-adrenergic blockers. Female participants were at an increased risk of cardiac events while taking all ASMs and each class of ASMs. Despite no change in overall QT c duration, pharmacogenomic analyses set the stage for future prospective clinical and mechanistic studies to validate that ASMs with predominantly Na + channel blocking actions are deleterious in LQTS2, but protective in LQTS1. Copyright © 2017 Elsevier Inc. All rights reserved.
Tests of Theories of Crime in Female Prisoners.
Lindberg, Marc A; Fugett, April; Adkins, Ashtin; Cook, Kelsey
2017-02-01
Several general theories of crime were tested with path models on 293 female prisoners in a U.S. State prison. The theories tested included Social Bond and Control, Thrill/Risk Seeking, and a new attachment-based Developmental Dynamic Systems model. A large battery of different instruments ranging from measures of risk taking, to a crime addiction scale, to Childhood Adverse Events, to attachments and clinical issues were used. The older general theories of crime did not hold up well under the rigor of path modeling. The new dynamic systems model was supported that incorporated adverse childhood events leading to (a) peer crime, (b) crime addiction, and (c) a measure derived from the Attachment and Clinical Issues Questionnaire (ACIQ) that takes individual differences in attachments and clinical issues into account. The results were discussed in terms of new approaches to Research Defined Criteria of Diagnosis (RDoC) and new approaches to intervention.
Net reclassification index at event rate: properties and relationships.
Pencina, Michael J; Steyerberg, Ewout W; D'Agostino, Ralph B
2017-12-10
The net reclassification improvement (NRI) is an attractively simple summary measure quantifying improvement in performance because of addition of new risk marker(s) to a prediction model. Originally proposed for settings with well-established classification thresholds, it quickly extended into applications with no thresholds in common use. Here we aim to explore properties of the NRI at event rate. We express this NRI as a difference in performance measures for the new versus old model and show that the quantity underlying this difference is related to several global as well as decision analytic measures of model performance. It maximizes the relative utility (standardized net benefit) across all classification thresholds and can be viewed as the Kolmogorov-Smirnov distance between the distributions of risk among events and non-events. It can be expressed as a special case of the continuous NRI, measuring reclassification from the 'null' model with no predictors. It is also a criterion based on the value of information and quantifies the reduction in expected regret for a given regret function, casting the NRI at event rate as a measure of incremental reduction in expected regret. More generally, we find it informative to present plots of standardized net benefit/relative utility for the new versus old model across the domain of classification thresholds. Then, these plots can be summarized with their maximum values, and the increment in model performance can be described by the NRI at event rate. We provide theoretical examples and a clinical application on the evaluation of prognostic biomarkers for atrial fibrillation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Hsu, Chien-Yi; Chen, Yung-Tai; Huang, Po-Hsun; Leu, Hsin-Bang; Su, Yu-Wen; Chiang, Chia-Hung; Chen, Jaw-Wen; Chen, Tzeng-Ji; Lin, Shing-Jong; Chan, Wan-Leong
2016-05-01
Although accumulating evidence suggests urinary calculi may be associated with an increased risk of cardiovascular disease (CVD), the number of longitudinal studies linking urolithiasis to CVD events is limited. We investigated the association between urinary calculi and the risk of development of myocardial infarction (MI) and/or stroke in a nationwide, population-based cohort database in Taiwan. Our analyses were conducted using information from a random sample of 1 million people enrolled in the nationally representative Taiwan National Health Insurance Research Database. A total of 81,546 subjects aged 18 years or above, including 40,773 subjects diagnosed with urinary calculi during the study period and a propensity score-matched 40,773 subjects without urinary calculi were enrolled in our study. During a 10-year follow-up period, 501 MI events and 1295 stroke events were identified. By comparison, the urinary calculi group had a higher incidence rate of MI occurrence (11.79 vs 8.94 per 10,000 person-years) and stroke (31.41 vs 22.45 per 10,000 person-years). Cox proportional hazard regression model analysis showed that development of urinary calculi was independently associated with higher risk of developing future MI (HR, 1.31; 95% CI, 1.09-1.56, p=0.003), stroke (HR, 1.39; 95% CI, 1.24-1.55, p<0.001), and total cardiovascular events (HR, 1.38; 95% CI, 1.25-1.51, p<0.001). Urinary calculi were associated with an increased risk of future cardiovascular events in the Asian population, which was consistent with the recent epidemiologic evidence in Western countries. Copyright © 2015 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.
Hunt, Randall J.
2012-01-01
Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.
Li, Zhi-Jun; Yi, Chen-Ju; Li, Jing; Tang, Na
2017-04-11
The role of uric acid as a risk factor for cardio-cerebrovascular diseases is controversial. In this study, we aimed to investigate the relationship between serum uric acid level and the risk of cardio-cerebrovascular events in middle-aged and non-obese Chinese men. We included 3152 participants from the health examination center of Tongji Hospital from June 2007 to June 2010. Clinical examination and medical records were collected at the annual health examination. The hazard ratios (HRs) of uric acid for cardio-cerebrovascular events were calculated by Cox proportional hazards models. Generalized additive model and threshold effect analysis were used to explore the non-linear relationship between serum uric acid level and the incidence of cardio-cerebrovascular event. The mean follow-up time was 52 months. When the participants were classified into four groups by the serum acid quarter (Q1-Q4), the HRs (95% CI) of Q2-Q4 for cardio-cerebrovascular events were 1.26 (0.83, 1.92), 1.97 (1.33, 2.91) and 2.05 (1.40, 3.01), respectively, compared with the reference (Q1). The actual incidence and conditional incidence of cardio-cerebrovascular events in the high serum acid group were higher than those in the low serum acid group, which were stratified by the turning point (sUA = 372 μmol/L). We also showed a strong prognostic accuracy of the multiple variable-based score in 3 years and 5 years, with area under the receiver operating characteristic (ROC) curve of 0.790 (0.756-0.823) and 0.777 (0.749-0.804), respectively. Serum uric acid level is a strong risk factor for cardio-cerebrovascular events.
Lognormal Approximations of Fault Tree Uncertainty Distributions.
El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P
2018-01-26
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.
Eriksson, Jonas K; Jacobsson, Lennart; Bengtsson, Karin; Askling, Johan
2017-02-01
To assess and compare the incidence of cardiovascular (CV) events, by CV phenotype, between patients with ankylosing spondylitis (AS), rheumatoid arthritis (RA) and the general population. Using linkages of national and population-based registers, we identified one cohort of prevalent patients with AS (n=5358), one with RA (n=37 245) and one with matched general population subjects (n=25 006). These cohorts were identified in 2006 through 2011 and were followed in 31 December 2012, for first ever occurrence of acute coronary syndromes (ACS), deep venous thromboembolism, pulmonary embolism and stroke, respectively. For each outcome, we calculated incidence rates standardised to the age and sex distribution of the AS cohort, as well as relative risks using Cox proportional hazards models. Based on 69 ACS events during 20 251 person-years of follow-up of the patients with AS, and 966 events during 127 014 person-years in the RA cohort, the age/sex-adjusted relative risks for ACS compared with the general population was 1.3 (95% CI 1.0 to 1.7) for AS and 1.7 (1.4 to 2.0) for RA. For thromboembolic events, the corresponding risks were 1.4 (1.1 to 1.9) in AS and 1.8 (1.5 to 2.1) in RA. Finally, for stroke, the relative risks were 1.5 (1.1 to 2.0) in AS and 1.5 (1.2 to 1.8) in RA, compared with the general population. Prevalent patients with AS are at a 30%-50% increased risk of incident CV events. When compared with patients with RA, this level of increase was similar for stroke, but only half as high for ACS and thrombotic events. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Linking Adverse Outcome Pathways to Dynamic Energy Budgets: A Conceptual Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, Cheryl; Nisbet, Roger; Antczak, Philipp
Ecological risk assessment quantifies the likelihood of undesirable impacts of stressors, primarily at high levels of biological organization. Data used to inform ecological risk assessments come primarily from tests on individual organisms or from suborganismal studies, indicating a disconnect between primary data and protection goals. We know how to relate individual responses to population dynamics using individual-based models, and there are emerging ideas on how to make connections to ecosystem services. However, there is no established methodology to connect effects seen at higher levels of biological organization with suborganismal dynamics, despite progress made in identifying Adverse Outcome Pathways (AOPs) thatmore » link molecular initiating events to ecologically relevant key events. This chapter is a product of a working group at the National Center for Mathematical and Biological Synthesis (NIMBioS) that assessed the feasibility of using dynamic energy budget (DEB) models of individual organisms as a “pivot” connecting suborganismal processes to higher level ecological processes. AOP models quantify explicit molecular, cellular or organ-level processes, but do not offer a route to linking sub-organismal damage to adverse effects on individual growth, reproduction, and survival, which can be propagated to the population level through individual-based models. DEB models describe these processes, but use abstract variables with undetermined connections to suborganismal biology. We propose linking DEB and quantitative AOP models by interpreting AOP key events as measures of damage-inducing processes in a DEB model. Here, we present a conceptual model for linking AOPs to DEB models and review existing modeling tools available for both AOP and DEB.« less
Lin, Li-An; Luo, Sheng; Davis, Barry R
2018-01-01
In the course of hypertension, cardiovascular disease events (e.g., stroke, heart failure) occur frequently and recurrently. The scientific interest in such study may lie in the estimation of treatment effect while accounting for the correlation among event times. The correlation among recurrent event times come from two sources: subject-specific heterogeneity (e.g., varied lifestyles, genetic variations, and other unmeasurable effects) and event dependence (i.e., event incidences may change the risk of future recurrent events). Moreover, event incidences may change the disease progression so that there may exist event-varying covariate effects (the covariate effects may change after each event) and event effect (the effect of prior events on the future events). In this article, we propose a Bayesian regression model that not only accommodates correlation among recurrent events from both sources, but also explicitly characterizes the event-varying covariate effects and event effect. This model is especially useful in quantifying how the incidences of events change the effects of covariates and risk of future events. We compare the proposed model with several commonly used recurrent event models and apply our model to the motivating lipid-lowering trial (LLT) component of the Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) (ALLHAT-LLT).
Lin, Li-An; Luo, Sheng; Davis, Barry R.
2017-01-01
In the course of hypertension, cardiovascular disease events (e.g., stroke, heart failure) occur frequently and recurrently. The scientific interest in such study may lie in the estimation of treatment effect while accounting for the correlation among event times. The correlation among recurrent event times come from two sources: subject-specific heterogeneity (e.g., varied lifestyles, genetic variations, and other unmeasurable effects) and event dependence (i.e., event incidences may change the risk of future recurrent events). Moreover, event incidences may change the disease progression so that there may exist event-varying covariate effects (the covariate effects may change after each event) and event effect (the effect of prior events on the future events). In this article, we propose a Bayesian regression model that not only accommodates correlation among recurrent events from both sources, but also explicitly characterizes the event-varying covariate effects and event effect. This model is especially useful in quantifying how the incidences of events change the effects of covariates and risk of future events. We compare the proposed model with several commonly used recurrent event models and apply our model to the motivating lipid-lowering trial (LLT) component of the Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) (ALLHAT-LLT). PMID:29755162
Gomez, Céline; Mangeas, Morgan; Curt, Thomas; Ibanez, Thomas; Munzinger, Jérôme; Dumas, Pascal; Jérémy, André; Despinoy, Marc; Hély, Christelle
2015-01-01
Wildfire has been recognized as one of the most ubiquitous disturbance agents to impact on natural environments. In this study, our main objective was to propose a modeling approach to investigate the potential impact of wildfire on biodiversity. The method is illustrated with an application example in New Caledonia where conservation and sustainable biodiversity management represent an important challenge. Firstly, a biodiversity loss index, including the diversity and the vulnerability indexes, was calculated for every vegetation unit in New Caledonia and mapped according to its distribution over the New Caledonian mainland. Then, based on spatially explicit fire behavior simulations (using the FLAMMAP software) and fire ignition probabilities, two original fire risk assessment approaches were proposed: a one-off event model and a multi-event burn probability model. The spatial distribution of fire risk across New Caledonia was similar for both indices with very small localized spots having high risk. The patterns relating to highest risk are all located around the remaining sclerophyll forest fragments and are representing 0.012% of the mainland surface. A small part of maquis and areas adjacent to dense humid forest on ultramafic substrates should also be monitored. Vegetation interfaces between secondary and primary units displayed high risk and should represent priority zones for fire effects mitigation. Low fire ignition probability in anthropogenic-free areas decreases drastically the risk. A one-off event associated risk allowed localizing of the most likely ignition areas with potential for extensive damage. Emergency actions could aim limiting specific fire spread known to have high impact or consist of on targeting high risk areas to limit one-off fire ignitions. Spatially explicit information on burning probability is necessary for setting strategic fire and fuel management planning. Both risk indices provide clues to preserve New Caledonia hot spot of biodiversity facing wildfires.
Gomez, Céline; Mangeas, Morgan; Curt, Thomas; Ibanez, Thomas; Munzinger, Jérôme; Dumas, Pascal; Jérémy, André; Despinoy, Marc; Hély, Christelle
2015-01-01
Wildfire has been recognized as one of the most ubiquitous disturbance agents to impact on natural environments. In this study, our main objective was to propose a modeling approach to investigate the potential impact of wildfire on biodiversity. The method is illustrated with an application example in New Caledonia where conservation and sustainable biodiversity management represent an important challenge. Firstly, a biodiversity loss index, including the diversity and the vulnerability indexes, was calculated for every vegetation unit in New Caledonia and mapped according to its distribution over the New Caledonian mainland. Then, based on spatially explicit fire behavior simulations (using the FLAMMAP software) and fire ignition probabilities, two original fire risk assessment approaches were proposed: a one-off event model and a multi-event burn probability model. The spatial distribution of fire risk across New Caledonia was similar for both indices with very small localized spots having high risk. The patterns relating to highest risk are all located around the remaining sclerophyll forest fragments and are representing 0.012% of the mainland surface. A small part of maquis and areas adjacent to dense humid forest on ultramafic substrates should also be monitored. Vegetation interfaces between secondary and primary units displayed high risk and should represent priority zones for fire effects mitigation. Low fire ignition probability in anthropogenic-free areas decreases drastically the risk. A one-off event associated risk allowed localizing of the most likely ignition areas with potential for extensive damage. Emergency actions could aim limiting specific fire spread known to have high impact or consist of on targeting high risk areas to limit one-off fire ignitions. Spatially explicit information on burning probability is necessary for setting strategic fire and fuel management planning. Both risk indices provide clues to preserve New Caledonia hot spot of biodiversity facing wildfires. PMID:25691965
Ding, Fangyu; Ge, Quansheng; Fu, Jingying; Hao, Mengmeng
2017-01-01
Terror events can cause profound consequences for the whole society. Finding out the regularity of terrorist attacks has important meaning for the global counter-terrorism strategy. In the present study, we demonstrate a novel method using relatively popular and robust machine learning methods to simulate the risk of terrorist attacks at a global scale based on multiple resources, long time series and globally distributed datasets. Historical data from 1970 to 2015 was adopted to train and evaluate machine learning models. The model performed fairly well in predicting the places where terror events might occur in 2015, with a success rate of 96.6%. Moreover, it is noteworthy that the model with optimized tuning parameter values successfully predicted 2,037 terrorism event locations where a terrorist attack had never happened before. PMID:28591138
Ding, Fangyu; Ge, Quansheng; Jiang, Dong; Fu, Jingying; Hao, Mengmeng
2017-01-01
Terror events can cause profound consequences for the whole society. Finding out the regularity of terrorist attacks has important meaning for the global counter-terrorism strategy. In the present study, we demonstrate a novel method using relatively popular and robust machine learning methods to simulate the risk of terrorist attacks at a global scale based on multiple resources, long time series and globally distributed datasets. Historical data from 1970 to 2015 was adopted to train and evaluate machine learning models. The model performed fairly well in predicting the places where terror events might occur in 2015, with a success rate of 96.6%. Moreover, it is noteworthy that the model with optimized tuning parameter values successfully predicted 2,037 terrorism event locations where a terrorist attack had never happened before.
Framework for probabilistic flood risk assessment in an Alpine region
NASA Astrophysics Data System (ADS)
Schneeberger, Klaus; Huttenlau, Matthias; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann
2014-05-01
Flooding is among the natural hazards that regularly cause significant losses to property and human lives. The assessment of flood risk delivers crucial information for all participants involved in flood risk management and especially for local authorities and insurance companies in order to estimate the possible flood losses. Therefore a framework for assessing flood risk has been developed and is introduced with the presented contribution. Flood risk is thereby defined as combination of the probability of flood events and of potential flood damages. The probability of occurrence is described through the spatial and temporal characterisation of flood. The potential flood damages are determined in the course of vulnerability assessment, whereas, the exposure and the vulnerability of the elements at risks are considered. Direct costs caused by flooding with the focus on residential building are analysed. The innovative part of this contribution lies on the development of a framework which takes the probability of flood events and their spatio-temporal characteristic into account. Usually the probability of flooding will be determined by means of recurrence intervals for an entire catchment without any spatial variation. This may lead to a misinterpretation of the flood risk. Within the presented framework the probabilistic flood risk assessment is based on analysis of a large number of spatial correlated flood events. Since the number of historic flood events is relatively small additional events have to be generated synthetically. This temporal extrapolation is realised by means of the method proposed by Heffernan and Tawn (2004). It is used to generate a large number of possible spatial correlated flood events within a larger catchment. The approach is based on the modelling of multivariate extremes considering the spatial dependence structure of flood events. The input for this approach are time series derived from river gauging stations. In a next step the historic and synthetic flood events have to be spatially interpolated from point scale (i.e. river gauges) to the river network. Therefore, topological kriging (Top-kriging) proposed by Skøien et al. (2006) is applied. Top-kriging considers the nested structure of river networks and is therefore suitable to regionalise flood characteristics. Thus, the characteristics of a large number of possible flood events can be transferred to arbitrary locations (e.g. community level) at the river network within a study region. This framework has been used to generate a set of spatial correlated river flood events in the Austrian Federal Province of Vorarlberg. In addition, loss-probability-curves for each community has been calculated based on official inundation maps of public authorities, elements at risks and their vulnerability. One location along the river network within each community refers as interface between the set of flood events and the individual loss-probability relationships for the individual communities. Consequently, every flood event from the historic and synthetic generated dataset can be monetary evaluated. Thus, a time series comprising a large number of flood events and their corresponding monetary losses serves as basis for a probabilistic flood risk assessment. This includes expected annual losses and estimates of extreme event losses, which occur over the course of a certain time period. The gained results are essential decision-support for primary insurers, reinsurance companies and public authorities in order to setup a scale adequate risk management.
Quantifying the predictive accuracy of time-to-event models in the presence of competing risks.
Schoop, Rotraut; Beyersmann, Jan; Schumacher, Martin; Binder, Harald
2011-02-01
Prognostic models for time-to-event data play a prominent role in therapy assignment, risk stratification and inter-hospital quality assurance. The assessment of their prognostic value is vital not only for responsible resource allocation, but also for their widespread acceptance. The additional presence of competing risks to the event of interest requires proper handling not only on the model building side, but also during assessment. Research into methods for the evaluation of the prognostic potential of models accounting for competing risks is still needed, as most proposed methods measure either their discrimination or calibration, but do not examine both simultaneously. We adapt the prediction error proposal of Graf et al. (Statistics in Medicine 1999, 18, 2529–2545) and Gerds and Schumacher (Biometrical Journal 2006, 48, 1029–1040) to handle models with competing risks, i.e. more than one possible event type, and introduce a consistent estimator. A simulation study investigating the behaviour of the estimator in small sample size situations and for different levels of censoring together with a real data application follows.
A coupled physical and economic model of the response of coastal real estate to climate risk
NASA Astrophysics Data System (ADS)
McNamara, Dylan E.; Keeler, Andrew
2013-06-01
Barring an unprecedented large-scale effort to raise island elevation, barrier-island communities common along the US East Coast are likely to eventually face inundation of the existing built environment on a timescale that depends on uncertain climatic forcing. Between the present and when a combination of sea-level rise and erosion renders these areas uninhabitable, communities must choose levels of defensive expenditures to reduce risks and individual residents must assess whether and when risk levels are unacceptably high to justify investment in housing. We model the dynamics of coastal adaptation as the interplay of underlying climatic risks, collective actions to mitigate those risks, and individual risk assessments based on beliefs in model predictions and processing of past climate events. Efforts linking physical and behavioural models to explore shoreline dynamics have not yet brought together this set of essential factors. We couple a barrier-island model with an agent-based model of real-estate markets to show that, relative to people with low belief in model predictions about climate change, informed property owners invest heavily in defensive expenditures in the near term and then abandon coastal real estate at some critical risk threshold that presages a period of significant price volatility.
Tsunami Modeling of Hikurangi Trench M9 Events: Case Study for Napier, New Zealand
NASA Astrophysics Data System (ADS)
Williams, C. R.; Nyst, M.; Farahani, R.; Bryngelson, J.; Lee, R.; Molas, G.
2015-12-01
RMS has developed a tsunami model for New Zealand for the insurance industry to price and to manage their tsunami risks. A key tsunamigenic source for New Zealand is the Hikurangi Trench that lies offshore on the eastside of the North Island. The trench is the result of the subduction of the Pacific Plate beneath the North Island at a rate of 40-45 mm/yr. Though there have been no M9 historical events on the Hikurangi Trench, events in this magnitude range are considered in the latest version of the National Seismic Hazard Maps for New Zealand (Stirling et al., 2012). The RMS modeling approaches the tsunami lifecycle in three stages: event generation, ocean wave propagation, and coastal inundation. The tsunami event generation is modeled based on seafloor deformation resulting from an event rupture model. The ocean wave propagation and coastal inundation are modeled using a RMS-developed numerical solver, implemented on graphic processing units using a finite-volume approach to approximate two-dimensional, shallow-water wave equations over the ocean and complex topography. As the tsunami waves enter shallow water and approach the coast, the RMS model calculates the propagation of the waves along the wet-dry interface considering variable land friction. The initiation and characteristics of the tsunami are based on the event rupture model. As there have been no historical M9 events on the Hikurangi Trench, this rupture characterization posed unique challenges. This study examined the impacts of a suite of event rupture models to understand the key drivers in the variations in the tsunami inundation footprints. The goal was to develop a suite of tsunamigenic event characterizations that represent a range of potential tsunami outcomes for M9 events on the Hikurangi Trench. The focus of this case study is the Napier region as it represents an important exposure concentration in the region and has experience tsunami inundations in the past including during the 1931 Ms7.8 Hawkes Bay Earthquake.
NASA Astrophysics Data System (ADS)
Humer, Günter; Reithofer, Andreas
2016-04-01
Using an extended 2D hydrodynamic model for evaluating damage risk caused by extreme rain events: Flash-Flood-Risk-Map (FFRM) Upper Austria Considering the increase in flash flood events causing massive damage during the last years in urban but also rural areas [1-4], the requirement for hydrodynamic calculation of flash flood prone areas and possible countermeasures has arisen to many municipalities and local governments. Besides the German based URBAS project [1], also the EU-funded FP7 research project "SWITCH-ON" [5] addresses the damage risk caused by flash floods in the sub-project "FFRM" (Flash Flood Risk Map Upper Austria) by calculating damage risk for buildings and vulnerable infrastructure like schools and hospitals caused by flash-flood driven inundation. While danger zones in riverine flooding are established as an integral part of spatial planning, flash floods caused by overland runoff from extreme rain events have been for long an underrated safety hazard not only for buildings and infrastructure, but man and animals as well. Based on the widespread 2D-model "hydro_as-2D", an extension was developed, which calculates the runoff formation from a spatially and temporally variable precipitation and determines two dimensionally the land surface area runoff and its concentration. The conception of the model is to preprocess the precipitation data and calculate the effective runoff-volume for a short time step of e.g. five minutes. This volume is applied to the nodes of the 2D-model and the calculation of the hydrodynamic model is started. At the end of each time step, the model run is stopped, the preprocessing step is repeated and the hydraulic model calculation is continued. In view of the later use for the whole of Upper Austria (12.000 km²) a model grid of 25x25 m² was established using digital elevation data. Model parameters could be estimated for the small catchment of river Ach, which was hit by an intense rain event with up to 109 mm per hour at 20th of June 2012, based on open data sources of geology, soil and land use. The aim of FFRM is to provide an estimation of the damage risk caused by flash-floods for the whole of Upper Austria. To address the hazard, inundation depths were calculated with the extended 2D-model using design rains with an 100-year return period provided by the Environmental Ministry [7]. The potential damage was calculated using damage functions, which were derived from our experience from damage surveys of past events in Austria and according to guidelines for determination of cost-benefit-ratios for flood protection measures [8]. The greatest difficulty was to get appropriate data for the distribution of houses and industrial plants. Zoning plans provide good information on spatial distribution of residential, commercial and industrial areas, but does not contain information on the kind of industry, which is essential for estimating absolute damage values. To get a first idea detailed information from surveyed areas was intersected with the zoning plan, which provides an average damage in the respective zones. The first results can be found on www.waterviewer.com and will be updated with the further development of the project. [1] URBAS, risk management of extreme flooding events - prediction and management of flash floods in urban areas, www.urbanesturzfluten.de, prompted on 13th of November 2014 [2] Società Meteorologica Italiana (SMI), http://www.nimbus.it/eventi/2013/130624flashfloodRimini.pdf, prompted on 13th of November 2014 [3]Newspaper "Österreich", http://www.oe24.at/oesterreich/chronik/Sturzflut-Regen-legt-Ost-Oesterreich-lahm/1509113, prompted on 13th of November 2014 [4] Newspaper "Oberösterreichische Nachrichten", http://www.nachrichten.at/oberoesterreich/Unwetter-Mure-riss-Strasse-mit-Wohnhaus-in-Gosau-gefaehrdet;art4,911288 , prompted on 13th of November 2014 [5] Sharing Water-related Information to Tackle Changes in the Hydrosphere - for Operational Needs (SWITCH-ON), http://water-switch-on.eu [6] European Commission, directive 2007/60/EC of the European Parliament and the Council of 23rd October 2007 on the assessment and management of flood risks: http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2007:288:0027:0034:en:PDF [7] http://ehyd.gv.at [8] Austrian Federal Ministry of Agriculture, Forestry, Environment and Water Management: „Kosten-Nutzen-Untersuchungen im Schutzwaserbau", July 2009
North, Frederick; Fox, Samuel; Chaudhry, Rajeev
2016-07-20
Risk calculation is increasingly used in lipid management, congestive heart failure, and atrial fibrillation. The risk scores are then used for decisions about statin use, anticoagulation, and implantable defibrillator use. Calculating risks for patients and making decisions based on these risks is often done at the point of care and is an additional time burden for clinicians that can be decreased by automating the tasks and using clinical decision-making support. Using Morae Recorder software, we timed 30 healthcare providers tasked with calculating the overall risk of cardiovascular events, sudden death in heart failure, and thrombotic event risk in atrial fibrillation. Risk calculators used were the American College of Cardiology Atherosclerotic Cardiovascular Disease risk calculator (AHA-ASCVD risk), Seattle Heart Failure Model (SHFM risk), and CHA2DS2VASc. We also timed the 30 providers using Ask Mayo Expert care process models for lipid management, heart failure management, and atrial fibrillation management based on the calculated risk scores. We used the Mayo Clinic primary care panel to estimate time for calculating an entire panel risk. Mean provider times to complete the CHA2DS2VASc, AHA-ASCVD risk, and SHFM were 36, 45, and 171 s respectively. For decision making about atrial fibrillation, lipids, and heart failure, the mean times (including risk calculations) were 85, 110, and 347 s respectively. Even under best case circumstances, providers take a significant amount of time to complete risk assessments. For a complete panel of patients this can lead to hours of time required to make decisions about prescribing statins, use of anticoagulation, and medications for heart failure. Informatics solutions are needed to capture data in the medical record and serve up automatically calculated risk assessments to physicians and other providers at the point of care.
Abdel-Razeq, Hikmat; Mansour, Asem; Abdulelah, Hazem; Al-Shwayat, Anas; Makoseh, Mohammad; Ibrahim, Mohammad; Abunasser, Mahmoud; Rimawi, Dalia; Al-Rabaiah, Abeer; Alfar, Rozan; Abufara, Alaa'; Ibrahim, Alaa; Bawaliz, Anas; Ismael, Yousef
2018-01-01
The risk of thromboembolic events is higher among cancer patients, especially in patients undergoing chemotherapy. Cisplatin-based regimens claim to be associated with a very high thromboembolic rate. In this study, we report on our own experience with thrombosis among patients on active cisplatin-based chemotherapy. Medical records and hospital databases were searched for all the patients treated with any cisplatin-based regimen for any kind of cancer. Thrombosis was considered cisplatin-related if diagnosed any time after the first dose and up to 4 weeks after the last. The Khorana risk assessment model was performed in all cases. A total of 1677 patients (65.5% males, median age: 50 years) treated with cisplatin-based regimens were identified. Head and neck (22.9%), lung (22.2%), lymphoma and gastric (11.4% each) were the most common primary tumors. Thromboembolic events were reported in 110 (6.6%); the highest was in patients with gastric cancer (20.9%) and the lowest in patients with head and neck cancers (2.3%) and lymphoma (1.6%). Thrombosis included deep vein thrombosis (DVT) in 69 (62.7%), pulmonary embolism (PE) in 18 (16.9%) and arterial thrombosis in 17 (15.6%). A majority (51.1%) of the patients had stage IV disease and only 16% had stage I or II.In a multivariate analysis, significantly higher rates of thrombosis were associated with gastric as the primary tumor, advanced-stage disease, female sex but not age, and the Khorana risk score or type of cisplatin regimen. While the presence of CVC was significantly associated with the risk of thrombosis ( p < 0.0001) in the univariate analysis, and such significance was lost in the multivariate analysis (odds ratio, 1.098; 95%CI, 0.603-1.999, p = 0.7599). Thromboembolic events in cancer patients on active cisplatin-based chemotherapy were commonly encountered. Gastric cancer, regardless of other clinical variables, was associated with the highest risk.
Dubois, Carl-Ardy; D'amour, Danielle; Tchouaket, Eric; Clarke, Sean; Rivard, Michèle; Blais, Régis
2013-04-01
To examine the associations of four distinct nursing care organizational models with patient safety outcomes. Cross-sectional correlational study. Using a standardized protocol, patients' records were screened retrospectively to detect occurrences of patient safety-related events. Binary logistic regression was used to assess the associations of those events with four nursing care organizational models. Twenty-two medical units in 11 hospitals in Quebec, Canada, were clustered into 4 nursing care organizational models: 2 professional models and 2 functional models. Two thousand six hundred and ninety-nine were patients hospitalized for at least 48 h on the selected units. Composite of six safety-related events widely-considered sensitive to nursing care: medication administration errors, falls, pneumonia, urinary tract infection, unjustified restraints and pressure ulcers. Events were ultimately sorted into two categories: events 'without major' consequences for patients and events 'with' consequences. After controlling for patient characteristics, patient risk of experiencing one or more events (of any severity) and of experiencing an event with consequences was significantly lower, by factors of 25-52%, in both professional models than in the functional models. Event rates for both functional models were statistically indistinguishable from each other. Data suggest that nursing care organizational models characterized by contrasting staffing, work environment and innovation characteristics may be associated with differential risk for hospitalized patients. The two professional models, which draw mainly on registered nurses (RNs) to deliver nursing services and reflect stronger support for nurses' professional practice, were associated with lower risks than are the two functional models.
Leger, Stefan; Zwanenburg, Alex; Pilz, Karoline; Lohaus, Fabian; Linge, Annett; Zöphel, Klaus; Kotzerke, Jörg; Schreiber, Andreas; Tinhofer, Inge; Budach, Volker; Sak, Ali; Stuschke, Martin; Balermpas, Panagiotis; Rödel, Claus; Ganswindt, Ute; Belka, Claus; Pigorsch, Steffi; Combs, Stephanie E; Mönnich, David; Zips, Daniel; Krause, Mechthild; Baumann, Michael; Troost, Esther G C; Löck, Steffen; Richter, Christian
2017-10-16
Radiomics applies machine learning algorithms to quantitative imaging data to characterise the tumour phenotype and predict clinical outcome. For the development of radiomics risk models, a variety of different algorithms is available and it is not clear which one gives optimal results. Therefore, we assessed the performance of 11 machine learning algorithms combined with 12 feature selection methods by the concordance index (C-Index), to predict loco-regional tumour control (LRC) and overall survival for patients with head and neck squamous cell carcinoma. The considered algorithms are able to deal with continuous time-to-event survival data. Feature selection and model building were performed on a multicentre cohort (213 patients) and validated using an independent cohort (80 patients). We found several combinations of machine learning algorithms and feature selection methods which achieve similar results, e.g. C-Index = 0.71 and BT-COX: C-Index = 0.70 in combination with Spearman feature selection. Using the best performing models, patients were stratified into groups of low and high risk of recurrence. Significant differences in LRC were obtained between both groups on the validation cohort. Based on the presented analysis, we identified a subset of algorithms which should be considered in future radiomics studies to develop stable and clinically relevant predictive models for time-to-event endpoints.
Schilling, Peter L; Bozic, Kevin J
2016-01-06
Comparing outcomes across providers requires risk-adjustment models that account for differences in case mix. The burden of data collection from the clinical record can make risk-adjusted outcomes difficult to measure. The purpose of this study was to develop risk-adjustment models for hip fracture repair (HFR), total hip arthroplasty (THA), and total knee arthroplasty (TKA) that weigh adequacy of risk adjustment against data-collection burden. We used data from the American College of Surgeons National Surgical Quality Improvement Program to create derivation cohorts for HFR (n = 7000), THA (n = 17,336), and TKA (n = 28,661). We developed logistic regression models for each procedure using age, sex, American Society of Anesthesiologists (ASA) physical status classification, comorbidities, laboratory values, and vital signs-based comorbidities as covariates, and validated the models with use of data from 2012. The derivation models' C-statistics for mortality were 80%, 81%, 75%, and 92% and for adverse events were 68%, 68%, 60%, and 70% for HFR, THA, TKA, and combined procedure cohorts. Age, sex, and ASA classification accounted for a large share of the explained variation in mortality (50%, 58%, 70%, and 67%) and adverse events (43%, 45%, 46%, and 68%). For THA and TKA, these three variables were nearly as predictive as models utilizing all covariates. HFR model discrimination improved with the addition of comorbidities and laboratory values; among the important covariates were functional status, low albumin, high creatinine, disseminated cancer, dyspnea, and body mass index. Model performance was similar in validation cohorts. Risk-adjustment models using data from health records demonstrated good discrimination and calibration for HFR, THA, and TKA. It is possible to provide adequate risk adjustment using only the most predictive variables commonly available within the clinical record. This finding helps to inform the trade-off between model performance and data-collection burden as well as the need to define priorities for data capture from electronic health records. These models can be used to make fair comparisons of outcome measures intended to characterize provider quality of care for value-based-purchasing and registry initiatives. Copyright © 2016 by The Journal of Bone and Joint Surgery, Incorporated.
The role of building models in the evaluation of heat-related risks
NASA Astrophysics Data System (ADS)
Buchin, Oliver; Jänicke, Britta; Meier, Fred; Scherer, Dieter; Ziegler, Felix
2016-04-01
Hazard-risk relationships in epidemiological studies are generally based on the outdoor climate, despite the fact that most of humans' lifetime is spent indoors. By coupling indoor and outdoor climates with a building model, the risk concept developed can still be based on the outdoor conditions but also includes exposure to the indoor climate. The influence of non-linear building physics and the impact of air conditioning on heat-related risks can be assessed in a plausible manner using this risk concept. For proof of concept, the proposed risk concept is compared to a traditional risk analysis. As an example, daily and city-wide mortality data of the age group 65 and older in Berlin, Germany, for the years 2001-2010 are used. Four building models with differing complexity are applied in a time-series regression analysis. This study shows that indoor hazard better explains the variability in the risk data compared to outdoor hazard, depending on the kind of building model. Simplified parameter models include the main non-linear effects and are proposed for the time-series analysis. The concept shows that the definitions of heat events, lag days, and acclimatization in a traditional hazard-risk relationship are influenced by the characteristics of the prevailing building stock.
Wilke, Thomas; Boettger, Bjoern; Berg, Bjoern; Groth, Antje; Mueller, Sabrina; Botteman, Marc; Yu, Shengsheng; Fuchs, Andreas; Maywald, Ulf
2015-01-01
This analysis was conducted to investigate urinary tract infection (UTI) incidence among Type 2 Diabetes mellitus (T2DM) patients in Germany in a real-world setting and to identify risk factors associated with UTI incidence/recurrence. Our cohort study was conducted based on an anonymized dataset from a regional German sickness fund (2010-2012). A UTI event was mainly identified through observed outpatient/inpatient UTI diagnoses. We reported the number of UTI events per 1000 patient-years. Furthermore, the proportion of patients affected by ≥1 and ≥2 UTI events in the observational period was separately reported. Finally, three multivariate Cox regression analyses were conducted to identify factors that may be associated with UTI event risk or recurrent UTI event risk. A total of 456,586 T2DM-prevalent patients were identified (mean age 72.8years, 56.1% female, mean Charlson Comorbidity Index (CCI) of 7.3). Overall, the UTI event rate was 87.3 events per 1000 patient-years (111.8/55.8 per 1000 patient-years for women/men (p<0.001)). The highest UTI event rates were observed for those aged >89years. After 730days after first observed T2DM diagnosis, the proportion of women/men still UTI-event-free was 80.9%/90.2% (p<0.001). Most important factors associated with UTI risk in our three models were older age (Hazard Ratio (HR)=1.56-1.70 for >79years), female gender (HR=1.38-1.57), UTIs in the previous two years (HR=2.77-5.94), number of comorbidities as measured by the CCI (HR=1.32-1.52 for CCI>6) and at least one cystoscopy in the previous year (HR=2.06-5.48). Furthermore, high HbA1c values in the previous year (HR=1.29-1.4 referring to HbA1c>9.5%) and a poor kidney function (HR=1.11-1.211 referring to glomerular filtration rate (GFR)<60ml/min) increased the UTI event risk. Our study confirms that UTI event risk is high in T2DM patients. Older female patients having experienced previous UTIs face an above-average UTI risk, especially if these risk factors are associated with poor glycemic control and poor kidney function. Copyright © 2015 Elsevier Inc. All rights reserved.
Ren, J; Jenkinson, I; Wang, J; Xu, D L; Yang, J B
2008-01-01
Focusing on people and organizations, this paper aims to contribute to offshore safety assessment by proposing a methodology to model causal relationships. The methodology is proposed in a general sense that it will be capable of accommodating modeling of multiple risk factors considered in offshore operations and will have the ability to deal with different types of data that may come from different resources. Reason's "Swiss cheese" model is used to form a generic offshore safety assessment framework, and Bayesian Network (BN) is tailored to fit into the framework to construct a causal relationship model. The proposed framework uses a five-level-structure model to address latent failures within the causal sequence of events. The five levels include Root causes level, Trigger events level, Incidents level, Accidents level, and Consequences level. To analyze and model a specified offshore installation safety, a BN model was established following the guideline of the proposed five-level framework. A range of events was specified, and the related prior and conditional probabilities regarding the BN model were assigned based on the inherent characteristics of each event. This paper shows that Reason's "Swiss cheese" model and BN can be jointly used in offshore safety assessment. On the one hand, the five-level conceptual model is enhanced by BNs that are capable of providing graphical demonstration of inter-relationships as well as calculating numerical values of occurrence likelihood for each failure event. Bayesian inference mechanism also makes it possible to monitor how a safety situation changes when information flow travel forwards and backwards within the networks. On the other hand, BN modeling relies heavily on experts' personal experiences and is therefore highly domain specific. "Swiss cheese" model is such a theoretic framework that it is based on solid behavioral theory and therefore can be used to provide industry with a roadmap for BN modeling and implications. A case study of the collision risk between a Floating Production, Storage and Offloading (FPSO) unit and authorized vessels caused by human and organizational factors (HOFs) during operations is used to illustrate an industrial application of the proposed methodology.
An operational procedure for rapid flood risk assessment in Europe
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc
2017-07-01
The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.
Extreme value modelling of Ghana stock exchange index.
Nortey, Ezekiel N N; Asare, Kwabena; Mettle, Felix Okoe
2015-01-01
Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana stock exchange all-shares index (2000-2010) by applying the extreme value theory (EVT) to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before the EVT method was applied. The Peak Over Threshold approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model's goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the value at risk and expected shortfall risk measures at some high quantiles, based on the fitted GPD model.
Yellow fever vaccination: some thoughts on how much is enough [Vaccine 23 (2005) 3908-3914].
Martins, Reinaldo M; Galler, Ricardo; Freire, Marcos Silva; Camacho, Luiz Antonio B; de Lourdes S Maia, Maria; Homma, Akira
2007-01-02
In a recently published article in this journal, Massad et al. contraindicates yellow fever vaccination to persons 60 years or older, considering that the risk of serious adverse events is higher for this age class. The conclusion was based on the input of available data on age-related probabilities of developing serious adverse events in the United States, as well on other data not firmly established. We consider such contraindication inadequate, because the data input has limitations, higher letality of wild-type yellow fever infection in older adults, risk of introduction of yellow fever by travelers into new countries, lower risk of vaccine adverse events in revaccinated or immune people in endemic countries, and the experience of Brazil, with only one suspect case of associated viscerotropic disease in an individual older than 60 years. The model proposed by Massad et al. is useful but can lead to different conclusions, depending on the epidemiological context and individual risk profile.
Analysis and modeling of a hail event consequences on a building portfolio
NASA Astrophysics Data System (ADS)
Nicolet, Pierrick; Voumard, Jérémie; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel
2014-05-01
North-West Switzerland has been affected by a severe Hail Storm in July 2011, which was especially intense in the Canton of Aargau. The damage cost of this event is around EUR 105 Million only for the Canton of Aargau, which corresponds to half of the mean annual consolidated damage cost of the last 20 years for the 19 Cantons (over 26) with a public insurance. The aim of this project is to benefit from the collected insurance data to better understand and estimate the risk of such event. In a first step, a simple hail event simulator, which has been developed for a previous hail episode, is modified. The geometric properties of the storm is derived from the maximum intensity radar image by means of a set of 2D Gaussians instead of using 1D Gaussians on profiles, as it was the case in the previous version. The tool is then tested on this new event in order to establish its ability to give a fast damage estimation based on the radar image and buildings value and location. The geometrical properties are used in a further step to generate random outcomes with similar characteristics, which are combined with a vulnerability curve and an event frequency to estimate the risk. The vulnerability curve comes from a 2009 event and is improved with data from this event, whereas the frequency for the Canton is estimated from insurance records. In addition to this regional risk analysis, this contribution aims at studying the relation of the buildings orientation with the damage rate. Indeed, it is expected that the orientation of the roof influences the aging of the material by controlling the frequency and amplitude of thaw-freeze cycles, changing then the vulnerability over time. This part is established by calculating the hours of sunshine, which are used to derive the material temperatures. This information is then compared with insurance claims. A last part proposes a model to study the hail impact on a building, by modeling the different equipment on each facade of the building, such as the number of windows or the material type. The goal for this part, which is more prospective, is to have a model which would allow to quickly estimate the risk of a given building according to its physical characteristics and to the local wind conditions during a hail event.
Sun, S; Cui, Z; Zhou, M; Li, R; Li, H; Zhang, S; Ba, Y; Cheng, G
2017-02-01
Proton pump inhibitors (PPIs) are commonly used as potent gastric acid secretion antagonists for gastro-esophageal disorders and their overall safety in patients with gastro-esophageal reflux disease (GERD) is considered to be good and they are well-tolerated. However, recent studies have suggested that PPIs may be a potential independent risk factor for cardiovascular adverse events. The aim of our meta-analysis was to examine the association between PPI monotherapy and cardiovascular events in patients with GERD. A literature search involved examination of relevant databases up to July 2015 including PubMed, Cochrane Library, EMBASE, and ClinicalTrial.gov, as well as selected randomized controlled trials (RCTs) reporting cardiovascular events with PPI exposure in GERD patients. In addition, the pooled risk ratio (RR) and heterogeneity were assessed based on a fixed effects model of the meta-analysis and the I 2 statistic, respectively. Seventeen RCTs covering 7540 patients were selected. The pooled data suggested that the use of PPIs was associated with a 70% increased cardiovascular risk (RR=1.70, 95% CI: [1.13-2.56], P=.01, I 2 =0%). Furthermore, higher risks of adverse cardiovascular events in the omeprazole subgroup (RR=3.17, 95% CI: [1.43-7.03], P=.004, I 2 =25%) and long-term treatment subgroup (RR=2.33, 95% CI: [1.33-4.08], P=.003, I 2 =0%) were found. PPI monotherapy can be a risk factor for cardiovascular adverse events. Omeprazole could significantly increase the risk of cardiovascular events and, so, should be used carefully. © 2016 John Wiley & Sons Ltd.
Characterizing the performance of the Conway-Maxwell Poisson generalized linear model.
Francis, Royce A; Geedipally, Srinivas Reddy; Guikema, Seth D; Dhavala, Soma Sekhar; Lord, Dominique; LaRocca, Sarah
2012-01-01
Count data are pervasive in many areas of risk analysis; deaths, adverse health outcomes, infrastructure system failures, and traffic accidents are all recorded as count events, for example. Risk analysts often wish to estimate the probability distribution for the number of discrete events as part of doing a risk assessment. Traditional count data regression models of the type often used in risk assessment for this problem suffer from limitations due to the assumed variance structure. A more flexible model based on the Conway-Maxwell Poisson (COM-Poisson) distribution was recently proposed, a model that has the potential to overcome the limitations of the traditional model. However, the statistical performance of this new model has not yet been fully characterized. This article assesses the performance of a maximum likelihood estimation method for fitting the COM-Poisson generalized linear model (GLM). The objectives of this article are to (1) characterize the parameter estimation accuracy of the MLE implementation of the COM-Poisson GLM, and (2) estimate the prediction accuracy of the COM-Poisson GLM using simulated data sets. The results of the study indicate that the COM-Poisson GLM is flexible enough to model under-, equi-, and overdispersed data sets with different sample mean values. The results also show that the COM-Poisson GLM yields accurate parameter estimates. The COM-Poisson GLM provides a promising and flexible approach for performing count data regression. © 2011 Society for Risk Analysis.
Greving, Jacoba P; Diener, Hans-Christoph; Csiba, László; Hacke, Werner; Kappelle, L Jaap; Koudstaal, Peter J; Leys, Didier; Mas, Jean-Louis; Sacco, Ralph L; Sivenius, Juhani; Algra, Ale
2015-10-01
The Cerebrovascular Antiplatelet Trialists' Collaborative Group was formed to obtain and analyze individual patient data from the major randomized trials of common antiplatelet regimens after cerebral ischemia. Although the risk of stroke can be reduced by antiplatelet drugs, there continues to be uncertainty about the balance of risk and benefits of different antiplatelet regimens for an individual patient. Our aim is to provide clinicians with a thorough evidence-based answer on these therapeutic alternatives. We have identified six large randomized trials and plan to meta-analyze the data on an individual patient level. In total, these trials have enrolled 46 948 patients with cerebral ischemia. Uniquely, the Cerebrovascular Antiplatelet Trialists' Collaborative Group has secured access to the individual data of all of these trials, with the participation of key investigators and pharmaceutical companies. Our principal objective includes deriving a reliable estimate of the efficacy of different antiplatelet regimens on key outcomes including serious vascular events, major ischemic events, major bleeding, and intracranial hemorrhage. We propose to redefine composite outcome events, if necessary, to achieve comparability. Further, we aim to build and validate prognostic models for the risk of major bleeding and intracranial hemorrhage and to build a decision model that may support evidence-based decision making about which antiplatelet regimen would be most effective in different risk groups of patients. This paper outlines inclusion criteria, outcome measures, baseline characteristics, and planned statistical analysis. © 2015 World Stroke Organization.
A Bayes linear Bayes method for estimation of correlated event rates.
Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim
2013-12-01
Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.
Barrett, Tyler W; Storrow, Alan B; Jenkins, Cathy A; Abraham, Robert L; Liu, Dandan; Miller, Karen F; Moser, Kelly M; Russ, Stephan; Roden, Dan M; Harrell, Frank E; Darbar, Dawood
2015-03-15
There is wide variation in the management of patients with atrial fibrillation (AF) in the emergency department (ED). We aimed to derive and internally validate the first prospective, ED-based clinical decision aid to identify patients with AF at low risk for 30-day adverse events. We performed a prospective cohort study at a university-affiliated tertiary-care ED. Patients were enrolled from June 9, 2010, to February 28, 2013, and followed for 30 days. We enrolled a convenience sample of patients in ED presenting with symptomatic AF. Candidate predictors were based on ED data available in the first 2 hours. The decision aid was derived using model approximation (preconditioning) followed by strong bootstrap internal validation. We used an ordinal outcome hierarchy defined as the incidence of the most severe adverse event within 30 days of the ED evaluation. Of 497 patients enrolled, stroke and AF-related death occurred in 13 (3%) and 4 (<1%) patients, respectively. The decision aid included the following: age, triage vitals (systolic blood pressure, temperature, respiratory rate, oxygen saturation, supplemental oxygen requirement), medical history (heart failure, home sotalol use, previous percutaneous coronary intervention, electrical cardioversion, cardiac ablation, frequency of AF symptoms), and ED data (2 hours heart rate, chest radiograph results, hemoglobin, creatinine, and brain natriuretic peptide). The decision aid's c-statistic in predicting any 30-day adverse event was 0.7 (95% confidence interval 0.65, 0.76). In conclusion, in patients with AF in the ED, Atrial Fibrillation and Flutter Outcome Risk Determination provides the first evidence-based decision aid for identifying patients who are at low risk for 30-day adverse events and candidates for safe discharge. Copyright © 2015 Elsevier Inc. All rights reserved.
Engineering models for catastrophe risk and their application to insurance
NASA Astrophysics Data System (ADS)
Dong, Weimin
2002-06-01
Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.
Sudden Cardiac Death After Non-ST-Segment Elevation Acute Coronary Syndrome.
Hess, Paul L; Wojdyla, Daniel M; Al-Khatib, Sana M; Lokhnygina, Yuliya; Wallentin, Lars; Armstrong, Paul W; Roe, Matthew T; Ohman, E Magnus; Harrington, Robert A; Alexander, John H; White, Harvey D; Van de Werf, Frans; Piccini, Jonathan P; Held, Claes; Aylward, Philip E; Moliterno, David J; Mahaffey, Kenneth W; Tricoci, Pierluigi
2016-04-01
In the current therapeutic era, the risk for sudden cardiac death (SCD) after non-ST-segment elevation acute coronary syndrome (NSTE ACS) has not been characterized completely. To determine the cumulative incidence of SCD during long-term follow-up after NSTE ACS, to develop a risk model and risk score for SCD after NSTE ACS, and to assess the association between recurrent events after the initial ACS presentation and the risk for SCD. This pooled cohort analysis merged individual data from 48 286 participants in 4 trials: the Apixaban for Prevention of Acute Ischemic Events 2 (APPRAISE-2), Study of Platelet Inhibition and Patient Outcomes (PLATO), Thrombin Receptor Antagonist for Clinical Event Reduction in Acute Coronary Syndrome (TRACER), and Targeted Platelet Inhibition to Clarify the Optimal Strategy to Medically Manage Acute Coronary Syndromes (TRILOGY ACS) trials. The cumulative incidence of SCD and cardiovascular death was examined according to time after NSTE ACS. Using competing risk and Cox proportional hazards models, clinical factors at baseline and after the index event that were associated with SCD after NSTE ACS were identified. Baseline factors were used to develop a risk model. Data were analyzed from January 2, 2014, to December 11, 2015. Sudden cardiac death. Of the initial 48 286 patients, 37 555 patients were enrolled after NSTE ACS (67.4% men; 32.6% women; median [interquartile range] age, 65 [57-72] years). Among these, 2109 deaths occurred after a median follow-up of 12.1 months. Of 1640 cardiovascular deaths, 513 (31.3%) were SCD. At 6, 18, and 30 months, the cumulative incidence estimates of SCD were 0.79%, 1.65%, and 2.37%, respectively. Reduced left ventricular ejection fraction, older age, diabetes mellitus, lower estimated glomerular filtration rate, higher heart rate, prior myocardial infarction, peripheral artery disease, Asian race, male sex, and high Killip class were significantly associated with SCD. A model developed to calculate the risk for SCD in trials with systematic collection of left ventricular ejection fraction had a C index of 0.77. An integer-based score was developed from this model and yielded a calculated SCD probability ranging from 0.1% to 56.7% (C statistic, 0.75). In a multivariable model that included time-dependent clinical events occurring after the index hospitalization for ACS, SCD was associated with recurrent myocardial infarction (hazard ratio [HR], 2.95; 95% CI, 2.29-3.80; P < .001) and any hospitalization (HR, 2.45; 95% CI, 1.98-3.03; P < .001), whereas coronary revascularization had a negative relationship with SCD (HR, 0.75; 95% CI, 0.58-0.98; P = .03). In the current therapeutic era, SCD accounts for about one-third of cardiovascular deaths after NSTE ACS. Risk stratification can be performed with good accuracy using commonly collected clinical variables. Clinical events occurring after the index hospitalization are underappreciated but important risk factors.
Including operational data in QMRA model: development and impact of model inputs.
Jaidi, Kenza; Barbeau, Benoit; Carrière, Annie; Desjardins, Raymond; Prévost, Michèle
2009-03-01
A Monte Carlo model, based on the Quantitative Microbial Risk Analysis approach (QMRA), has been developed to assess the relative risks of infection associated with the presence of Cryptosporidium and Giardia in drinking water. The impact of various approaches for modelling the initial parameters of the model on the final risk assessments is evaluated. The Monte Carlo simulations that we performed showed that the occurrence of parasites in raw water was best described by a mixed distribution: log-Normal for concentrations > detection limit (DL), and a uniform distribution for concentrations < DL. The selection of process performance distributions for modelling the performance of treatment (filtration and ozonation) influences the estimated risks significantly. The mean annual risks for conventional treatment are: 1.97E-03 (removal credit adjusted by log parasite = log spores), 1.58E-05 (log parasite = 1.7 x log spores) or 9.33E-03 (regulatory credits based on the turbidity measurement in filtered water). Using full scale validated SCADA data, the simplified calculation of CT performed at the plant was shown to largely underestimate the risk relative to a more detailed CT calculation, which takes into consideration the downtime and system failure events identified at the plant (1.46E-03 vs. 3.93E-02 for the mean risk).
Tominaga, T; Miyazaki, S; Oniyama, Y; Weber, A D; Kondo, T
2017-08-01
The Japanese Postmarketing Relief System provides for compensation to patients with adverse reactions, based on the acknowledgment that unpredicted adverse events occur inevitably once a drug is marketed. The system also provides new knowledge about the benefit-risk profile of a drug that may be incorporated into product labeling. The system relies on causality assessments that are based on sound clinical pharmacology principles. The system may serve as a model for other countries' healthcare systems. © 2016 American Society for Clinical Pharmacology and Therapeutics.
Lai, Jyh-Mirn; Hwang, Yi-Ting; Chou, Chin-Cheng
2012-06-01
The highly pathogenic avian influenza virus (HPAIV) is able to survive in poultry products and could be carried into a country by air travelers. An assessment model was constructed to estimate the probability of the exotic viable HPAIV entering Taiwan from two neighboring areas through poultry products carried illegally by air passengers at Taiwan's main airports. The entrance risk was evaluated based on HPAIV-related factors (the prevalence and the incubation period of HPAIV; the manufacturing process of poultry products; and the distribution-storage-transportation factor event) and the passenger event. Distribution functions were adopted to simulate the probabilities of each HPAIV factor. The odds of passengers being intercepted with illegal poultry products were estimated by logistic regression. The Monte Carlo simulation established that the risk caused by HPAIV-related factors from area A was lower than area B, whereas the entrance risk by the passenger event from area A was similar to area B. Sensitivity analysis showed that the incubation period of HPAIV and the interception of passenger violations were major determinants. Although the result showed viable HPAIV was unlikely to enter Taiwan through meat illegally carried by air passengers, this low probability could be caused by incomplete animal disease data and modeling uncertainties. Considering the negative socioeconomic impacts of HPAIV outbreaks, strengthening airport quarantine measures is still necessary. This assessment provides a profile of HPAIV entrance risk through air travelers arriving from endemic areas and a feasible direction for quarantine and public health measures. © 2011 Society for Risk Analysis.
Assessing Risk Prediction Models Using Individual Participant Data From Multiple Studies
Pennells, Lisa; Kaptoge, Stephen; White, Ian R.; Thompson, Simon G.; Wood, Angela M.; Tipping, Robert W.; Folsom, Aaron R.; Couper, David J.; Ballantyne, Christie M.; Coresh, Josef; Goya Wannamethee, S.; Morris, Richard W.; Kiechl, Stefan; Willeit, Johann; Willeit, Peter; Schett, Georg; Ebrahim, Shah; Lawlor, Debbie A.; Yarnell, John W.; Gallacher, John; Cushman, Mary; Psaty, Bruce M.; Tracy, Russ; Tybjærg-Hansen, Anne; Price, Jackie F.; Lee, Amanda J.; McLachlan, Stela; Khaw, Kay-Tee; Wareham, Nicholas J.; Brenner, Hermann; Schöttker, Ben; Müller, Heiko; Jansson, Jan-Håkan; Wennberg, Patrik; Salomaa, Veikko; Harald, Kennet; Jousilahti, Pekka; Vartiainen, Erkki; Woodward, Mark; D'Agostino, Ralph B.; Bladbjerg, Else-Marie; Jørgensen, Torben; Kiyohara, Yutaka; Arima, Hisatomi; Doi, Yasufumi; Ninomiya, Toshiharu; Dekker, Jacqueline M.; Nijpels, Giel; Stehouwer, Coen D. A.; Kauhanen, Jussi; Salonen, Jukka T.; Meade, Tom W.; Cooper, Jackie A.; Cushman, Mary; Folsom, Aaron R.; Psaty, Bruce M.; Shea, Steven; Döring, Angela; Kuller, Lewis H.; Grandits, Greg; Gillum, Richard F.; Mussolino, Michael; Rimm, Eric B.; Hankinson, Sue E.; Manson, JoAnn E.; Pai, Jennifer K.; Kirkland, Susan; Shaffer, Jonathan A.; Shimbo, Daichi; Bakker, Stephan J. L.; Gansevoort, Ron T.; Hillege, Hans L.; Amouyel, Philippe; Arveiler, Dominique; Evans, Alun; Ferrières, Jean; Sattar, Naveed; Westendorp, Rudi G.; Buckley, Brendan M.; Cantin, Bernard; Lamarche, Benoît; Barrett-Connor, Elizabeth; Wingard, Deborah L.; Bettencourt, Richele; Gudnason, Vilmundur; Aspelund, Thor; Sigurdsson, Gunnar; Thorsson, Bolli; Kavousi, Maryam; Witteman, Jacqueline C.; Hofman, Albert; Franco, Oscar H.; Howard, Barbara V.; Zhang, Ying; Best, Lyle; Umans, Jason G.; Onat, Altan; Sundström, Johan; Michael Gaziano, J.; Stampfer, Meir; Ridker, Paul M.; Michael Gaziano, J.; Ridker, Paul M.; Marmot, Michael; Clarke, Robert; Collins, Rory; Fletcher, Astrid; Brunner, Eric; Shipley, Martin; Kivimäki, Mika; Ridker, Paul M.; Buring, Julie; Cook, Nancy; Ford, Ian; Shepherd, James; Cobbe, Stuart M.; Robertson, Michele; Walker, Matthew; Watson, Sarah; Alexander, Myriam; Butterworth, Adam S.; Angelantonio, Emanuele Di; Gao, Pei; Haycock, Philip; Kaptoge, Stephen; Pennells, Lisa; Thompson, Simon G.; Walker, Matthew; Watson, Sarah; White, Ian R.; Wood, Angela M.; Wormser, David; Danesh, John
2014-01-01
Individual participant time-to-event data from multiple prospective epidemiologic studies enable detailed investigation into the predictive ability of risk models. Here we address the challenges in appropriately combining such information across studies. Methods are exemplified by analyses of log C-reactive protein and conventional risk factors for coronary heart disease in the Emerging Risk Factors Collaboration, a collation of individual data from multiple prospective studies with an average follow-up duration of 9.8 years (dates varied). We derive risk prediction models using Cox proportional hazards regression analysis stratified by study and obtain estimates of risk discrimination, Harrell's concordance index, and Royston's discrimination measure within each study; we then combine the estimates across studies using a weighted meta-analysis. Various weighting approaches are compared and lead us to recommend using the number of events in each study. We also discuss the calculation of measures of reclassification for multiple studies. We further show that comparison of differences in predictive ability across subgroups should be based only on within-study information and that combining measures of risk discrimination from case-control studies and prospective studies is problematic. The concordance index and discrimination measure gave qualitatively similar results throughout. While the concordance index was very heterogeneous between studies, principally because of differing age ranges, the increments in the concordance index from adding log C-reactive protein to conventional risk factors were more homogeneous. PMID:24366051
Assessing risk prediction models using individual participant data from multiple studies.
Pennells, Lisa; Kaptoge, Stephen; White, Ian R; Thompson, Simon G; Wood, Angela M
2014-03-01
Individual participant time-to-event data from multiple prospective epidemiologic studies enable detailed investigation into the predictive ability of risk models. Here we address the challenges in appropriately combining such information across studies. Methods are exemplified by analyses of log C-reactive protein and conventional risk factors for coronary heart disease in the Emerging Risk Factors Collaboration, a collation of individual data from multiple prospective studies with an average follow-up duration of 9.8 years (dates varied). We derive risk prediction models using Cox proportional hazards regression analysis stratified by study and obtain estimates of risk discrimination, Harrell's concordance index, and Royston's discrimination measure within each study; we then combine the estimates across studies using a weighted meta-analysis. Various weighting approaches are compared and lead us to recommend using the number of events in each study. We also discuss the calculation of measures of reclassification for multiple studies. We further show that comparison of differences in predictive ability across subgroups should be based only on within-study information and that combining measures of risk discrimination from case-control studies and prospective studies is problematic. The concordance index and discrimination measure gave qualitatively similar results throughout. While the concordance index was very heterogeneous between studies, principally because of differing age ranges, the increments in the concordance index from adding log C-reactive protein to conventional risk factors were more homogeneous.
NASA Technical Reports Server (NTRS)
Butler, Doug; Bauman, David; Johnson-Throop, Kathy
2011-01-01
The Integrated Medical Model (IMM) Project has been developing a probabilistic risk assessment tool, the IMM, to help evaluate in-flight crew health needs and impacts to the mission due to medical events. This package is a follow-up to a data package provided in June 2009. The IMM currently represents 83 medical conditions and associated ISS resources required to mitigate medical events. IMM end state forecasts relevant to the ISS PRA model include evacuation (EVAC) and loss of crew life (LOCL). The current version of the IMM provides the basis for the operational version of IMM expected in the January 2011 timeframe. The objectives of this data package are: 1. To provide a preliminary understanding of medical risk data used to update the ISS PRA Model. The IMM has had limited validation and an initial characterization of maturity has been completed using NASA STD 7009 Standard for Models and Simulation. The IMM has been internally validated by IMM personnel but has not been validated by an independent body external to the IMM Project. 2. To support a continued dialogue between the ISS PRA and IMM teams. To ensure accurate data interpretation, and that IMM output format and content meets the needs of the ISS Risk Management Office and ISS PRA Model, periodic discussions are anticipated between the risk teams. 3. To help assess the differences between the current ISS PRA and IMM medical risk forecasts of EVAC and LOCL. Follow-on activities are anticipated based on the differences between the current ISS PRA medical risk data and the latest medical risk data produced by IMM.
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan W.; Jones, Scott M.
2017-01-01
Aircraft flying in regions of high ice crystal concentrations are susceptible to the buildup of ice within the compression system of their gas turbine engines. This ice buildup can restrict engine airflow and cause an uncommanded loss of thrust, also known as engine rollback, which poses a potential safety hazard. The aviation community is conducting research to understand this phenomena, and to identify avoidance and mitigation strategies to address the concern. To support this research, a dynamic turbofan engine model has been created to enable the development and evaluation of engine icing detection and control-based mitigation strategies. This model captures the dynamic engine response due to high ice water ingestion and the buildup of ice blockage in the engines low pressure compressor. It includes a fuel control system allowing engine closed-loop control effects during engine icing events to be emulated. The model also includes bleed air valve and horsepower extraction actuators that, when modulated, change overall engine operating performance. This system-level model has been developed and compared against test data acquired from an aircraft turbofan engine undergoing engine icing studies in an altitude test facility and also against outputs from the manufacturers customer deck. This paper will describe the model and show results of its dynamic response under open-loop and closed-loop control operating scenarios in the presence of ice blockage buildup compared against engine test cell data. Planned follow-on use of the model for the development and evaluation of icing detection and control-based mitigation strategies will also be discussed. The intent is to combine the model and control mitigation logic with an engine icing risk calculation tool capable of predicting the risk of engine icing based on current operating conditions. Upon detection of an operating region of risk for engine icing events, the control mitigation logic will seek to change the engines operating point to a region of lower risk through the modulation of available control actuators while maintaining the desired engine thrust output. Follow-on work will assess the feasibility and effectiveness of such control-based mitigation strategies.
NASA Technical Reports Server (NTRS)
Culpepper, William X.; ONeill, Pat; Nicholson, Leonard L.
2000-01-01
An internuclear cascade and evaporation model has been adapted to estimate the LET spectrum generated during testing with 200 MeV protons. The model-generated heavy ion LET spectrum is compared to the heavy ion LET spectrum seen on orbit. This comparison is the basis for predicting single event failure rates from heavy ions using results from a single proton test. Of equal importance, this spectra comparison also establishes an estimate of the risk of encountering a failure mode on orbit that was not detected during proton testing. Verification of the general results of the model is presented based on experiments, individual part test results, and flight data. Acceptance of this model and its estimate of remaining risk opens the hardware verification philosophy to the consideration of radiation testing with high energy protons at the board and box level instead of the more standard method of individual part testing with low energy heavy ions.
Comparing flood loss models of different complexity
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Riggelsen, Carsten; Scherbaum, Frank; Merz, Bruno
2013-04-01
Any deliberation on flood risk requires the consideration of potential flood losses. In particular, reliable flood loss models are needed to evaluate cost-effectiveness of mitigation measures, to assess vulnerability, for comparative risk analysis and financial appraisal during and after floods. In recent years, considerable improvements have been made both concerning the data basis and the methodological approaches used for the development of flood loss models. Despite of that, flood loss models remain an important source of uncertainty. Likewise the temporal and spatial transferability of flood loss models is still limited. This contribution investigates the predictive capability of different flood loss models in a split sample cross regional validation approach. For this purpose, flood loss models of different complexity, i.e. based on different numbers of explaining variables, are learned from a set of damage records that was obtained from a survey after the Elbe flood in 2002. The validation of model predictions is carried out for different flood events in the Elbe and Danube river basins in 2002, 2005 and 2006 for which damage records are available from surveys after the flood events. The models investigated are a stage-damage model, the rule based model FLEMOps+r as well as novel model approaches which are derived using data mining techniques of regression trees and Bayesian networks. The Bayesian network approach to flood loss modelling provides attractive additional information concerning the probability distribution of both model predictions and explaining variables.
Quantitative risk assessment system (QRAS)
NASA Technical Reports Server (NTRS)
Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)
2001-01-01
A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.
Hudson, Christopher D.; Huxley, Jonathan N.; Green, Martin J.
2014-01-01
The ever-growing volume of data routinely collected and stored in everyday life presents researchers with a number of opportunities to gain insight and make predictions. This study aimed to demonstrate the usefulness in a specific clinical context of a simulation-based technique called probabilistic sensitivity analysis (PSA) in interpreting the results of a discrete time survival model based on a large dataset of routinely collected dairy herd management data. Data from 12,515 dairy cows (from 39 herds) were used to construct a multilevel discrete time survival model in which the outcome was the probability of a cow becoming pregnant during a given two day period of risk, and presence or absence of a recorded lameness event during various time frames relative to the risk period amongst the potential explanatory variables. A separate simulation model was then constructed to evaluate the wider clinical implications of the model results (i.e. the potential for a herd’s incidence rate of lameness to influence its overall reproductive performance) using PSA. Although the discrete time survival analysis revealed some relatively large associations between lameness events and risk of pregnancy (for example, occurrence of a lameness case within 14 days of a risk period was associated with a 25% reduction in the risk of the cow becoming pregnant during that risk period), PSA revealed that, when viewed in the context of a realistic clinical situation, a herd’s lameness incidence rate is highly unlikely to influence its overall reproductive performance to a meaningful extent in the vast majority of situations. Construction of a simulation model within a PSA framework proved to be a very useful additional step to aid contextualisation of the results from a discrete time survival model, especially where the research is designed to guide on-farm management decisions at population (i.e. herd) rather than individual level. PMID:25101997
Hudson, Christopher D; Huxley, Jonathan N; Green, Martin J
2014-01-01
The ever-growing volume of data routinely collected and stored in everyday life presents researchers with a number of opportunities to gain insight and make predictions. This study aimed to demonstrate the usefulness in a specific clinical context of a simulation-based technique called probabilistic sensitivity analysis (PSA) in interpreting the results of a discrete time survival model based on a large dataset of routinely collected dairy herd management data. Data from 12,515 dairy cows (from 39 herds) were used to construct a multilevel discrete time survival model in which the outcome was the probability of a cow becoming pregnant during a given two day period of risk, and presence or absence of a recorded lameness event during various time frames relative to the risk period amongst the potential explanatory variables. A separate simulation model was then constructed to evaluate the wider clinical implications of the model results (i.e. the potential for a herd's incidence rate of lameness to influence its overall reproductive performance) using PSA. Although the discrete time survival analysis revealed some relatively large associations between lameness events and risk of pregnancy (for example, occurrence of a lameness case within 14 days of a risk period was associated with a 25% reduction in the risk of the cow becoming pregnant during that risk period), PSA revealed that, when viewed in the context of a realistic clinical situation, a herd's lameness incidence rate is highly unlikely to influence its overall reproductive performance to a meaningful extent in the vast majority of situations. Construction of a simulation model within a PSA framework proved to be a very useful additional step to aid contextualisation of the results from a discrete time survival model, especially where the research is designed to guide on-farm management decisions at population (i.e. herd) rather than individual level.
NASA Astrophysics Data System (ADS)
Florian, Ehmele; Michael, Kunz
2016-04-01
Several major flood events occurred in Germany in the past 15-20 years especially in the eastern parts along the rivers Elbe and Danube. Examples include the major floods of 2002 and 2013 with an estimated loss of about 2 billion Euros each. The last major flood events in the State of Baden-Württemberg in southwest Germany occurred in the years 1978 and 1993/1994 along the rivers Rhine and Neckar with an estimated total loss of about 150 million Euros (converted) each. Flood hazard originates from a combination of different meteorological, hydrological and hydraulic processes. Currently there is no defined methodology available for evaluating and quantifying the flood hazard and related risk for larger areas or whole river catchments instead of single gauges. In order to estimate the probable maximum loss for higher return periods (e.g. 200 years, PML200), a stochastic model approach is designed since observational data are limited in time and space. In our approach, precipitation is linearly composed of three elements: background precipitation, orographically-induces precipitation, and a convectively-driven part. We use linear theory of orographic precipitation formation for the stochastic precipitation model (SPM), which is based on fundamental statistics of relevant atmospheric variables. For an adequate number of historic flood events, the corresponding atmospheric conditions and parameters are determined in order to calculate a probability density function (pdf) for each variable. This method involves all theoretically possible scenarios which may not have happened, yet. This work is part of the FLORIS-SV (FLOod RISk Sparkassen Versicherung) project and establishes the first step of a complete modelling chain of the flood risk. On the basis of the generated stochastic precipitation event set, hydrological and hydraulic simulations will be performed to estimate discharge and water level. The resulting stochastic flood event set will be used to quantify the flood risk and to estimate probable maximum loss (e.g. PML200) for a given property (buildings, industry) portfolio.
Prediction of Coronary Artery Disease Risk Based on Multiple Longitudinal Biomarkers
Yang, Lili; Yu, Menggang; Gao, Sujuan
2016-01-01
In the last decade, few topics in the area of cardiovascular disease (CVD) research have received as much attention as risk prediction. One of the well documented risk factors for CVD is high blood pressure (BP). Traditional CVD risk prediction models consider BP levels measured at a single time and such models form the basis for current clinical guidelines for CVD prevention. However, in clinical practice, BP levels are often observed and recorded in a longitudinal fashion. Information on BP trajectories can be powerful predictors for CVD events. We consider joint modeling of time to coronary artery disease and individual longitudinal measures of systolic and diastolic BPs in a primary care cohort with up to 20 years of follow-up. We applied novel prediction metrics to assess the predictive performance of joint models. Predictive performances of proposed joint models and other models were assessed via simulations and illustrated using the primary care cohort. PMID:26439685
An Agent-Based Model of Evolving Community Flood Risk.
Tonn, Gina L; Guikema, Seth D
2018-06-01
Although individual behavior plays a major role in community flood risk, traditional flood risk models generally do not capture information on how community policies and individual decisions impact the evolution of flood risk over time. The purpose of this study is to improve the understanding of the temporal aspects of flood risk through a combined analysis of the behavioral, engineering, and physical hazard aspects of flood risk. Additionally, the study aims to develop a new modeling approach for integrating behavior, policy, flood hazards, and engineering interventions. An agent-based model (ABM) is used to analyze the influence of flood protection measures, individual behavior, and the occurrence of floods and near-miss flood events on community flood risk. The ABM focuses on the following decisions and behaviors: dissemination of flood management information, installation of community flood protection, elevation of household mechanical equipment, and elevation of homes. The approach is place based, with a case study area in Fargo, North Dakota, but is focused on generalizable insights. Generally, community mitigation results in reduced future damage, and individual action, including mitigation and movement into and out of high-risk areas, can have a significant influence on community flood risk. The results of this study provide useful insights into the interplay between individual and community actions and how it affects the evolution of flood risk. This study lends insight into priorities for future work, including the development of more in-depth behavioral and decision rules at the individual and community level. © 2017 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Anagnostou, E. N.; Seyyedi, H.; Beighley, E., II; McCollum, J.
2014-12-01
Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a conflation of CCS with the more advanced, widespread technology hydraulic fracturing and corresponding strong risk associations. We conclude with suggestions on how to integrate modeling results into public conversations to improve risk awareness and we provide preliminary policy recommendations to increase public support for CCS.
NASA Astrophysics Data System (ADS)
Augustin, C. M.
2015-12-01
Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a conflation of CCS with the more advanced, widespread technology hydraulic fracturing and corresponding strong risk associations. We conclude with suggestions on how to integrate modeling results into public conversations to improve risk awareness and we provide preliminary policy recommendations to increase public support for CCS.
Brain Perivascular Spaces as Biomarkers of Vascular Risk: Results from the Northern Manhattan Study.
Gutierrez, J; Elkind, M S V; Dong, C; Di Tullio, M; Rundek, T; Sacco, R L; Wright, C B
2017-05-01
Dilated perivascular spaces in the brain are associated with greater arterial pulsatility. We hypothesized that perivascular spaces identify individuals at higher risk for systemic and cerebral vascular events. Stroke-free participants in the population-based Northern Manhattan Study had brain MR imaging performed and were followed for myocardial infarction, any stroke, and death. Imaging analyses distinguished perivascular spaces from lesions presumably ischemic. Perivascular spaces were further subdivided into lesions with diameters of ≤3 mm (small perivascular spaces) and >3 mm (large perivascular spaces). We calculated relative rates of events with Poisson models and hazard ratios with Cox proportional models. The Northern Manhattan Study participants who had MR imaging data available for review ( n = 1228; 59% women, 65% Hispanic; mean age, 71 ± 9 years) were followed for an average of 9 ± 2 years. Participants in the highest tertile of the small perivascular space score had a higher relative rate of all deaths (relative rate, 1.38; 95% CI, 1.01-1.91), vascular death (relative rate, 1.87; 95% CI, 1.12-3.14), myocardial infarction (relative rate, 2.08; 95% CI, 1.01-4.31), any stroke (relative rate, 1.79; 95% CI, 1.03-3.11), and any vascular event (relative rate, 1.74; 95% CI, 1.18-2.56). After we adjusted for confounders, there was a higher risk of vascular death (hazard ratio, 1.06; 95% CI, 1.01-1.11), myocardial infarction (hazard ratio, 2.22; 95% CI, 1.12-4.42), and any vascular event (hazard ratio, 1.04; 95% CI, 1.01-1.08) with higher small perivascular space scores. In this multiethnic, population-based study, participants with a high burden of small perivascular spaces had increased risk of vascular events. By gaining pathophysiologic insight into the mechanism of perivascular space dilation, we may be able to propose novel therapies to better prevent vascular disorders in the population. © 2017 by American Journal of Neuroradiology.
NASA Astrophysics Data System (ADS)
Mortazavi-Naeini, M.; Bussi, G.; Hall, J. W.; Whitehead, P. G.
2016-12-01
The main aim of water companies is to have a reliable and safe water supply system. To fulfil their duty the water companies have to consider both water quality and quantity issues and challenges. Climate change and population growth will have an impact on water resources both in terms of available water and river water quality. Traditionally, a distinct separation between water quality and abstraction has existed. However, water quality can be a bottleneck in a system since water treatment works can only treat water if it meets certain standards. For instance, high turbidity and large phytoplankton content can increase sharply the cost of treatment or even make river water unfit for human consumption purposes. It is vital for water companies to be able to characterise the quantity and quality of water under extreme weather events and to consider the occurrence of eventual periods when water abstraction has to cease due to water quality constraints. This will give them opportunity to decide on water resource planning and potential changes to reduce the system failure risk. We present a risk-based approach for incorporating extreme events, based on future climate change scenarios from a large ensemble of climate model realisations, into integrated water resources model through combined use of water allocation (WATHNET) and water quality (INCA) models. The annual frequency of imposed restrictions on demand is considered as measure of reliability. We tested our approach on Thames region, in the UK, with 100 extreme events. The results show increase in frequency of imposed restrictions when water quality constraints were considered. This indicates importance of considering water quality issues in drought management plans.
Hoffmann, Udo; Massaro, Joseph M; D'Agostino, Ralph B; Kathiresan, Sekar; Fox, Caroline S; O'Donnell, Christopher J
2016-02-22
We determined whether vascular and valvular calcification predicted incident major coronary heart disease, cardiovascular disease (CVD), and all-cause mortality independent of Framingham risk factors in the community-based Framingham Heart Study. Coronary artery calcium (CAC), thoracic and abdominal aortic calcium, and mitral or aortic valve calcium were measured by cardiac computed tomography in participants free of CVD. Participants were followed for a median of 8 years. Multivariate Cox proportional hazards models were used to determine association of CAC, thoracic and abdominal aortic calcium, and mitral and aortic valve calcium with end points. Improvement in discrimination beyond risk factors was tested via the C-statistic and net reclassification index. In this cohort of 3486 participants (mean age 50±10 years; 51% female), CAC was most strongly associated with major coronary heart disease, followed by major CVD, and all-cause mortality independent of Framingham risk factors. Among noncoronary calcifications, mitral valve calcium was associated with major CVD and all-cause mortality independent of Framingham risk factors and CAC. CAC significantly improved discriminatory value beyond risk factors for coronary heart disease (area under the curve 0.78-0.82; net reclassification index 32%, 95% CI 11-53) but not for CVD. CAC accurately reclassified 85% of the 261 patients who were at intermediate (5-10%) 10-year risk for coronary heart disease based on Framingham risk factors to either low risk (n=172; no events observed) or high risk (n=53; observed event rate 8%). CAC improves discrimination and risk reclassification for major coronary heart disease and CVD beyond risk factors in asymptomatic community-dwelling persons and accurately reclassifies two-thirds of the intermediate-risk population. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Manzini, G; Ettrich, T J; Kremer, M; Kornmann, M; Henne-Bruns, D; Eikema, D A; Schlattmann, P; de Wreede, L C
2018-02-13
Standard survival analysis fails to give insight into what happens to a patient after a first outcome event (like first relapse of a disease). Multi-state models are a useful tool for analyzing survival data when different treatments and results (intermediate events) can occur. Aim of this study was to implement a multi-state model on data of patients with rectal cancer to illustrate the advantages of multi-state analysis in comparison to standard survival analysis. We re-analyzed data from the RCT FOGT-2 study by using a multi-state model. Based on the results we defined a high and low risk reference patient. Using dynamic prediction, we estimated how the survival probability changes as more information about the clinical history of the patient becomes available. A patient with stage UICC IIIc (vs UICC II) has a higher risk to develop distant metastasis (DM) or both DM and local recurrence (LR) if he/she discontinues chemotherapy within 6 months or between 6 and 12 months, as well as after the completion of 12 months CTx with HR 3.55 (p = 0.026), 5.33 (p = 0.001) and 3.37 (p < 0.001), respectively. He/she also has a higher risk to die after the development of DM (HR 1.72, p = 0.023). Anterior resection vs. abdominoperineal amputation means 63% risk reduction to develop DM or both DM and LR (HR 0.37, p = 0.003) after discontinuation of chemotherapy between 6 and 12 months. After development of LR, a woman has a 4.62 times higher risk to die (p = 0.006). A high risk reference patient has an estimated 43% 5-year survival probability at start of CTx, whereas for a low risk patient this is 79%. After the development of DM 1 year later, the high risk patient has an estimated 5-year survival probability of 11% and the low risk patient one of 21%. Multi-state models help to gain additional insight into the complex events after start of treatment. Dynamic prediction shows how survival probabilities change by progression of the clinical history.
Han, Yaling; Chen, Jiyan; Qiu, Miaohan; Li, Yi; Li, Jing; Feng, Yingqing; Qiu, Jian; Meng, Liang; Sun, Yihong; Tao, Guizhou; Wu, Zhaohui; Yang, Chunyu; Guo, Jincheng; Pu, Kui; Chen, Shaoliang; Wang, Xiaozeng
2018-06-05
The prognosis of patients with coronary artery disease (CAD) at hospital discharge was constantly varying, and post-discharge risk of ischemic events remain a concern. However, risk prediction tools to identify risk of ischemia for these patients has not yet been reported. We sought to develop a scoring system for predicting long-term ischemic events in CAD patients receiving antiplatelet therapy that would be beneficial in appropriate personalized decision-making for these patients. In this prospective Optimal antiPlatelet Therapy for Chinese patients with Coronary Artery Disease (OPT-CAD, NCT01735305) registry, a total of 14,032 patients with CAD receiving at least one kind of antiplatelet agent were enrolled from 107 centers across China, from January 2012 to March 2014. The risk scoring system was developed in a derivation cohort (enrolled initially 10,000 patients in the database) using a logistic regression model and was subsequently tested in a validation cohort (the last 4,032 patients). Points in risk score was assigned based on the multivariable odds ratio of each factor. Ischemic events were defined as the composite of cardiac death, myocardial infarction or stroke. Ischemic events occurred in 342 (3.4%) patients in the derivation cohort and 160 (4.0%) patients in the validation cohort during 1-year follow-up. The OPT-CAD score, ranging from 0-257 points, consist of 10 independent risk factors, including age (0-71 points), heart rates (0-36 points), hypertension (0-20 points), prior myocardial infarction (16 points), prior stroke (16 points), renal insufficient (21 points), anemia (19 points), low ejection fraction (22 points), positive cardiac troponin (23 points) and ST-segment deviation (13 points). In predicting 1-year ischemic events, the area under receiver operating characteristics curve were 0.73 and 0.72 in derivation and validation cohort, respectively. The incidences of ischemic events in low- (0-90 points), medium- (91-150 points) and high-risk (≥151 points) patients were 1.6%, 5.5%, and 15.0%, respectively. Compared to GRACE score, OPT-CAD score had a better discrimination in predicting ischemic events and all-cause mortality (ischemic events: 0.72 vs 0.65, all-cause mortality: 0.79 vs 0.72, both P<0.001). Among CAD patients, a risk score based on 10 baseline clinical variables performed better than the GRACE risk score in predicting long-term ischemic events. However, further research is needed to assess the value of the OPT-CAD score in guiding the management of antiplatelet therapy for patients with CAD. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
A formal framework of scenario creation and analysis of extreme hydrological events
NASA Astrophysics Data System (ADS)
Lohmann, D.
2007-12-01
We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.
The 10-year Absolute Risk of Cardiovascular (CV) Events in Northern Iran: a Population Based Study
Motamed, Nima; Mardanshahi, Alireza; Saravi, Benyamin Mohseni; Siamian, Hasan; Maadi, Mansooreh; Zamani, Farhad
2015-01-01
Background: The present study was conducted to estimate 10-year cardiovascular disease events (CVD) risk using three instruments in northern Iran. Material and methods: Baseline data of 3201 participants 40-79 of a population based cohort which was conducted in Northern Iran were analyzed. Framingham risk score (FRS), World Health Organization (WHO) risk prediction charts and American college of cardiovascular / American heart association (ACC/AHA) tool were applied to assess 10-year CVD events risk. The agreement values between the risk assessment instruments were determined using the kappa statistics. Results: Our study estimated 53.5%of male population aged 40-79 had a 10 –year risk of CVD events≥10% based on ACC/AHA approach, 48.9% based on FRS and 11.8% based on WHO risk charts. A 10 –year risk≥10% was estimated among 20.1% of women using the ACC/AHA approach, 11.9%using FRS and 5.7%using WHO tool. ACC/AHA and Framingham tools had closest agreement in the estimation of 10-year risk≥10% (κ=0.7757) in meanwhile ACC/AHA and WHO approaches displayed highest agreement (κ=0.6123) in women. Conclusion: Different estimations of 10-year risk of CVD event were provided by ACC/AHA, FRS and WHO approaches. PMID:26236160
A Robust Response of Precipitation to Global Warming from CMIP5 Models
NASA Technical Reports Server (NTRS)
Lau, K. -M.; Wu, H. -T.; Kim, K. -M.
2012-01-01
How precipitation responds to global warming is a major concern to society and a challenge to climate change research. Based on analyses of rainfall probability distribution functions of 14 state-of-the-art climate models, we find a robust, canonical global rainfall response to a triple CO2 warming scenario, featuring 100 250% more heavy rain, 5-10% less moderate rain, and 10-15% more very light or no-rain events. Regionally, a majority of the models project a consistent response with more heavy rain events over climatologically wet regions of the deep tropics, and more dry events over subtropical and tropical land areas. Results suggest that increased CO2 emissions induce basic structural changes in global rain systems, increasing risks of severe floods and droughts in preferred geographic locations worldwide.
Coronary artery calcium distributions in older persons in the AGES-Reykjavik study
Gudmundsson, Elias Freyr; Gudnason, Vilmundur; Sigurdsson, Sigurdur; Launer, Lenore J.; Harris, Tamara B.; Aspelund, Thor
2013-01-01
Coronary Artery Calcium (CAC) is a sign of advanced atherosclerosis and an independent risk factor for cardiac events. Here, we describe CAC-distributions in an unselected aged population and compare modelling methods to characterize CAC-distribution. CAC is difficult to model because it has a skewed and zero inflated distribution with over-dispersion. Data are from the AGES-Reykjavik sample, a large population based study [2002-2006] in Iceland of 5,764 persons aged 66-96 years. Linear regressions using logarithmic- and Box-Cox transformations on CAC+1, quantile regression and a Zero-Inflated Negative Binomial model (ZINB) were applied. Methods were compared visually and with the PRESS-statistic, R2 and number of detected associations with concurrently measured variables. There were pronounced differences in CAC according to sex, age, history of coronary events and presence of plaque in the carotid artery. Associations with conventional coronary artery disease (CAD) risk factors varied between the sexes. The ZINB model provided the best results with respect to the PRESS-statistic, R2, and predicted proportion of zero scores. The ZINB model detected similar numbers of associations as the linear regression on ln(CAC+1) and usually with the same risk factors. PMID:22990371
Visualizing the uncertainty in the relationship between seasonal average climate and malaria risk.
MacLeod, D A; Morse, A P
2014-12-02
Around $1.6 billion per year is spent financing anti-malaria initiatives, and though malaria morbidity is falling, the impact of annual epidemics remains significant. Whilst malaria risk may increase with climate change, projections are highly uncertain and to sidestep this intractable uncertainty, adaptation efforts should improve societal ability to anticipate and mitigate individual events. Anticipation of climate-related events is made possible by seasonal climate forecasting, from which warnings of anomalous seasonal average temperature and rainfall, months in advance are possible. Seasonal climate hindcasts have been used to drive climate-based models for malaria, showing significant skill for observed malaria incidence. However, the relationship between seasonal average climate and malaria risk remains unquantified. Here we explore this relationship, using a dynamic weather-driven malaria model. We also quantify key uncertainty in the malaria model, by introducing variability in one of the first order uncertainties in model formulation. Results are visualized as location-specific impact surfaces: easily integrated with ensemble seasonal climate forecasts, and intuitively communicating quantified uncertainty. Methods are demonstrated for two epidemic regions, and are not limited to malaria modeling; the visualization method could be applied to any climate impact.
Visualizing the uncertainty in the relationship between seasonal average climate and malaria risk
NASA Astrophysics Data System (ADS)
MacLeod, D. A.; Morse, A. P.
2014-12-01
Around $1.6 billion per year is spent financing anti-malaria initiatives, and though malaria morbidity is falling, the impact of annual epidemics remains significant. Whilst malaria risk may increase with climate change, projections are highly uncertain and to sidestep this intractable uncertainty, adaptation efforts should improve societal ability to anticipate and mitigate individual events. Anticipation of climate-related events is made possible by seasonal climate forecasting, from which warnings of anomalous seasonal average temperature and rainfall, months in advance are possible. Seasonal climate hindcasts have been used to drive climate-based models for malaria, showing significant skill for observed malaria incidence. However, the relationship between seasonal average climate and malaria risk remains unquantified. Here we explore this relationship, using a dynamic weather-driven malaria model. We also quantify key uncertainty in the malaria model, by introducing variability in one of the first order uncertainties in model formulation. Results are visualized as location-specific impact surfaces: easily integrated with ensemble seasonal climate forecasts, and intuitively communicating quantified uncertainty. Methods are demonstrated for two epidemic regions, and are not limited to malaria modeling; the visualization method could be applied to any climate impact.
Greisenegger, Stefan; Segal, Helen C; Burgess, Annette I; Poole, Debbie L; Mehta, Ziyah; Rothwell, Peter M
2015-11-01
Copeptin, the c-terminal portion of provasopressin, is a useful prognostic marker in patients after myocardial infarction and heart failure. More recently, high levels of copeptin have also been associated with worse functional outcome and increased mortality within the first year after ischemic stroke and transient ischemic attack (TIA). However, to date, there are no published data on whether copeptin predicts long-term risk of vascular events after TIA and stroke. We measured copeptin levels in consecutive patients with TIA or ischemic stroke in a population-based study (Oxford Vascular Study) recruited from 2002 to 2007 and followed up to 2014. Associations with risk of recurrent vascular events were determined by Cox-regression. During ≈6000 patient-years in 1076 patients, there were 357 recurrent vascular events, including 174 ischemic strokes. After adjustment for age, sex, and risk factors, copeptin was predictive of recurrent vascular events (adjusted hazard ratio per SD, 1.47; 95% confidence interval, 1.31-1.64; P=0.0001), vascular death (1.85; 1.60-2.14; P<0.0001), all-cause death (1.75; 1.58-1.93; P<0.0001), and recurrent ischemic stroke (1.22; 1.04-1.44; P=0.017); and improved model-discrimination significantly: net reclassification improvement for recurrent vascular events (32%; P<0.0001), vascular death (55%; P<0.0001), death (66%; P<0.0001), and recurrent stroke (16%; P=0.044). The predictive value of copeptin was largest in patients with cardioembolic index events (adjusted hazard ratio, 1.84; 95% confidence interval, 1.53-2.20 versus 1.31, 1.14-1.50 in noncardioembolic stroke; P=0.0025). In patients with cardioembolic stroke, high copeptin levels were associated with a 4-fold increased risk of vascular events within the first year of follow-up (adjusted hazard ratio, 4.02; 95% confidence interval, 2.13-7.70). In patients with TIA and ischemic stroke, copeptin predicted recurrent vascular events and death, particularly after cardioembolic TIA/stroke. Further validation is required, in particular, in studies using more extensive cardiac evaluation. © 2015 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Cheng, Chad Shouquan; Li, Qian; Li, Guilong
2010-05-01
The synoptic weather typing approach has become popular in evaluating the impacts of climate change on a variety of environmental problems. One of the reasons is its ability to categorize a complex set of meteorological variables as a coherent index, which can facilitate analyses of local climate change impacts. The weather typing method has been applied in Environment Canada to analyze climatic change impacts on various meteorological/hydrological risks, such as freezing rain, heavy rainfall, high-/low-flow events, air pollution, and human health. These studies comprise of three major parts: (1) historical simulation modeling to verify the hazardous events, (2) statistical downscaling to provide station-scale future climate information, and (3) estimates of changes in frequency and magnitude of future hazardous meteorological/hydrological events in this century. To achieve these goals, in addition to synoptic weather typing, the modeling conceptualizations in meteorology and hydrology and various linear/nonlinear regression techniques were applied. Furthermore, a formal model result verification process has been built into the entire modeling exercise. The results of the verification, based on historical observations of the outcome variables predicted by the models, showed very good agreement. This paper will briefly summarize these research projects, focusing on the modeling exercise and results.
Taft, L M; Evans, R S; Shyu, C R; Egger, M J; Chawla, N; Mitchell, J A; Thornton, S N; Bray, B; Varner, M
2009-04-01
The IOM report, Preventing Medication Errors, emphasizes the overall lack of knowledge of the incidence of adverse drug events (ADE). Operating rooms, emergency departments and intensive care units are known to have a higher incidence of ADE. Labor and delivery (L&D) is an emergency care unit that could have an increased risk of ADE, where reported rates remain low and under-reporting is suspected. Risk factor identification with electronic pattern recognition techniques could improve ADE detection rates. The objective of the present study is to apply Synthetic Minority Over Sampling Technique (SMOTE) as an enhanced sampling method in a sparse dataset to generate prediction models to identify ADE in women admitted for labor and delivery based on patient risk factors and comorbidities. By creating synthetic cases with the SMOTE algorithm and using a 10-fold cross-validation technique, we demonstrated improved performance of the Naïve Bayes and the decision tree algorithms. The true positive rate (TPR) of 0.32 in the raw dataset increased to 0.67 in the 800% over-sampled dataset. Enhanced performance from classification algorithms can be attained with the use of synthetic minority class oversampling techniques in sparse clinical datasets. Predictive models created in this manner can be used to develop evidence based ADE monitoring systems.
NASA Astrophysics Data System (ADS)
Ciabatta, Luca; Brocca, Luca; Ponziani, Francesco; Berni, Nicola; Stelluti, Marco; Moramarco, Tommaso
2014-05-01
The Umbria Region, located in Central Italy, is one of the most landslide risk prone area in Italy, almost yearly affected by landslides events at different spatial scales. For early warning procedures aimed at the assessment of the hydrogeological risk, the rainfall thresholds represent the main tool for the Italian Civil Protection System. As shown in previous studies, soil moisture plays a key-role in landslides triggering. In fact, acting on the pore water pressure, soil moisture influences the rainfall amount needed for activating a landslide. In this work, an operational physically-based early warning system, named PRESSCA, that takes into account soil moisture for the definition of rainfall thresholds is presented. Specifically, the soil moisture conditions are evaluated in PRESSCA by using a distributed soil water balance model that is recently coupled with near real-time satellite soil moisture product obtained from ASCAT (Advanced SCATterometer) and from in-situ monitoring data. The integration of three different sources of soil moisture information allows to estimate the most accurate possible soil moisture condition. Then, both observed and forecasted rainfall data are compared with the soil moisture-based thresholds in order to obtain risk indicators over a grid of ~ 5 km. These indicators are then used for the daily hydrogeological risk evaluation and management by the Civil Protection regional service, through the sharing/delivering of near real-time landslide risk scenarios (also through an open source web platform: www.cfumbria.it). On the 11th-12th November, 2013, Umbria Region was hit by an exceptional rainfall event with up to 430mm/72hours that resulted in significant economic damages, but fortunately no casualties among the population. In this study, the results during the rainfall event of PRESSCA system are described, by underlining the model capability to reproduce, two days in advance, landslide risk scenarios in good spatial and temporal agreement with the occurred actual conditions. High-resolution risk scenarios (100mx100m), obtained by coupling PRESSCA forecasts with susceptibility and vulnerability layers, are also produced. The results show good relationship between the PRESSCA forecast and the reported landslides to the Civil Protection Service during the rainfall event, confirming the system robustness. The good forecasts of PRESSCA system have surely contributed to start well in advance the Civil Protection operations (alerting local authorities and population).
Mazzanti, Andrea; Maragna, Riccardo; Vacanti, Gaetano; Monteforte, Nicola; Bloise, Raffaella; Marino, Maira; Braghieri, Lorenzo; Gambelli, Patrick; Memmi, Mirella; Pagan, Eleonora; Morini, Massimo; Malovini, Alberto; Ortiz, Martin; Sacilotto, Luciana; Bellazzi, Riccardo; Monserrat, Lorenzo; Napolitano, Carlo; Bagnardi, Vincenzo; Priori, Silvia G
2018-04-17
Long QT syndrome (LQTS) is a common inheritable arrhythmogenic disorder, often secondary to mutations in the KCNQ1, KCNH2, and SCN5A genes. The disease is characterized by a prolonged ventricular repolarization (QTc interval) that confers susceptibility to life-threatening arrhythmic events (LAEs). This study sought to create an evidence-based risk stratification scheme to personalize the quantification of the arrhythmic risk in patients with LQTS. Data from 1,710 patients with LQTS followed up for a median of 7.1 years (interquartile range [IQR]: 2.7 to 13.4 years) were analyzed to estimate the 5-year risk of LAEs based on QTc duration and genotype and to assess the antiarrhythmic efficacy of beta-blockers. The relationship between QTc duration and risk of events was investigated by comparison of linear and cubic spline models, and the linear model provided the best fit. The 5-year risk of LAEs while patients were off therapy was then calculated in a multivariable Cox model with QTc and genotype considered as independent factors. The estimated risk of LAEs increased by 15% for every 10-ms increment of QTc duration for all genotypes. Intergenotype comparison showed that the risk for patients with LQT2 and LQT3 increased by 130% and 157% at any QTc duration versus patients with LQT1. Analysis of response to beta-blockers showed that only nadolol reduced the arrhythmic risk in all genotypes significantly compared with no therapy (hazard ratio: 0.38; 95% confidence interval: 0.15 to 0.93; p = 0.03). The study provides an estimator of risk of LAEs in LQTS that allows a granular estimate of 5-year arrhythmic risk and demonstrate the superiority of nadolol in reducing the risk of LAEs in LQTS. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Mothers' preferences regarding new combination vaccines for their children in Japan, 2014.
Shono, Aiko; Kondo, Masahide
2017-04-03
A number of new vaccines to prevent childhood diseases have been introduced globally over the last few decades. Only four combination vaccines are currently available in Japan, DTaP/sIPV, DTaP, DT, and MR, leading to complex infant vaccine scheduling. This study aims to investigate mothers' preferences with respect to combination vaccines for their children, should new combination vaccines become available that have not yet been launched in Japan or that will be developed in the future. We conducted a web-based, cross-sectional survey of 1,243 mothers who had at least one child between 2 months and 3 y of age. Mothers were recruited from an online survey panel of registered users. Their preferences were elicited using discrete choice experiments, the analyzed main effects model, and interactions using a mixed logit model. Mothers showed a preference for vaccines that prevented multiple diseases, had fewer injections per doctor visit, lower price, and lower risk of adverse events. Respondents valued a reduced risk of adverse events the most among all attributes in this study. The estimated monetary value of the willingness to pay for a 1% reduction in the risk of adverse events was ¥ 92,557 ($ 907). Therefore, if new combination vaccines are introduced, the risk of adverse events after vaccination is an especially important factor for mothers. While the safety of the vaccines themselves is required, health professionals should also inform mothers about the benefits and risks of vaccination, to allay mothers' concerns about vaccine safety.
NASA Astrophysics Data System (ADS)
Pearson, Callum; Reaney, Sim; Bracken, Louise; Butler, Lucy
2015-04-01
Throughout the United Kingdom flood risk is a growing problem and a significant proportion of the population are at risk from flooding throughout the country. Across England and Wales over 5 million people are believed to be at risk from fluvial, pluvial or coastal flooding (DEFRA, 2013). Increasingly communities that have not dealt with flooding before have recently experienced significant flood events. The communities of Stockdalewath and Highbridge in the Roe catchment, a tributary of the River Eden in Cumbria, UK, are an excellent example. The River Roe has a normal flow of less than 5m3 sec-1 occurring 97 percent of the time however there have been two flash floods of 98.8m3 sec-1 in January 2005 and 86.9m3 sec-1 in May 2013. These two flash flood events resulted in the inundation of numerous properties within the catchment with the 2013 event prompting the creation of the Roe Catchment Community Water Management Group which aims are to deliver a sustainable approach to managing the flood risk. Due to the distributed rural population the community fails the cost-benefit analysis for a centrally funded flood risk mitigation scheme. Therefore the at-risk community within the Roe catchment have to look for cost-effective, sustainable techniques and interventions to reduce the potential negative impacts of future events; this has resulted in a focus on natural flood risk management. This research investigates the potential to reduce flood risk through natural catchment-based land management techniques and interventions within the Roe catchment; providing a scientific base from with further action can be enacted. These interventions include changes to land management and land use, such as soil aeration and targeted afforestation, the creation of runoff attenuation features and the construction of in channel features, such as debris dams. Natural flood management (NFM) application has been proven to be effective when reducing flood risk in smaller catchments and the potential to transfer these benefits to the Roe catchment (~69km2) have been assessed. Furthermore these flood mitigation features have the potential to deliver wider environmental improvements throughout the catchment and hence the potential for multiple benefits such as diffuse pollution reduction and habitat creation are considered. The research explores the impact of NFM techniques, flood storage areas or afforestation for example, with a view to enhancing local scale habitats. The research combines innovative catchment modelling techniques, both risk-based approaches (SCIMAP Flood) and spatially distributed hydrological simulation modelling (CRUM3), with in-field monitoring and observation of flow pathways and tributary response to rainfall using time-lapse cameras. Additional work with the local community and stakeholders will identify the range and location of potential catchment-based land management techniques and interventions being assessed; natural flood management implementation requires the participation and cooperation of landowners and local community to be successful (Howgate and Kenyon, 2009).
A Hydrological Modeling Framework for Flood Risk Assessment for Japan
NASA Astrophysics Data System (ADS)
Ashouri, H.; Chinnayakanahalli, K.; Chowdhary, H.; Sen Gupta, A.
2016-12-01
Flooding has been the most frequent natural disaster that claims lives and imposes significant economic losses to human societies worldwide. Japan, with an annual rainfall of up to approximately 4000 mm is extremely vulnerable to flooding. The focus of this research is to develop a macroscale hydrologic model for simulating flooding toward an improved understanding and assessment of flood risk across Japan. The framework employs a conceptual hydrological model, known as the Probability Distributed Model (PDM), as well as the Muskingum-Cunge flood routing procedure for simulating streamflow. In addition, a Temperature-Index model is incorporated to account for snowmelt and its contribution to streamflow. For an efficient calibration of the model, in terms of computational timing and convergence of the parameters, a set of A Priori parameters is obtained based on the relationships between the model parameters and the physical properties of watersheds. In this regard, we have implemented a particle tracking algorithm and a statistical model which use high resolution Digital Terrain Models to estimate different time related parameters of the model such as time to peak of the unit hydrograph. In addition, global soil moisture and depth data are used to generate A Priori estimation of maximum soil moisture capacity, an important parameter of the PDM model. Once the model is calibrated, its performance is examined during the Typhoon Nabi which struck Japan in September 2005 and caused severe flooding throughout the country. The model is also validated for the extreme precipitation event in 2012 which affected Kyushu. In both cases, quantitative measures show that simulated streamflow depicts good agreement with gauge-based observations. The model is employed to simulate thousands of possible flood events for the entire Japan which makes a basis for a comprehensive flood risk assessment and loss estimation for the flood insurance industry.
Whitfield, Malcolm D; Gillett, Michael; Holmes, Michael; Ogden, Elaine
2006-12-01
The brief for this study was to produce a practical, evidence based, financial planning tool, which could be used to present an economic argument for funding a public health-based prevention programme in coronary heart disease (CHD) related illness on the same basis as treatment interventions. To explore the possibility of using multivariate risk prediction equations, derived from the Framingham and other studies, to estimate how many people in a population are likely to be admitted to hospital in the next 5-10 years with cardio vascular disease (CVD) related events such as heart attacks, strokes, heart failure and kidney disease. To estimate the potential financial impact of reductions in hospital admissions, on an 'invest to save' basis, if primary care trusts (PCTs) were to invest in public health based interventions to reduce cardiovascular risk at a population level. The populations of five UK PCTs were entered into a spreadsheet based decision tree model, in terms of age and sex (this equated to around 620,000 adults). An estimation was made to determine how many people, in each age group, were likely to be diabetic. Population risk factors such as smoking rates, mean body mass index (BMI), mean total cholesterol and mean systolic blood pressure were entered by age group. The spreadsheet then used a variant of the Framingham equation to calculate how many non-diabetic people in each age group were likely to have a heart attack or stroke in the next 5 years. In addition heart failure and dialysis admission rates were estimated based upon risk factors for incidence. The United Kingdom Prospective Diabetes Study (UKPDS) risk engines 56 and 60 were used to calculate the risk of CHD and stroke, respectively, in people with type 2 diabetes. The spreadsheet deducted the number of people likely to die before reaching hospital and produced a predicted number of hospital admissions for each category over a 5-year period. The final part of the calculation attached a cost to the hospital activity using the UK Health Resource Grouping (HRG) tariffs. The predicted number of events in each of the primary care trusts was then compared with the actual number of events the previous year (2004/2005). The study used a decision tree type model, which was populated with data from the research literature. The model applied the risk equations to population data from five primary care trusts to estimate how many people would suffer from an acute CVD related event over the next 5 years. The predicted number of events was then compared with the actual number of acute admissions for heart attacks, strokes, heart failure, acute hypoglycaemic attacks, renal failure and coronary bypass surgery the previous year. The first outcome of the model was to compare the estimated number of people in each PCT likely to suffer from a heart attack, a stroke, heart failure or chronic kidney failure with the actual number the previous year 2004/2005. The predicted number was remarkably accurate in the case of heart attack and stroke. There was some over-prediction of chronic kidney disease (CKD) which could be accounted for by known under-diagnosis in this illness group and the inability of the model to pick up, at this stage, the fact that many CKD patients die of a CHD related event before they reach the stage of requiring renal replacement. The second outcome of the model was to estimate the financial consequence of risk reduction. Moderate reductions in risk in the order of around 2-4% were estimated to lead to saving in acute admission costs or around pounds sterling 5.4 million over 5 years. More ambitious targets of risk reduction in the order of 5-6% led to estimated savings of around pounds sterling 8.7 million. This study is not presented as the definitive approach to predicting the economic consequences of investment in public health on the cost of secondary care. It is simply a logical, systematic approach to quantifying these issues in order to present a business case for such investment. The research team do not know if the predicted savings would accrue from such investments; it is theoretical at this stage. The point is, however, that if the predictions are correct then the savings will accrue from over 4000 people, from an adult population of around 185,000 not having a heart attack or a stroke or an acute exacerbation of heart failure.
Kaasenbrood, Lotte; Poulter, Neil R; Sever, Peter S; Colhoun, Helen M; Livingstone, Shona J; Boekholdt, S Matthijs; Pressel, Sara L; Davis, Barry R; van der Graaf, Yolanda; Visseren, Frank L J
2016-05-01
In this study, we aimed to translate the average relative effect of statin therapy from trial data to the individual patient with type 2 diabetes mellitus by developing and validating a model to predict individualized absolute risk reductions (ARR) of cardiovascular events. Data of 2725 patients with type 2 diabetes mellitus from the Lipid Lowering Arm of the Anglo Scandinavian Cardiac Outcomes Trial (ASCOT-LLA) study (atorvastatin 10 mg versus placebo) were used for model derivation. The model was based on 8 clinical predictors including treatment allocation (statin/placebo). Ten-year individualized ARR on major cardiovascular events by statin therapy were calculated for each patient by subtracting the estimated on-treatment risk from the estimated off-treatment risk. Predicted 10-year ARR by statin therapy was <2% for 13% of the patients. About 30% had an ARR of >4% (median ARR, 3.2%; interquartile range, 2.5%-4.3%; 95% confidence interval for 3.2% ARR, -1.4% to 6.8%). Addition of treatment interactions did not improve model performance. Therefore, the wide distribution in ARR was a consequence of the underlying distribution in cardiovascular risk enrolled in these trials. External validation of the model was performed in data from the Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT-LLT; pravastatin 40 mg versus usual care) and Collaborative Atorvastatin Diabetes Study (CARDS; atorvastatin 10 mg versus placebo) of 3878 and 2838 patients with type 2 diabetes mellitus, respectively. Model calibration was adequate in both external data sets, discrimination was moderate (ALLHAT-LLT: c-statistics, 0.64 [95% confidence interval, 0.61-0.67] and CARDS: 0.68 [95% confidence interval, 0.64-0.72]). ARRs of major cardiovascular events by statin therapy can be accurately estimated for individual patients with type 2 diabetes mellitus using a model based on routinely available patient characteristics. There is a wide distribution in ARR that may complement informed decision making. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00327418 (CARDS) and NCT00000542 (ALLHAT). © 2016 American Heart Association, Inc.
Galeazzi, Roberta; Olivieri, Fabiola; Spazzafumo, Liana; Rose, Giuseppina; Montesanto, Alberto; Giovagnetti, Simona; Cecchini, Sara; Malatesta, Gelsomina; Di Pillo, Raffaele; Antonicelli, Roberto
2018-06-23
The clinical efficacy of clopidogrel in secondary prevention of vascular events is hampered by marked inter-patient variability in drug response, which partially depends on genetic make-up. The aim of this pilot prospective study was to evaluate 12-month cardiovascular outcomes in elderly patients with acute coronary syndrome (ACS) receiving dual antiplatelet therapy (aspirin and clopidogrel) according to the clustering of CYP2C19 and ABCB1 genetic variants. Participants were 100 consecutive ACS patients who were genotyped for CYP2C19 (G681A and C-806T) and ABCB1 (C3435T) polymorphisms, which affect clopidogrel metabolism and bioavailability, using PCR-restriction fragment length polymorphism. They were then grouped as poor, extensive and ultra-rapid metabolisers based on the combination of CYP2C19 loss-of-function (CYP2C19*2) and gain-of-function (CYP2C19*17) alleles and ABCB1 alleles. The predictive value of each phenotype for acute vascular events was estimated based on 12-month cardiovascular outcomes. The poor metabolisers were at an increased risk of thrombotic events (OR 1.26; 95% CI 1.099-1.45; χ 2 = 5.676; p = 0.027), whereas the ultra-rapid metabolisers had a 1.31-fold increased risk of bleeding events compared with the poor and extensive metabolisers (OR 1.31; 95% CI 1.033-1.67; χ 2 = 5.676; p = 0.048). Logistic regression model, including age, sex, BMI and smoking habit, confirmed the differential risk of major events in low and ultra-rapid metabolisers. Our findings suggest that ACS patients classified as 'poor or ultra-rapid' metabolisers based on CYP2C19 and ABCB1 genotypes should receive alternative antiplatelet therapies to clopidogrel.
Moghavem, Nuriel; McDonald, Kathryn; Ratliff, John K; Hernandez-Boussard, Tina
2016-04-01
Patient Safety Indicators (PSIs) are administratively coded identifiers of potentially preventable adverse events. These indicators are used for multiple purposes, including benchmarking and quality improvement efforts. Baseline PSI evaluation in high-risk surgeries is fundamental to both purposes. Determine PSI rates and their impact on other outcomes in patients undergoing cranial neurosurgery compared with other surgeries. The Agency for Healthcare Research and Quality (AHRQ) PSI software was used to flag adverse events and determine risk-adjusted rates (RAR). Regression models were built to assess the association between PSIs and important patient outcomes. We identified cranial neurosurgeries based on International Classification of Diseases, Ninth Revision, Clinical Modification codes in California, Florida, New York, Arkansas, and Mississippi State Inpatient Databases, AHRQ, 2010-2011. PSI development, 30-day all-cause readmission, length of stay, hospital costs, and inpatient mortality. A total of 48,424 neurosurgical patients were identified. Procedure indication was strongly associated with PSI development. The neurosurgical population had significantly higher RAR of most PSIs evaluated compared with other surgical patients. Development of a PSI was strongly associated with increased length of stay and hospital cost and, in certain PSIs, increased inpatient mortality and 30-day readmission. In this population-based study, certain accountability measures proposed for use as value-based payment modifiers show higher RAR in neurosurgery patients compared with other surgical patients and were subsequently associated with poor outcomes. Our results indicate that for quality improvement efforts, the current AHRQ risk-adjustment models should be viewed in clinically meaningful stratified subgroups: for profiling and pay-for-performance applications, additional factors should be included in the risk-adjustment models. Further evaluation of PSIs in additional high-risk surgeries is needed to better inform the use of these metrics.
Predicting fire effects on water quality: a perspective and future needs
NASA Astrophysics Data System (ADS)
Smith, Hugh; Sheridan, Gary; Nyman, Petter; Langhans, Christoph; Noske, Philip; Lane, Patrick
2017-04-01
Forest environments are a globally significant source of drinking water. Fire presents a credible threat to the supply of high quality water in many forested regions. The post-fire risk to water supplies depends on storm event characteristics, vegetation cover and fire-related changes in soil infiltration and erodibility modulated by landscape position. The resulting magnitude of runoff generation, erosion and constituent flux to streams and reservoirs determines the severity of water quality impacts in combination with the physical and chemical composition of the entrained material. Research to date suggests that most post-fire water quality impacts are due to large increases in the supply of particulates (fine-grained sediment and ash) and particle-associated chemical constituents. The largest water quality impacts result from high magnitude erosion events, including debris flow processes, which typically occur in response to short duration, high intensity storm events during the recovery period. Most research to date focuses on impacts on water quality after fire. However, information on potential water quality impacts is required prior to fire events for risk planning. Moreover, changes in climate and forest management (e.g. prescribed burning) that affect fire regimes may alter water quality risks. Therefore, prediction requires spatial-temporal representation of fire and rainfall regimes coupled with information on fire-related changes to soil hydrologic parameters. Recent work has applied such an approach by combining a fire spread model with historic fire weather data in a Monte Carlo simulation to quantify probabilities associated with fire and storm events generating debris flows and fine sediment influx to a reservoir located in Victoria, Australia. Prediction of fire effects on water quality would benefit from further research in several areas. First, more work on regional-scale stochastic modelling of intersecting fire and storm events with landscape zones of erosion vulnerability is required to support quantitative evaluation of water quality risk and the effect of future changes in climate and land management. Second, we underscore previous calls for characterisation of landscape-scale domains to support regionalisation of parameter sets derived from empirical studies. Recent examples include work identifying aridity as a control of hydro-geomorphic response to fire and the use of spectral-based indices to predict spatial heterogeneity in ash loadings. Third, information on post-fire erosion from colluvial or alluvial stores is needed to determine their significance as both sediment-contaminant sinks and sources. Such sediment stores may require explicit spatial representation in risk models for some environments and sediment tracing can be used to determine their relative importance as secondary sources. Fourth, increased dating of sediment archives could provide regional datasets of fire-related erosion event frequency. Presently, the lack of such data hinders evaluation of risk models linking fire and storm events to erosion and water quality impacts.
A methodology for modeling regional terrorism risk.
Chatterjee, Samrat; Abkowitz, Mark D
2011-07-01
Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.
Valence-Dependent Belief Updating: Computational Validation
Kuzmanovic, Bojana; Rigoux, Lionel
2017-01-01
People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments. PMID:28706499
Valence-Dependent Belief Updating: Computational Validation.
Kuzmanovic, Bojana; Rigoux, Lionel
2017-01-01
People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments.
Use of documentary sources on past flood events for flood risk management and land planning
NASA Astrophysics Data System (ADS)
Cœur, Denis; Lang, Michel
2008-09-01
The knowledge of past catastrophic events can improve flood risk mitigation policy, with a better awareness against risk. As such historical information is usually available in Europe for the past five centuries, historians are able to understand how past society dealt with flood risk, and hydrologists can include information on past floods into an adapted probabilistic framework. In France, Flood Risk Mitigation Maps are based either on the largest historical known flood event or on the 100-year flood event if it is greater. Two actions can be suggested in terms of promoting the use of historical information for flood risk management: (1) the development of a regional flood data base, with both historical and current data, in order to get a good feedback on recent events and to improve the flood risk education and awareness; (2) the commitment to keep a persistent/perennial management of a reference network of hydrometeorological observations for climate change studies.
NASA Astrophysics Data System (ADS)
Mei, Chao; Liu, Jiahong; Wang, Hao; Shao, Weiwei; Xia, Lin; Xiang, Chenyao; Zhou, Jinjun
2018-06-01
Urban inundation is a serious challenge that increasingly confronts the residents of many cities, as well as policymakers, in the context of rapid urbanization and climate change worldwide. In recent years, source control measures (SCMs) such as green roofs, permeable pavements, rain gardens, and vegetative swales have been implemented to address flood inundation in urban settings, and proven to be cost-effective and sustainable. In order to investigate the ability of SCMs on reducing inundation in a community-scale urban drainage system, a dynamic rainfall-runoff model of a community-scale urban drainage system was developed based on SWMM. SCMs implementing scenarios were modelled under six design rainstorm events with return period ranging from 2 to 100 years, and inundation risks of the drainage system were evaluated before and after the proposed implementation of SCMs, with a risk-evaluation method based on SWMM and analytic hierarchy process (AHP). Results show that, SCMs implementation resulting in significantly reduction of hydrological indexes that related to inundation risks, range of reduction rates of average flow, peak flow, and total flooded volume of the drainage system were 28.1-72.1, 19.0-69.2, and 33.9-56.0 %, respectively, under six rainfall events with return periods ranging from 2 to 100 years. Corresponding, the inundation risks of the drainage system were significantly reduced after SCMs implementation, the risk values falling below 0.2 when the rainfall return period was less than 10 years. Simulation results confirm the effectiveness of SCMs on mitigating inundation, and quantified the potential of SCMs on reducing inundation risks in the urban drainage system, which provided scientific references for implementing SCMs for inundation control of the study area.
Budoff, Matthew J; Nasir, Khurram; McClelland, Robyn L; Detrano, Robert; Wong, Nathan; Blumenthal, Roger S; Kondos, George; Kronmal, Richard A
2009-01-27
In this study, we aimed to establish whether age-sex-specific percentiles of coronary artery calcium (CAC) predict cardiovascular outcomes better than the actual (absolute) CAC score. The presence and extent of CAC correlates with the overall magnitude of coronary atherosclerotic plaque burden and with the development of subsequent coronary events. MESA (Multi-Ethnic Study of Atherosclerosis) is a prospective cohort study of 6,814 asymptomatic participants followed for coronary heart disease (CHD) events including myocardial infarction, angina, resuscitated cardiac arrest, or CHD death. Time to incident CHD was modeled with Cox regression, and we compared models with percentiles based on age, sex, and/or race/ethnicity to categories commonly used (0, 1 to 100, 101 to 400, 400+ Agatston units). There were 163 (2.4%) incident CHD events (median follow-up 3.75 years). Expressing CAC in terms of age- and sex-specific percentiles had significantly lower area under the receiver-operating characteristic curve (AUC) than when using absolute scores (women: AUC 0.73 versus 0.76, p = 0.044; men: AUC 0.73 versus 0.77, p < 0.001). Akaike's information criterion indicated better model fit with the overall score. Both methods robustly predicted events (>90th percentile associated with a hazard ratio [HR] of 16.4, 95% confidence interval [CI]: 9.30 to 28.9, and score >400 associated with HR of 20.6, 95% CI: 11.8 to 36.0). Within groups based on age-, sex-, and race/ethnicity-specific percentiles there remains a clear trend of increasing risk across levels of the absolute CAC groups. In contrast, once absolute CAC category is fixed, there is no increasing trend across levels of age-, sex-, and race/ethnicity-specific categories. Patients with low absolute scores are low-risk, regardless of age-, sex-, and race/ethnicity-specific percentile rank. Persons with an absolute CAC score of >400 are high risk, regardless of percentile rank. Using absolute CAC in standard groups performed better than age-, sex-, and race/ethnicity-specific percentiles in terms of model fit and discrimination. We recommend using cut points based on the absolute CAC amount, and the common CAC cut points of 100 and 400 seem to perform well.
Urbain, Jay
2015-12-01
We present the design, and analyze the performance of a multi-stage natural language processing system employing named entity recognition, Bayesian statistics, and rule logic to identify and characterize heart disease risk factor events in diabetic patients over time. The system was originally developed for the 2014 i2b2 Challenges in Natural Language in Clinical Data. The system's strengths included a high level of accuracy for identifying named entities associated with heart disease risk factor events. The system's primary weakness was due to inaccuracies when characterizing the attributes of some events. For example, determining the relative time of an event with respect to the record date, whether an event is attributable to the patient's history or the patient's family history, and differentiating between current and prior smoking status. We believe these inaccuracies were due in large part to the lack of an effective approach for integrating context into our event detection model. To address these inaccuracies, we explore the addition of a distributional semantic model for characterizing contextual evidence of heart disease risk factor events. Using this semantic model, we raise our initial 2014 i2b2 Challenges in Natural Language of Clinical data F1 score of 0.838 to 0.890 and increased precision by 10.3% without use of any lexicons that might bias our results. Copyright © 2015 Elsevier Inc. All rights reserved.
Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil.
Lowe, Rachel; Coelho, Caio As; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier
2016-02-24
Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics.
Aiassa, E; Higgins, J P T; Frampton, G K; Greiner, M; Afonso, A; Amzal, B; Deeks, J; Dorne, J-L; Glanville, J; Lövei, G L; Nienstedt, K; O'connor, A M; Pullin, A S; Rajić, A; Verloo, D
2015-01-01
Food and feed safety risk assessment uses multi-parameter models to evaluate the likelihood of adverse events associated with exposure to hazards in human health, plant health, animal health, animal welfare, and the environment. Systematic review and meta-analysis are established methods for answering questions in health care, and can be implemented to minimize biases in food and feed safety risk assessment. However, no methodological frameworks exist for refining risk assessment multi-parameter models into questions suitable for systematic review, and use of meta-analysis to estimate all parameters required by a risk model may not be always feasible. This paper describes novel approaches for determining question suitability and for prioritizing questions for systematic review in this area. Risk assessment questions that aim to estimate a parameter are likely to be suitable for systematic review. Such questions can be structured by their "key elements" [e.g., for intervention questions, the population(s), intervention(s), comparator(s), and outcome(s)]. Prioritization of questions to be addressed by systematic review relies on the likely impact and related uncertainty of individual parameters in the risk model. This approach to planning and prioritizing systematic review seems to have useful implications for producing evidence-based food and feed safety risk assessment.
Macera, Márcia A C; Louzada, Francisco; Cancho, Vicente G; Fontes, Cor J F
2015-03-01
In this paper, we introduce a new model for recurrent event data characterized by a baseline rate function fully parametric, which is based on the exponential-Poisson distribution. The model arises from a latent competing risk scenario, in the sense that there is no information about which cause was responsible for the event occurrence. Then, the time of each recurrence is given by the minimum lifetime value among all latent causes. The new model has a particular case, which is the classical homogeneous Poisson process. The properties of the proposed model are discussed, including its hazard rate function, survival function, and ordinary moments. The inferential procedure is based on the maximum likelihood approach. We consider an important issue of model selection between the proposed model and its particular case by the likelihood ratio test and score test. Goodness of fit of the recurrent event models is assessed using Cox-Snell residuals. A simulation study evaluates the performance of the estimation procedure in the presence of a small and moderate sample sizes. Applications on two real data sets are provided to illustrate the proposed methodology. One of them, first analyzed by our team of researchers, considers the data concerning the recurrence of malaria, which is an infectious disease caused by a protozoan parasite that infects red blood cells. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Disaster risk from a macroeconomic perspective: a metric for fiscal vulnerability evaluation.
Cardona, Omar D; Ordaz, Mario G; Marulanda, Mabel C; Carreño, Martha L; Barbat, Alex H
2010-10-01
The Disaster Deficit Index (DDI) measures macroeconomic and financial risk in a country according to possible catastrophic scenario events. Extreme disasters can generate financial deficit due to sudden and elevated need of resources to restore affected inventories. The DDI captures the relationship between the economic loss that a country could experience when a catastrophic event occurs and the availability of funds to address the situation. The proposed model utilises the procedures of the insurance industry in establishing probable losses, based on critical impacts during a given period of exposure; for economic resilience, the model allows one to calculate the country's financial ability to cope with a critical impact. There are limitations and costs associated with access to resources that one must consider as feasible values according to the country's macroeconomic and financial conditions. This paper presents the DDI model and the results of its application to 19 countries of the Americas and aims to guide governmental decision-making in disaster risk reduction. © 2010 The Author(s). Journal compilation © Overseas Development Institute, 2010.
NASA Astrophysics Data System (ADS)
Smith, Leonard A.
2010-05-01
This contribution concerns "deep" or "second-order" uncertainty, such as the uncertainty in our probability forecasts themselves. It asks the question: "Is it rational to take (or offer) bets using model-based probabilities as if they were objective probabilities?" If not, what alternative approaches for determining odds, perhaps non-probabilistic odds, might prove useful in practice, given the fact we know our models are imperfect? We consider the case where the aim is to provide sustainable odds: not to produce a profit but merely to rationally expect to break even in the long run. In other words, to run a quantified risk of ruin that is relatively small. Thus the cooperative insurance schemes of coastal villages provide a more appropriate parallel than a casino. A "better" probability forecast would lead to lower premiums charged and less volatile fluctuations in the cash reserves of the village. Note that the Bayesian paradigm does not constrain one to interpret model distributions as subjective probabilities, unless one believes the model to be empirically adequate for the task at hand. In geophysics, this is rarely the case. When a probability forecast is interpreted as the objective probability of an event, the odds on that event can be easily computed as one divided by the probability of the event, and one need not favour taking either side of the wager. (Here we are using "odds-for" not "odds-to", the difference being whether of not the stake is returned; odds of one to one are equivalent to odds of two for one.) The critical question is how to compute sustainable odds based on information from imperfect models. We suggest that this breaks the symmetry between the odds-on an event and the odds-against it. While a probability distribution can always be translated into odds, interpreting the odds on a set of events might result in "implied-probabilities" that sum to more than one. And/or the set of odds may be incomplete, not covering all events. We ask whether or not probabilities based on imperfect models can be expected to yield probabilistic odds which are sustainable. Evidence is provided that suggest this is not the case. Even with very good models (good in an Root-Mean-Square sense), the risk of ruin of probabilistic odds is significantly higher than might be expected. Methods for constructing model-based non-probabilistic odds which are sustainable are discussed. The aim here is to be relevant to real world decision support, and so unrealistic assumptions of equal knowledge, equal compute power, or equal access to information are to be avoided. Finally, the use of non-probabilistic odds as a method for communicating deep uncertainty (uncertainty in a probability forecast itself) is discussed in the context of other methods, such as stating one's subjective probability that the models will prove inadequate in each particular instance (that is, the Probability of a "Big Surprise").
Validation of a modified Medical Resource Model for mass gatherings.
Smith, Wayne P; Tuffin, Heather; Stratton, Samuel J; Wallis, Lee A
2013-02-01
A modified Medical Resource Model to predict the medical resources required at mass gatherings based on the risk profile of events has been developed. This study was undertaken to validate this tool using data from events held in both a developed and a developing country. A retrospective study was conducted utilizing prospectively gathered data from individual events at Old Trafford Stadium in Manchester, United Kingdom, and Ellis Park Stadium, Johannesburg, South Africa. Both stadia are similar in design and spectator capacity. Data for Professional Football as well as Rugby League and Rugby Union (respectively) matches were used for the study. The medical resources predicted for the events were determined by entering the risk profile of each of the events into the Medical Resource Model. A recently developed South African tool was used to predetermine medical staffing for mass gatherings. For the study, the medical resources actually required to deal with the patient load for events within the control sample from the two stadia were compared with the number of needed resources predicted by the Medical Resource Model when that tool was applied retrospectively to the study events. The comparison was used to determine if the newly developed tool was either over- or under-predicting the resource requirements. In the case of Ellis Park, the model under-predicted the basic life support (BLS) requirement for 1.5% of the events in the data set. Mean over-prediction was 209.1 minutes for BLS availability. Old Trafford displayed no events for which the Medical Resource Model would have under-predicted. The mean over-prediction of BLS availability for Old Trafford was 671.6 minutes. The intermediate life support (ILS) requirement for Ellis Park was under-predicted for seven of the total 66 events (10.6% of the events), all of which had one factor in common, that being relatively low spectator attendance numbers. Modelling for ILS at Old Trafford did not under-predict for any events. The ILS requirements showed a mean over-prediction of 161.4 minutes ILS availability for Ellis Park compared with 425.2 minutes for Old Trafford. Of the events held at Ellis Park, the Medical Resource Model under-predicted the ambulance requirement in 4.5% of the events. For Old Trafford events, the under-prediction was higher: 7.5% of cases. The medical resources that are deployed at a mass gathering should best match the requirement for patient care at a particular event. An important consideration for any model is that it does not continually under-predict the resources required in relation to the actual requirement. With the exception of a specific subset of events at Ellis Park, the rate of under-prediction for this model was acceptable.
NASA Astrophysics Data System (ADS)
Bachmann, C. E.; Wiemer, S.; Woessner, J.; Hainzl, S.
2011-08-01
Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was reduced and then stopped after another ML 2.5 event. A few hours later, an earthquake with ML 3.4, felt within the city, occurred, which led to bleed-off of the well. A risk study was later issued with the outcome that the experiment could not be resumed. We analyse the statistical features of the sequence and show that the sequence is well modelled with the Omori-Utsu law following the termination of water injection. Based on this model, the sequence will last 31+29/-14 years to reach the background level. We introduce statistical models based on Reasenberg and Jones and Epidemic Type Aftershock Sequence (ETAS) models, commonly used to model aftershock sequences. We compare and test different model setups to simulate the sequences, varying the number of fixed and free parameters. For one class of the ETAS models, we account for the flow rate at the injection borehole. We test the models against the observed data with standard likelihood tests and find the ETAS model accounting for the on flow rate to perform best. Such a model may in future serve as a valuable tool for designing probabilistic alarm systems for EGS experiments.
Cost-effectiveness analysis of valsartan versus losartan and the effect of switching.
Baker, Timothy M; Goh, Jowern; Johnston, Atholl; Falvey, Heather; Brede, Yvonne; Brown, Ruth E
2012-01-01
Losartan will shortly become generic, and this may encourage switching to the generic drug. However, valsartan was shown in a meta-analysis to be statistically superior in lowering blood pressure (BP) to losartan. This paper examines the costs of treatment with these two drugs and the potential consequences of switching established valsartan patients to generic losartan. A US payer cost-effectiveness model was developed incorporating the risk of cardiovascular disease (CVD) events related to systolic blood pressure (SBP) control comparing valsartan to continual losartan and switching from valsartan to generic losartan. The model, based upon a meta-analysis by Nixon et al. and Framingham equations, included first CVD event costs calculated from US administrative data sets and utility values from published sources. The modeled outcomes were number of CVD events, costs and incremental cost per quality-adjusted life-year (QALY) and life-year (LY). Fewer patients had fatal and non-fatal CVD events with valsartan therapy compared with continual losartan and with patients switched from valsartan to generic losartan. The base-case model results indicated that continued treatment with valsartan had an incremental cost-effectiveness ratio of $27,268 and $25,460 per life year gained, and $32,313 and $30,170 per QALY gained, relative to continual losartan and switching treatments, respectively. Sensitivity analyses found that patient discontinuation post-switching was a sensitive parameter. Including efficacy offsets with lowered adherence or discontinuation resulted in more favorable ratios for valsartan compared to switching therapy. The model does not evaluate post-primary CVD events and considers change in SBP from baseline level as the sole predictor of CVD risk. Valsartan appears to be cost-effective compared to switching to generic losartan and switching to the generic drug does not support a cost offset argument over the longer term. Physicians should continue to consider the needs of individual patient and not cost offsets.
ARSENIC MODE OF ACTION AND DEVELOPING A BBDR MODEL
The current USEPA cancer risk assessment for inorganic arsenic is based on a linear extrapolation of the epidemiological data from exposed populations in Taiwan. However, proposed key events in the mode of action (MoA) for arsenic-induced cancer (which may include altered DNA me...
[The genetics of thrombosis in cancer].
Soria, José Manuel; López, Sonia
2015-01-01
Venous thromboembolism (VTE) is a multifactorial and complex disease in which the interaction of genetic factors (estimated at 60%) and environmental factors (e.g., the use of oral contraceptives, pregnancy, immobility and cancer) determine the risk of thrombosis for each individual. In particular, the association between thrombosis and cancer is well established. Approximately 20% of patients with cancer develop a thromboembolic event over the course of the natural history of the tumor process, with thrombosis being the second leading cause of death for these patients. One of the greatest challenges currently facing the field of oncology is the identification of patients at high risk of VTE who can benefit from thromboprophylaxis. Currently, there is a VTE risk prediction model for patients with cancer (the Khorana risk score); however, its ability to identify patients at high risk is very low. It is important to note that this score, which is based on five clinical parameters, ignores the genetic variability associated with VTE risk. In this article, we present the preliminary results of the Oncothromb study, whose objective is to develop an individual VTE risk prediction model for patients with cancer who are treated with outpatient chemotherapy. Our model includes the clinical and genetic data on each patient (Thrombo inCode(®) genetic profile). Only by integrating multiple layers of biological information (clinical, plasmatic and genetic) we could obtain models that provide accurate information as to which patients are at high risk of developing a thromboembolic event associated with cancer so as to take appropriate prophylactic measures. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Lehmann, Nils; Erbel, Raimund; Mahabadi, Amir A; Rauwolf, Michael; Möhlenkamp, Stefan; Moebus, Susanne; Kälsch, Hagen; Budde, Thomas; Schmermund, Axel; Stang, Andreas; Führer-Sakel, Dagmar; Weimar, Christian; Roggenbuck, Ulla; Dragano, Nico; Jöckel, Karl-Heinz
2018-02-13
Computed tomography (CT) allows estimation of coronary artery calcium (CAC) progression. We evaluated several progression algorithms in our unselected, population-based cohort for risk prediction of coronary and cardiovascular events. In 3281 participants (45-74 years of age), free from cardiovascular disease until the second visit, risk factors, and CTs at baseline (b) and after a mean of 5.1 years (5y) were measured. Hard coronary and cardiovascular events, and total cardiovascular events including revascularization, as well, were recorded during a follow-up time of 7.8±2.2 years after the second CT. The added predictive value of 10 CAC progression algorithms on top of risk factors including baseline CAC was evaluated by using survival analysis, C-statistics, net reclassification improvement, and integrated discrimination index. A subgroup analysis of risk in CAC categories was performed. We observed 85 (2.6%) hard coronary, 161 (4.9%) hard cardiovascular, and 241 (7.3%) total cardiovascular events. Absolute CAC progression was higher with versus without subsequent coronary events (median, 115 [Q1-Q3, 23-360] versus 8 [0-83], P <0.0001; similar for hard/total cardiovascular events). Some progression algorithms added to the predictive value of baseline CT and risk assessment in terms of C-statistic or integrated discrimination index, especially for total cardiovascular events. However, CAC progression did not improve models including CAC 5y and 5-year risk factors. An excellent prognosis was found for 921 participants with double-zero CAC b =CAC 5y =0 (10-year coronary and hard/total cardiovascular risk: 1.4%, 2.0%, and 2.8%), which was for participants with incident CAC 1.8%, 3.8%, and 6.6%, respectively. When CAC b progressed from 1 to 399 to CAC 5y ≥400, coronary and total cardiovascular risk were nearly 2-fold in comparison with subjects who remained below CAC 5y =400. Participants with CAC b ≥400 had high rates of hard coronary and hard/total cardiovascular events (10-year risk: 12.0%, 13.5%, and 30.9%, respectively). CAC progression is associated with coronary and cardiovascular event rates, but adds only weakly to risk prediction. What counts is the most recent CAC value and risk factor assessment. Therefore, a repeat scan >5 years after the first scan may be of additional value, except when a double-zero CT scan is present or when the subjects are already at high risk. © 2017 The Authors.
Statistical surrogate models for prediction of high-consequence climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick
2011-09-01
In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest.more » A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.« less
Defining clinical deterioration.
Jones, Daryl; Mitchell, Imogen; Hillman, Ken; Story, David
2013-08-01
To review literature reporting adverse events and physiological instability in order to develop frameworks that describe and define clinical deterioration in hospitalised patients. Literature review of publications from 1960 to August 2012. Conception and refinement of models to describe clinical deterioration based on prevailing themes that developed chronologically in adverse event literature. We propose four frameworks or models that define clinical deterioration and discuss the utility of each. Early attempts used retrospective chart review and focussed on the end result of deterioration (adverse events) and iatrogenesis. Subsequent models were also retrospective, but used discrete complications (e.g. sepsis, cardiac arrest) to define deterioration, had a more clinical focus, and identified the concept of antecedent physiological instability. Current models for defining clinical deterioration are based on the presence of abnormalities in vital signs and other clinical observations and attempt to prospectively assist clinicians in predicting subsequent risk. However, use of deranged vital signs in isolation does not consider important patient-, disease-, or system-related factors that are known to adversely affect the outcome of hospitalised patients. These include pre-morbid function, frailty, extent and severity of co-morbidity, nature of presenting illness, delays in responding to deterioration and institution of treatment, and patient response to therapy. There is a need to develop multiple-variable models for deteriorating ward patients similar to those used in intensive care units. Such models may assist clinician education, prospective and real-time patient risk stratification, and guide quality improvement initiatives that prevent and improve response to clinical deterioration. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.
Stress and sleep reactivity: a prospective investigation of the stress-diathesis model of insomnia.
Drake, Christopher L; Pillai, Vivek; Roth, Thomas
2014-08-01
To prospectively assess sleep reactivity as a diathesis of insomnia, and to delineate the interaction between this diathesis and naturalistic stress in the development of insomnia among normal sleepers. Longitudinal. Community-based. 2,316 adults from the Evolution of Pathways to Insomnia Cohort (EPIC) with no history of insomnia or depression (46.8 ± 13.2 y; 60% female). None. Participants reported the number of stressful events they encountered at baseline (Time 1), as well as the level of cognitive intrusion they experienced in response to each stressor. Stressful events (OR = 1.13; P < 0.01) and stress-induced cognitive intrusion (OR = 1.61; P < 0.01) were significant predictors of risk for insomnia one year hence (Time 2). Intrusion mediated the effects of stressful events on risk for insomnia (P < 0.05). Trait sleep reactivity significantly increased risk for insomnia (OR = 1.78; P < 0.01). Further, sleep reactivity moderated the effects of stress-induced intrusion (P < 0.05), such that the risk for insomnia as a function of intrusion was significantly higher in individuals with high sleep reactivity. Trait sleep reactivity also constituted a significant risk for depression (OR = 1.67; P < 0.01) two years later (Time 3). Insomnia at Time 2 significantly mediated this effect (P < 0.05). This study suggests that premorbid sleep reactivity is a significant risk factor for incident insomnia, and that it triggers insomnia by exacerbating the effects of stress-induced intrusion. Sleep reactivity is also a precipitant of depression, as mediated by insomnia. These findings support the stress-diathesis model of insomnia, while highlighting sleep reactivity as an important diathesis. Drake CL, Pillai V, Roth T. Stress and sleep reactivity: a prospective investigation of the stress-diathesis model of insomnia.
Majeed, Ammar; Wallvik, Niklas; Eriksson, Joakim; Höijer, Jonas; Bottai, Matteo; Holmström, Margareta; Schulman, Sam
2017-02-28
The optimal timing of vitamin K antagonists (VKAs) resumption after an upper gastrointestinal (GI) bleeding, in patients with continued indication for oral anticoagulation, is uncertain. We included consecutive cases of VKA-associated upper GI bleeding from three hospitals retrospectively. Data on the bleeding location, timing of VKA resumption, recurrent GI bleeding and thromboembolic events were collected. A model was constructed to evaluate the 'total risk', based on the sum of the cumulative rates of recurrent GI bleeding and thromboembolic events, depending on the timing of VKA resumption. A total of 121 (58 %) of 207 patients with VKA-associated upper GI bleeding were restarted on anticoagulation after a median (interquartile range) of one (0.2-3.4) week after the index bleeding. Restarting VKAs was associated with a reduced risk of thromboembolism (HR 0.19; 95 % CI, 0.07-0.55) and death (HR 0.61; 95 % CI, 0.39-0.94), but with an increased risk of recurrent GI bleeding (HR 2.5; 95 % CI, 1.4-4.5). The composite risk obtained from the combined statistical model of recurrent GI bleeding, and thromboembolism decreased if VKAs were resumed after three weeks and reached a nadir at six weeks after the index GI bleeding. On this background we will discuss how the disutility of the outcomes may influence the decision regarding timing of resumption. In conclusion, the optimal timing of VKA resumption after VKA-associated upper GI bleeding appears to be between 3-6 weeks after the index bleeding event but has to take into account the degree of thromboembolic risk, patient values and preferences.
How Safe Are Common Analgesics for the Treatment of Acute Pain for Children? A Systematic Review.
Hartling, Lisa; Ali, Samina; Dryden, Donna M; Chordiya, Pritam; Johnson, David W; Plint, Amy C; Stang, Antonia; McGrath, Patrick J; Drendel, Amy L
2016-01-01
Background . Fear of adverse events and occurrence of side effects are commonly cited by families and physicians as obstructive to appropriate use of pain medication in children. We examined evidence comparing the safety profiles of three groups of oral medications, acetaminophen, nonsteroidal anti-inflammatory drugs, and opioids, to manage acute nonsurgical pain in children (<18 years) treated in ambulatory settings. Methods . A comprehensive search was performed to July 2015, including review of national data registries. Two reviewers screened articles for inclusion, assessed methodological quality, and extracted data. Risks (incidence rates) were pooled using a random effects model. Results . Forty-four studies were included; 23 reported on adverse events. Based on limited current evidence, acetaminophen, ibuprofen, and opioids have similar nausea and vomiting profiles. Opioids have the greatest risk of central nervous system adverse events. Dual therapy with a nonopioid/opioid combination resulted in a lower risk of adverse events than opioids alone. Conclusions . Ibuprofen and acetaminophen have similar reported adverse effects and notably less adverse events than opioids. Dual therapy with a nonopioid/opioid combination confers a protective effect for adverse events over opioids alone. This research highlights challenges in assessing medication safety, including lack of more detailed information in registry data, and inconsistent reporting in trials.
Displacement, county social cohesion, and depression after a large-scale traumatic event.
Lê, Félice; Tracy, Melissa; Norris, Fran H; Galea, Sandro
2013-11-01
Depression is a common and potentially debilitating consequence of traumatic events. Mass traumatic events cause wide-ranging disruptions to community characteristics, influencing the population risk of depression. In the aftermath of such events, population displacement is common. Stressors associated with displacement may increase risk of depression directly. Indirectly, persons who are displaced may experience erosion in social cohesion, further exacerbating their risk for depression. Using data from a population-based cross-sectional survey of adults living in the 23 southernmost counties of Mississippi (N = 708), we modeled the independent and joint relations of displacement and county-level social cohesion with depression 18-24 months after Hurricane Katrina. After adjustment for individual- and county-level socio-demographic characteristics and county-level hurricane exposure, joint exposure to both displacement and low social cohesion was associated with substantially higher log-odds of depression (b = 1.34 [0.86-1.83]). Associations were much weaker for exposure only to low social cohesion (b = 0.28 [-0.35-0.90]) or only to displacement (b = 0.04 [-0.80-0.88]). The associations were robust to additional adjustment for individually perceived social cohesion and social support. Addressing the multiple, simultaneous disruptions that are a hallmark of mass traumatic events is important to identify vulnerable populations and understand the psychological ramifications of these events.
Risks of Mortality and Morbidity from Worldwide Terrorism: 1968-2004
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bogen, K T; Jones, E D
Worldwide data on terrorist incidents between 1968 and 2004 gathered by the RAND corporation and the Oklahoma City National Memorial Institute for the Prevention of Terrorism (MIPT) were assessed for patterns and trends in morbidity/mortality. Adjusted data analyzed involve a total of 19,828 events, 7,401 ''adverse'' events (each causing {ge}1 victim), and 86,568 ''casualties'' (injuries) of which 25,408 were fatal. Most terror-related adverse events, casualties and deaths involved bombs and guns. Weapon-specific patterns and terror-related risk levels in Israel (IS) have differed markedly from those of all other regions combined (OR). IS had a fatal fraction of casualties about halfmore » that of OR, but has experienced relatively constant lifetime terror-related casualty risks on the order of 0.5%--a level 2 to 3 orders of magnitude more than those experienced in OR that increased {approx}100-fold over the same period. Individual event fatality has increased steadily, the median increasing from 14 to 50%. Lorenz curves obtained indicate substantial dispersion among victim/event rates: about half of all victims were caused by the top 2.5% (or 10%) of harm-ranked events in OR (or IS). Extreme values of victim/event rates were approximated fairly well by generalized Pareto models (typically used to fit to data on forest fires, sea levels, earthquakes, etc.). These results were in turn used to forecast maximum OR- and IS-specific victims/event rates through 2080, illustrating empirically based methods that could be applied to improve strategies to assess, prevent and manage terror-related risks and consequences.« less
Risks of mortality and morbidity from worldwide terrorism: 1968-2004.
Bogen, Kenneth T; Jones, Edwin D
2006-02-01
Worldwide data on terrorist incidents between 1968 and 2004 gathered by the RAND Corporation and the Oklahoma City National Memorial Institute for the Prevention of Terrorism (MIPT) were assessed for patterns and trends in morbidity/mortality. Adjusted data analyzed involve a total of 19,828 events, 7,401 "adverse" events (each causing >or= 1 victim), and 86,568 "casualties" (injuries), of which 25,408 were fatal. Most terror-related adverse events, casualties, and deaths involved bombs and guns. Weapon-specific patterns and terror-related risk levels in Israel (IS) have differed markedly from those of all other regions combined (OR). IS had a fatal fraction of casualties about half that of OR, but has experienced relatively constant lifetime terror-related casualty risks on the order of 0.5%--a level 2 to 3 orders of magnitude more than those experienced in OR that increased approximately 100-fold over the same period. Individual event fatality has increased steadily, the median increasing from 14% to 50%. Lorenz curves obtained indicate substantial dispersion among victim/event rates: about half of all victims were caused by the top 2.5% (or 10%) of harm-ranked events in OR (or IS). Extreme values of victim/event rates were approximated fairly well by generalized Pareto models (typically used to fit to data on forest fires, sea levels, earthquakes, etc.). These results were in turn used to forecast maximum OR- and IS-specific victims/event rates through 2080, illustrating empirically-based methods that could be applied to improve strategies to assess, prevent, and manage terror-related risks and consequences.
Satellite-Enhanced Dynamical Downscaling of Extreme Events
NASA Astrophysics Data System (ADS)
Nunes, A.
2015-12-01
Severe weather events can be the triggers of environmental disasters in regions particularly susceptible to changes in hydrometeorological conditions. In that regard, the reconstruction of past extreme weather events can help in the assessment of vulnerability and risk mitigation actions. Using novel modeling approaches, dynamical downscaling of long-term integrations from global circulation models can be useful for risk analysis, providing more accurate climate information at regional scales. Originally developed at the National Centers for Environmental Prediction (NCEP), the Regional Spectral Model (RSM) is being used in the dynamical downscaling of global reanalysis, within the South American Hydroclimate Reconstruction Project. Here, RSM combines scale-selective bias correction with assimilation of satellite-based precipitation estimates to downscale extreme weather occurrences. Scale-selective bias correction is a method employed in the downscaling, similar to the spectral nudging technique, in which the downscaled solution develops in agreement with its coarse boundaries. Precipitation assimilation acts on modeled deep-convection, drives the land-surface variables, and therefore the hydrological cycle. During the downscaling of extreme events that took place in Brazil in recent years, RSM continuously assimilated NCEP Climate Prediction Center morphing technique precipitation rates. As a result, RSM performed better than its global (reanalysis) forcing, showing more consistent hydrometeorological fields compared with more sophisticated global reanalyses. Ultimately, RSM analyses might provide better-quality initial conditions for high-resolution numerical predictions in metropolitan areas, leading to more reliable short-term forecasting of severe local storms.
Ervasti, Jenni; Virtanen, Marianna; Lallukka, Tea; Friberg, Emilie; Mittendorfer-Rutz, Ellenor; Lundström, Erik; Alexanderson, Kristina
2017-09-29
We examined the risk of disability pension before and after ischaemic heart disease (IHD) or stroke event, the burden of stroke compared with IHD and which factors predicted disability pension after either event. A population-based cohort study with follow-up 5 years before and after the event. Register data were analysed with general linear modelling with binary and Poisson distributions including interaction tests for event type (IHD/stroke). All people living in Sweden, aged 25‒60 years at the first event year, who had been living in Sweden for 5 years before the event and had no indication of IHD or stroke prior to the index event in 2006‒2008 were included, except for cases in which death occurred within 30 days of the event. People with both IHD and stroke were excluded, resulting in 18 480 cases of IHD (65%) and 9750 stroke cases (35%). Disability pension. Of those going to suffer IHD or stroke event, 25% were already on disability pension a year before the event. The adjusted OR for disability pension at first postevent year was 2.64-fold (95% CI 2.25 to 3.11) for people with stroke compared with IHD. Economic inactivity predicted disability pension regardless of event type (OR=3.40; 95% CI 2.85 to 4.04). Comorbid mental disorder was associated with the greatest risk (OR=3.60; 95% CI 2.69 to 4.83) after an IHD event. Regarding stroke, medical procedure, a proxy for event severity, was the largest contributor (OR=2.27, 95% CI 1.43 to 3.60). While IHD event was more common, stroke involved more permanent work disability. Demographic, socioeconomic and comorbidity-related factors were associated with disability pension both before and after the event. The results help occupational and other healthcare professionals to identify vulnerable groups at risk for permanent labour market exclusion after such an event. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Green, Linda E; Dinh, Tuan A; Hinds, David A; Walser, Bryan L; Allman, Richard
2014-04-01
Tamoxifen therapy reduces the risk of breast cancer but increases the risk of serious adverse events including endometrial cancer and thromboembolic events. The cost effectiveness of using a commercially available breast cancer risk assessment test (BREVAGen™) to inform the decision of which women should undergo chemoprevention by tamoxifen was modeled in a simulated population of women who had undergone biopsies but had no diagnosis of cancer. A continuous time, discrete event, mathematical model was used to simulate a population of white women aged 40-69 years, who were at elevated risk for breast cancer because of a history of benign breast biopsy. Women were assessed for clinical risk of breast cancer using the Gail model and for genetic risk using a panel of seven common single nucleotide polymorphisms. We evaluated the cost effectiveness of using genetic risk together with clinical risk, instead of clinical risk alone, to determine eligibility for 5 years of tamoxifen therapy. In addition to breast cancer, the simulation included health states of endometrial cancer, pulmonary embolism, deep-vein thrombosis, stroke, and cataract. Estimates of costs in 2012 US dollars were based on Medicare reimbursement rates reported in the literature and utilities for modeled health states were calculated as an average of utilities reported in the literature. A 50-year time horizon was used to observe lifetime effects including survival benefits. For those women at intermediate risk of developing breast cancer (1.2-1.66 % 5-year risk), the incremental cost-effectiveness ratio for the combined genetic and clinical risk assessment strategy over the clinical risk assessment-only strategy was US$47,000, US$44,000, and US$65,000 per quality-adjusted life-year gained, for women aged 40-49, 50-59, and 60-69 years, respectively (assuming a price of US$945 for genetic testing). Results were sensitive to assumptions about patient adherence, utility of life while taking tamoxifen, and cost of genetic testing. From the US payer's perspective, the combined genetic and clinical risk assessment strategy may be a moderately cost-effective alternative to using clinical risk alone to guide chemoprevention recommendations for women at intermediate risk of developing breast cancer.
Liu, Hui; Waite, Linda; Shen, Shannon; Wang, Donna
2016-01-01
Working from a social relationship and life course perspective, we provide generalizable population-based evidence on partnered sexuality linked to cardiovascular risk in later life using national longitudinal data from the NSHAP (N=2204). We consider characteristics of partnered sexuality of older men and women, particularly sexual activity and sexual quality, as they affect cardiovascular risk. Cardiovascular risk is defined as hypertension, rapid heart rate, elevated CRP, and general cardiovascular events. We find that older men are more likely to report being sexually active, report having sex more often and more enjoyably than are older women. Results from cross-lagged models suggest that high frequency of sex is positively related to later risk of cardiovascular events for men but not women, whereas good sexual quality seems to protect women but not men from cardiovascular risk in later life. We find no evidence that poor cardiovascular health interferes with later sexuality for either gender. PMID:27601406
Flood damage: a model for consistent, complete and multipurpose scenarios
NASA Astrophysics Data System (ADS)
Menoni, Scira; Molinari, Daniela; Ballio, Francesco; Minucci, Guido; Mejri, Ouejdane; Atun, Funda; Berni, Nicola; Pandolfo, Claudia
2016-12-01
Effective flood risk mitigation requires the impacts of flood events to be much better and more reliably known than is currently the case. Available post-flood damage assessments usually supply only a partial vision of the consequences of the floods as they typically respond to the specific needs of a particular stakeholder. Consequently, they generally focus (i) on particular items at risk, (ii) on a certain time window after the occurrence of the flood, (iii) on a specific scale of analysis or (iv) on the analysis of damage only, without an investigation of damage mechanisms and root causes. This paper responds to the necessity of a more integrated interpretation of flood events as the base to address the variety of needs arising after a disaster. In particular, a model is supplied to develop multipurpose complete event scenarios. The model organizes available information after the event according to five logical axes. This way post-flood damage assessments can be developed that (i) are multisectoral, (ii) consider physical as well as functional and systemic damage, (iii) address the spatial scales that are relevant for the event at stake depending on the type of damage that has to be analyzed, i.e., direct, functional and systemic, (iv) consider the temporal evolution of damage and finally (v) allow damage mechanisms and root causes to be understood. All the above features are key for the multi-usability of resulting flood scenarios. The model allows, on the one hand, the rationalization of efforts currently implemented in ex post damage assessments, also with the objective of better programming financial resources that will be needed for these types of events in the future. On the other hand, integrated interpretations of flood events are fundamental to adapting and optimizing flood mitigation strategies on the basis of thorough forensic investigation of each event, as corroborated by the implementation of the model in a case study.
NASA Astrophysics Data System (ADS)
Green, Daniel; Pattison, Ian; Yu, Dapeng
2016-04-01
Surface water (pluvial) flooding occurs when rainwater from intense precipitation events is unable to infiltrate into the subsurface or drain via natural or artificial drainage channels. Surface water flooding poses a serious hazard to urban areas across the world, with the UK's perceived risk appearing to have increased in recent years due to surface water flood events seeming more severe and frequent. Surface water flood risk currently accounts for 1/3 of all UK flood risk, with approximately two million people living in urban areas at risk of a 1 in 200-year flood event. Research often focuses upon using numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer a novel, alternative and innovative environment to collect data within, creating a controlled, closed system where independent variables can be altered independently to investigate cause and effect relationships. A physical modelling environment provides a suitable platform to investigate rainfall-runoff processes occurring within an urban catchment. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered 1:100 physical model consisting of: (i) a low-cost rainfall simulator component able to simulate consistent, uniformly distributed (>75% CUC) rainfall events of varying intensity, and; (ii) a fully interchangeable, modular plot surface have been conducted to investigate and quantify the influence of a number of terrestrial and meteorological factors on overland flow and rainfall-runoff patterns within a modelled urban setting. Terrestrial factors investigated include altering the physical model's catchment slope (0°- 20°), as well as simulating a number of spatially-varied impermeability and building density/configuration scenarios. Additionally, the influence of different storm dynamics and intensities were investigated. Preliminary results demonstrate that rainfall-runoff responses in the physical modelling environment are highly sensitive to slight increases in catchment gradient and rainfall intensity and that more densely distributed building layouts significantly increase peak flows recorded at the physical model outflow when compared to sparsely distributed building layouts under comparable simulated rainfall conditions.
Moving towards a new paradigm for global flood risk estimation
NASA Astrophysics Data System (ADS)
Troy, Tara J.; Devineni, Naresh; Lima, Carlos; Lall, Upmanu
2013-04-01
Traditional approaches to flood risk assessment are typically indexed to an instantaneous peak flow event at a specific recording gage on a river, and then extrapolated through hydraulic modeling of that peak flow to the potential area that is likely to be inundated. Recent research shows that property losses tend to be determined as much by the duration of flooding as by the depth and velocity of inundation. The existing notion of a flood return period based on just the instantaneous peak flow rate at a stream gauge consequently needs to be revisited, especially for floods due to persistent rainfall as seen recently in Thailand, Pakistan, the Ohio and the Mississippi Rivers, France, and Germany. Depending on the flood event type considered, different rainfall inducing mechanisms (tropical storm, local convection, frontal system, recurrent tropical waves) may be involved. Each of these will have a characteristic spatial scale, expression and orientation and temporal characteristics. We develop stochastic models that can reproduce these attributes with appropriate intensity-duration-frequency and spatial expression, and hence provide a basis for conditioning basin hydrologic attributes for flood risk assessment. Past work on Non-homogeneous Hidden Markov Models (NHMM) is used as a basis to develop this capability at regional scales. In addition, a dynamic hierarchical Bayesian network model that is continuous and not based on discretization to states is tested and compared against NHMM. The exogenous variables in these models comes from the analysis of key synoptic circulation patterns which will be used as predictors for the regional spatio-temporal models. The stochastic simulations of rainfall are then used as input to a flood modeling system, which consists of a series of physically based models. Rainfall-runoff generation is produced by the Variable Infiltration Capacity (VIC) model. When the modeled streamflow crosses a threshold, a full kinematic wave routing model is implemented at a finer resolution (<=1km) in order to more accurately model streamflow under flood conditions and estimate inundation. This approach allows for efficient computational simulation of the hydrology when not under potential for flooding with high-resolution flood wave modeling when there is flooding potential. We demonstrate the results of this flood risk estimation system for the Ohio River basin in the United States, a large river basin that is historically prone to flooding, with the intention of using it to do global flood risk assessment.
Physiological pharmacokinetic modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menzel, D.B.
1987-10-01
Risk assessment often defines the approach and the degree of regulation, decisions in risk assessment often have major regulatory impacts. Chemicals that have economic value or that were byproducts of the chemical industry are common subjects of such decisions. Regrettably, decisions related to risk assessment, science, or regulatory matters will frequently be made with incomplete information and on the basis of intuitive reasoning. Statistical fits to experimental data have been used to estimate risks in humans from experimental data in animals. These treatments have not taken into account the obvious differences in physiology, biochemistry, and size between aniamals and humans.more » In this article the use of mathematical models based on continuous relationships, rather than quantal events, are discussed. The mathematical models can be used to adjust the dose in the quantal response model, but the emphasis will be on how these mathematical models are conceived and what implications their use holds for risk assessment. Experiments with humans that produce toxic effects cannot be done. Data for human toxicity will always be lacking.« less
Beauvais, W; Fournié, G; Jones, B A; Cameron, A; Njeumi, F; Lubroth, J; Pfeiffer, D U
2013-11-01
Now that we are in the rinderpest post-eradication era, attention is focused on the risk of re-introduction. A semi-quantitative risk assessment identified accidental use of rinderpest virus in laboratories as the most likely cause of re-introduction. However there is little data available on the rates of laboratory biosafety breakdowns in general. In addition, any predictions based on past events are subject to various uncertainties. The aims of this study were therefore to investigate the potential usefulness of historical data for predicting the future risk of rinderpest release via laboratory biosafety breakdowns, and to investigate the impacts of the various uncertainties on these predictions. Data were collected using a worldwide online survey of laboratories, a structured search of ProMED reports and discussion with experts. A stochastic model was constructed to predict the number of laboratory biosafety breakdowns involving rinderpest that will occur over the next 10 years, based on: (1) the historical rate of biosafety breakdowns; and (2) the change in the number of laboratories that will have rinderpest virus in the next 10 years compared to historically. The search identified five breakdowns, all of which occurred during 1970-2000 and all of which were identified via discussions with experts. Assuming that our search for historical events had a sensitivity of over 60% and there has been at least a 40% reduction in the underlying risk (attributable to decreased laboratory activity post eradication) the most likely number of biosafety events worldwide was estimated to be zero over a 10 year period. However, the risk of at least one biosafety breakdown remains greater than 1 in 10,000 unless the sensitivity was at least 99% or the number of laboratories has decreased by at least 99% (based on 2000-2010 during which there were no biosafety breakdowns). Copyright © 2013 Elsevier B.V. All rights reserved.
Vincenzi, Simone; Crivelli, Alain J; Jesensek, Dusan; De Leo, Giulio A
2008-06-01
Theoretical and empirical models of populations dynamics have paid little attention to the implications of density-dependent individual growth on the persistence and regulation of small freshwater salmonid populations. We have therefore designed a study aimed at testing our hypothesis that density-dependent individual growth is a process that enhances population recovery and reduces extinction risk in salmonid populations in a variable environment subject to disturbance events. This hypothesis was tested in two newly introduced marble trout (Salmo marmoratus) populations living in Slovenian streams (Zakojska and Gorska) subject to severe autumn floods. We developed a discrete-time stochastic individual-based model of population dynamics for each population with demographic parameters and compensatory responses tightly calibrated on data from individually tagged marble trout. The occurrence of severe flood events causing population collapses was explicitly accounted for in the model. We used the model in a population viability analysis setting to estimate the quasi-extinction risk and demographic indexes of the two marble trout populations when individual growth was density-dependent. We ran a set of simulations in which the effect of floods on population abundance was explicitly accounted for and another set of simulations in which flood events were not included in the model. These simulation results were compared with those of scenarios in which individual growth was modelled with density-independent Von Bertalanffy growth curves. Our results show how density-dependent individual growth may confer remarkable resilience to marble trout populations in case of major flood events. The resilience to flood events shown by the simulation results can be explained by the increase in size-dependent fecundity as a consequence of the drop in population size after a severe flood, which allows the population to quickly recover to the pre-event conditions. Our results suggest that density-dependent individual growth plays a potentially powerful role in the persistence of freshwater salmonids living in streams subject to recurrent yet unpredictable flood events.
Whiteley, William N; Adams, Harold P; Bath, Philip MW; Berge, Eivind; Sandset, Per Morten; Dennis, Martin; Murray, Gordon D; Wong, Ka-Sing Lawrence; Sandercock, Peter AG
2013-01-01
Summary Background Many international guidelines on the prevention of venous thromboembolism recommend targeting heparin treatment at patients with stroke who have a high risk of venous thrombotic events or a low risk of haemorrhagic events. We sought to identify reliable methods to target anticoagulant treatment and so improve the chance of avoiding death or dependence after stroke. Methods We obtained individual patient data from the five largest randomised controlled trials in acute ischaemic stroke that compared heparins (unfractionated heparin, heparinoids, or low-molecular-weight heparin) with aspirin or placebo. We developed and evaluated statistical models for the prediction of thrombotic events (myocardial infarction, stroke, deep vein thrombosis, or pulmonary embolism) and haemorrhagic events (symptomatic intracranial or significant extracranial) in the first 14 days after stroke. We calculated the absolute risk difference for the outcome “dead or dependent” in patients grouped by quartiles of predicted risk of thrombotic and haemorrhagic events with random effect meta-analysis. Findings Patients with ischaemic stroke who were of advanced age, had increased neurological impairment, or had atrial fibrillation had a high risk of both thrombotic and haemorrhagic events after stroke. Additionally, patients with CT-visible evidence of recent cerebral ischaemia were at increased risk of thrombotic events. In evaluation datasets, the area under a receiver operating curve for prediction models for thrombotic events was 0·63 (95% CI 0·59–0·67) and for haemorrhagic events was 0·60 (0·55–0·64). We found no evidence that the net benefit from heparins increased with either increasing risk of thrombotic events or decreasing risk of haemorrhagic events. Interpretation There was no evidence that patients with ischaemic stroke who were at higher risk of thrombotic events or lower risk of haemorrhagic events benefited from heparins. We were therefore unable to define a targeted approach to select the patients who would benefit from treatment with early anticoagulant therapy. We recommend that guidelines for routine or selective use of heparin in stroke should be revised. Funding MRC. PMID:23642343
Whiteley, William N; Adams, Harold P; Bath, Philip M W; Berge, Eivind; Sandset, Per Morten; Dennis, Martin; Murray, Gordon D; Wong, Ka-Sing Lawrence; Sandercock, Peter A G
2013-06-01
Many international guidelines on the prevention of venous thromboembolism recommend targeting heparin treatment at patients with stroke who have a high risk of venous thrombotic events or a low risk of haemorrhagic events. We sought to identify reliable methods to target anticoagulant treatment and so improve the chance of avoiding death or dependence after stroke. We obtained individual patient data from the five largest randomised controlled trials in acute ischaemic stroke that compared heparins (unfractionated heparin, heparinoids, or low-molecular-weight heparin) with aspirin or placebo. We developed and evaluated statistical models for the prediction of thrombotic events (myocardial infarction, stroke, deep vein thrombosis, or pulmonary embolism) and haemorrhagic events (symptomatic intracranial or significant extracranial) in the first 14 days after stroke. We calculated the absolute risk difference for the outcome "dead or dependent" in patients grouped by quartiles of predicted risk of thrombotic and haemorrhagic events with random effect meta-analysis. Patients with ischaemic stroke who were of advanced age, had increased neurological impairment, or had atrial fibrillation had a high risk of both thrombotic and haemorrhagic events after stroke. Additionally, patients with CT-visible evidence of recent cerebral ischaemia were at increased risk of thrombotic events. In evaluation datasets, the area under a receiver operating curve for prediction models for thrombotic events was 0·63 (95% CI 0·59-0·67) and for haemorrhagic events was 0·60 (0·55-0·64). We found no evidence that the net benefit from heparins increased with either increasing risk of thrombotic events or decreasing risk of haemorrhagic events. There was no evidence that patients with ischaemic stroke who were at higher risk of thrombotic events or lower risk of haemorrhagic events benefited from heparins. We were therefore unable to define a targeted approach to select the patients who would benefit from treatment with early anticoagulant therapy. We recommend that guidelines for routine or selective use of heparin in stroke should be revised. MRC. Copyright © 2013 Elsevier Ltd. All rights reserved.
A coupled weather generator - rainfall-runoff approach on hourly time steps for flood risk analysis
NASA Astrophysics Data System (ADS)
Winter, Benjamin; Schneeberger, Klaus; Dung Nguyen, Viet; Vorogushyn, Sergiy; Huttenlau, Matthias; Merz, Bruno; Stötter, Johann
2017-04-01
The evaluation of potential monetary damage of flooding is an essential part of flood risk management. One possibility to estimate the monetary risk is to analyze long time series of observed flood events and their corresponding damages. In reality, however, only few flood events are documented. This limitation can be overcome by the generation of a set of synthetic, physically and spatial plausible flood events and subsequently the estimation of the resulting monetary damages. In the present work, a set of synthetic flood events is generated by a continuous rainfall-runoff simulation in combination with a coupled weather generator and temporal disaggregation procedure for the study area of Vorarlberg (Austria). Most flood risk studies focus on daily time steps, however, the mesoscale alpine study area is characterized by short concentration times, leading to large differences between daily mean and daily maximum discharge. Accordingly, an hourly time step is needed for the simulations. The hourly metrological input for the rainfall-runoff model is generated in a two-step approach. A synthetic daily dataset is generated by a multivariate and multisite weather generator and subsequently disaggregated to hourly time steps with a k-Nearest-Neighbor model. Following the event generation procedure, the negative consequences of flooding are analyzed. The corresponding flood damage for each synthetic event is estimated by combining the synthetic discharge at representative points of the river network with a loss probability relation for each community in the study area. The loss probability relation is based on exposure and susceptibility analyses on a single object basis (residential buildings) for certain return periods. For these impact analyses official inundation maps of the study area are used. Finally, by analyzing the total event time series of damages, the expected annual damage or losses associated with a certain probability of occurrence can be estimated for the entire study area.
Ndindjock, Roger; Gedeon, Jude; Mendis, Shanthi; Paccaud, Fred; Bovet, Pascal
2011-04-01
To assess the prevalence of cardiovascular (CV) risk factors in Seychelles, a middle-income African country, and compare the cost-effectiveness of single-risk-factor management (treating individuals with arterial blood pressure ≥ 140/90 mmHg and/or total serum cholesterol ≥ 6.2 mmol/l) with that of management based on total CV risk (treating individuals with a total CV risk ≥ 10% or ≥ 20%). CV risk factor prevalence and a CV risk prediction chart for Africa were used to estimate the 10-year risk of suffering a fatal or non-fatal CV event among individuals aged 40-64 years. These figures were used to compare single-risk-factor management with total risk management in terms of the number of people requiring treatment to avert one CV event and the number of events potentially averted over 10 years. Treatment for patients with high total CV risk (≥ 20%) was assumed to consist of a fixed-dose combination of several drugs (polypill). Cost analyses were limited to medication. A total CV risk of ≥ 10% and ≥ 20% was found among 10.8% and 5.1% of individuals, respectively. With single-risk-factor management, 60% of adults would need to be treated and 157 cardiovascular events per 100000 population would be averted per year, as opposed to 5% of adults and 92 events with total CV risk management. Management based on high total CV risk optimizes the balance between the number requiring treatment and the number of CV events averted. Total CV risk management is much more cost-effective than single-risk-factor management. These findings are relevant for all countries, but especially for those economically and demographically similar to Seychelles.
Van Gent, Jan-Michael; Calvo, Richard Yee; Zander, Ashley L; Olson, Erik J; Sise, C Beth; Sise, Michael J; Shackford, Steven R
2017-12-01
Venous thromboembolism, including deep vein thrombosis (DVT) and pulmonary embolism (PE), is typically reported as a composite measure of the quality of trauma center care. Given that recent data suggesting postinjury DVT and PE are distinct clinical processes, a better understanding may result from analyzing them as independent, competing events. Using competing risks analysis, we evaluated our hypothesis that the risk factors and timing of postinjury DVT and PE are different. We examined all adult trauma patients admitted to our Level I trauma center from July 2006 to December 2011 who received at least one surveillance duplex ultrasound of the lower extremities and who were at high risk or greater for DVT. Outcomes included DVT and PE events, and time-to-event from admission. We used competing risks analysis to evaluate risk factors for DVT while accounting for PE as a competing event, and vice versa. Of 2,370 patients, 265 (11.2%) had at least one venous thromboembolism event, 235 DVT only, 19 PE only, 11 DVT and PE. Within 2 days of admission, 38% of DVT cases had occurred compared with 26% of PE. Competing risks modeling of DVT as primary event identified older age, severe injury (Injury Severity Score, ≥ 15), mechanical ventilation longer than 4 days, active cancer, history of DVT or PE, major venous repair, male sex, and prophylactic enoxaparin and prophylactic heparin as associated risk factors. Modeling of PE as the primary event showed younger age, nonsevere injury (Injury Severity Score, < 15), central line placement, and prophylactic heparin as relevant factors. The risk factors for PE and DVT after injury were different, suggesting that they are clinically distinct events that merit independent consideration. Many DVT events occurred early despite prophylaxis, bringing into question the preventability of postinjury DVT. We recommend trauma center quality reporting program measures be revised to account for DVT and PE as unique events. Epidemiologic, level III.
Sailer, Anna M; van Kuijk, Sander M J; Nelemans, Patricia J; Chin, Anne S; Kino, Aya; Huininga, Mark; Schmidt, Johanna; Mistelbauer, Gabriel; Bäumler, Kathrin; Chiu, Peter; Fischbein, Michael P; Dake, Michael D; Miller, D Craig; Schurink, Geert Willem H; Fleischmann, Dominik
2017-04-01
Medical treatment of initially uncomplicated acute Stanford type-B aortic dissection is associated with a high rate of late adverse events. Identification of individuals who potentially benefit from preventive endografting is highly desirable. The association of computed tomography imaging features with late adverse events was retrospectively assessed in 83 patients with acute uncomplicated Stanford type-B aortic dissection, followed over a median of 850 (interquartile range 247-1824) days. Adverse events were defined as fatal or nonfatal aortic rupture, rapid aortic growth (>10 mm/y), aneurysm formation (≥6 cm), organ or limb ischemia, or new uncontrollable hypertension or pain. Five significant predictors were identified using multivariable Cox regression analysis: connective tissue disease (hazard ratio [HR] 2.94, 95% confidence interval [CI]: 1.29-6.72; P =0.01), circumferential extent of false lumen in angular degrees (HR 1.03 per degree, 95% CI: 1.01-1.04, P =0.003), maximum aortic diameter (HR 1.10 per mm, 95% CI: 1.02-1.18, P =0.015), false lumen outflow (HR 0.999 per mL/min, 95% CI: 0.998-1.000; P =0.055), and number of intercostal arteries (HR 0.89 per n, 95% CI: 0.80-0.98; P =0.024). A prediction model was constructed to calculate patient specific risk at 1, 2, and 5 years and to stratify patients into high-, intermediate-, and low-risk groups. The model was internally validated by bootstrapping and showed good discriminatory ability with an optimism-corrected C statistic of 70.1%. Computed tomography imaging-based morphological features combined into a prediction model may be able to identify patients at high risk for late adverse events after an initially uncomplicated type-B aortic dissection. © 2017 American Heart Association, Inc.
A framework for quantifying net benefits of alternative prognostic models.
Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G
2012-01-30
New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd.
Latent variable model for suicide risk in relation to social capital and socio-economic status.
Congdon, Peter
2012-08-01
There is little evidence on the association between suicide outcomes (ideation, attempts, self-harm) and social capital. This paper investigates such associations using a structural equation model based on health survey data, and allowing for both individual and contextual risk factors. Social capital and other major risk factors for suicide, namely socioeconomic status and social isolation, are modelled as latent variables that are proxied (or measured) by observed indicators or question responses for survey subjects. These latent scales predict suicide risk in the structural component of the model. Also relevant to explaining suicide risk are contextual variables, such as area deprivation and region of residence, as well as the subject's demographic status. The analysis is based on the 2007 Adult Psychiatric Morbidity Survey and includes 7,403 English subjects. A Bayesian modelling strategy is used. Models with and without social capital as a predictor of suicide risk are applied. A benefit to statistical fit is demonstrated when social capital is added as a predictor. Social capital varies significantly by geographic context variables (neighbourhood deprivation, region), and this impacts on the direct effects of these contextual variables on suicide risk. In particular, area deprivation is not confirmed as a distinct significant influence. The model develops a suicidality risk score incorporating social capital, and the success of this risk score in predicting actual suicide events is demonstrated. Social capital as reflected in neighbourhood perceptions is a significant factor affecting risks of different types of self-harm and may mediate the effects of other contextual variables such as area deprivation.
Gooding, Holly C; Ning, Hongyan; Gillman, Matthew W; Shay, Christina; Allen, Norrina; Goff, David C; Lloyd-Jones, Donald; Chiuve, Stephanie
2017-09-01
Few tools exist for assessing the risk for early atherosclerotic cardiovascular disease (ASCVD) events in young adults. To assess the performance of the Healthy Heart Score (HHS), a lifestyle-based tool that estimates ASCVD events in older adults, for ASCVD events occurring before 55 years of age. This prospective cohort study included 4893 US adults aged 18 to 30 years from the Coronary Artery Risk Development in Young Adults (CARDIA) study. Participants underwent measurement of lifestyle factors from March 25, 1985, through June 7, 1986, and were followed up for a median of 27.1 years (interquartile range, 26.9-27.2 years). Data for this study were analyzed from February 24 through December 12, 2016. The HHS includes age, smoking status, body mass index, alcohol intake, exercise, and a diet score composed of self-reported daily intake of cereal fiber, fruits and/or vegetables, nuts, sugar-sweetened beverages, and red and/or processed meats. The HHS in the CARDIA study was calculated using sex-specific equations produced by its derivation cohorts. The ability of the HHS to assess the 25-year risk for ASCVD (death from coronary heart disease, nonfatal myocardial infarction, and fatal or nonfatal ischemic stroke) in the total sample, in race- and sex-specific subgroups, and in those with and without clinical ASCVD risk factors at baseline. Model discrimination was assessed with the Harrell C statistic; model calibration, with Greenwood-Nam-D'Agostino statistics. The study population of 4893 participants included 2205 men (45.1%) and 2688 women (54.9%) with a mean (SD) age at baseline of 24.8 (3.6) years; 2483 (50.7%) were black; and 427 (8.7%) had at least 1 clinical ASCVD risk factor (hypertension, hyperlipidemia, or diabetes types 1 and 2). Among these participants, 64 premature ASCVD events occurred in women and 99 in men. The HHS showed moderate discrimination for ASCVD risk assessment in this diverse population of mostly healthy young adults (C statistic, 0.71; 95% CI, 0.66-0.76); it performed better in men (C statistic, 0.74; 95% CI, 0.68-0.79) than in women (C statistic, 0.69; 95% CI, 0.62-0.75); in white (C statistic, 0.77; 95% CI, 0.71-0.84) than in black (C statistic, 0.66; 95% CI, 0.60-0.72) participants; and in those without (C statistic, 0.71; 95% CI, 0.66-0.76) vs with (C statistic, 0.64; 95% CI, 0.55-0.73) clinical risk factors at baseline. The HHS was adequately calibrated overall and within each subgroup. The HHS, when measured in younger persons without ASCVD risk factors, performs moderately well in assessing risk for ASCVD events by early middle age. Its reliance on self-reported, modifiable lifestyle factors makes it an attractive tool for risk assessment and counseling for early ASCVD prevention.
Pealing, Louise; Perel, Pablo; Prieto-Merino, David; Roberts, Ian
2012-01-01
Background Vascular occlusive events can complicate recovery following trauma. We examined risk factors for venous and arterial vascular occlusive events in trauma patients and the extent to which the risk of vascular occlusive events varies with the severity of bleeding. Methods and Findings We conducted a cohort analysis using data from a large international, double-blind, randomised, placebo-controlled trial (The CRASH-2 trial) [1]. We studied the association between patient demographic and physiological parameters at hospital admission and the risk of vascular occlusive events. To assess the extent to which risk of vascular occlusive events varies with severity of bleeding, we constructed a prognostic model for the risk of death due to bleeding and assessed the relationship between risk of death due to bleeding and risk of vascular occlusive events. There were 20,127 trauma patients with outcome data including 204 (1.01%) patients with a venous event (pulmonary embolism or deep vein thrombosis) and 200 (0.99%) with an arterial event (myocardial infarction or stroke). There were 81 deaths due to vascular occlusive events. Increasing age, decreasing systolic blood pressure, increased respiratory rates, longer central capillary refill times, higher heart rates and lower Glasgow Coma Scores (all p<0.02) were strong risk factors for venous and arterial vascular occlusive events. Patients with more severe bleeding as assessed by predicted risk of haemorrhage death had a greatly increased risk for all types of vascular occlusive event (all p<0.001). Conclusions Patients with severe traumatic bleeding are at greatly increased risk of venous and arterial vascular occlusive events. Older age and blunt trauma are also risk factors for vascular occlusive events. Effective treatment of bleeding may reduce venous and arterial vascular occlusive complications in trauma patients. PMID:23251374
2007-11-01
Since the completion of the program in 2003, OSA-CBM has been merged in MIMOSA consortium. The following areas are covered by the this standard...Data architecture design based on the CRIS data model from MIMOSA . • Implementation guidance among available middleware technologies
Besner, Marie-Claude; Prévost, Michèle; Regli, Stig
2011-01-01
Low and negative pressure events in drinking water distribution systems have the potential to result in intrusion of pathogenic microorganisms if an external source of contamination is present (e.g., nearby leaking sewer main) and there is a pathway for contaminant entry (e.g., leaks in drinking water main). While the public health risk associated with such events is not well understood, quantitative microbial risk assessment can be used to estimate such risk. A conceptual model is provided and the state of knowledge, current assumptions, and challenges associated with the conceptual model parameters are presented. This review provides a characterization of the causes, magnitudes, durations and frequencies of low/negative pressure events; pathways for pathogen entry; pathogen occurrence in external sources of contamination; volumes of water that may enter through the different pathways; fate and transport of pathogens from the pathways of entry to customer taps; pathogen exposure to populations consuming the drinking water; and risk associated with pathogen exposure. Copyright © 2010 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sezen, Halil; Aldemir, Tunc; Denning, R.
Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.
Subramanyam, Rajeev; Yeramaneni, Samrat; Hossain, Mohamed Monir; Anneken, Amy M; Varughese, Anna M
2016-05-01
Perioperative respiratory adverse events (PRAEs) are the most common cause of serious adverse events in children receiving anesthesia. Our primary aim of this study was to develop and validate a risk prediction tool for the occurrence of PRAE from the onset of anesthesia induction until discharge from the postanesthesia care unit in children younger than 18 years undergoing elective ambulatory anesthesia for surgery and radiology. The incidence of PRAE was studied. We analyzed data from 19,059 patients from our department's quality improvement database. The predictor variables were age, sex, ASA physical status, morbid obesity, preexisting pulmonary disorder, preexisting neurologic disorder, and location of ambulatory anesthesia (surgery or radiology). Composite PRAE was defined as the presence of any 1 of the following events: intraoperative bronchospasm, intraoperative laryngospasm, postoperative apnea, postoperative laryngospasm, postoperative bronchospasm, or postoperative prolonged oxygen requirement. Development and validation of the risk prediction tool for PRAE were performed using a split sampling technique to split the database into 2 independent cohorts based on the year when the patient received ambulatory anesthesia for surgery and radiology using logistic regression. A risk score was developed based on the regression coefficients from the validation tool. The performance of the risk prediction tool was assessed by using tests of discrimination and calibration. The overall incidence of composite PRAE was 2.8%. The derivation cohort included 8904 patients, and the validation cohort included 10,155 patients. The risk of PRAE was 3.9% in the development cohort and 1.8% in the validation cohort. Age ≤ 3 years (versus >3 years), ASA physical status II or III (versus ASA physical status I), morbid obesity, preexisting pulmonary disorder, and surgery (versus radiology) significantly predicted the occurrence of PRAE in a multivariable logistic regression model. A risk score in the range of 0 to 3 was assigned to each significant variable in the logistic regression model, and final score for all risk factors ranged from 0 to 11. A cutoff score of 4 was derived from a receiver operating characteristic curve to determine the high-risk category. The model C-statistic and the corresponding SE for the derivation and validation cohort was 0.64 ± 0.01 and 0.63 ± 0.02, respectively. Sensitivity and SE of the risk prediction tool to identify children at risk for PRAE was 77.6 ± 0.02 in the derivation cohort and 76.2 ± 0.03 in the validation cohort. The risk tool developed and validated from our study cohort identified 5 risk factors: age ≤ 3 years (versus >3 years), ASA physical status II and III (versus ASA physical status I), morbid obesity, preexisting pulmonary disorder, and surgery (versus radiology) for PRAE. This tool can be used to provide an individual risk score for each patient to predict the risk of PRAE in the preoperative period.
Galper, Benjamin Z.; Wang, Y. Claire; Einstein, Andrew J.
2015-01-01
Background Several approaches have been proposed for risk-stratification and primary prevention of coronary heart disease (CHD), but their comparative and cost-effectiveness is unknown. Methods We constructed a state-transition microsimulation model to compare multiple approaches to the primary prevention of CHD in a simulated cohort of men aged 45–75 and women 55–75. Risk-stratification strategies included the 2013 American College of Cardiology/American Heart Association (ACC/AHA) guidelines on the treatment of blood cholesterol, the Adult Treatment Panel (ATP) III guidelines, and approaches based on coronary artery calcium (CAC) scoring and C-reactive protein (CRP). Additionally we assessed a treat-all strategy in which all individuals were prescribed either moderate-dose or high-dose statins and all males received low-dose aspirin. Outcome measures included CHD events, costs, medication-related side effects, radiation-attributable cancers, and quality-adjusted-life-years (QALYs) over a 30-year timeframe. Results Treat-all with high-dose statins dominated all other strategies for both men and women, gaining 15.7 million QALYs, preventing 7.3 million myocardial infarctions, and saving over $238 billion, compared to the status quo, far outweighing its associated adverse events including bleeding, hepatitis, myopathy, and new-onset diabetes. ACC/AHA guidelines were more cost-effective than ATP III guidelines for both men and women despite placing 8.7 million more people on statins. For women at low CHD risk, treat-all with high-dose statins was more likely to cause a statin-related adverse event than to prevent a CHD event. Conclusions Despite leading to a greater proportion of the population placed on statin therapy, the ACC/AHA guidelines are more cost-effective than ATP III. Even so, at generic prices, treating all men and women with statins and all men with low-dose aspirin appears to be more cost-effective than all risk-stratification approaches for the primary prevention of CHD. Especially for low-CHD risk women, decisions on the appropriate primary prevention strategy should be based on shared decision making between patients and healthcare providers. PMID:26422204
Ashburner, Jeffrey M.; Go, Alan S.; Chang, Yuchiao; Fang, Margaret C.; Fredman, Lisa; Applebaum, Katie M.; Singer, Daniel E.
2016-01-01
Background/Objectives To date, studies examining the association between warfarin therapy and incidence of ischemic stroke among patients with atrial fibrillation (AF) have not accounted for the competing risk of death. Competing risk analysis may provide greater understanding of the “real world” impact of anticoagulation on stroke risk over a multiyear time span. Design Cohort study Setting ATRIA Study community-based cohort Participants 13,559 adults with nonvalvular AF between 1996 and 2003. Measurements All events were clinician-adjudicated. We used extended Cox regression with longitudinal warfarin exposure to estimate cause-specific hazard ratios (HR) for thromboembolism (TE) and the competing risk event (all cause death). The Fine and Gray subdistribution regression approach was used to estimate this association while accounting for competing death events. As a secondary analysis, follow-up was limited to 1, 3, and 5-years. Results The rate of death was much higher in the non-warfarin group (8.1 deaths/100 person-years) compared to the warfarin group (5.5 deaths/100 person-years). The cause-specific HR indicated a large reduction in TE with warfarin use (adjusted HR: 0.57, 95% CI: 0.50–0.65). However, after accounting for competing death events, this association was substantially attenuated (adjusted HR: 0.87, 95% CI: 0.77–0.99). In analyses limited to 1-year of follow-up with fewer competing death events, the results for models that did and did not account for competing risks were similar. Conclusion Analyses accounting for competing death events may provide a more realistic estimate of the longer-term stroke prevention benefits of anticoagulants for patients with AF, particularly those who are not currently treated with anticoagulants. PMID:27861698
Streaming data from a smartphone application: A new approach to mapping health during travel.
Farnham, Andrea; Röösli, Martin; Blanke, Ulf; Stone, Emily; Hatz, Christoph; Puhan, Milo A
New research methods offer opportunities to investigate the influence of environment on health during travel. Our study uses data from a smartphone application to describe spatial and environmental patterns in health among travellers. A prospective cohort of travellers to Thailand used a smartphone application during their trips to 1) answer a daily questionnaire about health behaviours and events, and 2) collect streaming data on environment, itinerary, and weather. Incidence of health events was described by region and trip type. The relationship between environmental factors and health events was modelled using a logistic mixed model. The 75/101 (74.3%) travellers that completed the study answered 940 questionnaires, 796 (84.7%) of which were geolocated to Southeast Asia. Accidents occurred to 20.0% of participants and were mainly in the Thai islands, while self-rated "severe" mental health events (21.3%) were centred in Bangkok. The odds of a health event were higher in Chiang Mai (2.34, 95% CI: 1.08, 5.08) and on rainy days (1.86, 95% CI: 1.03, 3.36). Distinct patterns in spatial and environmental risk factors emerged in travellers to Thailand. Location based tracking could identify "hotspots" for health problems and update travel advice to target specific risk groups and regions. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Risk Assessment System with Automatic Extraction of Event Types
NASA Astrophysics Data System (ADS)
Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula
In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.
Puddu, Paolo Emilio; Piras, Paolo; Menotti, Alessandro
2017-02-01
To study coronary heart disease (CHD) death versus 11 other causes of death using the cumulative incidence function (CIF) and the competing risks procedures to disentangle the differential role of risk factors for different end-points. Standard Cox and Fine-Gray models among 1712 middle-aged men were compared during 50years of follow-up. CHD death was the primary event, while deaths from 11 selected causes, mutually exclusive from the primary end-point, were considered as secondary events. Reverse solutions were also performed. We considered 10 selected risk factors. CHD death risk was the second highest among 12 mostly specific causes of death. Some risk factors were specific: serum cholesterol for CHD death whereas, systolic blood pressure, cigarette smoking and age may have a differential role in other causes of death. Application of the Fine-Gray model based on CIF enabled to dissect, at least in part, the respective role that baseline covariates may have to segregate the probabilities of two types of death in contrast from each other. They also point to the absence of contributing significance for some of the selected risk factors and this calls for a parsimonious approach in predictions. The relative rarity of competing risk challenges when defining the risk factors role at long-term needs now be corrected since we have clearly shown, with Fine-Gray model, at direct or reverse use, that comparing different end-points heavily influences the risk factor predictive capacity. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
CKD and Sudden Cardiac Death: Epidemiology, Mechanisms, and Therapeutic Approaches
Whitman, Isaac R.; Feldman, Harold I.
2012-01-01
Multiple studies demonstrate a strong independent association between CKD and cardiovascular events including death, heart failure, and myocardial infarction. This review focuses on recent clinical studies that expand this spectrum of adverse cardiovascular events to include ventricular arrhythmias and sudden cardiac death. In addition, experimental models suggest structural remodeling of the heart and electrophysiologic changes in this population. These processes may explain the increased arrhythmic risk in kidney disease and aid in identifying patients who are at higher risk for sudden cardiac death. Finally, we review here the data to support the use of pharmacologic and device-based therapies for both the primary and secondary prevention of sudden cardiac death. PMID:23100219
Hignett, Sue; Wolf, Laurie; Taylor, Ellen; Griffiths, Paula
2015-11-01
The aim of this study was to use a theoretical model (bench) for human factors and ergonomics (HFE) and a comparison with occupational slips, trips, and falls (STFs) risk management to discuss patient STF interventions (bedside). Risk factors for patient STFs have been identified and reported since the 1950s and are mostly unchanged in the 2010s. The prevailing clinical view has been that STF events indicate underlying frailty or illness, and so many of the interventions over the past 60 years have focused on assessing and treating physiological factors (dizziness, illness, vision/hearing, medicines) rather than designing interventions to reduce risk factors at the time of the STF. Three case studies are used to discuss how HFE has been, or could be, applied to STF risk management as (a) a design-based (building) approach to embed safety into the built environment, (b) a staff- (and organization-) based approach, and (c) a patient behavior-based approach to explore and understand patient perspectives of STF events. The results from the case studies suggest taking a similar HFE integration approach to other industries, that is, a sustainable design intervention for the person who experiences the STF event-the patient. This paper offers a proactive problem-solving approach to reduce STFs by patients in acute hospitals. Authors of the three case studies use HFE principles (bench/book) to understand the complex systems for facility and equipment design and include the perspective of all stakeholders (bedside). © 2015, Human Factors and Ergonomics Society.
Gandy, William M; Coberley, Carter; Pope, James E; Rula, Elizabeth Y
2014-02-01
The goal of this study was to determine the relationship between individual well-being and risk of a hospital event in the subsequent year. The authors hypothesized an inverse relationship in which low well-being predicts higher likelihood of hospital use. The study specifically sought to understand how well-being segments and demographic variables interact in defining risk of a hospital event (inpatient admission or emergency room visit) in an employed population. A retrospective study design was conducted with data from 8835 employees who completed a Well-Being Assessment questionnaire based on the Gallup-Healthways Well-Being Index. Cox proportional hazards models were used to examine the impact of Individual Well-Being Score (IWBS) segments and member demographics on hazard ratios (HRs) for a hospital event during the 12 months following assessment completion. Significant main effects were found for the influence of IWBS segments, sex, education, and relationship status on HRs of a hospital event, but not for age. However, further analysis revealed significant interactions between age and IWBS segments (P=0.005) and between age and sex (P<0.0001), indicating that the effects for IWBS segments and sex on HRs of a hospital event are mediated through their relationship with age. Overall, the strong relationship between low well-being and higher risk of an event in employees ages 44 years and older is mitigated in younger age groups. These results suggest that youth attenuates the risk engendered in poor well-being; therefore, methods to maintain or improve well-being as individuals age presents a strong opportunity for reducing hospital events.
Gandy, William M.; Coberley, Carter; Pope, James E.
2014-01-01
Abstract The goal of this study was to determine the relationship between individual well-being and risk of a hospital event in the subsequent year. The authors hypothesized an inverse relationship in which low well-being predicts higher likelihood of hospital use. The study specifically sought to understand how well-being segments and demographic variables interact in defining risk of a hospital event (inpatient admission or emergency room visit) in an employed population. A retrospective study design was conducted with data from 8835 employees who completed a Well-Being Assessment questionnaire based on the Gallup-Healthways Well-Being Index. Cox proportional hazards models were used to examine the impact of Individual Well-Being Score (IWBS) segments and member demographics on hazard ratios (HRs) for a hospital event during the 12 months following assessment completion. Significant main effects were found for the influence of IWBS segments, sex, education, and relationship status on HRs of a hospital event, but not for age. However, further analysis revealed significant interactions between age and IWBS segments (P=0.005) and between age and sex (P<0.0001), indicating that the effects for IWBS segments and sex on HRs of a hospital event are mediated through their relationship with age. Overall, the strong relationship between low well-being and higher risk of an event in employees ages 44 years and older is mitigated in younger age groups. These results suggest that youth attenuates the risk engendered in poor well-being; therefore, methods to maintain or improve well-being as individuals age presents a strong opportunity for reducing hospital events. (Population Health Management 2014;17:13–20) PMID:23560493
Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems
NASA Astrophysics Data System (ADS)
Kwag, Shinyoung
Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.
Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez
2014-01-01
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. © 2013 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Goteti, G.; Kaheil, Y. H.; Katz, B. G.; Li, S.; Lohmann, D.
2011-12-01
In the United States, government agencies as well as the National Flood Insurance Program (NFIP) use flood inundation maps associated with the 100-year return period (base flood elevation, BFE), produced by the Federal Emergency Management Agency (FEMA), as the basis for flood insurance. A credibility check of the flood risk hydraulic models, often employed by insurance companies, is their ability to reasonably reproduce FEMA's BFE maps. We present results from the implementation of a flood modeling methodology aimed towards reproducing FEMA's BFE maps at a very fine spatial resolution using a computationally parsimonious, yet robust, hydraulic model. The hydraulic model used in this study has two components: one for simulating flooding of the river channel and adjacent floodplain, and the other for simulating flooding in the remainder of the catchment. The first component is based on a 1-D wave propagation model, while the second component is based on a 2-D diffusive wave model. The 1-D component captures the flooding from large-scale river transport (including upstream effects), while the 2-D component captures the flooding from local rainfall. The study domain consists of the contiguous United States, hydrologically subdivided into catchments averaging about 500 km2 in area, at a spatial resolution of 30 meters. Using historical daily precipitation data from the Climate Prediction Center (CPC), the precipitation associated with the 100-year return period event was computed for each catchment and was input to the hydraulic model. Flood extent from the FEMA BFE maps is reasonably replicated by the 1-D component of the model (riverine flooding). FEMA's BFE maps only represent the riverine flooding component and are unavailable for many regions of the USA. However, this modeling methodology (1-D and 2-D components together) covers the entire contiguous USA. This study is part of a larger modeling effort from Risk Management Solutions° (RMS) to estimate flood risk associated with extreme precipitation events in the USA. Towards this greater objective, state-of-the-art models of flood hazard and stochastic precipitation are being implemented over the contiguous United States. Results from the successful implementation of the modeling methodology will be presented.
O’Brien, Denzil
2016-01-01
Simple Summary This paper examines a number of methods for calculating injury risk for riders in the equestrian sport of eventing, and suggests that the primary locus of risk is the action of the horse jumping, and the jump itself. The paper argues that risk calculation should therefore focus first on this locus. Abstract All horse-riding is risky. In competitive horse sports, eventing is considered the riskiest, and is often characterised as very dangerous. But based on what data? There has been considerable research on the risks and unwanted outcomes of horse-riding in general, and on particular subsets of horse-riding such as eventing. However, there can be problems in accessing accurate, comprehensive and comparable data on such outcomes, and in using different calculation methods which cannot compare like with like. This paper critically examines a number of risk calculation methods used in estimating risk for riders in eventing, including one method which calculates risk based on hours spent in the activity and in one case concludes that eventing is more dangerous than motorcycle racing. This paper argues that the primary locus of risk for both riders and horses is the jump itself, and the action of the horse jumping. The paper proposes that risk calculation in eventing should therefore concentrate primarily on this locus, and suggests that eventing is unlikely to be more dangerous than motorcycle racing. The paper proposes avenues for further research to reduce the likelihood and consequences of rider and horse falls at jumps. PMID:26891334
Meissner, Y; Richter, A; Manger, B; Tony, H P; Wilden, E; Listing, J; Zink, A; Strangfeld, A
2017-09-01
In the general population, the incidence of stroke is increased following other serious events and hospitalisation. We investigated the impact of serious adverse events on the risk of stroke in patients with rheumatoid arthritis (RA), taking risk factors and treatment into account. Using data of the German biologics register RABBIT (Rheumatoid Arthritis: Observation of Biologic Therapy) with 12354 patients with RA, incidence rates (IRs) and risk factors for stroke were investigated using multi-state and Cox proportional hazard models. In addition, in a nested case-control study, all patients with stroke were matched 1:2 to patients with identical baseline risk profile and analysed using a shared frailty model. During follow-up, 166 strokes were reported. The overall IR was 3.2/1000 patient-years (PY) (95% CI 2.7 to 3.7). It was higher after a serious adverse event (IR: 9.0 (7.3 to 11.0)), particularly within 30 days after the event (IR: 94.9 (72.6 to 121.9)). The adjusted Cox model showed increased risks of age per 5 years (HR: 1.4 (1.3 to 1.5)), hyperlipoproteinaemia (HR: 1.6 (1.0 to 2.5)) and smoking (HR: 1.9 (1.3 to 2.6)). The risk decreased with better physical function (HR: 0.9 (0.8 to 0.96)). In the case-control study, 163 patients were matched to 326 controls. Major risk factors for stroke were untreated cardiovascular disease (HR: 3.3 (1.5 to 7.2)) and serious infections (HR:4.4 (1.6 to 12.5)) or other serious adverse events (HR: 2.6 (1.4 to 4.8)). Incident adverse events, in particular serious infections, and insufficient treatment of cardiovascular diseases are independent drivers of the risk of stroke. Physicians should be aware that patients who experience a serious event are at increased risk of subsequent stroke. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Technical Reports Server (NTRS)
Guarro, Sergio B.
2010-01-01
This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.
NASA Astrophysics Data System (ADS)
Kirkire, Milind Shrikant; Rane, Santosh B.; Jadhav, Jagdish Rajaram
2015-12-01
Medical product development (MPD) process is highly multidisciplinary in nature, which increases the complexity and the associated risks. Managing the risks during MPD process is very crucial. The objective of this research is to explore risks during MPD in a dental product manufacturing company and propose a model for risk mitigation during MPD process to minimize failure events. A case study approach is employed. The existing MPD process is mapped with five phases of the customized phase gate process. The activities during each phase of development and risks associated with each activity are identified and categorized based on the source of occurrence. The risks are analyzed using traditional Failure mode and effect analysis (FMEA) and fuzzy FMEA. The results of two methods when compared show that fuzzy approach avoids the duplication of RPNs and helps more to convert cognition of experts into information to get values of risk factors. The critical, moderate, low level and negligible risks are identified based on criticality; risk treatments and mitigation model are proposed. During initial phases of MPD, the risks are less severe, but as the process progresses the severity of risks goes on increasing. The MPD process should be critically designed and simulated to minimize the number of risk events and their severity. To successfully develop the products/devices within the manufacturing companies, the process risk management is very essential. A systematic approach to manage risks during MPD process will lead to the development of medical products with expected quality and reliability. This is the first research of its kind having focus on MPD process risks and its management. The methodology adopted in this paper will help the developers, managers and researchers to have a competitive edge over the other companies by managing the risks during the development process.
Chatterjee, Satabdi; Chen, Hua; Johnson, Michael L; Aparasu, Rajender R
2012-10-01
Atypical antipsychotic agents have been associated with cerebrovascular adverse events, particularly in elderly dementia patients. However, limited evidence exists regarding comparative cerebrovascular profiles of individual atypical agents, particularly in community settings. The objective of this study was to evaluate the risk of cerebrovascular events associated with use of risperidone, olanzapine and quetiapine in community-dwelling older adults in the US. A propensity score-adjusted retrospective cohort design involving the IMS LifeLink™ Health Plan Claims Database was used for the study. The study population included all older adults (aged ≥50 years) who initiated risperidone, olanzapine or quetiapine anytime during 1 July 2000 to 30 June 2008. Patients were followed until hospitalization or an emergency room visit for a cerebrovascular event, or the end of the study period, whichever occurred earlier. The Cox proportional hazard regression model with time-varying covariates was used to evaluate the risk of cerebrovascular events during the follow-up period, using olanzapine as the reference. The covariates adjusted for in the final model included multiple propensity scores and exposure to other medications that could be associated with the risk of cerebrovascular events. A total of 2,458 cerebrovascular events were identified in the study cohort: 1,081 (21.38%) for risperidone users, 816 (18.75%) for olanzapine users and 561 (21.05%) for quetiapine users. After adjusting for propensity scores and other covariates, the Cox proportional hazard model revealed that use of quetiapine [hazard ratio (HR) 0.88; 95% CI 0.78, 0.99] but not risperidone (HR 1.05; 95% CI 0.95, 1.16) was associated with a decrease in the risk of cerebrovascular adverse events compared with olanzapine. The study suggested that quetiapine use may be associated with a moderately lower risk of cerebrovascular events than olanzapine in older adults. Prescribers should closely monitor the patients treated with atypical agents for the incidence of cerebrovascular adverse events.
A Research Roadmap for Computation-Based Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Mandelli, Diego; Joe, Jeffrey
2015-08-01
The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less
Developing a phenological model for grapevine to assess future frost risk in Luxembourg
NASA Astrophysics Data System (ADS)
Caffarra, A.; Molitor, D.; Pertot, I.; Sinigoy, P.; Junk, J.
2012-04-01
Late frost damage represents a significant hazard to grape production in cool climate viticulture regions such as Luxembourg. The main aim of our study is to analyze the frequency of these events for the Luxembourg's winegrowing region in the future. Spring frost injuries on grape may occur when young green parts are exposed to air temperature below 0°C. The potential risk is determined by: (i) minimum air temperature conditions and the (ii) the timing of bud burst. Therefore, we developed and validated a model for budburst of the grapevine (*Vitis vinifera)* cultivar Rivaner, the most grown local variety, based on multi-annual data from 7 different sites across Europe and the US. An advantage of this approach is, that it could be applied to a wide range of climate conditions. Higher spring temperatures were projected for the future and could lead to earlier dates of budburst as well as earlier dates of last frost events in the season. However, so far it is unknown if this will increase or decrease the risk of severe late frost damages for Luxembourg's winegrowing region. To address this question results of 10 regional climate change projections from the FP6 ENSEMBLES project (spatial resolution = 25km; A1B emission scenario) were combined with the new bud burst model. The use of a multi model ensemble of climate change projections allows for a better quantification of the uncertainties. A bias corrections scheme, based on local observations, was applied to the model output. Projected daily minimum air temperatures, up to 2098, were compared to the projected date of bud burst in order to quantify the future frost risk for Luxembourg.
Laukkanen, Jari A; Mäkikallio, Timo H; Kauhanen, Jussi; Kurl, Sudhir
2009-10-01
Adrenoceptors mediate contraction of vascular smooth muscle and induce coronary vasoconstriction in humans. A deletion variant of the human alpha(2B)-adrenoreseptor of glutamic acid residues has been associated with impaired receptor desensitization. This receptor variant could, therefore, be involved in cardiovascular diseases associated with enhanced vasoconstriction. Our aim was to study whether an insertion/deletion (I/D) polymorphism in the alpha(2B)-adrenoceptor gene is associated with the risk for sudden cardiac death. This was a prospective population-based study investigating risk factors for cardiovascular diseases in middle-aged men from 42 to 60 years from eastern Finland. The study is based on 1,606 men with complete data on DNA observed for an average time of 17 years. In this study population, 338 men (21%) had the D/D genotype, 467 (29%) had the I/I genotype, and 801 (50%) had a heterozygous genotype. There were 76 sudden cardiac deaths during follow-up (0.81 deaths/1,000 persons per year). In a Cox model adjusting for other coronary risk factors (age, systolic blood pressure, smoking, diabetes, serum low-density lipoprotein and high-density lipoprotein cholesterol, body mass index, and exercise-induced myocardial ischemia), men with the D/D or I/D genotype had 1.97 times (95% CI 1.08-3.59, P = .026) higher risk to experience sudden cardiac death (20 events for D/D genotype, 13 events for I/I genotype, and 43 events for I/D genotype) compared with men carrying the I/I genotype. In addition, the alpha(2B)-adrenoceptor D/D genotype was associated with the risk of coronary heart disease death and acute coronary events, after adjusting for risk factors. The genetic polymorphism of the alpha(2B)-adrenoreceptor is genetic risk predictor for sudden cardiac death.
A methodology for evacuation design for urban areas: theoretical aspects and experimentation
NASA Astrophysics Data System (ADS)
Russo, F.; Vitetta, A.
2009-04-01
This paper proposes an unifying approach for the simulation and design of a transportation system under conditions of incoming safety and/or security. Safety and security are concerned with threats generated by very different factors and which, in turn, generate emergency conditions, such as the 9/11, Madrid and London attacks, the Asian tsunami, and the Katrina hurricane; just considering the last five years. In transportation systems, when exogenous events happen and there is a sufficient interval time between the instant when the event happens and the instant when the event has effect on the population, it is possible to reduce the negative effects with the population evacuation. For this event in every case it is possible to prepare with short and long term the evacuation. For other event it is possible also to plan the real time evacuation inside the general risk methodology. The development of models for emergency conditions in transportation systems has not received much attention in the literature. The main findings in this area are limited to only a few public research centres and private companies. In general, there is no systematic analysis of the risk theory applied in the transportation system. Very often, in practice, the vulnerability and exposure in the transportation system are considered as similar variables, or in other worse cases the exposure variables are treated as vulnerability variables. Models and algorithms specified and calibrated in ordinary conditions cannot be directly applied in emergency conditions under the usual hypothesis considered. This paper is developed with the following main objectives: (a) to formalize the risk problem with clear diversification (for the consequences) in the definition of the vulnerability and exposure in a transportation system; thus the book offers improvements over consolidated quantitative risk analysis models, especially transportation risk analysis models (risk assessment); (b) to formalize a system of models for evacuation simulation; (c) to calibrate and validate system of model for evacuation simulation from a real experimentation. In relation to the proposed objectives in this paper: (a) a general framework about risk analysis is reported in the first part, with specific methods and models to analyze urban transportation system performances in emergency conditions when exogenous phenomena occur and for the specification of the risk function; (b) a formulation of the general evacuation problem in the standard simulation context of "what if" approach is specified in the second part with reference to the model considered for the simulation of transportation system in ordinary condition; (c) a set of models specified in the second part are calibrated and validated from a real experimentation in the third part. The experimentation was developed in the central business district of an Italian village and about 1000 inhabitants were evacuated, in order to construct a complete data-base. Our experiment required that socioeconomic information (population, number employed, public buildings, schools, etc.) and transport supply characteristics (infrastructures, etc.) be measured before and during experimentation. The real data of evacuation were recorded with 30 video cameras for laboratory analysis. The results are divided into six strictly connected tasks: Demand models; Supply and supply-demand interaction models for users; Simulation of refuge areas for users; Design of path choice models for emergency vehicles; Pedestrian outflow models in a building; Planning process and guidelines.
NASA Astrophysics Data System (ADS)
Schindewolf, Marcus; Kaiser, Andreas; Buchholtz, Arno; Schmidt, Jürgen
2017-04-01
Extreme rainfall events and resulting flash floods led to massive devastations in Germany during spring 2016. The study presented aims on the development of a early warning system, which allows the simulation and assessment of negative effects on infrastructure by radar-based heavy rainfall predictions, serving as input data for the process-based soil loss and deposition model EROSION 3D. Our approach enables a detailed identification of runoff and sediment fluxes in agricultural used landscapes. In a first step, documented historical events were analyzed concerning the accordance of measured radar rainfall and large scale erosion risk maps. A second step focused on a small scale erosion monitoring via UAV of source areas of heavy flooding events and a model reconstruction of the processes involved. In all examples damages were caused to local infrastructure. Both analyses are promising in order to detect runoff and sediment delivering areas even in a high temporal and spatial resolution. Results prove the important role of late-covering crops such as maize, sugar beet or potatoes in runoff generation. While e.g. winter wheat positively affects extensive runoff generation on undulating landscapes, massive soil loss and thus muddy flows are observed and depicted in model results. Future research aims on large scale model parameterization and application in real time, uncertainty estimation of precipitation forecast and interface developments.
Korostovtseva, Lyudmila S; Sviryaev, Yurii V; Zvartau, Nadezhda E; Konradi, Alexandra O; Kalinkin, Alexander L
2011-02-25
To assess the impact of obstructive sleep apnea-hypopnea syndrome (OSAHS) on prognosis and cardiovascular morbidity and mortality in relation to other major cardiovascular risk factors. This prospective study recruited 234 patients from an out-patient clinic. Based on the Berlin questionnaire, 147 patients (90 males, mean age 52.1 ± 10.4 years) with highly suspected sleep breathing disorders were included in the study. Based on cardiorespiratory monitoring, patients were divided into 2 groups: 42 patients without sleep breathing disorders (SBD), and 105 patients with OSAHS. Among these, 12 patients started CPAP therapy and formed the third group. The mean follow-up period was 46.4 ± 14.3 months. Event-free survival was lowest in the untreated OSAHS patients (log rank test 6.732, p = 0.035). In the non-adjusted regression model, OSAHS was also associated with a higher risk of cardiovascular events (OR = 8.557, 95% CI 1.142-64.131, p = 0.037). OSAHS patients demonstrated higher rates of hospitalization compared to the control group without SBD (OR 2.750, 95%CI 1.100-6.873, p = 0.04). OSAHS hypertensive patients, and in particular, according to our model, patients with severe OSAHS (AHI ≥ 30/h), are at higher risk of fatal and non-fatal cardiovascular events. Moreover, untreated OSAHS patients demonstrate higher rates of hospitalization caused by the onset or deterioration of cardiovascular disease.
Multi-hazard risk analysis for management strategies
NASA Astrophysics Data System (ADS)
Kappes, M.; Keiler, M.; Bell, R.; Glade, T.
2009-04-01
Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.
Research and Evaluations of the Health Aspects of Disasters, Part IX: Risk-Reduction Framework.
Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P; Loretti, Alessandro
2016-06-01
A disaster is a failure of resilience to an event. Mitigating the risks that a hazard will progress into a destructive event, or increasing the resilience of a society-at-risk, requires careful analysis, planning, and execution. The Disaster Logic Model (DLM) is used to define the value (effects, costs, and outcome(s)), impacts, and benefits of interventions directed at risk reduction. A Risk-Reduction Framework, based on the DLM, details the processes involved in hazard mitigation and/or capacity-building interventions to augment the resilience of a community or to decrease the risk that a secondary event will develop. This Framework provides the structure to systematically undertake and evaluate risk-reduction interventions. It applies to all interventions aimed at hazard mitigation and/or increasing the absorbing, buffering, or response capacities of a community-at-risk for a primary or secondary event that could result in a disaster. The Framework utilizes the structure provided by the DLM and consists of 14 steps: (1) hazards and risks identification; (2) historical perspectives and predictions; (3) selection of hazard(s) to address; (4) selection of appropriate indicators; (5) identification of current resilience standards and benchmarks; (6) assessment of the current resilience status; (7) identification of resilience needs; (8) strategic planning; (9) selection of an appropriate intervention; (10) operational planning; (11) implementation; (12) assessments of outputs; (13) synthesis; and (14) feedback. Each of these steps is a transformation process that is described in detail. Emphasis is placed on the role of Coordination and Control during planning, implementation of risk-reduction/capacity building interventions, and evaluation. Birnbaum ML , Daily EK , O'Rourke AP , Loretti A . Research and evaluations of the health aspects of disasters, part IX: Risk-Reduction Framework. Prehosp Disaster Med. 2016;31(3):309-325.
Regional earthquake loss estimation in the Autonomous Province of Bolzano - South Tyrol (Italy)
NASA Astrophysics Data System (ADS)
Huttenlau, Matthias; Winter, Benjamin
2013-04-01
Beside storm events geophysical events cause a majority of natural hazard losses on a global scale. However, in alpine regions with a moderate earthquake risk potential like in the study area and thereupon connected consequences on the collective memory this source of risk is often neglected in contrast to gravitational and hydrological hazards processes. In this context, the comparative analysis of potential disasters and emergencies on a national level in Switzerland (Katarisk study) has shown that earthquakes are the most serious source of risk in general. In order to estimate the potential losses of earthquake events for different return periods and loss dimensions of extreme events the following study was conducted in the Autonomous Province of Bolzano - South Tyrol (Italy). The applied methodology follows the generally accepted risk concept based on the risk components hazard, elements at risk and vulnerability, whereby risk is not defined holistically (direct, indirect, tangible and intangible) but with the risk category losses on buildings and inventory as a general risk proxy. The hazard analysis is based on a regional macroseismic scenario approach. Thereby, the settlement centre of each community (116 communities) is defined as potential epicentre. For each epicentre four different epicentral scenarios (return periods of 98, 475, 975 and 2475 years) are calculated based on the simple but approved and generally accepted attenuation law according to Sponheuer (1960). The relevant input parameters to calculate the epicentral scenarios are (i) the macroseismic intensity and (ii) the focal depth. The considered macroseismic intensities are based on a probabilistic seismic hazard analysis (PSHA) of the Italian earthquake catalogue on a community level (Dipartimento della Protezione Civile). The relevant focal depth are considered as a mean within a defined buffer of the focal depths of the harmonized earthquake catalogues of Italy and Switzerland as well as earthquake data of the US Geological Survey (USGS). The asset database to identify the elements at risk is developed under consideration of an address dataset, the land-use plan, official building footprints, building heights based on a normalized digital surface model, official construction costs for different building types (buildings cross cubatures), official statistical data concerning households on community level and insurance data based mean inventory values. To analyse the structural vulnerability and consequently the potential structural losses, community specific mean damage ratios based on the EMS-98 approach and the historic development of the building stock within the individual communities are estimated. Inventory losses are assumed with 30 percent of the structural losses. Thus, for each epicentre a loss-frequency-relationship can be calculated and the most severe epicentral scenarios can be identified.
Dietary flavonoid intake and cardiovascular risk: a population-based cohort study.
Ponzo, Valentina; Goitre, Ilaria; Fadda, Maurizio; Gambino, Roberto; De Francesco, Antonella; Soldati, Laura; Gentile, Luigi; Magistroni, Paola; Cassader, Maurizio; Bo, Simona
2015-07-08
The cardio-protective effects of flavonoids are still controversial; many studies referred to the benefits of specific foods, such as soy, cocoa, tea. A population-based cohort of middle-aged adults, coming from a semi-rural area where the consumption of those foods is almost negligible, was studied. The primary objective was establishing if flavonoid intake was inversely associated with the cardiovascular (CV) risk evaluated after 12-year follow-up; the associations between flavonoid intake and CV incidence and mortality and all-cause mortality were also evaluated. In 2001-2003, a cohort of 1,658 individuals completed a validated food-frequency questionnaire. Anthropometric, laboratory measurements, medical history and the vital status were collected at baseline and during 2014. The CV risk was estimated with the Framingham risk score. Individuals with the lowest tertile of flavonoid intake showed a worse metabolic pattern and less healthy lifestyle habits. The 2014 CV risk score and the increase in the risk score from baseline were significantly higher with the lowest intake of total and all subclasses of flavonoids, but isoflavones, in a multiple regression model. During follow-up, 125 CV events and 220 deaths (84 of which due to CV causes) occurred. CV non-fatal events were less frequent in individuals with higher flavonoid intake (HR = 0.64; 95%CI 0.42-1.00 and HR = 0.46; 95%CI 0.28-0.75 for the second and third tertiles, respectively) in Cox-regression models, after multiple adjustments. All subclasses of flavonoids, but flavones and isoflavones, were inversely correlated with incident CV events, with HRs ranging from 0.42 (flavan-3-ols) to 0.56 (anthocyanidins). Being in the third tertile of flavan-3-ols (HR = 0.68; 95% CI 0.48-0.96), anthocyanidins (HR = 0.66; 95% CI 0.46-0.95) and flavanones (HR = 0.59; 95% CI 0.40-0.85) was inversely associated with all-cause mortality. Total and subclasses of flavonoids were not significantly associated with the risk of CV mortality. Flavonoid intake was inversely associated with CV risk, CV non-fatal events and all-cause mortality in a cohort with a low consumption of soy, tea and cocoa, which are typically viewed as the foods responsible for flavonoid-related benefits.
Baneshi, Mohammad Reza; Haghdoost, Ali Akbar; Zolala, Farzaneh; Nakhaee, Nouzar; Jalali, Maryam; Tabrizi, Reza; Akbari, Maryam
2017-04-01
This study aimed to assess using tree-based models the impact of different dimensions of religion and other risk factors on suicide attempts in the Islamic Republic of Iran. Three hundred patients who attempted suicide and 300 age- and sex-matched patient attendants with other types of disease who referred to Kerman Afzalipour Hospital were recruited for this study following a convenience sampling. Religiosity was assessed by the Duke University Religion Index. A tree-based model was constructed using the Gini Index as the homogeneity criterion. A complementary discrimination analysis was also applied. Variables contributing to the construction of the tree were stressful life events, mental disorder, family support, and religious belief. Strong religious belief was a protective factor for those with a low number of stressful life events and those with a high mental disorder score; 72 % of those who formed these two groups had not attempted suicide. Moreover, 63 % of those with a high number of stressful life events, strong family support, strong problem-solving skills, and a low mental disorder score were less likely to attempt suicide. The significance of four other variables, GHQ, problem-coping skills, friend support, and neuroticism, was revealed in the discrimination analysis. Religious beliefs seem to be an independent factor that can predict risk for suicidal behavior. Based on the decision tree, religious beliefs among people with a high number of stressful life events might not be a dissuading factor. Such subjects need more family support and problem-solving skills.
How well does the Post-fire Erosion Risk Management Tool (ERMiT) really work?
NASA Astrophysics Data System (ADS)
Robichaud, Peter; Elliot, William; Lewis, Sarah; Miller, Mary Ellen
2016-04-01
The decision of where, when, and how to apply the most effective postfire erosion mitigation treatments requires land managers to assess the risk of damaging runoff and erosion events occurring after a fire. The Erosion Risk Management Tool (ERMiT) was developed to assist post fire assessment teams identify high erosion risk areas and effectiveness of various mitigation treatments to reduce that risk. ERMiT is a web-based application that uses the Water Erosion Prediction Project (WEPP) technology to estimate erosion, in probabilistic terms, on burned and recovering forest, range, and chaparral lands with and without the application of mitigation treatments. User inputs are processed by ERMiT to combine rain event variability with spatial and temporal variabilities of hillslope burn severity and soil properties which are then used as WEPP inputs. Since 2007, the model has been used in making hundreds of land management decisions in the US and elsewhere. We use eight published field study sites in the Western US to compare ERMiT predictions to observed hillslope erosion rates. Most sites experience only a few rainfall events that produced runoff and sediment except for a California site with a Mediterranean climate. When hillslope erosion occurred, significant correlations occurred between the observed hillslope erosion and those predicted by ERMiT. Significant correlation occurred for most mitigation treatments as well as the five recovery years. These model validation results suggest reasonable estimates of probabilistic post-fire hillslope sediment delivery when compared to observation.
Fitch, Kathryn; Goldberg, Sara W.; Iwasaki, Kosuke; Pyenson, Bruce S.; Kuznik, Andreas; Solomon, Henry A.
2009-01-01
Objectives To model the financial and health outcomes impact of intensive statin therapy compared with usual care in a high-risk working-age population (actively employed, commercially insured health plan members and their adult dependents). The target population consists of working-age people who are considered high-risk for cardiovascular disease events because of a history of coronary heart disease. Study Design Three-year event forecast for a sample population generated from the National Health and Nutrition Examination Survey data. Methods Using Framingham risk scoring system, the probability of myocardial infarction or stroke events was calculated for a representative sample population, ages 35 to 69 years, of people at high risk for cardiovascular disease, with a history of coronary heart disease. The probability of events for each individual was used to project the number of events expected to be generated for this population. Reductions in cardiovascular and stroke events reported in clinical trials with aggressive statin therapy were applied to these cohorts. We used medical claims data to model the cohorts' event costs. All results are adjusted to reflect the demographics of a typical working-age population. Results The high-risk cohort (those with coronary heart disease) comprises 4% of the 35- to 69-year-old commercially insured population but generates 22% of the risk for coronary heart disease and stroke. Reduced event rates associated with intensive statin therapy yielded a $58 mean medical cost reduction per treated person per month; a typical payer cost for a 30-day supply of intensive statin therapy is approximately $57. Conclusions Aggressive low-density lipoprotein cholesterol–lowering therapy for working-age people at high risk for cardiovascular events and with a history of heart disease appears to have a significant potential to reduce the rate of clinical events and is cost-neutral for payers. PMID:25126293
A framework for quantifying net benefits of alternative prognostic models‡
Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G
2012-01-01
New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21905066
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
Non-stationary background intensity and Caribbean seismic events
NASA Astrophysics Data System (ADS)
Valmy, Larissa; Vaillant, Jean
2014-05-01
We consider seismic risk calculation based on models with non-stationary background intensity. The aim is to improve predictive strategies in the framework of seismic risk assessment from models describing at best the seismic activity in the Caribbean arc. Appropriate statistical methods are required for analyzing the volumes of data collected. The focus is on calculating earthquakes occurrences probability and analyzing spatiotemporal evolution of these probabilities. The main modeling tool is the point process theory in order to take into account past history prior to a given date. Thus, the seismic event conditional intensity is expressed by means of the background intensity and the self exciting component. This intensity can be interpreted as the expected event rate per time and / or surface unit. The most popular intensity model in seismology is the ETAS (Epidemic Type Aftershock Sequence) model introduced and then generalized by Ogata [2, 3]. We extended this model and performed a comparison of different probability density functions for the triggered event times [4]. We illustrate our model by considering the CDSA (Centre de Données Sismiques des Antilles) catalog [1] which contains more than 7000 seismic events occurred in the Lesser Antilles arc. Statistical tools for testing the background intensity stationarity and for dynamical segmentation are presented. [1] Bengoubou-Valérius M., Bazin S., Bertil D., Beauducel F. and Bosson A. (2008). CDSA: a new seismological data center for the French Lesser Antilles, Seismol. Res. Lett., 79 (1), 90-102. [2] Ogata Y. (1998). Space-time point-process models for earthquake occurrences, Annals of the Institute of Statistical Mathematics, 50 (2), 379-402. [3] Ogata, Y. (2011). Significant improvements of the space-time ETAS model for forecasting of accurate baseline seismicity, Earth, Planets and Space, 63 (3), 217-229. [4] Valmy L. and Vaillant J. (2013). Statistical models in seismology: Lesser Antilles arc case, Bull. Soc. géol. France, 2013, 184 (1), 61-67.
Developments in remote sensing technology enable more detailed urban flood risk analysis.
NASA Astrophysics Data System (ADS)
Denniss, A.; Tewkesbury, A.
2009-04-01
Spaceborne remote sensors have been allowing us to build up a profile of planet earth for many years. With each new satellite launched we see the capabilities improve: new bands of data, higher resolution imagery, the ability to derive better elevation information. The combination of this geospatial data to create land cover and usage maps, all help inform catastrophe modelling systems. From Landsat 30m resolution to 2.44m QuickBird multispectral imagery; from 1m radar data collected by TerraSAR-X which enables rapid tracking of the rise and fall of a flood event, and will shortly have a twin satellite launched enabling elevation data creation; we are spoilt for choice in available data. However, just what is cost effective? It is always a question of choosing the appropriate level of input data detail for modelling, depending on the value of the risk. In the summer of 2007, the cost of the flooding in the UK was approximately £3bn and affected over 58,000 homes and businesses. When it comes to flood risk, we have traditionally considered rising river levels and surge tides, but with climate change and variations in our own construction behaviour, there are other factors to be taken into account. During those summer 2007 events, the Environment Agency suggested that around 70% of the properties damaged were the result of pluvial flooding, where high localised rainfall events overload localised drainage infrastructure, causing widespread flooding of properties and infrastructure. To create a risk model that is able to simulate such an event requires much more accurate source data than can be provided from satellite or radar. As these flood events cause considerable damage within relatively small, complex urban environments, therefore new high resolution remote sensing techniques have to be applied to better model these events. Detailed terrain data of England and Wales, plus cities in Scotland, have been produced by combining terrain measurements from the latest digital airborne sensors, both optical and lidar, to produce the input layer for surface water flood modelling. A national flood map product has been created. The new product utilises sophisticated modelling techniques, perfected over many years, which harness graphical processing power. This product will prove particularly valuable for risk assessment decision support within insurance/reinsurance, property/environmental, utilities, risk management and government agencies. However, it is not just the ground elevation that determines the behaviour of surface water. By combining height information (surface and terrain) with high resolution aerial photography and colour infrared imagery, a high definition land cover mapping dataset (LandBase) is being produced, which provides a precise measure of sealed versus non sealed surface. This will allows even more sophisticated modelling of flood scenarios. Thus, the value of airborne survey data can be demonstrated by flood risk analysis down to individual addresses in urban areas. However for some risks, an even more detailed survey may be justified. In order to achieve this, Infoterra is testing new 360˚ mobile lidar technology. Collecting lidar data from a moving vehicle allows each street to be mapped in very high detail, allowing precise information about the location, size and shape of features such as kerbstones, gullies, road camber and building threshold level to be captured quickly and accurately. These data can then be used to model the problem of overland flood risk at the scale of individual properties. Whilst at present it might be impractical to undertake such detailed modelling for all properties, these techniques can certainly be used to improve the flood risk analysis of key locations. This paper will demonstrate how these new high resolution remote sensing techniques can be combined to provide a new resolution of detail to aid urban flood modelling.
NASA Astrophysics Data System (ADS)
Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.
2016-11-01
Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.
Cognitive complexity of the medical record is a risk factor for major adverse events.
Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot
2014-01-01
Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood as "patient complexity" has been difficult to quantify. We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time.
A descriptive model of preventability in maternal morbidity and mortality.
Geller, S E; Cox, S M; Kilpatrick, S J
2006-02-01
To develop a descriptive model of preventability for maternal morbidity and mortality that can be used in quality assurance and morbidity and mortality review processes. This descriptive study was part of a larger case-control study conducted at the University of Illinois at Chicago in which maternal deaths were cases and women with severe maternal morbidity served as controls. Morbidities and mortalities were classified by a team of clinicians as preventable or not preventable. Qualitative analysis of data was conducted to identify and categorize different types of preventable events. Of 237 women, there were 79 women with preventable events attributable to provider or system factors. The most common types of preventable events were inadequate diagnosis/recognition of high-risk (54.4%), treatment (38.0%), and documentation (30.7%). A descriptive model was illustrated that can be used to categorize preventable events in maternal morbidity and mortality and can be incorporated into quality assurance and clinical case review to enhance the monitoring of hospital-based obstetric care and to decrease medical error.
Investigating accident causation through information network modelling.
Griffin, T G C; Young, M S; Stanton, N A
2010-02-01
Management of risk in complex domains such as aviation relies heavily on post-event investigations, requiring complex approaches to fully understand the integration of multi-causal, multi-agent and multi-linear accident sequences. The Event Analysis of Systemic Teamwork methodology (EAST; Stanton et al. 2008) offers such an approach based on network models. In this paper, we apply EAST to a well-known aviation accident case study, highlighting communication between agents as a central theme and investigating the potential for finding agents who were key to the accident. Ultimately, this work aims to develop a new model based on distributed situation awareness (DSA) to demonstrate that the risk inherent in a complex system is dependent on the information flowing within it. By identifying key agents and information elements, we can propose proactive design strategies to optimize the flow of information and help work towards avoiding aviation accidents. Statement of Relevance: This paper introduces a novel application of an holistic methodology for understanding aviation accidents. Furthermore, it introduces an ongoing project developing a nonlinear and prospective method that centralises distributed situation awareness and communication as themes. The relevance of findings are discussed in the context of current ergonomic and aviation issues of design, training and human-system interaction.
Trending in Pc Measurements via a Bayesian Zero-Inflated Mixed Model
NASA Technical Reports Server (NTRS)
Vallejo, Jonathon; Hejduk, Matthew; Stamey, James
2015-01-01
Two satellites predicted to come within close proximity of one another, usually a high-value satellite and a piece of space debris moving the active satellite is a means of reducing collision risk but reduces satellite lifetime, perturbs satellite mission, and introduces its own risks. So important to get a good statement of the risk of collision in order to determine whether a maneuver is truly necessary. Two aspects of this Calculation of the Probability of Collision (Pc) based on the most recent set of position velocity and uncertainty data for both satellites. Examination of the changes in the Pc value as the event develops. Events should follow a canonical development (Pc vs time to closest approach (TCA)). Helpful to be able to guess where the present data point fits in the canonical development in order to guide operational response.
Evaluation of a model of violence risk assessment among forensic psychiatric patients.
Douglas, Kevin S; Ogloff, James R P; Hart, Stephen D
2003-10-01
This study tested the interrater reliability and criterion-related validity of structured violence risk judgments made by using one application of the structured professional judgment model of violence risk assessment, the HCR-20 violence risk assessment scheme, which assesses 20 key risk factors in three domains: historical, clinical, and risk management. The HCR-20 was completed for a sample of 100 forensic psychiatric patients who had been found not guilty by reason of a mental disorder and were subsequently released to the community. Violence in the community was determined from multiple file-based sources. Interrater reliability of structured final risk judgments of low, moderate, or high violence risk made on the basis of the structured professional judgment model was acceptable (weighted kappa=.61). Structured final risk judgments were significantly predictive of postrelease community violence, yielding moderate to large effect sizes. Event history analyses showed that final risk judgments made with the structured professional judgment model added incremental validity to the HCR-20 used in an actuarial (numerical) sense. The findings support the structured professional judgment model of risk assessment as well as the HCR-20 specifically and suggest that clinical judgment, if made within a structured context, can contribute in meaningful ways to the assessment of violence risk.
den Ruijter, H M; Peters, S A E; Groenewegen, K A; Anderson, T J; Britton, A R; Dekker, J M; Engström, G; Eijkemans, M J; Evans, G W; de Graaf, J; Grobbee, D E; Hedblad, B; Hofman, A; Holewijn, S; Ikeda, A; Kavousi, M; Kitagawa, K; Kitamura, A; Koffijberg, H; Ikram, M A; Lonn, E M; Lorenz, M W; Mathiesen, E B; Nijpels, G; Okazaki, S; O'Leary, D H; Polak, J F; Price, J F; Robertson, C; Rembold, C M; Rosvall, M; Rundek, T; Salonen, J T; Sitzer, M; Stehouwer, C D A; Witteman, J C; Moons, K G; Bots, M L
2013-07-01
The aim of this work was to investigate whether measurement of the mean common carotid intima-media thickness (CIMT) improves cardiovascular risk prediction in individuals with diabetes. We performed a subanalysis among 4,220 individuals with diabetes in a large ongoing individual participant data meta-analysis involving 56,194 subjects from 17 population-based cohorts worldwide. We first refitted the risk factors of the Framingham heart risk score on the individuals without previous cardiovascular disease (baseline model) and then expanded this model with the mean common CIMT (CIMT model). The absolute 10 year risk for developing a myocardial infarction or stroke was estimated from both models. In individuals with diabetes we compared discrimination and calibration of the two models. Reclassification of individuals with diabetes was based on allocation to another cardiovascular risk category when mean common CIMT was added. During a median follow-up of 8.7 years, 684 first-time cardiovascular events occurred among the population with diabetes. The C statistic was 0.67 for the Framingham model and 0.68 for the CIMT model. The absolute 10 year risk for developing a myocardial infarction or stroke was 16% in both models. There was no net reclassification improvement with the addition of mean common CIMT (1.7%; 95% CI -1.8, 3.8). There were no differences in the results between men and women. There is no improvement in risk prediction in individuals with diabetes when measurement of the mean common CIMT is added to the Framingham risk score. Therefore, this measurement is not recommended for improving individual cardiovascular risk stratification in individuals with diabetes.
Schechter, Clyde B; Near, Aimee M; Jayasekera, Jinani; Chandler, Young; Mandelblatt, Jeanne S
2018-04-01
The Georgetown University-Albert Einstein College of Medicine breast cancer simulation model (Model GE) has evolved over time in structure and function to reflect advances in knowledge about breast cancer, improvements in early detection and treatment technology, and progress in computing resources. This article describes the model and provides examples of model applications. The model is a discrete events microsimulation of single-life histories of women from multiple birth cohorts. Events are simulated in the absence of screening and treatment, and interventions are then applied to assess their impact on population breast cancer trends. The model accommodates differences in natural history associated with estrogen receptor (ER) and human epidermal growth factor receptor 2 (HER2) biomarkers, as well as conventional breast cancer risk factors. The approach for simulating breast cancer natural history is phenomenological, relying on dates, stage, and age of clinical and screen detection for a tumor molecular subtype without explicitly modeling tumor growth. The inputs to the model are regularly updated to reflect current practice. Numerous technical modifications, including the use of object-oriented programming (C++), and more efficient algorithms, along with hardware advances, have increased program efficiency permitting simulations of large samples. The model results consistently match key temporal trends in US breast cancer incidence and mortality. The model has been used in collaboration with other CISNET models to assess cancer control policies and will be applied to evaluate clinical trial design, recurrence risk, and polygenic risk-based screening.
Pollock, Benjamin D; Hu, Tian; Chen, Wei; Harville, Emily W; Li, Shengxu; Webber, Larry S; Fonseca, Vivian; Bazzano, Lydia A
2017-01-01
To evaluate several adult diabetes risk calculation tools for predicting the development of incident diabetes and pre-diabetes in a bi-racial, young adult population. Surveys beginning in young adulthood (baseline age ≥18) and continuing across multiple decades for 2122 participants of the Bogalusa Heart Study were used to test the associations of five well-known adult diabetes risk scores with incident diabetes and pre-diabetes using separate Cox models for each risk score. Racial differences were tested within each model. Predictive utility and discrimination were determined for each risk score using the Net Reclassification Index (NRI) and Harrell's c-statistic. All risk scores were strongly associated (p<.0001) with incident diabetes and pre-diabetes. The Wilson model indicated greater risk of diabetes for blacks versus whites with equivalent risk scores (HR=1.59; 95% CI 1.11-2.28; p=.01). C-statistics for the diabetes risk models ranged from 0.79 to 0.83. Non-event NRIs indicated high specificity (non-event NRIs: 76%-88%), but poor sensitivity (event NRIs: -23% to -3%). Five diabetes risk scores established in middle-aged, racially homogenous adult populations are generally applicable to younger adults with good specificity but poor sensitivity. The addition of race to these models did not result in greater predictive capabilities. A more sensitive risk score to predict diabetes in younger adults is needed. Copyright © 2017 Elsevier Inc. All rights reserved.
Predicting geomorphically-induced flood risk for the Nepalese Terai communities
NASA Astrophysics Data System (ADS)
Dingle, Elizabeth; Creed, Maggie; Attal, Mikael; Sinclair, Hugh; Mudd, Simon; Borthwick, Alistair; Dugar, Sumit; Brown, Sarah
2017-04-01
Rivers sourced from the Himalaya irrigate the Indo-Gangetic Plain via major river networks that support 10% of the global population. However, many of these rivers are also the source of devastating floods. During the 2014 Karnali River floods in west Nepal, the Karnali rose to around 16 m at Chisapani (where it enters the Indo-Gangetic Plain), 1 m higher than the previous record in 1983; the return interval for this event was estimated to be 1000 years. Flood risk may currently be underestimated in this region, primarily because changes to the channel bed are not included when identifying areas at risk of flooding from events of varying recurrence intervals. Our observations in the field, corroborated by satellite imagery, show that river beds are highly mobile and constantly evolve through each monsoon. Increased bed levels due to sediment aggradation decreases the capacity of the river, increasing significantly the risk of devastating flood events; we refer to these as 'geomorphically-induced floods'. Major, short-lived episodes of sediment accumulation in channels are caused by stochastic variability in sediment flux generated by storms, earthquakes and glacial outburst floods from upstream parts of the catchment. Here, we generate a field-calibrated, geomorphic flood risk model for varying upstream scenarios, and predict changing flood risk for the Karnali River. A numerical model is used to carry out a sensitivity analysis of changes in channel geometry (particularly aggradation or degradation) based on realistic flood scenarios. In these scenarios, water and sediment discharge are varied within a range of plausible values, up to extreme sediment and water fluxes caused by widespread landsliding and/or intense monsoon precipitation based on existing records. The results of this sensitivity analysis will be used to inform flood hazard maps of the Karnali River floodplain and assess the vulnerability of the populations in the region.
Schnell-Inderst, Petra; Schwarzer, Ruth; Göhler, Alexander; Grandi, Norma; Grabein, Kristin; Stollenwerk, Björn; Klauß, Volker; Wasem, Jürgen; Siebert, Uwe
2009-05-12
In a substantial portion of patients (= 25%) with coronary heart disease (CHD), a myocardial infarction or sudden cardiac death without prior symptoms is the first manifestation of disease. The use of new risk predictors for CHD such as the high-sensitivity C-reactive Protein (hs-CRP) in addition to established risk factors could improve prediction of CHD. As a consequence of the altered risk assessment, modified preventive actions could reduce the number of cardiac death and non-fatal myocardial infarction. Does the additional information gained through the measurement of hs-CRP in asymptomatic patients lead to a clinically relevant improvement in risk prediction as compared to risk prediction based on traditional risk factors and is this cost-effective? A literature search of the electronic databases of the German Institute of Medical Documentation and Information (DIMDI) was conducted. Selection, data extraction, assessment of the study-quality and synthesis of information was conducted according to the methods of evidence-based medicine. Eight publications about predictive value, one publication on the clinical efficacy and three health-economic evaluations were included. In the seven study populations of the prediction studies, elevated CRP-levels were almost always associated with a higher risk of cardiovascular events and non-fatal myocardial infarctions or cardiac death and severe cardiovascular events. The effect estimates (odds ratio (OR), relative risk (RR), hazard ratio (HR)), once adjusted for traditional risk factors, demonstrated a moderate, independent association between hs-CRP and cardiac and cardiovascular events that fell in the range of 0.7 to 2.47. In six of the seven studies, a moderate increase in the area under the curve (AUC) could be detected by adding hs-CRP as a predictor to regression models in addition to established risk factors though in three cases this was not statistically significant. The difference [in the AUC] between the models with and without hs-CRP fell between 0.00 and 0.023 with a median of 0.003. A decision-analytic modeling study reported a gain in life-expectancy for those using statin therapy for populations with elevated hs-CRP levels and normal lipid levels as compared to statin therapy for those with elevated lipid levels (approximately 6.6 months gain in life-expectancy for 58 year olds). Two decision-analytic models (three publications) on cost-effectiveness reported incremental cost-effectiveness ratios between Euro 8,700 and 50,000 per life year gained for the German context and between 52,000 and 708,000 for the US context. The empirical input data for the model is highly uncertain. No sufficient evidence is available to support the notion that hs-CRP-values should be measured during the global risk assessment for CAD or cardiovascular disease in addition to the traditional risk factors. The additional measurement of the hs-CRP-level increases the incremental predictive value of the risk prediction. It has not yet been clarified whether this increase is clinically relevant resulting in reduction of cardiovascular morbidity and mortality. For people with medium cardiovascular risk (5 to 20% in ten years) additional measurement of hs-CRP seems most likely to be clinical relevant to support the decision as to whether or not additional statin therapy should be initiated for primary prevention. Statin therapy can reduce the occurrence of cardiovascular events for asymptomatic individuals with normal lipid and elevated hs-CRP levels. However, this is not enough to provide evidence for a clinical benefit of hs-CRP-screening. The cost-effectiveness of general hs-CRP-screening as well as screening among only those with normal lipid levels remains unknown at present.
Saigas on the brink: Multidisciplinary analysis of the factors influencing mass mortality events
Kock, Richard A.; Orynbayev, Mukhit; Robinson, Sarah; Zuther, Steffen; Singh, Navinder J.; Beauvais, Wendy; Morgan, Eric R.; Kerimbayev, Aslan; Khomenko, Sergei; Martineau, Henny M.; Rystaeva, Rashida; Omarova, Zamira; Wolfs, Sara; Hawotte, Florent; Radoux, Julien; Milner-Gulland, Eleanor J.
2018-01-01
In 2015, more than 200,000 saiga antelopes died in 3 weeks in central Kazakhstan. The proximate cause of death is confirmed as hemorrhagic septicemia caused by the bacterium Pasteurella multocida type B, based on multiple strands of evidence. Statistical modeling suggests that there was unusually high relative humidity and temperature in the days leading up to the mortality event; temperature and humidity anomalies were also observed in two previous similar events in the same region. The modeled influence of environmental covariates is consistent with known drivers of hemorrhagic septicemia. Given the saiga population’s vulnerability to mass mortality and the likely exacerbation of climate-related and environmental stressors in the future, management of risks to population viability such as poaching and viral livestock disease is urgently needed, as well as robust ongoing veterinary surveillance. A multidisciplinary approach is needed to research mass mortality events under rapid environmental change. PMID:29376120
An experimental system for flood risk forecasting at global scale
NASA Astrophysics Data System (ADS)
Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.
2016-12-01
Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.
NASA Astrophysics Data System (ADS)
Huttenlau, Matthias; Stötter, Johann
2010-05-01
Reinsurance companies are stating a high increase in natural hazard related losses, both insured and economic losses, within the last decades on a global scale. This ongoing trend can be described as a product of the dynamic in the natural and in the anthroposphere. To analyze the potential impact of natural hazard process to a certain insurance portfolio or to the society in general, reinsurance companies or risk management consultants have developed loss models. However, those models are generally not fitting the scale dependent demand on regional scales like it is appropriate (i) for analyses on the scale of a specific province or (ii) for portfolio analyses of regional insurance companies. Moreover, the scientific basis of most of the models is not transparent documented and therefore scientific evaluations concerning the methodology concepts are not possible (black box). This is contrary to the scientific principles of transparency and traceability. Especially in mountain regions like the European Alps with their inherent (i) specific characteristic on small scales, (ii) the relative high process dynamics in general, (iii) the occurrence of gravitative mass movements which are related to high relief energy and thus only exists in mountain regions, (iv) the small proportion of the area of permanent settlement on the overall area, (v) the high value concentration in the valley floors, (vi) the exposition of important infrastructures and lifelines, and others, analyses must consider these circumstances adequately. Therefore, risk-based analyses are methodically estimating the potential consequences of hazard process on the built environment standardized with the risk components (i) hazard, (ii) elements at risk, and (iii) vulnerability. However, most research and progress have been made in the field of hazard analyses, whereas the other both components are not developed accordingly. Since these three general components are influencing factors without any weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were incorporated with additional GIS and statistic data to a comprehensive property-by-property geodatabase of the existing elements and values. This stock of elements and values geodatabase is furthermore the consistent basis for all natural hazard analyses and enables the comparison of the results. The study follows the general accepted moduls (i) hazard analysis, (ii) exposition analysis, and (iii) consequence analysis, whereas the exposition analysis estimates the elements at risk with their corresponding damage potentials and the consequence analysis estimates the PMLs. This multi-hazard analysis focuses on process types with a high to extreme potential of negative consequences on a regional scale. In this context, (i) floodings, (ii) rockslides with the potential of corresponding consequence effects (backwater ponding and outburst flood), (iii) earthquakes, (iv) hail events, and (v) winter storms were considered as hazard processes. Based on general hazard analyses (hazard maps) concrete scenarios and their spatial affectedness were determined. For the different hazard processes, different vulnerability approaches were considered to demonstrate their sensitivity and implication on the results. Thus, no absolute values of losses but probable loss ranges were estimated. It can be shown, that the most serious amount of losses would arise from extreme earthquake events with loss burdens up to more than € 7 bn. solely on buildings and inventory. Possible extreme flood events could lead to losses between € 2 and 2.5 bn., whereas a severe hail swath which affects the central Inn valley could result in losses of ca. € 455 mill. (thereof € 285 mill. on vehicles). The potential most serious rockslide with additional consequence effects would result in losses up to ca. € 185 mill. and extreme winter storms can induce losses between € 100 mill. and 150 mill..
Cardiovascular disease in live related renal transplantation.
Kaul, A; Sharm, R K; Gupta, A; Sinha, N; Singh, U
2011-11-01
Cardiovascular disease has become the leading cause of morbidity and mortality in renal transplant recipients, although its pathogenesis and treatment are poorly understood. Modifiable cardiovascular risk factors and graft dysfunction both play an important role in development of post transplant cardiovascular events. Prevalence of cardiovascular disease was studied in stable kidney transplant patients on cyclosporine based triple immunosuppression in relation to the various risk factors and post transplant cardiovascular events. Analysis of 562 post transplant patients with stable graft function for 6 months, the patients were evaluated for cardiovascular events in post transplant period. Pre and post transplant risk factors were analyzed using the COX proportional hazard model. 174 patients had undergone pre transplant coronary angiography, 15 of these patients underwent coronary revascularization (angioplasty in 12, CABG in 3). The prevalence of CAD was 7.2% in transplant recipients. Of 42 patients with CAD 31 (73.8%) had cardiovascular event in post transplant period. Age > or = 40 yrs, male sex, graft dysfunction, diabetes as primary renal disease, pre transplant cardiovascular event, chronic rejection showed significant correlation in univariate analysis and there was significant between age > or = 40 years (OR = 2.16 with 95% CI, 0.977-4.78) S creatinine > or = 1.4 mg % (OR = 2.40 with 95% CI, 1.20 - 4.82), diabetes as primary disease (OR with 95% CI 3.67, 3.2-14.82), PTDM (OR 3.67, 95% CI 1.45-9.40), pre-transplant cardiovascular disease (OR 4.14, 95% CI .38-13.15) with post transplant cardiovascular event on multivariate analysis. There was poor patient and graft survival among those who suffered post transplant cardiovascular event. The incidence of cardiovascular disease continues to be high after renal transplantation and modifiable risk factors should be identified to prevent occurrence of events in post transplant period.
NASA Astrophysics Data System (ADS)
Wahl, Thomas; Jensen, Jürgen; Mudersbach, Christoph
2010-05-01
Storm surges along the German North Sea coastline led to major damages in the past and the risk of inundation is expected to increase in the course of an ongoing climate change. The knowledge of the characteristics of possible storm surges is essential for the performance of integrated risk analyses, e.g. based on the source-pathway-receptor concept. The latter includes the storm surge simulation/analyses (source), modelling of dike/dune breach scenarios (pathway) and the quantification of potential losses (receptor). In subproject 1b of the German joint research project XtremRisK (www.xtremrisk.de), a stochastic storm surge generator for the south-eastern North Sea area is developed. The input data for the multivariate model are high resolution sea level observations from tide gauges during extreme events. Based on 25 parameters (19 sea level parameters and 6 time parameters) observed storm surge hydrographs consisting of three tides are parameterised. Followed by the adaption of common parametric probability distributions and a large number of Monte-Carlo-Simulations, the final reconstruction leads to a set of 100.000 (default) synthetic storm surge events with a one-minute resolution. Such a data set can potentially serve as the basis for a large number of applications. For risk analyses, storm surges with peak water levels exceeding the design water levels are of special interest. The occurrence probabilities of the simulated extreme events are estimated based on multivariate statistics, considering the parameters "peak water level" and "fullness/intensity". In the past, most studies considered only the peak water levels during extreme events, which might not be the most important parameter in any cases. Here, a 2D-Archimedian copula model is used for the estimation of the joint probabilities of the selected parameters, accounting for the structures of dependence overlooking the margins. In coordination with subproject 1a, the results will be used as the input for the XtremRisK subprojects 2 to 4. The project is funded by the German Federal Ministry of Education and Research (BMBF) (Project No. 03 F 0483 B).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.
The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less
Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.; ...
2017-08-25
The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less
Stehle, Sebastian; Dabrowski, James Michael; Bangert, Uli; Schulz, Ralf
2016-03-01
Regulatory risk assessment considers vegetated buffer strips as effective risk mitigation measures for the reduction of runoff-related pesticide exposure of surface waters. However, apart from buffer strip widths, further characteristics such as vegetation density or the presence of erosion rills are generally neglected in the determination of buffer strip mitigation efficacies. This study conducted a field survey of fruit orchards (average slope 3.1-12.2%) of the Lourens River catchment, South Africa, which specifically focused on the characteristics and attributes of buffer strips separating orchard areas from tributary streams. In addition, in-stream and erosion rill water samples were collected during three runoff events and GIS-based modeling was employed to predict losses of pesticides associated with runoff. The results show that erosion rills are common in buffer strips (on average 13 to 24 m wide) of the tributaries (up to 6.5 erosion rills per km flow length) and that erosion rills represent concentrated entry pathways of pesticide runoff into the tributaries during rainfall events. Exposure modeling shows that measured pesticide surface water concentrations correlated significantly (R(2)=0.626; p<0.001) with runoff losses predicted by the modeling approach in which buffer strip width was set to zero at sites with erosion rills; in contrast, no relationship between predicted runoff losses and in-stream pesticide concentrations were detected in the modeling approach that neglected erosion rills and thus assumed efficient buffer strips. Overall, the results of our study show that erosion rills may substantially reduce buffer strip pesticide retention efficacies during runoff events and suggest that the capability of buffer strips as a risk mitigation tool for runoff is largely overestimated in current regulatory risk assessment procedures conducted for pesticide authorization. Copyright © 2015 Elsevier B.V. All rights reserved.
Failure of fertility therapy and subsequent adverse cardiovascular events
Udell, Jacob A.; Lu, Hong; Redelmeier, Donald A.
2017-01-01
BACKGROUND: Infertility may indicate an underlying predisposition toward premature cardiovascular disease, yet little is known about potential long-term cardiovascular events following fertility therapy. We investigated whether failure of fertility therapy is associated with subsequent adverse cardiovascular events. METHODS: We performed a population-based cohort analysis of women who received gonadotropin-based fertility therapy between Apr. 1, 1993, and Mar. 31, 2011, distinguishing those who subsequently gave birth and those who did not. Using multivariable Poisson regression models, we estimated the relative rate ratio of adverse cardiovascular events associated with fertility therapy failure, accounting for age, year, baseline risk factors, health care history and number of fertility cycles. The primary outcome was subsequent treatment for nonfatal coronary ischemia, stroke, transient ischemic attack, heart failure or thromboembolism. RESULTS: Of 28 442 women who received fertility therapy, 9349 (32.9%) subsequently gave birth and 19 093 (67.1%) did not. The median number of fertility treatments was 3 (interquartile range 1–5). We identified 2686 cardiovascular events over a median 8.4 years of follow-up. The annual rate of cardiovascular events was 19% higher among women who did not give birth after fertility therapy than among those who did (1.08 v. 0.91 per 100 patient-years, p < 0.001), equivalent to a 21% relative increase in the annual rate (95% confidence interval 13%–30%). We observed no association between event rates and number of treatment cycles. INTERPRETATION: Fertility therapy failure was associated with an increased risk of long-term adverse cardiovascular events. These women merit surveillance for subsequent cardiovascular events. PMID:28385819
NASA Astrophysics Data System (ADS)
Risser, Mark D.; Stone, Dáithí A.; Paciorek, Christopher J.; Wehner, Michael F.; Angélil, Oliver
2017-11-01
In recent years, the climate change research community has become highly interested in describing the anthropogenic influence on extreme weather events, commonly termed "event attribution." Limitations in the observational record and in computational resources motivate the use of uncoupled, atmosphere/land-only climate models with prescribed ocean conditions run over a short period, leading up to and including an event of interest. In this approach, large ensembles of high-resolution simulations can be generated under factual observed conditions and counterfactual conditions that might have been observed in the absence of human interference; these can be used to estimate the change in probability of the given event due to anthropogenic influence. However, using a prescribed ocean state ignores the possibility that estimates of attributable risk might be a function of the ocean state. Thus, the uncertainty in attributable risk is likely underestimated, implying an over-confidence in anthropogenic influence. In this work, we estimate the year-to-year variability in calculations of the anthropogenic contribution to extreme weather based on large ensembles of atmospheric model simulations. Our results both quantify the magnitude of year-to-year variability and categorize the degree to which conclusions of attributable risk are qualitatively affected. The methodology is illustrated by exploring extreme temperature and precipitation events for the northwest coast of South America and northern-central Siberia; we also provides results for regions around the globe. While it remains preferable to perform a full multi-year analysis, the results presented here can serve as an indication of where and when attribution researchers should be concerned about the use of atmosphere-only simulations.
Landslide risk mapping and modeling in China
NASA Astrophysics Data System (ADS)
Li, W.; Hong, Y.
2015-12-01
Under circumstances of global climate change, tectonic stress and human effect, landslides are among the most frequent and severely widespread natural hazards on Earth, as demonstrated in the World Atlas of Natural Hazards (McGuire et al., 2004). Every year, landslide activities cause serious economic loss as well as casualties (Róbert et al., 2005). How landslides can be monitored and predicted is an urgent research topic of the international landslide research community. Particularly, there is a lack of high quality and updated landslide risk maps and guidelines that can be employed to better mitigate and prevent landslide disasters in many emerging regions, including China (Hong, 2007). Since the 1950s, landslide events have been recorded in the statistical yearbooks, newspapers, and monographs in China. As disasters have been increasingly concerned by the government and the public, information about landslide events is becoming available from online news reports (Liu et al., 2012).This study presents multi-scale landslide risk mapping and modeling in China. At the national scale, based on historical data and practical experiences, we carry out landslide susceptibility and risk mapping by adopting a statistical approach and pattern recognition methods to construct empirical models. Over the identified landslide hot-spot areas, we further evaluate the slope-stability for each individual site (Sidle and Hirotaka, 2006), with the ultimate goal to set up a space-time multi-scale coupling system of Landslide risk mapping and modeling for landslide hazard monitoring and early warning.
Landslide risk mitigation by means of early warning systems
NASA Astrophysics Data System (ADS)
Calvello, Michele
2017-04-01
Among the many options available to mitigate landslide risk, early warning systems may be used where, in specific circumstances, the risk to life increases above tolerable levels. A coherent framework to classify and analyse landslide early warning systems (LEWS) is herein presented. Once the objectives of an early warning strategy are defined depending on the scale of analysis and the type of landslides to address, the process of designing and managing a LEWS should synergically employ technical and social skills. A classification scheme for the main components of LEWSs is proposed for weather-induced landslides. The scheme is based on a clear distinction among: i) the landslide model, i.e. a functional relationship between weather characteristics and landslide events considering the geotechnical, geomorphological and hydro-geological characterization of the area as well as an adequate monitoring strategy; ii) the warning model, i.e. the landslide model plus procedures to define the warning events and to issue the warnings; iii) the warning system, i.e. the warning model plus warning dissemination procedures, communication and education tools, strategies for community involvement and emergency plans. Each component of a LEWS is related to a number of actors involved with their deployment, operational activities and management. For instance, communication and education, community involvement and emergency plans are all significantly influenced by people's risk perception and by operational aspects system managers need to address in cooperation with scientists.
Managing wildfire events: risk-based decision making among a group of federal fire managers
Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy Ascher
2011-01-01
Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...
Bio-Terrorism: Steps to Effective Public Health Risk Communication and Fear Management
2004-06-01
outline the challenges of communicating risk prior to, during and following a bio-terrorism event as well as the relationship between the content of...particularly challenging for a system based on thorough research and data analysis. Risk communication in a bio-terrorism event will involve...Ultimately, the Anthrax events confirmed the difficulty in communicating risk when scientific data is not available. Adding to the challenges imposed by an
Shulman, Rayzel; Stukel, Therese A; Miller, Fiona A; Newman, Alice; Daneman, Denis; Wasserman, Jonathan D; Guttmann, Astrid
2016-01-01
Objective To describe adverse events in pediatric insulin pump users since universal funding in Ontario and to explore the role of socioeconomic status and 24-hour support. Research design and methods Population-based cohort study of youth (<19 years) with type 1 diabetes (n=3193) under a universal access program in Ontario, Canada, from 2006 to 2013. We linked 2012 survey data from 33 pediatric diabetes centers to health administrative databases. The relationship between patient and center-level characteristics and time to first diabetic ketoacidosis (DKA) admission or death was tested using a Cox proportional hazards model and the rate of diabetes-related emergency department visits and hospitalizations with a Poisson model, both using generalized estimating equations. Results The rate of DKA was 5.28/100 person-years and mortality 0.033/100 person-years. Compared with the least deprived quintile, the risk of DKA or death for those in the most deprived quintile was significantly higher (HR 1.58, 95% CI 1.05 to 2.38) as was the rate of diabetes-related acute care use (RR 1.60, 95% CI 1.27 to 2.00). 24-hour support was not associated with these outcomes. Higher glycated hemoglobin, prior DKA, older age, and higher nursing patient load were associated with a higher risk of DKA or death. Conclusions The safety profile of pump therapy in the context of universal funding is similar to other jurisdictions and unrelated to 24-hour support. Several factors including higher deprivation were associated with an increased risk of adverse events and could be used to inform the design of interventions aimed at preventing poor outcomes in high-risk individuals. PMID:27547416
NASA Technical Reports Server (NTRS)
Prassinos, Peter G.; Stamatelatos, Michael G.; Young, Jonathan; Smith, Curtis
2010-01-01
Managed by NASA's Office of Safety and Mission Assurance, a pilot probabilistic risk analysis (PRA) of the NASA Crew Exploration Vehicle (CEV) was performed in early 2006. The PRA methods used follow the general guidance provided in the NASA PRA Procedures Guide for NASA Managers and Practitioners'. Phased-mission based event trees and fault trees are used to model a lunar sortie mission of the CEV - involving the following phases: launch of a cargo vessel and a crew vessel; rendezvous of these two vessels in low Earth orbit; transit to th$: moon; lunar surface activities; ascension &om the lunar surface; and return to Earth. The analysis is based upon assumptions, preliminary system diagrams, and failure data that may involve large uncertainties or may lack formal validation. Furthermore, some of the data used were based upon expert judgment or extrapolated from similar componentssystemsT. his paper includes a discussion of the system-level models and provides an overview of the analysis results used to identify insights into CEV risk drivers, and trade and sensitivity studies. Lastly, the PRA model was used to determine changes in risk as the system configurations or key parameters are modified.
The impacts of climate change on scour-vulnerable bridges : an assessment based on HYRISK.
DOT National Transportation Integrated Search
2011-10-01
More than 20% of the bridges in the U.S. were built more than 50 years ago, at a time in which : intense precipitation events were much less common. However, very little work has been done : on the use of scour risk-assessment models to assess how cl...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.
The US Nuclear Regulatory Commission (NRC) has been using full-power. Level 1, limited-scope risk models for the Accident Sequence Precursor (ASP) program for over fifteen years. These models have evolved and matured over the years, as have probabilistic risk assessment (PRA) and computer technologies. Significant upgrading activities have been undertaken over the past three years, with involvement from the Offices of Nuclear Reactor Regulation (NRR), Analysis and Evaluation of Operational Data (AEOD), and Nuclear Regulatory Research (RES), and several national laboratories. Part of these activities was an RES-sponsored feasibility study investigating the ability to extend the ASP models to includemore » contributors to core damage from events initiated with the reactor at low power or shutdown (LP/SD), both internal events and external events. This paper presents only the LP/SD internal event modeling efforts.« less
Shara, Nawar M; Wang, Hong; Mete, Mihriye; Al-Balha, Yaman Rai; Azalddin, Nameer; Lee, Elisa T; Franceschini, Nora; Jolly, Stacey E; Howard, Barbara V; Umans, Jason G
2012-11-01
In populations with high prevalences of diabetes and obesity, estimating glomerular filtration rate (GFR) by using the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation may predict cardiovascular disease (CVD) risk better than by using the Modification of Diet in Renal Disease (MDRD) Study equation. Longitudinal cohort study comparing the association of GFR estimated using either the CKD-EPI or MDRD Study equation with incident CVD outcomes. American Indians participating in the Strong Heart Study, a longitudinal population-based cohort with high prevalences of diabetes, CVD, and CKD. Estimated GFR (eGFR) predicted using the CKD-EPI and MDRD Study equations. Fatal and nonfatal cardiovascular events, consisting of coronary heart disease, stroke, and heart failure. The association between eGFR and outcomes was explored in Cox proportional hazards models adjusted for traditional risk factors and albuminuria; the net reclassification index and integrated discrimination improvement were determined for the CKD-EPI versus MDRD Study equations. In 4,549 participants, diabetes was present in 45%; CVD, in 7%; and stages 3-5 CKD, in 10%. During a median of 15 years, there were 1,280 cases of incident CVD, 929 cases of incident coronary heart disease, 305 cases of incident stroke, and 381 cases of incident heart failure. Reduced eGFR (<90 mL/min/1.73 m2) was associated with adverse events in most models. Compared with the MDRD Study equation, the CKD-EPI equation correctly reclassified 17.0% of 2,151 participants without incident CVD to a lower risk (higher eGFR) category and 1.3% (n=28) were reclassified incorrectly to a higher risk (lower eGFR) category. Single measurements of eGFR and albuminuria at study visits. Although eGFR based on either equation had similar associations with incident CVD, coronary heart disease, stroke, and heart failure events, in those not having events, reclassification of participants to eGFR categories was superior using the CKD-EPI equation compared with the MDRD Study equation. Copyright © 2012 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
Rasmussen, Cathrine Skovmand; Nielsen, Louise Gramstrup; Petersen, Dorthe Janne; Christiansen, Erik; Bilenberg, Niels
2014-04-01
The aim of the study was to identify risk factors for significant changes in emotional and behavioural problem load in a community-based cohort of Danish children aged 9-16 years, the risk factors being seven parental and two child-related adverse life events. Data on emotional and behavioural problems was obtained from parents filling in the Child Behavior Checklist (CBCL) when the child was 8-9 and again when 15 years old. Data on risk factors was drawn from Danish registers. Analysis used was logistic regression for crude and adjusted change. Parental divorce significantly raised the odds ratio of an increase in emotional and behavioural problems; furthermore, the risk of deterioration in problem behaviour rose significantly with increasing number of adverse life events. By dividing the children into four groups based on the pathway in problem load (increasers, decreasers, high persisters and low persisters), we found that children with a consistently high level of behavioural problems also had the highest number of adverse life events compared with any other group. Family break-up was found to be a significant risk factor. This supports findings in previous studies. The fact that no other risk factor proved to be of significance might be due to lack of power in the study. Children experiencing high levels of adverse life events are at high risk of chronic problem behaviour. Thus these risk factors should be assessed in daily clinical practice.
Arterial stiffness and cardiovascular events: the Framingham Heart Study.
Mitchell, Gary F; Hwang, Shih-Jen; Vasan, Ramachandran S; Larson, Martin G; Pencina, Michael J; Hamburg, Naomi M; Vita, Joseph A; Levy, Daniel; Benjamin, Emelia J
2010-02-02
Various measures of arterial stiffness and wave reflection have been proposed as cardiovascular risk markers. Prior studies have not assessed relations of a comprehensive panel of stiffness measures to prognosis in the community. We used proportional hazards models to analyze first-onset major cardiovascular disease events (myocardial infarction, unstable angina, heart failure, or stroke) in relation to arterial stiffness (pulse wave velocity [PWV]), wave reflection (augmentation index, carotid-brachial pressure amplification), and central pulse pressure in 2232 participants (mean age, 63 years; 58% women) in the Framingham Heart Study. During median follow-up of 7.8 (range, 0.2 to 8.9) years, 151 of 2232 participants (6.8%) experienced an event. In multivariable models adjusted for age, sex, systolic blood pressure, use of antihypertensive therapy, total and high-density lipoprotein cholesterol concentrations, smoking, and presence of diabetes mellitus, higher aortic PWV was associated with a 48% increase in cardiovascular disease risk (95% confidence interval, 1.16 to 1.91 per SD; P=0.002). After PWV was added to a standard risk factor model, integrated discrimination improvement was 0.7% (95% confidence interval, 0.05% to 1.3%; P<0.05). In contrast, augmentation index, central pulse pressure, and pulse pressure amplification were not related to cardiovascular disease outcomes in multivariable models. Higher aortic stiffness assessed by PWV is associated with increased risk for a first cardiovascular event. Aortic PWV improves risk prediction when added to standard risk factors and may represent a valuable biomarker of cardiovascular disease risk in the community.
NASA Astrophysics Data System (ADS)
Michel, G.; Gunasekera, R.; Werner, A.; Galy, H.
2012-04-01
Similar to 2001, 2004, and 2005, 2011 was another year of unexpected international catastrophe events, in which insured losses were more than twice the expected long-term annual average catastrophe losses of USD 30 to 40bn. Key catastrophe events that significantly contributed these losses included the Mw 9.0 Great Tohoku earthquake and tsunami, the Jan. 2011 floods in Queensland, the October 2011 floods in Thailand, the Mw 6.1 Christchurch earthquake and Convective system (Tornado) in United States. However, despite considerable progress in catastrophe modelling, the advent of global catastrophe models, increasing risk model coverage and skill in the detailed modelling, the above mentioned events were not satisfactorily modelled by the current mainstream Re/Insurance catastrophe models. This presentation therefore address problems in models and incomplete understanding identified from recent catastrophic events by considering: i) the current modelling environment, and ii) how the current processes could be improved via: a) the understanding of risk within science networks such as the Willis Research Network, and b) the integration of risk model results from available insurance catastrophe models and tools. This presentation aims to highlight the needed improvements in decision making and market practices, thereby advancing the current management of risk in the Re/Insurance industry. This also increases the need for better integration of Public-Private-Academic partnerships and tools to provide better estimates of not only financial loss but also humanitarian and infrastructural losses as well.
Displacement, county social cohesion and depression after a large-scale traumatic event
Lê, Félice; Tracy, Melissa; Norris, Fran H.; Galea, Sandro
2013-01-01
Background Depression is a common and potentially debilitating consequence of traumatic events. Mass traumatic events cause wide-ranging disruptions to community characteristics, influencing the population risk of depression. In the aftermath of such events, population displacement is common. Stressors associated with displacement may increase risk of depression directly. Indirectly, persons who are displaced may experience erosion in social cohesion, further exacerbating their risk for depression. Methods Using data from a population-based cross-sectional survey of adults living in the 23 southernmost counties of Mississippi (N = 708), we modeled the independent and joint relations of displacement and county-level social cohesion with depression 18–24 months after Hurricane Katrina. Results After adjustment for individual- and county-level sociodemographic characteristics and county-level hurricane exposure, joint exposure to both displacement and low social cohesion was associated with substantially higher log-odds of depression (b = 1.34 [0.86–1.83]). Associations were much weaker for exposure only to low social cohesion (b = 0.28 [−0.35–0.90]) or only to displacement (b = 0.04 [−0.80– 0.88]). The associations were robust to additional adjustment for individually perceived social cohesion and social support. Conclusion Addressing the multiple, simultaneous disruptions that are a hallmark of mass traumatic events is important to identify vulnerable populations and understand the psychological ramifications of these events. PMID:23644724
Bobdiwala, S.; Guha, S.; Van Calster, B.; Ayim, F.; Mitchell-Jones, N.; Al-Memar, M.; Mitchell, H.; Stalder, C.; Bottomley, C.; Kothari, A.; Timmerman, D.; Bourne, T.
2016-01-01
STUDY QUESTION What are the adverse outcomes associated with using the M4 model in everyday clinical practice for women with pregnancy of unknown location (PUL)? SUMMARY ANSWER There were 17/835 (2.0%) adverse events and no serious adverse events associated with the performance of the M4 model in clinical practice. WHAT IS KNOWN ALREADY The M4 model has previously been shown to stratify women classified as a PUL as at low or high risk of complications with a good level of test performance. The triage performance of the M4 model is better than single measurements of serum progesterone or the hCG ratio (serum hCG at 48 h/hCG at presentation). STUDY DESIGN, SIZE, DURATION A prospective multi-centre cohort study of 1022 women with a PUL carried out between August 2012 and December 2013 across 2 university teaching hospitals and 1 district general hospital. PARTICIPANTS/MATERIALS, SETTING, METHODS All women presenting with a PUL to the early pregnancy units of the three hospitals were recruited. The final outcome for PUL was either a failed PUL (FPUL), intrauterine pregnancy (IUP) or ectopic pregnancy (EP) (including persistent PUL (PPUL)), with EP and PPUL considered high-risk PUL. Their hCG results at 0 and 48 h were entered into the M4 model algorithm. If the risk of EP was ≥5%, the PUL was predicted to be high-risk and the participant was asked to re-attend 48 h later for a repeat hCG and transvaginal ultrasound scan by a senior clinician. If the PUL was classified as ‘low risk, likely failed PUL’, the participant was asked to perform a urinary pregnancy test 2 weeks later. If the PUL was classified as ‘low risk, likely intrauterine’, the participant was scheduled for a repeat scan in 1 week. Deviations from the management protocol were recorded as either an ‘unscheduled visit (participant reason)’, ‘unscheduled visit (clinician reason)’ or ‘differences in timing (blood test/ultrasound)’. Adverse events were assessed using definitions outlined in the UK Good Clinical Practice Guidelines' document. MAIN RESULTS AND THE ROLE OF CHANCE A total of 835 (82%) women classified as a PUL were managed according to the M4 model (9 met the exclusion criteria, 69 were lost to follow-up, 109 had no hCG result at 48 h). Of these, 443 (53%) had a final outcome of FPUL, 298 (36%) an IUP and 94 (11%) an EP. The M4 model predicted 70% (585/835) PUL as low risk, of which 568 (97%) were confirmed as FPUL or IUP. Of the 17 EP and PPUL misclassified as low risk, 5 had expectant management, 7 medical management with methotrexate and 5 surgical intervention. Nineteen PUL had an unscheduled visit (participant reason), 38 PUL had an unscheduled visit (clinician reason) and 68 PUL had deviations from protocol due to a difference in timing (blood test/ultrasound). Adverse events were reported in 26 PUL and 1 participant had a serious adverse event. A total of 17/26 (65%) adverse events were misclassifications of a high risk PUL as low risk by the M4 model, while 5/26 (19%) adverse events were related to incorrect clinical decisions. Four of the 26 adverse events (15%) were secondary to unscheduled admissions for pain/bleeding. The serious adverse event was due to an incorrect clinical decision. LIMITATIONS, REASONS FOR CAUTION A limitation of the study was that 69/1022 (7%) of PUL were lost to follow-up. A 48 h hCG level was missing for 109/1022 (11%) participants. WIDER IMPLICATIONS OF THE FINDINGS The low number of adverse events (2.0%) suggests that expectant management of PUL using the M4 prediction model is safe. The model is an effective way of triaging women with a PUL as being at high- and low-risk of complications and rationalizing follow-up. The multi-centre design of the study is more likely to make the performance of the M4 model generalizable in other populations. STUDY FUNDING/COMPETING INTEREST(S) None. TRIAL REGISTRATION NUMBER Not applicable. PMID:27165655
Bettiol, Alessandra; Lucenteforte, Ersilia; Vannacci, Alfredo; Lombardi, Niccolò; Onder, Graziano; Agabiti, Nera; Vitale, Cristiana; Trifirò, Gianluca; Corrao, Giovanni; Roberto, Giuseppe; Mugelli, Alessandro; Chinellato, Alessandro
2017-12-01
Antihypertensive treatment with calcium channel blockers (CCBs) is consolidated in clinical practice; however, different studies observed increased risks of acute events for short-acting CCBs. This study aimed to provide real-world evidence on risks of acute cardiovascular (CV) events, hospitalizations and mortality among users of different CCB classes in secondary CV prevention. Three case-control studies were nested in a cohort of Italian elderly hypertensive CV-compromised CCBs users. Cases were subjects with CV events (n = 25,204), all-cause hospitalizations (n = 19,237), or all-cause mortality (n = 17,996) during the follow-up. Up to four controls were matched for each case. Current or past exposition to CCBs at index date was defined based on molecule, formulation and daily doses of the last CCB delivery. The odds ratio (OR) and 95% confidence intervals (CI) were estimated using conditional logistic regression models. Compared to past users, current CCB users had significant reductions in risks of CV events [OR 0.88 (95% CI: 0.84-0.91)], hospitalization [0.90 (0.88-0.93)] and mortality [0.48 (0.47-0.49)]. Current users of long-acting dihydropyridines (DHPs) had the lowest risk [OR 0.87 (0.84-0.90), 0.86 (0.83-0.90), 0.55 (0.54-0.56) for acute CV events, hospitalizations and mortality], whereas current users of short-acting CCBs had an increased risk of acute CV events [OR 1.77 (1.13-2.78) for short-acting DHPs; 1.19 (1.07-1.31) for short-acting non-DHPs] and hospitalizations [OR 1.84 (0.96-3.51) and 1.23 (1.08-1.42)]. The already-existing warning on short-acting CCBs should be potentiated, addressing clinicians towards the choice of long-acting formulations.
Venous thromboembolism prevention guidelines for medical inpatients: mind the (implementation) gap.
Maynard, Greg; Jenkins, Ian H; Merli, Geno J
2013-10-01
Hospital-associated nonsurgical venous thromboembolism (VTE) is an important problem addressed by new guidelines from the American College of Physicians (ACP) and American College of Chest Physicians (AT9). Narrative review and critique. Both guidelines discount asymptomatic VTE outcomes and caution against overprophylaxis, but have different methodologies and estimates of risk/benefit. Guideline complexity and lack of consensus on VTE risk assessment contribute to an implementation gap. Methods to estimate prophylaxis benefit have significant limitations because major trials included mostly screening-detected events. AT9 relies on a single Italian cohort study to conclude that those with a Padua score ≥4 have a very high VTE risk, whereas patients with a score <4 (60% of patients) have a very small risk. However, the cohort population has less comorbidity than US inpatients, and over 1% of patients with a score of 3 suffered pulmonary emboli. The ACP guideline does not endorse any risk-assessment model. AT9 includes the Padua model and Caprini point-based system for nonsurgical inpatients and surgical inpatients, respectively, but there is no evidence they are more effective than simpler risk-assessment models. New VTE prevention guidelines provide varied guidance on important issues including risk assessment. If Padua is used, a threshold of 3, as well as 4, should be considered. Simpler VTE risk-assessment models may be superior to complicated point-based models in environments without sophisticated clinical decision support. © 2013 Society of Hospital Medicine.
A Quantitative Risk-Benefit Analysis of Prophylactic Surgery Prior to Extended-Duration Spaceflight
NASA Technical Reports Server (NTRS)
Carroll, Danielle; Reyes, David; Kerstman, Eric; Walton, Marlei; Antonsen, Erik
2017-01-01
INTRODUCTION: Among otherwise healthy astronauts undertaking deep space missions, the risks for acute appendicitis (AA) and cholecystitis (AC) are not zero. If these conditions were to occur during spaceflight they may require surgery for definitive care. The proposed study quantifies and compares the risks of developing de novo AA and AC in-flight to the surgical risks of prophylactic laparoscopic appendectomy (LA) and cholecystectomy (LC) using NASA's Integrated Medical Model (IMM). METHODS: The IMM is a Monte Carlo simulation that forecasts medical events during spaceflight missions and estimates the impact of these medical events on crew health. In this study, four Design Reference Missions (DRMs) were created to assess the probability of an astronaut developing in-flight small-bowel obstruction (SBO) following prophylactic 1) LA, 2) LC, 3) LA and LC, or 4) neither surgery (SR# S-20160407-351). Model inputs were drawn from a large, population-based 2011 Swedish study that examined the incidence and risks of post-operative SBO over a 5-year follow-up period. The study group included 1,152 patients who underwent LA, and 16,371 who underwent LC. RESULTS: Preliminary results indicate that prophylactic LA may yield higher mission risks than the control DRM. Complete analyses are pending and will be subsequently available. DISCUSSION: The risk versus benefits of prophylactic surgery in astronauts to decrease the probability of acute surgical events during spaceflight has only been qualitatively examined in prior studies. Within the assumptions and limitations of the IMM, this work provides the first quantitative guidance that has previously been lacking to this important question for future deep space exploration missions.
A First Step towards a Clinical Decision Support System for Post-traumatic Stress Disorders.
Ma, Sisi; Galatzer-Levy, Isaac R; Wang, Xuya; Fenyö, David; Shalev, Arieh Y
2016-01-01
PTSD is distressful and debilitating, following a non-remitting course in about 10% to 20% of trauma survivors. Numerous risk indicators of PTSD have been identified, but individual level prediction remains elusive. As an effort to bridge the gap between scientific discovery and practical application, we designed and implemented a clinical decision support pipeline to provide clinically relevant recommendation for trauma survivors. To meet the specific challenge of early prediction, this work uses data obtained within ten days of a traumatic event. The pipeline creates personalized predictive model for each individual, and computes quality metrics for each predictive model. Clinical recommendations are made based on both the prediction of the model and its quality, thus avoiding making potentially detrimental recommendations based on insufficient information or suboptimal model. The current pipeline outperforms the acute stress disorder, a commonly used clinical risk factor for PTSD development, both in terms of sensitivity and specificity.
NASA Astrophysics Data System (ADS)
Keshtpoor, M.; Carnacina, I.; Yablonsky, R. M.
2016-12-01
Extratropical cyclones (ETCs) are the primary driver of storm surge events along the UK and northwest mainland Europe coastlines. In an effort to evaluate the storm surge risk in coastal communities in this region, a stochastic catalog is developed by perturbing the historical storm seeds of European ETCs to account for 10,000 years of possible ETCs. Numerical simulation of the storm surge generated by the full 10,000-year stochastic catalog, however, is computationally expensive and may take several months to complete with available computational resources. A new statistical regression model is developed to select the major surge-generating events from the stochastic ETC catalog. This regression model is based on the maximum storm surge, obtained via numerical simulations using a calibrated version of the Delft3D-FM hydrodynamic model with a relatively coarse mesh, of 1750 historical ETC events that occurred over the past 38 years in Europe. These numerically-simulated surge values were regressed to the local sea level pressure and the U and V components of the wind field at the location of 196 tide gauge stations near the UK and northwest mainland Europe coastal areas. The regression model suggests that storm surge values in the area of interest are highly correlated to the U- and V-component of wind speed, as well as the sea level pressure. Based on these correlations, the regression model was then used to select surge-generating storms from the 10,000-year stochastic catalog. Results suggest that roughly 105,000 events out of 480,000 stochastic storms are surge-generating events and need to be considered for numerical simulation using a hydrodynamic model. The selected stochastic storms were then simulated in Delft3D-FM, and the final refinement of the storm population was performed based on return period analysis of the 1750 historical event simulations at each of the 196 tide gauges in preparation for Delft3D-FM fine mesh simulations.
Ndindjock, Roger; Gedeon, Jude; Mendis, Shanthi; Paccaud, Fred
2011-01-01
Abstract Objective To assess the prevalence of cardiovascular (CV) risk factors in Seychelles, a middle-income African country, and compare the cost-effectiveness of single-risk-factor management (treating individuals with arterial blood pressure ≥ 140/90 mmHg and/or total serum cholesterol ≥ 6.2 mmol/l) with that of management based on total CV risk (treating individuals with a total CV risk ≥ 10% or ≥ 20%). Methods CV risk factor prevalence and a CV risk prediction chart for Africa were used to estimate the 10-year risk of suffering a fatal or non-fatal CV event among individuals aged 40–64 years. These figures were used to compare single-risk-factor management with total risk management in terms of the number of people requiring treatment to avert one CV event and the number of events potentially averted over 10 years. Treatment for patients with high total CV risk (≥ 20%) was assumed to consist of a fixed-dose combination of several drugs (polypill). Cost analyses were limited to medication. Findings A total CV risk of ≥ 10% and ≥ 20% was found among 10.8% and 5.1% of individuals, respectively. With single-risk-factor management, 60% of adults would need to be treated and 157 cardiovascular events per 100 000 population would be averted per year, as opposed to 5% of adults and 92 events with total CV risk management. Management based on high total CV risk optimizes the balance between the number requiring treatment and the number of CV events averted. Conclusion Total CV risk management is much more cost-effective than single-risk-factor management. These findings are relevant for all countries, but especially for those economically and demographically similar to Seychelles. PMID:21479093
Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil
Lowe, Rachel; Coelho, Caio AS; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier
2016-01-01
Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics. DOI: http://dx.doi.org/10.7554/eLife.11285.001 PMID:26910315
Bliden, Kevin P; Chaudhary, Rahul; Navarese, Eliano P; Sharma, Tushar; Kaza, Himabindu; Tantry, Udaya S; Gurbel, Paul A
2018-01-01
Conventional cardiovascular risk estimators based on clinical demographics have limited prediction of coronary events. Markers for thrombogenicity and vascular function have not been explored in risk estimation of high-risk patients with coronary artery disease. We aimed to develop a clinical and biomarker score to predict 3-year adverse cardiovascular events. Four hundred eleven patients, with ejection fraction ≥40% undergoing coronary angiography, and found to have a luminal diameter stenosis ≥50%, were included in the analysis. Thrombelastography indices and central pulse pressure (CPP) were determined at the time of catheterization. We identified predictors of death, myocardial infarction (MI) or stroke and developed a numerical ischemia risk score. The primary endpoint of cardiovascular death, MI or stroke occurred in 22 patients (5.4%). The factors associated with events were age, prior PCI or CABG, diabetes, CPP, and thrombin-induced platelet-fibrin clot strength, and were included in the MAGMA-ischemia score. The MAGMA-ischemia score showed a c-statistic of 0.85 (95% Confidence Interval [CI] 0.80-0.87; p<0.001) for the primary endpoint. In the subset of patients who underwent revascularization, the c-statistic was 0.90 (p<0.001). Patients with MAGMA-ischemia score greater than 5 had highest risk to develop clinical events, hazard ratio for the primary endpoint: 13.9 (95% CI 5.8-33.1, p<0.001) and for the secondary endpoint: 4.8 (95% CI 2.3-9.6, p<0.001). When compared to previous models, the MAGMA-ischemia score yielded a higher discrimination. Inclusion of CPP and assessment of thrombogenicity in a novel score for patients with documented CAD enhanced the prediction of events. Copyright © 2017 Elsevier B.V. All rights reserved.
Mining Rare Events Data for Assessing Customer Attrition Risk
NASA Astrophysics Data System (ADS)
Au, Tom; Chin, Meei-Ling Ivy; Ma, Guangqin
Customer attrition refers to the phenomenon whereby a customer leaves a service provider. As competition intensifies, preventing customers from leaving is a major challenge to many businesses such as telecom service providers. Research has shown that retaining existing customers is more profitable than acquiring new customers due primarily to savings on acquisition costs, the higher volume of service consumption, and customer referrals. For a large enterprise, its customer base consists of tens of millions service subscribers, more often the events, such as switching to competitors or canceling services are large in absolute number, but rare in percentage, far less than 5%. Based on a simple random sample, popular statistical procedures, such as logistic regression, tree-based method and neural network, can sharply underestimate the probability of rare events, and often result a null model (no significant predictors). To improve efficiency and accuracy for event probability estimation, a case-based data collection technique is then considered. A case-based sample is formed by taking all available events and a small, but representative fraction of nonevents from a dataset of interest. In this article we showed a consistent prior correction method for events probability estimation and demonstrated the performance of the above data collection techniques in predicting customer attrition with actual telecommunications data.
Modeling the risk of water pollution by pesticides from imbalanced data.
Trajanov, Aneta; Kuzmanovski, Vladimir; Real, Benoit; Perreau, Jonathan Marks; Džeroski, Sašo; Debeljak, Marko
2018-04-30
The pollution of ground and surface waters with pesticides is a serious ecological issue that requires adequate treatment. Most of the existing water pollution models are mechanistic mathematical models. While they have made a significant contribution to understanding the transfer processes, they face the problem of validation because of their complexity, the user subjectivity in their parameterization, and the lack of empirical data for validation. In addition, the data describing water pollution with pesticides are, in most cases, very imbalanced. This is due to strict regulations for pesticide applications, which lead to only a few pollution events. In this study, we propose the use of data mining to build models for assessing the risk of water pollution by pesticides in field-drained outflow water. Unlike the mechanistic models, the models generated by data mining are based on easily obtainable empirical data, while the parameterization of the models is not influenced by the subjectivity of ecological modelers. We used empirical data from field trials at the La Jaillière experimental site in France and applied the random forests algorithm to build predictive models that predict "risky" and "not-risky" pesticide application events. To address the problems of the imbalanced classes in the data, cost-sensitive learning and different measures of predictive performance were used. Despite the high imbalance between risky and not-risky application events, we managed to build predictive models that make reliable predictions. The proposed modeling approach can be easily applied to other ecological modeling problems where we encounter empirical data with highly imbalanced classes.
NASA Astrophysics Data System (ADS)
Guler Yigitoglu, Askin
In the context of long operation of nuclear power plants (NPPs) (i.e., 60-80 years, and beyond), investigation of the aging of passive systems, structures and components (SSCs) is important to assess safety margins and to decide on reactor life extension as indicated within the U.S. Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) Program. In the traditional probabilistic risk assessment (PRA) methodology, evaluating the potential significance of aging of passive SSCs on plant risk is challenging. Although passive SSC failure rates can be added as initiating event frequencies or basic event failure rates in the traditional event-tree/fault-tree methodology, these failure rates are generally based on generic plant failure data which means that the true state of a specific plant is not reflected in a realistic manner on aging effects. Dynamic PRA methodologies have gained attention recently due to their capability to account for the plant state and thus address the difficulties in the traditional PRA modeling of aging effects of passive components using physics-based models (and also in the modeling of digital instrumentation and control systems). Physics-based models can capture the impact of complex aging processes (e.g., fatigue, stress corrosion cracking, flow-accelerated corrosion, etc.) on SSCs and can be utilized to estimate passive SSC failure rates using realistic NPP data from reactor simulation, as well as considering effects of surveillance and maintenance activities. The objectives of this dissertation are twofold: The development of a methodology for the incorporation of aging modeling of passive SSC into a reactor simulation environment to provide a framework for evaluation of their risk contribution in both the dynamic and traditional PRA; and the demonstration of the methodology through its application to pressurizer surge line pipe weld and steam generator tubes in commercial nuclear power plants. In the proposed methodology, a multi-state physics based model is selected to represent the aging process. The model is modified via sojourn time approach to reflect the operational and maintenance history dependence of the transition rates. Thermal-hydraulic parameters of the model are calculated via the reactor simulation environment and uncertainties associated with both parameters and the models are assessed via a two-loop Monte Carlo approach (Latin hypercube sampling) to propagate input probability distributions through the physical model. The effort documented in this thesis towards this overall objective consists of : i) defining a process for selecting critical passive components and related aging mechanisms, ii) aging model selection, iii) calculating the probability that aging would cause the component to fail, iv) uncertainty/sensitivity analyses, v) procedure development for modifying an existing PRA to accommodate consideration of passive component failures, and, vi) including the calculated failure probability in the modified PRA. The proposed methodology is applied to pressurizer surge line pipe weld aging and steam generator tube degradation in pressurized water reactors.
Murphy, F Gregory; Swingler, Ashleigh J; Gerth, Wayne A; Howle, Laurens E
2018-01-01
Decompression sickness (DCS) in humans is associated with reductions in ambient pressure that occur during diving, aviation, or certain manned spaceflight operations. Its signs and symptoms can include, but are not limited to, joint pain, radiating abdominal pain, paresthesia, dyspnea, general malaise, cognitive dysfunction, cardiopulmonary dysfunction, and death. Probabilistic models of DCS allow the probability of DCS incidence and time of occurrence during or after a given hyperbaric or hypobaric exposure to be predicted based on how the gas contents or gas bubble volumes vary in hypothetical tissue compartments during the exposure. These models are calibrated using data containing the pressure and respired gas histories of actual exposures, some of which resulted in DCS, some of which did not, and others in which the diagnosis of DCS was not clear. The latter are referred to as marginal DCS cases. In earlier works, a marginal DCS event was typically weighted as 0.1, with a full DCS event being weighted as 1.0, and a non-event being weighted as 0.0. Recent work has shown that marginal DCS events should be weighted as 0.0 when calibrating gas content models. We confirm this indication in the present work by showing that such models have improved performance when calibrated to data with marginal DCS events coded as non-events. Further, we investigate the ramifications of derating marginal events on model-prescribed air diving no-stop limits. Copyright © 2017 Elsevier Ltd. All rights reserved.
Carotid Atherosclerosis Progression and Risk of Cardiovascular Events in a Community in Taiwan.
Chen, Pei-Chun; Jeng, Jiann-Shing; Hsu, Hsiu-Ching; Su, Ta-Chen; Chien, Kuo-Liong; Lee, Yuan-Teh
2016-05-12
The authors investigated the association between progression of carotid atherosclerosis and incidence of cardiovascular disease in a community cohort in Taiwan. Data has rarely been reported in Asian populations. Study subjects were 1,398 participants who underwent ultrasound measures of common carotid artery intima-media thickness (IMT) and extracranial carotid artery plaque score at both 1994-1995 and 1999-2000 surveys. Cox proportional hazards model was used to assess the risk of incident cardiovascular disease. During a median follow-up of 13 years (1999-2013), 71 strokes and 68 coronary events occurred. The 5-year individual IMT change was not associated with development of cardiovascular events in unadjusted and adjusted models. Among subjects without plaque in 1994-1995, we observed elevated risk associated with presence of new plaque (plaque score >0 in 1999-2000) in a dose-response manner in unadjusted and age- and sex- adjusted models. The associations attenuated and became statistically non-significant after controlling for cardiovascular risk factors (hazard ratio [95% confidence interval] for plaque score >2 vs. 0: stroke, 1.61 [0.79-3.27], coronary events, 1.13 [0.48-2.69]). This study suggested that carotid plaque formation measured by ultrasound is associated increased risk of developing cardiovascular disease, and cardiovascular risk factors explain the associations to a large extent.
Lan, Chen-Chia; Tseng, Chun-Hung; Chen, Jiunn-Horng; Lan, Joung-Liang; Wang, Yu-Chiao; Tsay, Gregory J; Hsu, Chung-Yi
2016-11-01
An increased risk of suicide ideation and death has been reported in patients with fibromyalgia. This study aimed to evaluate the risk of a suicide event in patients with primary fibromyalgia and in fibromyalgia patients with comorbidities. We used the Longitudinal Health Insurance Database, a subset of the national insurance claim dataset, which enrolled 1 million Taiwanese people from 2000 to 2005, to identify 95,150 patients with incident fibromyalgia (ICD-9-CM 729.0-729.1) and 190,299 reference subjects matched by sex, age, and index date of diagnosis, with a mean of 8.46 ± 2.37 years of follow-up until 2011. The risk of a suicide event (ICD-9-CM, External-Cause Codes 950-959) was analyzed with a Cox proportional hazards model. Stratification analysis was performed by separating fibromyalgia patients and reference subjects with respect to each comorbidity to determine the risk of suicide in fibromyalgia patients with or without comorbidity relative to subjects who had neither fibromyalgia nor comorbidity. In this Taiwanese dataset, there were 347 suicide events in patients with fibromyalgia (4.16 per 10 person-years) and 424 in matched reference subjects (2.63 per 10 person-years) with a significant crude hazard ratio (HR) of 1.58 (95% confidence interval [CI] 1.38-1.83) and an adjusted HR of 1.38 (95% CI 1.17-1.71) for fibromyalgia patients relative to the matched reference subjects. According to the 2 × 2 stratification analysis, we found that fibromyalgia patients without comorbidity had an independent but mild risk of a suicide event with adjusted HRs ranging from 1.33 to 1.69 relative to subjects with neither fibromyalgia nor comorbidity. Meanwhile, fibromyalgia patients with comorbidity led to a markedly enhanced risk of a suicide event relative to the matched reference subjects, with adjusted HRs ranging from 1.51 to 8.23. Our analysis confirmed a mild-to-moderate risk of a suicide event in patients with primary fibromyalgia. Attention should be paid to the prevention of suicide in fibromyalgia patients with concomitant comorbidities.
Socioeconomic indicators of heat-related health risk supplemented with remotely sensed data
Johnson, Daniel P; Wilson, Jeffrey S; Luber, George C
2009-01-01
Background Extreme heat events are the number one cause of weather-related fatalities in the United States. The current system of alert for extreme heat events does not take into account intra-urban spatial variation in risk. The purpose of this study is to evaluate a potential method to improve spatial delineation of risk from extreme heat events in urban environments by integrating sociodemographic risk factors with estimates of land surface temperature derived from thermal remote sensing data. Results Comparison of logistic regression models indicates that supplementing known sociodemographic risk factors with remote sensing estimates of land surface temperature improves the delineation of intra-urban variations in risk from extreme heat events. Conclusion Thermal remote sensing data can be utilized to improve understanding of intra-urban variations in risk from extreme heat. The refinement of current risk assessment systems could increase the likelihood of survival during extreme heat events and assist emergency personnel in the delivery of vital resources during such disasters. PMID:19835578
An experimental system for flood risk forecasting and monitoring at global scale
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Alfieri, Lorenzo; Kalas, Milan; Lorini, Valerio; Salamon, Peter
2017-04-01
Global flood forecasting and monitoring systems are nowadays a reality and are being applied by a wide range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasting, combining streamflow estimations with expected inundated areas and flood impacts. Finally, emerging technologies such as crowdsourcing and social media monitoring can play a crucial role in flood disaster management and preparedness. Here, we present some recent advances of an experimental procedure for near-real time flood mapping and impact assessment. The procedure translates in near real-time the daily streamflow forecasts issued by the Global Flood Awareness System (GloFAS) into event-based flood hazard maps, which are then combined with exposure and vulnerability information at global scale to derive risk forecast. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To increase the reliability of our forecasts we propose the integration of model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification and correction of impact forecasts. Finally, we present the results of preliminary tests which show the potential of the proposed procedure in supporting emergency response and management.
Tsertsvadze, Alexander; Yazdi, Fatemeh; Fink, Howard A; MacDonald, Roderick; Wilt, Timothy J; Bella, Anthony J; Ansari, Mohammed T; Garritty, Chantelle; Soares-Weiser, Karla; Daniel, Raymond; Sampson, Margaret; Moher, David
2009-10-01
To summarize and compare evidence on harms in sildenafil- and placebo-treated men with erectile dysfunction (ED) in a systematic review and meta-analysis. Randomized placebo-controlled trials (RCTs) were identified using an electronic search in MEDLINE, EMBASE, PsycINFO, SCOPUS, and Cochrane CENTRAL. The rates of any adverse events (AEs), most commonly reported AEs, withdrawals because of adverse events, and serious adverse events were ascertained and compared between sildenafil and placebo groups. The results of men with ED were stratified by clinical condition(s). Statistical heterogeneity was explored. Meta-analyses based on random-effects model were also performed. A total of 49 RCTs were included. Sildenafil-treated men had a higher risk for all-cause AEs (RR = 1.56, 95% CI: 1.38, 1.76), headache, flushing, dyspepsia, and visual disturbances compared with placebo-treated men. The magnitude of excess risk was greater in fixed- than in flexible-dose trials. The rates of serious adverse events and withdrawals because of adverse events did not differ in sildenafil vs placebo groups. A higher dose of sildenafil corresponded to a greater risk of AEs. The increased risk of harms was observed within and across clinically defined specific groups of patients. There was a lack of RCTs reporting long-term (>6 months) harms data. In short-term trials, men with ED randomized to sildenafil had an increased risk of all-cause any AEs, headache, flushing, dyspepsia, and visual disturbances. The exploration of different modes of dose optimization of sildenafil may be warranted.
Risk Assessment of Bone Fracture During Space Exploration Missions to the Moon and Mars
NASA Technical Reports Server (NTRS)
Lewandowski, Beth E.; Myers, Jerry G.; Nelson, Emily S.; Licatta, Angelo; Griffin, Devon
2007-01-01
The possibility of a traumatic bone fracture in space is a concern due to the observed decrease in astronaut bone mineral density (BMD) during spaceflight and because of the physical demands of the mission. The Bone Fracture Risk Module (BFxRM) was developed to quantify the probability of fracture at the femoral neck and lumbar spine during space exploration missions. The BFxRM is scenario-based, providing predictions for specific activities or events during a particular space mission. The key elements of the BFxRM are the mission parameters, the biomechanical loading models, the bone loss and fracture models and the incidence rate of the activity or event. Uncertainties in the model parameters arise due to variations within the population and unknowns associated with the effects of the space environment. Consequently, parameter distributions were used in Monte Carlo simulations to obtain an estimate of fracture probability under real mission scenarios. The model predicts an increase in the probability of fracture as the mission length increases and fracture is more likely in the higher gravitational field of Mars than on the moon. The resulting probability predictions and sensitivity analyses of the BFxRM can be used as an engineering tool for mission operation and resource planning in order to mitigate the risk of bone fracture in space.
Risk Assessment of Bone Fracture During Space Exploration Missions to the Moon and Mars
NASA Technical Reports Server (NTRS)
Lewandowski, Beth E.; Myers, Jerry G.; Nelson, Emily S.; Griffin, Devon
2008-01-01
The possibility of a traumatic bone fracture in space is a concern due to the observed decrease in astronaut bone mineral density (BMD) during spaceflight and because of the physical demands of the mission. The Bone Fracture Risk Module (BFxRM) was developed to quantify the probability of fracture at the femoral neck and lumbar spine during space exploration missions. The BFxRM is scenario-based, providing predictions for specific activities or events during a particular space mission. The key elements of the BFxRM are the mission parameters, the biomechanical loading models, the bone loss and fracture models and the incidence rate of the activity or event. Uncertainties in the model parameters arise due to variations within the population and unknowns associated with the effects of the space environment. Consequently, parameter distributions were used in Monte Carlo simulations to obtain an estimate of fracture probability under real mission scenarios. The model predicts an increase in the probability of fracture as the mission length increases and fracture is more likely in the higher gravitational field of Mars than on the moon. The resulting probability predictions and sensitivity analyses of the BFxRM can be used as an engineering tool for mission operation and resource planning in order to mitigate the risk of bone fracture in space.
Friedman, Samuel R; Rossi, Diana; Braine, Naomi
2009-05-01
Political-economic transitions in the Soviet Union, Indonesia, and China, but not the Philippines, were followed by HIV epidemics among drug users. Wars also may sometimes increase HIV risk. Based on similarities in some of the causal pathways through which wars and transitions can affect HIV risk, we use the term "Big Events" to include both. We first critique several prior epidemiological models of Big Events as inadequately incorporating social agency and as somewhat imprecise and over-generalizing in their sociology. We then suggest a model using the following concepts: first, event-specific HIV transmission probabilities are functions of (a) the probability that partners are infection-discordant; (b) the infection-susceptibility of the uninfected partner; (c) the infectivity of the infected--as well as (d) the behaviours engaged in. These probabilities depend on the distributions of HIV and other variables in populations. Sexual or injection events incorporate risk behaviours and are embedded in sexual and injection partnership patterns and community networks, which in turn are shaped by the content of normative regulation in communities. Wars and transitions can change socio-economic variables that can sometimes precipitate increases in the numbers of people who engage in high-risk drug and sexual networks and behaviours and in the riskiness of what they do. These variables that Big Events affect may include population displacement; economic difficulties and policies; police corruption, repressiveness, and failure to preserve order; health services; migration; social movements; gender roles; and inter-communal violence--which, in turn, affect normative regulation, youth alienation, networks and behaviours. As part of these pathways, autonomous action by neighbourhood residents, teenagers, drug users and sex workers to maintain their economic welfare, health or happiness may affect many of these variables or otherwise mediate whether HIV epidemics follow transitions. We thus posit that research on whether and how these interacting causal pathways and autonomous actions are followed by drug-related harm and/or HIV or other epidemics can help us understand how to intervene to prevent or mitigate such harms.
Overview of EVA PRA for TPS Repair for Hubble Space Telescope Servicing Mission
NASA Technical Reports Server (NTRS)
Bigler, Mark; Duncan, Gary; Roeschel, Eduardo; Canga, Michael
2010-01-01
Following the Columbia accident in 2003, NASA developed techniques to repair the Thermal Protection System (TPS) in the event of damage to the TPS as one of several actions to reduce the risk to future flights from ascent debris, micro-meteoroid and/or orbital debris (MMOD). Other actions to help reduce the risk include improved inspection techniques, reduced shedding of debris from the External Tank and ability to rescue the crew with a launch on need vehicle. For the Hubble Space Telescope (HST) Servicing Mission the crew rescue capability was limited by the inability to safe haven on the International Space Station (ISS), resulting in a greater reliance on the repair capability. Therefore it was desirable to have an idea of the risk associated with conducting a repair, where the repair would have to be conducted using an Extra-Vehicular Activity (EVA). Previously, focused analyses had been conducted to quantify the risk associated with certain aspects of an EVA, for example the EVA Mobility Unit (EMU) or Space Suit; however, the analyses were somewhat limited in scope. A complete integrated model of an EVA which could quantify the risk associated with all of the major components of an EVA had never been done before. It was desired to have a complete integrated model to be able to assess the risks associated with an EVA to support the Space Shuttle Program (SSP) in making risk informed decisions. In the case of the HST Servicing Mission, this model was developed to assess specifically the risks associated with performing a TPS repair EVA. This paper provides an overview of the model that was developed to support the HST mission in the event of TPS damage. The HST Servicing Mission was successfully completed on May 24th 2009 with no critical TPS damage; therefore the model was not required for real-time mission support. However, it laid the foundation upon which future EVA quantitative risk assessments could be based.
Liu, Hui; Waite, Linda J; Shen, Shannon; Wang, Donna H
2016-09-01
Working from a social relationship and life course perspective, we provide generalizable population-based evidence on partnered sexuality linked to cardiovascular risk in later life using national longitudinal data from the National Social Life, Health and Aging Project (NSHAP) (N = 2,204). We consider characteristics of partnered sexuality of older men and women, particularly sexual activity and sexual quality, as they affect cardiovascular risk. Cardiovascular risk is defined as hypertension, rapid heart rate, elevated C-reactive protein (CRP), and general cardiovascular events. We find that older men are more likely to report being sexually active, having sex more often, and more enjoyably than are older women. Results from cross-lagged models suggest that high frequency of sex is positively related to later risk of cardiovascular events for men but not women, whereas good sexual quality seems to protect women but not men from cardiovascular risk in later life. We find no evidence that poor cardiovascular health interferes with later sexuality for either gender. © American Sociological Association 2016.
NASA Technical Reports Server (NTRS)
Kerstman, Eric; Saile, Lynn; Freire de Carvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Lopez, Vilma
2011-01-01
Introduction The Integrated Medical Model (IMM) is a decision support tool that is useful to space flight mission managers and medical system designers in assessing risks and optimizing medical systems. The IMM employs an evidence-based, probabilistic risk assessment (PRA) approach within the operational constraints of space flight. Methods Stochastic computational methods are used to forecast probability distributions of medical events, crew health metrics, medical resource utilization, and probability estimates of medical evacuation and loss of crew life. The IMM can also optimize medical kits within the constraints of mass and volume for specified missions. The IMM was used to forecast medical evacuation and loss of crew life probabilities, as well as crew health metrics for a near-earth asteroid (NEA) mission. An optimized medical kit for this mission was proposed based on the IMM simulation. Discussion The IMM can provide information to the space program regarding medical risks, including crew medical impairment, medical evacuation and loss of crew life. This information is valuable to mission managers and the space medicine community in assessing risk and developing mitigation strategies. Exploration missions such as NEA missions will have significant mass and volume constraints applied to the medical system. Appropriate allocation of medical resources will be critical to mission success. The IMM capability of optimizing medical systems based on specific crew and mission profiles will be advantageous to medical system designers. Conclusion The IMM is a decision support tool that can provide estimates of the impact of medical events on human space flight missions, such as crew impairment, evacuation, and loss of crew life. It can be used to support the development of mitigation strategies and to propose optimized medical systems for specified space flight missions. Learning Objectives The audience will learn how an evidence-based decision support tool can be used to help assess risk, develop mitigation strategies, and optimize medical systems for exploration space flight missions.
Cognitive Complexity of the Medical Record Is a Risk Factor for Major Adverse Events
Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot
2014-01-01
Context: Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood because “patient complexity” has been difficult to quantify. Objective: We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. Design: The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. Main Outcome Measures: The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Results: Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. Conclusions: CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time. PMID:24626065
Wolfson, Julian; Vock, David M; Bandyopadhyay, Sunayan; Kottke, Thomas; Vazquez-Benitez, Gabriela; Johnson, Paul; Adomavicius, Gediminas; O'Connor, Patrick J
2017-04-24
Clinicians who are using the Framingham Risk Score (FRS) or the American College of Cardiology/American Heart Association Pooled Cohort Equations (PCE) to estimate risk for their patients based on electronic health data (EHD) face 4 questions. (1) Do published risk scores applied to EHD yield accurate estimates of cardiovascular risk? (2) Are FRS risk estimates, which are based on data that are up to 45 years old, valid for a contemporary patient population seeking routine care? (3) Do the PCE make the FRS obsolete? (4) Does refitting the risk score using EHD improve the accuracy of risk estimates? Data were extracted from the EHD of 84 116 adults aged 40 to 79 years who received care at a large healthcare delivery and insurance organization between 2001 and 2011. We assessed calibration and discrimination for 4 risk scores: published versions of FRS and PCE and versions obtained by refitting models using a subset of the available EHD. The published FRS was well calibrated (calibration statistic K=9.1, miscalibration ranging from 0% to 17% across risk groups), but the PCE displayed modest evidence of miscalibration (calibration statistic K=43.7, miscalibration from 9% to 31%). Discrimination was similar in both models (C-index=0.740 for FRS, 0.747 for PCE). Refitting the published models using EHD did not substantially improve calibration or discrimination. We conclude that published cardiovascular risk models can be successfully applied to EHD to estimate cardiovascular risk; the FRS remains valid and is not obsolete; and model refitting does not meaningfully improve the accuracy of risk estimates. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Risk Management Model in Surface Exploitation of Mineral Deposits
NASA Astrophysics Data System (ADS)
Stojanović, Cvjetko
2016-06-01
Risk management is an integrative part of all types of project management. One of the main tasks of pre-investment studies and other project documentation is the tendency to protect investment projects as much as possible against investment risks. Therefore, the provision and regulation of risk information ensure the identification of the probability of the emergence of adverse events, their forms, causes and consequences, and provides a timely measures of protection against risks. This means that risk management involves a set of management methods and techniques used to reduce the possibility of realizing the adverse events and consequences and thus increase the possibilities of achieving the planned results with minimal losses. Investment in mining projects are of capital importance because they are very complex projects, therefore being very risky, because of the influence of internal and external factors and limitations arising from the socio-economic environment. Due to the lack of a risk management system, numerous organizations worldwide have suffered significant financial losses. Therefore, it is necessary for any organization to establish a risk management system as a structural element of system management system as a whole. This paper presents an approach to a Risk management model in the project of opening a surface coal mine, developed based on studies of extensive scientific literature and personal experiences of the author, and which, with certain modifications, may find use for any investment project, both in the mining industry as well as in investment projects in other areas.
Modeling logistic performance in quantitative microbial risk assessment.
Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke
2010-01-01
In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.
Solomon, Daniel H.; Kremer, Joel; Curtis, Jeffrey R; Hochberg, Marc C.; Reed, George; Tsao, Peter; Farkouh, Michael E.; Setoguchi, Soko; Greenberg, Jeffrey D.
2010-01-01
Background Cardiovascular (CV) disease has a major impact on patients with rheumatoid arthritis (RA), however, the relative contributions of traditional CV risk factors and markers of RA severity are unclear. We examined the relative importance of traditional CV risk factors and RA markers in predicting CV events. Methods A prospective longitudinal cohort study was conducted in the setting of the CORRONA registry in the United States. Baseline data from subjects with RA enrolled in the CORRONA registry were examined to determine predictors of CV outcomes, including myocardial infarction (MI), stroke or transient ischemic attack (TIA). Possible predictors were of two types: traditional CV risk factors and markers of RA severity. The discriminatory value of these variables was assessed by calculating the area under the receiver operating characteristic curve (c-statistic) in logistic regression. We then assessed the incidence rate for CV events among subjects with an increasing number of traditional CV risk factors and/or RA severity markers. Results The cohort consisted of 10,156 patients with RA followed for a median of 22 months. We observed 76 primary CV events during follow-up for a composite event rate of 3.98 (95% CI 3.08 – 4.88) per 1,000 patient-years. The c-statistic improved from 0.57 for models with only CV risk factors to 0.67 for models with CV risk factors plus age and gender. The c-statistic improved further to 0.71 when markers of RA severity were also added. The incidence rate for CV events was 0 (95% CI 0 – 5.98) for persons without any CV risk factors or markers of RA severity, while in the group with two or more CV risk factors and 3 or more markers of RA severity the incidence was 7.47 (95% CI 4.21–10.73) per 1,000 person-years. Conclusions Traditional CV risk factors and markers of RA severity both contribute to models predicting CV events. Increasing numbers of both types of factors are associated with greater risk. PMID:20444756
A framework for probabilistic pluvial flood nowcasting for urban areas
NASA Astrophysics Data System (ADS)
Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick
2016-04-01
Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the larger city of Gent, Belgium. After each of the different above-mentioned components were evaluated, they were combined and tested for recent historical flood events. The rainfall nowcasting, hydraulic sewer and 2D inundation modelling and socio-economical flood risk results each could be partly evaluated: the rainfall nowcasting results based on radar data and rain gauges; the hydraulic sewer model results based on water level and discharge data at pumping stations; the 2D inundation modelling results based on limited data on some recent flood locations and inundation depths; the results for the socio-economical flood consequences of the most extreme events based on claims in the database of the national disaster agency. Different methods for visualization of the probabilistic inundation results are proposed and tested.
Regulatory-Science: Biphasic Cancer Models or the LNT—Not Just a Matter of Biology!
Ricci, Paolo F.; Sammis, Ian R.
2012-01-01
There is no doubt that prudence and risk aversion must guide public decisions when the associated adverse outcomes are either serious or irreversible. With any carcinogen, the levels of risk and needed protection before and after an event occurs, are determined by dose-response models. Regulatory law should not crowd out the actual beneficial effects from low dose exposures—when demonstrable—that are inevitably lost when it adopts the linear non-threshold (LNT) as its causal model. Because regulating exposures requires planning and developing protective measures for future acute and chronic exposures, public management decisions should be based on minimizing costs and harmful exposures. We address the direct and indirect effects of causation when the danger consists of exposure to very low levels of carcinogens and toxicants. The societal consequences of a policy can be deleterious when that policy is based on a risk assumed by the LNT, in cases where low exposures are actually beneficial. Our work develops the science and the law of causal risk modeling: both are interwoven. We suggest how their relevant characteristics differ, but do not attempt to keep them separated; as we demonstrate, this union, however unsatisfactory, cannot be severed. PMID:22740778
Depta, Jeremiah P; Patel, Jayendrakumar S; Novak, Eric; Gage, Brian F; Masrani, Shriti K; Raymer, David; Facey, Gabrielle; Patel, Yogesh; Zajarias, Alan; Lasala, John M; Amin, Amit P; Kurz, Howard I; Singh, Jasvindar; Bach, Richard G
2015-02-21
Although lesions deferred revascularization following fractional flow reserve (FFR) assessment have a low risk of adverse cardiac events, variability in risk for deferred lesion intervention (DLI) has not been previously evaluated. The aim of this study was to develop a prediction model to estimate 1-year risk of DLI for coronary lesions where revascularization was not performed following FFR assessment. A prediction model for DLI was developed from a cohort of 721 patients with 882 coronary lesions where revascularization was deferred based on FFR between 10/2002 and 7/2010. Deferred lesion intervention was defined as any revascularization of a lesion previously deferred following FFR. The final DLI model was developed using stepwise Cox regression and validated using bootstrapping techniques. An algorithm was constructed to predict the 1-year risk of DLI. During a mean (±SD) follow-up period of 4.0 ± 2.3 years, 18% of lesions deferred after FFR underwent DLI; the 1-year incidence of DLI was 5.3%, while the predicted risk of DLI varied from 1 to 40%. The final Cox model included the FFR value, age, current or former smoking, history of coronary artery disease (CAD) or prior percutaneous coronary intervention, multi-vessel CAD, and serum creatinine. The c statistic for the DLI prediction model was 0.66 (95% confidence interval, CI: 0.61-0.70). Patients deferred revascularization based on FFR have variation in their risk for DLI. A clinical prediction model consisting of five clinical variables and the FFR value can help predict the risk of DLI in the first year following FFR assessment. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2014. For permissions please email: journals.permissions@oup.com.
The c-index is not proper for the evaluation of $t$-year predicted risks.
Blanche, Paul; Kattan, Michael W; Gerds, Thomas A
2018-02-16
We show that the widely used concordance index for time to event outcome is not proper when interest is in predicting a $t$-year risk of an event, for example 10-year mortality. In the situation with a fixed prediction horizon, the concordance index can be higher for a misspecified model than for a correctly specified model. Impropriety happens because the concordance index assesses the order of the event times and not the order of the event status at the prediction horizon. The time-dependent area under the receiver operating characteristic curve does not have this problem and is proper in this context.
NASA Astrophysics Data System (ADS)
Abe, Steffen; Krieger, Lars; Deckert, Hagen
2017-04-01
The changes of fluid pressures related to the injection of fluids into the deep underground, for example during geothermal energy production, can potentially reactivate faults and thus cause induced seismic events. Therefore, an important aspect in the planning and operation of such projects, in particular in densely populated regions such as the Upper Rhine Graben in Germany, is the estimation and mitigation of the induced seismic risk. The occurrence of induced seismicity depends on a combination of hydraulic properties of the underground, mechanical and geometric parameters of the fault, and the fluid injection regime. In this study we are therefore employing a numerical model to investigate the impact of fluid pressure changes on the dynamics of the faults and the resulting seismicity. The approach combines a model of the fluid flow around a geothermal well based on a 3D finite difference discretisation of the Darcy-equation with a 2D block-slider model of a fault. The models are coupled so that the evolving pore pressure at the relevant locations of the hydraulic model is taken into account in the calculation of the stick-slip dynamics of the fault model. Our modelling approach uses two subsequent modelling steps. Initially, the fault model is run by applying a fixed deformation rate for a given duration and without the influence of the hydraulic model in order to generate the background event statistics. Initial tests have shown that the response of the fault to hydraulic loading depends on the timing of the fluid injection relative to the seismic cycle of the fault. Therefore, multiple snapshots of the fault's stress- and displacement state are generated from the fault model. In a second step, these snapshots are then used as initial conditions in a set of coupled hydro-mechanical model runs including the effects of the fluid injection. This set of models is then compared with the background event statistics to evaluate the change in the probability of seismic events. The event data such as location, magnitude, and source characteristics can be used as input for numerical wave propagation models. This allows the translation of seismic event statistics generated by the model into ground shaking probabilities.
Analyses of risks associated with radiation exposure from past major solar particle events
NASA Technical Reports Server (NTRS)
Weyland, Mark D.; Atwell, William; Cucinotta, Francis A.; Wilson, John W.; Hardy, Alva C.
1991-01-01
Radiation exposures and cancer induction/mortality risks were investigated for several major solar particle events (SPE's). The SPE's included are: February 1956, November 1960, August 1972, October 1989, and the September, August, and October 1989 events combined. The three 1989 events were treated as one since all three could affect a single lunar or Mars mission. A baryon transport code was used to propagate particles through aluminum and tissue shield materials. A free space environment was utilized for all calculations. Results show the 30-day blood forming organs (BFO) limit of 25 rem was surpassed by all five events using 10 g/sq cm of shielding. The BFO limit is based on a depth dose of 5 cm of tissue, while a more detailed shield distribution of the BFO's was utilized. A comparison between the 5 cm depth dose and the dose found using the BFO shield distribution shows that the 5 cm depth value slightly higher than the BFO dose. The annual limit of 50 rem was exceeded by the August 1972, October 1989, and the three combined 1989 events with 5 g/sq cm of shielding. Cancer mortality risks ranged from 1.5 to 17 percent at 1 g/sq cm and 0.5 to 1.1 percent behind 10 g/sq cm of shielding for five events. These ranges correspond to those for a 45 year old male. It is shown that secondary particles comprise about 1/3 of the total risk at 10 g/sq cm of shielding. Utilizing a computerized Space Shuttle shielding model to represent a typical spacecraft configuration in free space at the August 1972 SPE, average crew doses exceeded the BFO dose limit.
Atherosclerosis profile and incidence of cardiovascular events: a population-based survey.
Robinson, Jennifer G; Fox, Kathleen M; Bullano, Michael F; Grandy, Susan
2009-09-15
Atherosclerosis is a chronic progressive disease often presenting as clinical cardiovascular disease (CVD) events. This study evaluated the characteristics of individuals with a diagnosis of atherosclerosis and estimated the incidence of CVD events to assist in the early identification of high-risk individuals. Respondents to the US SHIELD baseline survey were followed for 2 years to observe incident self-reported CVD. Respondents had subclinical atherosclerosis if they reported a diagnosis of narrow or blocked arteries/carotid artery disease without a past clinical CVD event (heart attack, stroke or revascularization). Characteristics of those with atherosclerosis and incident CVD were compared with those who did not report atherosclerosis at baseline but had CVD in the following 2 years using chi-square tests. Logistic regression model identified characteristics associated with atherosclerosis and incident events. Of 17,640 respondents, 488 (2.8%) reported having subclinical atherosclerosis at baseline. Subclinical atherosclerosis was associated with age, male gender, dyslipidemia, circulation problems, hypertension, past smoker, and a cholesterol test in past year (OR = 2.2) [all p < 0.05]. Incident CVD was twice as high in respondents with subclinical atherosclerosis (25.8%) as in those without atherosclerosis or clinical CVD (12.2%). In individuals with subclinical atherosclerosis, men (RR = 1.77, p = 0.050) and individuals with circulation problems (RR = 2.36, p = 0.003) were at greatest risk of experiencing CVD events in the next 2 years. Self-report of subclinical atherosclerosis identified an extremely high-risk group with a >25% risk of a CVD event in the next 2 years. These characteristics may be useful for identifying individuals for more aggressive diagnostic and therapeutic efforts.
Li, Wen; Zhao, Li-Zhong; Ma, Dong-Wang; Wang, De-Zheng; Shi, Lei; Wang, Hong-Lei; Dong, Mo; Zhang, Shu-Yi; Cao, Lei; Zhang, Wei-Hua; Zhang, Xi-Peng; Zhang, Qing-Huai; Yu, Lin; Qin, Hai; Wang, Xi-Mo; Chen, Sam Li-Sheng
2018-05-01
We aimed to predict colorectal cancer (CRC) based on the demographic features and clinical correlates of personal symptoms and signs from Tianjin community-based CRC screening data.A total of 891,199 residents who were aged 60 to 74 and were screened in 2012 were enrolled. The Lasso logistic regression model was used to identify the predictors for CRC. Predictive validity was assessed by the receiver operating characteristic (ROC) curve. Bootstrapping method was also performed to validate this prediction model.CRC was best predicted by a model that included age, sex, education level, occupations, diarrhea, constipation, colon mucosa and bleeding, gallbladder disease, a stressful life event, family history of CRC, and a positive fecal immunochemical test (FIT). The area under curve (AUC) for the questionnaire with a FIT was 84% (95% CI: 82%-86%), followed by 76% (95% CI: 74%-79%) for a FIT alone, and 73% (95% CI: 71%-76%) for the questionnaire alone. With 500 bootstrap replications, the estimated optimism (<0.005) shows good discrimination in validation of prediction model.A risk prediction model for CRC based on a series of symptoms and signs related to enteric diseases in combination with a FIT was developed from first round of screening. The results of the current study are useful for increasing the awareness of high-risk subjects and for individual-risk-guided invitations or strategies to achieve mass screening for CRC.
Lin, C Huie; Hegde, Sanjeet; Marshall, Audrey C; Porras, Diego; Gauvreau, Kimberlee; Balzer, David T; Beekman, Robert H; Torres, Alejandro; Vincent, Julie A; Moore, John W; Holzer, Ralf; Armsby, Laurie; Bergersen, Lisa
2014-01-01
Continued advancements in congenital cardiac catheterization and interventions have resulted in increased patient and procedural complexity. Anticipation of life-threatening events and required rescue measures is a critical component to preprocedural preparation. We sought to determine the incidence and nature of life-threatening adverse events in congenital and pediatric cardiac catheterization, risk factors, and resources necessary to anticipate and manage events. Data from 8905 cases performed at the 8 participating institutions of the Congenital Cardiac Catheterization Project on Outcomes were captured between 2007 and 2010 [median 1,095/site (range 133-3,802)]. The incidence of all life-threatening events was 2.1 % [95 % confidence interval (CI) 1.8-2.4 %], whereas mortality was 0.28 % (95 % CI 0.18-0.41 %). Fifty-seven life-threatening events required cardiopulmonary resuscitation, whereas 9 % required extracorporeal membrane oxygenation. Use of a risk adjustment model showed that age <1 year [odd ratio (OR) 1.9, 95 % CI 1.4-2.7, p < 0.001], hemodynamic vulnerability (OR 1.6, 95 % CI 1.1-2.3, p < 0.01), and procedure risk (category 3: OR 2.3, 95 % CI 1.3-4.1; category 4: OR 4.2, 95 % CI 2.4-7.4) were predictors of life-threatening events. Using this model, standardized life-threatening event ratios were calculated, thus showing that one institution had a life-threatening event rate greater than expected. Congenital cardiac catheterization and intervention can be performed safely with a low rate of life-threatening events and mortality; preprocedural evaluation of risk may optimize preparation of emergency rescue and bailout procedures. Risk predictors (age < 1, hemodynamic vulnerability, and procedure risk category) can enhance preprocedural patient risk stratification and planning.
Effects of protection forests on rockfall risks: implementation in the Swiss risk concept
NASA Astrophysics Data System (ADS)
Trappmann, Daniel; Moos, Christine; Fehlmann, Michael; Ernst, Jacqueline; Sandri, Arthur; Dorren, Luuk; Stoffel, Markus
2016-04-01
Forests growing on slopes below active rockfall cliffs can provide effective protection for human lives and infrastructures. The risk-based approach for natural hazards in Switzerland shall take such biological measures just like existing technical protective measures into account, provided that certain criteria regarding condition, maintenance and durability are met. This contribution describes a project in which we are investigating how the effects of protection forests can be considered in rockfall risk analyses in an appropriate way. In principle, protection forests reduce rockfall risks in three different ways: (i) reduction of the event magnitude (energy) due to collisions with tree stems; (ii) reduction of frequency of occurrence of a given scenario (block volume arriving at the damage potential); (iii) reduction of spatial probability of occurrence (spread and runout) of a given scenario in case of multiple fragments during one event. The aim of this work is to develop methods for adequately implementing these three effects of rockfall protection forests in risk calculations. To achieve this, we use rockfall simulations taking collisions with trees into account and detailed field validation. On five test sites, detailed knowledge on past rockfall activity is gathered by combining investigations of impacted trees, analysis of documented historical events, and deposits in the field. Based on this empirical data on past rockfalls, a methodology is developed that allows transferring real past rockfall activity to simulation results obtained with the three-dimensional, process-based model Rockyfor3D. Different ways of quantifying the protective role of forests will be considered by comparing simulation results with and without forest cover. Combining these different research approaches, systematic considerations shall lead to the development of methods for adequate inclusion of the protective effects of forests in risk calculations. The applicability of the developed methods will be tested on the case study slopes in order to ensure practical applicability to a broad range of rockfall situations on forested slopes.
Risk and the physics of clinical prediction.
McEvoy, John W; Diamond, George A; Detrano, Robert C; Kaul, Sanjay; Blaha, Michael J; Blumenthal, Roger S; Jones, Steven R
2014-04-15
The current paradigm of primary prevention in cardiology uses traditional risk factors to estimate future cardiovascular risk. These risk estimates are based on prediction models derived from prospective cohort studies and are incorporated into guideline-based initiation algorithms for commonly used preventive pharmacologic treatments, such as aspirin and statins. However, risk estimates are more accurate for populations of similar patients than they are for any individual patient. It may be hazardous to presume that the point estimate of risk derived from a population model represents the most accurate estimate for a given patient. In this review, we exploit principles derived from physics as a metaphor for the distinction between predictions regarding populations versus patients. We identify the following: (1) predictions of risk are accurate at the level of populations but do not translate directly to patients, (2) perfect accuracy of individual risk estimation is unobtainable even with the addition of multiple novel risk factors, and (3) direct measurement of subclinical disease (screening) affords far greater certainty regarding the personalized treatment of patients, whereas risk estimates often remain uncertain for patients. In conclusion, shifting our focus from prediction of events to detection of disease could improve personalized decision-making and outcomes. We also discuss innovative future strategies for risk estimation and treatment allocation in preventive cardiology. Copyright © 2014 Elsevier Inc. All rights reserved.
Willis, Michael; Asseburg, Christian; Nilsson, Andreas; Johnsson, Kristina; Kartman, Bernt
2017-03-01
Type 2 diabetes mellitus (T2DM) is chronic and progressive and the cost-effectiveness of new treatment interventions must be established over long time horizons. Given the limited durability of drugs, assumptions regarding downstream rescue medication can drive results. Especially for insulin, for which treatment effects and adverse events are known to depend on patient characteristics, this can be problematic for health economic evaluation involving modeling. To estimate parsimonious multivariate equations of treatment effects and hypoglycemic event risks for use in parameterizing insulin rescue therapy in model-based cost-effectiveness analysis. Clinical evidence for insulin use in T2DM was identified in PubMed and from published reviews and meta-analyses. Study and patient characteristics and treatment effects and adverse event rates were extracted and the data used to estimate parsimonious treatment effect and hypoglycemic event risk equations using multivariate regression analysis. Data from 91 studies featuring 171 usable study arms were identified, mostly for premix and basal insulin types. Multivariate prediction equations for glycated hemoglobin A 1c lowering and weight change were estimated separately for insulin-naive and insulin-experienced patients. Goodness of fit (R 2 ) for both outcomes were generally good, ranging from 0.44 to 0.84. Multivariate prediction equations for symptomatic, nocturnal, and severe hypoglycemic events were also estimated, though considerable heterogeneity in definitions limits their usefulness. Parsimonious and robust multivariate prediction equations were estimated for glycated hemoglobin A 1c and weight change, separately for insulin-naive and insulin-experienced patients. Using these in economic simulation modeling in T2DM can improve realism and flexibility in modeling insulin rescue medication. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Markov chains and semi-Markov models in time-to-event analysis.
Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J
2013-10-25
A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.
Markov chains and semi-Markov models in time-to-event analysis
Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.
2014-01-01
A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062
Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions
NASA Astrophysics Data System (ADS)
Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.
2016-02-01
The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable individual vessel accident risk levels and shoreline contamination risk from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - the Portuguese continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time, or as an alternative, a correction factor based on vessel distance from coast. Shoreline risks can be computed in real time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns ("hot spots") or developing sensitivity analysis to specific conditions, whereas real-time risk levels can be used in the prioritization of individual ships, geographical areas, strategic tug positioning and implementation of dynamic risk-based vessel traffic monitoring.
Binder, Harald; Porzelius, Christine; Schumacher, Martin
2011-03-01
Analysis of molecular data promises identification of biomarkers for improving prognostic models, thus potentially enabling better patient management. For identifying such biomarkers, risk prediction models can be employed that link high-dimensional molecular covariate data to a clinical endpoint. In low-dimensional settings, a multitude of statistical techniques already exists for building such models, e.g. allowing for variable selection or for quantifying the added value of a new biomarker. We provide an overview of techniques for regularized estimation that transfer this toward high-dimensional settings, with a focus on models for time-to-event endpoints. Techniques for incorporating specific covariate structure are discussed, as well as techniques for dealing with more complex endpoints. Employing gene expression data from patients with diffuse large B-cell lymphoma, some typical modeling issues from low-dimensional settings are illustrated in a high-dimensional application. First, the performance of classical stepwise regression is compared to stage-wise regression, as implemented by a component-wise likelihood-based boosting approach. A second issues arises, when artificially transforming the response into a binary variable. The effects of the resulting loss of efficiency and potential bias in a high-dimensional setting are illustrated, and a link to competing risks models is provided. Finally, we discuss conditions for adequately quantifying the added value of high-dimensional gene expression measurements, both at the stage of model fitting and when performing evaluation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High resolution tsunami modelling for the evaluation of potential risk areas in Setúbal (Portugal)
NASA Astrophysics Data System (ADS)
Ribeiro, J.; Silva, A.; Leitão, P.
2011-08-01
The use of high resolution hydrodynamic modelling to simulate the potential effects of tsunami events can provide relevant information about the most probable inundation areas. Moreover, the consideration of complementary data such as the type of buildings, location of priority equipment, type of roads, enables mapping of the most vulnerable zones, computing of the expected damage on man-made structures, constrain of the definition of rescue areas and escape routes, adaptation of emergency plans and proper evaluation of the vulnerability associated with different areas and/or equipment. Such an approach was used to evaluate the specific risks associated with a potential occurrence of a tsunami event in the region of Setúbal (Portugal), which was one of the areas most seriously affected by the 1755 tsunami. In order to perform an evaluation of the hazard associated with the occurrence of a similar event, high resolution wave propagation simulations were performed considering different potential earthquake sources with different magnitudes. Based on these simulations, detailed inundation maps associated with the different events were produced. These results were combined with the available information on the vulnerability of the local infrastructures (building types, roads and streets characteristics, priority buildings) in order to impose restrictions in the production of high-scale potential damage maps, escape routes and emergency routes maps.
Model for Solar Proton Risk Assessment
NASA Technical Reports Server (NTRS)
Xapos, M. A.; Stauffer, C.; Gee, G. B.; Barth, J. L.; Stassinopoulos, E. G.; McGuire, R. E.
2004-01-01
A statistical model for cumulative solar proton event fluences during space missions is presented that covers both the solar minimum and solar maximum phases of the solar cycle. It is based on data from the IMP and GOES series of satellites that is integrated together to allow the best features of each data set to be taken advantage of. This allows fluence-energy spectra to be extended out to energies of 327 MeV.
Assessment of Costs for a Global Climate Fund Against Public Sector Disaster Risks
NASA Astrophysics Data System (ADS)
Hochrainer-Stigler, Stefan; Mechler, Reinhard; Pflug, Georg; Williges, Keith
2013-04-01
National governments are key actors in managing climate variability and change, yet, many countries, faced with exhausted tax bases, high levels of indebtedness and limited donor assistance, have been unable to raise sufficient and timely capital to replace or repair damaged assets and restore livelihoods following major disasters exacerbating the impacts of disaster shocks on poverty and development. For weather extremes, which form a subset of the adaptation challenge and are supposed to increase in intensity and frequency with a changing climate, we conduct an assessment of the costs of managing and financing today's public sector risks on a global scale for more than 180 countries. A countries financial vulnerability is defined as a function of its financial resilience and its exposure to disaster risk. While disaster risk is estimated in terms of asset loss distributions based on catastrophe modeling approaches, financial resilience is operationalized as the public sector's ability to pay for relief to the affected population and support the reconstruction of affected assets and infrastructure for a given event. We consider governments financially vulnerable to disasters if they cannot access sufficient funding after a disaster to cover their liabilities. We operationalize this concept by the term resource gap, which we define the net loss associated with a disaster event after exhausting all possible ex-post and ex ante financing sources. Extending this approach for all possible disaster events, the risk that a resource gap will occur over a given time-span can be calculated for each country individually and dependent on the risk level different risk instruments may have to be applied. Furthermore, our estimates may inform decisions pertaining to a "climate insurance fund" absorbing "high level" country risks exceeding the ability of any given country to pay in the case of an extreme event. Our estimates relate to today's climate, yet we suggest that estimates of current climate variability and related risks, although also associated with substantial uncertainty, can be interpreted as a baseline for very uncertain future projections.
Quantitative assessment of changes in landslide risk using a regional scale run-out model
NASA Astrophysics Data System (ADS)
Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone
2015-04-01
The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors. The risk was calculated by multiplying the vulnerability with the spatial probability and the building values. Changes in landslide risk was assessed using the loss estimation of four different periods: (1) pre-August 2003 disaster, (2) the August 2003 event, (3) post-August 2003 to 2011 and (4) smaller frequent events occurring between the entire 1996-2011 period. One of the major findings of our work was the calculation of a significant decrease in landslide risk after the 2003 disaster compared to the pre-disaster risk period. This indicates the importance of estimating risk after a few years of a major event in order to avoid overestimation or exaggeration of future losses.
NASA Astrophysics Data System (ADS)
Bozza, Andrea; Durand, Arnaud; Allenbach, Bernard; Confortola, Gabriele; Bocchiola, Daniele
2013-04-01
We present a feasibility study to explore potential of high-resolution imagery, coupled with hydraulic flood modeling to predict flooding risks, applied to the case study of Gonaives basins (585 km²), Haiti. We propose a methodology working at different scales, providing accurate results and a faster intervention during extreme flood events. The 'Hispaniola' island, in the Caribbean tropical zone, is often affected by extreme floods events. Floods are caused by tropical springs and hurricanes, and may lead to several damages, including cholera epidemics, as recently occurred, in the wake of the earthquake upon January 12th 2010 (magnitude 7.0). Floods studies based upon hydrological and hydraulic modeling are hampered by almost complete lack of ground data. Thenceforth, and given the noticeable cost involved in the organization of field measurement campaigns, the need for exploitation of remote sensing images data. HEC-RAS 1D modeling is carried out under different scenarios of available Digital Elevation Models. The DEMs are generated using optical remote sensing satellite (WorldView-1) and SRTM, combined with information from an open source database (Open Street Map). We study two recent flood episodes, where flood maps from remote sensing were available. Flood extent and land use have been assessed by way of data from SPOT-5 satellite, after hurricane Jeanne in 2004 and hurricane Hanna in 2008. A semi-distributed, DEM based hydrological model is used to simulate flood flows during the hurricanes. Precipitation input is taken from daily rainfall data derived from TRMM satellite, plus proper downscaling. The hydraulic model is calibrated using floodplain friction as tuning parameters against the observed flooded area. We compare different scenarios of flood simulation, and the predictive power of model calibration. The method provide acceptable results in depicting flooded areas, especially considering the tremendous lack of ground data, and show the potential of remote sensing information in prediction of flood events in this area, for the purpose of risk assessment and land use planning, and possibly for flood forecast during extreme events.
NASA Human Research Program Space Radiation Program Element
NASA Technical Reports Server (NTRS)
Chappell, Lori; Huff, Janice; Patel, Janapriya; Wang, Minli; Hu, Shaowwen; Kidane, Yared; Myung-Hee, Kim; Li, Yongfeng; Nounu, Hatem; Plante, Ianik;
2013-01-01
The goal of the NASA Human Research Program's Space Radiation Program Element is to ensure that crews can safely live and work in the space radiation environment. Current work is focused on developing the knowledge base and tools required for accurate assessment of health risks resulting from space radiation exposure including cancer and circulatory and central nervous system diseases, as well as acute risks from solar particle events. Division of Space Life Sciences (DSLS) Space Radiation Team scientists work at multiple levels to advance this goal, with major projects in biological risk research; epidemiology; and physical, biophysical, and biological modeling.
Modeling Payload Stowage Impacts on Fire Risks On-Board the International Space Station
NASA Technical Reports Server (NTRS)
Anton, Kellie e.; Brown, Patrick F.
2010-01-01
The purpose of this presentation is to determine the risks of fire on-board the ISS due to non-standard stowage. ISS stowage is constantly being reexamined for optimality. Non-standard stowage involves stowing items outside of rack drawers, and fire risk is a key concern and is heavily mitigated. A Methodology is needed to account for fire risk due to non-standard stowage to capture the risk. The contents include: 1) Fire Risk Background; 2) General Assumptions; 3) Modeling Techniques; 4) Event Sequence Diagram (ESD); 5) Qualitative Fire Analysis; 6) Sample Qualitative Results for Fire Risk; 7) Qualitative Stowage Analysis; 8) Sample Qualitative Results for Non-Standard Stowage; and 9) Quantitative Analysis Basic Event Data.
Risk of Cardiac Events Associated With Antidepressant Therapy in Patients With Long QT Syndrome.
Wang, Meng; Szepietowska, Barbara; Polonsky, Bronislava; McNitt, Scott; Moss, Arthur J; Zareba, Wojciech; Auerbach, David S
2018-01-15
Patients with long QT syndrome (LQTS) are at a high risk of cardiac events. Many patients with LQTS are treated with antidepressant drugs (ADs). We investigated the LQTS genotype-specific risk of recurrent cardiac arrhythmic events (CAEs) associated with AD therapy. The study included 59 LQT1 and 72 LQT2 patients from the Rochester-based LQTS Registry with corrected QT (QT c ) prolongation and a history of AD therapy. Using multivariate Anderson-Gill models, we estimated the LQTS genotype-specific risk of recurrent CAEs (ventricular tachyarrhythmias, aborted cardiac arrest, or sudden cardiac death) associated with time-dependent ADs. Specifically, we examined the risk associated with all ADs, selective serotonin reuptake inhibitor (SSRI), and ADs classified on the CredibleMeds list (www.CredibleMeds.org) as "Conditional" or "Known risk of Torsades de pointes (TdP)." After adjusting for baseline QT c duration, sex, and time-dependent beta-blocker usage, there was an increased risk of recurrent CAEs associated with ADs in LQT1 patients (hazard ratio = 3.67, 95% confidence interval 1.98-6.82, p < 0.001) but not in LQT2 patients (hazard ratio = 0.89, 95% confidence interval 0.49-1.64, p = 0.716; LQT1 vs LQT2 interaction, p < 0.001). Similarly, LQT1 patients who were on SSRIs or ADs with "Known risk of TdP" had a higher risk of recurrent CAEs than those patients off all ADs, whereas there was no association in LQT2 patients. ADs with "Conditional risk of TdP" were not associated with the risk of recurrent CAEs in any of the groups. In conclusion, the risk of recurrent CAEs associated with time-dependent ADs is higher in LQT1 patients but not in LQT2 patients. Results suggest a LQTS genotype-specific effect of ADs on the risk of arrhythmic events. Copyright © 2017 Elsevier Inc. All rights reserved.
Perel, P; Prieto-Merino, D; Shakur, H; Roberts, I
2013-06-01
Severe bleeding accounts for about one-third of in-hospital trauma deaths. Patients with a high baseline risk of death have the most to gain from the use of life-saving treatments. An accurate and user-friendly prognostic model to predict mortality in bleeding trauma patients could assist doctors and paramedics in pre-hospital triage and could shorten the time to diagnostic and life-saving procedures such as surgery and tranexamic acid (TXA). The aim of the study was to develop and validate a prognostic model for early mortality in patients with traumatic bleeding and to examine whether or not the effect of TXA on the risk of death and thrombotic events in bleeding adult trauma patients varies according to baseline risk. Multivariable logistic regression and risk-stratified analysis of a large international cohort of trauma patients. Two hundred and seventy-four hospitals in 40 high-, medium- and low-income countries. We derived prognostic models in a large placebo-controlled trial of the effects of early administration of a short course of TXA [Clinical Randomisation of an Antifibrinolytic in Significant Haemorrhage (CRASH-2) trial]. The trial included 20,127 trauma patients with, or at risk of, significant bleeding, within 8 hours of injury. We externally validated the model on 14,220 selected trauma patients from the Trauma Audit and Research Network (TARN), which included mainly patients from the UK. We examined the effect of TXA on all-cause mortality, death due to bleeding and thrombotic events (fatal and non-fatal myocardial infarction, stroke, deep-vein thrombosis and pulmonary embolism) within risk strata in the CRASH-2 trial data set and we estimated the proportion of premature deaths averted by applying the odds ratio (OR) from the CRASH-2 trial to each of the risk strata in TARN. For the stratified analysis according baseline risk we considered the intervention TXA (1 g over 10 minutes followed by 1 g over 8 hours) or matching placebo. For the prognostic models we included predictors for death in hospital within 4 weeks of injury. For the stratified analysis we reported ORs for all causes of death, death due to bleeding, and fatal and non-fatal thrombotic events associated with the use of TXA according to baseline risk. A total of 3076 (15%) patients died in the CRASH-2 trial and 1705 (12%) in the TARN data set. Glasgow Coma Scale score, age and systolic blood pressure were the strongest predictors of mortality. Discrimination and calibration were satisfactory, with C-statistics > 0.80 in both CRASH-2 trial and TARN data sets. A simple chart was constructed to readily provide the probability of death at the point of care, while a web-based calculator is available for a more detailed risk assessment. TXA reduced all-cause mortality and death due to bleeding in each stratum of baseline risk. There was no evidence of heterogeneity in the effect of TXA on all-cause mortality (p-value for interaction = 0.96) or death due to bleeding (p= 0.98). There was a significant reduction in the odds of fatal and non-fatal thrombotic events with TXA (OR = 0.69, 95% confidence interval 0.53 to 0.89; p= 0.005). There was no evidence of heterogeneity in the effect of TXA on the risk of thrombotic events (p= 0.74). This prognostic model can be used to obtain valid predictions of mortality in patients with traumatic bleeding. TXA can be administered safely to a wide spectrum of bleeding trauma patients and should not be restricted to the most severely injured. Future research should evaluate whether or not the use of this prognostic model in clinical practice has an impact on the management and outcomes of trauma patients.
Microbial risk assessment in heterogeneous aquifers: 2. Infection risk sensitivity
NASA Astrophysics Data System (ADS)
Molin, S.; Cvetkovic, V.; StenströM, T. A.
2010-05-01
The entire chain of events of human disease transmitted through contaminated water, from pathogen introduction into the source (E. coli, rotavirus, and Hepatitis A), pathogen migration through the aquifer pathway, to ingestion via a supply well, and finally, the potential infection in the human host, is investigated. The health risk calculations are based on a relevant hazardous event with safe setback distances estimated by considering the infection risk from peak exposure in compliance with an acceptable level defined by a regulatory agency. A site-specific hypothetical scenario is illustrated for an aquifer with similar characteristics as the Cape Cod site, Massachusetts (United States). Relatively large variation of safe distances for the three index pathogens is found; individually, none of the index pathogens could predict the safe distance under the wide range of conditions investigated. It is shown that colloid filtration theory (CFT) with spatially variable attachment-detachment rates yields significantly different results from the effective CFT model (i.e., assuming spatially constant parameters).
Market-implied spread for earthquake CAT bonds: financial implications of engineering decisions.
Damnjanovic, Ivan; Aslan, Zafer; Mander, John
2010-12-01
In the event of natural and man-made disasters, owners of large-scale infrastructure facilities (assets) need contingency plans to effectively restore the operations within the acceptable timescales. Traditionally, the insurance sector provides the coverage against potential losses. However, there are many problems associated with this traditional approach to risk transfer including counterparty risk and litigation. Recently, a number of innovative risk mitigation methods, termed alternative risk transfer (ART) methods, have been introduced to address these problems. One of the most important ART methods is catastrophe (CAT) bonds. The objective of this article is to develop an integrative model that links engineering design parameters with financial indicators including spread and bond rating. The developed framework is based on a four-step structural loss model and transformed survival model to determine expected excess returns. We illustrate the framework for a seismically designed bridge using two unique CAT bond contracts. The results show a nonlinear relationship between engineering design parameters and market-implied spread. © 2010 Society for Risk Analysis.
A Systems Modeling Approach for Risk Management of Command File Errors
NASA Technical Reports Server (NTRS)
Meshkat, Leila
2012-01-01
The main cause of commanding errors is often (but not always) due to procedures. Either lack of maturity in the processes, incompleteness of requirements or lack of compliance to these procedures. Other causes of commanding errors include lack of understanding of system states, inadequate communication, and making hasty changes in standard procedures in response to an unexpected event. In general, it's important to look at the big picture prior to making corrective actions. In the case of errors traced back to procedures, considering the reliability of the process as a metric during its' design may help to reduce risk. This metric is obtained by using data from Nuclear Industry regarding human reliability. A structured method for the collection of anomaly data will help the operator think systematically about the anomaly and facilitate risk management. Formal models can be used for risk based design and risk management. A generic set of models can be customized for a broad range of missions.
Tsao, Connie W; Gona, Philimon N; Salton, Carol J; Chuang, Michael L; Levy, Daniel; Manning, Warren J; O’Donnell, Christopher J
2015-01-01
Background Elevated left ventricular mass index (LVMI) and concentric left ventricular (LV) remodeling are related to adverse cardiovascular disease (CVD) events. The predictive utility of LV concentric remodeling and LV mass in the prediction of CVD events is not well characterized. Methods and Results Framingham Heart Study Offspring Cohort members without prevalent CVD (n=1715, 50% men, aged 65±9 years) underwent cardiovascular magnetic resonance for LVMI and geometry (2002–2006) and were prospectively followed for incident CVD (myocardial infarction, coronary insufficiency, heart failure, stroke) or CVD death. Over 13 808 person-years of follow-up (median 8.4, range 0.0 to 10.5 years), 85 CVD events occurred. In multivariable-adjusted proportional hazards regression models, each 10-g/m2 increment in LVMI and each 0.1 unit in relative wall thickness was associated with 33% and 59% increased risk for CVD, respectively (P=0.004 and P=0.009, respectively). The association between LV mass/LV end-diastolic volume and incident CVD was borderline significant (P=0.053). Multivariable-adjusted risk reclassification models showed a modest improvement in CVD risk prediction with the incorporation of cardiovascular magnetic resonance LVMI and measures of LV concentricity (C-statistic 0.71 [95% CI 0.65 to 0.78] for the model with traditional risk factors only, improved to 0.74 [95% CI 0.68 to 0.80] for the risk factor model additionally including LVMI and relative wall thickness). Conclusions Among adults free of prevalent CVD in the community, greater LVMI and LV concentric hypertrophy are associated with a marked increase in adverse incident CVD events. The potential benefit of aggressive primary prevention to modify LV mass and geometry in these adults requires further investigation. PMID:26374295
Integration of expert knowledge and uncertainty in natural risk assessment
NASA Astrophysics Data System (ADS)
Baruffini, Mirko; Jaboyedoff, Michel
2010-05-01
Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and uncertainties. We followed the same approach for each term of risks i.e. hazard, vulnerability, element at risk, exposition. This risk approach can be achieved by a comprehensive use of several artificial intelligence (AI) technologies, which are done through, for example: (1) GIS techniques; (2) FR or T-PDF for qualitatively predicting risks for possible review results; and (3) A Multi-Criteria Evaluation for analyzing weak points. The main advantages of FR or T-PDF involve the ability to express not-fully-formalized knowledge, easy knowledge representation and acquisition, and self updatability. The results show that such an approach points out quite wide zone of uncertainty. REFERENCES Zadeh L.A. 1965 : Fuzzy Sets. Information and Control, 8:338-353.
NASA Astrophysics Data System (ADS)
Burton, E. A.; Pickles, W. L.; Gouveia, F. J.; Bogen, K. T.; Rau, G. H.; Friedmann, J.
2006-12-01
Correct assessment of the potential for CO2 leakage to the atmosphere or near surface is key to managing the risk associated with CO2 storage. Catastrophic, point-source leaks, diffuse seepage, and low leakage rates all merit assessment. Smaller leaks may be early warnings of catastrophic failures, and may be sufficient to damage natural vegetation or crops. Small leaks also may lead to cumulative build-up of lethal levels of CO2 in enclosed spaces, such as basements, groundwater-well head spaces, and caverns. Working with our ZERT partners, we are integrating a variety of monitoring and modeling approaches to understand how to assess potential health, property and environmental risks across this spectrum of leakage types. Remote sensing offers a rapid technique to monitor large areas for adverse environmental effects. If it can be deployed prior to the onset of storage operations, remote sensing also can document baseline conditions against which future claims of environmental damage can be compared. LLNL has been using hyperspectral imaging to detect plant stress associated with CO2 gas leakage, and has begun investigating use of NASA's new satellite or airborne instrumentation that directly measures gas compositions in the atmosphere. While remote sensing techniques have been criticized as lacking the necessary resolution to address environmental problems, new instruments and data processing techniques are demonstrated to resolve environmental changes at the scale associated with gas-leakage scenarios. During the shallow low-flow- CO2 release field experiments planned by ZERT, for the first time, we will have the opportunity to ground- truth hyperspectral data by simultaneous measurement of changes in hyperspectral readings, soil and root zone microbiology, ambient air, soil and aquifer CO2 concentrations. When monitoring data appear to indicate a CO2 leakage event, risk assessment and mitigation of that event requires a robust and nearly real-time method for estimating its associated risk, spatially and temporally. This requires integration of subsurface, surface and atmospheric data and models. To date, we have developed techniques to map risk based on predicted atmospheric plumes and GIS/MT (meteorologic- topographic) risk-indexing tools. This methodology was derived from study of large CO2 releases from an abandoned well penetrating a natural CO2 reservoir at Crystal Geyser, Utah. This integrated approach will provide a powerful tool to screen for high-risk zones at proposed sequestration sites, to design and optimize surface networks for site monitoring and/or to guide setting science-based regulatory compliance requirements for monitoring sequestration sites, as well as to target critical areas for first responders should a catastrophic-release event occur. This work was performed under the auspices of the U.S. Dept. of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.
USDA-ARS?s Scientific Manuscript database
Rift Valley fever (RVF) is a vector-borne zoonotic disease which causes high morbidity and mortality in livestock. In the event Rift Valley fever virus is introduced to the United States or other non-endemic areas, understanding the potential patterns of spread and the areas at risk based on disease...
Interdisciplinary approach for disaster risk reduction in Valtellina Valley, northern Italy
NASA Astrophysics Data System (ADS)
Garcia, Carolina; Blahut, Jan; Luna, Byron Quan; Poretti, Ilaria; Camera, Corrado; de Amicis, Mattia; Sterlacchini, Simone
2010-05-01
Inside the framework of the European research network Mountain Risks, an interdisciplinary research group has been working in the Consortium of Mountain Municipalities of Valtellina di Tirano (northern Italy). This area has been continuously affected by several mountain hazards such as landslides, debris flows and floods that directly affect the population, and in some cases caused several deaths and million euros of losses. An aim of the interdisciplinary work in this study area, is to integrate different scientific products of the research group, in the areas of risk assessment, management and governance, in order to generate, among others, risk reduction tools addressed to general public and stakeholders. Two types of phenomena have been particularly investigated: debris flows and floods. The scientific products range from modeling to mapping of hazard and risk, emergency planning based on real time decision support systems, surveying for the evaluation of risk perception and preparedness, among others. Outputs from medium scale hazard and risk modeling could be used for decision makers and spatial planners as well as civil protection authorities to have a general overview of the area and indentify hot spots for further detailed analysis. Subsequently, local scale analysis is necessary to define possible events and risk scenarios for emergency planning. As for the modeling of past events and new scenarios of debris flows, physical outputs were used as inputs into physical vulnerability assessment and quantitative risk analysis within dynamic runout models. On a pilot zone, the physical damage was quantified for each affected structure within the context of physical vulnerability and different empirical vulnerability curves were obtained. Prospective economic direct losses were estimated. For floods hazard assessment, different approaches and models are being tested, in order to produce flood maps for various return periods, and related to registered rainfalls. About Civil Protection topics, the main aim is to set up and manage contingency plans in advance; that is, to identify and prepare people in charge to take action to define the activities to be performed, to be aware of available resources and to optimize the communication system among the people involved, in order to efficiently face a prospective crisis phase. For this purpose, a real time emergency plan has been develop based GIS (Geographical Information Systems), DSS (Decision Support Systems), and ICT (Information & Communication Technology).
NASA Astrophysics Data System (ADS)
McNamara, D.; Keeler, A.
2011-12-01
Policy discussions of adaptation by coastal residents to increasing rates of sea level rise and changing frequency of damaging storms have focused on community land use planning processes. This view neglects the role that market dynamics and climate change expectations play in the way coastal communities choose among risk mitigation options and manage land use decisions in an environment of escalating risks. We use a model coupling physical coastal processes with an agent-based model of behavior in real estate and mitigation markets to examine the interplay of climate-driven coastal hazards, collective mitigation decisions, and individual beliefs. The physical component model simulates barrier island processes that respond to both storms and slow scale dynamics associated with sea level rise. The economic component model is an agent-based model of economic behavior where agents are rational economic actors working off different assessments of future climate-driven events. Agents differentially update their beliefs based on a) how much emphasis they give to observed coastal changes and b) how much weight they give to scientific predictions. In essence, agents differ along a spectrum of how much they believe that the past is the best guide to the future and how quickly they react to new information. We use the coupled model to explore three questions of interest to coastal policy. First, how do the interplay of costal processes, beliefs, and mitigation choices affect the level and stability of real estate prices? Second, how does this interplay affect the incentives for community investments in shoreline protection? Third, how do expectations and reactions to observed events, as well as mitigation investments, affect the built environment in circumstances when climate risks reach very high levels? This last question relates to a key aspect of climate change adaptation on the coast - when does mitigation give way to abandonment as an optimal adaptation strategy? Results suggest that subjective expectations about climate risk and about the effectiveness of mitigation in high-risk environments are critical in determining when the market starts to reflect the possibility that property might no longer be inhabitable. Results will be presented that contrast the dynamics of abandonment over a range of sea level rise and storminess scenarios.
Flood Risk Assessments of Architectural Heritage - Case of Changgyeonggung Palace
NASA Astrophysics Data System (ADS)
Lee, Hyosang; Kim, Ji-sung; Lee, Ho-jin
2014-05-01
The risk of natural disasters such as flood and earthquake has increased due to recent extreme weather events. Therefore, the necessity of the risk management system to protect architectural properties, a cultural heritage of humanity, from natural disasters has been consistently felt. The solutions for managing flood risk focusing on architectural heritage are suggested and applied to protect Changgyeonggung Palace, a major palace heritage in Seoul. After the probable rainfall scenario for risk assessment (frequency: 100 years, 200 years, and 500 years) and the scenario of a probable maximum precipitation (PMP) are made and a previous rainfall event (from July 26th to 28th in 2011) is identified, they are used for the model (HEC-HMS, SWMM) to assess flood risk of certain areas covering Changgyeonggung Palace to do flood amount. Such flood amount makes it possible to identify inundation risks based on GIS models to assess flood risk of individual architectural heritage. The results of assessing such risk are used to establish the disaster risk management system that managers of architectural properties can utilize. According to the results of assessing flood risk of Changgyeonggung Palace, inundation occurs near outlets of Changgyeonggung Palace and sections of river channel for all scenarios of flood risk but the inundation risk of major architectural properties was estimated low. The methods for assessing flood risk of architectural heritage proposed in this study and the risk management system for Changgyeonggung Palace using the methods show thorough solutions for flood risk management and the possibility of using the solutions seems high. A comprehensive management system for architectural heritage will be established in the future through the review on diverse factors for disasters.
Louis R. Iverson; Stephen N. Matthews; Anantha M. Prasad; Matthew P. Peters; Gary W. Yohe
2012-01-01
We used a risk matrix to assess risk from climate change for multiple forest species by discussing an example that depicts a range of risk for three tree species of northern Wisconsin. Risk is defined here as the product of the likelihood of an event occurring and the consequences or effects of that event. In the context of species habitats, likelihood is related to...
ERIC Educational Resources Information Center
King, Kevin M.; Molina, Brooke S. G.; Chassin, Laurie
2008-01-01
Stressful life events are an important risk factor for psychopathology among children and adolescents. However, variation in life stress may be both stable and time-varying with associated differences in the antecedents. We tested, using latent variable modeling, a state-trait model of stressful life events in adolescence, and predictors of…
SEPEM: A tool for statistical modeling the solar energetic particle environment
NASA Astrophysics Data System (ADS)
Crosby, Norma; Heynderickx, Daniel; Jiggens, Piers; Aran, Angels; Sanahuja, Blai; Truscott, Pete; Lei, Fan; Jacobs, Carla; Poedts, Stefaan; Gabriel, Stephen; Sandberg, Ingmar; Glover, Alexi; Hilgers, Alain
2015-07-01
Solar energetic particle (SEP) events are a serious radiation hazard for spacecraft as well as a severe health risk to humans traveling in space. Indeed, accurate modeling of the SEP environment constitutes a priority requirement for astrophysics and solar system missions and for human exploration in space. The European Space Agency's Solar Energetic Particle Environment Modelling (SEPEM) application server is a World Wide Web interface to a complete set of cross-calibrated data ranging from 1973 to 2013 as well as new SEP engineering models and tools. Both statistical and physical modeling techniques have been included, in order to cover the environment not only at 1 AU but also in the inner heliosphere ranging from 0.2 AU to 1.6 AU using a newly developed physics-based shock-and-particle model to simulate particle flux profiles of gradual SEP events. With SEPEM, SEP peak flux and integrated fluence statistics can be studied, as well as durations of high SEP flux periods. Furthermore, effects tools are also included to allow calculation of single event upset rate and radiation doses for a variety of engineering scenarios.
Ghasemzadeh, Nima; Hritani, Abdul Wahab; De Staercke, Christine; Eapen, Danny J; Veledar, Emir; Al Kassem, Hatem; Khayata, Mohamed; Zafari, A Maziar; Sperling, Laurence; Hooper, Craig; Vaccarino, Viola; Mavromatis, Kreton; Quyyumi, Arshed A
2015-01-01
Stromal derived factor-1α/CXCL12 is a chemoattractant responsible for homing of progenitor cells to ischemic tissues. We aimed to investigate the association of plasma CXCL12 with long-term cardiovascular outcomes in patients with coronary artery disease (CAD). 785 patients aged: 63 ± 12 undergoing coronary angiography were independently enrolled into discovery (N = 186) and replication (N = 599) cohorts. Baseline levels of plasma CXCL12 were measured using Quantikine CXCL12 ELISA assay (R&D systems). Patients were followed for cardiovascular death and/or myocardial infarction (MI) for a mean of 2.6 yrs. Cox proportional hazard was used to determine independent predictors of cardiovascular death/MI. The incidence of cardiovascular death/MI was 13% (N = 99). High CXCL12 level based on best discriminatory threshold derived from the ROC analysis predicted risk of cardiovascular death/MI (HR = 4.81, p = 1 × 10(-6)) independent of traditional risk factors in the pooled cohort. Addition of CXCL12 to a baseline model was associated with a significant improvement in c-statistic (AUC: 0.67-0.73, p = 0.03). Addition of CXCL12 was associated with correct risk reclassification of 40% of events and 10.5% of non-events. Similarly for the outcome of cardiovascular death, the addition of the CXCL12 to the baseline model was associated with correct reclassification of 20.7% of events and 9% of non-events. These results were replicated in two independent cohorts. Plasma CXCL12 level is a strong independent predictor of adverse cardiovascular outcomes in patients with CAD and improves risk reclassification. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Rodriguez, Christina M
2006-05-01
This study examined a model wherein children's attributional style mediates the relationship between parental physical child-abuse risk and children's internalizing problems. Using structural equation modeling, three indices of abuse risk were selected (child abuse potential, physical discipline use, and dysfunctional parenting style) and two indices of children's internalizing problems (depression and anxiety). The sample included 75 parent-child dyads, in which parents reported on their abuse risk and children independently completed measures of depressive and anxious symptomatology and a measure on their attributional style. Findings supported the model that children's attributional style for positive events (but not negative events) partially mediated the relationship between abuse risk and internalizing symptoms, with significant direct and indirect effects of abuse risk on internalizing symptomatology. Future directions to continue evaluating additional mediators and other possible contextual variables are discussed.
Greci, Marina; Manicardi, Valeria
2017-01-01
The aim of the study is to assess sex difference in association between type 2 diabetes and incidence of major cardiovascular events, that is, myocardial infarction, stroke, and heart failure, using information retrieved by diabetes register. The inhabitants of Reggio Emilia (Italy) aged 30–84 were followed during 2012–2014. Incidence rate ratios and 95% confidence intervals were calculated using multivariate Poisson model. The age- and sex-specific event rates were graphed. Subjects with type 2 diabetes had an excess risk compared to their counterparts without diabetes for all the three major cardiovascular events. The excess risk is similar in women and men for stroke (1.8 times) and heart failure (2.7 times), while for myocardial infarction, the excess risk in women is greater than the one observed in men (IRR 2.58, 95% CI 2.22–3.00 and IRR 1.78, 95% CI 1.60–2.00, resp.; P of interaction < 0.0001). Women had always a lesser risk than men, but in case of myocardial infarction, the women with type 2 diabetes lost part of advantage gained by women free of diabetes (IRR 0.61, 95% CI 0.53–0.72 and IRR 0.36, 95% CI 0.33–0.39, resp.). In women with type 2 diabetes, the risk of major cardiovascular events is anticipated by 20–30 years, while in men it is by 15–20. PMID:28316624
Villalta, Elizabeth M; Peiris, Casey L
2013-01-01
To investigate whether early postoperative aquatic physical therapy is a low-risk and effective form of physical therapy to improve functional outcomes after orthopedic surgery. Databases MEDLINE, CINAHL, AMED, Embase, and PEDro were searched from the earliest date available until October 2011. Additional trials were identified by searching reference lists and citation tracking. Controlled trials evaluating the effects of aquatic physical therapy on adverse events for adults <3 months after orthopedic surgery. Two reviewers independently applied inclusion and exclusion criteria, and any disagreements were discussed until consensus could be reached. Searching identified 5069 potentially relevant articles, of which 8 controlled trials with 287 participants met inclusion criteria. A predefined data extraction form was completed in detail for each included study by 1 reviewer and checked for accuracy by another. Methodologic quality of included trials was assessed independently by 2 reviewers using the PEDro scale. Pooled analyses were performed using random effects model with inverse variance methods to calculate standardized mean differences (SMDs) and 95% confidence intervals (CIs) (continuous outcomes) and risk difference and 95% CIs (dichotomous outcomes). When compared with land-based physical therapy, early aquatic physical therapy does not increase the risk of wound-related adverse events (risk difference=.01, 95% CI -.05 to .07) and results in improved performance of activities of daily living (SMD=.33, 95% CI=.07-.58, I(2)=0%). There were no significant differences in edema (SMD=-.27, 95% CI=-.81 to .27, I(2)=58%) or pain (SMD=-.06, 95% CI=-.50 to .38, I(2)=32%). After orthopedic surgery aquatic physical therapy improves function and does not increase the risk of wound-related adverse events and is as effective as land-based therapy in terms of pain, edema, strength, and range of motion in the early postoperative period. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Kimemia, Judy
2017-01-01
Purpose: The purpose of this project was to compare web-based to high-fidelity simulation training in the management of high risk/low occurrence anesthesia related events, to enhance knowledge acquisition for Certified Registered Nurse Anesthetists (CRNAs). This project was designed to answer the question: Is web-based training as effective as…
Tang, Yi-Da; Dewland, Thomas A; Wencker, Detlef; Katz, Stuart D
2009-12-01
Post-exercise heart rate recovery (HRR) is an index of parasympathetic function associated with clinical outcomes in populations with and without documented coronary heart disease. Decreased parasympathetic activity is thought to be associated with disease progression in chronic heart failure (HF), but an independent association between post-exercise HRR and clinical outcomes among such patients has not been established. We measured HRR (calculated as the difference between heart rate at peak exercise and after 1 minute of recovery) in 202 HF subjects and recorded 17 mortality and 15 urgent transplantation outcome events over 624 days of follow-up. Reduced post-exercise HRR was independently associated with increased event risk after adjusting for other exercise-derived variables (peak oxygen uptake and change in minute ventilation per change in carbon dioxide production slope), for the Heart Failure Survival Score (adjusted HR 1.09 for 1 beat/min reduction, 95% CI 1.05-1.13, P < .0001), and the Seattle Heart Failure Model score (adjusted HR 1.08 for one beat/min reduction, 95% CI 1.05-1.12, P < .0001). Subjects in the lowest risk tertile based on post-exercise HRR (>or=30 beats/min) had low risk of events irrespective of the risk predicted by the survival scores. In a subgroup of 15 subjects, reduced post-exercise HRR was associated with increased serum markers of inflammation (interleukin-6, r = 0.58, P = .024; high-sensitivity C-reactive protein, r = 0.66, P = .007). Post-exercise HRR predicts mortality risk in patients with HF and provides prognostic information independent of previously described survival models. Pathophysiologic links between autonomic function and inflammation may be mediators of this association.
Mode of action in relevance of rodent liver tumors to human cancer risk.
Holsapple, Michael P; Pitot, Henri C; Cohen, Samuel M; Cohen, Samuel H; Boobis, Alan R; Klaunig, James E; Pastoor, Timothy; Dellarco, Vicki L; Dragan, Yvonne P
2006-01-01
Hazard identification and risk assessment paradigms depend on the presumption of the similarity of rodents to humans, yet species specific responses, and the extrapolation of high-dose effects to low-dose exposures can affect the estimation of human risk from rodent data. As a consequence, a human relevance framework concept was developed by the International Programme on Chemical Safety (IPCS) and International Life Sciences Institute (ILSI) Risk Science Institute (RSI) with the central tenet being the identification of a mode of action (MOA). To perform a MOA analysis, the key biochemical, cellular, and molecular events need to first be established, and the temporal and dose-dependent concordance of each of the key events in the MOA can then be determined. The key events can be used to bridge species and dose for a given MOA. The next step in the MOA analysis is the assessment of biological plausibility for determining the relevance of the specified MOA in an animal model for human cancer risk based on kinetic and dynamic parameters. Using the framework approach, a MOA in animals could not be defined for metal overload. The MOA for phenobarbital (PB)-like P450 inducers was determined to be unlikely in humans after kinetic and dynamic factors were considered. In contrast, after these factors were considered with reference to estrogen, the conclusion was drawn that estrogen-induced tumors were plausible in humans. Finally, it was concluded that the induction of rodent liver tumors by porphyrogenic compounds followed a cytotoxic MOA, and that liver tumors formed as a result of sustained cytotoxicity and regenerative proliferation are considered relevant for evaluating human cancer risk if appropriate metabolism occurs in the animal models and in humans.
NASA Astrophysics Data System (ADS)
Gobin, Anne; Van de vijver, Hans; Zamani, Sepideh; Curnel, Yannick; Planchon, Viviane; Verspecht, Ann; Van Huylenbroeck, Guido
2014-05-01
Devastating weather-related events have captured the interest of the general public in Belgium. Extreme weather events such as droughts, heat waves and rain storms are projected to increase both in frequency and magnitude with climate change. Since more than half of the Belgian territory is managed by the agricultural sector, extreme events may have significant impacts on agro-ecosystem services and pose severe limitations to sustainable agricultural land management. The research hypothesis of the MERINOVA project is that meteorological risks act as drivers of environmental innovation in agro-ecosystem management. The major objectives are to characterise extreme meteorological events, assess the impact on Belgian agro-ecosystems, characterise their vulnerability and resilience to these events, and explore innovative adaptation options to agricultural risk management. The project comprises of five major parts that reflect the chain of risks: the hazard, its impact on different agro-ecosystems, vulnerability, risk management and risk communication. Impacts developed from physically based models not only provide information on the state of the damage at any given time, but also assist in understanding the links between different factors causing damage and determining bio-physical vulnerability. Socio-economic impacts enlarge the basis for vulnerability mapping, risk management and adaptation options. The perspective of rising risk-exposure is exacerbated further by more limits to aid received for agricultural damage and an overall reduction of direct income support to farmers. The main findings of each of these project building blocks will be communicated. MERINOVA provides for a robust and flexible framework by demonstrating its performance across Belgian agro-ecosystems, and by ensuring its relevance to policy makers and practitioners. A strong expert and end-user network is established to help disseminating and exploiting project results to meet user needs. The research is funded by the Belgian Science Policy Organisation (Belspo) under contract nr SD/RI/03A. https://merinova.vito.be
Risk Importance Measures in the Designand Operation of Nuclear Power Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrbanic I.; Samanta P.; Basic, I
This monograph presents and discusses risk importance measures as quantified by the probabilistic risk assessment (PRA) models of nuclear power plants (NPPs) developed according to the current standards and practices. Usually, PRA tools calculate risk importance measures related to a single ?basic event? representing particular failure mode. This is, then, reflected in many current PRA applications. The monograph focuses on the concept of ?component-level? importance measures that take into account different failure modes of the component including common-cause failures (CCFs). In opening sections the roleof risk assessment in safety analysis of an NPP is introduced and discussion given of ?traditional?,more » mainly deterministic, design principles which have been established to assign a level of importance to a particular system, structure or component. This is followed by an overview of main risk importance measures for risk increase and risk decrease from current PRAs. Basic relations which exist among the measures are shown. Some of the current practical applications of risk importancemeasures from the field of NPP design, operation and regulation are discussed. The core of the monograph provides a discussion on theoreticalbackground and practical aspects of main risk importance measures at the level of ?component? as modeled in a PRA, starting from the simplest case, single basic event, and going toward more complexcases with multiple basic events and involvements in CCF groups. The intent is to express the component-level importance measures via theimportance measures and probabilities of the underlying single basic events, which are the inputs readily available from a PRA model andits results. Formulas are derived and discussed for some typical cases. The formulas and their results are demonstrated through some practicalexamples, done by means of a simplified PRA model developed in and run by RiskSpectrum? tool, which are presented in the appendices. The monograph concludes with discussion of limitations of the use of risk importance measures and a summary of component-level importance cases evaluated.« less
Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R
2013-01-01
The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.
Stress and Sleep Reactivity: A Prospective Investigation of the Stress-Diathesis Model of Insomnia
Drake, Christopher L.; Pillai, Vivek; Roth, Thomas
2014-01-01
Study Objectives: To prospectively assess sleep reactivity as a diathesis of insomnia, and to delineate the interaction between this diathesis and naturalistic stress in the development of insomnia among normal sleepers. Design: Longitudinal. Setting: Community-based. Participants: 2,316 adults from the Evolution of Pathways to Insomnia Cohort (EPIC) with no history of insomnia or depression (46.8 ± 13.2 y; 60% female). Interventions: None. Measurements and Results: Participants reported the number of stressful events they encountered at baseline (Time 1), as well as the level of cognitive intrusion they experienced in response to each stressor. Stressful events (OR = 1.13; P < 0.01) and stress-induced cognitive intrusion (OR = 1.61; P < 0.01) were significant predictors of risk for insomnia one year hence (Time 2). Intrusion mediated the effects of stressful events on risk for insomnia (P < 0.05). Trait sleep reactivity significantly increased risk for insomnia (OR = 1.78; P < 0.01). Further, sleep reactivity moderated the effects of stress-induced intrusion (P < 0.05), such that the risk for insomnia as a function of intrusion was significantly higher in individuals with high sleep reactivity. Trait sleep reactivity also constituted a significant risk for depression (OR = 1.67; P < 0.01) two years later (Time 3). Insomnia at Time 2 significantly mediated this effect (P < 0.05). Conclusions: This study suggests that premorbid sleep reactivity is a significant risk factor for incident insomnia, and that it triggers insomnia by exacerbating the effects of stress-induced intrusion. Sleep reactivity is also a precipitant of depression, as mediated by insomnia. These findings support the stress-diathesis model of insomnia, while highlighting sleep reactivity as an important diathesis. Citation: Drake CL, Pillai V, Roth T. Stress and sleep reactivity: a prospective investigation of the stress-diathesis model of insomnia. SLEEP 2014;37(8):1295-1304. PMID:25083009
Comparison of Time-to-First Event and Recurrent Event Methods in Randomized Clinical Trials.
Claggett, Brian; Pocock, Stuart; Wei, L J; Pfeffer, Marc A; McMurray, John J V; Solomon, Scott D
2018-03-27
Background -Most Phase-3 trials feature time-to-first event endpoints for their primary and/or secondary analyses. In chronic diseases where a clinical event can occur more than once, recurrent-event methods have been proposed to more fully capture disease burden and have been assumed to improve statistical precision and power compared to conventional "time-to-first" methods. Methods -To better characterize factors that influence statistical properties of recurrent-events and time-to-first methods in the evaluation of randomized therapy, we repeatedly simulated trials with 1:1 randomization of 4000 patients to active vs control therapy, with true patient-level risk reduction of 20% (i.e. RR=0.80). For patients who discontinued active therapy after a first event, we assumed their risk reverted subsequently to their original placebo-level risk. Through simulation, we varied a) the degree of between-patient heterogeneity of risk and b) the extent of treatment discontinuation. Findings were compared with those from actual randomized clinical trials. Results -As the degree of between-patient heterogeneity of risk was increased, both time-to-first and recurrent-events methods lost statistical power to detect a true risk reduction and confidence intervals widened. The recurrent-events analyses continued to estimate the true RR=0.80 as heterogeneity increased, while the Cox model produced estimates that were attenuated. The power of recurrent-events methods declined as the rate of study drug discontinuation post-event increased. Recurrent-events methods provided greater power than time-to-first methods in scenarios where drug discontinuation was ≤30% following a first event, lesser power with drug discontinuation rates of ≥60%, and comparable power otherwise. We confirmed in several actual trials in chronic heart failure that treatment effect estimates were attenuated when estimated via the Cox model and that increased statistical power from recurrent-events methods was most pronounced in trials with lower treatment discontinuation rates. Conclusions -We find that the statistical power of both recurrent-events and time-to-first methods are reduced by increasing heterogeneity of patient risk, a parameter not included in conventional power and sample size formulas. Data from real clinical trials are consistent with simulation studies, confirming that the greatest statistical gains from use of recurrent-events methods occur in the presence of high patient heterogeneity and low rates of study drug discontinuation.
Future climate risk from compound events
NASA Astrophysics Data System (ADS)
Zscheischler, Jakob; Westra, Seth; van den Hurk, Bart J. J. M.; Seneviratne, Sonia I.; Ward, Philip J.; Pitman, Andy; AghaKouchak, Amir; Bresch, David N.; Leonard, Michael; Wahl, Thomas; Zhang, Xuebin
2018-06-01
Floods, wildfires, heatwaves and droughts often result from a combination of interacting physical processes across multiple spatial and temporal scales. The combination of processes (climate drivers and hazards) leading to a significant impact is referred to as a `compound event'. Traditional risk assessment methods typically only consider one driver and/or hazard at a time, potentially leading to underestimation of risk, as the processes that cause extreme events often interact and are spatially and/or temporally dependent. Here we show how a better understanding of compound events may improve projections of potential high-impact events, and can provide a bridge between climate scientists, engineers, social scientists, impact modellers and decision-makers, who need to work closely together to understand these complex events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Prescott, Steven; Coleman, Justin
This report describes the current progress and status related to the Industry Application #2 focusing on External Hazards. For this industry application within the Light Water Reactor Sustainability (LWRS) Program Risk-Informed Safety Margin Characterization (RISMC) R&D Pathway, we will create the Risk-Informed Margin Management (RIMM) approach to represent meaningful (i.e., realistic facility representation) event scenarios and consequences by using an advanced 3D facility representation that will evaluate external hazards such as flooding and earthquakes in order to identify, model and analyze the appropriate physics that needs to be included to determine plant vulnerabilities related to external events; manage the communicationmore » and interactions between different physics modeling and analysis technologies; and develop the computational infrastructure through tools related to plant representation, scenario depiction, and physics prediction. One of the unique aspects of the RISMC approach is how it couples probabilistic approaches (the scenario) with mechanistic phenomena representation (the physics) through simulation. This simulation-based modeling allows decision makers to focus on a variety of safety, performance, or economic metrics. In this report, we describe the evaluation of various physics toolkits related to flooding representation. Ultimately, we will be coupling the flooding representation with other events such as earthquakes in order to provide coupled physics analysis for scenarios where interactions exist.« less
Raffel, Katie E; Beach, Leila Y; Lin, John; Berchuck, Jacob E; Abram, Shelly; Markle, Elizabeth; Patel, Shalini
2018-03-30
Naloxone distribution has historically been implemented in a community-based, expanded public health model; however, there is now a need to further explore primary care clinic-based naloxone delivery to effectively address the nationwide opioid epidemic. To create a general medicine infrastructure to identify patients with high-risk opioid use and provide 25% of this population with naloxone autoinjector prescription and training within a 6-month period. The quality improvement study was conducted at an outpatient clinic serving 1238 marginally housed veterans with high rates of comorbid substance use and mental health disorders. Patients at high risk of opioid-related adverse events were identified using the Stratification Tool for Opioid Risk Management and were contacted to participate in a one-on-one, 15-minute, hands-on naloxone training led by nursing staff. The number of patients identified at high risk and rates of naloxone training/distribution. There were 67 patients identified as having high-risk opioid use. None of these patients had been prescribed naloxone at baseline. At the end of the intervention, 61 patients (91%) had been trained in the use of naloxone. Naloxone was primarily distributed by licensed vocational nurses (42/61, 69%). This study demonstrates the feasibility of high-risk patient identification and of a primary care-based and nursing-championed naloxone distribution model. This delivery model has the potential to provide access to naloxone to a population of patients with opioid use who may not be engaged in mental health or specialty care.
Raffel, Katie E; Beach, Leila Y; Lin, John; Berchuck, Jacob E; Abram, Shelly; Markle, Elizabeth; Patel, Shalini
2018-01-01
Context Naloxone distribution has historically been implemented in a community-based, expanded public health model; however, there is now a need to further explore primary care clinic-based naloxone delivery to effectively address the nationwide opioid epidemic. Objective To create a general medicine infrastructure to identify patients with high-risk opioid use and provide 25% of this population with naloxone autoinjector prescription and training within a 6-month period. Design The quality improvement study was conducted at an outpatient clinic serving 1238 marginally housed veterans with high rates of comorbid substance use and mental health disorders. Patients at high risk of opioid-related adverse events were identified using the Stratification Tool for Opioid Risk Management and were contacted to participate in a one-on-one, 15-minute, hands-on naloxone training led by nursing staff. Main Outcome Measures The number of patients identified at high risk and rates of naloxone training/distribution. Results There were 67 patients identified as having high-risk opioid use. None of these patients had been prescribed naloxone at baseline. At the end of the intervention, 61 patients (91%) had been trained in the use of naloxone. Naloxone was primarily distributed by licensed vocational nurses (42/61, 69%). Conclusion This study demonstrates the feasibility of high-risk patient identification and of a primary care-based and nursing-championed naloxone distribution model. This delivery model has the potential to provide access to naloxone to a population of patients with opioid use who may not be engaged in mental health or specialty care. PMID:29616917
The impact of birth weight on cardiovascular disease risk in the Women's Health Initiative
Smith, CJ; Ryckman, KK; Barnabei, Vanessa M.; Howard, Barbara; Isasi, Carmen R.; Sarto, Gloria; Tom, Sarah E.; Van Horn, Linda; Wallace, Robert; Robinson, Jennifer G
2016-01-01
Background and Aims Cardiovascular disease (CVD) is among the leading causes of morbidity and mortality worldwide. Traditional risk factors predict 75-80% of an individual's risk of incident CVD. However, the role of early life experiences in future disease risk is gaining attention. The Barker hypothesis proposes fetal origins of adult disease, with consistent evidence demonstrating the deleterious consequences of birth weight outside the normal range. In this study, we investigate the role of birth weight in CVD risk prediction. Methods and Results The Women's Health Initiative (WHI) represents a large national cohort of post-menopausal women with 63 815 participants included in this analysis. Univariable proportional hazards regression analyses evaluated the association of 4 self-reported birth weight categories against 3 CVD outcome definitions, which included indicators of coronary heart disease, ischemic stroke, coronary revascularization, carotid artery disease and peripheral arterial disease. The role of birth weight was also evaluated for prediction of CVD events in the presence of traditional risk factors using 3 existing CVD risk prediction equations: one body mass index (BMI)-based and two laboratory-based models. Low birth weight (LBW) (< 6 lbs.) was significantly associated with all CVD outcome definitions in univariable analyses (HR=1.086, p=0.009). LBW was a significant covariate in the BMI-based model (HR=1.128, p<0.0001) but not in the lipid-based models. Conclusion LBW (<6 lbs.) is independently associated with CVD outcomes in the WHI cohort. This finding supports the role of the prenatal and postnatal environment in contributing to the development of adult chronic disease. PMID:26708645
Abdelbary, B E; Garcia-Viveros, M; Ramirez-Oropesa, H; Rahbar, M H; Restrepo, B I
2017-10-01
The purpose of this study was to develop a method for identifying newly diagnosed tuberculosis (TB) patients at risk for TB adverse events in Tamaulipas, Mexico. Surveillance data between 2006 and 2013 (8431 subjects) was used to develop risk scores based on predictive modelling. The final models revealed that TB patients failing their treatment regimen were more likely to have at most a primary school education, multi-drug resistance (MDR)-TB, and few to moderate bacilli on acid-fast bacilli smear. TB patients who died were more likely to be older males with MDR-TB, HIV, malnutrition, and reporting excessive alcohol use. Modified risk scores were developed with strong predictability for treatment failure and death (c-statistic 0·65 and 0·70, respectively), and moderate predictability for drug resistance (c-statistic 0·57). Among TB patients with diabetes, risk scores showed moderate predictability for death (c-statistic 0·68). Our findings suggest that in the clinical setting, the use of our risk scores for TB treatment failure or death will help identify these individuals for tailored management to prevent these adverse events. In contrast, the available variables in the TB surveillance dataset are not robust predictors of drug resistance, indicating the need for prompt testing at time of diagnosis.
Ermolieva, T; Filatova, T; Ermoliev, Y; Obersteiner, M; de Bruijn, K M; Jeuken, A
2017-01-01
As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood-loss-sharing program involving private insurance based on location-specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS-based flood model and a stochastic optimization procedure with respect to location-specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile-related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures. © 2016 Society for Risk Analysis.
Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions
NASA Astrophysics Data System (ADS)
Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.
2015-07-01
The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable shoreline risk levels from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - Portuguese Continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time. Shoreline risks can be computed in real-time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns, "hot spots" or developing sensitivity analysis to specific conditions, whereas real time risk levels can be used in the prioritization of individual ships, geographical areas, strategic tug positioning and implementation of dynamic risk-based vessel traffic monitoring.
NASA Astrophysics Data System (ADS)
Vencloviene, J.; Babarskiene, R.; Milvidaite, I.; Kubilius, R.; Stasionyte, J.
2013-12-01
Some evidence indicates the deterioration of the cardiovascular system during space storms. It is plausible that the space weather conditions during and after hospital admission may affect the risk of coronary events in patients with acute coronary syndromes (ACS). We analyzed the data of 1400 ACS patients who were admitted to the Hospital Lithuanian University of Health Sciences, and who survived for more than 4 days. We evaluated the associations between geomagnetic storms (GS), solar proton events (SPE), and solar flares (SF) that occurred 0-3 days before and after hospital admission and the risk of cardiovascular death (CAD), non-fatal ACS, and coronary artery bypass grafting (CABG) during a period of 1 year; the evaluation was based on the multivariate logistic model, controlling for clinical data. After adjustment for clinical variables, GS occurring in conjunction with SF 1 day before admission increased the risk of CAD by over 2.5 times. GS 2 days after SPE occurred 1 day after admission increased the risk of CAD and CABG by over 2.8 times. The risk of CABG increased by over 2 times in patients admitted during the day of GS and 1 day after SPE. The risk of ACS was by over 1.63 times higher for patients admitted 1 day before or after solar flares.
A policy model of cardiovascular disease in moderate-to-advanced chronic kidney disease.
Schlackow, Iryna; Kent, Seamus; Herrington, William; Emberson, Jonathan; Haynes, Richard; Reith, Christina; Wanner, Christoph; Fellström, Bengt; Gray, Alastair; Landray, Martin J; Baigent, Colin; Mihaylova, Borislava
2017-12-01
To present a long-term policy model of cardiovascular disease (CVD) in moderate-to-advanced chronic kidney disease (CKD). A Markov model with transitions between CKD stages (3B, 4, 5, on dialysis, with kidney transplant) and cardiovascular events (major atherosclerotic events, haemorrhagic stroke, vascular death) was developed with individualised CKD and CVD risks estimated using the 5 years' follow-up data of the 9270 patients with moderate-to-severe CKD in the Study of Heart and Renal Protection (SHARP) and multivariate parametric survival analysis. The model was assessed in three further CKD cohorts and compared with currently used risk scores. Higher age, previous cardiovascular events and advanced CKD were the main contributors to increased individual disease risks. CKD and CVD risks predicted by the state-transition model corresponded well to risks observed in SHARP and external cohorts. The model's predictions of vascular risk and progression to end-stage renal disease were better than, or comparable to, those produced by other risk scores. As an illustration, at age 60-69 years, projected survival for SHARP participants in CKD stage 3B was 13.5 years (10.6 quality-adjusted life years (QALYs)) in men and 14.8 years (10.7 QALYs) in women. Corresponding projections for participants on dialysis were 7.5 (5.6 QALYs) and 7.8 years (5.4 QALYs). A non-fatal major atherosclerotic event reduced life expectancy by about 2 years in stage 3B and by 1 year in dialysis. The SHARP CKD-CVD model is a novel resource for evaluating health outcomes and cost-effectiveness of interventions in CKD. NCT00125593 and ISRCTN54137607; Post-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
[Modeling in value-based medicine].
Neubauer, A S; Hirneiss, C; Kampik, A
2010-03-01
Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.
NASA Astrophysics Data System (ADS)
Frey, Holger; Buis, Daniel; Huggel, Christian; Bühler, Yves; Choquevilca, Walter; Fernandez, Felipe; García, Javier; Giráldez, Claudia; Loarte, Edwin; Masias, Paul; Portocarreo, César; Price, Karen; Walser, Marco
2015-04-01
The local center of Santa Teresa (Cusco Region, Peru, 7 km northwest of the ruins of Machu Picchu) has been affected by several large debris-flow events in the recent past. In January and February 1998, three events of extreme magnitudes with estimated total volumes of several tens of millions cubic meters each, caused the destruction of most parts of the municipality and resulted in a resettlement of the town on higher grounds. Additionally, several settlements further upstream, as well valuable infrastructure such as bridges, a railway, and a hydropower plant, were destroyed. Some events were related to large-scale slope instabilities and landslide processes in glacial sediments that transformed into highly mobile debris flows. However, the exact trigger mechanisms are still not entirely clear, and the potential role of glacial lakes for past and future mass flows remains to be analyzed. Here we applied RAMMS (RApid Mass Movement System), a physically based dynamic model, to reconstruct one of the 1998 events in the Sacsara catchment using the ASTER Global Digital Elevation Model (ASTER GDEM) with 30 m spatial resolution and a photogrammetric DEM compiled from ALOS PRISM data with 6 m spatial resolution. A sensitivity analysis for various model parameters such as friction and starting conditions was performed, along with an assessment of potential trigger factors. Based on these results, further potential debris-flows for this catchment were modeled, including outburst scenarios of several glacial lakes. In combination with a vulnerability analysis, these hazard scenarios were then incorporated in a qualitative risk analysis. To further reduce the risk for the local communities, technical risk sheets were elaborated for each of the 17 local settlements in the catchment. Furthermore an Early Warning System (EWS) has been designed. The modular structure of the EWS aims at a first step to install an inexpensive but efficient system to detect debris-flow type mass movements and temporal damming of the river with trigger cables, geophones, and water level measurements. Independent energy supply, real-time data transfer to the data center in the municipality of Santa Teresa and remote access to the system via internet allows constant monitoring from within and outside the catchment. On a later stage the system is open to be enhanced by adding further sensors, cameras, meteorological stations, monitoring stations at glacier lakes, and related communication infrastructure. Risk management in such a context is a complex task: on one hand the data and information scarcity as well as the environmental conditions challenge scientific and technical aspects of debris-flow modeling and the design of the EWS. On the other hand, social aspects must be taken into account to make actions coherent with local risk perceptions and to achieve a good preparedness of the population. For a successful realization of the EWS and the entire risk management scheme, the local and regional institutional framework must also be considered. This contribution thus illustrates the implementation of an integrated risk management strategy under the challenging conditions common for remote high-mountain regions.
Petoumenos, Kathy; Worm, Signe W; Fontas, Eric; Weber, Rainer; De Wit, Stephane; Bruyand, Mathias; Reiss, Peter; El-Sadr, Wafaa; Monforte, Antonella D'Arminio; Friis-Møller, Nina; Lundgren, Jens D; Law, Matthew G
2012-01-01
Introduction HIV-positive patients receiving combination antiretroviral therapy (cART) frequently experience metabolic complications such as dyslipidemia and insulin resistance, as well as lipodystrophy, increasing the risk of cardiovascular disease (CVD) and diabetes mellitus (DM). Rates of DM and other glucose-associated disorders among HIV-positive patients have been reported to range between 2 and 14%, and in an ageing HIV-positive population, the prevalence of DM is expected to continue to increase. This study aims to develop a model to predict the short-term (six-month) risk of DM in HIV-positive populations and to compare the existing models developed in the general population. Methods All patients recruited to the Data Collection on Adverse events of Anti-HIV Drugs (D:A:D) study with follow-up data, without prior DM, myocardial infarction or other CVD events and with a complete DM risk factor profile were included. Conventional risk factors identified in the general population as well as key HIV-related factors were assessed using Poisson-regression methods. Expected probabilities of DM events were also determined based on the Framingham Offspring Study DM equation. The D:A:D and Framingham equations were then assessed using an internal-external validation process; area under the receiver operating characteristic (AUROC) curve and predicted DM events were determined. Results Of 33,308 patients, 16,632 (50%) patients were included, with 376 cases of new onset DM during 89,469 person-years (PY). Factors predictive of DM included higher glucose, body mass index (BMI) and triglyceride levels, and older age. Among HIV-related factors, recent CD4 counts of<200 cells/µL and lipodystrophy were predictive of new onset DM. The mean performance of the D:A:D and Framingham equations yielded AUROC of 0.894 (95% CI: 0.849, 0.940) and 0.877 (95% CI: 0.823, 0.932), respectively. The Framingham equation over-predicted DM events compared to D:A:D for lower glucose and lower triglycerides, and for BMI levels below 25 kg/m2. Conclusions The D:A:D equation performed well in predicting the short-term onset of DM in the validation dataset and for specific subgroups provided better estimates of DM risk than the Framingham. PMID:23078769