Yanagawa, T; Tokudome, S
1990-01-01
We developed methods to assess the cancer risks by screening tests. These methods estimate the size of the high risk group adjusted for the characteristics of screening tests and estimate the incidence rates of cancer among the high risk group adjusted for the characteristics of the tests. A method was also developed for selecting the cut-off point of a screening test. Finally, the methods were applied to estimate the risk of the adult T-cell leukemia/lymphoma. PMID:2269244
Alternative evaluation metrics for risk adjustment methods.
Park, Sungchul; Basu, Anirban
2018-06-01
Risk adjustment is instituted to counter risk selection by accurately equating payments with expected expenditures. Traditional risk-adjustment methods are designed to estimate accurate payments at the group level. However, this generates residual risks at the individual level, especially for high-expenditure individuals, thereby inducing health plans to avoid those with high residual risks. To identify an optimal risk-adjustment method, we perform a comprehensive comparison of prediction accuracies at the group level, at the tail distributions, and at the individual level across 19 estimators: 9 parametric regression, 7 machine learning, and 3 distributional estimators. Using the 2013-2014 MarketScan database, we find that no one estimator performs best in all prediction accuracies. Generally, machine learning and distribution-based estimators achieve higher group-level prediction accuracy than parametric regression estimators. However, parametric regression estimators show higher tail distribution prediction accuracy and individual-level prediction accuracy, especially at the tails of the distribution. This suggests that there is a trade-off in selecting an appropriate risk-adjustment method between estimating accurate payments at the group level and lower residual risks at the individual level. Our results indicate that an optimal method cannot be determined solely on the basis of statistical metrics but rather needs to account for simulating plans' risk selective behaviors. Copyright © 2018 John Wiley & Sons, Ltd.
Salganik, Matthew J; Fazito, Dimitri; Bertoni, Neilane; Abdo, Alexandre H; Mello, Maeve B; Bastos, Francisco I
2011-11-15
One of the many challenges hindering the global response to the human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) epidemic is the difficulty of collecting reliable information about the populations most at risk for the disease. Thus, the authors empirically assessed a promising new method for estimating the sizes of most at-risk populations: the network scale-up method. Using 4 different data sources, 2 of which were from other researchers, the authors produced 5 estimates of the number of heavy drug users in Curitiba, Brazil. The authors found that the network scale-up and generalized network scale-up estimators produced estimates 5-10 times higher than estimates made using standard methods (the multiplier method and the direct estimation method using data from 2004 and 2010). Given that equally plausible methods produced such a wide range of results, the authors recommend that additional studies be undertaken to compare estimates based on the scale-up method with those made using other methods. If scale-up-based methods routinely produce higher estimates, this would suggest that scale-up-based methods are inappropriate for populations most at risk of HIV/AIDS or that standard methods may tend to underestimate the sizes of these populations.
What is the lifetime risk of developing cancer?: the effect of adjusting for multiple primaries
Sasieni, P D; Shelton, J; Ormiston-Smith, N; Thomson, C S; Silcocks, P B
2011-01-01
Background: The ‘lifetime risk' of cancer is generally estimated by combining current incidence rates with current all-cause mortality (‘current probability' method) rather than by describing the experience of a birth cohort. As individuals may get more than one type of cancer, what is generally estimated is the average (mean) number of cancers over a lifetime. This is not the same as the probability of getting cancer. Methods: We describe a method for estimating lifetime risk that corrects for the inclusion of multiple primary cancers in the incidence rates routinely published by cancer registries. The new method applies cancer incidence rates to the estimated probability of being alive without a previous cancer. The new method is illustrated using data from the Scottish Cancer Registry and is compared with ‘gold-standard' estimates that use (unpublished) data on first primaries. Results: The effect of this correction is to make the estimated ‘lifetime risk' smaller. The new estimates are extremely similar to those obtained using incidence based on first primaries. The usual ‘current probability' method considerably overestimates the lifetime risk of all cancers combined, although the correction for any single cancer site is minimal. Conclusion: Estimation of the lifetime risk of cancer should either be based on first primaries or should use the new method. PMID:21772332
Landy, Rebecca; Cheung, Li C; Schiffman, Mark; Gage, Julia C; Hyun, Noorie; Wentzensen, Nicolas; Kinney, Walter K; Castle, Philip E; Fetterman, Barbara; Poitras, Nancy E; Lorey, Thomas; Sasieni, Peter D; Katki, Hormuzd A
2018-06-01
Electronic health-records (EHR) are increasingly used by epidemiologists studying disease following surveillance testing to provide evidence for screening intervals and referral guidelines. Although cost-effective, undiagnosed prevalent disease and interval censoring (in which asymptomatic disease is only observed at the time of testing) raise substantial analytic issues when estimating risk that cannot be addressed using Kaplan-Meier methods. Based on our experience analysing EHR from cervical cancer screening, we previously proposed the logistic-Weibull model to address these issues. Here we demonstrate how the choice of statistical method can impact risk estimates. We use observed data on 41,067 women in the cervical cancer screening program at Kaiser Permanente Northern California, 2003-2013, as well as simulations to evaluate the ability of different methods (Kaplan-Meier, Turnbull, Weibull and logistic-Weibull) to accurately estimate risk within a screening program. Cumulative risk estimates from the statistical methods varied considerably, with the largest differences occurring for prevalent disease risk when baseline disease ascertainment was random but incomplete. Kaplan-Meier underestimated risk at earlier times and overestimated risk at later times in the presence of interval censoring or undiagnosed prevalent disease. Turnbull performed well, though was inefficient and not smooth. The logistic-Weibull model performed well, except when event times didn't follow a Weibull distribution. We have demonstrated that methods for right-censored data, such as Kaplan-Meier, result in biased estimates of disease risks when applied to interval-censored data, such as screening programs using EHR data. The logistic-Weibull model is attractive, but the model fit must be checked against Turnbull non-parametric risk estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
The U.S. EPA Ecological Risk Assessment Support Center (ERASC) announced the release of the final report entitled, Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles. This report evaluates approaches for estimating the probabi...
Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L
2016-02-10
Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure. Copyright © 2015 John Wiley & Sons, Ltd.
DeFelice, Nicholas B; Johnston, Jill E; Gibson, Jacqueline MacDonald
2015-08-18
The magnitude and spatial variability of acute gastrointestinal illness (AGI) cases attributable to microbial contamination of U.S. community drinking water systems are not well characterized. We compared three approaches (drinking water attributable risk, quantitative microbial risk assessment, and population intervention model) to estimate the annual number of emergency department visits for AGI attributable to microorganisms in North Carolina community water systems. All three methods used 2007-2013 water monitoring and emergency department data obtained from state agencies. The drinking water attributable risk method, which was the basis for previous U.S. Environmental Protection Agency national risk assessments, estimated that 7.9% of annual emergency department visits for AGI are attributable to microbial contamination of community water systems. However, the other methods' estimates were more than 2 orders of magnitude lower, each attributing 0.047% of annual emergency department visits for AGI to community water system contamination. The differences in results between the drinking water attributable risk method, which has been the main basis for previous national risk estimates, and the other two approaches highlight the need to improve methods for estimating endemic waterborne disease risks, in order to prioritize investments to improve community drinking water systems.
A risk adjustment approach to estimating the burden of skin disease in the United States.
Lim, Henry W; Collins, Scott A B; Resneck, Jack S; Bolognia, Jean; Hodge, Julie A; Rohrer, Thomas A; Van Beek, Marta J; Margolis, David J; Sober, Arthur J; Weinstock, Martin A; Nerenz, David R; Begolka, Wendy Smith; Moyano, Jose V
2018-01-01
Direct insurance claims tabulation and risk adjustment statistical methods can be used to estimate health care costs associated with various diseases. In this third manuscript derived from the new national Burden of Skin Disease Report from the American Academy of Dermatology, a risk adjustment method that was based on modeling the average annual costs of individuals with or without specific diseases, and specifically tailored for 24 skin disease categories, was used to estimate the economic burden of skin disease. The results were compared with the claims tabulation method used in the first 2 parts of this project. The risk adjustment method estimated the direct health care costs of skin diseases to be $46 billion in 2013, approximately $15 billion less than estimates using claims tabulation. For individual skin diseases, the risk adjustment cost estimates ranged from 11% to 297% of those obtained using claims tabulation for the 10 most costly skin disease categories. Although either method may be used for purposes of estimating the costs of skin disease, the choice of method will affect the end result. These findings serve as an important reference for future discussions about the method chosen in health care payment models to estimate both the cost of skin disease and the potential cost impact of care changes. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
Assessment of Methods for Estimating Risk to Birds from ...
The U.S. EPA Ecological Risk Assessment Support Center (ERASC) announced the release of the final report entitled, Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles. This report evaluates approaches for estimating the probability of ingestion by birds of contaminated particles such as pesticide granules or lead particles (i.e. shot or bullet fragments). In addition, it presents an approach for using this information to estimate the risk of mortality to birds from ingestion of lead particles. Response to ERASC Request #16
Estimation model of life insurance claims risk for cancer patients by using Bayesian method
NASA Astrophysics Data System (ADS)
Sukono; Suyudi, M.; Islamiyati, F.; Supian, S.
2017-01-01
This paper discussed the estimation model of the risk of life insurance claims for cancer patients using Bayesian method. To estimate the risk of the claim, the insurance participant data is grouped into two: the number of policies issued and the number of claims incurred. Model estimation is done using a Bayesian approach method. Further, the estimator model was used to estimate the risk value of life insurance claims each age group for each sex. The estimation results indicate that a large risk premium for insured males aged less than 30 years is 0.85; for ages 30 to 40 years is 3:58; for ages 41 to 50 years is 1.71; for ages 51 to 60 years is 2.96; and for those aged over 60 years is 7.82. Meanwhile, for insured women aged less than 30 years was 0:56; for ages 30 to 40 years is 3:21; for ages 41 to 50 years is 0.65; for ages 51 to 60 years is 3:12; and for those aged over 60 years is 9.99. This study is useful in determining the risk premium in homogeneous groups based on gender and age.
NASA Astrophysics Data System (ADS)
Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao
2017-05-01
Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.
Are Structural Estimates of Auction Models Reasonable? Evidence from Experimental Data
ERIC Educational Resources Information Center
Bajari, Patrick; Hortacsu, Ali
2005-01-01
Recently, economists have developed methods for structural estimation of auction models. Many researchers object to these methods because they find the strict rationality assumptions to be implausible. Using bid data from first-price auction experiments, we estimate four alternative structural models: (1) risk-neutral Bayes-Nash, (2) risk-averse…
DOT National Transportation Integrated Search
2017-01-01
This report summarizes the variety of methods used to estimate and evaluate exposure to risk in pedestrian and bicyclist safety analyses. In the literature, the most common definition of risk was a measure of the probability of a crash to occur given...
Epidemiologic research using probabilistic outcome definitions.
Cai, Bing; Hennessy, Sean; Lo Re, Vincent; Small, Dylan S
2015-01-01
Epidemiologic studies using electronic healthcare data often define the presence or absence of binary clinical outcomes by using algorithms with imperfect specificity, sensitivity, and positive predictive value. This results in misclassification and bias in study results. We describe and evaluate a new method called probabilistic outcome definition (POD) that uses logistic regression to estimate the probability of a clinical outcome using multiple potential algorithms and then uses multiple imputation to make valid inferences about the risk ratio or other epidemiologic parameters of interest. We conducted a simulation to evaluate the performance of the POD method with two variables that can predict the true outcome and compared the POD method with the conventional method. The simulation results showed that when the true risk ratio is equal to 1.0 (null), the conventional method based on a binary outcome provides unbiased estimates. However, when the risk ratio is not equal to 1.0, the traditional method, either using one predictive variable or both predictive variables to define the outcome, is biased when the positive predictive value is <100%, and the bias is very severe when the sensitivity or positive predictive value is poor (less than 0.75 in our simulation). In contrast, the POD method provides unbiased estimates of the risk ratio both when this measure of effect is equal to 1.0 and not equal to 1.0. Even when the sensitivity and positive predictive value are low, the POD method continues to provide unbiased estimates of the risk ratio. The POD method provides an improved way to define outcomes in database research. This method has a major advantage over the conventional method in that it provided unbiased estimates of risk ratios and it is easy to use. Copyright © 2014 John Wiley & Sons, Ltd.
Estimating relative risks for common outcome using PROC NLP.
Yu, Binbing; Wang, Zhuoqiao
2008-05-01
In cross-sectional or cohort studies with binary outcomes, it is biologically interpretable and of interest to estimate the relative risk or prevalence ratio, especially when the response rates are not rare. Several methods have been used to estimate the relative risk, among which the log-binomial models yield the maximum likelihood estimate (MLE) of the parameters. Because of restrictions on the parameter space, the log-binomial models often run into convergence problems. Some remedies, e.g., the Poisson and Cox regressions, have been proposed. However, these methods may give out-of-bound predicted response probabilities. In this paper, a new computation method using the SAS Nonlinear Programming (NLP) procedure is proposed to find the MLEs. The proposed NLP method was compared to the COPY method, a modified method to fit the log-binomial model. Issues in the implementation are discussed. For illustration, both methods were applied to data on the prevalence of microalbuminuria (micro-protein leakage into urine) for kidney disease patients from the Diabetes Control and Complications Trial. The sample SAS macro for calculating relative risk is provided in the appendix.
Takeuchi, Yoshinori; Shinozaki, Tomohiro; Matsuyama, Yutaka
2018-01-08
Despite the frequent use of self-controlled methods in pharmacoepidemiological studies, the factors that may bias the estimates from these methods have not been adequately compared in real-world settings. Here, we comparatively examined the impact of a time-varying confounder and its interactions with time-invariant confounders, time trends in exposures and events, restrictions, and misspecification of risk period durations on the estimators from three self-controlled methods. This study analyzed self-controlled case series (SCCS), case-crossover (CCO) design, and sequence symmetry analysis (SSA) using simulated and actual electronic medical records datasets. We evaluated the performance of the three self-controlled methods in simulated cohorts for the following scenarios: 1) time-invariant confounding with interactions between the confounders, 2) time-invariant and time-varying confounding without interactions, 3) time-invariant and time-varying confounding with interactions among the confounders, 4) time trends in exposures and events, 5) restricted follow-up time based on event occurrence, and 6) patient restriction based on event history. The sensitivity of the estimators to misspecified risk period durations was also evaluated. As a case study, we applied these methods to evaluate the risk of macrolides on liver injury using electronic medical records. In the simulation analysis, time-varying confounding produced bias in the SCCS and CCO design estimates, which aggravated in the presence of interactions between the time-invariant and time-varying confounders. The SCCS estimates were biased by time trends in both exposures and events. Erroneously short risk periods introduced bias to the CCO design estimate, whereas erroneously long risk periods introduced bias to the estimates of all three methods. Restricting the follow-up time led to severe bias in the SSA estimates. The SCCS estimates were sensitive to patient restriction. The case study showed that although macrolide use was significantly associated with increased liver injury occurrence in all methods, the value of the estimates varied. The estimations of the three self-controlled methods depended on various underlying assumptions, and the violation of these assumptions may cause non-negligible bias in the resulting estimates. Pharmacoepidemiologists should select the appropriate self-controlled method based on how well the relevant key assumptions are satisfied with respect to the available data.
The lifetime risk of maternal mortality: concept and measurement
2009-01-01
Abstract Objective The lifetime risk of maternal mortality, which describes the cumulative loss of life due to maternal deaths over the female life course, is an important summary measure of population health. However, despite its interpretive appeal, the lifetime risk of dying from maternal causes can be defined and calculated in various ways. A clear and concise discussion of both its underlying concept and methods of measurement is badly needed. Methods I define and compare a variety of procedures for calculating the lifetime risk of maternal mortality. I use detailed survey data from Bangladesh in 2001 to illustrate these calculations and compare the properties of the various risk measures. Using official UN estimates of maternal mortality for 2005, I document the differences in lifetime risk derived with the various measures. Findings Taking sub-Saharan Africa as an example, the range of estimates for the 2005 lifetime risk extends from 3.41% to 5.76%, or from 1 in 29 to 1 in 17. The highest value resulted from the method used for producing official UN estimates for the year 2000. The measure recommended here has an intermediate value of 4.47%, or 1 in 22. Conclusion There are strong reasons to consider the calculation method proposed here more accurate and appropriate than earlier procedures. Accordingly, it was adopted for use in producing the 2005 UN estimates of the lifetime risk of maternal mortality. By comparison, the method used for the 2000 UN estimates appears to overestimate this important measure of population health by around 20%. PMID:19551233
Petromilli Nordi Sasso Garcia, Patrícia; Polli, Gabriela Scatimburgo; Campos, Juliana Alvares Duarte Bonini
2013-01-01
As dentistry is a profession that demands a manipulative precision of hand movements, musculoskeletal disorders are among the most common occupational diseases. This study estimated the risk of musculoskeletal disorders developing in dental students using the Ovako Working Analysis System (OWAS) and Rapid Upper Limb Assessment (RULA) methods, and estimated the diagnostic agreement between the 2 methods. Students (n = 75), enrolled in the final undergraduate year at the Araraquara School of Dentistry--UNESP--were studied. Photographs were taken of students while performing diverse clinical procedures (n = 283) using a digital camera, which were assessed using OWAS and RULA. A risk score was attributed following each procedure performed by the student. The prevalence of the risk of musculoskeletal disorders was estimated per point and for a 95% CI. To assess the agreement between the 2 methods, Kappa statistics with linear weighting were used. The level of significance adopted was 5%. There was a high prevalence of the mean score for risk of musculoskeletal disorders in the dental students evaluated according to the OWAS method (p = 97.88%; 95% CI: 96.20-99.56%), and a high prevalence of the high score (p = 40.6; 95% CI: 34.9-46.4%) and extremely high risk (p = 59.4%; 95% CI: 53.6-65.1%) according to RULA method Null agreement was verified (k = 0) in the risk di agnosis of the tested methods. The risk of musculoskeletal disorders in dental students estimated by the OWAS method was medium, whereas the same risk by the RULA method was extremely high. There was no diagnostic agreement between the OWAS and RULA methods.
Unmanned aircraft system sense and avoid integrity and continuity
NASA Astrophysics Data System (ADS)
Jamoom, Michael B.
This thesis describes new methods to guarantee safety of sense and avoid (SAA) functions for Unmanned Aircraft Systems (UAS) by evaluating integrity and continuity risks. Previous SAA efforts focused on relative safety metrics, such as risk ratios, comparing the risk of using an SAA system versus not using it. The methods in this thesis evaluate integrity and continuity risks as absolute measures of safety, as is the established practice in commercial aircraft terminal area navigation applications. The main contribution of this thesis is a derivation of a new method, based on a standard intruder relative constant velocity assumption, that uses hazard state estimates and estimate error covariances to establish (1) the integrity risk of the SAA system not detecting imminent loss of '"well clear," which is the time and distance required to maintain safe separation from intruder aircraft, and (2) the probability of false alert, the continuity risk. Another contribution is applying these integrity and continuity risk evaluation methods to set quantifiable and certifiable safety requirements on sensors. A sensitivity analysis uses this methodology to evaluate the impact of sensor errors on integrity and continuity risks. The penultimate contribution is an integrity and continuity risk evaluation where the estimation model is refined to address realistic intruder relative linear accelerations, which goes beyond the current constant velocity standard. The final contribution is an integrity and continuity risk evaluation addressing multiple intruders. This evaluation is a new innovation-based method to determine the risk of mis-associating intruder measurements. A mis-association occurs when the SAA system incorrectly associates a measurement to the wrong intruder, causing large errors in the estimated intruder trajectories. The new methods described in this thesis can help ensure safe encounters between aircraft and enable SAA sensor certification for UAS integration into the National Airspace System.
Reinforcing flood-risk estimation.
Reed, Duncan W
2002-07-15
Flood-frequency estimation is inherently uncertain. The practitioner applies a combination of gauged data, scientific method and hydrological judgement to derive a flood-frequency curve for a particular site. The resulting estimate can be thought fully satisfactory only if it is broadly consistent with all that is reliably known about the flood-frequency behaviour of the river. The paper takes as its main theme the search for information to strengthen a flood-risk estimate made from peak flows alone. Extra information comes in many forms, including documentary and monumental records of historical floods, and palaeological markers. Meteorological information is also useful, although rainfall rarity is difficult to assess objectively and can be a notoriously unreliable indicator of flood rarity. On highly permeable catchments, groundwater levels present additional data. Other types of information are relevant to judging hydrological similarity when the flood-frequency estimate derives from data pooled across several catchments. After highlighting information sources, the paper explores a second theme: that of consistency in flood-risk estimates. Following publication of the Flood estimation handbook, studies of flood risk are now using digital catchment data. Automated calculation methods allow estimates by standard methods to be mapped basin-wide, revealing anomalies at special sites such as river confluences. Such mapping presents collateral information of a new character. Can this be used to achieve flood-risk estimates that are coherent throughout a river basin?
Population-based absolute risk estimation with survey data
Kovalchik, Stephanie A.; Pfeiffer, Ruth M.
2013-01-01
Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614
Kaplan-Meier Survival Analysis Overestimates the Risk of Revision Arthroplasty: A Meta-analysis.
Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter D; Ghali, William A; Marshall, Deborah A
2015-11-01
Although Kaplan-Meier survival analysis is commonly used to estimate the cumulative incidence of revision after joint arthroplasty, it theoretically overestimates the risk of revision in the presence of competing risks (such as death). Because the magnitude of overestimation is not well documented, the potential associated impact on clinical and policy decision-making remains unknown. We performed a meta-analysis to answer the following questions: (1) To what extent does the Kaplan-Meier method overestimate the cumulative incidence of revision after joint replacement compared with alternative competing-risks methods? (2) Is the extent of overestimation influenced by followup time or rate of competing risks? We searched Ovid MEDLINE, EMBASE, BIOSIS Previews, and Web of Science (1946, 1980, 1980, and 1899, respectively, to October 26, 2013) and included article bibliographies for studies comparing estimated cumulative incidence of revision after hip or knee arthroplasty obtained using both Kaplan-Meier and competing-risks methods. We excluded conference abstracts, unpublished studies, or studies using simulated data sets. Two reviewers independently extracted data and evaluated the quality of reporting of the included studies. Among 1160 abstracts identified, six studies were included in our meta-analysis. The principal reason for the steep attrition (1160 to six) was that the initial search was for studies in any clinical area that compared the cumulative incidence estimated using the Kaplan-Meier versus competing-risks methods for any event (not just the cumulative incidence of hip or knee revision); we did this to minimize the likelihood of missing any relevant studies. We calculated risk ratios (RRs) comparing the cumulative incidence estimated using the Kaplan-Meier method with the competing-risks method for each study and used DerSimonian and Laird random effects models to pool these RRs. Heterogeneity was explored using stratified meta-analyses and metaregression. The pooled cumulative incidence of revision after hip or knee arthroplasty obtained using the Kaplan-Meier method was 1.55 times higher (95% confidence interval, 1.43-1.68; p < 0.001) than that obtained using the competing-risks method. Longer followup times and higher proportions of competing risks were not associated with increases in the amount of overestimation of revision risk by the Kaplan-Meier method (all p > 0.10). This may be due to the small number of studies that met the inclusion criteria and conservative variance approximation. The Kaplan-Meier method overestimates risk of revision after hip or knee arthroplasty in populations where competing risks (such as death) might preclude the occurrence of the event of interest (revision). Competing-risks methods should be used to more accurately estimate the cumulative incidence of revision when the goal is to plan healthcare services and resource allocation for revisions.
Le Vu, Stéphane; Ratmann, Oliver; Delpech, Valerie; Brown, Alison E; Gill, O Noel; Tostevin, Anna; Fraser, Christophe; Volz, Erik M
2018-06-01
Phylogenetic clustering of HIV sequences from a random sample of patients can reveal epidemiological transmission patterns, but interpretation is hampered by limited theoretical support and statistical properties of clustering analysis remain poorly understood. Alternatively, source attribution methods allow fitting of HIV transmission models and thereby quantify aspects of disease transmission. A simulation study was conducted to assess error rates of clustering methods for detecting transmission risk factors. We modeled HIV epidemics among men having sex with men and generated phylogenies comparable to those that can be obtained from HIV surveillance data in the UK. Clustering and source attribution approaches were applied to evaluate their ability to identify patient attributes as transmission risk factors. We find that commonly used methods show a misleading association between cluster size or odds of clustering and covariates that are correlated with time since infection, regardless of their influence on transmission. Clustering methods usually have higher error rates and lower sensitivity than source attribution method for identifying transmission risk factors. But neither methods provide robust estimates of transmission risk ratios. Source attribution method can alleviate drawbacks from phylogenetic clustering but formal population genetic modeling may be required to estimate quantitative transmission risk factors. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Risk Classification with an Adaptive Naive Bayes Kernel Machine Model.
Minnier, Jessica; Yuan, Ming; Liu, Jun S; Cai, Tianxi
2015-04-22
Genetic studies of complex traits have uncovered only a small number of risk markers explaining a small fraction of heritability and adding little improvement to disease risk prediction. Standard single marker methods may lack power in selecting informative markers or estimating effects. Most existing methods also typically do not account for non-linearity. Identifying markers with weak signals and estimating their joint effects among many non-informative markers remains challenging. One potential approach is to group markers based on biological knowledge such as gene structure. If markers in a group tend to have similar effects, proper usage of the group structure could improve power and efficiency in estimation. We propose a two-stage method relating markers to disease risk by taking advantage of known gene-set structures. Imposing a naive bayes kernel machine (KM) model, we estimate gene-set specific risk models that relate each gene-set to the outcome in stage I. The KM framework efficiently models potentially non-linear effects of predictors without requiring explicit specification of functional forms. In stage II, we aggregate information across gene-sets via a regularization procedure. Estimation and computational efficiency is further improved with kernel principle component analysis. Asymptotic results for model estimation and gene set selection are derived and numerical studies suggest that the proposed procedure could outperform existing procedures for constructing genetic risk models.
Vascular Disease, ESRD, and Death: Interpreting Competing Risk Analyses
Coresh, Josef; Segev, Dorry L.; Kucirka, Lauren M.; Tighiouart, Hocine; Sarnak, Mark J.
2012-01-01
Summary Background and objectives Vascular disease, a common condition in CKD, is a risk factor for mortality and ESRD. Optimal patient care requires accurate estimation and ordering of these competing risks. Design, setting, participants, & measurements This is a prospective cohort study of screened (n=885) and randomized participants (n=837) in the Modification of Diet in Renal Disease study (original study enrollment, 1989–1992), evaluating the association of vascular disease with ESRD and pre-ESRD mortality using standard survival analysis and competing risk regression. Results The method of analysis resulted in markedly different estimates. Cumulative incidence by standard analysis (censoring at the competing event) implied that, with vascular disease, the 15-year incidence was 66% and 51% for ESRD and pre-ESRD death, respectively. A more accurate representation of absolute risk was estimated with competing risk regression: 15-year incidence was 54% and 29% for ESRD and pre-ESRD death, respectively. For the association of vascular disease with pre-ESRD death, estimates of relative risk by the two methods were similar (standard survival analysis adjusted hazard ratio, 1.63; 95% confidence interval, 1.20–2.20; competing risk regression adjusted subhazard ratio, 1.57; 95% confidence interval, 1.15–2.14). In contrast, the hazard and subhazard ratios differed substantially for other associations, such as GFR and pre-ESRD mortality. Conclusions When competing events exist, absolute risk is better estimated using competing risk regression, but etiologic associations by this method must be carefully interpreted. The presence of vascular disease in CKD decreases the likelihood of survival to ESRD, independent of age and other risk factors. PMID:22859747
Vascular disease, ESRD, and death: interpreting competing risk analyses.
Grams, Morgan E; Coresh, Josef; Segev, Dorry L; Kucirka, Lauren M; Tighiouart, Hocine; Sarnak, Mark J
2012-10-01
Vascular disease, a common condition in CKD, is a risk factor for mortality and ESRD. Optimal patient care requires accurate estimation and ordering of these competing risks. This is a prospective cohort study of screened (n=885) and randomized participants (n=837) in the Modification of Diet in Renal Disease study (original study enrollment, 1989-1992), evaluating the association of vascular disease with ESRD and pre-ESRD mortality using standard survival analysis and competing risk regression. The method of analysis resulted in markedly different estimates. Cumulative incidence by standard analysis (censoring at the competing event) implied that, with vascular disease, the 15-year incidence was 66% and 51% for ESRD and pre-ESRD death, respectively. A more accurate representation of absolute risk was estimated with competing risk regression: 15-year incidence was 54% and 29% for ESRD and pre-ESRD death, respectively. For the association of vascular disease with pre-ESRD death, estimates of relative risk by the two methods were similar (standard survival analysis adjusted hazard ratio, 1.63; 95% confidence interval, 1.20-2.20; competing risk regression adjusted subhazard ratio, 1.57; 95% confidence interval, 1.15-2.14). In contrast, the hazard and subhazard ratios differed substantially for other associations, such as GFR and pre-ESRD mortality. When competing events exist, absolute risk is better estimated using competing risk regression, but etiologic associations by this method must be carefully interpreted. The presence of vascular disease in CKD decreases the likelihood of survival to ESRD, independent of age and other risk factors.
Mobile Applications for Type 2 Diabetes Risk Estimation: a Systematic Review.
Fijacko, Nino; Brzan, Petra Povalej; Stiglic, Gregor
2015-10-01
Screening for chronical diseases like type 2 diabetes can be done using different methods and various risk tests. This study present a review of type 2 diabetes risk estimation mobile applications focusing on their functionality and availability of information on the underlying risk calculators. Only 9 out of 31 reviewed mobile applications, featured in three major mobile application stores, disclosed the name of risk calculator used for assessing the risk of type 2 diabetes. Even more concerning, none of the reviewed applications mentioned that they are collecting the data from users to improve the performance of their risk estimation calculators or offer users the descriptive statistics of the results from users that already used the application. For that purpose the questionnaires used for calculation of risk should be upgraded by including the information on the most recent blood sugar level measurements from users. Although mobile applications represent a great future potential for health applications, developers still do not put enough emphasis on informing the user of the underlying methods used to estimate the risk for a specific clinical condition.
Multiple imputation for handling missing outcome data when estimating the relative risk.
Sullivan, Thomas R; Lee, Katherine J; Ryan, Philip; Salter, Amy B
2017-09-06
Multiple imputation is a popular approach to handling missing data in medical research, yet little is known about its applicability for estimating the relative risk. Standard methods for imputing incomplete binary outcomes involve logistic regression or an assumption of multivariate normality, whereas relative risks are typically estimated using log binomial models. It is unclear whether misspecification of the imputation model in this setting could lead to biased parameter estimates. Using simulated data, we evaluated the performance of multiple imputation for handling missing data prior to estimating adjusted relative risks from a correctly specified multivariable log binomial model. We considered an arbitrary pattern of missing data in both outcome and exposure variables, with missing data induced under missing at random mechanisms. Focusing on standard model-based methods of multiple imputation, missing data were imputed using multivariate normal imputation or fully conditional specification with a logistic imputation model for the outcome. Multivariate normal imputation performed poorly in the simulation study, consistently producing estimates of the relative risk that were biased towards the null. Despite outperforming multivariate normal imputation, fully conditional specification also produced somewhat biased estimates, with greater bias observed for higher outcome prevalences and larger relative risks. Deleting imputed outcomes from analysis datasets did not improve the performance of fully conditional specification. Both multivariate normal imputation and fully conditional specification produced biased estimates of the relative risk, presumably since both use a misspecified imputation model. Based on simulation results, we recommend researchers use fully conditional specification rather than multivariate normal imputation and retain imputed outcomes in the analysis when estimating relative risks. However fully conditional specification is not without its shortcomings, and so further research is needed to identify optimal approaches for relative risk estimation within the multiple imputation framework.
Estimating seat belt effectiveness using matched-pair cohort methods.
Cummings, Peter; Wells, James D; Rivara, Frederick P
2003-01-01
Using US data for 1986-1998 fatal crashes, we employed matched-pair analysis methods to estimate that the relative risk of death among belted compared with unbelted occupants was 0.39 (95% confidence interval (CI) 0.37-0.41). This differs from relative risk estimates of about 0.55 in studies that used crash data collected prior to 1986. Using 1975-1998 data, we examined and rejected three theories that might explain the difference between our estimate and older estimates: (1) differences in the analysis methods; (2) changes related to car model year; (3) changes in crash characteristics over time. A fourth theory, that the introduction of seat belt laws would induce some survivors to claim belt use when they were not restrained, could explain part of the difference in our estimate and older estimates; but even in states without seat belt laws, from 1986 through 1998, the relative risk estimate was 0.45 (95% CI 0.39-0.52). All of the difference between our estimate and older estimates could be explained by some misclassification of seat belt use. Relative risk estimates would move away from 1, toward their true value, if misclassification of both the belted and unbelted decreased over time, or if the degree of misclassification remained constant, as the prevalence of belt use increased. We conclude that estimates of seat belt effects based upon data prior to 1986 may be biased toward 1 by misclassification.
A forecasting method to reduce estimation bias in self-reported cell phone data.
Redmayne, Mary; Smith, Euan; Abramson, Michael J
2013-01-01
There is ongoing concern that extended exposure to cell phone electromagnetic radiation could be related to an increased risk of negative health effects. Epidemiological studies seek to assess this risk, usually relying on participants' recalled use, but recall is notoriously poor. Our objectives were primarily to produce a forecast method, for use by such studies, to reduce estimation bias in the recalled extent of cell phone use. The method we developed, using Bayes' rule, is modelled with data we collected in a cross-sectional cluster survey exploring cell phone user-habits among New Zealand adolescents. Participants recalled their recent extent of SMS-texting and retrieved from their provider the current month's actual use-to-date. Actual use was taken as the gold standard in the analyses. Estimation bias arose from a large random error, as observed in all cell phone validation studies. We demonstrate that this seriously exaggerates upper-end forecasts of use when used in regression models. This means that calculations using a regression model will lead to underestimation of heavy-users' relative risk. Our Bayesian method substantially reduces estimation bias. In cases where other studies' data conforms to our method's requirements, application should reduce estimation bias, leading to a more accurate relative risk calculation for mid-to-heavy users.
Chen, Wansu; Shi, Jiaxiao; Qian, Lei; Azen, Stanley P
2014-06-26
To estimate relative risks or risk ratios for common binary outcomes, the most popular model-based methods are the robust (also known as modified) Poisson and the log-binomial regression. Of the two methods, it is believed that the log-binomial regression yields more efficient estimators because it is maximum likelihood based, while the robust Poisson model may be less affected by outliers. Evidence to support the robustness of robust Poisson models in comparison with log-binomial models is very limited. In this study a simulation was conducted to evaluate the performance of the two methods in several scenarios where outliers existed. The findings indicate that for data coming from a population where the relationship between the outcome and the covariate was in a simple form (e.g. log-linear), the two models yielded comparable biases and mean square errors. However, if the true relationship contained a higher order term, the robust Poisson models consistently outperformed the log-binomial models even when the level of contamination is low. The robust Poisson models are more robust (or less sensitive) to outliers compared to the log-binomial models when estimating relative risks or risk ratios for common binary outcomes. Users should be aware of the limitations when choosing appropriate models to estimate relative risks or risk ratios.
Estimating Risk from Ambient Concentrations of Acrolein across the United States
Woodruff, Tracey J.; Wells, Ellen M.; Holt, Elizabeth W.; Burgin, Deborah E.; Axelrad, Daniel A.
2007-01-01
Background Estimated ambient concentrations of acrolein, a hazardous air pollutant, are greater than the U.S. Environmental Protection Agency (EPA) reference concentration throughout the United States, making it a concern for human health. However, there is no method for assessing the extent of risk under the U.S. EPA noncancer risk assessment framework. Objectives We estimated excess risks from ambient concentrations of acrolein based on dose–response modeling of a study in rats with a relationship between acrolein and residual volume/total lung capacity ratio (RV/TLC) and specific compliance (sCL), markers for altered lung function. Methods Based on existing literature, we defined values above the 90th percentile for controls as “adverse.” We estimated the increase over baseline response that would occur in the human population from estimated ambient concentrations of acrolein, taken from the U.S. EPA’s National-Scale Air Toxics Assessment for 1999, after standard animal-to-human conversions and extrapolating to doses below the experimental data. Results The estimated median additional number of adverse sCL outcomes across the United States was approximately 2.5 cases per 1,000 people. The estimated range of additional outcomes from the 5th to the 95th percentile of acrolein concentration levels across census tracts was 0.28–14 cases per 1,000. For RV/TLC, the median additional outcome was 0.002 per 1,000, and the additional outcome at the 95th percentile was 0.13 per 1,000. Conclusions Although there are uncertainties in estimating human risks from animal data, this analysis demonstrates a method for estimating health risks for noncancer effects and suggests that acrolein could be associated with decreased respiratory function in the United States. PMID:17431491
NASA Astrophysics Data System (ADS)
Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua
2017-05-01
With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.
Survival analysis with error-prone time-varying covariates: a risk set calibration approach
Liao, Xiaomei; Zucker, David M.; Li, Yi; Spiegelman, Donna
2010-01-01
Summary Occupational, environmental, and nutritional epidemiologists are often interested in estimating the prospective effect of time-varying exposure variables such as cumulative exposure or cumulative updated average exposure, in relation to chronic disease endpoints such as cancer incidence and mortality. From exposure validation studies, it is apparent that many of the variables of interest are measured with moderate to substantial error. Although the ordinary regression calibration approach is approximately valid and efficient for measurement error correction of relative risk estimates from the Cox model with time-independent point exposures when the disease is rare, it is not adaptable for use with time-varying exposures. By re-calibrating the measurement error model within each risk set, a risk set regression calibration method is proposed for this setting. An algorithm for a bias-corrected point estimate of the relative risk using an RRC approach is presented, followed by the derivation of an estimate of its variance, resulting in a sandwich estimator. Emphasis is on methods applicable to the main study/external validation study design, which arises in important applications. Simulation studies under several assumptions about the error model were carried out, which demonstrated the validity and efficiency of the method in finite samples. The method was applied to a study of diet and cancer from Harvard’s Health Professionals Follow-up Study (HPFS). PMID:20486928
NASA Astrophysics Data System (ADS)
Dewi Ratih, Iis; Sutijo Supri Ulama, Brodjol; Prastuti, Mike
2018-03-01
Value at Risk (VaR) is one of the statistical methods used to measure market risk by estimating the worst losses in a given time period and level of confidence. The accuracy of this measuring tool is very important in determining the amount of capital that must be provided by the company to cope with possible losses. Because there is a greater losses to be faced with a certain degree of probability by the greater risk. Based on this, VaR calculation analysis is of particular concern to researchers and practitioners of the stock market to be developed, thus getting more accurate measurement estimates. In this research, risk analysis of stocks in four banking sub-sector, Bank Rakyat Indonesia, Bank Mandiri, Bank Central Asia and Bank Negara Indonesia will be done. Stock returns are expected to be influenced by exogenous variables, namely ICI and exchange rate. Therefore, in this research, stock risk estimation are done by using VaR ARMAX-GARCHX method. Calculating the VaR value with the ARMAX-GARCHX approach using window 500 gives more accurate results. Overall, Bank Central Asia is the only bank had the estimated maximum loss in the 5% quantile.
Uncertainties in estimates of the risks of late effects from space radiation
NASA Astrophysics Data System (ADS)
Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P. B.; Dicello, J. F.
2004-01-01
Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits.
Pretest probability estimation in the evaluation of patients with possible deep vein thrombosis.
Vinson, David R; Patel, Jason P; Irving, Cedric S
2011-07-01
An estimation of pretest probability is integral to the proper interpretation of a negative compression ultrasound in the diagnostic assessment of lower-extremity deep vein thrombosis. We sought to determine the rate, method, and predictors of pretest probability estimation in such patients. This cross-sectional study of outpatients was conducted in a suburban community hospital in 2006. Estimation of pretest probability was done by enzyme-linked immunosorbent assay d-dimer, Wells criteria, and unstructured clinical impression. Using logistic regression analysis, we measured predictors of documented risk assessment. A cohort analysis was undertaken to compare 3-month thromboembolic outcomes between risk groups. Among 524 cases, 289 (55.2%) underwent pretest probability estimation using the following methods: enzyme-linked immunosorbent assay d-dimer (228; 43.5%), clinical impression (106; 20.2%), and Wells criteria (24; 4.6%), with 69 (13.2%) patients undergoing a combination of at least two methods. Patient factors were not predictive of pretest probability estimation, but the specialty of the clinician was predictive; emergency physicians (P < .0001) and specialty clinicians (P = .001) were less likely than primary care clinicians to perform risk assessment. Thromboembolic events within 3 months were experienced by 0 of 52 patients in the explicitly low-risk group, 4 (1.8%) of 219 in the explicitly moderate- to high-risk group, and 1 (0.4%) of 226 in the group that did not undergo explicit risk assessment. Negative ultrasounds in the workup of deep vein thrombosis are commonly interpreted in isolation apart from pretest probability estimations. Risk assessments varied by physician specialties. Opportunities exist for improvement in the diagnostic evaluation of these patients. Copyright © 2011 Elsevier Inc. All rights reserved.
Xu, Fang; Wallace, Robyn C.; Garvin, William; Greenlund, Kurt J.; Bartoli, William; Ford, Derek; Eke, Paul; Town, G. Machell
2016-01-01
Public health researchers have used a class of statistical methods to calculate prevalence estimates for small geographic areas with few direct observations. Many researchers have used Behavioral Risk Factor Surveillance System (BRFSS) data as a basis for their models. The aims of this study were to 1) describe a new BRFSS small area estimation (SAE) method and 2) investigate the internal and external validity of the BRFSS SAEs it produced. The BRFSS SAE method uses 4 data sets (the BRFSS, the American Community Survey Public Use Microdata Sample, Nielsen Claritas population totals, and the Missouri Census Geographic Equivalency File) to build a single weighted data set. Our findings indicate that internal and external validity tests were successful across many estimates. The BRFSS SAE method is one of several methods that can be used to produce reliable prevalence estimates in small geographic areas. PMID:27418213
The estimation of time-varying risks in asset pricing modelling using B-Spline method
NASA Astrophysics Data System (ADS)
Nurjannah; Solimun; Rinaldo, Adji
2017-12-01
Asset pricing modelling has been extensively studied in the past few decades to explore the risk-return relationship. The asset pricing literature typically assumed a static risk-return relationship. However, several studies found few anomalies in the asset pricing modelling which captured the presence of the risk instability. The dynamic model is proposed to offer a better model. The main problem highlighted in the dynamic model literature is that the set of conditioning information is unobservable and therefore some assumptions have to be made. Hence, the estimation requires additional assumptions about the dynamics of risk. To overcome this problem, the nonparametric estimators can also be used as an alternative for estimating risk. The flexibility of the nonparametric setting avoids the problem of misspecification derived from selecting a functional form. This paper investigates the estimation of time-varying asset pricing model using B-Spline, as one of nonparametric approach. The advantages of spline method is its computational speed and simplicity, as well as the clarity of controlling curvature directly. The three popular asset pricing models will be investigated namely CAPM (Capital Asset Pricing Model), Fama-French 3-factors model and Carhart 4-factors model. The results suggest that the estimated risks are time-varying and not stable overtime which confirms the risk instability anomaly. The results is more pronounced in Carhart’s 4-factors model.
Zeng, Chan; Newcomer, Sophia R; Glanz, Jason M; Shoup, Jo Ann; Daley, Matthew F; Hambidge, Simon J; Xu, Stanley
2013-12-15
The self-controlled case series (SCCS) method is often used to examine the temporal association between vaccination and adverse events using only data from patients who experienced such events. Conditional Poisson regression models are used to estimate incidence rate ratios, and these models perform well with large or medium-sized case samples. However, in some vaccine safety studies, the adverse events studied are rare and the maximum likelihood estimates may be biased. Several bias correction methods have been examined in case-control studies using conditional logistic regression, but none of these methods have been evaluated in studies using the SCCS design. In this study, we used simulations to evaluate 2 bias correction approaches-the Firth penalized maximum likelihood method and Cordeiro and McCullagh's bias reduction after maximum likelihood estimation-with small sample sizes in studies using the SCCS design. The simulations showed that the bias under the SCCS design with a small number of cases can be large and is also sensitive to a short risk period. The Firth correction method provides finite and less biased estimates than the maximum likelihood method and Cordeiro and McCullagh's method. However, limitations still exist when the risk period in the SCCS design is short relative to the entire observation period.
Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa
2015-01-01
The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336
Comparison of risk estimates using life-table methods.
Sullivan, R E; Weng, P S
1987-08-01
Risk estimates promulgated by various radiation protection authorities in recent years have become increasingly more complex. Early "integral" estimates in the form of health effects per 0.01 person-Gy (per person-rad) or per 10(4) person-Gy (per 10(6) person-rad) have tended to be replaced by "differential" estimates which are age- and sex-dependent and specify both minimum induction (latency) and duration of risk expression (plateau) periods. These latter types of risk estimate must be used in conjunction with a life table in order to reduce them to integral form. In this paper, the life table has been used to effect a comparison of the organ and tissue risk estimates derived in several recent reports. In addition, a brief review of life-table methodology is presented and some features of the models used in deriving differential coefficients are discussed. While the great number of permutations possible with dose-response models, detailed risk estimates and proposed projection models precludes any unique result, the reduced integral coefficients are required to conform to the linear, absolute-risk model recommended for use with the integral risk estimates reviewed.
Van der Bij, Sjoukje; Vermeulen, Roel C H; Portengen, Lützen; Moons, Karel G M; Koffijberg, Hendrik
2016-05-01
Exposure to asbestos fibres increases the risk of mesothelioma and lung cancer. Although the vast majority of mesothelioma cases are caused by asbestos exposure, the number of asbestos-related lung cancers is less clear. This number cannot be determined directly as lung cancer causes are not clinically distinguishable but may be estimated using varying modelling methods. We applied three different modelling methods to the Dutch population supplemented with uncertainty ranges (UR) due to uncertainty in model input values. The first method estimated asbestos-related lung cancer cases directly from observed and predicted mesothelioma cases in an age-period-cohort analysis. The second method used evidence on the fraction of lung cancer cases attributable (population attributable risk (PAR)) to asbestos exposure. The third method incorporated risk estimates and population exposure estimates to perform a life table analysis. The three methods varied substantially in incorporated evidence. Moreover, the estimated number of asbestos-related lung cancer cases in the Netherlands between 2011 and 2030 depended crucially on the actual method applied, as the mesothelioma method predicts 17 500 expected cases (UR 7000-57 000), the PAR method predicts 12 150 cases (UR 6700-19 000), and the life table analysis predicts 6800 cases (UR 6800-33 850). The three different methods described resulted in absolute estimates varying by a factor of ∼2.5. These results show that accurate estimation of the impact of asbestos exposure on the lung cancer burden remains a challenge. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Methods to Develop Inhalation Cancer Risk Estimates for ...
This document summarizes the approaches and rationale for the technical and scientific considerations used to derive inhalation cancer risks for emissions of chromium and nickel compounds from electric utility steam generating units. The purpose of this document is to discuss the methods used to develop inhalation cancer risk estimates associated with emissions of chromium and nickel compounds from coal- and oil-fired electric utility steam generating units (EGUs) in support of EPA's recently proposed Air Toxics Rule.
OBJECTIVES: Estimating gestational age is usually based on date of last menstrual period (LMP) or clinical estimation (CE); both approaches introduce potential bias. Differences in methods of estimation may lead to misclassificat ion and inconsistencies in risk estimates, particu...
Uncertainties in Estimates of the Risks of Late Effects from Space Radiation
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P.; Dicelli, J. F.
2002-01-01
The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, and non-cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a Maximum Likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objective's, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits.
Empirical evaluation of the market price of risk using the CIR model
NASA Astrophysics Data System (ADS)
Bernaschi, M.; Torosantucci, L.; Uboldi, A.
2007-03-01
We describe a simple but effective method for the estimation of the market price of risk. The basic idea is to compare the results obtained by following two different approaches in the application of the Cox-Ingersoll-Ross (CIR) model. In the first case, we apply the non-linear least squares method to cross sectional data (i.e., all rates of a single day). In the second case, we consider the short rate obtained by means of the first procedure as a proxy of the real market short rate. Starting from this new proxy, we evaluate the parameters of the CIR model by means of martingale estimation techniques. The estimate of the market price of risk is provided by comparing results obtained with these two techniques, since this approach makes possible to isolate the market price of risk and evaluate, under the Local Expectations Hypothesis, the risk premium given by the market for different maturities. As a test case, we apply the method to data of the European Fixed Income Market.
McDonald, S A; Hutchinson, S J; Schnier, C; McLeod, A; Goldberg, D J
2014-01-01
In countries maintaining national hepatitis C virus (HCV) surveillance systems, a substantial proportion of individuals report no risk factors for infection. Our goal was to estimate the proportion of diagnosed HCV antibody-positive persons in Scotland (1991-2010) who probably acquired infection through injecting drug use (IDU), by combining data on IDU risk from four linked data sources using log-linear capture-recapture methods. Of 25,521 HCV-diagnosed individuals, 14,836 (58%) reported IDU risk with their HCV diagnosis. Log-linear modelling estimated a further 2484 HCV-diagnosed individuals with IDU risk, giving an estimated prevalence of 83. Stratified analyses indicated variation across birth cohort, with estimated prevalence as low as 49% in persons born before 1960 and greater than 90% for those born since 1960. These findings provide public-health professionals with a more complete profile of Scotland's HCV-infected population in terms of transmission route, which is essential for targeting educational, prevention and treatment interventions.
Bergsvik, Daniel; Rogeberg, Ole
2018-04-01
The provision of accurate information on health damaging behaviours and products is a widely accepted and widespread governmental task. It is easily mismanaged. This study demonstrates a simple method which can help to evaluate whether such information corrects recipient risk beliefs. Participants assess risks numerically, before and after being exposed to a relevant risk communication. Accuracy is incentivised by awarding financial prizes to answers closest to a pursued risk belief. To illustrate this method, 228 students from the University of Oslo, Norway, were asked to estimate the mortality risk of Swedish snus and cigarettes twice, before and after being exposed to one of three risk communications with information on the health dangers of snus. The data allow us to measure how participants updated their risk beliefs after being exposed to different risk communications. Risk information from the government strongly distorted risk perceptions for snus. A newspaper article discussing the relative risks of cigarettes and snus reduced belief errors regarding snus risks, but increased belief errors regarding smoking. The perceived quality of the risk communication was not associated with decreased belief errors. Public health information can potentially make the public less informed on risks about harmful products or behaviours. This risk can be reduced by targeting identified, measurable belief errors and empirically assessing how alternative communications affect these. The proposed method of incentivised risk estimation might be helpful in future assessments of risk communications. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Huggins, Richard
2013-10-01
Precise estimation of the relative risk of motorcyclists being involved in a fatal accident compared to car drivers is difficult. Simple estimates based on the proportions of licenced drivers or riders that are killed in a fatal accident are biased as they do not take into account the exposure to risk. However, exposure is difficult to quantify. Here we adapt the ideas behind the well known induced exposure methods and use available summary data on speeding detections and fatalities for motorcycle riders and car drivers to estimate the relative risk of a fatality for motorcyclists compared to car drivers under mild assumptions. The method is applied to data on motorcycle riders and car drivers in Victoria, Australia in 2010 and a small simulation study is conducted. Copyright © 2013 Elsevier Ltd. All rights reserved.
Uncertainties in estimates of the risks of late effects from space radiation
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P. B.; Dicello, J. F.
2004-01-01
Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits. Published by Elsevier Ltd on behalf of COSPAR.
Allodji, Rodrigue S; Schwartz, Boris; Diallo, Ibrahima; Agbovon, Césaire; Laurier, Dominique; de Vathaire, Florent
2015-08-01
Analyses of the Life Span Study (LSS) of Japanese atomic bombing survivors have routinely incorporated corrections for additive classical measurement errors using regression calibration. Recently, several studies reported that the efficiency of the simulation-extrapolation method (SIMEX) is slightly more accurate than the simple regression calibration method (RCAL). In the present paper, the SIMEX and RCAL methods have been used to address errors in atomic bomb survivor dosimetry on solid cancer and leukaemia mortality risk estimates. For instance, it is shown that using the SIMEX method, the ERR/Gy is increased by an amount of about 29 % for all solid cancer deaths using a linear model compared to the RCAL method, and the corrected EAR 10(-4) person-years at 1 Gy (the linear terms) is decreased by about 8 %, while the corrected quadratic term (EAR 10(-4) person-years/Gy(2)) is increased by about 65 % for leukaemia deaths based on a linear-quadratic model. The results with SIMEX method are slightly higher than published values. The observed differences were probably due to the fact that with the RCAL method the dosimetric data were partially corrected, while all doses were considered with the SIMEX method. Therefore, one should be careful when comparing the estimated risks and it may be useful to use several correction techniques in order to obtain a range of corrected estimates, rather than to rely on a single technique. This work will enable to improve the risk estimates derived from LSS data, and help to make more reliable the development of radiation protection standards.
Braunstein, Sarah L; van de Wijgert, Janneke H; Vyankandondera, Joseph; Kestelyn, Evelyne; Ntirushwa, Justin; Nash, Denis
2012-01-01
Background: The epidemiologic utility of STARHS hinges not only on producing accurate estimates of HIV incidence, but also on identifying risk factors for recent HIV infection. Methods: As part of an HIV seroincidence study, 800 Rwandan female sex workers (FSW) were HIV tested, with those testing positive further tested by BED-CEIA (BED) and AxSYM Avidity Index (Ax-AI) assays. A sample of HIV-negative (N=397) FSW were followed prospectively for HIV seroconversion. We compared estimates of risk factors for: 1) prevalent HIV infection; 2) recently acquired HIV infection (RI) based on three different STARHS classifications (BED alone, Ax-AI alone, BED/Ax-AI combined); and 3) prospectively observed seroconversion. Results: There was mixed agreement in risk factors between methods. HSV-2 coinfection and recent STI treatment were associated with both prevalent HIV infection and all three measures of recent infection. A number of risk factors were associated only with prevalent infection, including widowhood, history of forced sex, regular alcohol consumption, prior imprisonment, and current breastfeeding. Number of sex partners in the last 3 months was associated with recent infection based on BED/Ax-AI combined, but not other STARHS-based recent infection outcomes or prevalent infection. Risk factor estimates for prospectively observed seroconversion differed in magnitude and direction from those for recent infection via STARHS. Conclusions: Differences in risk factor estimates by each method could reflect true differences in risk factors between the prevalent, recently, or newly infected populations, the effect of study interventions (among those followed prospectively), or assay misclassification. Similar investigations in other populations/settings are needed to further establish the epidemiologic utility of STARHS for identifying risk factors, in addition to incidence rate estimation. PMID:23056162
[Survival analysis with competing risks: estimating failure probability].
Llorca, Javier; Delgado-Rodríguez, Miguel
2004-01-01
To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.
Quantifying the hurricane catastrophe risk to offshore wind power.
Rose, Stephen; Jaramillo, Paulina; Small, Mitchell J; Apt, Jay
2013-12-01
The U.S. Department of Energy has estimated that over 50 GW of offshore wind power will be required for the United States to generate 20% of its electricity from wind. Developers are actively planning offshore wind farms along the U.S. Atlantic and Gulf coasts and several leases have been signed for offshore sites. These planned projects are in areas that are sometimes struck by hurricanes. We present a method to estimate the catastrophe risk to offshore wind power using simulated hurricanes. Using this method, we estimate the fraction of offshore wind power simultaneously offline and the cumulative damage in a region. In Texas, the most vulnerable region we studied, 10% of offshore wind power could be offline simultaneously because of hurricane damage with a 100-year return period and 6% could be destroyed in any 10-year period. We also estimate the risks to single wind farms in four representative locations; we find the risks are significant but lower than those estimated in previously published results. Much of the hurricane risk to offshore wind turbines can be mitigated by designing turbines for higher maximum wind speeds, ensuring that turbine nacelles can turn quickly to track the wind direction even when grid power is lost, and building in areas with lower risk. © 2013 Society for Risk Analysis.
Multiple imputation for estimating the risk of developing dementia and its impact on survival.
Yu, Binbing; Saczynski, Jane S; Launer, Lenore
2010-10-01
Dementia, Alzheimer's disease in particular, is one of the major causes of disability and decreased quality of life among the elderly and a leading obstacle to successful aging. Given the profound impact on public health, much research has focused on the age-specific risk of developing dementia and the impact on survival. Early work has discussed various methods of estimating age-specific incidence of dementia, among which the illness-death model is popular for modeling disease progression. In this article we use multiple imputation to fit multi-state models for survival data with interval censoring and left truncation. This approach allows semi-Markov models in which survival after dementia depends on onset age. Such models can be used to estimate the cumulative risk of developing dementia in the presence of the competing risk of dementia-free death. Simulations are carried out to examine the performance of the proposed method. Data from the Honolulu Asia Aging Study are analyzed to estimate the age-specific and cumulative risks of dementia and to examine the effect of major risk factors on dementia onset and death.
Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R
2013-01-01
The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.
Fundamentals of health risk assessment. Use, derivation, validity and limitations of safety indices.
Putzrath, R M; Wilson, J D
1999-04-01
We investigated the way results of human health risk assessments are used, and the theory used to describe those methods, sometimes called the "NAS paradigm." Contrary to a key tenet of that theory, current methods have strictly limited utility. The characterizations now considered standard, Safety Indices such as "Acceptable Daily Intake," "Reference Dose," and so on, usefully inform only decisions that require a choice between two policy alternatives (e.g., approve a food additive or not), decided solely on the basis of a finding of safety. Risk is characterized as the quotient of one of these Safety Indices divided by an estimate of exposure: a quotient greater than one implies that the situation may be considered safe. Such decisions are very widespread, both in the U.S. federal government and elsewhere. No current method is universal; different policies lead to different practices, for example, in California's "Proposition 65," where statutory provisions specify some practices. Further, an important kind of human health risk assessment is not recognized by this theory: this kind characterizes risk as likelihood of harm, given estimates of exposure consequent to various decision choices. Likelihood estimates are necessary whenever decision makers have many possible decision choices and must weigh more than two societal values, such as in EPA's implementation of "conventional air pollutants." These estimates can not be derived using current methods; different methods are needed. Our analysis suggests changes needed in both the theory and practice of human health risk assessment, and how what is done is depicted.
Asbestos exposure--quantitative assessment of risk
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, J.M.; Weill, H.
Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under considerationmore » by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.« less
Estimating parameter of Rayleigh distribution by using Maximum Likelihood method and Bayes method
NASA Astrophysics Data System (ADS)
Ardianti, Fitri; Sutarman
2018-01-01
In this paper, we use Maximum Likelihood estimation and Bayes method under some risk function to estimate parameter of Rayleigh distribution to know the best method. The prior knowledge which used in Bayes method is Jeffrey’s non-informative prior. Maximum likelihood estimation and Bayes method under precautionary loss function, entropy loss function, loss function-L 1 will be compared. We compare these methods by bias and MSE value using R program. After that, the result will be displayed in tables to facilitate the comparisons.
Wang, Z.
2007-01-01
Although the causes of large intraplate earthquakes are still not fully understood, they pose certain hazard and risk to societies. Estimating hazard and risk in these regions is difficult because of lack of earthquake records. The New Madrid seismic zone is one such region where large and rare intraplate earthquakes (M = 7.0 or greater) pose significant hazard and risk. Many different definitions of hazard and risk have been used, and the resulting estimates differ dramatically. In this paper, seismic hazard is defined as the natural phenomenon generated by earthquakes, such as ground motion, and is quantified by two parameters: a level of hazard and its occurrence frequency or mean recurrence interval; seismic risk is defined as the probability of occurrence of a specific level of seismic hazard over a certain time and is quantified by three parameters: probability, a level of hazard, and exposure time. Probabilistic seismic hazard analysis (PSHA), a commonly used method for estimating seismic hazard and risk, derives a relationship between a ground motion parameter and its return period (hazard curve). The return period is not an independent temporal parameter but a mathematical extrapolation of the recurrence interval of earthquakes and the uncertainty of ground motion. Therefore, it is difficult to understand and use PSHA. A new method is proposed and applied here for estimating seismic hazard in the New Madrid seismic zone. This method provides hazard estimates that are consistent with the state of our knowledge and can be easily applied to other intraplate regions. ?? 2007 The Geological Society of America.
NASA Astrophysics Data System (ADS)
Paudel, Y.; Botzen, W. J. W.; Aerts, J. C. J. H.
2013-03-01
This study applies Bayesian Inference to estimate flood risk for 53 dyke ring areas in the Netherlands, and focuses particularly on the data scarcity and extreme behaviour of catastrophe risk. The probability density curves of flood damage are estimated through Monte Carlo simulations. Based on these results, flood insurance premiums are estimated using two different practical methods that each account in different ways for an insurer's risk aversion and the dispersion rate of loss data. This study is of practical relevance because insurers have been considering the introduction of flood insurance in the Netherlands, which is currently not generally available.
SOME PROBLEMS OF "SAFE DOSE" ESTIMATION
In environmental carcinogenic risk assessment, the usually defined "safe doses" appear subjective in some sense. n this paper a method of standardizing "safe doses" based on some objective parameters is introduced and a procedure of estimating safe doses under the competing risks...
Risk of Skin Cancer from Space Radiation. Chapter 11
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Kim, Myung-Hee Y.; George, Kerry A.; Wu, Hong-Lu
2003-01-01
We review the methods for estimating the probability of increased incidence of skin cancers from space radiation exposure, and describe some of the individual factors that may contribute to risk projection models, including skin pigment, and synergistic effects of combined ionizing and UV exposure. The steep dose gradients from trapped electrons, protons, and heavy ions radiation during EVA and limitations in EVA dosimetry are important factors for projecting skin cancer risk of astronauts. We estimate that the probability of increased skin cancer risk varies more than 10-fold for individual astronauts and that the risk of skin cancer could exceed 1 % for future lunar base operations for astronauts with light skin color and hair. Limitations in physical dosimetry in estimating the distribution of dose at the skin suggest that new biodosimetry methods be developed for responding to accidental overexposure of the skin during future space missions.
Protection of Space Vehicles from Micrometeoroid/Orbital Debris (MMOD) Damages
NASA Technical Reports Server (NTRS)
Barr, Stephanie
2007-01-01
As the environment that puts space vehicles at risk can never be eliminated, space vehicles must implement protection against the MMOD environment. In general, this protection has been implemented on a risk estimate basis, largely focused on estimates of impactor size and estimated flux. However, there is some uncertainty in applying these methods from data gathered in earth orbit to excursions outside. This paper discusses different past thresholds and processes of the past and suggests additional refinement or methods that could be used for future space endeavors.
Yelland, Lisa N; Salter, Amy B; Ryan, Philip
2011-10-15
Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.
A Robust Approach to Risk Assessment Based on Species Sensitivity Distributions.
Monti, Gianna S; Filzmoser, Peter; Deutsch, Roland C
2018-05-03
The guidelines for setting environmental quality standards are increasingly based on probabilistic risk assessment due to a growing general awareness of the need for probabilistic procedures. One of the commonly used tools in probabilistic risk assessment is the species sensitivity distribution (SSD), which represents the proportion of species affected belonging to a biological assemblage as a function of exposure to a specific toxicant. Our focus is on the inverse use of the SSD curve with the aim of estimating the concentration, HCp, of a toxic compound that is hazardous to p% of the biological community under study. Toward this end, we propose the use of robust statistical methods in order to take into account the presence of outliers or apparent skew in the data, which may occur without any ecological basis. A robust approach exploits the full neighborhood of a parametric model, enabling the analyst to account for the typical real-world deviations from ideal models. We examine two classic HCp estimation approaches and consider robust versions of these estimators. In addition, we also use data transformations in conjunction with robust estimation methods in case of heteroscedasticity. Different scenarios using real data sets as well as simulated data are presented in order to illustrate and compare the proposed approaches. These scenarios illustrate that the use of robust estimation methods enhances HCp estimation. © 2018 Society for Risk Analysis.
Comparison of gestational dating methods and implications ...
OBJECTIVES: Estimating gestational age is usually based on date of last menstrual period (LMP) or clinical estimation (CE); both approaches introduce potential bias. Differences in methods of estimation may lead to misclassificat ion and inconsistencies in risk estimates, particularly if exposure assignment is also gestation-dependent. This paper examines a'what-if' scenario in which alternative methods are used and attempts to elucidate how method choice affects observed results.METHODS: We constructed two 20-week gestational age cohorts of pregnancies between 2000 and 2005 (New Jersey, Pennsylvania, Ohio, USA) using live birth certificates : one defined preterm birth (PTB) status using CE and one using LMP. Within these, we estimated risk for 4 categories of preterm birth (PTBs per 106 pregnancies) and risk differences (RD (95% Cl s)) associated with exposure to particulate matter (PM2. 5).RESULTS: More births were classified preterm using LMP (16%) compared with CE (8%). RD divergences increased between cohorts as exposure period approached delivery. Among births between 28 and 31 weeks, week 7 PM2.5 exposure conveyed RDs of 44 (21 to 67) for CE and 50 (18 to 82) for LMP populations, while week 24 exposure conveyed RDs of 33 (11 to 56) and -20 (-50 to 10), respectively.CONCLUSIONS: Different results from analyses restricted to births with both CE and LMP are most likely due to differences in dating methods rather than selection issues. Results are sensitive t
Project risk management in the construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya
2018-03-01
This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.
Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof
2009-04-01
Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.
NASA Astrophysics Data System (ADS)
Samat, N. A.; Ma'arof, S. H. Mohd Imam
2015-05-01
Disease mapping is a method to display the geographical distribution of disease occurrence, which generally involves the usage and interpretation of a map to show the incidence of certain diseases. Relative risk (RR) estimation is one of the most important issues in disease mapping. This paper begins by providing a brief overview of Chikungunya disease. This is followed by a review of the classical model used in disease mapping, based on the standardized morbidity ratio (SMR), which we then apply to our Chikungunya data. We then fit an extension of the classical model, which we refer to as a Poisson-Gamma model, when prior distributions for the relative risks are assumed known. Both results are displayed and compared using maps and we reveal a smoother map with fewer extremes values of estimated relative risk. The extensions of this paper will consider other methods that are relevant to overcome the drawbacks of the existing methods, in order to inform and direct government strategy for monitoring and controlling Chikungunya disease.
P. B. Woodbury; D. A. Weinstein
2010-01-01
We reviewed probabilistic regional risk assessment methodologies to identify the methods that are currently in use and are capable of estimating threats to ecosystems from fire and fuels, invasive species, and their interactions with stressors. In a companion chapter, we highlight methods useful for evaluating risks from fire. In this chapter, we highlight methods...
Wang, Molin; Liao, Xiaomei; Laden, Francine; Spiegelman, Donna
2016-01-01
Identification of the latency period and age-related susceptibility, if any, is an important aspect of assessing risks of environmental, nutritional and occupational exposures. We consider estimation and inference for latency and age-related susceptibility in relative risk and excess risk models. We focus on likelihood-based methods for point and interval estimation of the latency period and age-related windows of susceptibility coupled with several commonly considered exposure metrics. The method is illustrated in a study of the timing of the effects of constituents of air pollution on mortality in the Nurses’ Health Study. PMID:26750582
Model estimation of claim risk and premium for motor vehicle insurance by using Bayesian method
NASA Astrophysics Data System (ADS)
Sukono; Riaman; Lesmana, E.; Wulandari, R.; Napitupulu, H.; Supian, S.
2018-01-01
Risk models need to be estimated by the insurance company in order to predict the magnitude of the claim and determine the premiums charged to the insured. This is intended to prevent losses in the future. In this paper, we discuss the estimation of risk model claims and motor vehicle insurance premiums using Bayesian methods approach. It is assumed that the frequency of claims follow a Poisson distribution, while a number of claims assumed to follow a Gamma distribution. The estimation of parameters of the distribution of the frequency and amount of claims are made by using Bayesian methods. Furthermore, the estimator distribution of frequency and amount of claims are used to estimate the aggregate risk models as well as the value of the mean and variance. The mean and variance estimator that aggregate risk, was used to predict the premium eligible to be charged to the insured. Based on the analysis results, it is shown that the frequency of claims follow a Poisson distribution with parameter values λ is 5.827. While a number of claims follow the Gamma distribution with parameter values p is 7.922 and θ is 1.414. Therefore, the obtained values of the mean and variance of the aggregate claims respectively are IDR 32,667,489.88 and IDR 38,453,900,000,000.00. In this paper the prediction of the pure premium eligible charged to the insured is obtained, which amounting to IDR 2,722,290.82. The prediction of the claims and premiums aggregate can be used as a reference for the insurance company’s decision-making in management of reserves and premiums of motor vehicle insurance.
Safarnejad, Ali; Groot, Wim; Pavlova, Milena
2018-01-30
Estimation of the size of populations at risk of HIV is a key activity in the surveillance of the HIV epidemic. The existing framework for considering future research needs may provide decision-makers with a basis for a fair process of deciding on the methods of the estimation of the size of key populations at risk of HIV. This study explores the extent to which stakeholders involved with population size estimation agree with this framework, and thus, the study updates the framework. We conducted 16 in-depth interviews with key informants from city and provincial governments, NGOs, research institutes, and the community of people at risk of HIV. Transcripts were analyzed and reviewed for significant statements pertaining to criteria. Variations and agreement around criteria were analyzed, and emerging criteria were validated against the existing framework. Eleven themes emerged which are relevant to the estimation of the size of populations at risk of HIV in Viet Nam. Findings on missing criteria, inclusive participation, community perspectives and conflicting weight and direction of criteria provide insights for an improved framework for the prioritization of population size estimation methods. The findings suggest that the exclusion of community members from decision-making on population size estimation methods in Viet Nam may affect the validity, use, and efficiency of the evidence generated. However, a wider group of decision-makers, including community members among others, may introduce diverse definitions, weight and direction of criteria. Although findings here may not apply to every country with a transitioning economy or to every emerging epidemic, the principles of fair decision-making, value of community participation in decision-making and the expected challenges faced, merit consideration in every situation.
Amornsiripanitch, Nita; Mangano, Mark; Niell, Bethany L
2017-05-01
Many models exist to estimate a woman's risk of development of breast cancer. At screening mammography, many imaging centers collect data required for these models to identify women who may benefit from supplemental screening and referral for cancer risk assessment. The purpose of this study was to discern perceptions and preferences of screening mammography patients regarding communication of estimated breast cancer risk. An anonymous survey was distributed to screening and surveillance mammography patients between April and June 2015. Survey questions were designed to assess patient preferences regarding the receipt and complexity of risk estimate communication, including hypothetical scenarios with and without > 20% estimated risk of breast cancer. The McNemar test and the Wilcoxon signed rank test were used with p ≤ 0.05 considered statistically significant. The survey was distributed to 1061 screening and surveillance mammography patients, and 503 patients responded (response rate, 47%). Although 86% (431/503) of patients expressed interest in learning their estimated risk, only 8% (38/503) had undergone formal risk assessment. The preferred method (241 respondents [26%]) of communication of risk < 20% was a mailed letter accompanying annual mammogram results. For risk > 20%, patients preferred oral communication and were 10-fold as likely to choose only oral communication (p < 0.000001). For risk < 20% and > 20%, patients preferred to learn their estimated risk in great detail (69% and 85%), although women were significantly more likely to choose greater detail for risk > 20% (p < 0.00001). Screening mammography patients expressed interest in learning their estimated risk of breast cancer regardless of their level of hypothetical risk.
Risk-Stratified Imputation in Survival Analysis
Kennedy, Richard E.; Adragni, Kofi P.; Tiwari, Hemant K.; Voeks, Jenifer H.; Brott, Thomas G.; Howard, George
2013-01-01
Background Censoring that is dependent on covariates associated with survival can arise in randomized trials due to changes in recruitment and eligibility criteria to minimize withdrawals, potentially leading to biased treatment effect estimates. Imputation approaches have been proposed to address censoring in survival analysis; and while these approaches may provide unbiased estimates of treatment effects, imputation of a large number of outcomes may over- or underestimate the associated variance based on the imputation pool selected. Purpose We propose an improved method, risk-stratified imputation, as an alternative to address withdrawal related to the risk of events in the context of time-to-event analyses. Methods Our algorithm performs imputation from a pool of replacement subjects with similar values of both treatment and covariate(s) of interest, that is, from a risk-stratified sample. This stratification prior to imputation addresses the requirement of time-to-event analysis that censored observations are representative of all other observations in the risk group with similar exposure variables. We compared our risk-stratified imputation to case deletion and bootstrap imputation in a simulated dataset in which the covariate of interest (study withdrawal) was related to treatment. A motivating example from a recent clinical trial is also presented to demonstrate the utility of our method. Results In our simulations, risk-stratified imputation gives estimates of treatment effect comparable to bootstrap and auxiliary variable imputation while avoiding inaccuracies of the latter two in estimating the associated variance. Similar results were obtained in analysis of clinical trial data. Limitations Risk-stratified imputation has little advantage over other imputation methods when covariates of interest are not related to treatment, although its performance is superior when covariates are related to treatment. Risk-stratified imputation is intended for categorical covariates, and may be sensitive to the width of the matching window if continuous covariates are used. Conclusions The use of the risk-stratified imputation should facilitate the analysis of many clinical trials, in which one group has a higher withdrawal rate that is related to treatment. PMID:23818434
Accounting for Selection Bias in Studies of Acute Cardiac Events.
Banack, Hailey R; Harper, Sam; Kaufman, Jay S
2018-06-01
In cardiovascular research, pre-hospital mortality represents an important potential source of selection bias. Inverse probability of censoring weights are a method to account for this source of bias. The objective of this article is to examine and correct for the influence of selection bias due to pre-hospital mortality on the relationship between cardiovascular risk factors and all-cause mortality after an acute cardiac event. The relationship between the number of cardiovascular disease (CVD) risk factors (0-5; smoking status, diabetes, hypertension, dyslipidemia, and obesity) and all-cause mortality was examined using data from the Atherosclerosis Risk in Communities (ARIC) study. To illustrate the magnitude of selection bias, estimates from an unweighted generalized linear model with a log link and binomial distribution were compared with estimates from an inverse probability of censoring weighted model. In unweighted multivariable analyses the estimated risk ratio for mortality ranged from 1.09 (95% confidence interval [CI], 0.98-1.21) for 1 CVD risk factor to 1.95 (95% CI, 1.41-2.68) for 5 CVD risk factors. In the inverse probability of censoring weights weighted analyses, the risk ratios ranged from 1.14 (95% CI, 0.94-1.39) to 4.23 (95% CI, 2.69-6.66). Estimates from the inverse probability of censoring weighted model were substantially greater than unweighted, adjusted estimates across all risk factor categories. This shows the magnitude of selection bias due to pre-hospital mortality and effect on estimates of the effect of CVD risk factors on mortality. Moreover, the results highlight the utility of using this method to address a common form of bias in cardiovascular research. Copyright © 2018 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.
Mackey, Dawn C.; Hubbard, Alan E.; Cawthon, Peggy M.; Cauley, Jane A.; Cummings, Steven R.; Tager, Ira B.
2011-01-01
Few studies have examined the relation between usual physical activity level and rate of hip fracture in older men or applied semiparametric methods from the causal inference literature that estimate associations without assuming a particular parametric model. Using the Physical Activity Scale for the Elderly, the authors measured usual physical activity level at baseline (2000–2002) in 5,682 US men ≥65 years of age who were enrolled in the Osteoporotic Fractures in Men Study. Physical activity levels were classified as low (bottom quartile of Physical Activity Scale for the Elderly score), moderate (middle quartiles), or high (top quartile). Hip fractures were confirmed by central review. Marginal associations between physical activity and hip fracture were estimated with 3 estimation methods: inverse probability-of-treatment weighting, G-computation, and doubly robust targeted maximum likelihood estimation. During 6.5 years of follow-up, 95 men (1.7%) experienced a hip fracture. The unadjusted risk of hip fracture was lower in men with a high physical activity level versus those with a low physical activity level (relative risk = 0.51, 95% confidence interval: 0.28, 0.92). In semiparametric analyses that controlled confounding, hip fracture risk was not lower with moderate (e.g., targeted maximum likelihood estimation relative risk = 0.92, 95% confidence interval: 0.62, 1.44) or high (e.g., targeted maximum likelihood estimation relative risk = 0.88, 95% confidence interval: 0.53, 2.03) physical activity relative to low. This study does not support a protective effect of usual physical activity on hip fracture in older men. PMID:21303805
Comparison of Time-to-First Event and Recurrent Event Methods in Randomized Clinical Trials.
Claggett, Brian; Pocock, Stuart; Wei, L J; Pfeffer, Marc A; McMurray, John J V; Solomon, Scott D
2018-03-27
Background -Most Phase-3 trials feature time-to-first event endpoints for their primary and/or secondary analyses. In chronic diseases where a clinical event can occur more than once, recurrent-event methods have been proposed to more fully capture disease burden and have been assumed to improve statistical precision and power compared to conventional "time-to-first" methods. Methods -To better characterize factors that influence statistical properties of recurrent-events and time-to-first methods in the evaluation of randomized therapy, we repeatedly simulated trials with 1:1 randomization of 4000 patients to active vs control therapy, with true patient-level risk reduction of 20% (i.e. RR=0.80). For patients who discontinued active therapy after a first event, we assumed their risk reverted subsequently to their original placebo-level risk. Through simulation, we varied a) the degree of between-patient heterogeneity of risk and b) the extent of treatment discontinuation. Findings were compared with those from actual randomized clinical trials. Results -As the degree of between-patient heterogeneity of risk was increased, both time-to-first and recurrent-events methods lost statistical power to detect a true risk reduction and confidence intervals widened. The recurrent-events analyses continued to estimate the true RR=0.80 as heterogeneity increased, while the Cox model produced estimates that were attenuated. The power of recurrent-events methods declined as the rate of study drug discontinuation post-event increased. Recurrent-events methods provided greater power than time-to-first methods in scenarios where drug discontinuation was ≤30% following a first event, lesser power with drug discontinuation rates of ≥60%, and comparable power otherwise. We confirmed in several actual trials in chronic heart failure that treatment effect estimates were attenuated when estimated via the Cox model and that increased statistical power from recurrent-events methods was most pronounced in trials with lower treatment discontinuation rates. Conclusions -We find that the statistical power of both recurrent-events and time-to-first methods are reduced by increasing heterogeneity of patient risk, a parameter not included in conventional power and sample size formulas. Data from real clinical trials are consistent with simulation studies, confirming that the greatest statistical gains from use of recurrent-events methods occur in the presence of high patient heterogeneity and low rates of study drug discontinuation.
[Theoretical model study about the application risk of high risk medical equipment].
Shang, Changhao; Yang, Fenghui
2014-11-01
Research for establishing a risk monitoring theoretical model of high risk medical equipment at applying site. Regard the applying site as a system which contains some sub-systems. Every sub-system consists of some risk estimating indicators. After quantizing of each indicator, the quantized values are multiplied with corresponding weight and then the products are accumulated. Hence, the risk estimating value of each subsystem is attained. Follow the calculating method, the risk estimating values of each sub-system are multiplied with corresponding weights and then the product is accumulated. The cumulative sum is the status indicator of the high risk medical equipment at applying site. The status indicator reflects the applying risk of the medical equipment at applying site. Establish a risk monitoring theoretical model of high risk medical equipment at applying site. The model can monitor the applying risk of high risk medical equipment at applying site dynamically and specially.
AN INFORMATIC APPROACH TO ESTIMATING ECOLOGICAL RISKS POSED BY PHARMACEUTICAL USE
A new method for estimating risks of human prescription pharmaceuticals based on information found in regulatory filings as well as scientific and trade literature is described in a presentation at the Pharmaceuticals in the Environment Workshop in Las Vegas, NV, August 23-25, 20...
Beulens, Joline W J; van der Schouw, Yvonne T; Moons, Karel G M; Boshuizen, Hendriek C; van der A, Daphne L; Groenwold, Rolf H H
2013-04-01
Moderate alcohol consumption is associated with a reduced type 2 diabetes risk, but the biomarkers that explain this relation are unknown. The most commonly used method to estimate the proportion explained by a biomarker is the difference method. However, influence of alcohol-biomarker interaction on its results is unclear. G-estimation method is proposed to accurately assess proportion explained, but how this method compares with the difference method is unknown. In a case-cohort study of 2498 controls and 919 incident diabetes cases, we estimated the proportion explained by different biomarkers on the relation between alcohol consumption and diabetes using the difference method and sequential G-estimation method. Using the difference method, high-density lipoprotein cholesterol explained the relation between alcohol and diabetes by 78% (95% confidence interval [CI], 41-243), whereas high-sensitivity C-reactive protein (-7.5%; -36.4 to 1.8) or blood pressure (-6.9; -26.3 to -0.6) did not explain the relation. Interaction between alcohol and liver enzymes led to bias in proportion explained with different outcomes for different levels of liver enzymes. G-estimation method showed comparable results, but proportions explained were lower. The relation between alcohol consumption and diabetes may be largely explained by increased high-density lipoprotein cholesterol but not by other biomarkers. Ignoring exposure-mediator interactions may result in bias. The difference and G-estimation methods provide similar results. Copyright © 2013 Elsevier Inc. All rights reserved.
Advanced uncertainty modelling for container port risk analysis.
Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin
2016-08-13
Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.
Rücker, Viktoria; Keil, Ulrich; Fitzgerald, Anthony P; Malzahn, Uwe; Prugger, Christof; Ertl, Georg; Heuschmann, Peter U; Neuhauser, Hannelore
2016-01-01
Estimation of absolute risk of cardiovascular disease (CVD), preferably with population-specific risk charts, has become a cornerstone of CVD primary prevention. Regular recalibration of risk charts may be necessary due to decreasing CVD rates and CVD risk factor levels. The SCORE risk charts for fatal CVD risk assessment were first calibrated for Germany with 1998 risk factor level data and 1999 mortality statistics. We present an update of these risk charts based on the SCORE methodology including estimates of relative risks from SCORE, risk factor levels from the German Health Interview and Examination Survey for Adults 2008–11 (DEGS1) and official mortality statistics from 2012. Competing risks methods were applied and estimates were independently validated. Updated risk charts were calculated based on cholesterol, smoking, systolic blood pressure risk factor levels, sex and 5-year age-groups. The absolute 10-year risk estimates of fatal CVD were lower according to the updated risk charts compared to the first calibration for Germany. In a nationwide sample of 3062 adults aged 40–65 years free of major CVD from DEGS1, the mean 10-year risk of fatal CVD estimated by the updated charts was lower by 29% and the estimated proportion of high risk people (10-year risk > = 5%) by 50% compared to the older risk charts. This recalibration shows a need for regular updates of risk charts according to changes in mortality and risk factor levels in order to sustain the identification of people with a high CVD risk. PMID:27612145
Van Regenmortel, Tina; Nys, Charlotte; Janssen, Colin R; Lofts, Stephen; De Schamphelaere, Karel A C
2017-08-01
Although chemical risk assessment is still mainly conducted on a substance-by-substance basis, organisms in the environment are typically exposed to mixtures of substances. Risk assessment procedures should therefore be adapted to fit these situations. Four mixture risk assessment methodologies were compared for risk estimations of mixtures of copper (Cu), zinc (Zn), and nickel (Ni). The results showed that use of the log-normal species sensitivity distribution (SSD) instead of the best-fit distribution and sampling species sensitivities independently for each metal instead of using interspecies correlations in metal sensitivity had little impact on risk estimates. Across 4 different monitoring datasets, between 0% and 52% of the target water samples were estimated to be at risk, but only between 0% and 15% of the target water samples were at risk because of the mixture of metals and not any single metal individually. When a natural baseline database was examined, it was estimated that 10% of the target water samples were at risk because of single metals or their mixtures; when the most conservative method was used (concentration addition [CA] applied directly to the SSD, i.e., CA SSD ). However, the issue of metal mixture risk at geochemical baseline concentrations became relatively small (2% of target water samples) when a theoretically more correct method was used (CA applied to individual dose response curves, i.e., CA DRC ). Finally, across the 4 monitoring datasets, the following order of conservatism for the 4 methods was shown (from most to least conservative, with ranges of median margin of safety [MoS] relative to CA SSD ): CA SSD > CA DRC (MoS = 1.17-1.25) > IA DRC (independent action (IA) applied to individual dose-response curves; MoS = 1.38-1.60) > IA SSD (MoS = 1.48-1.72). Therefore, it is suggested that these 4 methods can be used in a general tiered scheme for the risk assessment of metal mixtures in a regulatory context. In this scheme, the CA SSD method could serve as a first (conservative) tier to identify situations with likely no potential risk at all, regardless of the method used (the sum toxic unit expressed relative to the 5% hazardous concentration [SumTU HC5 ] < 1) and the IA SSD method to identify situations of potential risk, also regardless of the method used (the multisubstance potentially affected fraction of species using the IA SSD method [msPAF IA,SSD ] > 0.05). The CA DRC and IA DRC methods could be used for site-specific assessment for situations that fall in between (SumTU HC5 > 1 and msPAF IA,SSD < 0.05). Environ Toxicol Chem 2017;36:2123-2138. © 2017 SETAC. © 2017 SETAC.
Arriaga, Maria E; Vajdic, Claire M; Canfell, Karen; MacInnis, Robert; Hull, Peter; Magliano, Dianna J; Banks, Emily; Giles, Graham G; Cumming, Robert G; Byles, Julie E; Taylor, Anne W; Shaw, Jonathan E; Price, Kay; Hirani, Vasant; Mitchell, Paul; Adelstein, Barbara-Ann; Laaksonen, Maarit A
2017-06-14
To estimate the Australian cancer burden attributable to lifestyle-related risk factors and their combinations using a novel population attributable fraction (PAF) method that accounts for competing risk of death, risk factor interdependence and statistical uncertainty. 365 173 adults from seven Australian cohort studies. We linked pooled harmonised individual participant cohort data with population-based cancer and death registries to estimate exposure-cancer and exposure-death associations. Current Australian exposure prevalence was estimated from representative external sources. To illustrate the utility of the new PAF method, we calculated fractions of cancers causally related to body fatness or both tobacco and alcohol consumption avoidable in the next 10 years by risk factor modifications, comparing them with fractions produced by traditional PAF methods. Over 10 years of follow-up, we observed 27 483 incident cancers and 22 078 deaths. Of cancers related to body fatness (n=9258), 13% (95% CI 11% to 16%) could be avoided if those currently overweight or obese had body mass index of 18.5-24.9 kg/m 2 . Of cancers causally related to both tobacco and alcohol (n=4283), current or former smoking explains 13% (11% to 16%) and consuming more than two alcoholic drinks per day explains 6% (5% to 8%). The two factors combined explain 16% (13% to 19%): 26% (21% to 30%) in men and 8% (4% to 11%) in women. Corresponding estimates using the traditional PAF method were 20%, 31% and 10%. Our PAF estimates translate to 74 000 avoidable body fatness-related cancers and 40 000 avoidable tobacco- and alcohol-related cancers in Australia over the next 10 years (2017-2026). Traditional PAF methods not accounting for competing risk of death and interdependence of risk factors may overestimate PAFs and avoidable cancers. We will rank the most important causal factors and their combinations for a spectrum of cancers and inform cancer control activities. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Davis, Adam Christopher
This research develops a new framework for evaluating the occupational risks of exposure to hazardous substances in any setting where As Low As Reasonably Achievable (ALARA) practices are mandated or used. The evaluation is performed by developing a hypothesis-test-based procedure for evaluating the homogeneity of various epidemiological cohorts, and thus the appropriateness of the application of aggregate data-pooling techniques to those cohorts. A statistical methodology is then developed as an alternative to aggregate pooling for situations in which individual cohorts show heterogeneity between them and are thus unsuitable for pooled analysis. These methods are then applied to estimate the all-cancer mortality risks incurred by workers at four Department-of-Energy nuclear weapons laboratories. Both linear, no-threshold and dose-bin averaged risks are calculated and it is further shown that aggregate analysis tends to overestimate the risks with respect to those calculated by the methods developed in this work. The risk estimates developed in Chapter 2 are, in Chapter 3, applied to assess the risks to workers engaged in americium recovery operations at Los Alamos National Laboratory. The work described in Chapter 3 develops a full radiological protection assessment for the new americium recovery project, including development of exposure cases, creation and modification of MCNP5 models, development of a time-and-motion study, and the final synthesis of all data. This work also develops a new risk-based method of determining whether administrative controls, such as staffing increases, are ALARA-optimized. The EPA's estimate of the value of statistical life is applied to these risk estimates to determine a monetary value for risk. The rate of change of this "risk value" (marginal risk) is then compared with the rate of change of workers' compensations as additional workers are added to the project to reduce the dose (and therefore, presumably, risk) to each individual.
ERIC Educational Resources Information Center
Eaton, Danice K.; Brener, Nancy D.; Kann, Laura; Denniston, Maxine M.; McManus, Tim; Kyle, Tonja M.; Roberts, Alice M.; Flint, Katherine H.; Ross, James G.
2010-01-01
The authors examined whether paper-and-pencil and Web surveys administered in the school setting yield equivalent risk behavior prevalence estimates. Data were from a methods study conducted by the Centers for Disease Control and Prevention (CDC) in spring 2008. Intact classes of 9th- or 10th-grade students were assigned randomly to complete a…
Ambler, Gareth; Omar, Rumana Z; Royston, Patrick
2007-06-01
Risk models that aim to predict the future course and outcome of disease processes are increasingly used in health research, and it is important that they are accurate and reliable. Most of these risk models are fitted using routinely collected data in hospitals or general practices. Clinical outcomes such as short-term mortality will be near-complete, but many of the predictors may have missing values. A common approach to dealing with this is to perform a complete-case analysis. However, this may lead to overfitted models and biased estimates if entire patient subgroups are excluded. The aim of this paper is to investigate a number of methods for imputing missing data to evaluate their effect on risk model estimation and the reliability of the predictions. Multiple imputation methods, including hotdecking and multiple imputation by chained equations (MICE), were investigated along with several single imputation methods. A large national cardiac surgery database was used to create simulated yet realistic datasets. The results suggest that complete case analysis may produce unreliable risk predictions and should be avoided. Conditional mean imputation performed well in our scenario, but may not be appropriate if using variable selection methods. MICE was amongst the best performing multiple imputation methods with regards to the quality of the predictions. Additionally, it produced the least biased estimates, with good coverage, and hence is recommended for use in practice.
Sankaranarayanan, K; Chakraborty, R
2000-10-16
This paper recapitulates the advances in the field of genetic risk estimation that have occurred during the past decade and using them as a basis, presents revised estimates of genetic risks of exposure to radiation. The advances include: (i) an upward revision of the estimates of incidence for Mendelian diseases (2.4% now versus 1.25% in 1993); (ii) the introduction of a conceptual change for calculating doubling doses; (iii) the elaboration of methods to estimate the mutation component (i.e. the relative increase in disease frequency per unit relative increase in mutation rate) and the use of the estimates obtained through these methods for assessing the impact of induced mutations on the incidence of Mendelian and chronic multifactorial diseases; (iv) the introduction of an additional factor called the "potential recoverability correction factor" in the risk equation to bridge the gap between radiation-induced mutations that have been recovered in mice and the risk of radiation-inducible genetic disease in human live births and (v) the introduction of the concept that the adverse effects of radiation-induced genetic damage are likely to be manifest predominantly as multi-system developmental abnormalities in the progeny. For all classes of genetic disease (except congenital abnormalities), the estimates of risk have been obtained using a doubling dose of 1 Gy. For a population exposed to low LET, chronic/ low dose irradiation, the current estimates for the first generation progeny are the following (all estimates per million live born progeny per Gy of parental irradiation): autosomal dominant and X-linked diseases, approximately 750-1500 cases; autosomal recessive, nearly zero and chronic multifactorial diseases, approximately 250-1200 cases. For congenital abnormalities, the estimate is approximately 2000 cases and is based on mouse data on developmental abnormalities. The total risk per Gy is of the order of approximately 3000-4700 cases which represent approximately 0.4-0.6% of the baseline frequency of these diseases (738,000 per million) in the population.
Application of geostatistics to risk assessment.
Thayer, William C; Griffith, Daniel A; Goodrum, Philip E; Diamond, Gary L; Hassett, James M
2003-10-01
Geostatistics offers two fundamental contributions to environmental contaminant exposure assessment: (1) a group of methods to quantitatively describe the spatial distribution of a pollutant and (2) the ability to improve estimates of the exposure point concentration by exploiting the geospatial information present in the data. The second contribution is particularly valuable when exposure estimates must be derived from small data sets, which is often the case in environmental risk assessment. This article addresses two topics related to the use of geostatistics in human and ecological risk assessments performed at hazardous waste sites: (1) the importance of assessing model assumptions when using geostatistics and (2) the use of geostatistics to improve estimates of the exposure point concentration (EPC) in the limited data scenario. The latter topic is approached here by comparing design-based estimators that are familiar to environmental risk assessors (e.g., Land's method) with geostatistics, a model-based estimator. In this report, we summarize the basics of spatial weighting of sample data, kriging, and geostatistical simulation. We then explore the two topics identified above in a case study, using soil lead concentration data from a Superfund site (a skeet and trap range). We also describe several areas where research is needed to advance the use of geostatistics in environmental risk assessment.
Uncertainties in Projecting Risks of Late Effects from Space Radiation
NASA Astrophysics Data System (ADS)
Cucinotta, F.; Schimmerling, W.; Peterson, L.; Wilson, J.; Saganti, P.; Dicello, J.
The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, CNS risks, and non - cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low -Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a maximum likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of the primary factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objectives, i.e., number of days in space without exceeding a given risk level within well defined confidence limits
Does Static-99 predict recidivism among older sexual offenders?
Hanson, R K
2006-10-01
Static-99 (Hanson & Thornton, 2000) is the most commonly used actuarial risk tool for estimating sexual offender recidivism risk. Recent research has suggested that its methods of accounting for the offenders' ages may be insufficient to capture declines in recidivism risk associated with advanced age. Using data from 8 samples (combined size of 3,425 sexual offenders), the present study found that older offenders had lower Static-99 scores than younger offenders and that Static-99 was moderately accurate in estimating relative recidivism risk in all age groups. Older offenders, however, had lower sexual recidivism rates than would be expected based on their Static-99 risk categories. Consequently, evaluators using Static-99 should considered advanced age in their overall estimate of risk.
Wang, Molin; Liao, Xiaomei; Laden, Francine; Spiegelman, Donna
2016-06-15
Identification of the latency period and age-related susceptibility, if any, is an important aspect of assessing risks of environmental, nutritional, and occupational exposures. We consider estimation and inference for latency and age-related susceptibility in relative risk and excess risk models. We focus on likelihood-based methods for point and interval estimation of the latency period and age-related windows of susceptibility coupled with several commonly considered exposure metrics. The method is illustrated in a study of the timing of the effects of constituents of air pollution on mortality in the Nurses' Health Study. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Chappell, Lori J.; Cucinotta, Francis A.
2011-01-01
Radiation risks are estimated in a competing risk formalism where age or time after exposure estimates of increased risks for cancer and circulatory diseases are folded with a probability to survive to a given age. The survival function, also called the life-table, changes with calendar year, gender, smoking status and other demographic variables. An outstanding problem in risk estimation is the method of risk transfer between exposed populations and a second population where risks are to be estimated. Approaches used to transfer risks are based on: 1) Multiplicative risk transfer models -proportional to background disease rates. 2) Additive risk transfer model -risks independent of background rates. In addition, a Mixture model is often considered where the multiplicative and additive transfer assumptions are given weighted contributions. We studied the influence of the survival probability on the risk of exposure induced cancer and circulatory disease morbidity and mortality in the Multiplicative transfer model and the Mixture model. Risks for never-smokers (NS) compared to the average U.S. population are estimated to be reduced between 30% and 60% dependent on model assumptions. Lung cancer is the major contributor to the reduction for NS, with additional contributions from circulatory diseases and cancers of the stomach, liver, bladder, oral cavity, esophagus, colon, a portion of the solid cancer remainder, and leukemia. Greater improvements in risk estimates for NS s are possible, and would be dependent on improved understanding of risk transfer models, and elucidating the role of space radiation on the various stages of disease formation (e.g. initiation, promotion, and progression).
A new method for assessing the risk of accident associated with darkness.
Johansson, Osten; Wanvik, Per Ole; Elvik, Rune
2009-07-01
This paper presents a new method for assessing the risk of accidents associated with darkness. The method estimates the risk of accident associated with darkness in terms of an odds ratio, which is defined as follows: [(number of accidents in darkness in a given hour of the day)/(number of accidents in daylight in the same hour of the day)]/[(Number of accidents in a given comparison hour when the case hour is dark)/(Number of accidents in a given comparison hour when the case hour is in daylight)]. This estimate of the risk of accident associated with darkness does not require data on exposure, but relies on the count of accidents in the same pair of hours throughout the year. One of the hours is dark part of the year, but has daylight the rest of the year. The comparison hour, which has daylight the whole year, is used to control for seasonal variations. The aim of relying on the same pair of hours throughout the year is to minimise the influence of potentially confounding factors. Estimates of the risk of injury accidents associated with darkness are developed on the basis of accident data for Norway, Sweden and the Netherlands. It is found that the risk of an injury accident increases by nearly 30% in darkness in urban areas, by nearly 50% in rural areas, and by about 40% for urban and rural areas combined (adjusted estimate).
NOVIN, Vahid; GIVEHCHI, Saeed; HOVEIDI, Hassan
2016-01-01
Background: Reliable methods are crucial to cope with uncertainties in the risk analysis process. The aim of this study is to develop an integrated approach to assessing risks of benzene in the petrochemical plant that produces benzene. We offer an integrated system to contribute imprecise variables into the health risk calculation. Methods: The project was conducted in Asaluyeh, southern Iran during the years from 2013 to 2014. Integrated method includes fuzzy logic and artificial neural networks. Each technique had specific computational properties. Fuzzy logic was used for estimation of absorption rate. Artificial neural networks can decrease the noise of the data so applied for prediction of benzene concentration. First, the actual exposure was calculated then it combined with Integrated Risk Information System (IRIS) toxicity factors to assess real health risks. Results: High correlation between the measured and predicted benzene concentration was achieved (R2= 0.941). As for variable distribution, the best estimation of risk in a population implied 33% of workers exposed less than 1×10−5 and 67% inserted between 1.0×10−5 to 9.8×10−5 risk levels. The average estimated risk of exposure to benzene for entire work zones is equal to 2.4×10−5, ranging from 1.5×10−6 to 6.9×10−5. Conclusion: The integrated model is highly flexible as well as the rules possibly will be changed according to the necessities of the user in a different circumstance. The measured exposures can be duplicated well through proposed model and realistic risk assessment data will be produced. PMID:27957464
Practical Issues in Implementing Software Reliability Measurement
NASA Technical Reports Server (NTRS)
Nikora, Allen P.; Schneidewind, Norman F.; Everett, William W.; Munson, John C.; Vouk, Mladen A.; Musa, John D.
1999-01-01
Many ways of estimating software systems' reliability, or reliability-related quantities, have been developed over the past several years. Of particular interest are methods that can be used to estimate a software system's fault content prior to test, or to discriminate between components that are fault-prone and those that are not. The results of these methods can be used to: 1) More accurately focus scarce fault identification resources on those portions of a software system most in need of it. 2) Estimate and forecast the risk of exposure to residual faults in a software system during operation, and develop risk and safety criteria to guide the release of a software system to fielded use. 3) Estimate the efficiency of test suites in detecting residual faults. 4) Estimate the stability of the software maintenance process.
NASA Astrophysics Data System (ADS)
Kontos, Despina; Xing, Ye; Bakic, Predrag R.; Conant, Emily F.; Maidment, Andrew D. A.
2010-03-01
We performed a study to compare methods for volumetric breast density estimation in digital mammography (DM) and magnetic resonance imaging (MRI) for a high-risk population of women. DM and MRI images of the unaffected breast from 32 women with recently detected abnormalities and/or previously diagnosed breast cancer (age range 31-78 yrs, mean 50.3 yrs) were retrospectively analyzed. DM images were analyzed using QuantraTM (Hologic Inc). The MRI images were analyzed using a fuzzy-C-means segmentation algorithm on the T1 map. Both methods were compared to Cumulus (Univ. Toronto). Volumetric breast density estimates from DM and MRI are highly correlated (r=0.90, p<=0.001). The correlation between the volumetric and the area-based density measures is lower and depends on the training background of the Cumulus software user (r=0.73-84, p<=0.001). In terms of absolute values, MRI provides the lowest volumetric estimates (mean=14.63%), followed by the DM volumetric (mean=22.72%) and area-based measures (mean=29.35%). The MRI estimates of the fibroglandular volume are statistically significantly lower than the DM estimates for women with very low-density breasts (p<=0.001). We attribute these differences to potential partial volume effects in MRI and differences in the computational aspects of the image analysis methods in MRI and DM. The good correlation between the volumetric and the area-based measures, shown to correlate with breast cancer risk, suggests that both DM and MRI volumetric breast density measures can aid in breast cancer risk assessment. Further work is underway to fully-investigate the association between volumetric breast density measures and breast cancer risk.
Bunck, C.M.; Chen, C.-L.; Pollock, K.H.
1995-01-01
Traditional methods of estimating survival from radio-telemetry studies use either the Trent-Rongstad approach (Trent and Rongstad 1974, Heisey and Fuller 1985) or the Kaplan-Meier approach (Kaplan and Meier 1958; Pollock et al. 1989a,b). Both methods appear to require the assumption that relocation probability for animals with a functioning radio is 1. In practice this may not always be reasonable and, in fact, is unnecessary. The number of animals at risk (i.e., risk set) can be modified to account for uncertain relocation of individuals. This involves including only relocated animals in the risk set instead of also including animals not relocated but that were seen later. Simulation results show that estimators and tests for comparing survival curves should be based on this modification.
Mukaka, Mavuto; White, Sarah A; Terlouw, Dianne J; Mwapasa, Victor; Kalilani-Phiri, Linda; Faragher, E Brian
2016-07-22
Missing outcomes can seriously impair the ability to make correct inferences from randomized controlled trials (RCTs). Complete case (CC) analysis is commonly used, but it reduces sample size and is perceived to lead to reduced statistical efficiency of estimates while increasing the potential for bias. As multiple imputation (MI) methods preserve sample size, they are generally viewed as the preferred analytical approach. We examined this assumption, comparing the performance of CC and MI methods to determine risk difference (RD) estimates in the presence of missing binary outcomes. We conducted simulation studies of 5000 simulated data sets with 50 imputations of RCTs with one primary follow-up endpoint at different underlying levels of RD (3-25 %) and missing outcomes (5-30 %). For missing at random (MAR) or missing completely at random (MCAR) outcomes, CC method estimates generally remained unbiased and achieved precision similar to or better than MI methods, and high statistical coverage. Missing not at random (MNAR) scenarios yielded invalid inferences with both methods. Effect size estimate bias was reduced in MI methods by always including group membership even if this was unrelated to missingness. Surprisingly, under MAR and MCAR conditions in the assessed scenarios, MI offered no statistical advantage over CC methods. While MI must inherently accompany CC methods for intention-to-treat analyses, these findings endorse CC methods for per protocol risk difference analyses in these conditions. These findings provide an argument for the use of the CC approach to always complement MI analyses, with the usual caveat that the validity of the mechanism for missingness be thoroughly discussed. More importantly, researchers should strive to collect as much data as possible.
Evaluation of volatile organic emissions from hazardous waste incinerators.
Sedman, R M; Esparza, J R
1991-01-01
Conventional methods of risk assessment typically employed to evaluate the impact of hazardous waste incinerators on public health must rely on somewhat speculative emissions estimates or on complicated and expensive sampling and analytical methods. The limited amount of toxicological information concerning many of the compounds detected in stack emissions also complicates the evaluation of the public health impacts of these facilities. An alternative approach aimed at evaluating the public health impacts associated with volatile organic stack emissions is presented that relies on a screening criterion to evaluate total stack hydrocarbon emissions. If the concentration of hydrocarbons in ambient air is below the screening criterion, volatile emissions from the incinerator are judged not to pose a significant threat to public health. Both the screening criterion and a conventional method of risk assessment were employed to evaluate the emissions from 20 incinerators. Use of the screening criterion always yielded a substantially greater estimate of risk than that derived by the conventional method. Since the use of the screening criterion always yielded estimates of risk that were greater than that determined by conventional methods and measuring total hydrocarbon emissions is a relatively simple analytical procedure, the use of the screening criterion would appear to facilitate the evaluation of operating hazardous waste incinerators. PMID:1954928
[Medium-term forecast of solar cosmic rays radiation risk during a manned Mars mission].
Petrov, V M; Vlasov, A G
2006-01-01
Medium-term forecasting radiation hazard from solar cosmic rays will be vital in a manned Mars mission. Modern methods of space physics lack acceptable reliability in medium-term forecasting the SCR onset and parameters. The proposed estimation of average radiation risk from SCR during the manned Mars mission is made with the use of existing SCR fluence and spectrum models and correlation of solar particle event frequency with predicted Wolf number. Radiation risk is considered an additional death probability from acute radiation reactions (ergonomic component) or acute radial disease in flight. The algorithm for radiation risk calculation is described and resulted risk levels for various periods of the 23-th solar cycle are presented. Applicability of this method to advance forecasting and possible improvements are being investigated. Recommendations to the crew based on risk estimation are exemplified.
Monitoring risk-adjusted medical outcomes allowing for changes over time.
Steiner, Stefan H; Mackay, R Jock
2014-10-01
We consider the problem of monitoring and comparing medical outcomes, such as surgical performance, over time. Performance is subject to change due to a variety of reasons including patient heterogeneity, learning, deteriorating skills due to aging, etc. For instance, we expect inexperienced surgeons to improve their skills with practice. We propose a graphical method to monitor surgical performance that incorporates risk adjustment to account for patient heterogeneity. The procedure gives more weight to recent outcomes and down-weights the influence of outcomes further in the past. The chart is clinically interpretable as it plots an estimate of the failure rate for a "standard" patient. The chart also includes a measure of uncertainty in this estimate. We can implement the method using historical data or start from scratch. As the monitoring proceeds, we can base the estimated failure rate on a known risk model or use the observed outcomes to update the risk model as time passes. We illustrate the proposed method with an example from cardiac surgery. © The Author 2013. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
On the difficulty to delimit disease risk hot spots
NASA Astrophysics Data System (ADS)
Charras-Garrido, M.; Azizi, L.; Forbes, F.; Doyle, S.; Peyrard, N.; Abrial, D.
2013-06-01
Representing the health state of a region is a helpful tool to highlight spatial heterogeneity and localize high risk areas. For ease of interpretation and to determine where to apply control procedures, we need to clearly identify and delineate homogeneous regions in terms of disease risk, and in particular disease risk hot spots. However, even if practical purposes require the delineation of different risk classes, such a classification does not correspond to a reality and is thus difficult to estimate. Working with grouped data, a first natural choice is to apply disease mapping models. We apply a usual disease mapping model, producing continuous estimations of the risks that requires a post-processing classification step to obtain clearly delimited risk zones. We also apply a risk partition model that build a classification of the risk levels in a one step procedure. Working with point data, we will focus on the scan statistic clustering method. We illustrate our article with a real example concerning the bovin spongiform encephalopathy (BSE) an animal disease whose zones at risk are well known by the epidemiologists. We show that in this difficult case of a rare disease and a very heterogeneous population, the different methods provide risk zones that are globally coherent. But, related to the dichotomy between the need and the reality, the exact delimitation of the risk zones, as well as the corresponding estimated risks are quite different.
Blaizot, Stéphanie; Kim, Andrea A; Zeh, Clement; Riche, Benjamin; Maman, David; De Cock, Kevin M; Etard, Jean-François; Ecochard, René
2017-05-01
Estimating HIV incidence is critical for identifying groups at risk for HIV infection, planning and targeting interventions, and evaluating these interventions over time. The use of reliable estimation methods for HIV incidence is thus of high importance. The aim of this study was to compare methods for estimating HIV incidence in a population-based cross-sectional survey. The incidence estimation methods evaluated included assay-derived methods, a testing history-derived method, and a probability-based method applied to data from the Ndhiwa HIV Impact in Population Survey (NHIPS). Incidence rates by sex and age and cumulative incidence as a function of age were presented. HIV incidence ranged from 1.38 [95% confidence interval (CI) 0.67-2.09] to 3.30 [95% CI 2.78-3.82] per 100 person-years overall; 0.59 [95% CI 0.00-1.34] to 2.89 [95% CI 0.86-6.45] in men; and 1.62 [95% CI 0.16-6.04] to 4.03 [95% CI 3.30-4.77] per 100 person-years in women. Women had higher incidence rates than men for all methods. Incidence rates were highest among women aged 15-24 and 25-34 years and highest among men aged 25-34 years. Comparison of different methods showed variations in incidence estimates, but they were in agreement to identify most-at-risk groups. The use and comparison of several distinct approaches for estimating incidence are important to provide the best-supported estimate of HIV incidence in the population.
A chance constraint estimation approach to optimizing resource management under uncertainty
Michael Bevers
2007-01-01
Chance-constrained optimization is an important method for managing risk arising from random variations in natural resource systems, but the probabilistic formulations often pose mathematical programming problems that cannot be solved with exact methods. A heuristic estimation method for these problems is presented that combines a formulation for order statistic...
Rappazzo, Kristen M; Lobdell, Danelle T; Messer, Lynne C; Poole, Charles; Daniels, Julie L
2017-02-01
Estimating gestational age is usually based on date of last menstrual period (LMP) or clinical estimation (CE); both approaches introduce potential bias. Differences in methods of estimation may lead to misclassification and inconsistencies in risk estimates, particularly if exposure assignment is also gestation-dependent. This paper examines a 'what-if' scenario in which alternative methods are used and attempts to elucidate how method choice affects observed results. We constructed two 20-week gestational age cohorts of pregnancies between 2000 and 2005 (New Jersey, Pennsylvania, Ohio, USA) using live birth certificates: one defined preterm birth (PTB) status using CE and one using LMP. Within these, we estimated risk for 4 categories of preterm birth (PTBs per 10 6 pregnancies) and risk differences (RD (95% CIs)) associated with exposure to particulate matter (PM 2.5 ). More births were classified preterm using LMP (16%) compared with CE (8%). RD divergences increased between cohorts as exposure period approached delivery. Among births between 28 and 31 weeks, week 7 PM 2.5 exposure conveyed RDs of 44 (21 to 67) for CE and 50 (18 to 82) for LMP populations, while week 24 exposure conveyed RDs of 33 (11 to 56) and -20 (-50 to 10), respectively. Different results from analyses restricted to births with both CE and LMP are most likely due to differences in dating methods rather than selection issues. Results are sensitive to choice of gestational age estimation, though degree of sensitivity can vary by exposure timing. When both outcome and exposure depend on estimate of gestational age, awareness of nuances in the method used for estimation is critical. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Reentry survivability modeling
NASA Astrophysics Data System (ADS)
Fudge, Michael L.; Maher, Robert L.
1997-10-01
Statistical methods for expressing the impact risk posed to space systems in general [and the International Space Station (ISS) in particular] by other resident space objects have been examined. One of the findings of this investigation is that there are legitimate physical modeling reasons for the common statistical expression of the collision risk. A combination of statistical methods and physical modeling is also used to express the impact risk posed by re-entering space systems to objects of interest (e.g., people and property) on Earth. One of the largest uncertainties in the expressing of this risk is the estimation of survivable material which survives reentry to impact Earth's surface. This point was recently demonstrated in dramatic fashion by the impact of an intact expendable launch vehicle (ELV) upper stage near a private residence in the continental United States. Since approximately half of the missions supporting ISS will utilize ELVs, it is appropriate to examine the methods used to estimate the amount and physical characteristics of ELV debris surviving reentry to impact Earth's surface. This paper examines reentry survivability estimation methodology, including the specific methodology used by Caiman Sciences' 'Survive' model. Comparison between empirical results (observations of objects which have been recovered on Earth after surviving reentry) and Survive estimates are presented for selected upper stage or spacecraft components and a Delta launch vehicle second stage.
Kim, Sunduk; Yang, Ji-Yeon; Kim, Ho-Hyun; Yeo, In-Young; Shin, Dong-Chun
2012-01-01
Objectives The purpose of this study was to assess the risk of ingestion exposure of lead by particle sizes of crumb rubber in artificial turf filling material with consideration of bioavailability. Methods This study estimated the ingestion exposure by particle sizes (more than 250 um or less than 250 um) focusing on recyclable ethylene propylene diene monomer crumb rubber being used as artificial turf filling. Analysis on crumb rubber was conducted using body ingestion exposure estimate method in which total content test method, acid extraction method and digestion extraction method are reflected. Bioavailability which is a calibrating factor was reflected in ingestion exposure estimate method and applied in exposure assessment and risk assessment. Two methods using acid extraction and digestion extraction concentration were compared and evaluated. Results As a result of the ingestion exposure of crumb rubber material, the average lead exposure amount to the digestion extraction result among crumb rubber was calculated to be 1.56×10-4 mg/kg-day for low grade elementary school students and 4.87×10-5 mg/kg-day for middle and high school students in 250 um or less particle size, and that to the acid extraction result was higher than the digestion extraction result. Results of digestion extraction and acid extraction showed that the hazard quotient was estimated by about over 2 times more in particle size of lower than 250 um than in higher than 250 um. There was a case of an elementary school student in which the hazard quotient exceeded 0.1. Conclusions Results of this study confirm that the exposure of lead ingestion and risk level increases as the particle size of crumb rubber gets smaller. PMID:22355803
Lu, Chunling; Liu, Kai; Li, Lingling; Yang, Yuhong
2017-04-01
Reliable and comparable information on households with catastrophic health expenditure (HCHE) is crucial for monitoring and evaluating our progress towards achieving universal financial risk protection. This study aims to investigate the sensitivity of measuring the progress in financial risk protection to survey design and its socioeconomic and demographic determinants. Using the Rwanda Integrated Living Conditions Survey in 2005 and 2010/2011, we derived the level and trend of the percentage of the HCHE using out-of-pocket health spending data derived from (1) a health module with a two-week recall period and six (2005)/seven (2010/2011) survey questions (Method 1) and (2) a consumption module with a four-week/ten-/12-month recall period and 11(2005)/24 (2010/2011) questions (Method 2). Using multilevel logistic regression analysis, we investigated the household socioeconomic and demographic characteristics that affected the sensitivity of estimating the HCHE to survey design. We found that Method 1 generated a significantly higher HCHE estimate (9.2%, 95% confidence interval 8.4%-10.0%) than Method2 (7.4%, 6.6%-8.1%) in 2005 and lower estimate (5.6%, 5.2%-6.1%) than Method 2 (8.2%, 7.6%-8.7%) in 2010/2011. The estimated trends of the HCHE using the two methods were not consistent between the two years. A household's size, its income quintile, having no under-five children, and educational level of its head were positively associated with the consistency of its HCHE status when using the two survey methods. Estimates of the progress in financial risk protection, especially among the most vulnerable households, are sensitive to survey design. These results are robust to various thresholds of catastrophic health spending. Future work must focus on mitigating survey effects through the development of statistical tools. Copyright © 2017 Elsevier Ltd. All rights reserved.
Lu, Chunling; Liu, Kai; Li, Lingling; Yang, Yuhong
2017-01-01
Reliable and comparable information on households with catastrophic health expenditure (HCHE) is crucial for monitoring and evaluating our progress towards achieving universal financial risk protection. This study aims to investigate the sensitivity of measuring the progress in financial risk protection to survey design and its socioeconomic and demographic determinants. Using the Rwanda Integrated Living Conditions Survey in 2005 and 2010/2011, we derived the level and trend of the percentage of the HCHE using out-of-pocket health spending data derived from (1) a health module with a two-week recall period and six (2005)/seven (2010/2011) survey questions (Method 1) and (2) a consumption module with a four-week/ten-/12-month recall period and 11(2005)/24 (2010/2011) questions (Method 2). Using multilevel logistic regression analysis, we investigated the household socioeconomic and demographic characteristics that affected the sensitivity of estimating the HCHE to survey design. We found that Method 1 generated a significantly higher HCHE estimate (9.2%, 95% confidence interval 8.4%–10.0%) than Method2 (7.4%, 6.6%–8.1%) in 2005 and lower estimate (5.6%, 5.2%–6.1%) than Method 2 (8.2%, 7.6% –8.7%) in 2010/2011. The estimated trends of the HCHE using the two methods were not consistent between the two years. A household's size, its income quintile, having no under-five children, and educational level of its head were positively associated with the consistency of its HCHE status when using the two survey methods. Estimates of the progress in financial risk protection, especially among the most vulnerable households, are sensitive to survey design. These results are robust to various thresholds of catastrophic health spending. Future work must focus on mitigating survey effects through the development of statistical tools. PMID:28189819
STIR Version 1.0 User's Guide for Pesticide Inhalation Risk
STIR estimates inhalation-type exposure based on pesticide-specific information. It also estimates spray droplet exposure using the application method and rate and then compares these exposure estimates to avian and mammalian toxicity data.
Introduction of risk size in the determination of uncertainty factor UFL in risk assessment
NASA Astrophysics Data System (ADS)
Xue, Jinling; Lu, Yun; Velasquez, Natalia; Yu, Ruozhen; Hu, Hongying; Liu, Zhengtao; Meng, Wei
2012-09-01
The methodology for using uncertainty factors in health risk assessment has been developed for several decades. A default value is usually applied for the uncertainty factor UFL, which is used to extrapolate from LOAEL (lowest observed adverse effect level) to NAEL (no adverse effect level). Here, we have developed a new method that establishes a linear relationship between UFL and the additional risk level at LOAEL based on the dose-response information, which represents a very important factor that should be carefully considered. This linear formula makes it possible to select UFL properly in the additional risk range from 5.3% to 16.2%. Also the results remind us that the default value 10 may not be conservative enough when the additional risk level at LOAEL exceeds 16.2%. Furthermore, this novel method not only provides a flexible UFL instead of the traditional default value, but also can ensure a conservative estimation of the UFL with fewer errors, and avoid the benchmark response selection involved in the benchmark dose method. These advantages can improve the estimation of the extrapolation starting point in the risk assessment.
Kashcheev, Valery V; Pryakhin, Evgeny A; Menyaylo, Alexander N; Chekin, Sergey Yu; Ivanov, Viktor K
2014-06-01
The current study has two aims: the first is to quantify the difference between radiation risks estimated with the use of organ or effective doses, particularly when planning pediatric and adult computed tomography (CT) examinations. The second aim is to determine the method of calculating organ doses and cancer risk using dose-length product (DLP) for typical routine CT examinations. In both cases, the radiation-induced cancer risks from medical CT examinations were evaluated as a function of gender and age. Lifetime attributable risk values from CT scanning were estimated with the use of ICRP (Publication 103) risk models and Russian national medical statistics data. For populations under the age of 50 y, the risk estimates based on organ doses usually are 30% higher than estimates based on effective doses. In older populations, the difference can be up to a factor of 2.5. The typical distributions of organ doses were defined for Chest Routine, Abdominal Routine, and Head Routine examinations. The distributions of organ doses were dependent on the anatomical region of scanning. The most exposed organs/tissues were thyroid, breast, esophagus, and lungs in cases of Chest Routine examination; liver, stomach, colon, ovaries, and bladder in cases of Abdominal Routine examination; and brain for Head Routine examinations. The conversion factors for calculation of typical organ doses or tissues at risk using DLP were determined. Lifetime attributable risk of cancer estimated with organ doses calculated from DLP was compared with the risk estimated on the basis of organ doses measured with the use of silicon photodiode dosimeters. The estimated difference in LAR is less than 29%.
Fire spread estimation on forest wildfire using ensemble kalman filter
NASA Astrophysics Data System (ADS)
Syarifah, Wardatus; Apriliani, Erna
2018-04-01
Wildfire is one of the most frequent disasters in the world, for example forest wildfire, causing population of forest decrease. Forest wildfire, whether naturally occurring or prescribed, are potential risks for ecosystems and human settlements. These risks can be managed by monitoring the weather, prescribing fires to limit available fuel, and creating firebreaks. With computer simulations we can predict and explore how fires may spread. The model of fire spread on forest wildfire was established to determine the fire properties. The fire spread model is prepared based on the equation of the diffusion reaction model. There are many methods to estimate the spread of fire. The Kalman Filter Ensemble Method is a modified estimation method of the Kalman Filter algorithm that can be used to estimate linear and non-linear system models. In this research will apply Ensemble Kalman Filter (EnKF) method to estimate the spread of fire on forest wildfire. Before applying the EnKF method, the fire spread model will be discreted using finite difference method. At the end, the analysis obtained illustrated by numerical simulation using software. The simulation results show that the Ensemble Kalman Filter method is closer to the system model when the ensemble value is greater, while the covariance value of the system model and the smaller the measurement.
A study of lens opacification for a Mars mission
NASA Technical Reports Server (NTRS)
Shinn, J. L.; Wilson, J. W.; Cox, A. B.; Lett, J. T.
1991-01-01
A method based on risk-related cross sections is used to estimate risks of 'stationary' cataracts caused by radiation exposures during extended missions in deep space. Estimates of the even more important risk of late degenerative cataractogenesis are made on the basis of the limited data available. Data on lenticular opacification in the New Zealand white rabbit, an animal model from which such results can be extrapolated to humans, are analyzed by the Langley cosmic ray shielding code (HZETRN) to generate estimates of stationary cataract formation resulting from a Mars mission. The effects of the composition of shielding material and the relationship between risk and LET are given, and the effects of target fragmentation on the risk coefficients are evaluated explicitly.
Methods for Estimating the Social Benefits of EPA Land Cleanup and Reuse Programs (2007)
The Office of Policy, Economics and Innovation’s National Center for Environmental Economics, and the Office of Solid Waste and Emergency Response’s Land Revitalization Office convened a workshop on risk assessment and benefit estimation methods in 2006.
Oei, W; Lieshout-Krikke, R W; Kretzschmar, M E; Zaaijer, H L; Coutinho, R A; Eersel, M; Jubithana, B; Halabi, Y; Gerstenbluth, I; Maduro, E; Tromp, M; Janssen, M P
2016-05-01
The risk of dengue transmitted by travellers is known. Methods to estimate the transmission by transfusion (TT) risk from blood donors travelling to risk areas are available, for instance, the European Up-Front Risk Assessment Tool (EUFRAT). This study aimed to validate the estimated risk from travelling donors obtained from EUFRAT. Surveillance data on notified dengue cases in Suriname and the Dutch Caribbean islands (Aruba, Curaçao, St. Maarten, Bonaire, St. Eustatius and Saba) in 2001-2011 was used to calculate local incidence rates. Information on travel and donation behaviour of Dutch donors was collected. With the EUFRAT model, the TT risks from Dutch travelling donors were calculated. Model estimates were compared with the number of infections in Dutch travellers found by laboratory tests in the Netherlands. The expected cumulative number of donors becoming infected during travels to Suriname and the Dutch Caribbean from 2001 to 2011 was estimated at 5 (95% CI, 2-11) and 86 (45-179), respectively. The infection risk inferred from the laboratory-based study was 19 (9-61) and 28 (14-92). Given the independence of the data sources, these estimates are remarkably close. The model estimated that 0·02 (0·001-0·06) and 0·40 (0·01-1·4) recipients would have been infected by these travelling donors. The EUFRAT model provided an estimate close to actual observed number of dengue infections. The dengue TT risk among Dutch travelling donors can be estimated using basic transmission, travel and donation information. The TT risk from Dutch donors travelling to Suriname and the Dutch Caribbean is small. © 2016 International Society of Blood Transfusion.
Eisenberg, Jonathan D.; Lee, Richard J.; Gilmore, Michael E.; Turan, Ekin A.; Singh, Sarabjeet; Kalra, Mannudeep K.; Liu, Bob; Kong, Chung Yin; Gazelle, G. Scott
2013-01-01
Purpose: To demonstrate a limitation of lifetime radiation-induced cancer risk metrics in the setting of testicular cancer surveillance—in particular, their failure to capture the delayed timing of radiation-induced cancers over the course of a patient’s lifetime. Materials and Methods: Institutional review board approval was obtained for the use of computed tomographic (CT) dosimetry data in this study. Informed consent was waived. This study was HIPAA compliant. A Markov model was developed to project outcomes in patients with testicular cancer who were undergoing CT surveillance in the decade after orchiectomy. To quantify effects of early versus delayed risks, life expectancy losses and lifetime mortality risks due to testicular cancer were compared with life expectancy losses and lifetime mortality risks due to radiation-induced cancers from CT. Projections of life expectancy loss, unlike lifetime risk estimates, account for the timing of risks over the course of a lifetime, which enabled evaluation of the described limitation of lifetime risk estimates. Markov chain Monte Carlo methods were used to estimate the uncertainty of the results. Results: As an example of evidence yielded, 33-year-old men with stage I seminoma who were undergoing CT surveillance were projected to incur a slightly higher lifetime mortality risk from testicular cancer (598 per 100 000; 95% uncertainty interval [UI]: 302, 894) than from radiation-induced cancers (505 per 100 000; 95% UI: 280, 730). However, life expectancy loss attributable to testicular cancer (83 days; 95% UI: 42, 124) was more than three times greater than life expectancy loss attributable to radiation-induced cancers (24 days; 95% UI: 13, 35). Trends were consistent across modeled scenarios. Conclusion: Lifetime radiation risk estimates, when used for decision making, may overemphasize radiation-induced cancer risks relative to short-term health risks. © RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12121015/-/DC1 PMID:23249573
Graphs to estimate an individualized risk of breast cancer.
Benichou, J; Gail, M H; Mulvihill, J J
1996-01-01
Clinicians who counsel women about their risk for developing breast cancer need a rapid method to estimate individualized risk (absolute risk), as well as the confidence limits around that point. The Breast Cancer Detection Demonstration Project (BCDDP) model (sometimes called the Gail model) assumes no genetic model and simultaneously incorporates five risk factors, but involves cumbersome calculations and interpolations. This report provides graphs to estimate the absolute risk of breast cancer from the BCDDP model. The BCDDP recruited 280,000 women from 1973 to 1980 who were monitored for 5 years. From this cohort, 2,852 white women developed breast cancer and 3,146 controls were selected, all with complete risk-factor information. The BCDDP model, previously developed from these data, was used to prepare graphs that relate a specific summary relative-risk estimate to the absolute risk of developing breast cancer over intervals of 10, 20, and 30 years. Once a summary relative risk is calculated, the appropriate graph is chosen that shows the 10-, 20-, or 30-year absolute risk of developing breast cancer. A separate graph gives the 95% confidence limits around the point estimate of absolute risk. Once a clinician rules out a single gene trait that predisposes to breast cancer and elicits information on age and four risk factors, the tables and figures permit an estimation of a women's absolute risk of developing breast cancer in the next three decades. These results are intended to be applied to women who undergo regular screening. They should be used only in a formal counseling program to maximize a woman's understanding of the estimates and the proper use of them.
Towards the estimation of effect measures in studies using respondent-driven sampling.
Rotondi, Michael A
2014-06-01
Respondent-driven sampling (RDS) is an increasingly common sampling technique to recruit hidden populations. Statistical methods for RDS are not straightforward due to the correlation between individual outcomes and subject weighting; thus, analyses are typically limited to estimation of population proportions. This manuscript applies the method of variance estimates recovery (MOVER) to construct confidence intervals for effect measures such as risk difference (difference of proportions) or relative risk in studies using RDS. To illustrate the approach, MOVER is used to construct confidence intervals for differences in the prevalence of demographic characteristics between an RDS study and convenience study of injection drug users. MOVER is then applied to obtain a confidence interval for the relative risk between education levels and HIV seropositivity and current infection with syphilis, respectively. This approach provides a simple method to construct confidence intervals for effect measures in RDS studies. Since it only relies on a proportion and appropriate confidence limits, it can also be applied to previously published manuscripts.
Satagopan, Jaya M; Sen, Ananda; Zhou, Qin; Lan, Qing; Rothman, Nathaniel; Langseth, Hilde; Engel, Lawrence S
2016-06-01
Matched case-control studies are popular designs used in epidemiology for assessing the effects of exposures on binary traits. Modern studies increasingly enjoy the ability to examine a large number of exposures in a comprehensive manner. However, several risk factors often tend to be related in a nontrivial way, undermining efforts to identify the risk factors using standard analytic methods due to inflated type-I errors and possible masking of effects. Epidemiologists often use data reduction techniques by grouping the prognostic factors using a thematic approach, with themes deriving from biological considerations. We propose shrinkage-type estimators based on Bayesian penalization methods to estimate the effects of the risk factors using these themes. The properties of the estimators are examined using extensive simulations. The methodology is illustrated using data from a matched case-control study of polychlorinated biphenyls in relation to the etiology of non-Hodgkin's lymphoma. © 2015, The International Biometric Society.
Size Estimation of Groups at High Risk of HIV/AIDS using Network Scale Up in Kerman, Iran
Shokoohi, Mostafa; Baneshi, Mohammad Reza; Haghdoost, Ali-Akbar
2012-01-01
Objective: To estimate the size of groups at high risk of HIV, Network Scale UP (NSU), an indirect method, was used. Methods: 500 Kermanian male aged 18 to 45 were recruited. 8 groups at high risk of HIV were defined: Users of opium, unknown drug, ecstasy, and alcohol; intra-venous drug users (IDUs; males who have extra-marital sex with females (MSF); male who have sex with female sex workers (MFSW); and male who have sex with other male (MSMs). We asked respondents whether they know anybody (probability method), and if yes, how many people (frequency method) in our target groups. Results: Estimates derived in the probability method were higher than the frequency method. Based on the probability method, 13.7% (95% CI: 11.3%, 16.1%) of males used alcohol at least once in last year; the corresponding percent for opium was 13.1% (95% CI: 10.9%, 15.3%). In addition, 12% has extra-marital sex in last year (95% CI: 10%, 14%); while 7% (95% CI: 5.8%, 8.2%) had sex with a female sex worker. Conclusion: We showed that drug use is more common among young and mid-age males; although their sexual contacts were also considerable. These percentages show that special preventive program is needed to control an HIV transmission. Estimates derived from probability method were comparable with data from external sources. The underestimation in frequency method might be due to the fact that respondents are not aware of sensitive characteristics of all those in their network and underreporting is likely to occur. PMID:22891148
Simon, Steven L.; Bouville, André; Kleinerman, Ruth
2009-01-01
Biodosimetry measurements can potentially be an important and integral part of the dosimetric methods used in long-term studies of health risk following radiation exposure. Such studies rely on accurate estimation of doses to the whole body or to specific organs of individuals in order to derive reliable estimates of cancer risk. However, dose estimates based on analytical dose reconstruction (i.e., models) or personnel monitoring measurements, e.g., film-badges, can have substantial uncertainty. Biodosimetry can potentially reduce uncertainty in health risk studies by corroboration of model-based dose estimates or by using them to assess bias in dose models. While biodosimetry has begun to play a more significant role in long-term health risk studies, its use is still generally limited in that context due to one or more factors including, inadequate limits of detection, large inter-individual variability of the signal measured, high per-sample cost, and invasiveness. Presently, the most suitable biodosimetry methods for epidemiologic studies are chromosome aberration frequencies from fluorescence in situ hybridization (FISH) of peripheral blood lymphocytes and electron paramagnetic resonance (EPR) measurements made on tooth enamel. Both types of measurements, however, are usually invasive and require difficult to obtain biological samples. Moreover, doses derived from these methods are not always directly relevant to the tissues of interest. To increase the value of biodosimetry to epidemiologic studies, a number of issues need to be considered including limits of detection, effects of inhomogenous exposure of the body, how to extrapolate from the tissue sampled to the tissues of interest, and how to adjust dosimetry models applied to large populations based on sparse biodosimetry measurements. The requirements of health risk studies suggest a set of characteristics that, if satisfied by new biodosimetry methods, would increase the overall usefulness of biodosimetry to determining radiation health risks. PMID:20065672
Quantifying Cancer Risk from Radiation.
Keil, Alexander P; Richardson, David B
2017-12-06
Complex statistical models fitted to data from studies of atomic bomb survivors are used to estimate the human health effects of ionizing radiation exposures. We describe and illustrate an approach to estimate population risks from ionizing radiation exposure that relaxes many assumptions about radiation-related mortality. The approach draws on developments in methods for causal inference. The results offer a different way to quantify radiation's effects and show that conventional estimates of the population burden of excess cancer at high radiation doses are driven strongly by projecting outside the range of current data. Summary results obtained using the proposed approach are similar in magnitude to those obtained using conventional methods, although estimates of radiation-related excess cancers differ for many age, sex, and dose groups. At low doses relevant to typical exposures, the strength of evidence in data is surprisingly weak. Statements regarding human health effects at low doses rely strongly on the use of modeling assumptions. © 2017 Society for Risk Analysis.
Integrating Phylodynamics and Epidemiology to Estimate Transmission Diversity in Viral Epidemics
Magiorkinis, Gkikas; Sypsa, Vana; Magiorkinis, Emmanouil; Paraskevis, Dimitrios; Katsoulidou, Antigoni; Belshaw, Robert; Fraser, Christophe; Pybus, Oliver George; Hatzakis, Angelos
2013-01-01
The epidemiology of chronic viral infections, such as those caused by Hepatitis C Virus (HCV) and Human Immunodeficiency Virus (HIV), is affected by the risk group structure of the infected population. Risk groups are defined by each of their members having acquired infection through a specific behavior. However, risk group definitions say little about the transmission potential of each infected individual. Variation in the number of secondary infections is extremely difficult to estimate for HCV and HIV but crucial in the design of efficient control interventions. Here we describe a novel method that combines epidemiological and population genetic approaches to estimate the variation in transmissibility of rapidly-evolving viral epidemics. We evaluate this method using a nationwide HCV epidemic and for the first time co-estimate viral generation times and superspreading events from a combination of molecular and epidemiological data. We anticipate that this integrated approach will form the basis of powerful tools for describing the transmission dynamics of chronic viral diseases, and for evaluating control strategies directed against them. PMID:23382662
A Robust Statistics Approach to Minimum Variance Portfolio Optimization
NASA Astrophysics Data System (ADS)
Yang, Liusha; Couillet, Romain; McKay, Matthew R.
2015-12-01
We study the design of portfolios under a minimum risk criterion. The performance of the optimized portfolio relies on the accuracy of the estimated covariance matrix of the portfolio asset returns. For large portfolios, the number of available market returns is often of similar order to the number of assets, so that the sample covariance matrix performs poorly as a covariance estimator. Additionally, financial market data often contain outliers which, if not correctly handled, may further corrupt the covariance estimation. We address these shortcomings by studying the performance of a hybrid covariance matrix estimator based on Tyler's robust M-estimator and on Ledoit-Wolf's shrinkage estimator while assuming samples with heavy-tailed distribution. Employing recent results from random matrix theory, we develop a consistent estimator of (a scaled version of) the realized portfolio risk, which is minimized by optimizing online the shrinkage intensity. Our portfolio optimization method is shown via simulations to outperform existing methods both for synthetic and real market data.
Sacco, Ralph L
2007-06-01
By the year 2010, it is estimated that 18.1 million people worldwide will die annually because of cardiovascular diseases and stroke. "Global vascular risk" more broadly includes the multiple overlapping disease silos of stroke, myocardial infarction, peripheral arterial disease, and vascular death. Estimation of global vascular risk requires consideration of a variety of variables including demographics, environmental behaviors, and risk factors. Data from multiple studies suggest continuous linear relationships between the physiological vascular risk modulators of blood pressure, lipids, and blood glucose rather than treating these conditions as categorical risk factors. Constellations of risk factors may be more relevant than individual categorical components. Exciting work with novel risk factors may also have predictive value in estimates of global vascular risk. Advances in imaging have led to the measurement of subclinical conditions such as carotid intima-media thickness and subclinical brain conditions such as white matter hyperintensities and silent infarcts. These subclinical measurements may be intermediate stages in the transition from asymptomatic to symptomatic vascular events, appear to be associated with the fundamental vascular risk factors, and represent opportunities to more precisely quantitate disease progression. The expansion of studies in molecular epidemiology and detection of genetic markers underlying vascular risks also promises to extend our precision of global vascular risk estimation. Global vascular risk estimation will require quantitative methods that bundle these multi-dimensional data into more precise estimates of future risk. The power of genetic information coupled with data on demographics, risk-inducing behaviors, vascular risk modulators, biomarkers, and measures of subclinical conditions should provide the most realistic approximation of an individual's future global vascular risk. The ultimate public health benefit, however, will depend on not only identification of global vascular risk but also the realization that we can modify this risk and prove the prediction models wrong.
The 10-year Absolute Risk of Cardiovascular (CV) Events in Northern Iran: a Population Based Study
Motamed, Nima; Mardanshahi, Alireza; Saravi, Benyamin Mohseni; Siamian, Hasan; Maadi, Mansooreh; Zamani, Farhad
2015-01-01
Background: The present study was conducted to estimate 10-year cardiovascular disease events (CVD) risk using three instruments in northern Iran. Material and methods: Baseline data of 3201 participants 40-79 of a population based cohort which was conducted in Northern Iran were analyzed. Framingham risk score (FRS), World Health Organization (WHO) risk prediction charts and American college of cardiovascular / American heart association (ACC/AHA) tool were applied to assess 10-year CVD events risk. The agreement values between the risk assessment instruments were determined using the kappa statistics. Results: Our study estimated 53.5%of male population aged 40-79 had a 10 –year risk of CVD events≥10% based on ACC/AHA approach, 48.9% based on FRS and 11.8% based on WHO risk charts. A 10 –year risk≥10% was estimated among 20.1% of women using the ACC/AHA approach, 11.9%using FRS and 5.7%using WHO tool. ACC/AHA and Framingham tools had closest agreement in the estimation of 10-year risk≥10% (κ=0.7757) in meanwhile ACC/AHA and WHO approaches displayed highest agreement (κ=0.6123) in women. Conclusion: Different estimations of 10-year risk of CVD event were provided by ACC/AHA, FRS and WHO approaches. PMID:26236160
ERIC Educational Resources Information Center
Rule, David L.
Several regression methods were examined within the framework of weighted structural regression (WSR), comparing their regression weight stability and score estimation accuracy in the presence of outlier contamination. The methods compared are: (1) ordinary least squares; (2) WSR ridge regression; (3) minimum risk regression; (4) minimum risk 2;…
Human Health Risk Assessment Strategic Research Action Plan 2016-2019
EPA has designed the HHRA program to develop and apply state-of-the-science risk assessment methods to estimate risks from exposures to individual chemicals, chemical mixtures, and mixtures of chemical of chemicals and non-chemical stressors.
Lopez-Quintero, Catalina; Anthony, James C.
2016-01-01
As epidemiologists studying foodborne illness outbreaks, we do not ask luncheon attendees to say which food caused their illnesses. Instead, we use measurement and analysis methods to estimate food-specific risk variations. Here, we adapt the foodborne outbreak approach to develop new estimates of drug use disorder risk for single-drug and polydrug users, without attributing the syndrome to a specific drug when multiple drugs have been used. We estimate drug use disorder risk for cannabis-only users as a reference value. We then derive comparative relative risk estimates for users of other drug subtypes, including polydrug combinations. Data are from the 2002 to 2003 U.S. National Comorbidity Survey Replication, a nationally representative sample of household residents (18+ years), with standardized drug use and drug dependence assessments. Multiple logistic regression provides odds ratio estimates of relative risk. With this approach, for every 1000 cannabis-only users, an estimated 17 had become cases (1.7%). By comparison, polydrug users and cocaine-only users had much greater cumulative incidence (>10%), even with adjustment for covariates and local area matching (P < 0.001). Using this approach, we find exceptionally low risk for cannabis-only users and greater risk for polydrug and cocaine-only users. PMID:26348487
Tuberculosis disease mapping in Kedah using standardized morbidity ratio
NASA Astrophysics Data System (ADS)
Diah, Ijlal Mohd; Aziz, Nazrina; Kasim, Maznah Mat
2017-10-01
This paper presents the results of relative risk estimation that applied to TB data in Kedah using the most common approach, Standardized Morbidity Ratio (SMR). Disease mapping has been recognized as one of the methods that can be used by government and public health in order to control diseases since it can give a clear picture of the risk areas. To get good disease mapping, relative risk estimation is an important issue that need to be considered. TB risk areas will be recognized through the map. From the result, Kulim shows the lowest risk areas of contracting TB while Kota Setar has the highest risk area.
Calculation of out-of-field dose distribution in carbon-ion radiotherapy by Monte Carlo simulation.
Yonai, Shunsuke; Matsufuji, Naruhiro; Namba, Masao
2012-08-01
Recent radiotherapy technologies including carbon-ion radiotherapy can improve the dose concentration in the target volume, thereby not only reducing side effects in organs at risk but also the secondary cancer risk within or near the irradiation field. However, secondary cancer risk in the low-dose region is considered to be non-negligible, especially for younger patients. To achieve a dose estimation of the whole body of each patient receiving carbon-ion radiotherapy, which is essential for risk assessment and epidemiological studies, Monte Carlo simulation plays an important role because the treatment planning system can provide dose distribution only in∕near the irradiation field and the measured data are limited. However, validation of Monte Carlo simulations is necessary. The primary purpose of this study was to establish a calculation method using the Monte Carlo code to estimate the dose and quality factor in the body and to validate the proposed method by comparison with experimental data. Furthermore, we show the distributions of dose equivalent in a phantom and identify the partial contribution of each radiation type. We proposed a calculation method based on a Monte Carlo simulation using the PHITS code to estimate absorbed dose, dose equivalent, and dose-averaged quality factor by using the Q(L)-L relationship based on the ICRP 60 recommendation. The values obtained by this method in modeling the passive beam line at the Heavy-Ion Medical Accelerator in Chiba were compared with our previously measured data. It was shown that our calculation model can estimate the measured value within a factor of 2, which included not only the uncertainty of this calculation method but also those regarding the assumptions of the geometrical modeling and the PHITS code. Also, we showed the differences in the doses and the partial contributions of each radiation type between passive and active carbon-ion beams using this calculation method. These results indicated that it is essentially important to include the dose by secondary neutrons in the assessment of the secondary cancer risk of patients receiving carbon-ion radiotherapy with active as well as passive beams. We established a calculation method with a Monte Carlo simulation to estimate the distribution of dose equivalent in the body as a first step toward routine risk assessment and an epidemiological study of carbon-ion radiotherapy at NIRS. This method has the advantage of being verifiable by the measurement.
Estimation of the Dose and Dose Rate Effectiveness Factor
NASA Technical Reports Server (NTRS)
Chappell, L.; Cucinotta, F. A.
2013-01-01
Current models to estimate radiation risk use the Life Span Study (LSS) cohort that received high doses and high dose rates of radiation. Transferring risks from these high dose rates to the low doses and dose rates received by astronauts in space is a source of uncertainty in our risk calculations. The solid cancer models recommended by BEIR VII [1], UNSCEAR [2], and Preston et al [3] is fitted adequately by a linear dose response model, which implies that low doses and dose rates would be estimated the same as high doses and dose rates. However animal and cell experiments imply there should be curvature in the dose response curve for tumor induction. Furthermore animal experiments that directly compare acute to chronic exposures show lower increases in tumor induction than acute exposures. A dose and dose rate effectiveness factor (DDREF) has been estimated and applied to transfer risks from the high doses and dose rates of the LSS cohort to low doses and dose rates such as from missions in space. The BEIR VII committee [1] combined DDREF estimates using the LSS cohort and animal experiments using Bayesian methods for their recommendation for a DDREF value of 1.5 with uncertainty. We reexamined the animal data considered by BEIR VII and included more animal data and human chromosome aberration data to improve the estimate for DDREF. Several experiments chosen by BEIR VII were deemed inappropriate for application to human risk models of solid cancer risk. Animal tumor experiments performed by Ullrich et al [4], Alpen et al [5], and Grahn et al [6] were analyzed to estimate the DDREF. Human chromosome aberration experiments performed on a sample of astronauts within NASA were also available to estimate the DDREF. The LSS cohort results reported by BEIR VII were combined with the new radiobiology results using Bayesian methods.
Improving Estimation of Ground Casualty Risk From Reentering Space Objects
NASA Technical Reports Server (NTRS)
Ostrom, Chris L.
2017-01-01
A recent improvement to the long-term estimation of ground casualties from reentering space debris is the further refinement and update to the human population distribution. Previous human population distributions were based on global totals with simple scaling factors for future years, or a coarse grid of population counts in a subset of the world's countries, each cell having its own projected growth rate. The newest population model includes a 5-fold refinement in both latitude and longitude resolution. All areas along a single latitude are combined to form a global population distribution as a function of latitude, creating a more accurate population estimation based on non-uniform growth at the country and area levels. Previous risk probability calculations used simplifying assumptions that did not account for the ellipsoidal nature of the Earth. The new method uses first, a simple analytical method to estimate the amount of time spent above each latitude band for a debris object with a given orbit inclination and second, a more complex numerical method that incorporates the effects of a non-spherical Earth. These new results are compared with the prior models to assess the magnitude of the effects on reentry casualty risk.
Improving Estimation of Ground Casualty Risk from Reentering Space Objects
NASA Technical Reports Server (NTRS)
Ostrom, C.
2017-01-01
A recent improvement to the long-term estimation of ground casualties from reentering space debris is the further refinement and update to the human population distribution. Previous human population distributions were based on global totals with simple scaling factors for future years, or a coarse grid of population counts in a subset of the world's countries, each cell having its own projected growth rate. The newest population model includes a 5-fold refinement in both latitude and longitude resolution. All areas along a single latitude are combined to form a global population distribution as a function of latitude, creating a more accurate population estimation based on non-uniform growth at the country and area levels. Previous risk probability calculations used simplifying assumptions that did not account for the ellipsoidal nature of the earth. The new method uses first, a simple analytical method to estimate the amount of time spent above each latitude band for a debris object with a given orbit inclination, and second, a more complex numerical method that incorporates the effects of a non-spherical Earth. These new results are compared with the prior models to assess the magnitude of the effects on reentry casualty risk.
Nonparametric estimation of benchmark doses in environmental risk assessment
Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen
2013-01-01
Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133
Comparison of methods for estimating the attributable risk in the context of survival analysis.
Gassama, Malamine; Bénichou, Jacques; Dartois, Laureen; Thiébaut, Anne C M
2017-01-23
The attributable risk (AR) measures the proportion of disease cases that can be attributed to an exposure in the population. Several definitions and estimation methods have been proposed for survival data. Using simulations, we compared four methods for estimating AR defined in terms of survival functions: two nonparametric methods based on Kaplan-Meier's estimator, one semiparametric based on Cox's model, and one parametric based on the piecewise constant hazards model, as well as one simpler method based on estimated exposure prevalence at baseline and Cox's model hazard ratio. We considered a fixed binary exposure with varying exposure probabilities and strengths of association, and generated event times from a proportional hazards model with constant or monotonic (decreasing or increasing) Weibull baseline hazard, as well as from a nonproportional hazards model. We simulated 1,000 independent samples of size 1,000 or 10,000. The methods were compared in terms of mean bias, mean estimated standard error, empirical standard deviation and 95% confidence interval coverage probability at four equally spaced time points. Under proportional hazards, all five methods yielded unbiased results regardless of sample size. Nonparametric methods displayed greater variability than other approaches. All methods showed satisfactory coverage except for nonparametric methods at the end of follow-up for a sample size of 1,000 especially. With nonproportional hazards, nonparametric methods yielded similar results to those under proportional hazards, whereas semiparametric and parametric approaches that both relied on the proportional hazards assumption performed poorly. These methods were applied to estimate the AR of breast cancer due to menopausal hormone therapy in 38,359 women of the E3N cohort. In practice, our study suggests to use the semiparametric or parametric approaches to estimate AR as a function of time in cohort studies if the proportional hazards assumption appears appropriate.
Sign realized jump risk and the cross-section of stock returns: Evidence from China's stock market.
Chao, Youcong; Liu, Xiaoqun; Guo, Shijun
2017-01-01
Using 5-minute high frequency data from the Chinese stock market, we employ a non-parametric method to estimate Fama-French portfolio realized jumps and investigate whether the estimated positive, negative and sign realized jumps could forecast or explain the cross-sectional stock returns. The Fama-MacBeth regression results show that not only have the realized jump components and the continuous volatility been compensated with risk premium, but also that the negative jump risk, the positive jump risk and the sign jump risk, to some extent, could explain the return of the stock portfolios. Therefore, we should pay high attention to the downside tail risk and the upside tail risk.
Ungers, L J; Moskowitz, P D; Owens, T W; Harmon, A D; Briggs, T M
1982-02-01
Determining occupational health and safety risks posed by emerging technologies is difficult because of limited statistics. Nevertheless, estimates of such risks must be constructed to permit comparison of various technologies to identify the most attractive processes. One way to estimate risks is to use statistics on related industries. Based on process labor requirements and associated occupational health data, risks to workers and to society posed by an emerging technology can be calculated. Using data from the California semiconductor industry, this study applies a five-step occupational risk assessment procedure to four processes for the fabrication of photovoltaic cells. The validity of the occupational risk assessment method is discussed.
ERIC Educational Resources Information Center
Johnson, Andrew O.; Mink, Michael D.; Harun, Nusrat; Moore, Charity G.; Martin, Amy B.; Bennett, Kevin J.
2008-01-01
Objectives: The purpose of this study was to compare national estimates of drug use and exposure to violence between rural and urban teens. Methods: Twenty-eight dependent variables from the 2003 Youth Risk Behavior Survey were used to compare violent activities, victimization, suicidal behavior, tobacco use, alcohol use, and illegal drug use…
Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations
NASA Astrophysics Data System (ADS)
Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.
2014-02-01
The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.
Evidence That Breast Tissue Stiffness Is Associated with Risk of Breast Cancer
Boyd, Norman F.; Li, Qing; Melnichouk, Olga; Huszti, Ella; Martin, Lisa J.; Gunasekara, Anoma; Mawdsley, Gord; Yaffe, Martin J.; Minkin, Salomon
2014-01-01
Background Evidence from animal models shows that tissue stiffness increases the invasion and progression of cancers, including mammary cancer. We here use measurements of the volume and the projected area of the compressed breast during mammography to derive estimates of breast tissue stiffness and examine the relationship of stiffness to risk of breast cancer. Methods Mammograms were used to measure the volume and projected areas of total and radiologically dense breast tissue in the unaffected breasts of 362 women with newly diagnosed breast cancer (cases) and 656 women of the same age who did not have breast cancer (controls). Measures of breast tissue volume and the projected area of the compressed breast during mammography were used to calculate the deformation of the breast during compression and, with the recorded compression force, to estimate the stiffness of breast tissue. Stiffness was compared in cases and controls, and associations with breast cancer risk examined after adjustment for other risk factors. Results After adjustment for percent mammographic density by area measurements, and other risk factors, our estimate of breast tissue stiffness was significantly associated with breast cancer (odds ratio = 1.21, 95% confidence interval = 1.03, 1.43, p = 0.02) and improved breast cancer risk prediction in models with percent mammographic density, by both area and volume measurements. Conclusion An estimate of breast tissue stiffness was associated with breast cancer risk and improved risk prediction based on mammographic measures and other risk factors. Stiffness may provide an additional mechanism by which breast tissue composition is associated with risk of breast cancer and merits examination using more direct methods of measurement. PMID:25010427
Assessing Hospital Performance After Percutaneous Coronary Intervention Using Big Data.
Spertus, Jacob V; T Normand, Sharon-Lise; Wolf, Robert; Cioffi, Matt; Lovett, Ann; Rose, Sherri
2016-11-01
Although risk adjustment remains a cornerstone for comparing outcomes across hospitals, optimal strategies continue to evolve in the presence of many confounders. We compared conventional regression-based model to approaches particularly suited to leveraging big data. We assessed hospital all-cause 30-day excess mortality risk among 8952 adults undergoing percutaneous coronary intervention between October 1, 2011, and September 30, 2012, in 24 Massachusetts hospitals using clinical registry data linked with billing data. We compared conventional logistic regression models with augmented inverse probability weighted estimators and targeted maximum likelihood estimators to generate more efficient and unbiased estimates of hospital effects. We also compared a clinically informed and a machine-learning approach to confounder selection, using elastic net penalized regression in the latter case. Hospital excess risk estimates range from -1.4% to 2.0% across methods and confounder sets. Some hospitals were consistently classified as low or as high excess mortality outliers; others changed classification depending on the method and confounder set used. Switching from the clinically selected list of 11 confounders to a full set of 225 confounders increased the estimation uncertainty by an average of 62% across methods as measured by confidence interval length. Agreement among methods ranged from fair, with a κ statistic of 0.39 (SE: 0.16), to perfect, with a κ of 1 (SE: 0.0). Modern causal inference techniques should be more frequently adopted to leverage big data while minimizing bias in hospital performance assessments. © 2016 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
Eckermann, Simon; Coory, Michael; Willan, Andrew R
2011-02-01
Economic analysis and assessment of net clinical benefit often requires estimation of absolute risk difference (ARD) for binary outcomes (e.g. survival, response, disease progression) given baseline epidemiological risk in a jurisdiction of interest and trial evidence of treatment effects. Typically, the assumption is made that relative treatment effects are constant across baseline risk, in which case relative risk (RR) or odds ratios (OR) could be applied to estimate ARD. The objective of this article is to establish whether such use of RR or OR allows consistent estimates of ARD. ARD is calculated from alternative framing of effects (e.g. mortality vs survival) applying standard methods for translating evidence with RR and OR. For RR, the RR is applied to baseline risk in the jurisdiction to estimate treatment risk; for OR, the baseline risk is converted to odds, the OR applied and the resulting treatment odds converted back to risk. ARD is shown to be consistently estimated with OR but changes with framing of effects using RR wherever there is a treatment effect and epidemiological risk differs from trial risk. Additionally, in indirect comparisons, ARD is shown to be consistently estimated with OR, while calculation with RR allows inconsistency, with alternative framing of effects in the direction, let alone the extent, of ARD. OR ensures consistent calculation of ARD in translating evidence from trial settings and across trials in direct and indirect comparisons, avoiding inconsistencies from RR with alternative outcome framing and associated biases. These findings are critical for consistently translating evidence to inform economic analysis and assessment of net clinical benefit, as translation of evidence is proposed precisely where the advantages of OR over RR arise.
Grundy, Anne; Poirier, Abbey E.; Khandwala, Farah; Grevers, Xin; Friedenreich, Christine M.; Brenner, Darren R.
2017-01-01
Background: Estimates of the proportion of cancer cases that can be attributed to modifiable risk factors are not available for Canada and, more specifically, Alberta. The purpose of this study was to estimate the total proportion of cancer cases in Alberta in 2012 that could be attributed to a set of 24 modifiable lifestyle and environmental risk factors. Methods: We estimated summary population attributable risk estimates for 24 risk factors (smoking [both passive and active], overweight and obesity, inadequate physical activity, diet [inadequate fruit and vegetable consumption, inadequate fibre intake, excess red and processed meat consumption, salt consumption, inadequate calcium and vitamin D intake], alcohol, hormones [oral contraceptives and hormone therapy], infections [Epstein-Barr virus, hepatitis B and C viruses, human papillomavirus, Helicobacter pylori], air pollution, natural and artificial ultraviolet radiation, radon and water disinfection by-products) by combining population attributable risk estimates for each of the 24 factors that had been previously estimated. To account for the possibility that individual cancer cases were the result of a combination of multiple risk factors, we subtracted the population attributable risk for the first factor from 100% and then applied the population attributable risk for the second factor to the remaining proportion that was not attributable to the first factor. We repeated this process in sequential order for all relevant exposures. Results: Overall, an estimated 40.8% of cancer cases in Alberta in 2012 were attributable to modifiable lifestyle and environmental risk factors. The largest proportion of cancers were estimated to be attributable to tobacco smoking, physical inactivity and excess body weight. The summary population attributable risk estimate was slightly higher among women (42.4%) than among men (38.7%). Interpretation: About 41% of cancer cases in Alberta may be attributable to known modifiable lifestyle and environmental risk factors. Reducing the prevalence of these factors in the Alberta population has the potential to substantially reduce the provincial cancer burden. PMID:28687643
How are flood risk estimates affected by the choice of return-periods?
NASA Astrophysics Data System (ADS)
Ward, P. J.; de Moel, H.; Aerts, J. C. J. H.
2011-12-01
Flood management is more and more adopting a risk based approach, whereby flood risk is the product of the probability and consequences of flooding. One of the most common approaches in flood risk assessment is to estimate the damage that would occur for floods of several exceedance probabilities (or return periods), to plot these on an exceedance probability-loss curve (risk curve) and to estimate risk as the area under the curve. However, there is little insight into how the selection of the return-periods (which ones and how many) used to calculate risk actually affects the final risk calculation. To gain such insights, we developed and validated an inundation model capable of rapidly simulating inundation extent and depth, and dynamically coupled this to an existing damage model. The method was applied to a section of the River Meuse in the southeast of the Netherlands. Firstly, we estimated risk based on a risk curve using yearly return periods from 2 to 10 000 yr (€ 34 million p.a.). We found that the overall risk is greatly affected by the number of return periods used to construct the risk curve, with over-estimations of annual risk between 33% and 100% when only three return periods are used. In addition, binary assumptions on dike failure can have a large effect (a factor two difference) on risk estimates. Also, the minimum and maximum return period considered in the curve affects the risk estimate considerably. The results suggest that more research is needed to develop relatively simple inundation models that can be used to produce large numbers of inundation maps, complementary to more complex 2-D-3-D hydrodynamic models. It also suggests that research into flood risk could benefit by paying more attention to the damage caused by relatively high probability floods.
NASA Astrophysics Data System (ADS)
Freeman, P. E.; Izbicki, R.; Lee, A. B.
2017-07-01
Photometric redshift estimation is an indispensable tool of precision cosmology. One problem that plagues the use of this tool in the era of large-scale sky surveys is that the bright galaxies that are selected for spectroscopic observation do not have properties that match those of (far more numerous) dimmer galaxies; thus, ill-designed empirical methods that produce accurate and precise redshift estimates for the former generally will not produce good estimates for the latter. In this paper, we provide a principled framework for generating conditional density estimates (I.e. photometric redshift PDFs) that takes into account selection bias and the covariate shift that this bias induces. We base our approach on the assumption that the probability that astronomers label a galaxy (I.e. determine its spectroscopic redshift) depends only on its measured (photometric and perhaps other) properties x and not on its true redshift. With this assumption, we can explicitly write down risk functions that allow us to both tune and compare methods for estimating importance weights (I.e. the ratio of densities of unlabelled and labelled galaxies for different values of x) and conditional densities. We also provide a method for combining multiple conditional density estimates for the same galaxy into a single estimate with better properties. We apply our risk functions to an analysis of ≈106 galaxies, mostly observed by Sloan Digital Sky Survey, and demonstrate through multiple diagnostic tests that our method achieves good conditional density estimates for the unlabelled galaxies.
Galizzi, Matteo M; Miraldo, Marisa; Stavropoulou, Charitini
2016-05-01
We present results from a hypothetical framed field experiment assessing whether risk preferences significantly differ across the health and financial domains when they are elicited through the same multiple price list paired-lottery method. We consider a sample of 300 patients attending outpatient clinics in a university hospital in Athens during the Greek financial crisis. Risk preferences in finance were elicited using paired-lottery questions with hypothetical payments. The questions were adapted to the health domain by framing the lotteries as risky treatments in hypothetical health care scenarios. Using maximum likelihood methods, we estimated the degree of risk aversion, allowing for the estimates to be dependent on domain and individual characteristics. The subjects in our sample, who were exposed to both health and financial distress, tended to be less risk averse in the financial domain than in the health domain. © The Author(s) 2016.
Michael J. Firko; Jane Leslie Hayes
1990-01-01
Quantitative genetic studies of resistance can provide estimates of genetic parameters not available with other types of genetic analyses. Three methods are discussed for estimating the amount of additive genetic variation in resistance to individual insecticides and subsequent estimation of heritability (h2) of resistance. Sibling analysis and...
Estimating Stability Class in the Field
Leonidas G. Lavdas
1997-01-01
A simple and easily remembered method is described for estimating cloud ceiling height in the field. Estimating ceiling height provides the means to estimate stability class, a parameter used to help determine Dispersion Index and Low Visibility Occurrence Risk Index, indices used as smoke management aids. Stability class is also used as an input to VSMOKE, an...
Acevedo-Fontánez, Adrianna I; Suárez, Erick; Torres Cintrón, Carlos R; Ortiz, Ana P
2018-04-11
The aim of the study was to estimate the magnitude of the association between HPV-related gynecological neoplasms and secondary anal cancer among women in Puerto Rico (PR). We identified 9,489 women who had been diagnosed with a primary cervical, vaginal, or vulvar tumor during 1987-2013. To describe the trends of invasive cervical, vulvar, vaginal, and anal cancer, the age-adjusted incidence rates were estimated using the direct method (2000 US as Standard Population). Standardized incidence ratios (observed/expected) were computed using the indirect method; expected cases were calculated using 2 methods based on age-specific rates of anal cancer in PR. The ratio of standardized incidence ratios of anal cancer was estimated using the Poisson regression model to estimate the magnitude of the association between HPV-gynecologic neoplasms and secondary anal cancer. A significant increase in the incidence trend for anal cancer was observed from 1987 to 2013 (annual percent change = 1.1, p < .05), whereas from 2004 to 2013, an increase was observed for cervical cancer incidence (annual percent change = 3.3, p < .05). The risk of secondary anal cancer among women with HPV-related gynecological cancers was approximately 3 times this risk among women with non-HPV-related gynecological cancers (relative risk = 3.27, 95% CI = 1.37 to 7.79). Anal cancer is increasing among women in PR. Women with gynecological HPV-related tumors are at higher risk of secondary anal cancer as compared with women from the general population and with those with non-HPV-related gynecological cancers. Appropriate anal cancer screening guidelines for high-risk populations are needed, including women with HPV-related gynecological malignancies and potentially other cancer survivors.
Estimation and prediction under local volatility jump-diffusion model
NASA Astrophysics Data System (ADS)
Kim, Namhyoung; Lee, Younhee
2018-02-01
Volatility is an important factor in operating a company and managing risk. In the portfolio optimization and risk hedging using the option, the value of the option is evaluated using the volatility model. Various attempts have been made to predict option value. Recent studies have shown that stochastic volatility models and jump-diffusion models reflect stock price movements accurately. However, these models have practical limitations. Combining them with the local volatility model, which is widely used among practitioners, may lead to better performance. In this study, we propose a more effective and efficient method of estimating option prices by combining the local volatility model with the jump-diffusion model and apply it using both artificial and actual market data to evaluate its performance. The calibration process for estimating the jump parameters and local volatility surfaces is divided into three stages. We apply the local volatility model, stochastic volatility model, and local volatility jump-diffusion model estimated by the proposed method to KOSPI 200 index option pricing. The proposed method displays good estimation and prediction performance.
NASA Technical Reports Server (NTRS)
Edmonds, L. D.
2016-01-01
Since advancing technology has been producing smaller structures in electronic circuits, the floating gates in modern flash memories are becoming susceptible to prompt charge loss from ionizing radiation environments found in space. A method for estimating the risk of a charge-loss event is given.
NASA Technical Reports Server (NTRS)
Edmonds, L. D.
2016-01-01
Because advancing technology has been producing smaller structures in electronic circuits, the floating gates in modern flash memories are becoming susceptible to prompt charge loss from ionizing radiation environments found in space. A method for estimating the risk of a charge-loss event is given.
The gender- and age-specific 10-year and lifetime absolute fracture risk in Tromsø, Norway.
Ahmed, Luai A; Schirmer, Henrik; Bjørnerem, Ashild; Emaus, Nina; Jørgensen, Lone; Størmer, Jan; Joakimsen, Ragnar M
2009-01-01
Aim of this study is to estimate the gender- and age-specific 10-year and lifetime absolute risks of non-vertebral and osteoporotic (included hip, distal forearm and proximal humerus) fractures in a large cohort of men and women. This is a population-based 10 years follow-up study of 26,891 subjects aged 25 years and older in Tromsø, Norway. All non-vertebral fractures were registered from 1995 throughout 2004 by computerized search in radiographic archives. Absolute risks were estimated by life-table method taking into account the competing risk of death. The absolute fracture risk at each year of age was estimated for the next 10 years (10-year risk) or up to the age of 90 years (lifetime risk). The estimated 10-year absolute risk of all non-vertebral fracture was higher in men than women before but not after the age of 45 years. The 10-year absolute risk for non-vertebral and osteoporotic fractures was over 10%, respectively, in men over 65 and 70 years and in women over 45 and 50 years of age. The 10-year absolute risks of hip fractures at the age of 65 and 80 years were 4.2 and 18.6% in men, and 9.0 and 24.0% in women, respectively. The risk estimates for distal forearm and proximal humerus fractures were under 5% in men and 13% in women. The estimated lifetime risks for all fracture locations were higher in women than men at all ages. At the age of 50 years, the risks were 38.1 and 24.8% in men and 67.4 and 55.0% in women for all non-vertebral and osteoporotic fractures, respectively. The estimated gender- and age-specific 10-year and lifetime absolute fracture risk were higher in Tromsø than in other populations. The high lifetime fracture risk reflects the increased burden of fractures in this cohort.
Pletcher, Mark J; Tice, Jeffrey A; Pignone, Michael; McCulloch, Charles; Callister, Tracy Q; Browner, Warren S
2004-01-01
Background The coronary artery calcium (CAC) score is an independent predictor of coronary heart disease. We sought to combine information from the CAC score with information from conventional cardiac risk factors to produce post-test risk estimates, and to determine whether the score may add clinically useful information. Methods We measured the independent cross-sectional associations between conventional cardiac risk factors and the CAC score among asymptomatic persons referred for non-contrast electron beam computed tomography. Using the resulting multivariable models and published CAC score-specific relative risk estimates, we estimated post-test coronary heart disease risk in a number of different scenarios. Results Among 9341 asymptomatic study participants (age 35–88 years, 40% female), we found that conventional coronary heart disease risk factors including age, male sex, self-reported hypertension, diabetes and high cholesterol were independent predictors of the CAC score, and we used the resulting multivariable models for predicting post-test risk in a variety of scenarios. Our models predicted, for example, that a 60-year-old non-smoking non-diabetic women with hypertension and high cholesterol would have a 47% chance of having a CAC score of zero, reducing her 10-year risk estimate from 15% (per Framingham) to 6–9%; if her score were over 100, however (a 17% chance), her risk estimate would be markedly higher (25–51% in 10 years). In low risk scenarios, the CAC score is very likely to be zero or low, and unlikely to change management. Conclusion Combining information from the CAC score with information from conventional risk factors can change assessment of coronary heart disease risk to an extent that may be clinically important, especially when the pre-test 10-year risk estimate is intermediate. The attached spreadsheet makes these calculations easy. PMID:15327691
Remarks on a financial inverse problem by means of Monte Carlo Methods
NASA Astrophysics Data System (ADS)
Cuomo, Salvatore; Di Somma, Vittorio; Sica, Federica
2017-10-01
Estimating the price of a barrier option is a typical inverse problem. In this paper we present a numerical and statistical framework for a market with risk-free interest rate and a risk asset, described by a Geometric Brownian Motion (GBM). After approximating the risk asset with a numerical method, we find the final option price by following an approach based on sequential Monte Carlo methods. All theoretical results are applied to the case of an option whose underlying is a real stock.
Burgess, Stephen; Scott, Robert A; Timpson, Nicholas J; Davey Smith, George; Thompson, Simon G
2015-07-01
Finding individual-level data for adequately-powered Mendelian randomization analyses may be problematic. As publicly-available summarized data on genetic associations with disease outcomes from large consortia are becoming more abundant, use of published data is an attractive analysis strategy for obtaining precise estimates of the causal effects of risk factors on outcomes. We detail the necessary steps for conducting Mendelian randomization investigations using published data, and present novel statistical methods for combining data on the associations of multiple (correlated or uncorrelated) genetic variants with the risk factor and outcome into a single causal effect estimate. A two-sample analysis strategy may be employed, in which evidence on the gene-risk factor and gene-outcome associations are taken from different data sources. These approaches allow the efficient identification of risk factors that are suitable targets for clinical intervention from published data, although the ability to assess the assumptions necessary for causal inference is diminished. Methods and guidance are illustrated using the example of the causal effect of serum calcium levels on fasting glucose concentrations. The estimated causal effect of a 1 standard deviation (0.13 mmol/L) increase in calcium levels on fasting glucose (mM) using a single lead variant from the CASR gene region is 0.044 (95 % credible interval -0.002, 0.100). In contrast, using our method to account for the correlation between variants, the corresponding estimate using 17 genetic variants is 0.022 (95 % credible interval 0.009, 0.035), a more clearly positive causal effect.
Technical Report on Ozone Exposure, Risk, and Impact Assessments for Vegetation
The report presents analyses of national ozone air quality, vegetation exposures and risk, and impact to economic benefits that incorporates improved methods for estimating ozone at unmonitored locations.
Chrubasik, Sigrun A; Chrubasik, Cosima A; Piper, Jörg; Schulte-Moenting, Juergen; Erne, Paul
2015-01-01
In models and scores for estimating cardiovascular risk (CVR), the relative weightings given to blood pressure measurements (BPMs), and biometric and laboratory variables are such that even large differences in blood pressure lead to rather low differences in the resulting total risk when compared with other concurrent risk factors. We evaluated this phenomenon based on the PROCAM score, using BPMs made by volunteer subjects at home (HBPMs) and automated ambulatory BPMs (ABPMs) carried out in the same subjects. A total of 153 volunteers provided the data needed to estimate their CVR by means of the PROCAM formula. Differences (deltaCVR) between the risk estimated by entering the ABPM and that estimated with the HBPM were compared with the differences (deltaBPM) between the ABPM and the corresponding HBPM. In addition to the median values (= second quartile), the first and third quartiles of blood pressure profiles were also considered. PROCAM risk values were converted to European Society of Cardiology (ESC) risk values and all participants were assigned to the risk groups low, medium and high. Based on the PROCAM score, 132 participants had a low risk for suffering myocardial infarction, 16 a medium risk and 5 a high risk. The calculated ESC scores classified 125 participants into the low-risk group, 26 into the medium- and 2 into the high-risk group for death from a cardiovascular event. Mean ABPM tended to be higher than mean HBPM. Use of mean systolic ABPM or HBPM in the PROCAM formula had no major impact on the risk level. Our observations are in agreement with the rather low weighting of blood pressure as risk determinant in the PROCAM score. BPMs assessed with different methods had relatively little impact on estimation of cardiovascular risk in the given context of other important determinants. The risk calculations in our unselected population reflect the given classification of Switzerland as a so-called cardiovascular "low risk country".
Risk analysis for autonomous underwater vehicle operations in extreme environments.
Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter
2010-12-01
Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. © 2010 Society for Risk Analysis.
A New Method for Assessing How Sensitivity and Specificity of Linkage Studies Affects Estimation
Moore, Cecilia L.; Amin, Janaki; Gidding, Heather F.; Law, Matthew G.
2014-01-01
Background While the importance of record linkage is widely recognised, few studies have attempted to quantify how linkage errors may have impacted on their own findings and outcomes. Even where authors of linkage studies have attempted to estimate sensitivity and specificity based on subjects with known status, the effects of false negatives and positives on event rates and estimates of effect are not often described. Methods We present quantification of the effect of sensitivity and specificity of the linkage process on event rates and incidence, as well as the resultant effect on relative risks. Formulae to estimate the true number of events and estimated relative risk adjusted for given linkage sensitivity and specificity are then derived and applied to data from a prisoner mortality study. The implications of false positive and false negative matches are also discussed. Discussion Comparisons of the effect of sensitivity and specificity on incidence and relative risks indicate that it is more important for linkages to be highly specific than sensitive, particularly if true incidence rates are low. We would recommend that, where possible, some quantitative estimates of the sensitivity and specificity of the linkage process be performed, allowing the effect of these quantities on observed results to be assessed. PMID:25068293
Ma, Wenxia; Yin, Xuejun; Zhang, Ruijuan; Liu, Furong; Yang, Danrong; Fan, Yameng; Rong, Jie; Tian, Maoyi; Yu, Yan
2017-01-01
Background: 24-h urine collection is regarded as the “gold standard” for monitoring sodium intake at the population level, but ensuring high quality urine samples is difficult to achieve. The Kawasaki, International Study of Sodium, Potassium, and Blood Pressure (INTERSALT) and Tanaka methods have been used to estimate 24-h urinary sodium excretion from spot urine samples in some countries, but few studies have been performed to compare and validate these methods in the Chinese population. Objective: To compare and validate the Kawasaki, INTERSALT and Tanaka formulas in predicting 24-h urinary sodium excretion using spot urine samples in 365 high-risk elder patients of strokefrom the rural areas of Shaanxi province. Methods: Data were collected from a sub-sample of theSalt Substitute and Stroke Study. 365 high-risk elder patients of stroke from the rural areas of Shaanxi province participated and their spot and 24-h urine specimens were collected. The concentrations of sodium, potassium and creatinine in spot and 24-h urine samples wereanalysed. Estimated 24-h sodium excretion was predicted from spot urine concentration using the Kawasaki, INTERSALT, and Tanaka formulas. Pearson correlation coefficients and agreement by Bland-Altman method were computed for estimated and measured 24-h urinary sodium excretion. Results: The average 24-h urinary sodium excretion was 162.0 mmol/day, which representing a salt intake of 9.5 g/day. Three predictive equations had low correlation with the measured 24-h sodium excretion (r = 0.38, p < 0.01; ICC = 0.38, p < 0.01 for the Kawasaki; r = 0.35, p < 0.01; ICC = 0.31, p < 0.01 for the INTERSALT; r = 0.37, p < 0.01; ICC = 0.34, p < 0.01 for the Tanaka). Significant biases between estimated and measured 24-h sodium excretion were observed (all p < 0.01 for three methods). Among the three methods, the Kawasaki method was the least biased compared with the other two methods (mean bias: 31.90, 95% Cl: 23.84, 39.97). Overestimation occurred when the Kawasaki and Tanaka methods were used while the INTERSALT method underestimated 24-h sodium excretion. Conclusion: The Kawasaki, INTERSALT and Tanaka methods for estimation of 24-h urinary sodium excretion from spot urine specimens were inadequate for the assessment of sodium intake at the population level in high-risk elder patients of stroke from the rural areas of Shaanxi province, although the Kawasaki method was the least biased compared with the other two methods. PMID:29019912
Ma, Wenxia; Yin, Xuejun; Zhang, Ruijuan; Liu, Furong; Yang, Danrong; Fan, Yameng; Rong, Jie; Tian, Maoyi; Yu, Yan
2017-10-11
Background : 24-h urine collection is regarded as the "gold standard" for monitoring sodium intake at the population level, but ensuring high quality urine samples is difficult to achieve. The Kawasaki, International Study of Sodium, Potassium, and Blood Pressure (INTERSALT) and Tanaka methods have been used to estimate 24-h urinary sodium excretion from spot urine samples in some countries, but few studies have been performed to compare and validate these methods in the Chinese population. Objective : To compare and validate the Kawasaki, INTERSALT and Tanaka formulas in predicting 24-h urinary sodium excretion using spot urine samples in 365 high-risk elder patients of strokefrom the rural areas of Shaanxi province. Methods : Data were collected from a sub-sample of theSalt Substitute and Stroke Study. 365 high-risk elder patients of stroke from the rural areas of Shaanxi province participated and their spot and 24-h urine specimens were collected. The concentrations of sodium, potassium and creatinine in spot and 24-h urine samples wereanalysed. Estimated 24-h sodium excretion was predicted from spot urine concentration using the Kawasaki, INTERSALT, and Tanaka formulas. Pearson correlation coefficients and agreement by Bland-Altman method were computed for estimated and measured 24-h urinary sodium excretion. Results : The average 24-h urinary sodium excretion was 162.0 mmol/day, which representing a salt intake of 9.5 g/day. Three predictive equations had low correlation with the measured 24-h sodium excretion (r = 0.38, p < 0.01; ICC = 0.38, p < 0.01 for the Kawasaki; r = 0.35, p < 0.01; ICC = 0.31, p < 0.01 for the INTERSALT; r = 0.37, p < 0.01; ICC = 0.34, p < 0.01 for the Tanaka). Significant biases between estimated and measured 24-h sodium excretion were observed (all p < 0.01 for three methods). Among the three methods, the Kawasaki method was the least biased compared with the other two methods (mean bias: 31.90, 95% Cl: 23.84, 39.97). Overestimation occurred when the Kawasaki and Tanaka methods were used while the INTERSALT method underestimated 24-h sodium excretion. Conclusion : The Kawasaki, INTERSALT and Tanaka methods for estimation of 24-h urinary sodium excretion from spot urine specimens were inadequate for the assessment of sodium intake at the population level in high-risk elder patients of stroke from the rural areas of Shaanxi province, although the Kawasaki method was the least biased compared with the other two methods.
[Application of Competing Risks Model in Predicting Smoking Relapse Following Ischemic Stroke].
Hou, Li-Sha; Li, Ji-Jie; Du, Xu-Dong; Yan, Pei-Jing; Zhu, Cai-Rong
2017-07-01
To determine factors associated with smoking relapse in men who survived from their first stroke. Data were collected through face to face interviews with stroke patients in the hospital, and then repeated every three months via telephone over the period from 2010 to 2014. Kaplan-Meier method and competing risk model were adopted to estimate and predict smoking relapse rates. The Kaplan-Meier method estimated a higher relapse rate than the competing risk model. The four-year relapse rate was 43.1% after adjustment of competing risk. Exposure to environmental tobacco smoking outside of home and workplace (such as bars and restaurants) ( P =0.01), single ( P <0.01), and prior history of smoking at least 20 cigarettes per day ( P =0.02) were significant predictors of smoking relapse. When competing risks exist, competing risks model should be used in data analyses. Smoking interventions should give priorities to those without a spouse and those with a heavy smoking history. Smoking ban in public settings can reduce smoking relapse in stroke patients.
Sign realized jump risk and the cross-section of stock returns: Evidence from China's stock market
Chao, Youcong; Liu, Xiaoqun; Guo, Shijun
2017-01-01
Using 5-minute high frequency data from the Chinese stock market, we employ a non-parametric method to estimate Fama-French portfolio realized jumps and investigate whether the estimated positive, negative and sign realized jumps could forecast or explain the cross-sectional stock returns. The Fama-MacBeth regression results show that not only have the realized jump components and the continuous volatility been compensated with risk premium, but also that the negative jump risk, the positive jump risk and the sign jump risk, to some extent, could explain the return of the stock portfolios. Therefore, we should pay high attention to the downside tail risk and the upside tail risk. PMID:28771514
Optimizing hidden layer node number of BP network to estimate fetal weight
NASA Astrophysics Data System (ADS)
Su, Juan; Zou, Yuanwen; Lin, Jiangli; Wang, Tianfu; Li, Deyu; Xie, Tao
2007-12-01
The ultrasonic estimation of fetal weigh before delivery is of most significance for obstetrical clinic. Estimating fetal weight more accurately is crucial for prenatal care, obstetrical treatment, choosing appropriate delivery methods, monitoring fetal growth and reducing the risk of newborn complications. In this paper, we introduce a method which combines golden section and artificial neural network (ANN) to estimate the fetal weight. The golden section is employed to optimize the hidden layer node number of the back propagation (BP) neural network. The method greatly improves the accuracy of fetal weight estimation, and simultaneously avoids choosing the hidden layer node number with subjective experience. The estimation coincidence rate achieves 74.19%, and the mean absolute error is 185.83g.
MeProRisk - a Joint Venture for Minimizing Risk in Geothermal Reservoir Development
NASA Astrophysics Data System (ADS)
Clauser, C.; Marquart, G.
2009-12-01
Exploration and development of geothermal reservoirs for the generation of electric energy involves high engineering and economic risks due to the need for 3-D geophysical surface surveys and deep boreholes. The MeProRisk project provides a strategy guideline for reducing these risks by combining cross-disciplinary information from different specialists: Scientists from three German universities and two private companies contribute with new methods in seismic modeling and interpretation, numerical reservoir simulation, estimation of petrophysical parameters, and 3-D visualization. The approach chosen in MeProRisk consists in considering prospecting and developing of geothermal reservoirs as an iterative process. A first conceptual model for fluid flow and heat transport simulation can be developed based on limited available initial information on geology and rock properties. In the next step, additional data is incorporated which is based on (a) new seismic interpretation methods designed for delineating fracture systems, (b) statistical studies on large numbers of rock samples for estimating reliable rock parameters, (c) in situ estimates of the hydraulic conductivity tensor. This results in a continuous refinement of the reservoir model where inverse modelling of fluid flow and heat transport allows infering the uncertainty and resolution of the model at each iteration step. This finally yields a calibrated reservoir model which may be used to direct further exploration by optimizing additional borehole locations, estimate the uncertainty of key operational and economic parameters, and optimize the long-term operation of a geothermal resrvoir.
Radiation Dose and Cancer Risk Estimates in 16-Slice Computed Tomography Coronary Angiography
Einstein, Andrew J.; Sanz, Javier; Dellegrottaglie, Santo; Milite, Margherita; Sirol, Marc; Henzlova, Milena; Rajagopalan, Sanjay
2008-01-01
Background Recent advances have led to a rapid increase in the number of computed tomography coronary angiography (CTCA) studies performed. While several studies have reported effective dose (E), there is no data available on cancer risk for current CTCA protocols. Methods and Results E and organ doses were estimated, using scanner-derived parameters and Monte Carlo methods, for 50 patients having 16-slice CTCA performed for clinical indications. Lifetime attributable risks (LARs) were estimated with models developed in the National Academies’ Biological Effects of Ionizing Radiation VII report. E of a complete CTCA averaged 9.5 mSv, while that of a complete study, including calcium scoring when indicated, averaged 11.7 mSv. Calcium scoring increased E by 25%, while tube current modulation reduced it by 34% and was more effective at lower heart rates. Organ doses were highest to the lungs and female breast. LAR of cancer incidence from CTCA averaged approximately 1 in 1600, but varied widely between patients, being highest in younger women. For all patients, the greatest risk was from lung cancer. Conclusions CTCA is associated with non-negligible risk of malignancy. Doses can be reduced by careful attention to scanning protocol. PMID:18371595
Industrial machine systems risk assessment: a critical review of concepts and methods.
Etherton, John R
2007-02-01
Reducing the risk of work-related death and injury to machine operators and maintenance personnel poses a continuing occupational safety challenge. The risk of injury from machinery in U.S. workplaces is high. Between 1992 and 2001, there were, on average, 520 fatalities per year involving machines and, on average, 3.8 cases per 10,000 workers of nonfatal caught-in-running-machine injuries involving lost workdays. A U.S. task group recently developed a technical reference guideline, ANSI B11 TR3, "A Guide to Estimate, Evaluate, & Reduce Risks Associated with Machine Tools," that is intended to bring machine tool risk assessment practice in the United States up to or above the level now required by the international standard, ISO 14121. The ANSI guideline emphasizes identifying tasks and hazards not previously considered, particularly those associated with maintenance; and it further emphasizes teamwork among line workers, engineers, and safety professionals. The value of this critical review of concepts and methods resides in (1) its linking current risk theory to machine system risk assessment and (2) its exploration of how various risk estimation tools translate into risk-informed decisions on industrial machine system design and use. The review was undertaken to set the stage for a field evaluation study on machine risk assessment among users of the ANSI B11 TR3 method.
Cancer-related risk indicators and preventive screening behaviors among lesbians and bisexual women.
Cochran, S D; Mays, V M; Bowen, D; Gage, S; Bybee, D; Roberts, S J; Goldstein, R S; Robison, A; Rankow, E J; White, J
2001-01-01
OBJECTIVES: This study examined whether lesbians are at increased risk for certain cancers as a result of an accumulation of behavioral risk factors and difficulties in accessing health care. METHODS: Prevalence estimates of behavioral risk factors (nulliparity, obesity, smoking, and alcohol use), cancer screening behaviors, and self-reported breast cancer histories derived from 7 independently conducted surveys of lesbians/bisexual women (n = 11,876) were compared with national estimates for women. RESULTS: In comparison with adjusted estimates for the US female population, lesbians/bisexual women exhibited greater prevalence rates of obesity, alcohol use, and tobacco use and lower rates of parity and birth control pill use. These women were also less likely to have health insurance coverage or to have had a recent pelvic examination or mammogram. Self-reported histories of breast cancer, however, did not differ from adjusted US female population estimates. CONCLUSIONS: Lesbians and bisexual women differ from heterosexual women in patterns of health risk. These women would be expected to be at especially greater risk for chronic diseases linked to smoking and obesity. PMID:11291371
SEMIPARAMETRIC ADDITIVE RISKS REGRESSION FOR TWO-STAGE DESIGN SURVIVAL STUDIES
Li, Gang; Wu, Tong Tong
2011-01-01
In this article we study a semiparametric additive risks model (McKeague and Sasieni (1994)) for two-stage design survival data where accurate information is available only on second stage subjects, a subset of the first stage study. We derive two-stage estimators by combining data from both stages. Large sample inferences are developed. As a by-product, we also obtain asymptotic properties of the single stage estimators of McKeague and Sasieni (1994) when the semiparametric additive risks model is misspecified. The proposed two-stage estimators are shown to be asymptotically more efficient than the second stage estimators. They also demonstrate smaller bias and variance for finite samples. The developed methods are illustrated using small intestine cancer data from the SEER (Surveillance, Epidemiology, and End Results) Program. PMID:21931467
Characterizing vaccine-associated risks using cubic smoothing splines.
Brookhart, M Alan; Walker, Alexander M; Lu, Yun; Polakowski, Laura; Li, Jie; Paeglow, Corrie; Puenpatom, Tosmai; Izurieta, Hector; Daniel, Gregory W
2012-11-15
Estimating risks associated with the use of childhood vaccines is challenging. The authors propose a new approach for studying short-term vaccine-related risks. The method uses a cubic smoothing spline to flexibly estimate the daily risk of an event after vaccination. The predicted incidence rates from the spline regression are then compared with the expected rates under a log-linear trend that excludes the days surrounding vaccination. The 2 models are then used to estimate the excess cumulative incidence attributable to the vaccination during the 42-day period after vaccination. Confidence intervals are obtained using a model-based bootstrap procedure. The method is applied to a study of known effects (positive controls) and expected noneffects (negative controls) of the measles, mumps, and rubella and measles, mumps, rubella, and varicella vaccines among children who are 1 year of age. The splines revealed well-resolved spikes in fever, rash, and adenopathy diagnoses, with the maximum incidence occurring between 9 and 11 days after vaccination. For the negative control outcomes, the spline model yielded a predicted incidence more consistent with the modeled day-specific risks, although there was evidence of increased risk of diagnoses of congenital malformations after vaccination, possibly because of a "provider visit effect." The proposed approach may be useful for vaccine safety surveillance.
Prüss-Ustün, Annette; Bartram, Jamie; Clasen, Thomas; Colford, John M; Cumming, Oliver; Curtis, Valerie; Bonjour, Sophie; Dangour, Alan D; De France, Jennifer; Fewtrell, Lorna; Freeman, Matthew C; Gordon, Bruce; Hunter, Paul R; Johnston, Richard B; Mathers, Colin; Mäusezahl, Daniel; Medlicott, Kate; Neira, Maria; Stocks, Meredith; Wolf, Jennyfer; Cairncross, Sandy
2014-01-01
Objective To estimate the burden of diarrhoeal diseases from exposure to inadequate water, sanitation and hand hygiene in low- and middle-income settings and provide an overview of the impact on other diseases. Methods For estimating the impact of water, sanitation and hygiene on diarrhoea, we selected exposure levels with both sufficient global exposure data and a matching exposure-risk relationship. Global exposure data were estimated for the year 2012, and risk estimates were taken from the most recent systematic analyses. We estimated attributable deaths and disability-adjusted life years (DALYs) by country, age and sex for inadequate water, sanitation and hand hygiene separately, and as a cluster of risk factors. Uncertainty estimates were computed on the basis of uncertainty surrounding exposure estimates and relative risks. Results In 2012, 502 000 diarrhoea deaths were estimated to be caused by inadequate drinking water and 280 000 deaths by inadequate sanitation. The most likely estimate of disease burden from inadequate hand hygiene amounts to 297 000 deaths. In total, 842 000 diarrhoea deaths are estimated to be caused by this cluster of risk factors, which amounts to 1.5% of the total disease burden and 58% of diarrhoeal diseases. In children under 5 years old, 361 000 deaths could be prevented, representing 5.5% of deaths in that age group. Conclusions This estimate confirms the importance of improving water and sanitation in low- and middle-income settings for the prevention of diarrhoeal disease burden. It also underscores the need for better data on exposure and risk reductions that can be achieved with provision of reliable piped water, community sewage with treatment and hand hygiene. PMID:24779548
Comparing biomarkers as principal surrogate endpoints.
Huang, Ying; Gilbert, Peter B
2011-12-01
Recently a new definition of surrogate endpoint, the "principal surrogate," was proposed based on causal associations between treatment effects on the biomarker and on the clinical endpoint. Despite its appealing interpretation, limited research has been conducted to evaluate principal surrogates, and existing methods focus on risk models that consider a single biomarker. How to compare principal surrogate value of biomarkers or general risk models that consider multiple biomarkers remains an open research question. We propose to characterize a marker or risk model's principal surrogate value based on the distribution of risk difference between interventions. In addition, we propose a novel summary measure (the standardized total gain) that can be used to compare markers and to assess the incremental value of a new marker. We develop a semiparametric estimated-likelihood method to estimate the joint surrogate value of multiple biomarkers. This method accommodates two-phase sampling of biomarkers and is more widely applicable than existing nonparametric methods by incorporating continuous baseline covariates to predict the biomarker(s), and is more robust than existing parametric methods by leaving the error distribution of markers unspecified. The methodology is illustrated using a simulated example set and a real data set in the context of HIV vaccine trials. © 2011, The International Biometric Society.
Tonda, Tetsuji; Satoh, Kenichi; Otani, Keiko; Sato, Yuya; Maruyama, Hirofumi; Kawakami, Hideshi; Tashiro, Satoshi; Hoshi, Masaharu; Ohtaki, Megu
2012-05-01
While there is a considerable number of studies on the relationship between the risk of disease or death and direct exposure from the atomic bomb in Hiroshima, the risk for indirect exposure caused by residual radioactivity has not yet been fully evaluated. One of the reasons is that risk assessments have utilized estimated radiation doses, but that it is difficult to estimate indirect exposure. To evaluate risks for other causes, including indirect radiation exposure, as well as direct exposure, a statistical method is described here that evaluates risk with respect to individual location at the time of atomic bomb exposure instead of radiation dose. In addition, it is also considered to split the risks into separate risks due to direct exposure and other causes using radiation dose. The proposed method is applied to a cohort study of Hiroshima atomic bomb survivors. The resultant contour map suggests that the region west to the hypocenter has a higher risk compared to other areas. This in turn suggests that there exists an impact on risk that cannot be explained by direct exposure.
Cheng, Xiaoya; Shaw, Stephen B; Marjerison, Rebecca D; Yearick, Christopher D; DeGloria, Stephen D; Walter, M Todd
2014-05-01
Predicting runoff producing areas and their corresponding risks of generating storm runoff is important for developing watershed management strategies to mitigate non-point source pollution. However, few methods for making these predictions have been proposed, especially operational approaches that would be useful in areas where variable source area (VSA) hydrology dominates storm runoff. The objective of this study is to develop a simple approach to estimate spatially-distributed risks of runoff production. By considering the development of overland flow as a bivariate process, we incorporated both rainfall and antecedent soil moisture conditions into a method for predicting VSAs based on the Natural Resource Conservation Service-Curve Number equation. We used base-flow immediately preceding storm events as an index of antecedent soil wetness status. Using nine sub-basins of the Upper Susquehanna River Basin, we demonstrated that our estimated runoff volumes and extent of VSAs agreed with observations. We further demonstrated a method for mapping these areas in a Geographic Information System using a Soil Topographic Index. The proposed methodology provides a new tool for watershed planners for quantifying runoff risks across watersheds, which can be used to target water quality protection strategies. Copyright © 2014 Elsevier Ltd. All rights reserved.
Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil
2014-08-01
We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.
Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil
2015-01-01
We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called “Patient Recursive Survival Peeling” is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called “combined” cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication. PMID:26997922
Vos, Janet R; Hsu, Li; Brohet, Richard M; Mourits, Marian J E; de Vries, Jakob; Malone, Kathleen E; Oosterwijk, Jan C; de Bock, Geertruida H
2015-08-10
Recommendations for treating patients who carry a BRCA1/2 gene are mainly based on cumulative lifetime risks (CLTRs) of breast cancer determined from retrospective cohorts. These risks vary widely (27% to 88%), and it is important to understand why. We analyzed the effects of methods of risk estimation and bias correction and of population factors on CLTRs in this retrospective clinical cohort of BRCA1/2 carriers. The following methods to estimate the breast cancer risk of BRCA1/2 carriers were identified from the literature: Kaplan-Meier, frailty, and modified segregation analyses with bias correction consisting of including or excluding index patients combined with including or excluding first-degree relatives (FDRs) or different conditional likelihoods. These were applied to clinical data of BRCA1/2 families derived from our family cancer clinic for whom a simulation was also performed to evaluate the methods. CLTRs and 95% CIs were estimated and compared with the reference CLTRs. CLTRs ranged from 35% to 83% for BRCA1 and 41% to 86% for BRCA2 carriers at age 70 years width of 95% CIs: 10% to 35% and 13% to 46%, respectively). Relative bias varied from -38% to +16%. Bias correction with inclusion of index patients and untested FDRs gave the smallest bias: +2% (SD, 2%) in BRCA1 and +0.9% (SD, 3.6%) in BRCA2. Much of the variation in breast cancer CLTRs in retrospective clinical BRCA1/2 cohorts is due to the bias-correction method, whereas a smaller part is due to population differences. Kaplan-Meier analyses with bias correction that includes index patients and a proportion of untested FDRs provide suitable CLTRs for carriers counseled in the clinic. © 2015 by American Society of Clinical Oncology.
Assessment of risk due to the use of carbon fiber composites in commercial and general aviation
NASA Technical Reports Server (NTRS)
Fiksel, J.; Rosenfield, D.; Kalelkar, A.
1980-01-01
The development of a national risk profile for the total annual aircraft losses due to carbon fiber composite (CFC) usage through 1993 is discussed. The profile was developed using separate simulation methods for commercial and general aviation aircraft. A Monte Carlo method which was used to assess the risk in commercial aircraft is described. The method projects the potential usage of CFC through 1993, investigates the incidence of commercial aircraft fires, models the potential release and dispersion of carbon fibers from a fire, and estimates potential economic losses due to CFC damaging electronic equipment. The simulation model for the general aviation aircraft is described. The model emphasizes variations in facility locations and release conditions, estimates distribution of CFC released in general aviation aircraft accidents, and tabulates the failure probabilities and aggregate economic losses in the accidents.
Kontos, Despina; Bakic, Predrag R.; Carton, Ann-Katherine; Troxel, Andrea B.; Conant, Emily F.; Maidment, Andrew D.A.
2009-01-01
Rationale and Objectives Studies have demonstrated a relationship between mammographic parenchymal texture and breast cancer risk. Although promising, texture analysis in mammograms is limited by tissue superimposition. Digital breast tomosynthesis (DBT) is a novel tomographic x-ray breast imaging modality that alleviates the effect of tissue superimposition, offering superior parenchymal texture visualization compared to mammography. Our study investigates the potential advantages of DBT parenchymal texture analysis for breast cancer risk estimation. Materials and Methods DBT and digital mammography (DM) images of 39 women were analyzed. Texture features, shown in studies with mammograms to correlate with cancer risk, were computed from the retroareolar breast region. We compared the relative performance of DBT and DM texture features in correlating with two measures of breast cancer risk: (i) the Gail and Claus risk estimates, and (ii) mammographic breast density. Linear regression was performed to model the association between texture features and increasing levels of risk. Results No significant correlation was detected between parenchymal texture and the Gail and Claus risk estimates. Significant correlations were observed between texture features and breast density. Overall, the DBT texture features demonstrated stronger correlations with breast percent density (PD) than DM (p ≤0.05). When dividing our study population in groups of increasing breast PD, the DBT texture features appeared to be more discriminative, having regression lines with overall lower p-values, steeper slopes, and higher R2 estimates. Conclusion Although preliminary, our results suggest that DBT parenchymal texture analysis could provide more accurate characterization of breast density patterns, which could ultimately improve breast cancer risk estimation. PMID:19201357
Leidner, Andrew J.
2014-01-01
This paper provides a demonstration of propensity-score matching estimation methods to evaluate the effectiveness of health-risk communication efforts. This study develops a two-stage regression model to investigate household and respondent characteristics as they contribute to aversion behavior to reduce exposure to arsenic-contaminated groundwater. The aversion activity under study is a household-level point-of-use filtration device. Since the acquisition of arsenic contamination information and the engagement in an aversion activity may be codetermined, a two-stage propensity-score model is developed. In the first stage, the propensity for households to acquire arsenic contamination information is estimated. Then, the propensity scores are used to weight observations in a probit regression on the decision to avert the arsenic-related health risk. Of four potential sources of information, utility, media, friend, or others, information received from a friend appears to be the source of information most associated with aversion behavior. Other statistically significant covariates in the household's decision to avert contamination include reported household income, the presence of children in household, and region-level indicator variables. These findings are primarily illustrative and demonstrate the usefulness of propensity-score methods to estimate health-risk communication effectiveness. They may also be suggestive of areas for future research. PMID:25349622
Reich, Christian G; Ryan, Patrick B; Schuemie, Martijn J
2013-10-01
A systematic risk identification system has the potential to test marketed drugs for important Health Outcomes of Interest or HOI. For each HOI, multiple definitions are used in the literature, and some of them are validated for certain databases. However, little is known about the effect of different definitions on the ability of methods to estimate their association with medical products. Alternative definitions of HOI were studied for their effect on the performance of analytical methods in observational outcome studies. A set of alternative definitions for three HOI were defined based on literature review and clinical diagnosis guidelines: acute kidney injury, acute liver injury and acute myocardial infarction. The definitions varied by the choice of diagnostic codes and the inclusion of procedure codes and lab values. They were then used to empirically study an array of analytical methods with various analytical choices in four observational healthcare databases. The methods were executed against predefined drug-HOI pairs to generate an effect estimate and standard error for each pair. These test cases included positive controls (active ingredients with evidence to suspect a positive association with the outcome) and negative controls (active ingredients with no evidence to expect an effect on the outcome). Three different performance metrics where used: (i) Area Under the Receiver Operator Characteristics (ROC) curve (AUC) as a measure of a method's ability to distinguish between positive and negative test cases, (ii) Measure of bias by estimation of distribution of observed effect estimates for the negative test pairs where the true effect can be assumed to be one (no relative risk), and (iii) Minimal Detectable Relative Risk (MDRR) as a measure of whether there is sufficient power to generate effect estimates. In the three outcomes studied, different definitions of outcomes show comparable ability to differentiate true from false control cases (AUC) and a similar bias estimation. However, broader definitions generating larger outcome cohorts allowed more drugs to be studied with sufficient statistical power. Broader definitions are preferred since they allow studying drugs with lower prevalence than the more precise or narrow definitions while showing comparable performance characteristics in differentiation of signal vs. no signal as well as effect size estimation.
A Bayesian Approach to Determination of F, D, and Z Values Used in Steam Sterilization Validation.
Faya, Paul; Stamey, James D; Seaman, John W
2017-01-01
For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the well-known D T , z , and F o values that are used in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these values to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. LAY ABSTRACT: For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the critical process parameters that are evaluated in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these parameters to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. © PDA, Inc. 2017.
Wheeler, Matthew W; Bailer, A John
2007-06-01
Model averaging (MA) has been proposed as a method of accounting for model uncertainty in benchmark dose (BMD) estimation. The technique has been used to average BMD dose estimates derived from dichotomous dose-response experiments, microbial dose-response experiments, as well as observational epidemiological studies. While MA is a promising tool for the risk assessor, a previous study suggested that the simple strategy of averaging individual models' BMD lower limits did not yield interval estimators that met nominal coverage levels in certain situations, and this performance was very sensitive to the underlying model space chosen. We present a different, more computationally intensive, approach in which the BMD is estimated using the average dose-response model and the corresponding benchmark dose lower bound (BMDL) is computed by bootstrapping. This method is illustrated with TiO(2) dose-response rat lung cancer data, and then systematically studied through an extensive Monte Carlo simulation. The results of this study suggest that the MA-BMD, estimated using this technique, performs better, in terms of bias and coverage, than the previous MA methodology. Further, the MA-BMDL achieves nominal coverage in most cases, and is superior to picking the "best fitting model" when estimating the benchmark dose. Although these results show utility of MA for benchmark dose risk estimation, they continue to highlight the importance of choosing an adequate model space as well as proper model fit diagnostics.
Belief In Numbers: When and why women disbelieve tailored breast cancer risk statistics
Scherer, Laura D.; Ubel, Peter A.; McClure, Jennifer; Green, Sarah M.; Alford, Sharon Hensley; Holtzman, Lisa; Exe, Nicole; Fagerlin, Angela
2013-01-01
Objective To examine when and why women disbelieve tailored information about their risk of developing breast cancer. Methods 690 women participated in an online program to learn about medications that can reduce the risk of breast cancer. The program presented tailored information about each woman’s personal breast cancer risk. Half of women were told how their risk numbers were calculated, whereas the rest were not. Later, they were asked whether they believed that the program was personalized, and whether they believed their risk numbers. If a woman did not believe her risk numbers, she was asked to explain why. Results Beliefs that the program was personalized were enhanced by explaining the risk calculation methods in more detail. Nonetheless, nearly 20% of women did not believe their personalized risk numbers. The most common reason for rejecting the risk estimate was a belief that it did not fully account for personal and family history. Conclusions The benefits of tailored risk statistics may be attenuated by a tendency for people to be skeptical that these risk estimates apply to them personally. Practice Implications Decision aids may provide risk information that is not accepted by patients, but addressing the patients’ personal circumstances may lead to greater acceptance. PMID:23623330
Nguyen, Trang Quynh; Webb-Vargas, Yenny; Koning, Ina M; Stuart, Elizabeth A
We investigate a method to estimate the combined effect of multiple continuous/ordinal mediators on a binary outcome: 1) fit a structural equation model with probit link for the outcome and identity/probit link for continuous/ordinal mediators, 2) predict potential outcome probabilities, and 3) compute natural direct and indirect effects. Step 2 involves rescaling the latent continuous variable underlying the outcome to address residual mediator variance/covariance. We evaluate the estimation of risk-difference- and risk-ratio-based effects (RDs, RRs) using the ML, WLSMV and Bayes estimators in Mplus. Across most variations in path-coefficient and mediator-residual-correlation signs and strengths, and confounding situations investigated, the method performs well with all estimators, but favors ML/WLSMV for RDs with continuous mediators, and Bayes for RRs with ordinal mediators. Bayes outperforms WLSMV/ML regardless of mediator type when estimating RRs with small potential outcome probabilities and in two other special cases. An adolescent alcohol prevention study is used for illustration.
ERIC Educational Resources Information Center
Fleary, Sasha A.
2017-01-01
Background: Several studies have used latent class analyses to explore obesogenic behaviors and substance use in adolescents independently. We explored a variety of health risks jointly to identify distinct patterns of risk behaviors among adolescents. Methods: Latent class models were estimated using Youth Risk Behavior Surveillance System…
Jeon, Jihyoun; Hsu, Li; Gorfine, Malka
2012-07-01
Frailty models are useful for measuring unobserved heterogeneity in risk of failures across clusters, providing cluster-specific risk prediction. In a frailty model, the latent frailties shared by members within a cluster are assumed to act multiplicatively on the hazard function. In order to obtain parameter and frailty variate estimates, we consider the hierarchical likelihood (H-likelihood) approach (Ha, Lee and Song, 2001. Hierarchical-likelihood approach for frailty models. Biometrika 88, 233-243) in which the latent frailties are treated as "parameters" and estimated jointly with other parameters of interest. We find that the H-likelihood estimators perform well when the censoring rate is low, however, they are substantially biased when the censoring rate is moderate to high. In this paper, we propose a simple and easy-to-implement bias correction method for the H-likelihood estimators under a shared frailty model. We also extend the method to a multivariate frailty model, which incorporates complex dependence structure within clusters. We conduct an extensive simulation study and show that the proposed approach performs very well for censoring rates as high as 80%. We also illustrate the method with a breast cancer data set. Since the H-likelihood is the same as the penalized likelihood function, the proposed bias correction method is also applicable to the penalized likelihood estimators.
Environmental health risk assessments of chemical mixtures that rely on component approaches often begin by grouping the chemicals of concern according to toxicological similarity. Approaches that assume dose addition typically are used for groups of similarly-acting chemicals an...
Review of methods for developing probabilistic risk assessments
D. A. Weinstein; P.B. Woodbury
2010-01-01
We describe methodologies currently in use or those under development containing features for estimating fire occurrence risk assessment. We describe two major categories of fire risk assessment tools: those that predict fire under current conditions, assuming that vegetation, climate, and the interactions between them and fire remain relatively similar to their...
Balzer, K; Kremer, L; Junghans, A; Halfens, R J G; Dassen, T; Kottner, J
2014-05-01
Nurses' clinical judgement plays a vital role in pressure ulcer risk assessment, but evidence is lacking which patient characteristics are important for nurses' perception of patients' risk exposure. To explore which patient characteristics nurses employ when assessing pressure ulcer risk without use of a risk assessment scale. Mixed methods design triangulating observational data from the control group of a quasi-experimental trial and data from semi-structured interviews with nurses. Two traumatological wards at a university hospital. Quantitative data: A consecutive sample of 106 patients matching the eligibility criteria (age ≥ 18 years, no pressure ulcers category ≥ 2 at admission and ≥ 5 days expected length of stay). Qualitative data: A purposive sample of 16 nurses. Quantitative data: Predictor variables for pressure ulcer risk were measured by study assistants at the bedside each second day. Concurrently, nurses documented their clinical judgement on patients' pressure ulcer risk by means of a 4-step global judgement scale. Bivariate correlations between predictor variables and nurses' risk estimates were established. Qualitative data: In interviews, nurses were asked to assess fictitious patients' pressure ulcer risk and to justify their risk estimates. Patient characteristics perceived as relevant for nurses' judements were thematically clustered. Triangulation: Firstly, predictors of nurses' risk estimates identified in bivariate analysis were cross-mapped with interview findings. Secondly, three models to predict nurses' risk estimates underwent multiple linear regression analysis. Nurses consider multiple patient characteristics for pressure ulcer risk assessment, but regard some conditions more important than others. Triangulation showed that these are measures reflecting patients' exposure to pressure or overall care dependency. Qualitative data furthermore indicate that nurses are likely to trade off risk-enhancing conditions against conditions perceived to be protective. Here, patients' mental capabilities like willingness to engage in one owns care seem to be particularly important. Due to missing information on these variables in the quantitative data, they could not be incorporated into triangulation. Nurses' clinical judgement draws on well-known aetiological factors, and tends to expand conditions covered by risk assessment scales. Patients' care dependency and self-care abilities seem to be core concepts for nurses' risk assessment. Copyright © 2013 Elsevier Ltd. All rights reserved.
Methods to Develop Inhalation Cancer Risk Estimates for Chromium and Nickel Compounds
This document summarizes the approaches and rationale for the technical and scientific considerations used to derive inhalation cancer risks for emissions of chromium and nickel compounds from electric utility steam generating units.
Comparison of evidence on harms of medical interventions in randomized and nonrandomized studies
Papanikolaou, Panagiotis N.; Christidi, Georgia D.; Ioannidis, John P.A.
2006-01-01
Background Information on major harms of medical interventions comes primarily from epidemiologic studies performed after licensing and marketing. Comparison with data from large-scale randomized trials is occasionally feasible. We compared evidence from randomized trials with that from epidemiologic studies to determine whether they give different estimates of risk for important harms of medical interventions. Methods We targeted well-defined, specific harms of various medical interventions for which data were already available from large-scale randomized trials (> 4000 subjects). Nonrandomized studies involving at least 4000 subjects addressing these same harms were retrieved through a search of MEDLINE. We compared the relative risks and absolute risk differences for specific harms in the randomized and nonrandomized studies. Results Eligible nonrandomized studies were found for 15 harms for which data were available from randomized trials addressing the same harms. Comparisons of relative risks between the study types were feasible for 13 of the 15 topics, and of absolute risk differences for 8 topics. The estimated increase in relative risk differed more than 2-fold between the randomized and nonrandomized studies for 7 (54%) of the 13 topics; the estimated increase in absolute risk differed more than 2-fold for 5 (62%) of the 8 topics. There was no clear predilection for randomized or nonrandomized studies to estimate greater relative risks, but usually (75% [6/8]) the randomized trials estimated larger absolute excess risks of harm than the nonrandomized studies did. Interpretation Nonrandomized studies are often conservative in estimating absolute risks of harms. It would be useful to compare and scrutinize the evidence on harms obtained from both randomized and nonrandomized studies. PMID:16505459
NASA Astrophysics Data System (ADS)
Li, Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Frush, Donald P.
2010-04-01
Radiation-dose awareness and optimization in CT can greatly benefit from a dosereporting system that provides radiation dose and cancer risk estimates specific to each patient and each CT examination. Recently, we reported a method for estimating patientspecific dose from pediatric chest CT. The purpose of this study is to extend that effort to patient-specific risk estimation and to a population of pediatric CT patients. Our study included thirty pediatric CT patients (16 males and 14 females; 0-16 years old), for whom full-body computer models were recently created based on the patients' clinical CT data. Using a validated Monte Carlo program, organ dose received by the thirty patients from a chest scan protocol (LightSpeed VCT, 120 kVp, 1.375 pitch, 40-mm collimation, pediatric body scan field-of-view) was simulated and used to estimate patient-specific effective dose. Risks of cancer incidence were calculated for radiosensitive organs using gender-, age-, and tissue-specific risk coefficients and were used to derive patientspecific effective risk. The thirty patients had normalized effective dose of 3.7-10.4 mSv/100 mAs and normalized effective risk of 0.5-5.8 cases/1000 exposed persons/100 mAs. Normalized lung dose and risk of lung cancer correlated strongly with average chest diameter (correlation coefficient: r = -0.98 to -0.99). Normalized effective risk also correlated strongly with average chest diameter (r = -0.97 to -0.98). These strong correlations can be used to estimate patient-specific dose and risk prior to or after an imaging study to potentially guide healthcare providers in justifying CT examinations and to guide individualized protocol design and optimization.
Rosenbaum, Janet E
2009-06-01
Surveys are the primary information source about adolescents' health risk behaviors, but adolescents may not report their behaviors accurately. Survey data are used for formulating adolescent health policy, and inaccurate data can cause mistakes in policy creation and evaluation. The author used test-retest data from the Youth Risk Behavior Survey (United States, 2000) to compare adolescents' responses to 72 questions about their risk behaviors at a 2-week interval. Each question was evaluated for prevalence change and 3 measures of unreliability: inconsistency (retraction and apparent initiation), agreement measured as tetrachoric correlation, and estimated error due to inconsistency assessed with a Bayesian method. Results showed that adolescents report their sex, drug, alcohol, and tobacco histories more consistently than other risk behaviors in a 2-week period, opposite their tendency over longer intervals. Compared with other Youth Risk Behavior Survey topics, most sex, drug, alcohol, and tobacco items had stable prevalence estimates, higher average agreement, and lower estimated measurement error. Adolescents reported their weight control behaviors more unreliably than other behaviors, particularly problematic because of the increased investment in adolescent obesity research and reliance on annual surveys for surveillance and policy evaluation. Most weight control items had unstable prevalence estimates, lower average agreement, and greater estimated measurement error than other topics.
Burch, Tucker R.; Spencer, Susan K.; Stokdyk, Joel P.; Kieke, Burney A.; Larson, Rebecca A.; Firnstahl, Aaron D.; Rule, Ana M.
2017-01-01
Background: Spray irrigation for land-applying livestock manure is increasing in the United States as farms become larger and economies of scale make manure irrigation affordable. Human health risks from exposure to zoonotic pathogens aerosolized during manure irrigation are not well understood. Objectives: We aimed to a) estimate human health risks due to aerosolized zoonotic pathogens downwind of spray-irrigated dairy manure; and b) determine which factors (e.g., distance, weather conditions) have the greatest influence on risk estimates. Methods: We sampled downwind air concentrations of manure-borne fecal indicators and zoonotic pathogens during 21 full-scale dairy manure irrigation events at three farms. We fit these data to hierarchical empirical models and used model outputs in a quantitative microbial risk assessment (QMRA) to estimate risk [probability of acute gastrointestinal illness (AGI)] for individuals exposed to spray-irrigated dairy manure containing Campylobacter jejuni, enterohemorrhagic Escherichia coli (EHEC), or Salmonella spp. Results: Median risk estimates from Monte Carlo simulations ranged from 10−5 to 10−2 and decreased with distance from the source. Risk estimates for Salmonella or EHEC-related AGI were most sensitive to the assumed level of pathogen prevalence in dairy manure, while risk estimates for C. jejuni were not sensitive to any single variable. Airborne microbe concentrations were negatively associated with distance and positively associated with wind speed, both of which were retained in models as a significant predictor more often than relative humidity, solar irradiation, or temperature. Conclusions: Our model-based estimates suggest that reducing pathogen prevalence and concentration in source manure would reduce the risk of AGI from exposure to manure irrigation, and that increasing the distance from irrigated manure (i.e., setbacks) and limiting irrigation to times of low wind speed may also reduce risk. https://doi.org/10.1289/EHP283 PMID:28885976
Burch, Tucker R; Spencer, Susan K.; Stokdyk, Joel; Kieke, Burney A; Larson, Rebecca A; Firnstahl, Aaron; Rule, Ana M; Borchardt, Mark A.
2017-01-01
BACKGROUND: Spray irrigation for land-applying livestock manure is increasing in the United States as farms become larger and economies of scale make manure irrigation affordable. Human health risks from exposure to zoonotic pathogens aerosolized during manure irrigation are not well understood. OBJECTIVES: We aimed to a) estimate human health risks due to aerosolized zoonotic pathogens downwind of spray-irrigated dairy manure; and b) determine which factors (e.g., distance, weather conditions) have the greatest influence on risk estimates. METHODS: We sampled downwind air concentrations of manure-borne fecal indicators and zoonotic pathogens during 21 full-scale dairy manure irri- gation events at three farms. We fit these data to hierarchical empirical models and used model outputs in a quantitative microbial risk assessment (QMRA) to estimate risk [probability of acute gastrointestinal illness (AGI)] for individuals exposed to spray-irrigated dairy manure containing Campylobacter jejuni, enterohemorrhagic Escherichia coli (EHEC), or Salmonella spp. RESULTS: Median risk estimates from Monte Carlo simulations ranged from 10−5 to 10−2 and decreased with distance from the source. Risk estimates for Salmonella or EHEC-related AGI were most sensitive to the assumed level of pathogen prevalence in dairy manure, while risk estimates for C. jejuni were not sensitive to any single variable. Airborne microbe concentrations were negatively associated with distance and positively associated with wind speed, both of which were retained in models as a significant predictor more often than relative humidity, solar irradiation, or temperature. CONCLUSIONS: Our model-based estimates suggest that reducing pathogen prevalence and concentration in source manure would reduce the risk of AGI from exposure to manure irrigation, and that increasing the distance from irrigated manure (i.e., setbacks) and limiting irrigation to times of low wind speed may also reduce risk.
Creutzfeldt-Jakob disease in United Kingdom patients treated with human pituitary growth hormone.
Swerdlow, A J; Higgins, C D; Adlard, P; Jones, M E; Preece, M A
2003-09-23
To investigate risk factors for Creutzfeldt-Jakob disease (CJD) in patients in the United Kingdom treated with human pituitary growth hormone (hGH). Incidence rates of CJD, based on person-year denominators, were assessed in a cohort of 1,848 patients treated with hGH in the United Kingdom from 1959 through 1985 and followed to the end of 2000. CJD developed in 38 patients. Risk of CJD was significantly increased by treatment with hGH prepared by the Wilhelmi method of extraction from human pituitaries. Risk was further raised if this treatment was administered at ages 8 to 10 years. The peak risk of CJD was estimated to occur 20 years after first exposure, and the estimated lifetime cumulative risk of CJD in Wilhelmi-treated patients was 4.5%. Size-exclusion chromatography, used in non-Wilhelmi preparation methods, may prevent CJD infection. Susceptibility to CJD may vary with age, and susceptibility may be present in only a few percent of the population.
Mammographers’ Perception of Women’s Breast Cancer Risk
Egger, Joseph R.; Cutter, Gary R.; Carney, Patricia A.; Taplin, Stephen H.; Barlow, William E.; Hendrick, R. Edward; D’Orsi, Carl J.; Fosse, Jessica S.; Abraham, Linn; Elmore, Joann G.
2011-01-01
Objective To understand mammographers’ perception of individual women’s breast cancer risk. Materials and Methods Radiologists interpreting screening mammography examinations completed a mailed survey consisting of questions pertaining to demographic and clinical practice characteristics, as well as 2 vignettes describing different risk profiles of women. Respondents were asked to estimate the probability of a breast cancer diagnosis in the next 5 years for each vignette. Vignette responses were plotted against mean recall rates in actual clinical practice. Results The survey was returned by 77% of eligible radiologists. Ninety-three percent of radiologists overestimated risk in the vignette involving a 70-year-old woman; 96% overestimated risk in the vignette involving a 41-year-old woman. Radiologists who more accurately estimated breast cancer risk were younger, worked full-time, were affiliated with an academic medical center, had fellowship training, had fewer than 10 years experience interpreting mammograms, and worked more than 40% of the time in breast imaging. However, only age was statistically significant. No association was found between radiologists’ risk estimate and their recall rate. Conclusion U.S. radiologists have a heightened perception of breast cancer risk. PMID:15951455
Danaei, Goodarz; Rimm, Eric B.; Oza, Shefali; Kulkarni, Sandeep C.; Murray, Christopher J. L.; Ezzati, Majid
2010-01-01
Background There has been substantial research on psychosocial and health care determinants of health disparities in the United States (US) but less on the role of modifiable risk factors. We estimated the effects of smoking, high blood pressure, elevated blood glucose, and adiposity on national life expectancy and on disparities in life expectancy and disease-specific mortality among eight subgroups of the US population (the “Eight Americas”) defined on the basis of race and the location and socioeconomic characteristics of county of residence, in 2005. Methods and Findings We combined data from the National Health and Nutrition Examination Survey and the Behavioral Risk Factor Surveillance System to estimate unbiased risk factor levels for the Eight Americas. We used data from the National Center for Health Statistics to estimate age–sex–disease-specific number of deaths in 2005. We used systematic reviews and meta-analyses of epidemiologic studies to obtain risk factor effect sizes for disease-specific mortality. We used epidemiologic methods for multiple risk factors to estimate the effects of current exposure to these risk factors on death rates, and life table methods to estimate effects on life expectancy. Asians had the lowest mean body mass index, fasting plasma glucose, and smoking; whites had the lowest systolic blood pressure (SBP). SBP was highest in blacks, especially in the rural South—5–7 mmHg higher than whites. The other three risk factors were highest in Western Native Americans, Southern low-income rural blacks, and/or low-income whites in Appalachia and the Mississippi Valley. Nationally, these four risk factors reduced life expectancy at birth in 2005 by an estimated 4.9 y in men and 4.1 y in women. Life expectancy effects were smallest in Asians (M, 4.1 y; F, 3.6 y) and largest in Southern rural blacks (M, 6.7 y; F, 5.7 y). Standard deviation of life expectancies in the Eight Americas would decline by 0.50 y (18%) in men and 0.45 y (21%) in women if these risks had been reduced to optimal levels. Disparities in the probabilities of dying from cardiovascular diseases and diabetes at different ages would decline by 69%–80%; the corresponding reduction for probabilities of dying from cancers would be 29%–50%. Individually, smoking and high blood pressure had the largest effect on life expectancy disparities. Conclusions Disparities in smoking, blood pressure, blood glucose, and adiposity explain a significant proportion of disparities in mortality from cardiovascular diseases and cancers, and some of the life expectancy disparities in the US. Please see later in the article for the Editors' Summary PMID:20351772
Wang, Ying; Liu, Jing; Wang, Wei; Wang, Miao; Qi, Yue; Xie, Wuxiang; Li, Yan; Sun, Jiayi; Liu, Jun; Zhao, Dong
2016-01-01
Objective: Stroke is a major cause of premature death in China. Early prevention of stroke requires a more effective method to differentiate the stroke risk among young-aged and middle-aged individuals than the 10-year risk of cardiovascular disease. This study aimed to establish a lifetime stroke risk model and risk charts for the young-aged and middle-aged population in China. Methods: The Chinese Multi-Provincial Cohort Study participants (n = 21 953) aged 35–84 years without cardiovascular disease at baseline were followed for 18 years (263 016 person-years). Modified Kaplan–Meier method was used to estimate the mean lifetime stroke risk up to age of 80 years and the lifetime stroke risk according to major stroke risk factors for the population aged 35–60 years. Results: A total of 917 participants developed first-ever strokes. For the participants aged 35–40 years (98 stroke cases), the lifetime stroke risk was 18.0 and 14.7% in men and women, respectively. Blood pressure most effectively discriminated the lifetime stroke risk. The lifetime risk of stroke for the individuals with all risk factors optimal was 8–10 times lower compared with those with two or more high risk factors at age 35–60 years at baseline. Conclusion: In young-aged and middle-aged population, the lifetime stroke risk will keep very low if major risk factors especially blood pressure level is at optimal levels, but the risk substantially increases even with a slight elevation of major risk factors, which could not be identified using 10-year risk estimation. PMID:27512963
Bobb, Jennifer F; Dominici, Francesca; Peng, Roger D
2011-12-01
Estimating the risks heat waves pose to human health is a critical part of assessing the future impact of climate change. In this article, we propose a flexible class of time series models to estimate the relative risk of mortality associated with heat waves and conduct Bayesian model averaging (BMA) to account for the multiplicity of potential models. Applying these methods to data from 105 U.S. cities for the period 1987-2005, we identify those cities having a high posterior probability of increased mortality risk during heat waves, examine the heterogeneity of the posterior distributions of mortality risk across cities, assess sensitivity of the results to the selection of prior distributions, and compare our BMA results to a model selection approach. Our results show that no single model best predicts risk across the majority of cities, and that for some cities heat-wave risk estimation is sensitive to model choice. Although model averaging leads to posterior distributions with increased variance as compared to statistical inference conditional on a model obtained through model selection, we find that the posterior mean of heat wave mortality risk is robust to accounting for model uncertainty over a broad class of models. © 2011, The International Biometric Society.
Zhang, Xu; Zhang, Mei-Jie; Fine, Jason
2012-01-01
With competing risks failure time data, one often needs to assess the covariate effects on the cumulative incidence probabilities. Fine and Gray proposed a proportional hazards regression model to directly model the subdistribution of a competing risk. They developed the estimating procedure for right-censored competing risks data, based on the inverse probability of censoring weighting. Right-censored and left-truncated competing risks data sometimes occur in biomedical researches. In this paper, we study the proportional hazards regression model for the subdistribution of a competing risk with right-censored and left-truncated data. We adopt a new weighting technique to estimate the parameters in this model. We have derived the large sample properties of the proposed estimators. To illustrate the application of the new method, we analyze the failure time data for children with acute leukemia. In this example, the failure times for children who had bone marrow transplants were left truncated. PMID:21557288
Application of Fuzzy Logic in Oral Cancer Risk Assessment
SCROBOTĂ, Ioana; BĂCIUȚ, Grigore; FILIP, Adriana Gabriela; TODOR, Bianca; BLAGA, Florin; BĂCIUȚ, Mihaela Felicia
2017-01-01
Background: The mapping of the malignization mechanism is still incomplete, but oxidative stress is strongly correlated to carcinogenesis. In our research, using fuzzy logic, we aimed to estimate the oxidative stress related-cancerization risk of the oral potentially malignant disorders. Methods: Serum from 16 patients diagnosed (clinical and histopathological) with oral potentially malignant disorders (Dept. of Cranio-Maxillofacial Surgery and Radiology, ”Iuliu Hațieganu” University of Medicine and Pharmacy, Cluj Napoca, Romania) was processed fluorometric for malondialdehyde and proton donors assays (Dept. of Physiology,”Iuliu Hațieganu” University of Medicine and Pharmacy, Cluj-Napoca, Romania). The values were used as inputs, they were associated linguistic terms using MIN-MAX method and 25 IF-THEN inference rules were generated to estimate the output value, the cancerization risk appreciated on a scale from 1 to 10 - IF malondialdehyde is very high and donors protons are very low THEN the cancer risk is reaching the maximum value (Dept. of Industrial Engineering, Faculty of Managerial and Technological Engineering, University of Oradea, Oradea, Romania) (2012–2014). Results: We estimated the cancerization risk of the oral potentially malignant disorders by implementing the multi-criteria decision support system based on serum malondialdehyde and proton donors’ values. The risk was estimated as a concrete numerical value on a scale from 1 to 10 depending on the input numerical/linguistic value. Conclusion: The multi-criteria decision support system proposed by us, integrated into a more complex computerized decision support system, could be used as an important aid in oral cancer screening and establish future medical decision in oral potentially malignant disorders. PMID:28560191
Applications of Principled Search Methods in Climate Influences and Mechanisms
NASA Technical Reports Server (NTRS)
Glymour, Clark
2005-01-01
Forest and grass fires cause economic losses in the billions of dollars in the U.S. alone. In addition, boreal forests constitute a large carbon store; it has been estimated that, were no burning to occur, an additional 7 gigatons of carbon would be sequestered in boreal soils each century. Effective wildfire suppression requires anticipation of locales and times for which wildfire is most probable, preferably with a two to four week forecast, so that limited resources can be efficiently deployed. The United States Forest Service (USFS), and other experts and agencies have developed several measures of fire risk combining physical principles and expert judgment, and have used them in automated procedures for forecasting fire risk. Forecasting accuracies for some fire risk indices in combination with climate and other variables have been estimated for specific locations, with the value of fire risk index variables assessed by their statistical significance in regressions. In other cases, the MAPSS forecasts [23, 241 for example, forecasting accuracy has been estimated only by simulated data. We describe alternative forecasting methods that predict fire probability by locale and time using statistical or machine learning procedures trained on historical data, and we give comparative assessments of their forecasting accuracy for one fire season year, April- October, 2003, for all U.S. Forest Service lands. Aside from providing an accuracy baseline for other forecasting methods, the results illustrate the interdependence between the statistical significance of prediction variables and the forecasting method used.
Gan, Ryan W; Ford, Bonne; Lassman, William; Pfister, Gabriele; Vaidyanathan, Ambarish; Fischer, Emily; Volckens, John; Pierce, Jeffrey R; Magzamen, Sheryl
2017-03-01
Climate forecasts predict an increase in frequency and intensity of wildfires. Associations between health outcomes and population exposure to smoke from Washington 2012 wildfires were compared using surface monitors, chemical-weather models, and a novel method blending three exposure information sources. The association between smoke particulate matter ≤2.5 μm in diameter (PM 2.5 ) and cardiopulmonary hospital admissions occurring in Washington from 1 July to 31 October 2012 was evaluated using a time-stratified case-crossover design. Hospital admissions aggregated by ZIP code were linked with population-weighted daily average concentrations of smoke PM 2.5 estimated using three distinct methods: a simulation with the Weather Research and Forecasting with Chemistry (WRF-Chem) model, a kriged interpolation of PM 2.5 measurements from surface monitors, and a geographically weighted ridge regression (GWR) that blended inputs from WRF-Chem, satellite observations of aerosol optical depth, and kriged PM 2.5 . A 10 μg/m 3 increase in GWR smoke PM 2.5 was associated with an 8% increased risk in asthma-related hospital admissions (odds ratio (OR): 1.076, 95% confidence interval (CI): 1.019-1.136); other smoke estimation methods yielded similar results. However, point estimates for chronic obstructive pulmonary disease (COPD) differed by smoke PM 2.5 exposure method: a 10 μg/m 3 increase using GWR was significantly associated with increased risk of COPD (OR: 1.084, 95%CI: 1.026-1.145) and not significant using WRF-Chem (OR: 0.986, 95%CI: 0.931-1.045). The magnitude (OR) and uncertainty (95%CI) of associations between smoke PM 2.5 and hospital admissions were dependent on estimation method used and outcome evaluated. Choice of smoke exposure estimation method used can impact the overall conclusion of the study.
Advanced space-based InSAR risk analysis of planned and existing transportation infrastructure.
DOT National Transportation Integrated Search
2017-03-21
The purpose of this document is to summarize activities by Stanford University and : MDA Geospatial Services Inc. (MDA) to estimate surface deformation and associated : risk to transportation infrastructure using SAR Interferometric methods for the :...
Web-enabling Ecological Risk Assessment for Accessibility and Transparency
Ecological risk methods and tools are necessarily diverse to account for different combinations of receptors, exposure processes, effects estimation, and degree of conservatism/realism necessary to support chemical-based assessments. These tools have been continuously developed s...
Predictions of Leukemia Risks to Astronauts from Solar Particle Events
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; Atwell, W.; Kim, M. Y.; George, K. A.; Ponomarev, A.; Nikjoo, H.; Wilson, J. W.
2006-01-01
Leukemias consisting of acute and chronic myeloid leukemia and acute lymphatic lymphomas represent the earliest cancers that appear after radiation exposure, have a high lethality fraction, and make up a significant fraction of the overall fatal cancer risk from radiation for adults. Several considerations impact the recommendation of a preferred model for the estimation of leukemia risks from solar particle events (SPE's): The BEIR VII report recommends several changes to the method of calculation of leukemia risk compared to the methods recommended by the NCRP Report No. 132 including the preference of a mixture model with additive and multiplicative components in BEIR VII compared to the additive transfer model recommended by NCRP Report No. 132. Proton fluences and doses vary considerably across marrow regions because of the characteristic spectra of primary solar protons making the use of an average dose suspect. Previous estimates of bone marrow doses from SPE's have used an average body-shielding distribution for marrow based on the computerized anatomical man model (CAM). We have developed an 82-point body-shielding distribution that faithfully reproduces the mean and variance of SPE doses in the active marrow regions (head and neck, chest, abdomen, pelvis and thighs) allowing for more accurate estimation of linear- and quadratic-dose components of the marrow response. SPE's have differential dose-rates and a pseudo-quadratic dose response term is possible in the peak-flux period of an event. Also, the mechanistic basis for leukemia risk continues to improve allowing for improved strategies in choosing dose-rate modulation factors and radiation quality descriptors. We make comparisons of the various choices of the components in leukemia risk estimates in formulating our preferred model. A major finding is that leukemia could be the dominant risk to astronauts for a major solar particle event.
2013-01-01
Background Making evidence-based decisions often requires comparison of two or more options. Research-based evidence may exist which quantifies how likely the outcomes are for each option. Understanding these numeric estimates improves patients’ risk perception and leads to better informed decision making. This paper summarises current “best practices” in communication of evidence-based numeric outcomes for developers of patient decision aids (PtDAs) and other health communication tools. Method An expert consensus group of fourteen researchers from North America, Europe, and Australasia identified eleven main issues in risk communication. Two experts for each issue wrote a “state of the art” summary of best evidence, drawing on the PtDA, health, psychological, and broader scientific literature. In addition, commonly used terms were defined and a set of guiding principles and key messages derived from the results. Results The eleven key components of risk communication were: 1) Presenting the chance an event will occur; 2) Presenting changes in numeric outcomes; 3) Outcome estimates for test and screening decisions; 4) Numeric estimates in context and with evaluative labels; 5) Conveying uncertainty; 6) Visual formats; 7) Tailoring estimates; 8) Formats for understanding outcomes over time; 9) Narrative methods for conveying the chance of an event; 10) Important skills for understanding numerical estimates; and 11) Interactive web-based formats. Guiding principles from the evidence summaries advise that risk communication formats should reflect the task required of the user, should always define a relevant reference class (i.e., denominator) over time, should aim to use a consistent format throughout documents, should avoid “1 in x” formats and variable denominators, consider the magnitude of numbers used and the possibility of format bias, and should take into account the numeracy and graph literacy of the audience. Conclusion A substantial and rapidly expanding evidence base exists for risk communication. Developers of tools to facilitate evidence-based decision making should apply these principles to improve the quality of risk communication in practice. PMID:24625237
Estimating the health consequences of replacing cigarettes with nicotine inhalers
Sumner, W
2003-01-01
Background: A fast acting, clean nicotine delivery system might substantially displace cigarettes. Public health consequences would depend on the subsequent prevalence of nicotine use, hazards of delivery systems, and intrinsic hazards of nicotine. Methods: A spreadsheet program, DEMANDS, estimates differences in expected mortality, adjusted for nicotine delivery system features and prevalence of nicotine use, by extending the data and methods of the SAMMEC 3 software from the US Centers for Disease Control and Prevention. The user estimates disease risks attributable to nicotine, other smoke components, and risk factors that coexist with smoking. The public health consequences of a widely used clean nicotine inhaler replacing cigarettes were compared to historical observations and public health goals, using four different risk attribution scenarios and nicotine use prevalence from 0–100%. Main outcome measures: Changes in years of potential life before age 85 (YPL85). Results: If nicotine accounts for less than a third of smokers' excess risk of SAMMEC diseases, as it most likely does, then even with very widespread use of clean nicotine DEMANDS predicts public health gains, relative to current tobacco use. Public health benefits accruing from a widely used clean nicotine inhaler probably equal or exceed the benefits of achieving Healthy People 2010 goals. Conclusions: Clean nicotine inhalers might improve public health as much as any feasible tobacco control effort. Although the relevant risk estimates are somewhat uncertain, partial nicotine deregulation deserves consideration as part of a broad tobacco control policy. PMID:12773720
Breivik, Knut; Arnot, Jon A; Brown, Trevor N; McLachlan, Michael S; Wania, Frank
2012-08-01
Quantitative knowledge of organic chemical release into the environment is essential to understand and predict human exposure as well as to develop rational control strategies for any substances of concern. While significant efforts have been invested to characterize and screen organic chemicals for hazardous properties, relatively less effort has been directed toward estimating emissions and hence also risks. Here, a rapid throughput method to estimate emissions of discrete organic chemicals in commerce has been developed, applied and evaluated to support screening studies aimed at ranking and identifying chemicals of potential concern. The method builds upon information in the European Union Technical Guidance Document and utilizes information on quantities in commerce (production and/or import rates), chemical function (use patterns) and physical-chemical properties to estimate emissions to air, soil and water within the OECD for five stages of the chemical life-cycle. The method is applied to 16,029 discrete substances (identified by CAS numbers) from five national and international high production volume lists. As access to consistent input data remains fragmented or even impossible, particular attention is given to estimating, evaluating and discussing uncertainties in the resulting emission scenarios. The uncertainty for individual substances typically spans 3 to 4 orders of magnitude for this initial tier screening method. Information on uncertainties in emissions is useful as any screening or categorization methods which solely rely on threshold values are at risk of leading to a significant number of either false positives or false negatives. A limited evaluation of the screening method's estimates for a sub-set of about 100 substances, compared against independent and more detailed emission scenarios presented in various European Risk Assessment Reports, highlights that up-to-date and accurate information on quantities in commerce as well as a detailed breakdown on chemical function are critically needed for developing more realistic emission scenarios.
SURE Estimates for a Heteroscedastic Hierarchical Model
Xie, Xianchao; Kou, S. C.; Brown, Lawrence D.
2014-01-01
Hierarchical models are extensively studied and widely used in statistics and many other scientific areas. They provide an effective tool for combining information from similar resources and achieving partial pooling of inference. Since the seminal work by James and Stein (1961) and Stein (1962), shrinkage estimation has become one major focus for hierarchical models. For the homoscedastic normal model, it is well known that shrinkage estimators, especially the James-Stein estimator, have good risk properties. The heteroscedastic model, though more appropriate for practical applications, is less well studied, and it is unclear what types of shrinkage estimators are superior in terms of the risk. We propose in this paper a class of shrinkage estimators based on Stein’s unbiased estimate of risk (SURE). We study asymptotic properties of various common estimators as the number of means to be estimated grows (p → ∞). We establish the asymptotic optimality property for the SURE estimators. We then extend our construction to create a class of semi-parametric shrinkage estimators and establish corresponding asymptotic optimality results. We emphasize that though the form of our SURE estimators is partially obtained through a normal model at the sampling level, their optimality properties do not heavily depend on such distributional assumptions. We apply the methods to two real data sets and obtain encouraging results. PMID:25301976
A new technique for fire risk estimation in the wildland urban interface
NASA Astrophysics Data System (ADS)
Dasgupta, S.; Qu, J. J.; Hao, X.
A novel technique based on the physical variable of pre-ignition energy is proposed for assessing fire risk in the Grassland-Urban-Interface The physical basis lends meaning a site and season independent applicability possibilities for computing spread rates and ignition probabilities features contemporary fire risk indices usually lack The method requires estimates of grass moisture content and temperature A constrained radiative-transfer inversion scheme on MODIS NIR-SWIR reflectances which reduces solution ambiguity is used for grass moisture retrieval while MODIS land surface temperature emissivity products are used for retrieving grass temperature Subpixel urban contamination of the MODIS reflective and thermal signals over a Grassland-Urban-Interface pixel is corrected using periodic estimates of urban influence from high spatial resolution ASTER
Toward risk reduction: predicting the future burden of occupational cancer.
Hutchings, Sally; Rushton, Lesley
2011-05-01
Interventions to reduce cancers related to certain occupations should be evidence-based. The authors have developed a method for forecasting the future burden of occupational cancer to inform strategies for risk reduction. They project risk exposure periods, accounting for cancer latencies of up to 50 years, forward in time to estimate attributable fractions for a series of forecast target years given past and projected exposure trends and under targeted reduction scenarios. Adjustment factors for changes in exposed numbers and levels are applied in estimation intervals within the risk-exposure periods. The authors illustrate the methods by using a range of scenarios for reducing lung cancer due to occupational exposure to respirable crystalline silica. Attributable fractions for lung cancer due to respirable crystalline silica could be potentially reduced from 2.07% in 2010 to nearly 0% by 2060, depending on the timing and success of interventions. Focusing on achieving compliance with current exposure standards in small industries can be more effective than setting standards at a lower level. The method can be used to highlight high-risk carcinogens, industries, and occupations. It is adaptable for other countries and other exposure situations in the general environment and can be extended to include socioeconomic impact assessment.
Perini, Wilco; Agyemang, Charles; Snijder, Marieke B.; Peters, Ron J.G.; Kunst, Anton E.
2017-01-01
Background: European societies are becoming increasingly ethnically diverse. This may have important implications for socio-economic inequalities in health due to the often disadvantaged position of ethnic minority groups in both socio-economic status (SES) and disease, especially cardiovascular disease (CVD). Objective: The aim of this study was to determine whether the socio-economic gradient of estimated CVD risk differs between ethnic groups. Methods: Using the Healthy Life in an Urban Setting study, we obtained data on SES and CVD risk factors among participants from six ethnic backgrounds residing in Amsterdam. SES was measured using educational level and occupational level. CVD risk was estimated based on the occurrence of CVD risk factors using the Dutch version of the systematic coronary risk evaluation algorithm. Ethnic disparities in socio-economic gradients for estimated CVD risk were determined using the relative index of inequality (RII). Results: Among Dutch-origin men, the RII for estimated CVD risk according to educational level was 6.15% (95% confidence interval [CI] 4.35–7.96%), indicating that those at the bottom of the educational hierarchy had a 6.15% higher estimated CVD risk relative than those at the top. Among Dutch-origin women, the RII was 4.49% (CI 2.45–6.52%). The RII was lower among ethnic minority groups, ranging from 0.83% to 3.13% among men and −0.29% to 5.12% among women, indicating weaker associations among these groups. Results were similar based on occupational level. Conclusions: Ethnic background needs to be considered in associations between SES and disease. The predictive value of SES varies between ethnic groups and may be quite poor for some groups. PMID:28699411
Disease mapping based on stochastic SIR-SI model for Dengue and Chikungunya in Malaysia
NASA Astrophysics Data System (ADS)
Samat, N. A.; Ma'arof, S. H. Mohd Imam
2014-12-01
This paper describes and demonstrates a method for relative risk estimation which is based on the stochastic SIR-SI vector-borne infectious disease transmission model specifically for Dengue and Chikungunya diseases in Malaysia. Firstly, the common compartmental model for vector-borne infectious disease transmission called the SIR-SI model (susceptible-infective-recovered for human populations; susceptible-infective for vector populations) is presented. This is followed by the explanations on the stochastic SIR-SI model which involve the Bayesian description. This stochastic model then is used in the relative risk formulation in order to obtain the posterior relative risk estimation. Then, this relative estimation model is demonstrated using Dengue and Chikungunya data of Malaysia. The viruses of these diseases are transmitted by the same type of female vector mosquito named Aedes Aegypti and Aedes Albopictus. Finally, the findings of the analysis of relative risk estimation for both Dengue and Chikungunya diseases are presented, compared and displayed in graphs and maps. The distribution from risk maps show the high and low risk area of Dengue and Chikungunya diseases occurrence. This map can be used as a tool for the prevention and control strategies for both diseases.
Disease mapping based on stochastic SIR-SI model for Dengue and Chikungunya in Malaysia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samat, N. A.; Ma'arof, S. H. Mohd Imam
This paper describes and demonstrates a method for relative risk estimation which is based on the stochastic SIR-SI vector-borne infectious disease transmission model specifically for Dengue and Chikungunya diseases in Malaysia. Firstly, the common compartmental model for vector-borne infectious disease transmission called the SIR-SI model (susceptible-infective-recovered for human populations; susceptible-infective for vector populations) is presented. This is followed by the explanations on the stochastic SIR-SI model which involve the Bayesian description. This stochastic model then is used in the relative risk formulation in order to obtain the posterior relative risk estimation. Then, this relative estimation model is demonstrated using Denguemore » and Chikungunya data of Malaysia. The viruses of these diseases are transmitted by the same type of female vector mosquito named Aedes Aegypti and Aedes Albopictus. Finally, the findings of the analysis of relative risk estimation for both Dengue and Chikungunya diseases are presented, compared and displayed in graphs and maps. The distribution from risk maps show the high and low risk area of Dengue and Chikungunya diseases occurrence. This map can be used as a tool for the prevention and control strategies for both diseases.« less
ERIC Educational Resources Information Center
Brammeier, Monique; Chow, Joan M.; Samuel, Michael C.; Organista, Kurt C.; Miller, Jamie; Bolan, Gail
2008-01-01
Context: The prevalence of sexually transmitted diseases and associated risk behaviors among California farmworkers is not well described. Purpose: To estimate the prevalence of sexually transmitted diseases (STDs) and associated risk behaviors among California farmworkers. Methods: Cross-sectional analysis of population-based survey data from 6…
Aviation Security, Risk Assessment, and Risk Aversion for Public Decisionmaking
ERIC Educational Resources Information Center
Stewart, Mark G.; Mueller, John
2013-01-01
This paper estimates risk reductions for each layer of security designed to prevent commercial passenger airliners from being commandeered by terrorists, kept under control for some time, and then crashed into specific targets. Probabilistic methods are used to characterize the uncertainty of rates of deterrence, detection, and disruption, as well…
Environmental health risk assessments of chemical mixtures that rely on component approaches often begin by grouping the chemicals of concern according to toxicological similarity. Approaches that assume dose addition typically are used for groups of similarly-acting chemicals an...
12 CFR Appendix C to Part 325 - Risk-Based Capital for State Nonmember Banks: Market Risk
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10Standardized Measurement Method for Specific Risk Section 11Simplified Supervisory Formula Approach Section... apply: Affiliate with respect to a company means any company that controls, is controlled by, or is under common control with, the company. Backtesting means the comparison of a bank's internal estimates...
12 CFR Appendix C to Part 325 - Risk-Based Capital for State Nonmember Banks: Market Risk
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10Standardized Measurement Method for Specific Risk Section 11Simplified Supervisory Formula Approach Section... apply: Affiliate with respect to a company means any company that controls, is controlled by, or is under common control with, the company. Backtesting means the comparison of a bank's internal estimates...
French, David P; Cameron, Elaine; Benton, Jack S; Deaton, Christi; Harvie, Michelle
2017-10-01
The assessment and communication of disease risk that is personalised to the individual is widespread in healthcare contexts. Despite several systematic reviews of RCTs, it is unclear under what circumstances that personalised risk estimates promotes change in four key health-related behaviours: smoking, physical activity, diet and alcohol consumption. The present research aims to systematically identify, evaluate and synthesise the findings of existing systematic reviews. This systematic review of systematic reviews followed published guidance. A search of four databases and two-stage screening procedure with good reliability identified nine eligible systematic reviews. The nine reviews each included between three and 15 primary studies, containing 36 unique studies. Methods of personalising risk feedback included imaging/visual feedback, genetic testing, and numerical estimation from risk algorithms. The reviews were generally high quality. For a broad range of methods of estimating and communicating risk, the reviews found no evidence that risk information had strong or consistent effects on health-related behaviours. The most promising effects came from interventions using visual or imaging techniques and with smoking cessation and dietary behaviour as outcomes, but with inconsistent results. Few interventions explicitly used theory, few targeted self-efficacy or response efficacy, and a limited range of Behaviour Change Techniques were used. Presenting risk information on its own, even when highly personalised, does not produce strong effects on health-related behaviours or changes which are sustained. Future research in this area should build on the existing knowledge base about increasing the effects of risk communication on behaviour.
Jia, Yuling; Stone, Dave; Wang, Wentao; Schrlau, Jill; Tao, Shu; Massey Simonich, Staci L.
2011-01-01
Background The 2008 Beijing Olympic Games provided a unique case study to investigate the effect of source control measures on the reduction in air pollution, and associated inhalation cancer risk, in a Chinese megacity. Objectives We measured 17 carcinogenic polycyclic aromatic hydrocarbons (PAHs) and estimated the lifetime excess inhalation cancer risk during different periods of the Beijing Olympic Games, to assess the effectiveness of source control measures in reducing PAH-induced inhalation cancer risks. Methods PAH concentrations were measured in samples of particulate matter ≤ 2.5 μm in aerodynamic diameter (PM2.5) collected during the Beijing Olympic Games, and the associated inhalation cancer risks were estimated using a point-estimate approach based on relative potency factors. Results We estimated the number of lifetime excess cancer cases due to exposure to the 17 carcinogenic PAHs [12 priority pollutant PAHs and five high-molecular-weight (302 Da) PAHs (MW 302 PAHs)] to range from 6.5 to 518 per million people for the source control period concentrations and from 12.2 to 964 per million people for the nonsource control period concentrations. This would correspond to a 46% reduction in estimated inhalation cancer risk due to source control measures, if these measures were sustained over time. Benzo[b]fluoranthene, dibenz[a,h]anthracene, benzo[a]pyrene, and dibenzo[a,l]pyrene were the most carcinogenic PAH species evaluated. Total excess inhalation cancer risk would be underestimated by 23% if we did not include the five MW 302 PAHs in the risk calculation. Conclusions Source control measures, such as those imposed during the 2008 Beijing Olympics, can significantly reduce the inhalation cancer risk associated with PAH exposure in Chinese megacities similar to Beijing. MW 302 PAHs are a significant contributor to the estimated overall inhalation cancer risk. PMID:21632310
Kandhasamy, Chandrasekaran; Ghosh, Kaushik
2017-02-01
Indian states are currently classified into HIV-risk categories based on the observed prevalence counts, percentage of infected attendees in antenatal clinics, and percentage of infected high-risk individuals. This method, however, does not account for the spatial dependence among the states nor does it provide any measure of statistical uncertainty. We provide an alternative model-based approach to address these issues. Our method uses Poisson log-normal models having various conditional autoregressive structures with neighborhood-based and distance-based weight matrices and incorporates all available covariate information. We use R and WinBugs software to fit these models to the 2011 HIV data. Based on the Deviance Information Criterion, the convolution model using distance-based weight matrix and covariate information on female sex workers, literacy rate and intravenous drug users is found to have the best fit. The relative risk of HIV for the various states is estimated using the best model and the states are then classified into the risk categories based on these estimated values. An HIV risk map of India is constructed based on these results. The choice of the final model suggests that an HIV control strategy which focuses on the female sex workers, intravenous drug users and literacy rate would be most effective. Copyright © 2017 Elsevier Ltd. All rights reserved.
Prieto, M.L.; Cuéllar-Barboza, A.B.; Bobo, W.V.; Roger, V.L.; Bellivier, F.; Leboyer, M.; West, C.P.; Frye, M.A.
2016-01-01
Objective To review the evidence on and estimate the risk of myocardial infarction and stroke in bipolar disorder. Method A systematic search using MEDLINE, EMBASE, PsycINFO, Web of Science, Scopus, Cochrane Database of Systematic Reviews, and bibliographies (1946 – May, 2013) was conducted. Case-control and cohort studies of bipolar disorder patients age 15 or older with myocardial infarction or stroke as outcomes were included. Two independent reviewers extracted data and assessed quality. Estimates of effect were summarized using random-effects meta-analysis. Results Five cohort studies including 13 115 911 participants (27 092 bipolar) were included. Due to the use of registers, different statistical methods, and inconsistent adjustment for confounders, there was significant methodological heterogeneity among studies. The exploratory meta-analysis yielded no evidence for a significant increase in the risk of myocardial infarction: [relative risk (RR): 1.09, 95% CI 0.96–1.24, P = 0.20; I2 = 6%]. While there was evidence of significant study heterogeneity, the risk of stroke in bipolar disorder was significantly increased (RR 1.74, 95% CI 1.29–2.35; P = 0.0003; I2 = 83%). Conclusion There may be a differential risk of myocardial infarction and stroke in patients with bipolar disorder. Confidence in these pooled estimates was limited by the small number of studies, significant heterogeneity and dissimilar methodological features. PMID:24850482
2011-01-01
Background Tetrachloroethylene (PCE) is an important occupational chemical used in metal degreasing and drycleaning and a prevalent drinking water contaminant. Exposure often occurs with other chemicals but it occurred alone in a pattern that reduced the likelihood of confounding in a unique scenario on Cape Cod, Massachusetts. We previously found a small to moderate increased risk of breast cancer among women with the highest exposures using a simple exposure model. We have taken advantage of technical improvements in publically available software to incorporate a more sophisticated determination of water flow and direction to see if previous results were robust to more accurate exposure assessment. Methods The current analysis used PCE exposure estimates generated with the addition of water distribution modeling software (EPANET 2.0) to test model assumptions, compare exposure distributions to prior methods, and re-examine the risk of breast cancer. In addition, we applied data smoothing to examine nonlinear relationships between breast cancer and exposure. We also compared a set of measured PCE concentrations in water samples collected in 1980 to modeled estimates. Results Thirty-nine percent of individuals considered unexposed in prior epidemiological analyses were considered exposed using the current method, but mostly at low exposure levels. As a result, the exposure distribution was shifted downward resulting in a lower value for the 90th percentile, the definition of "high exposure" in prior analyses. The current analyses confirmed a modest increase in the risk of breast cancer for women with high PCE exposure levels defined by either the 90th percentile (adjusted ORs 1.0-1.5 for 0-19 year latency assumptions) or smoothing analysis cut point (adjusted ORs 1.3-2.0 for 0-15 year latency assumptions). Current exposure estimates had a higher correlation with PCE concentrations in water samples (Spearman correlation coefficient = 0.65, p < 0.0001) than estimates generated using the prior method (0.54, p < 0.0001). Conclusions The incorporation of sophisticated flow estimates in the exposure assessment method shifted the PCE exposure distribution downward, but did not meaningfully affect the exposure ranking of subjects or the strength of the association with the risk of breast cancer found in earlier analyses. Thus, the current analyses show a slightly elevated breast cancer risk for highly exposed women, with strengthened exposure assessment and minimization of misclassification by using the latest technology. PMID:21600013
Di Salvo, Francesca; Meneghini, Elisabetta; Vieira, Veronica; Baili, Paolo; Mariottini, Mauro; Baldini, Marco; Micheli, Andrea; Sant, Milena
2015-01-01
Introduction The study investigated the geographic variation of mortality risk for hematological malignancies (HMs) in order to identify potential high-risk areas near an Italian petrochemical refinery. Material and methods A population-based case-control study was conducted and residential histories for 171 cases and 338 sex- and age-matched controls were collected. Confounding factors were obtained from interviews with consenting relatives for 109 HM deaths and 267 controls. To produce risk mortality maps, two different approaches were applied. We mapped (1) adptive kernel density relative risk estimation (KDE) for case-control studies which estimates a spatial relative risk function using the ratio between cases and controls’ densities, and (2) estimated odds ratios for case-control study data using generalized additive models (GAMs) to smooth the effect of location, a proxy for exposure, while adjusting for confounding variables. Results No high-risk areas for HM mortality were identified among all subjects (men and women combined), by applying both approaches. Using the adaptive KDE approach, we found a significant increase in death risk only among women in a large area 2–6 km southeast of the refinery and the application of GAMs also identified a similarly-located significant high-risk area among women only (global p-value<0.025). Potential confounding risk factors we considered in the GAM did not alter the results. Conclusion Both approaches identified a high-risk area close to the refinery among women only. Those spatial methods are useful tools for public policy management to determine priority areas for intervention. Our findings suggest several directions for further research in order to identify other potential environmental exposures that may be assessed in forthcoming studies based on detailed exposure modeling. PMID:26073202
Assessing risks and preventing disease from environmental chemicals.
Dunnette, D A
1989-01-01
In the last 25 years there has been considerable concern expressed about the extent to which chemical agents in the ambient and work environments are contributing to the causation of disease. This concern is a logical extension of our increased knowledge of the real and potential effects of environmental chemicals and the methodological difficulties in applying new knowledge that could help prevent environmentally induced disease. Chemical risk assessment offers an approach to estimating risks and involves consideration of relevant information including identification of chemical hazards, evaluation of the dose-response relationship, estimation of exposure and finally, risk characterization. Particularly significant uncertainties which are inherent in use of this and other risk models include animal-human and low dose-high dose extrapolation and estimation of exposure. Community public health risks from exposure to environmental chemicals appear to be small relative to other public health risks based on information related to cancer trends, dietary intake of synthetic chemicals, assessment data on substances such as DDT and "dioxin," public health effects of hazardous waste sites and contextual considerations. Because of inherent uncertainty in the chemical risk assessment process, however, we need to apply what methods are available in our efforts to prevent disease induced by environmental chemicals. There are a number of societal strategies which can contribute to overall reduction of risk from environmental chemicals. These include acquisition of information on environmental risk including toxicity, intensity and extensity of exposure, biological monitoring, disease surveillance, improvement in epidemiological methods, control of environmental chemical exposures, and dissemination of hazardous chemical information. Responsible environmental risk communication and information transfer appear to be among the most important of the available strategies for preventing disease induced by chemicals in the environment.
NASA Astrophysics Data System (ADS)
Cifelli, R.; Mahoney, K. M.; Webb, R. S.; McCormick, B.
2017-12-01
To ensure structural and operational safety of dams and other water management infrastructure, water resources managers and engineers require information about the potential for heavy precipitation. The methods and data used to estimate extreme rainfall amounts for managing risk are based on 40-year-old science and in need of improvement. The need to evaluate new approaches based on the best science available has led the states of Colorado and New Mexico to engage a body of scientists and engineers in an innovative "ensemble approach" to updating extreme precipitation estimates. NOAA is at the forefront of one of three technical approaches that make up the "ensemble study"; the three approaches are conducted concurrently and in collaboration with each other. One approach is the conventional deterministic, "storm-based" method, another is a risk-based regional precipitation frequency estimation tool, and the third is an experimental approach utilizing NOAA's state-of-the-art High Resolution Rapid Refresh (HRRR) physically-based dynamical weather prediction model. The goal of the overall project is to use the individual strengths of these different methods to define an updated and broadly acceptable state of the practice for evaluation and design of dam spillways. This talk will highlight the NOAA research and NOAA's role in the overarching goal to better understand and characterizing extreme precipitation estimation uncertainty. The research led by NOAA explores a novel high-resolution dataset and post-processing techniques using a super-ensemble of hourly forecasts from the HRRR model. We also investigate how this rich dataset may be combined with statistical methods to optimally cast the data in probabilistic frameworks. NOAA expertise in the physical processes that drive extreme precipitation is also employed to develop careful testing and improved understanding of the limitations of older estimation methods and assumptions. The process of decision making in the midst of uncertainty is a major part of this study. We will speak to how the ensemble approach may be used in concert with one another to manage risk and enhance resiliency in the midst of uncertainty. Finally, the presentation will also address the implications of including climate change in future extreme precipitation estimation studies.
Aarnisalo, Kaarina; Vihavainen, Elina; Rantala, Leila; Maijala, Riitta; Suihko, Maija-Liisa; Hielm, Sebastian; Tuominen, Pirkko; Ranta, Jukka; Raaska, Laura
2008-02-10
Microbial risk assessment provides a means of estimating consumer risks associated with food products. The methods can also be applied at the plant level. In this study results of microbiological analyses were used to develop a robust single plant level risk assessment. Furthermore, the prevalence and numbers of Listeria monocytogenes in marinated broiler legs in Finland were estimated. These estimates were based on information on the prevalence, numbers and genotypes of L. monocytogenes in 186 marinated broiler legs from 41 retail stores. The products were from three main Finnish producers, which produce 90% of all marinated broiler legs sold in Finland. The prevalence and numbers of L. monocytogenes were estimated by Monte Carlo simulation using WinBUGS, but the model is applicable to any software featuring standard probability distributions. The estimated mean annual number of L. monocytogenes-positive broiler legs sold in Finland was 7.2x10(6) with a 95% credible interval (CI) 6.7x10(6)-7.7x10(6). That would be 34%+/-1% of the marinated broiler legs sold in Finland. The mean number of L. monocytogenes in marinated broiler legs estimated at the sell-by-date was 2 CFU/g, with a 95% CI of 0-14 CFU/g. Producer-specific L. monocytogenes strains were recovered from the products throughout the year, which emphasizes the importance of characterizing the isolates and identifying strains that may cause problems as part of risk assessment studies. As the levels of L. monocytogenes were low, the risk of acquiring listeriosis from these products proved to be insignificant. Consequently there was no need for a thorough national level risk assessment. However, an approach using worst-case and average point estimates was applied to produce an example of single producer level risk assessment based on limited data. This assessment also indicated that the risk from these products was low. The risk-based approach presented in this work can provide estimation of public health risk on which control measures at the plant level can be based.
Community Health Risk Assessment of Primary Aluminum Smelter Emissions
Larivière, Claude
2014-01-01
Objective: Primary aluminum production is an industrial process with high potential health risk for workers. We consider in this article how to assess community health risks associated with primary aluminum smelter emissions. Methods: We reviewed the literature on health effects, community exposure data, and dose–response relationships of the principal hazardous agents emitted. Results: On the basis of representative measured community exposure levels, we were able to make rough estimates on health risks associated with specific agents and categorize these as none, low, medium, or high. Conclusions: It is possible to undertake a rough-estimate community Health Risk Assessment for individual smelters on the basis of information available in the epidemiological literature and local community exposure data. PMID:24806724
Evaluation of risk communication in a mammography patient decision aid
Klein, Krystal A.; Watson, Lindsey; Ash, Joan S.; Eden, Karen B.
2016-01-01
Objectives We characterized patients’ comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Methods Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest–posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Results Participants’ positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Conclusions Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Practice implications Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics PMID:26965020
Accelerated Monte Carlo Simulation for Safety Analysis of the Advanced Airspace Concept
NASA Technical Reports Server (NTRS)
Thipphavong, David
2010-01-01
Safe separation of aircraft is a primary objective of any air traffic control system. An accelerated Monte Carlo approach was developed to assess the level of safety provided by a proposed next-generation air traffic control system. It combines features of fault tree and standard Monte Carlo methods. It runs more than one order of magnitude faster than the standard Monte Carlo method while providing risk estimates that only differ by about 10%. It also preserves component-level model fidelity that is difficult to maintain using the standard fault tree method. This balance of speed and fidelity allows sensitivity analysis to be completed in days instead of weeks or months with the standard Monte Carlo method. Results indicate that risk estimates are sensitive to transponder, pilot visual avoidance, and conflict detection failure probabilities.
Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter; Ghali, William A; Marshall, Deborah A
2018-01-01
Kaplan-Meier survival analysis overestimates cumulative incidence in competing risks (CRs) settings. The extent of overestimation (or its clinical significance) has been questioned, and CRs methods are infrequently used. This meta-analysis compares the Kaplan-Meier method to the cumulative incidence function (CIF), a CRs method. We searched MEDLINE, EMBASE, BIOSIS Previews, Web of Science (1992-2016), and article bibliographies for studies estimating cumulative incidence using the Kaplan-Meier method and CIF. For studies with sufficient data, we calculated pooled risk ratios (RRs) comparing Kaplan-Meier and CIF estimates using DerSimonian and Laird random effects models. We performed stratified meta-analyses by clinical area, rate of CRs (CRs/events of interest), and follow-up time. Of 2,192 identified abstracts, we included 77 studies in the systematic review and meta-analyzed 55. The pooled RR demonstrated the Kaplan-Meier estimate was 1.41 [95% confidence interval (CI): 1.36, 1.47] times higher than the CIF. Overestimation was highest among studies with high rates of CRs [RR = 2.36 (95% CI: 1.79, 3.12)], studies related to hepatology [RR = 2.60 (95% CI: 2.12, 3.19)], and obstetrics and gynecology [RR = 1.84 (95% CI: 1.52, 2.23)]. The Kaplan-Meier method overestimated the cumulative incidence across 10 clinical areas. Using CRs methods will ensure accurate results inform clinical and policy decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
Cost Risk Analysis Based on Perception of the Engineering Process
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.
1986-01-01
In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering process parameters are elicited from the engineer/expert on the project and are based on that expert's technical knowledge. These are converted by a parametric cost model into a cost estimate. The method discussed makes no assumptions about the distribution underlying the distribution of possible costs, and is not tied to the analysis of previous projects, except through the expert calibrations performed by the parametric cost analyst.
Regression dilution bias: tools for correction methods and sample size calculation.
Berglund, Lars
2012-08-01
Random errors in measurement of a risk factor will introduce downward bias of an estimated association to a disease or a disease marker. This phenomenon is called regression dilution bias. A bias correction may be made with data from a validity study or a reliability study. In this article we give a non-technical description of designs of reliability studies with emphasis on selection of individuals for a repeated measurement, assumptions of measurement error models, and correction methods for the slope in a simple linear regression model where the dependent variable is a continuous variable. Also, we describe situations where correction for regression dilution bias is not appropriate. The methods are illustrated with the association between insulin sensitivity measured with the euglycaemic insulin clamp technique and fasting insulin, where measurement of the latter variable carries noticeable random error. We provide software tools for estimation of a corrected slope in a simple linear regression model assuming data for a continuous dependent variable and a continuous risk factor from a main study and an additional measurement of the risk factor in a reliability study. Also, we supply programs for estimation of the number of individuals needed in the reliability study and for choice of its design. Our conclusion is that correction for regression dilution bias is seldom applied in epidemiological studies. This may cause important effects of risk factors with large measurement errors to be neglected.
Estimating gestational age is usually based on last menstrual period date (LMP) or clinical estimation (CGA); both approaches introduce error and potential bias. Differences in the two methods of gestational age assignment may lead to misclassification and differences in risk est...
Comparison of risk estimates for selected diseases and causes of death.
Merrill, R M; Kessler, L G; Udler, J M; Rasband, G C; Feuer, E J
1999-02-01
Lifetime risk estimates of disease are limited by long-term data extrapolations and are less relevant to individuals who have already lived a period of time without the disease, but are approaching the age at which the disease risk becomes common. In contrast, short-term age-conditional risk estimates, such as the risk of developing a disease in the next 10 years among those alive and free of the disease at a given age, are less restricted by long-term extrapolation of current rates and can present patients with risk information tailored to their age. This study focuses on short-term age-conditional risk estimates for a broad set of important chronic diseases and nondisease causes of death among white and black men and women. The Feuer et al. (1993, Journal of the National Cancer Institute) [15] method was applied to data from a variety of sources to obtain risk estimates for select cancers, myocardial infarction, diabetes mellitus, multiple sclerosis, Alzheimer's, and death from motor vehicle accidents, homicide or legal intervention, and suicide. Acute deaths from suicide, homicide or legal intervention, and fatal motor vehicle accidents dominate the risk picture for persons in their 20s, with only diabetes mellitus and end-stage renal disease therapy (for blacks only) having similar levels of risk in this age range. Late in life, cancer, acute myocardial infarction, Alzheimer's, and stroke become most common. The chronic diseases affecting the population later in life present the most likely diseases someone will face. Several interesting differences in disease and death risks were derived and reported among age-specific race and gender subgroups of the population. Presentation of risk estimates for a broad set of chronic diseases and nondisease causes of death within short-term age ranges among population subgroups provides tailored information that may lead to better educated prevention, screening, and control behaviors and more efficient allocation of health resources.
Forecasting extinction risk with nonstationary matrix models.
Gotelli, Nicholas J; Ellison, Aaron M
2006-02-01
Matrix population growth models are standard tools for forecasting population change and for managing rare species, but they are less useful for predicting extinction risk in the face of changing environmental conditions. Deterministic models provide point estimates of lambda, the finite rate of increase, as well as measures of matrix sensitivity and elasticity. Stationary matrix models can be used to estimate extinction risk in a variable environment, but they assume that the matrix elements are randomly sampled from a stationary (i.e., non-changing) distribution. Here we outline a method for using nonstationary matrix models to construct realistic forecasts of population fluctuation in changing environments. Our method requires three pieces of data: (1) field estimates of transition matrix elements, (2) experimental data on the demographic responses of populations to altered environmental conditions, and (3) forecasting data on environmental drivers. These three pieces of data are combined to generate a series of sequential transition matrices that emulate a pattern of long-term change in environmental drivers. Realistic estimates of population persistence and extinction risk can be derived from stochastic permutations of such a model. We illustrate the steps of this analysis with data from two populations of Sarracenia purpurea growing in northern New England. Sarracenia purpurea is a perennial carnivorous plant that is potentially at risk of local extinction because of increased nitrogen deposition. Long-term monitoring records or models of environmental change can be used to generate time series of driver variables under different scenarios of changing environments. Both manipulative and natural experiments can be used to construct a linking function that describes how matrix parameters change as a function of the environmental driver. This synthetic modeling approach provides quantitative estimates of extinction probability that have an explicit mechanistic basis.
Weng, Lu-Chen; Roetker, Nicholas S; Lutsey, Pamela L; Alonso, Alvaro; Guan, Weihua; Pankow, James S; Folsom, Aaron R; Steffen, Lyn M; Pankratz, Nathan; Tang, Weihong
2018-01-01
Studies have reported that higher circulating levels of total cholesterol (TC), low-density lipoprotein (LDL) cholesterol and lower of high-density lipoprotein (HDL) cholesterol may be associated with increased risk of abdominal aortic aneurysm (AAA). Whether dyslipidemia causes AAA is still unclear and is potentially testable using a Mendelian randomization (MR) approach. We investigated the associations between blood lipids and AAA using two-sample MR analysis with SNP-lipids association estimates from a published genome-wide association study of blood lipids (n = 188,577) and SNP-AAA association estimates from European Americans (EAs) of the Atherosclerosis Risk in Communities (ARIC) study (n = 8,793). We used inverse variance weighted (IVW) MR as the primary method and MR-Egger regression and weighted median MR estimation as sensitivity analyses. Over a median of 22.7 years of follow-up, 338 of 8,793 ARIC participants experienced incident clinical AAA. Using the IVW method, we observed positive associations of plasma LDL cholesterol and TC with the risk of AAA (odds ratio (OR) = 1.55, P = 0.02 for LDL cholesterol and OR = 1.61, P = 0.01 for TC per 1 standard deviation of lipid increment). Using the MR-Egger regression and weighted median methods, we were able to validate the association of AAA risk with TC, although the associations were less consistent for LDL cholesterol due to wider confidence intervals. Triglycerides and HDL cholesterol were not associated with AAA in any of the MR methods. Assuming instrumental variable assumptions are satisfied, our finding suggests that higher plasma TC and LDL cholesterol are causally associated with the increased risk of AAA in EAs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumann, Brian C.; He, Jiwei; Hwang, Wei-Ting
Purpose: To inform prospective trials of adjuvant radiation therapy (adj-RT) for bladder cancer after radical cystectomy, a locoregional failure (LF) risk stratification was proposed. This stratification was developed and validated using surgical databases that may not reflect the outcomes expected in prospective trials. Our purpose was to assess sources of bias that may affect the stratification model's validity or alter the LF risk estimates for each subgroup: time bias due to evolving surgical techniques; trial accrual bias due to inclusion of patients who would be ineligible for adj-RT trials because of early disease progression, death, or loss to follow-up shortlymore » after cystectomy; bias due to different statistical methods to estimate LF; and subgrouping bias due to different definitions of the LF subgroups. Methods and Materials: The LF risk stratification was developed using a single-institution cohort (n=442, 1990-2008) and the multi-institutional SWOG 8710 cohort (n=264, 1987-1998) treated with radical cystectomy with or without chemotherapy. We evaluated the sensitivity of the stratification to sources of bias using Fine-Gray regression and Kaplan-Meier analyses. Results: Year of radical cystectomy was not associated with LF risk on univariate or multivariate analysis after controlling for risk group. By use of more stringent inclusion criteria, 26 SWOG patients (10%) and 60 patients from the single-institution cohort (14%) were excluded. Analysis of the remaining patients confirmed 3 subgroups with significantly different LF risks with 3-year rates of 7%, 17%, and 36%, respectively (P<.01), nearly identical to the rates without correcting for trial accrual bias. Kaplan-Meier techniques estimated higher subgroup LF rates than competing risk analysis. The subgroup definitions used in the NRG-GU001 adj-RT trial were validated. Conclusions: These sources of bias did not invalidate the LF risk stratification or substantially change the model's LF estimates.« less
Estimating aquatic hazards posed by prescription pharmaceutical residues from municipal wastewater
Risks posed by pharmaceuticals in the environment are hard to estimate due to limited monitoring capacity and difficulty interpreting monitoring results. In order to partially address these issues, we suggest a method for prioritizing pharmaceuticals for monitoring, and a framewo...
Stram, Daniel O; Leigh Pearce, Celeste; Bretsky, Phillip; Freedman, Matthew; Hirschhorn, Joel N; Altshuler, David; Kolonel, Laurence N; Henderson, Brian E; Thomas, Duncan C
2003-01-01
The US National Cancer Institute has recently sponsored the formation of a Cohort Consortium (http://2002.cancer.gov/scpgenes.htm) to facilitate the pooling of data on very large numbers of people, concerning the effects of genes and environment on cancer incidence. One likely goal of these efforts will be generate a large population-based case-control series for which a number of candidate genes will be investigated using SNP haplotype as well as genotype analysis. The goal of this paper is to outline the issues involved in choosing a method of estimating haplotype-specific risk estimates for such data that is technically appropriate and yet attractive to epidemiologists who are already comfortable with odds ratios and logistic regression. Our interest is to develop and evaluate extensions of methods, based on haplotype imputation, that have been recently described (Schaid et al., Am J Hum Genet, 2002, and Zaykin et al., Hum Hered, 2002) as providing score tests of the null hypothesis of no effect of SNP haplotypes upon risk, which may be used for more complex tasks, such as providing confidence intervals, and tests of equivalence of haplotype-specific risks in two or more separate populations. In order to do so we (1) develop a cohort approach towards odds ratio analysis by expanding the E-M algorithm to provide maximum likelihood estimates of haplotype-specific odds ratios as well as genotype frequencies; (2) show how to correct the cohort approach, to give essentially unbiased estimates for population-based or nested case-control studies by incorporating the probability of selection as a case or control into the likelihood, based on a simplified model of case and control selection, and (3) finally, in an example data set (CYP17 and breast cancer, from the Multiethnic Cohort Study) we compare likelihood-based confidence interval estimates from the two methods with each other, and with the use of the single-imputation approach of Zaykin et al. applied under both null and alternative hypotheses. We conclude that so long as haplotypes are well predicted by SNP genotypes (we use the Rh2 criteria of Stram et al. [1]) the differences between the three methods are very small and in particular that the single imputation method may be expected to work extremely well. Copyright 2003 S. Karger AG, Basel
NASA Astrophysics Data System (ADS)
Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen
2017-06-01
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.
Hsu, Chiu-Hsieh; Li, Yisheng; Long, Qi; Zhao, Qiuhong; Lance, Peter
2011-01-01
In colorectal polyp prevention trials, estimation of the rate of recurrence of adenomas at the end of the trial may be complicated by dependent censoring, that is, time to follow-up colonoscopy and dropout may be dependent on time to recurrence. Assuming that the auxiliary variables capture the dependence between recurrence and censoring times, we propose to fit two working models with the auxiliary variables as covariates to define risk groups and then extend an existing weighted logistic regression method for independent censoring to each risk group to accommodate potential dependent censoring. In a simulation study, we show that the proposed method results in both a gain in efficiency and reduction in bias for estimating the recurrence rate. We illustrate the methodology by analyzing a recurrent adenoma dataset from a colorectal polyp prevention trial. PMID:22065985
Competing risks regression for clustered data
Zhou, Bingqing; Fine, Jason; Latouche, Aurelien; Labopin, Myriam
2012-01-01
A population average regression model is proposed to assess the marginal effects of covariates on the cumulative incidence function when there is dependence across individuals within a cluster in the competing risks setting. This method extends the Fine–Gray proportional hazards model for the subdistribution to situations, where individuals within a cluster may be correlated due to unobserved shared factors. Estimators of the regression parameters in the marginal model are developed under an independence working assumption where the correlation across individuals within a cluster is completely unspecified. The estimators are consistent and asymptotically normal, and variance estimation may be achieved without specifying the form of the dependence across individuals. A simulation study evidences that the inferential procedures perform well with realistic sample sizes. The practical utility of the methods is illustrated with data from the European Bone Marrow Transplant Registry. PMID:22045910
Assessing Interval Estimation Methods for Hill Model ...
The Hill model of concentration-response is ubiquitous in toxicology, perhaps because its parameters directly relate to biologically significant metrics of toxicity such as efficacy and potency. Point estimates of these parameters obtained through least squares regression or maximum likelihood are commonly used in high-throughput risk assessment, but such estimates typically fail to include reliable information concerning confidence in (or precision of) the estimates. To address this issue, we examined methods for assessing uncertainty in Hill model parameter estimates derived from concentration-response data. In particular, using a sample of ToxCast concentration-response data sets, we applied four methods for obtaining interval estimates that are based on asymptotic theory, bootstrapping (two varieties), and Bayesian parameter estimation, and then compared the results. These interval estimation methods generally did not agree, so we devised a simulation study to assess their relative performance. We generated simulated data by constructing four statistical error models capable of producing concentration-response data sets comparable to those observed in ToxCast. We then applied the four interval estimation methods to the simulated data and compared the actual coverage of the interval estimates to the nominal coverage (e.g., 95%) in order to quantify performance of each of the methods in a variety of cases (i.e., different values of the true Hill model paramet
Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Dalton, Angela C.; Dale, Crystal
2014-06-01
Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less
Stability basin estimates fall risk from observed kinematics, demonstrated on the Sit-to-Stand task.
Shia, Victor; Moore, Talia Yuki; Holmes, Patrick; Bajcsy, Ruzena; Vasudevan, Ram
2018-04-27
The ability to quantitatively measure stability is essential to ensuring the safety of locomoting systems. While the response to perturbation directly reflects the stability of a motion, this experimental method puts human subjects at risk. Unfortunately, existing indirect methods for estimating stability from unperturbed motion have been shown to have limited predictive power. This paper leverages recent advances in dynamical systems theory to accurately estimate the stability of human motion without requiring perturbation. This approach relies on kinematic observations of a nominal Sit-to-Stand motion to construct an individual-specific dynamic model, input bounds, and feedback control that are then used to compute the set of perturbations from which the model can recover. This set, referred to as the stability basin, was computed for 14 individuals, and was able to successfully differentiate between less and more stable Sit-to-Stand strategies for each individual with greater accuracy than existing methods. Copyright © 2018 Elsevier Ltd. All rights reserved.
Increased Cancer Risks in Myotonic Dystrophy
Win, Aung Ko; Perattur, Promilla G.; Pulido, Jose S.; Pulido, Christine M.; Lindor, Noralane M.
2012-01-01
Objective To estimate cancer risks for patients with myotonic dystrophy, given that increased risks for neoplasms in association with myotonic dystrophy type 1 and type 2 have been suggested in several studies but the risks of cancers have not been quantified. Patients and Methods A cohort of 307 patients with myotonic dystrophy identified from medical records of Mayo Clinic in Rochester, MN, from January 1, l993, through May 28, 2010, was retrospectively analyzed. We estimated standardized incidence ratios (SIRs) of specific cancers for patients with myotonic dystrophy compared with age- and sex-specific cancer incidences of the general population. Age-dependent cumulative risks were calculated using the Kaplan-Meier method. Results A total of 53 cancers were observed at a median age at diagnosis of 55 years. Patients with myotonic dystrophy had an increased risk of thyroid cancer (SIR, 5.54; 95% confidence interval [CI], 1.80-12.93; P=.001) and choroidal melanoma (SIR, 27.54; 95% CI, 3.34-99.49; P<.001). They may also have an increased risk of testicular cancer (SIR, 5.09; 95% CI, 0.62-18.38; P=.06) and prostate cancer (SIR, 2.21; 95% CI, 0.95-4.35; P=.05). The estimated cumulative risks at age 50 years were 1.72% (95% CI, 0.64%-4.55%) for thyroid cancer and 1.00% (95% CI, 0.25%-3.92%) for choroidal melanoma. There was no statistical evidence of an increased risk of brain, breast, colorectal, lung, renal, bladder, endometrial, or ovarian cancer; lymphoma; leukemia; or multiple myeloma. Conclusion Patients with myotonic dystrophy may have an increased risk of thyroid cancer and choroidal melanoma and, possibly, testicular and prostate cancers. PMID:22237010
Simmons, Rebecca K.; Coleman, Ruth L.; Price, Hermione C.; Holman, Rury R.; Khaw, Kay-Tee; Wareham, Nicholas J.; Griffin, Simon J.
2009-01-01
OBJECTIVE The purpose of this study was to examine the performance of the UK Prospective Diabetes Study (UKPDS) Risk Engine (version 3) and the Framingham risk equations (2008) in estimating cardiovascular disease (CVD) incidence in three populations: 1) individuals with known diabetes; 2) individuals with nondiabetic hyperglycemia, defined as A1C ≥6.0%; and 3) individuals with normoglycemia defined as A1C <6.0%. RESEARCH DESIGN AND METHODS This was a population-based prospective cohort (European Prospective Investigation of Cancer-Norfolk). Participants aged 40–79 years recruited from U.K. general practices attended a health examination (1993–1998) and were followed for CVD events/death until April 2007. CVD risk estimates were calculated for 10,137 individuals. RESULTS Over 10.1 years, there were 69 CVD events in the diabetes group (25.4%), 160 in the hyperglycemia group (17.7%), and 732 in the normoglycemia group (8.2%). Estimated CVD 10-year risk in the diabetes group was 33 and 37% using the UKPDS and Framingham equations, respectively. In the hyperglycemia group, estimated CVD risks were 31 and 22%, respectively, and for the normoglycemia group risks were 20 and 14%, respectively. There were no significant differences in the ability of the risk equations to discriminate between individuals at different risk of CVD events in each subgroup; both equations overestimated CVD risk. The Framingham equations performed better in the hyperglycemia and normoglycemia groups as they did not overestimate risk as much as the UKPDS Risk Engine, and they classified more participants correctly. CONCLUSIONS Both the UKPDS Risk Engine and Framingham risk equations were moderately effective at ranking individuals and are therefore suitable for resource prioritization. However, both overestimated true risk, which is important when one is using scores to communicate prognostic information to individuals. PMID:19114615
Trends in Worker Hearing Loss by Industry Sector, 1981–2010
Masterson, Elizabeth A.; Deddens, James A.; Themann, Christa L.; Bertke, Stephen; Calvert, Geoffrey M.
2015-01-01
Background The purpose of this study was to estimate the incidence and prevalence of hearing loss for noise-exposed U.S. workers by industry sector and 5-year time period, covering 30 years. Methods Audiograms for 1.8 million workers from 1981–2010 were examined. Incidence and prevalence were estimated by industry sector and time period. The adjusted risk of incident hearing loss within each time period and industry sector as compared with a reference time period was also estimated. Results The adjusted risk for incident hearing loss decreased over time when all industry sectors were combined. However, the risk remained high for workers in Healthcare and Social Assistance, and the prevalence was consistently high for Mining and Construction workers. Conclusions While progress has been made in reducing the risk of incident hearing loss within most industry sectors, additional efforts are needed within Mining, Construction and Healthcare and Social Assistance. PMID:25690583
Stupp, Paul; Okoroh, Ekwutosi; Besera, Ghenet; Goodman, David; Danel, Isabella
2016-01-01
Objectives In 1996, the U.S. Congress passed legislation making female genital mutilation/cutting (FGM/C) illegal in the United States. CDC published the first estimates of the number of women and girls at risk for FGM/C in 1997. Since 2012, various constituencies have again raised concerns about the practice in the United States. We updated an earlier estimate of the number of women and girls in the United States who were at risk for FGM/C or its consequences. Methods We estimated the number of women and girls who were at risk for undergoing FGM/C or its consequences in 2012 by applying country-specific prevalence of FGM/C to the estimated number of women and girls living in the United States who were born in that country or who lived with a parent born in that country. Results Approximately 513,000 women and girls in the United States were at risk for FGM/C or its consequences in 2012, which was more than three times higher than the earlier estimate, based on 1990 data. The increase in the number of women and girls younger than 18 years of age at risk for FGM/C was more than four times that of previous estimates. Conclusion The estimated increase was wholly a result of rapid growth in the number of immigrants from FGM/C-practicing countries living in the United States and not from increases in FGM/C prevalence in those countries. Scientifically valid information regarding whether women or their daughters have actually undergone FGM/C and related information that can contribute to efforts to prevent the practice in the United States and provide needed health services to women who have undergone FGM/C are needed. PMID:26957669
Information technologies for taking into account risks in business development programme
NASA Astrophysics Data System (ADS)
Kalach, A. V.; Khasianov, R. R.; Rossikhina, L. V.; Zybin, D. G.; Melnik, A. A.
2018-05-01
The paper describes the information technologies for taking into account risks in business development programme, which rely on the algorithm for assessment of programme project risks and the algorithm of programme forming with constrained financing of high-risk projects taken into account. A method of lower-bound estimate is suggested for subsets of solutions. The corresponding theorem and lemma and their proofs are given.
Risk Decision Making Model for Reservoir Floodwater resources Utilization
NASA Astrophysics Data System (ADS)
Huang, X.
2017-12-01
Floodwater resources utilization(FRU) can alleviate the shortage of water resources, but there are risks. In order to safely and efficiently utilize the floodwater resources, it is necessary to study the risk of reservoir FRU. In this paper, the risk rate of exceeding the design flood water level and the risk rate of exceeding safety discharge are estimated. Based on the principle of the minimum risk and the maximum benefit of FRU, a multi-objective risk decision making model for FRU is constructed. Probability theory and mathematical statistics method is selected to calculate the risk rate; C-D production function method and emergy analysis method is selected to calculate the risk benefit; the risk loss is related to flood inundation area and unit area loss; the multi-objective decision making problem of the model is solved by the constraint method. Taking the Shilianghe reservoir in Jiangsu Province as an example, the optimal equilibrium solution of FRU of the Shilianghe reservoir is found by using the risk decision making model, and the validity and applicability of the model are verified.
Thomas, H L; Andrews, N; Green, H K; Boddington, N L; Zhao, H; Reynolds, A; McMenamin, J; Pebody, R G
2014-01-01
Methods for estimating vaccine effectiveness (VE) against severe influenza are not well established. We used the screening method to estimate VE against influenza resulting in intensive care unit (ICU) admission in England and Scotland in 2011/2012. We extracted data on confirmed influenza ICU cases from severe influenza surveillance systems, and obtained their 2011/2012 trivalent influenza vaccine (TIV) status from primary care. We compared case vaccine uptake with population vaccine uptake obtained from routine monitoring systems, adjusting for age group, specific risk group, region and week. Of 60 influenza ICU cases reported, vaccination status was available for 56 (93%). Adjusted VE against ICU admission for those aged ≥ 65 years was -10% [95% confidence interval (CI) -207 to 60], consistent with evidence of poor protection from the 2011/2012 TIV in 2011/2012. Adjusted VE for those aged <65 years in risk groups was -296% (95% CI -930 to -52), suggesting significant residual confounding using the screening method in those subject to selective vaccination.
Raghavan, Sridharan; Porneala, Bianca; McKeown, Nicola; Fox, Caroline S.; Dupuis, Josée; Meigs, James B.
2015-01-01
Aims/hypothesis Type 2 diabetes mellitus in parents is a strong determinant of diabetes risk in their offspring. We hypothesise that offspring diabetes risk associated with parental diabetes is mediated by metabolic risk factors. Methods We studied initially non-diabetic participants of the Framingham Offspring Study. Metabolic risk was estimated using beta cell corrected insulin response (CIR), HOMA-IR or a count of metabolic syndrome components (metabolic syndrome score [MSS]). Dietary risk and physical activity were estimated using questionnaire responses. Genetic risk score (GRS) was estimated as the count of 62 type 2 diabetes risk alleles. The outcome of incident diabetes in offspring was examined across levels of parental diabetes exposure, accounting for sibling correlation and adjusting for age, sex and putative mediators. The proportion mediated was estimated by comparing regression coefficients for parental diabetes with (βadj) and without (βunadj) adjustments for CIR, HOMA-IR, MSS and GRS (percentage mediated = 1 – βadj / βunadj). Results Metabolic factors mediated 11% of offspring diabetes risk associated with parental diabetes, corresponding to a reduction in OR per diabetic parent from 2.13 to 1.96. GRS mediated 9% of risk, corresponding to a reduction in OR per diabetic parent from 2.13 to 1.99. Conclusions/interpretation Metabolic risk factors partially mediated offspring type 2 diabetes risk conferred by parental diabetes to a similar magnitude as genetic risk. However, a substantial proportion of offspring diabetes risk associated with parental diabetes remains unexplained by metabolic factors, genetic risk, diet and physical activity, suggesting that important familial influences on diabetes risk remain undiscovered. PMID:25619168
This project summary highlights recent findings from research undertaken to develop improved methods to assess potential human health risks related to drinking water disinfection byproduct (DBP) exposures.
DEVELOPMENT OF ENVIRONMENTAL INDICES FOR GREEN CHEMICAL PRODUCTION AND USE
Chemical production, use and disposal cause adverse impacts on the environment. Consequently, much research has been conducted to develop methods for estimating the risk of chemicals and to screen them based on environmental impact. Risk assessment may be subdivide...
Acute Gastroenteritis and Recreational Water: Highest Burden Among Young US Children
OBJECT I VES : To provide summary estimates of gastroenteritis risks and illness burden associated with recreational water exposure and determine whether children have higher risks and burden.METHODS: We combined individual participant data from 13 prospective cohorts at marine a...
Suicidal behaviour across the African continent: a review of the literature
2014-01-01
Background Suicide is a major cause of premature mortality worldwide, but data on its epidemiology in Africa, the world’s second most populous continent, are limited. Methods We systematically reviewed published literature on suicidal behaviour in African countries. We searched PubMed, Web of Knowledge, PsycINFO, African Index Medicus, Eastern Mediterranean Index Medicus and African Journals OnLine and carried out citation searches of key articles. We crudely estimated the incidence of suicide and suicide attempts in Africa based on country-specific data and compared these with published estimates. We also describe common features of suicide and suicide attempts across the studies, including information related to age, sex, methods used and risk factors. Results Regional or national suicide incidence data were available for less than one third (16/53) of African countries containing approximately 60% of Africa’s population; suicide attempt data were available for <20% of countries (7/53). Crude estimates suggest there are over 34,000 (inter-quartile range 13,141 to 63,757) suicides per year in Africa, with an overall incidence rate of 3.2 per 100,000 population. The recent Global Burden of Disease (GBD) estimate of 49,558 deaths is somewhat higher, but falls within the inter-quartile range of our estimate. Suicide rates in men are typically at least three times higher than in women. The most frequently used methods of suicide are hanging and pesticide poisoning. Reported risk factors are similar for suicide and suicide attempts and include interpersonal difficulties, mental and physical health problems, socioeconomic problems and drug and alcohol use/abuse. Qualitative studies are needed to identify additional culturally relevant risk factors and to understand how risk factors may be connected to suicidal behaviour in different socio-cultural contexts. Conclusions Our estimate is somewhat lower than GBD, but still clearly indicates suicidal behaviour is an important public health problem in Africa. More regional studies, in both urban and rural areas, are needed to more accurately estimate the burden of suicidal behaviour across the continent. Qualitative studies are required in addition to quantitative studies. PMID:24927746
Gać, Paweł; Jaźwiec, Przemysław; Poręba, Małgorzata; Mazur, Grzegorz; Pawlas, Krystyna; Sobieszczańska, Małgorzata; Poręba, Rafał
2017-12-01
The relationship between environmental exposure of non-smokers to cigarette smoke and the coronary artery calcium scores has not been sufficiently documented. The aim of the study was to identify the relationship between environmental exposure to cigarette smoke and the risk of coronary artery disease (CAD) estimated non-invasively through measurement of coronary artery calcium score by computed tomography in patients with essential hypertension. The study was conducted on 67 patients with essential hypertension, non-smokers environmentally exposed to cigarette smoke (group A) and on 67 patients with essential hypertension, non-smokers not exposed to cigarette smoke (group B), selected using the case to case. Environmental exposure to cigarette smoke was evaluated using a questionnaire. The risk of development of coronary artery disease was estimated non-invasively through measurement of coronary artery calcium score (CA CS ) by computed tomography. Group A was characterised by significantly higher CA CS and left anterior descending (LAD CS ) calcium scores than group B. Compared to group B, group A had significantly higher percentage of patients with significant risk of CAD estimated on the basis of CA CS values, and significantly lower percentage of patients with practically no risk of CAD estimated with the same method. Advanced age, peripheral artery diseases and environmental exposure to cigarette smoke are independent risk factors associated with increased CA CS and LAD CS values. In addition, higher BMI and hypercholesterolemia are independent risk factors for increased values of LAD CS . In patients with essential hypertension environmental exposure to cigarette smoke may result in elevated risk of coronary artery disease estimated non-invasively through measurement of coronary artery calcium score by computed tomography. Copyright © 2017 Elsevier B.V. All rights reserved.
Neighbors, Charles J; Barnett, Nancy P; Rohsenow, Damaris J; Colby, Suzanne M; Monti, Peter M
2010-05-01
Brief interventions in the emergency department targeting risk-taking youth show promise to reduce alcohol-related injury. This study models the cost-effectiveness of a motivational interviewing-based intervention relative to brief advice to stop alcohol-related risk behaviors (standard care). Average cost-effectiveness ratios were compared between conditions. In addition, a cost-utility analysis examined the incremental cost of motivational interviewing per quality-adjusted life year gained. Microcosting methods were used to estimate marginal costs of motivational interviewing and standard care as well as two methods of patient screening: standard emergency-department staff questioning and proactive outreach by counseling staff. Average cost-effectiveness ratios were computed for drinking and driving, injuries, vehicular citations, and negative social consequences. Using estimates of the marginal effect of motivational interviewing in reducing drinking and driving, estimates of traffic fatality risk from drinking-and-driving youth, and national life tables, the societal costs per quality-adjusted life year saved by motivational interviewing relative to standard care were also estimated. Alcohol-attributable traffic fatality risks were estimated using national databases. Intervention costs per participant were $81 for standard care, $170 for motivational interviewing with standard screening, and $173 for motivational interviewing with proactive screening. The cost-effectiveness ratios for motivational interviewing were more favorable than standard care across all study outcomes and better for men than women. The societal cost per quality-adjusted life year of motivational interviewing was $8,795. Sensitivity analyses indicated that results were robust in terms of variability in parameter estimates. This brief intervention represents a good societal investment compared with other commonly adopted medical interventions.
Zablotska, L B; Lane, R S D; Thompson, P A
2014-01-01
Background: A 15-country study of nuclear workers reported significantly increased radiation-related risks of all cancers excluding leukaemia, with Canadian data a major factor behind the pooled results. We analysed mortality (1956–1994) in the updated Canadian cohort and provided revised risk estimates. Methods: Employment records were searched to verify and revise exposure data and to restore missing socioeconomic status. Excess relative risks per sievert (ERR/Sv) of recorded radiation dose and 95% confidence intervals (CIs) were estimated using Poisson regression. Results: A significant heterogeneity of the dose–response for solid cancer was identified (P=0.02), with 3088 early (1956–1964) Atomic Energy of Canada Limited (AECL) workers having a significant increase (ERR/Sv=7.87, 95% CI: 1.88, 19.5), and no evidence of radiation risk for 42 228 workers employed by three nuclear power plant companies and post-1964 AECL (ERR/Sv=−1.20, 95% CI: <−1.47, 2.39). Radiation risks of leukaemia were negative in early AECL workers and non-significantly increased in other workers. In analyses with separate terms for tritium and gamma doses, there was no evidence of increased risk from tritium exposure. All workers had mortality lower than the general population. Conclusion: Significantly increased risks for early AECL workers are most likely due to incomplete transfer of AECL dose records to the National Dose Registry. Analyses of the remainder of the Canadian nuclear workers (93.2%) provided no evidence of increased risk, but the risk estimate was compatible with estimates that form the basis of radiation protection standards. Study findings suggest that the revised Canadian cohort, with the exclusion of early AECL workers, would likely have an important effect on the 15-country pooled risk estimate of radiation-related risks of all cancer excluding leukaemia by substantially reducing the size of the point estimate and its significance. PMID:24231946
Risk-Based Sampling: I Don't Want to Weight in Vain.
Powell, Mark R
2015-12-01
Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. © 2015 Society for Risk Analysis.
Nguyen, Trang Quynh; Webb-Vargas, Yenny; Koning, Ina M.; Stuart, Elizabeth A.
2016-01-01
We investigate a method to estimate the combined effect of multiple continuous/ordinal mediators on a binary outcome: 1) fit a structural equation model with probit link for the outcome and identity/probit link for continuous/ordinal mediators, 2) predict potential outcome probabilities, and 3) compute natural direct and indirect effects. Step 2 involves rescaling the latent continuous variable underlying the outcome to address residual mediator variance/covariance. We evaluate the estimation of risk-difference- and risk-ratio-based effects (RDs, RRs) using the ML, WLSMV and Bayes estimators in Mplus. Across most variations in path-coefficient and mediator-residual-correlation signs and strengths, and confounding situations investigated, the method performs well with all estimators, but favors ML/WLSMV for RDs with continuous mediators, and Bayes for RRs with ordinal mediators. Bayes outperforms WLSMV/ML regardless of mediator type when estimating RRs with small potential outcome probabilities and in two other special cases. An adolescent alcohol prevention study is used for illustration. PMID:27158217
Mapping Urban Risk: Flood Hazards, Race, & Environmental Justice In New York”
Maantay, Juliana; Maroko, Andrew
2009-01-01
This paper demonstrates the importance of disaggregating population data aggregated by census tracts or other units, for more realistic population distribution/location. A newly-developed mapping method, the Cadastral-based Expert Dasymetric System (CEDS), calculates population in hyper-heterogeneous urban areas better than traditional mapping techniques. A case study estimating population potentially impacted by flood hazard in New York City compares the impacted population determined by CEDS with that derived by centroid-containment method and filtered areal weighting interpolation. Compared to CEDS, 37 percent and 72 percent fewer people are estimated to be at risk from floods city-wide, using conventional areal weighting of census data, and centroid-containment selection, respectively. Undercounting of impacted population could have serious implications for emergency management and disaster planning. Ethnic/racial populations are also spatially disaggregated to determine any environmental justice impacts with flood risk. Minorities are disproportionately undercounted using traditional methods. Underestimating more vulnerable sub-populations impairs preparedness and relief efforts. PMID:20047020
Assessment of in silico methods to estimate aquatic species sensitivity
Determining the sensitivity of a diversity of species to environmental contaminants continues to be a significant challenge in ecological risk assessment because toxicity data are generally limited to a few standard species. In many cases, QSAR models are used to estimate toxici...
Adjusting for Confounding in Early Postlaunch Settings: Going Beyond Logistic Regression Models.
Schmidt, Amand F; Klungel, Olaf H; Groenwold, Rolf H H
2016-01-01
Postlaunch data on medical treatments can be analyzed to explore adverse events or relative effectiveness in real-life settings. These analyses are often complicated by the number of potential confounders and the possibility of model misspecification. We conducted a simulation study to compare the performance of logistic regression, propensity score, disease risk score, and stabilized inverse probability weighting methods to adjust for confounding. Model misspecification was induced in the independent derivation dataset. We evaluated performance using relative bias confidence interval coverage of the true effect, among other metrics. At low events per coefficient (1.0 and 0.5), the logistic regression estimates had a large relative bias (greater than -100%). Bias of the disease risk score estimates was at most 13.48% and 18.83%. For the propensity score model, this was 8.74% and >100%, respectively. At events per coefficient of 1.0 and 0.5, inverse probability weighting frequently failed or reduced to a crude regression, resulting in biases of -8.49% and 24.55%. Coverage of logistic regression estimates became less than the nominal level at events per coefficient ≤5. For the disease risk score, inverse probability weighting, and propensity score, coverage became less than nominal at events per coefficient ≤2.5, ≤1.0, and ≤1.0, respectively. Bias of misspecified disease risk score models was 16.55%. In settings with low events/exposed subjects per coefficient, disease risk score methods can be useful alternatives to logistic regression models, especially when propensity score models cannot be used. Despite better performance of disease risk score methods than logistic regression and propensity score models in small events per coefficient settings, bias, and coverage still deviated from nominal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, J.; Moteabbed, M.; Paganetti, H., E-mail: hpaganetti@mgh.harvard.edu
2015-01-15
Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagationmore » was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the ratio of absolute risks between two modalities is less sensitive to the uncertainties in the risk model and can provide statistically significant estimates.« less
Perin, Jamie; Walker, Neff
2015-01-01
Background Recent steep declines in child mortality have been attributed in part to increased use of contraceptives and the resulting change in fertility behaviour, including an increase in the time between births. Previous observational studies have documented strong associations between short birth spacing and an increase in the risk of neonatal, infant, and under-five mortality, compared to births with longer preceding birth intervals. In this analysis, we compare two methods to estimate the association between short birth intervals and mortality risk to better inform modelling efforts linking family planning and mortality in children. Objectives Our goal was to estimate the mortality risk for neonates, infants, and young children by preceding birth space using household survey data, controlling for mother-level factors and to compare the results to those from previous analyses with survey data. Design We assessed the potential for confounding when estimating the relative mortality risk by preceding birth interval and estimated mortality risk by birth interval in four categories: less than 18 months, 18–23 months, 24–35 months, and 36 months or longer. We estimated the relative risks among women who were 35 and older at the time of the survey with two methods: in a Cox proportional hazards regression adjusting for potential confounders and also by stratifying Cox regression by mother, to control for all factors that remain constant over a woman's childbearing years. We estimated the overall effects for birth spacing in a meta-analysis with random survey effects. Results We identified several factors known for their associations with neonatal, infant, and child mortality that are also associated with preceding birth interval. When estimating the effect of birth spacing on mortality, we found that regression adjustment for these factors does not substantially change the risk ratio for short birth intervals compared to an unadjusted mortality ratio. For birth intervals less than 18 months, standard regression adjustment for confounding factors estimated a risk ratio for neonatal mortality of 2.28 (95% confidence interval: 2.18–2.37). This same effect estimated within mother is 1.57 (95% confidence interval: 1.52–1.63), a decline of almost one-third in the effect on neonatal mortality. Conclusions Neonatal, infant, and child mortality are strongly and significantly related to preceding birth interval, where births within a short interval of time after the previous birth have increased mortality. Previous analyses have demonstrated this relationship on average across all births; however, women who have short spaces between births are different from women with long spaces. Among women 35 years and older where a comparison of birth spaces within mother is possible, we find a much reduced although still significant effect of short birth spaces on child mortality. PMID:26562139
Perin, Jamie; Walker, Neff
2015-01-01
Recent steep declines in child mortality have been attributed in part to increased use of contraceptives and the resulting change in fertility behaviour, including an increase in the time between births. Previous observational studies have documented strong associations between short birth spacing and an increase in the risk of neonatal, infant, and under-five mortality, compared to births with longer preceding birth intervals. In this analysis, we compare two methods to estimate the association between short birth intervals and mortality risk to better inform modelling efforts linking family planning and mortality in children. Our goal was to estimate the mortality risk for neonates, infants, and young children by preceding birth space using household survey data, controlling for mother-level factors and to compare the results to those from previous analyses with survey data. We assessed the potential for confounding when estimating the relative mortality risk by preceding birth interval and estimated mortality risk by birth interval in four categories: less than 18 months, 18-23 months, 24-35 months, and 36 months or longer. We estimated the relative risks among women who were 35 and older at the time of the survey with two methods: in a Cox proportional hazards regression adjusting for potential confounders and also by stratifying Cox regression by mother, to control for all factors that remain constant over a woman's childbearing years. We estimated the overall effects for birth spacing in a meta-analysis with random survey effects. We identified several factors known for their associations with neonatal, infant, and child mortality that are also associated with preceding birth interval. When estimating the effect of birth spacing on mortality, we found that regression adjustment for these factors does not substantially change the risk ratio for short birth intervals compared to an unadjusted mortality ratio. For birth intervals less than 18 months, standard regression adjustment for confounding factors estimated a risk ratio for neonatal mortality of 2.28 (95% confidence interval: 2.18-2.37). This same effect estimated within mother is 1.57 (95% confidence interval: 1.52-1.63), a decline of almost one-third in the effect on neonatal mortality. Neonatal, infant, and child mortality are strongly and significantly related to preceding birth interval, where births within a short interval of time after the previous birth have increased mortality. Previous analyses have demonstrated this relationship on average across all births; however, women who have short spaces between births are different from women with long spaces. Among women 35 years and older where a comparison of birth spaces within mother is possible, we find a much reduced although still significant effect of short birth spaces on child mortality.
Lindqvist, R
2006-07-01
Turbidity methods offer possibilities for generating data required for addressing microorganism variability in risk modeling given that the results of these methods correspond to those of viable count methods. The objectives of this study were to identify the best approach for determining growth parameters based on turbidity data and use of a Bioscreen instrument and to characterize variability in growth parameters of 34 Staphylococcus aureus strains of different biotypes isolated from broiler carcasses. Growth parameters were estimated by fitting primary growth models to turbidity growth curves or to detection times of serially diluted cultures either directly or by using an analysis of variance (ANOVA) approach. The maximum specific growth rates in chicken broth at 17 degrees C estimated by time to detection methods were in good agreement with viable count estimates, whereas growth models (exponential and Richards) underestimated growth rates. Time to detection methods were selected for strain characterization. The variation of growth parameters among strains was best described by either the logistic or lognormal distribution, but definitive conclusions require a larger data set. The distribution of the physiological state parameter ranged from 0.01 to 0.92 and was not significantly different from a normal distribution. Strain variability was important, and the coefficient of variation of growth parameters was up to six times larger among strains than within strains. It is suggested to apply a time to detection (ANOVA) approach using turbidity measurements for convenient and accurate estimation of growth parameters. The results emphasize the need to consider implications of strain variability for predictive modeling and risk assessment.
Estimating Radiation Dose Metrics for Patients Undergoing Tube Current Modulation CT Scans
NASA Astrophysics Data System (ADS)
McMillan, Kyle Lorin
Computed tomography (CT) has long been a powerful tool in the diagnosis of disease, identification of tumors and guidance of interventional procedures. With CT examinations comes the concern of radiation exposure and the associated risks. In order to properly understand those risks on a patient-specific level, organ dose must be quantified for each CT scan. Some of the most widely used organ dose estimates are derived from fixed tube current (FTC) scans of a standard sized idealized patient model. However, in current clinical practice, patient size varies from neonates weighing just a few kg to morbidly obese patients weighing over 200 kg, and nearly all CT exams are performed with tube current modulation (TCM), a scanning technique that adjusts scanner output according to changes in patient attenuation. Methods to account for TCM in CT organ dose estimates have been previously demonstrated, but these methods are limited in scope and/or restricted to idealized TCM profiles that are not based on physical observations and not scanner specific (e.g. don't account for tube limits, scanner-specific effects, etc.). The goal of this work was to develop methods to estimate organ doses to patients undergoing CT scans that take into account both the patient size as well as the effects of TCM. This work started with the development and validation of methods to estimate scanner-specific TCM schemes for any voxelized patient model. An approach was developed to generate estimated TCM schemes that match actual TCM schemes that would have been acquired on the scanner for any patient model. Using this approach, TCM schemes were then generated for a variety of body CT protocols for a set of reference voxelized phantoms for which TCM information does not currently exist. These are whole body patient models representing a variety of sizes, ages and genders that have all radiosensitive organs identified. TCM schemes for these models facilitated Monte Carlo-based estimates of fully-, partially- and indirectly-irradiated organ dose from TCM CT exams. By accounting for the effects of patient size in the organ dose estimates, a comprehensive set of patient-specific dose estimates from TCM CT exams was developed. These patient-specific organ dose estimates from TCM CT exams will provide a more complete understanding of the dose impact and risks associated with modern body CT scanning protocols.
Longitudinal study of mammographic density measures that predict breast cancer risk
Krishnan, Kavitha; Baglietto, Laura; Stone, Jennifer; Simpson, Julie A; Severi, Gianluca; Evans, Christopher F; MacInnis, Robert J; Giles, Graham G; Apicella, Carmel; Hopper, John L
2016-01-01
Background After adjusting for age and body mass index (BMI), mammographic measures - dense area (DA), percent dense area (PDA) and non-dense area (NDA) - are associated with breast cancer risk. Our aim was to use longitudinal data to estimate the extent to which these risk-predicting measures track over time. Methods We collected 4,320 mammograms (age range, 24-83 years) from 970 women in the Melbourne Collaborative Cohort Study and the Australian Breast Cancer Family Registry. Women had on average 4.5 mammograms (range, 1-14). DA, PDA and NDA were measured using the Cumulus software and normalised using the Box-Cox method. Correlations in the normalised risk-predicting measures over time intervals of different lengths were estimated using nonlinear mixed-effects modelling of Gompertz curves. Results Mean normalised DA and PDA were constant with age to the early 40s, decreased over the next two decades, and were almost constant from the mid 60s onwards. Mean normalised NDA increased non-linearly with age. After adjusting for age and BMI, the within-woman correlation estimates for normalised DA were 0.94, 0.93, 0.91, 0.91 and 0.91 for mammograms taken 2, 4, 6, 8 and 10 years apart, respectively. Similar correlations were estimated for the age and BMI adjusted normalized PDA and NDA. Conclusion The mammographic measures that predict breast cancer risk are highly correlated over time. Impact This has implications for etiologic research and clinical management whereby women at increased risk could be identified at a young age (e.g. early 40s or even younger) and recommended appropriate screening and prevention strategies. PMID:28062399
Roy, Kakoli; Lang, Jason E.; Payne, Rebecca L.; Howard, David H.
2016-01-01
Introduction Employers may incur costs related to absenteeism among employees who have chronic diseases or unhealthy behaviors. We examined the association between employee absenteeism and 5 conditions: 3 risk factors (smoking, physical inactivity, and obesity) and 2 chronic diseases (hypertension and diabetes). Methods We identified 5 chronic diseases or risk factors from 2 data sources: MarketScan Health Risk Assessment and the Medical Expenditure Panel Survey (MEPS). Absenteeism was measured as the number of workdays missed because of sickness or injury. We used zero-inflated Poisson regression to estimate excess absenteeism as the difference in the number of days missed from work by those who reported having a risk factor or chronic disease and those who did not. Covariates included demographics (eg, age, education, sex) and employment variables (eg, industry, union membership). We quantified absenteeism costs in 2011 and adjusted them to reflect growth in employment costs to 2015 dollars. Finally, we estimated absenteeism costs for a hypothetical small employer (100 employees) and a hypothetical large employer (1,000 employees). Results Absenteeism estimates ranged from 1 to 2 days per individual per year depending on the risk factor or chronic disease. Except for the physical inactivity and obesity estimates, disease- and risk-factor–specific estimates were similar in MEPS and MarketScan. Absenteeism increased with the number of risk factors or diseases reported. Nationally, each risk factor or disease was associated with annual absenteeism costs greater than $2 billion. Absenteeism costs ranged from $16 to $81 (small employer) and $17 to $286 (large employer) per employee per year. Conclusion Absenteeism costs associated with chronic diseases and health risk factors can be substantial. Employers may incur these costs through lower productivity, and employees could incur costs through lower wages. PMID:27710764
WRF-based fire risk modelling and evaluation for years 2010 and 2012 in Poland
NASA Astrophysics Data System (ADS)
Stec, Magdalena; Szymanowski, Mariusz; Kryza, Maciej
2016-04-01
Wildfires are one of the main ecosystems' disturbances for forested, seminatural and agricultural areas. They generate significant economic loss, especially in forest management and agriculture. Forest fire risk modeling is therefore essential e.g. for forestry administration. In August 2015 a new method of forest fire risk forecasting entered into force in Poland. The method allows to predict a fire risk level in a 4-degree scale (0 - no risk, 3 - highest risk) and consists of a set of linearized regression equations. Meteorological information is used as predictors in regression equations, with air temperature, relative humidity, average wind speed, cloudiness and rainfall. The equations include also pine litter humidity as a measure of potential fuel characteristics. All these parameters are measured routinely in Poland at 42 basic and 94 auxiliary sites. The fire risk level is estimated for a current (basing on morning measurements) or next day (basing on midday measurements). Entire country is divided into 42 prognostic zones, and fire risk level for each zone is taken from the closest measuring site. The first goal of this work is to assess if the measurements needed for fire risk forecasting may be replaced by the data from mesoscale meteorological model. Additionally, the use of a meteorological model would allow to take into account much more realistic spatial differentiation of weather elements determining the fire risk level instead of discrete point-made measurements. Meteorological data have been calculated using the Weather Research and Forecasting model (WRF). For the purpose of this study the WRF model is run in the reanalysis mode allowing to estimate all required meteorological data in a 5-kilometers grid. The only parameter that cannot be directly calculated using WRF is the litter humidity, which has been estimated using empirical formula developed by Sakowska (2007). The experiments are carried out for two selected years: 2010 and 2012. The year 2010 was characterized by the smallest number of wildfires and burnt area whereas 2012 - by the biggest number of fires and the largest area of conflagration. The data about time, localization, scale and causes of individual wildfire occurrence in given years are taken from the National Forest Fire Information System (KSIPL), administered by Forest Fire Protection Department of Polish Forest Research Institute. The database is a part of European Forest Fire Information System (EFFIS). Basing on this data and on the WRF-based fire risk modelling we intend to achieve the second goal of the study, which is the evaluation of the forecasted fire risk with an occurrence of wildfires. Special attention is paid here to the number, time and the spatial distribution of wildfires occurred in cases of low-level predicted fire risk. Results obtained reveals the effectiveness of the new forecasting method. The outcome of our investigation allows to draw a conclusion that some adjustments are possible to improve the efficiency on the fire-risk estimation method.
NASA Astrophysics Data System (ADS)
Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.
2013-12-01
In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example of five watersheds located in the South of France. References : O. CAYLA : Probability calculation of design floods abd inflows - SPEED. Waterpower 1995, San Francisco, California 1995 CFGB : Design flood determination by the gradex method. Bulletin du Comité Français des Grands Barrages News 96, 18th congress CIGB-ICOLD n2, nov:108, 1994. F. GARAVAGLIA et al. : Introducing a rainfall compound distribution model based on weather patterns subsampling. Hydrology and Earth System Sciences, 14, 951-964, 2010. J. LAVABRE et al. : SHYREG : une méthode pour l'estimation régionale des débits de crue. application aux régions méditerranéennes françaises. Ingénierie EAT, 97-111, 2003. M. MARGOUM : Estimation des crues rares et extrêmes : le modèle AGREGEE. Conceptions et remières validations. PhD, Ecole des Mines de Paris, 1992. R. NAULET et al. : Flood frequency analysis on the Ardèche river using French documentary sources from the two last centuries. Journal of Hydrology, 313:58-78, 2005. E. PAQUET et al. : The SCHADEX method: A semi-continuous rainfall-runoff simulation for extreme flood estimation, Journal of Hydrology, 495, 23-37, 2013.
Wu, Jun; Wilhelm, Michelle; Chung, Judith; Ritz, Beate
2011-01-01
Background Previous studies reported adverse impacts of traffic-related air pollution exposure on pregnancy outcomes. Yet, little information exists on how effect estimates are impacted by the different exposure assessment methods employed in these studies. Objectives To compare effect estimates for traffic-related air pollution exposure and preeclampsia, preterm birth (gestational age less than 37 weeks), and very preterm birth (gestational age less than 30 weeks) based on four commonly-used exposure assessment methods. Methods We identified 81,186 singleton births during 1997–2006 at four hospitals in Los Angeles and Orange Counties, California. Exposures were assigned to individual subjects based on residential address at delivery using the nearest ambient monitoring station data [carbon monoxide (CO), nitrogen dioxide (NO2), nitric oxide (NO), nitrogen oxides (NOx), ozone (O3), and particulate matter less than 2.5 (PM2.5) or less than 10 (PM10) μm in aerodynamic diameter], both unadjusted and temporally-adjusted land-use regression (LUR) model estimates (NO, NO2, and NOx), CALINE4 line-source air dispersion model estimates (NOx and PM2.5), and a simple traffic-density measure. We employed unconditional logistic regression to analyze preeclampsia in our birth cohort, while for gestational age-matched risk sets with preterm and very preterm birth we employed conditional logistic regression. Results We observed elevated risks for preeclampsia, preterm birth, and very preterm birth from maternal exposures to traffic air pollutants measured at ambient stations (CO, NO, NO2, and NOx) and modeled through CALINE4 (NOx and PM2.5) and LUR (NO2 and NOx). Increased risk of preterm birth and very preterm birth were also positively associated with PM10 and PM2.5 air pollution measured at ambient stations. For LUR-modeled NO2 and NOx exposures, elevated risks for all the outcomes were observed in Los Angeles only – the region for which the LUR models were initially developed. Unadjusted LUR models often produced odds ratios somewhat larger in size than temporally-adjusted models. The size of effect estimates was smaller for exposures based on simpler traffic density measures than the other exposure assessment methods. Conclusion We generally confirmed that traffic-related air pollution was associated with adverse reproductive outcomes regardless of the exposure assessment method employed, yet the size of the estimated effect depended on how both temporal and spatial variations were incorporated into exposure assessment. The LUR model was not transferable even between two contiguous areas within the same large metropolitan area in Southern California. PMID:21453913
The burden of disease from indoor air pollution in developing countries: comparison of estimates.
Smith, Kirk R; Mehta, Sumi
2003-08-01
Four different methods have been applied to estimate the burden of disease due to indoor air pollution from household solid fuel use in developing countries (LDCs). The largest number of estimates involves applying exposure-response information from urban ambient air pollution studies to estimate indoor exposure concentrations of particulate air pollution. Another approach is to construct child survival curves using the results of large-scale household surveys, as has been done for India. A third approach involves cross-national analyses of child survival and household fuel use. The fourth method, referred to as the 'fuel-based' approach, which is explored in more depth here, involves applying relative risk estimates from epidemiological studies that use exposure surrogates, such as fuel type, to estimates of household solid fuel use to determine population attributable fractions by disease and age group. With this method and conservative assumptions about relative risks, 4-5 percent of the global LDC totals for both deaths and DALYs (disability adjusted life years) from acute respiratory infections, chronic obstructive pulmonary disease, tuberculosis, asthma, lung cancer, ischaemic heart disease, and blindness can be attributed to solid fuel use in developing countries. Acute respiratory infections in children under five years of age are the largest single category of deaths (64%) and DALYs (81%) from indoor air pollution, apparently being responsible globally for about 1.2 million premature deaths annually in the early 1990s.
Estimated value of insurance premium due to Citarum River flood by using Bayesian method
NASA Astrophysics Data System (ADS)
Sukono; Aisah, I.; Tampubolon, Y. R. H.; Napitupulu, H.; Supian, S.; Subiyanto; Sidi, P.
2018-03-01
Citarum river flood in South Bandung, West Java Indonesia, often happens every year. It causes property damage, producing economic loss. The risk of loss can be mitigated by following the flood insurance program. In this paper, we discussed about the estimated value of insurance premiums due to Citarum river flood by Bayesian method. It is assumed that the risk data for flood losses follows the Pareto distribution with the right fat-tail. The estimation of distribution model parameters is done by using Bayesian method. First, parameter estimation is done with assumption that prior comes from Gamma distribution family, while observation data follow Pareto distribution. Second, flood loss data is simulated based on the probability of damage in each flood affected area. The result of the analysis shows that the estimated premium value of insurance based on pure premium principle is as follows: for the loss value of IDR 629.65 million of premium IDR 338.63 million; for a loss of IDR 584.30 million of its premium IDR 314.24 million; and the loss value of IDR 574.53 million of its premium IDR 308.95 million. The premium value estimator can be used as neither a reference in the decision of reasonable premium determination, so as not to incriminate the insured, nor it result in loss of the insurer.
Assessing the risk of Nipah virus establishment in Australian flying-foxes.
Roche, S E; Costard, S; Meers, J; Field, H E; Breed, A C
2015-07-01
Nipah virus (NiV) is a recently emerged zoonotic virus that causes severe disease in humans. The reservoir hosts for NiV, bats of the genus Pteropus (known as flying-foxes) are found across the Asia-Pacific including Australia. While NiV has not been detected in Australia, evidence for NiV infection has been found in flying-foxes in some of Australia's closest neighbours. A qualitative risk assessment was undertaken to assess the risk of NiV establishing in Australian flying-foxes through flying-fox movements from nearby regions. Events surrounding the emergence of new diseases are typically uncertain and in this study an expert opinion workshop was used to address gaps in knowledge. Given the difficulties in combining expert opinion, five different combination methods were analysed to assess their influence on the risk outcome. Under the baseline scenario where the median was used to combine opinions, the risk was estimated to be very low. However, this risk increased when the mean and linear opinion pooling combination methods were used. This assessment highlights the effects that different methods for combining expert opinion have on final risk estimates and the caution needed when interpreting these outcomes given the high degree of uncertainty in expert opinion. This work has provided a flexible model framework for assessing the risk of NiV establishment in Australian flying-foxes through bat movements which can be updated when new data become available.
Developing an objective evaluation method to estimate diabetes risk in community-based settings.
Kenya, Sonjia; He, Qing; Fullilove, Robert; Kotler, Donald P
2011-05-01
Exercise interventions often aim to affect abdominal obesity and glucose tolerance, two significant risk factors for type 2 diabetes. Because of limited financial and clinical resources in community and university-based environments, intervention effects are often measured with interviews or questionnaires and correlated with weight loss or body fat indicated by body bioimpedence analysis (BIA). However, self-reported assessments are subject to high levels of bias and low levels of reliability. Because obesity and body fat are correlated with diabetes at different levels in various ethnic groups, data reflecting changes in weight or fat do not necessarily indicate changes in diabetes risk. To determine how exercise interventions affect diabetes risk in community and university-based settings, improved evaluation methods are warranted. We compared a noninvasive, objective measurement technique--regional BIA--with whole-body BIA for its ability to assess abdominal obesity and predict glucose tolerance in 39 women. To determine regional BIA's utility in predicting glucose, we tested the association between the regional BIA method and blood glucose levels. Regional BIA estimates of abdominal fat area were significantly correlated (r = 0.554, P < 0.003) with fasting glucose. When waist circumference and family history of diabetes were added to abdominal fat in multiple regression models, the association with glucose increased further (r = 0.701, P < 0.001). Regional BIA estimates of abdominal fat may predict fasting glucose better than whole-body BIA as well as provide an objective assessment of changes in diabetes risk achieved through physical activity interventions in community settings.
Knapp, Julika; Allesch, Astrid; Müller, Wolfgang; Bockreis, Anke
2017-11-01
Recycling of waste materials is desirable to reduce the consumption of limited primary resources, but also includes the risk of recycling unwanted, hazardous substances. In Austria, the legal framework demands secondary products must not present a higher risk than comparable products derived from primary resources. However, the act provides no definition on how to assess this risk potential. This paper describes the development of different quantitative and qualitative methods to estimate the transfer of contaminants in recycling processes. The quantitative methods comprise the comparison of concentrations of harmful substances in recycling products to corresponding primary products and to existing limit values. The developed evaluation matrix, which considers further aspects, allows for the assessment of the qualitative risk potential. The results show that, depending on the assessed waste fraction, particular contaminants can be critical. Their concentrations were higher than in comparable primary materials and did not comply with existing limit values. On the other hand, the results show that a long-term, well-established quality control system can assure compliance with the limit values. The results of the qualitative assessment obtained with the evaluation matrix support the results of the quantitative assessment. Therefore, the evaluation matrix can be suitable to quickly screen waste streams used for recycling to estimate their potential environmental and health risks. To prevent the transfer of contaminants into product cycles, improved data of relevant substances in secondary resources are necessary. In addition, regulations for material recycling are required to assure adequate quality control measures, including limit values. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kelly, Christopher; Pashayan, Nora; Munisamy, Sreetharan; Powles, John W
2009-01-01
Background Our aim was to estimate the burden of fatal disease attributable to excess adiposity in England and Wales in 2003 and 2015 and to explore the sensitivity of the estimates to the assumptions and methods used. Methods A spreadsheet implementation of the World Health Organization's (WHO) Comparative Risk Assessment (CRA) methodology for continuously distributed exposures was used. For our base case, adiposity-related risks were assumed to be minimal with a mean (SD) BMI of 21 (1) Kg m-2. All cause mortality risks for 2015 were taken from the Government Actuary and alternative compositions by cause derived. Disease-specific relative risks by BMI were taken from the CRA project and varied in sensitivity analyses. Results Under base case methods and assumptions for 2003, approximately 41,000 deaths and a loss of 1.05 years of life expectancy were attributed to excess adiposity. Seventy-seven percent of all diabetic deaths, 23% of all ischaemic heart disease deaths and 14% of all cerebrovascular disease deaths were attributed to excess adiposity. Predictions for 2015 were found to be more sensitive to assumptions about the future course of mortality risks for diabetes than to variation in the assumed trend in BMI. On less favourable assumptions the attributable loss of life expectancy in 2015 would rise modestly to 1.28 years. Conclusion Excess adiposity appears to contribute materially but modestly to mortality risks in England and Wales and this contribution is likely to increase in the future. Uncertainty centres on future trends of associated diseases, especially diabetes. The robustness of these estimates is limited by the lack of control for correlated risks by stratification and by the empirical uncertainty surrounding the effects of prolonged excess adiposity beginning in adolescence. PMID:19566928
Serrier, Hassan; Sultan-Taieb, Hélène; Luce, Danièle; Bejean, Sophie
2014-07-01
The objective of this article was to estimate the social cost of respiratory cancer cases attributable to occupational risk factors in France in 2010. According to the attributable fraction method and based on available epidemiological data from the literature, we estimated the number of respiratory cancer cases due to each identified risk factor. We used the cost-of-illness method with a prevalence-based approach. We took into account the direct and indirect costs. We estimated the cost of production losses due to morbidity (absenteeism and presenteeism) and mortality costs (years of production losses) in the market and nonmarket spheres. The social cost of lung, larynx, sinonasal and mesothelioma cancer caused by exposure to asbestos, chromium, diesel engine exhaust, paint, crystalline silica, wood and leather dust in France in 2010 were estimated at between 917 and 2,181 million euros. Between 795 and 2,011 million euros (87-92%) of total costs were due to lung cancer alone. Asbestos was by far the risk factor representing the greatest cost to French society in 2010 at between 531 and 1,538 million euros (58-71%), ahead of diesel engine exhaust, representing an estimated social cost of between 233 and 336 million euros, and crystalline silica (119-229 million euros). Indirect costs represented about 66% of total costs. Our assessment shows the magnitude of the economic impact of occupational respiratory cancers. It allows comparisons between countries and provides valuable information for policy-makers responsible for defining public health priorities.
NASA Astrophysics Data System (ADS)
Harmon, T. C.; Conde, D.; Villamizar, S. R.; Reid, B.; Escobar, J.; Rusak, J.; Hoyos, N.; Scordo, F.; Perillo, G. M.; Piccolo, M. C.; Zilio, M.; Velez, M.
2015-12-01
Assessing risks to aquatic ecosystems services (ES) is challenging and time-consuming, and effective strategies for prioritizing more detailed assessment efforts are needed. We propose a screening-level risk analysis (SRA) approach that scales ES risk using socioeconomic and environmental indices to capture anthropic and climatic pressures, as well as the capacity for institutional responses to those pressures. The method considers ES within a watershed context, and uses expert input to prioritize key services and the associated pressures that threaten them. The SRA approach focuses on estimating ES risk affect factors, which are the sum of the intensity factors for all hazards or pressures affecting the ES. We estimate the pressure intensity factors in a novel manner, basing them on the nation's (i) human development (proxied by Inequality-adjusted Human Development Index, IHDI), (ii) environmental regulatory and monitoring state (Environmental Performance Index, EPI) and (iii) the current level of water stress in the watershed (baseline water stress, BWS). Anthropic intensity factors for future conditions are derived from the baseline values based on the nation's 10-year trend in IHDI and EPI; ES risks in nations with stronger records of change are rewarded more/penalized less in estimates for good/poor future management scenarios. Future climatic intensity factors are tied to water stress estimates based on two general circulation model (GCM) outcomes. We demonstrate the method for an international array of six sites representing a wide range of socio-environmental settings. The outcomes illustrate novel consequences of the scaling scheme. Risk affect factors may be greater in a highly developed region under intense climatic pressure, or in less well-developed regions due to human factors (e.g., poor environmental records). As a screening-level tool, the SRA approach offers considerable promise for ES risk comparisons among watersheds and regions so that detailed assessment, management and mitigation efforts can be effectively prioritized.
Estimating high-risk cannabis and opiate use in Ankara, Istanbul and Izmir.
Kraus, Ludwig; Hay, Gordon; Richardson, Clive; Yargic, Ilhan; Ilhan, Mustafa Necmi; Ay, Pinar; Karasahin, Füsun; Pinarci, Mustafa; Tuncoglu, Tolga; Piontek, Daniela; Schulte, Bernd
2017-09-01
Information on high-risk drug use in Turkey, particularly at the regional level, is lacking. The present analysis aims at estimating high-risk cannabis use (HRCU) and high-risk opiate use (HROU) in the cities of Ankara, Istanbul and Izmir. Capture-recapture and multiplier methods were applied based on treatment and police data stratified by age and gender in the years 2009 and 2010. Case definitions refer to ICD-10 cannabis (F.12) and opiate (F.11) disorder diagnoses from outpatient and inpatient treatment records and illegal possession of these drugs as recorded by the police. High-risk cannabis use was estimated at 28 500 (8.5 per 1000; 95% confidence interval 7.3-10.3) and 33 400 (11.9 per 1000; 95% confidence interval 10.7-13.5) in Ankara and Izmir, respectively. Using multipliers based on capture-recapture estimates for Izmir, HRCU in Istanbul was estimated up to 166 000 (18.0 per 1000; range: 2.8-18.0). Capture-recapture estimates of HROU resulted in 4800 (1.4 per 1000; 95% confidence interval 0.9-1.9) in Ankara and multipliers based on these gave estimates up to 20 000 (2.2 per 1000; range: 0.9-2.2) in Istanbul. HROU in Izmir was not estimated due to the low absolute numbers of opiate users. While HRCU prevalence in both Ankara and Izmir was considerably lower in comparison to an estimate for Berlin, the rate for Istanbul was only slightly lower. Compared with the majority of European cities, HROU in these three Turkish cities may be considered rather low. [Kraus L, Hay G, Richardson C, Yargic I, Ilhan N M, Ay P, Karasahin F, Pinarci M, Tuncoglu T, Piontek D, Schulte B Estimating high-risk cannabis and opiate use in Ankara, Istanbul and Izmir Drug Alcohol Rev 2016;00:000-000]. © 2016 Australasian Professional Society on Alcohol and other Drugs.
An Integrated Web-Based Assessment Tool for Assessing Pesticide Exposure and Risks
Background/Question/Methods We have created an integrated web-based tool designed to estimate exposure doses and ecological risks under the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Species Act. This involved combining a number of disparat...
Historically, risk assessment has relied upon toxicological data to obtain hazard-based reference levels, which are subsequently compared to exposure estimates to determine whether an unacceptable risk to public health may exist. Recent advances in analytical methods, biomarker ...
Carroll, Rachel; Lawson, Andrew B; Kirby, Russell S; Faes, Christel; Aregay, Mehreteab; Watjou, Kevin
2017-01-01
Many types of cancer have an underlying spatiotemporal distribution. Spatiotemporal mixture modeling can offer a flexible approach to risk estimation via the inclusion of latent variables. In this article, we examine the application and benefits of using four different spatiotemporal mixture modeling methods in the modeling of cancer of the lung and bronchus as well as "other" respiratory cancer incidences in the state of South Carolina. Of the methods tested, no single method outperforms the other methods; which method is best depends on the cancer under consideration. The lung and bronchus cancer incidence outcome is best described by the univariate modeling formulation, whereas the "other" respiratory cancer incidence outcome is best described by the multivariate modeling formulation. Spatiotemporal multivariate mixture methods can aid in the modeling of cancers with small and sparse incidences when including information from a related, more common type of cancer. Copyright © 2016 Elsevier Inc. All rights reserved.
Kuempel, Eileen D.; Sweeney, Lisa M.; Morris, John B.; Jarabek, Annie M.
2015-01-01
The purpose of this article is to provide an overview and practical guide to occupational health professionals concerning the derivation and use of dose estimates in risk assessment for development of occupational exposure limits (OELs) for inhaled substances. Dosimetry is the study and practice of measuring or estimating the internal dose of a substance in individuals or a population. Dosimetry thus provides an essential link to understanding the relationship between an external exposure and a biological response. Use of dosimetry principles and tools can improve the accuracy of risk assessment, and reduce the uncertainty, by providing reliable estimates of the internal dose at the target tissue. This is accomplished through specific measurement data or predictive models, when available, or the use of basic dosimetry principles for broad classes of materials. Accurate dose estimation is essential not only for dose-response assessment, but also for interspecies extrapolation and for risk characterization at given exposures. Inhalation dosimetry is the focus of this paper since it is a major route of exposure in the workplace. Practical examples of dose estimation and OEL derivation are provided for inhaled gases and particulates. PMID:26551218
Ahlborn, W; Tuz, H J; Uberla, K
1990-03-01
In cohort studies the Mantel-Haenszel estimator ORMH is computed from sample data and is used as a point estimator of relative risk. Test-based confidence intervals are estimated with the help of the asymptotic chi-squared distributed MH-statistic chi 2MHS. The Mantel-extension-chi-squared is used as a test statistic for a dose-response relationship. Both test statistics--the Mantel-Haenszel-chi as well as the Mantel-extension-chi--assume homogeneity of risk across strata, which is rarely present. Also an extended nonparametric statistic, proposed by Terpstra, which is based on the Mann-Whitney-statistics assumes homogeneity of risk across strata. We have earlier defined four risk measures RRkj (k = 1,2,...,4) in the population and considered their estimates and the corresponding asymptotic distributions. In order to overcome the homogeneity assumption we use the delta-method to get "test-based" confidence intervals. Because the four risk measures RRkj are presented as functions of four weights gik we give, consequently, the asymptotic variances of these risk estimators also as functions of the weights gik in a closed form. Approximations to these variances are given. For testing a dose-response relationship we propose a new class of chi 2(1)-distributed global measures Gk and the corresponding global chi 2-test. In contrast to the Mantel-extension-chi homogeneity of risk across strata must not be assumed. These global test statistics are of the Wald type for composite hypotheses.(ABSTRACT TRUNCATED AT 250 WORDS)
Quantifying the predictive accuracy of time-to-event models in the presence of competing risks.
Schoop, Rotraut; Beyersmann, Jan; Schumacher, Martin; Binder, Harald
2011-02-01
Prognostic models for time-to-event data play a prominent role in therapy assignment, risk stratification and inter-hospital quality assurance. The assessment of their prognostic value is vital not only for responsible resource allocation, but also for their widespread acceptance. The additional presence of competing risks to the event of interest requires proper handling not only on the model building side, but also during assessment. Research into methods for the evaluation of the prognostic potential of models accounting for competing risks is still needed, as most proposed methods measure either their discrimination or calibration, but do not examine both simultaneously. We adapt the prediction error proposal of Graf et al. (Statistics in Medicine 1999, 18, 2529–2545) and Gerds and Schumacher (Biometrical Journal 2006, 48, 1029–1040) to handle models with competing risks, i.e. more than one possible event type, and introduce a consistent estimator. A simulation study investigating the behaviour of the estimator in small sample size situations and for different levels of censoring together with a real data application follows.
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Kim, Myung-Hee Y.; Chappell, Lori J.
2012-01-01
Cancer risk is an important concern for International Space Station (ISS) missions and future exploration missions. An important question concerns the likelihood of a causal association between a crew members radiation exposure and the occurrence of cancer. The probability of causation (PC), also denoted as attributable risk, is used to make such an estimate. This report summarizes the NASA model of space radiation cancer risks and uncertainties, including improvements to represent uncertainties in tissue-specific cancer incidence models for never-smokers and the U.S. average population. We report on tissue-specific cancer incidence estimates and PC for different post-mission times for ISS and exploration missions. An important conclusion from our analysis is that the NASA policy to limit the risk of exposure-induced death to 3% at the 95% confidence level largely ensures that estimates of the PC for most cancer types would not reach a level of significance. Reducing uncertainties through radiobiological research remains the most efficient method to extend mission length and establish effective mitigators for cancer risks. Efforts to establish biomarkers of space radiation-induced tumors and to estimate PC for rarer tumor types are briefly discussed.
Zhang, Wei; Regterschot, G Ruben H; Wahle, Fabian; Geraedts, Hilde; Baldus, Heribert; Zijlstra, Wiebren
2014-01-01
Falls result in substantial disability, morbidity, and mortality among older people. Early detection of fall risks and timely intervention can prevent falls and injuries due to falls. Simple field tests, such as repeated chair rise, are used in clinical assessment of fall risks in older people. Development of on-body sensors introduces potential beneficial alternatives for traditional clinical methods. In this article, we present a pendant sensor based chair rise detection and analysis algorithm for fall risk assessment in older people. The recall and the precision of the transfer detection were 85% and 87% in standard protocol, and 61% and 89% in daily life activities. Estimation errors of chair rise performance indicators: duration, maximum acceleration, peak power and maximum jerk were tested in over 800 transfers. Median estimation error in transfer peak power ranged from 1.9% to 4.6% in various tests. Among all the performance indicators, maximum acceleration had the lowest median estimation error of 0% and duration had the highest median estimation error of 24% over all tests. The developed algorithm might be feasible for continuous fall risk assessment in older people.
Wang, Hongrui; Wang, Cheng; Wang, Ying; ...
2017-04-05
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less
Graves, Tabitha A.; Royle, J. Andrew; Kendall, Katherine C.; Beier, Paul; Stetz, Jeffrey B.; Macleod, Amy C.
2012-01-01
Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against those risks. The analysis framework presented here will be useful for other species exhibiting heterogeneity by detection method.
Bayesian updating in a fault tree model for shipwreck risk assessment.
Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M
2017-07-15
Shipwrecks containing oil and other hazardous substances have been deteriorating on the seabeds of the world for many years and are threatening to pollute the marine environment. The status of the wrecks and the potential volume of harmful substances present in the wrecks are affected by a multitude of uncertainties. Each shipwreck poses a unique threat, the nature of which is determined by the structural status of the wreck and possible damage resulting from hazardous activities that could potentially cause a discharge. Decision support is required to ensure the efficiency of the prioritisation process and the allocation of resources required to carry out risk mitigation measures. Whilst risk assessments can provide the requisite decision support, comprehensive methods that take into account key uncertainties related to shipwrecks are limited. The aim of this paper was to develop a method for estimating the probability of discharge of hazardous substances from shipwrecks. The method is based on Bayesian updating of generic information on the hazards posed by different activities in the surroundings of the wreck, with information on site-specific and wreck-specific conditions in a fault tree model. Bayesian updating is performed using Monte Carlo simulations for estimating the probability of a discharge of hazardous substances and formal handling of intrinsic uncertainties. An example application involving two wrecks located off the Swedish coast is presented. Results show the estimated probability of opening, discharge and volume of the discharge for the two wrecks and illustrate the capability of the model to provide decision support. Together with consequence estimations of a discharge of hazardous substances, the suggested model enables comprehensive and probabilistic risk assessments of shipwrecks to be made. Copyright © 2017 Elsevier B.V. All rights reserved.
The role of plant phenology in stomatal ozone flux modeling.
Anav, Alessandro; Liu, Qiang; De Marco, Alessandra; Proietti, Chiara; Savi, Flavia; Paoletti, Elena; Piao, Shilong
2018-01-01
Plant phenology plays a pivotal role in the climate system as it regulates the gas exchange between the biosphere and the atmosphere. The uptake of ozone by forest is estimated through several meteorological variables and a specific function describing the beginning and the termination of plant growing season; actually, in many risk assessment studies, this function is based on a simple latitude and topography model. In this study, using two satellite datasets, we apply and compare six methods to estimate the start and the end dates of the growing season across a large region covering all Europe for the year 2011. Results show a large variability between the green-up and dormancy dates estimated using the six different methods, with differences greater than one month. However, interestingly, all the methods display a common spatial pattern in the uptake of ozone by forests with a marked change in the magnitude, up to 1.9 TgO 3 /year, and corresponding to a difference of 25% in the amount of ozone that enters the leaves. Our results indicate that improved estimates of ozone fluxes require a better representation of plant phenology in the models used for O 3 risk assessment. © 2017 John Wiley & Sons Ltd.
McElvaine, M D; McDowell, R M; Fite, R W; Miller, L
1993-12-01
The United States Department of Agriculture, Animal and Plant Health Inspection Service (USDA-APHIS) has been exploring methods of quantitative risk assessment to support decision-making, provide risk management options and identify research needs. With current changes in world trade, regulatory decisions must have a scientific basis which is transparent, consistent, documentable and defensible. These quantitative risk assessment methods are described in an accompanying paper in this issue. In the present article, the authors provide an illustration by presenting an application of these methods. Prior to proposing changes in regulations, USDA officials requested an assessment of the risk of introduction of foreign animal disease to the United States of America through garbage from Alaskan cruise ships. The risk assessment team used a combination of quantitative and qualitative methods to evaluate this question. Quantitative risk assessment methods were used to estimate the amount of materials of foreign origin being sent to Alaskan landfills. This application of quantitative risk assessment illustrates the flexibility of the methods in addressing specific questions. By applying these methods, specific areas were identified where more scientific information and research were needed. Even with limited information, the risk assessment provided APHIS management with a scientific basis for a regulatory decision.
Kawada, Tomoyuki; Otsuka, Toshiaki; Inagaki, Hirofumi; Wakayama, Yoko; Li, Qing; Katsumata, Masao
2009-10-01
The Framingham Risk Score (FRS) has frequently been used in the United States to predict the 10-year risk of coronary heart disease (CHD). Components of the metabolic syndrome and several lifestyle factors have also been evaluated to estimate the risk of CHD. To determine the relationship between the FRS and components of metabolic syndrome as coronary risk indicators, the authors conducted a cross-sectional study of 2,619 Japanese male workers, ranging in age from 40 to 64 years, at a single workplace. Although the estimation by the FRS and metabolic syndrome involved some different factors, significant association of the risk estimated by the 2 methods was observed. When logistic regression analysis was conducted with adjustment for several lifestyle factors, the FRS and serum insulin were found to be significantly associated with the risk of likelihood of metabolic syndrome. The odds ratios and 95% confidence intervals of FRS by per standard deviation increment and serum insulin by increasing 1 microIU/mL for the prediction of metabolic syndrome were 2.50 (2.17-2.88) and 1.24 (1.20-1.27), respectively. A preventive effect of abstaining from drinking every day and eating breakfast almost daily against the likelihood of metabolic syndrome was also observed. In conclusion, the FRS and insulin were found to be significantly associated with the risk of likelihood of metabolic syndrome, even after controlling for weight change.
Afari-Dwamena, Nana Ama; Li, Ji; Chen, Rusan; Feinleib, Manning; Lamm, Steven H.
2016-01-01
Background. To examine whether the US EPA (2010) lung cancer risk estimate derived from the high arsenic exposures (10–934 µg/L) in southwest Taiwan accurately predicts the US experience from low arsenic exposures (3–59 µg/L). Methods. Analyses have been limited to US counties solely dependent on underground sources for their drinking water supply with median arsenic levels of ≥3 µg/L. Results. Cancer risks (slopes) were found to be indistinguishable from zero for males and females. The addition of arsenic level did not significantly increase the explanatory power of the models. Stratified, or categorical, analysis yielded relative risks that hover about 1.00. The unit risk estimates were nonpositive and not significantly different from zero, and the maximum (95% UCL) unit risk estimates for lung cancer were lower than those in US EPA (2010). Conclusions. These data do not demonstrate an increased risk of lung cancer associated with median drinking water arsenic levels in the range of 3–59 µg/L. The upper-bound estimates of the risks are lower than the risks predicted from the SW Taiwan data and do not support those predictions. These results are consistent with a recent metaregression that indicated no increased lung cancer risk for arsenic exposures below 100–150 µg/L. PMID:27382373
Serum Iron Levels and the Risk of Parkinson Disease: A Mendelian Randomization Study
Gögele, Martin; Lill, Christina M.; Bertram, Lars; Do, Chuong B.; Eriksson, Nicholas; Foroud, Tatiana; Myers, Richard H.; Nalls, Michael; Keller, Margaux F.; Benyamin, Beben; Whitfield, John B.; Pramstaller, Peter P.; Hicks, Andrew A.; Thompson, John R.; Minelli, Cosetta
2013-01-01
Background Although levels of iron are known to be increased in the brains of patients with Parkinson disease (PD), epidemiological evidence on a possible effect of iron blood levels on PD risk is inconclusive, with effects reported in opposite directions. Epidemiological studies suffer from problems of confounding and reverse causation, and mendelian randomization (MR) represents an alternative approach to provide unconfounded estimates of the effects of biomarkers on disease. We performed a MR study where genes known to modify iron levels were used as instruments to estimate the effect of iron on PD risk, based on estimates of the genetic effects on both iron and PD obtained from the largest sample meta-analyzed to date. Methods and Findings We used as instrumental variables three genetic variants influencing iron levels, HFE rs1800562, HFE rs1799945, and TMPRSS6 rs855791. Estimates of their effect on serum iron were based on a recent genome-wide meta-analysis of 21,567 individuals, while estimates of their effect on PD risk were obtained through meta-analysis of genome-wide and candidate gene studies with 20,809 PD cases and 88,892 controls. Separate MR estimates of the effect of iron on PD were obtained for each variant and pooled by meta-analysis. We investigated heterogeneity across the three estimates as an indication of possible pleiotropy and found no evidence of it. The combined MR estimate showed a statistically significant protective effect of iron, with a relative risk reduction for PD of 3% (95% CI 1%–6%; p = 0.001) per 10 µg/dl increase in serum iron. Conclusions Our study suggests that increased iron levels are causally associated with a decreased risk of developing PD. Further studies are needed to understand the pathophysiological mechanism of action of serum iron on PD risk before recommendations can be made. Please see later in the article for the Editors' Summary PMID:23750121
Maciosek, Michael V.; Xu, Xin; Butani, Amy L.; Pechacek, Terry F.
2015-01-01
Objective To accurately assess the benefits of tobacco control interventions and to better inform decision makers, knowledge of medical expenditures by age, gender, and smoking status is essential. Method We propose an approach to distribute smoking-attributable expenditures by age, gender, and cigarette smoking status to reflect the known risks of smoking. We distribute hospitalization days for smoking-attributable diseases according to relative risks of smoking-attributable mortality, and use the method to determine national estimates of smoking-attributable expenditures by age, sex, and cigarette smoking status. Sensitivity analyses explored assumptions of the method. Results Both current and former smokers ages 75 and over have about 12 times the smoking-attributable expenditures of their current and former smoker counterparts 35–54 years of age. Within each age group, the expenditures of formers smokers are about 70% lower than current smokers. In sensitivity analysis, these results were not robust to large changes to the relative risks of smoking-attributable mortality which were used in the calculations. Conclusion Sex- and age-group-specific smoking expenditures reflect observed disease risk differences between current and former cigarette smokers and indicate that about 70% of current smokers’ excess medical care costs is preventable by quitting. PMID:26051203
USING DOSE ADDITION TO ESTIMATE CUMULATIVE RISKS FROM EXPOSURES TO MULTIPLE CHEMICALS
The Food Quality Protection Act (FQPA) of 1996 requires the EPA to consider the cumulative risk from exposure to multiple chemicals that have a common mechanism of toxicity. Three methods, hazard index (HI), point-of-departure index (PODI), and toxicity equivalence factor (TEF), ...
Relative risk of fatal crash involvement by BAC, age, and gender
DOT National Transportation Integrated Search
2000-04-01
The objective of this study was to re-examine and refine estimates for alcohol-related relative risk of driver involvement in fatal crashes by age and gender as a function of blood alcohol concentration (BAC) using recent data. The method of study wa...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bender, Edward T.
Purpose: To develop a robust method for deriving dose-painting prescription functions using spatial information about the risk for disease recurrence. Methods: Spatial distributions of radiobiological model parameters are derived from distributions of recurrence risk after uniform irradiation. These model parameters are then used to derive optimal dose-painting prescription functions given a constant mean biologically effective dose. Results: An estimate for the optimal dose distribution can be derived based on spatial information about recurrence risk. Dose painting based on imaging markers that are moderately or poorly correlated with recurrence risk are predicted to potentially result in inferior disease control when comparedmore » the same mean biologically effective dose delivered uniformly. A robust optimization approach may partially mitigate this issue. Conclusions: The methods described here can be used to derive an estimate for a robust, patient-specific prescription function for use in dose painting. Two approximate scaling relationships were observed: First, the optimal choice for the maximum dose differential when using either a linear or two-compartment prescription function is proportional to R, where R is the Pearson correlation coefficient between a given imaging marker and recurrence risk after uniform irradiation. Second, the predicted maximum possible gain in tumor control probability for any robust optimization technique is nearly proportional to the square of R.« less
Ancestry estimation and control of population stratification for sequence-based association studies.
Wang, Chaolong; Zhan, Xiaowei; Bragg-Gresham, Jennifer; Kang, Hyun Min; Stambolian, Dwight; Chew, Emily Y; Branham, Kari E; Heckenlively, John; Fulton, Robert; Wilson, Richard K; Mardis, Elaine R; Lin, Xihong; Swaroop, Anand; Zöllner, Sebastian; Abecasis, Gonçalo R
2014-04-01
Estimating individual ancestry is important in genetic association studies where population structure leads to false positive signals, although assigning ancestry remains challenging with targeted sequence data. We propose a new method for the accurate estimation of individual genetic ancestry, based on direct analysis of off-target sequence reads, and implement our method in the publicly available LASER software. We validate the method using simulated and empirical data and show that the method can accurately infer worldwide continental ancestry when used with sequencing data sets with whole-genome shotgun coverage as low as 0.001×. For estimates of fine-scale ancestry within Europe, the method performs well with coverage of 0.1×. On an even finer scale, the method improves discrimination between exome-sequenced study participants originating from different provinces within Finland. Finally, we show that our method can be used to improve case-control matching in genetic association studies and to reduce the risk of spurious findings due to population structure.
NASA Astrophysics Data System (ADS)
Al-Akad, S.; Akensous, Y.; Hakdaoui, M.
2017-11-01
This research article is summarize the applications of remote sensing and GIS to study the urban floods risk in Al Mukalla. Satellite acquisition of a flood event on October 2015 in Al Mukalla (Yemen) by using flood risk mapping techniques illustrate the potential risk present in this city. Satellite images (The Landsat and DEM images data were atmospherically corrected, radiometric corrected, and geometric and topographic distortions rectified.) are used for flood risk mapping to afford a hazard (vulnerability) map. This map is provided by applying image-processing techniques and using geographic information system (GIS) environment also the application of NDVI, NDWI index, and a method to estimate the flood-hazard areas. Four factors were considered in order to estimate the spatial distribution of the hazardous areas: flow accumulation, slope, land use, geology and elevation. The multi-criteria analysis, allowing to deal with vulnerability to flooding, as well as mapping areas at the risk of flooding of the city Al Mukalla. The main object of this research is to provide a simple and rapid method to reduce and manage the risks caused by flood in Yemen by take as example the city of Al Mukalla.
A Machine Learning Framework for Plan Payment Risk Adjustment.
Rose, Sherri
2016-12-01
To introduce cross-validation and a nonparametric machine learning framework for plan payment risk adjustment and then assess whether they have the potential to improve risk adjustment. 2011-2012 Truven MarketScan database. We compare the performance of multiple statistical approaches within a broad machine learning framework for estimation of risk adjustment formulas. Total annual expenditure was predicted using age, sex, geography, inpatient diagnoses, and hierarchical condition category variables. The methods included regression, penalized regression, decision trees, neural networks, and an ensemble super learner, all in concert with screening algorithms that reduce the set of variables considered. The performance of these methods was compared based on cross-validated R 2 . Our results indicate that a simplified risk adjustment formula selected via this nonparametric framework maintains much of the efficiency of a traditional larger formula. The ensemble approach also outperformed classical regression and all other algorithms studied. The implementation of cross-validated machine learning techniques provides novel insight into risk adjustment estimation, possibly allowing for a simplified formula, thereby reducing incentives for increased coding intensity as well as the ability of insurers to "game" the system with aggressive diagnostic upcoding. © Health Research and Educational Trust.
Cheng, Hui G.; Anthony, James C.
2016-01-01
Purpose We seek answers to three questions about adolescent risk of starting to drink alcoholic beverages: (1) In new United States (US) data, can we reproduce a recently discovered female excess risk? (2) Has a female excess risk emerged in European countries? and (3) Might the size of country-level female-male differences (FMD) be influenced by macro-level gender equality and development processes? Methods Estimates are from US and European surveys of adolescents, 2010–2014. For US estimates, newly incident drinking refers to consuming the first full drink during the 12-month interval just prior to assessment. For all countries, lifetime cumulative incidence of drinking refers to any drinking before assessment of the sampled 15-to-16-year-olds. Results Cumulative meta-analysis summary estimates from the US show a highly reproducible female excess in newly incident drinking among 12-to-17-year-olds (final estimated female-male difference in risk, FMD = 2.1%; 95% confidence interval = 1.5%, 2.7%). Several European countries show female excess risk, estimated as lifetime cumulative incidence of drinking onsets before age 17 years. At the country level, the observed magnitude of FMD in risk is positively associated with the Gender Development Index (especially facets related to education and life expectancy of females relative to males), and with residence in a higher income European country. Conclusions New FMD estimates support reproducibility of a female excess risk in the US. In Europe, evidence of a female excess is modest. Educational attainment, life expectancies, and income merit attention in future FMD research on suspected macro-level processes that influence drinking onsets. PMID:27915406
Linear and nonlinear variable selection in competing risks data.
Ren, Xiaowei; Li, Shanshan; Shen, Changyu; Yu, Zhangsheng
2018-06-15
Subdistribution hazard model for competing risks data has been applied extensively in clinical researches. Variable selection methods of linear effects for competing risks data have been studied in the past decade. There is no existing work on selection of potential nonlinear effects for subdistribution hazard model. We propose a two-stage procedure to select the linear and nonlinear covariate(s) simultaneously and estimate the selected covariate effect(s). We use spectral decomposition approach to distinguish the linear and nonlinear parts of each covariate and adaptive LASSO to select each of the 2 components. Extensive numerical studies are conducted to demonstrate that the proposed procedure can achieve good selection accuracy in the first stage and small estimation biases in the second stage. The proposed method is applied to analyze a cardiovascular disease data set with competing death causes. Copyright © 2018 John Wiley & Sons, Ltd.
Nonparametric analysis of bivariate gap time with competing risks.
Huang, Chiung-Yu; Wang, Chenguang; Wang, Mei-Cheng
2016-09-01
This article considers nonparametric methods for studying recurrent disease and death with competing risks. We first point out that comparisons based on the well-known cumulative incidence function can be confounded by different prevalence rates of the competing events, and that comparisons of the conditional distribution of the survival time given the failure event type are more relevant for investigating the prognosis of different patterns of recurrence disease. We then propose nonparametric estimators for the conditional cumulative incidence function as well as the conditional bivariate cumulative incidence function for the bivariate gap times, that is, the time to disease recurrence and the residual lifetime after recurrence. To quantify the association between the two gap times in the competing risks setting, a modified Kendall's tau statistic is proposed. The proposed estimators for the conditional bivariate cumulative incidence distribution and the association measure account for the induced dependent censoring for the second gap time. Uniform consistency and weak convergence of the proposed estimators are established. Hypothesis testing procedures for two-sample comparisons are discussed. Numerical simulation studies with practical sample sizes are conducted to evaluate the performance of the proposed nonparametric estimators and tests. An application to data from a pancreatic cancer study is presented to illustrate the methods developed in this article. © 2016, The International Biometric Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leung, K; Wong, M; Ng, Y
Purpose: Interventional cardiac procedures utilize frequent fluoroscopy and cineangiography, which impose considerable radiation risk to patients, especially pediatric patients. Accurate calculation of effective dose is important in order to estimate cancer risk over the rest of their lifetime. This study evaluates the difference in effective dose calculated by Monte Carlo simulation with those estimated by locally-derived conversion factors (CF-local) and by commonly quoted conversion factors from Karambatsakidou et al (CF-K). Methods: Effective dose (E),of 12 pediatric patients, age between 2.5–19 years old, who had undergone interventional cardiac procedures, were calculated using PCXMC-2.0 software. Tube spectrum, irradiation geometry, exposure parameters andmore » dose-area product (DAP) of each projection were included in the software calculation. Effective doses for each patient were also estimated by two Methods: 1) CF-local: conversion factor derived locally by generalizing results of 12 patients, multiplied by DAP of each patient gives E-local. 2) CF-K: selected factor from above-mentioned literature, multiplied by DAP of each patient gives E-K. Results: Mean of E, E-local and E-K were 16.01 mSv, 16.80 mSv and 22.25 mSv respectively. A deviation of −29.35% to +34.85% between E and E-local, while a greater deviation of −28.96% to +60.86% between E and EK were observed. E-K overestimated the effective dose for patients at age 7.5–19. Conclusion: Effective dose obtained by conversion factors is simple and quick to estimate radiation risk of pediatric patients. This study showed that estimation by CF-local may bear an error of 35% when compared with Monte Carlo calculation. If using conversion factors derived by other studies may result in an even greater error, of up to 60%, due to factors that are not catered for in the estimation, including patient size, projection angles, exposure parameters, tube filtration, etc. Users must be aware of these potential inaccuracies when simple conversion method is employed.« less
A Bayes linear Bayes method for estimation of correlated event rates.
Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim
2013-12-01
Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.
Time Dependence of Collision Probabilities During Satellite Conjunctions
NASA Technical Reports Server (NTRS)
Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.
2017-01-01
The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.
[The methods of assessment of health risk from exposure to radon and radon daughters].
Demin, V F; Zhukovskiy, M V; Kiselev, S M
2014-01-01
The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.
A systematic review of waterborne disease burden methodologies from developed countries.
Murphy, H M; Pintar, K D M; McBean, E A; Thomas, M K
2014-12-01
The true incidence of endemic acute gastrointestinal illness (AGI) attributable to drinking water in Canada is unknown. Using a systematic review framework, the literature was evaluated to identify methods used to attribute AGI to drinking water. Several strategies have been suggested or applied to quantify AGI attributable to drinking water at a national level. These vary from simple point estimates, to quantitative microbial risk assessment, to Monte Carlo simulations, which rely on assumptions and epidemiological data from the literature. Using two methods proposed by researchers in the USA, this paper compares the current approaches and key assumptions. Knowledge gaps are identified to inform future waterborne disease attribution estimates. To improve future estimates, there is a need for robust epidemiological studies that quantify the health risks associated with small, private water systems, groundwater systems and the influence of distribution system intrusions on risk. Quantification of the occurrence of enteric pathogens in water supplies, particularly for groundwater, is needed. In addition, there are unanswered questions regarding the susceptibility of vulnerable sub-populations to these pathogens and the influence of extreme weather events (precipitation) on AGI-related health risks. National centralized data to quantify the proportions of the population served by different water sources, by treatment level, source water quality, and the condition of the distribution system infrastructure, are needed.
Lu, Chunling; Black, Maureen M; Richter, Linda M
2018-01-01
Summary Background A 2007 study published in The Lancet estimated that approximately 219 million children aged younger than 5 years were exposed to stunting or extreme poverty in 2004. We updated the 2004 estimates with the use of improved data and methods and generated estimates for 2010. Methods We used country-level prevalence of stunting in children younger than 5 years based on the 2006 Growth Standards proposed by WHO and poverty ratios from the World Bank to estimate children who were either stunted or lived in extreme poverty for 141 low-income and middle-income countries in 2004 and 2010. To avoid counting the same children twice, we excluded children jointly exposed to stunting and extreme poverty from children living in extreme poverty. To examine the robustness of estimates, we also used moderate poverty measures. Findings The 2007 study underestimated children at risk of poor development. The estimated number of children exposed to the two risk factors in low-income and middle-income countries decreased from 279·1 million (95% CI 250·4 million–307·4 million) in 2004 to 249·4 million (209·3 million–292·6 million) in 2010; prevalence of children at risk fell from 51% (95% CI 46–56) to 43% (36–51). The decline occurred in all income groups and regions with south Asia experiencing the largest drop. Sub-Saharan Africa had the highest prevalence in both years. These findings were robust to variations in poverty measures. Interpretation Progress has been made in reducing the number of children exposed to stunting or poverty between 2004 and 2010, but this is still not enough. Scaling up of effective interventions targeting the most vulnerable children is urgently needed. Funding National Institutes of Health, Bill & Melinda Gates Foundation, Hilton Foundation, and WHO. PMID:27717632
A Tactical Database for the Low Cost Combat Direction System
1990-12-01
another object. Track is a representation of some environmental phenomena converted into accurate estimates of geographical position with respect to...by the method CALCULATE RELATIVE POSITION. In order to obtain a better similarity of mehods , the methods OWNSHIP DISTANCE TO PIM, ESTIMATED TIME OF...this mechanism entails the risk that the user will lose all of the work that was done if conflicts are detected and the transaction cannot be committed
Tosteson, Tor D.; Morden, Nancy E.; Stukel, Therese A.; O'Malley, A. James
2014-01-01
The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival. PMID:25506259
MacKenzie, Todd A; Tosteson, Tor D; Morden, Nancy E; Stukel, Therese A; O'Malley, A James
2014-06-01
The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival.
The economic value of life: linking theory to practice.
Landefeld, J S; Seskin, E P
1982-01-01
Human capital estimates of the economic value of life have been routinely used in the past to perform cost-benefit analyses of health programs. Recently, however, serious questions have been raised concerning the conceptual basis for valuing human life by applying these estimates. Most economists writing on these issues tend to agree that a more conceptually correct method to value risks to human life in cost-benefit analyses would be based on individuals.' "willingness to pay" for small changes in their probability of survival. Attempts to implement the willingness-to-pay approach using survey responses or revealed-preference estimates have produced a confusing array of values fraught with statistical problems and measurement difficulties. As a result, economists have searched for a link between willingness to pay and standard human capital estimates and have found that for most individuals a lower bound for valuing risks to life can be based on their willingness to pay to avoid the expected economic losses associated with death. However, while these studies provide support for using individual's private valuation of forgone income in valuing risks to life, it is also clear that standard human capital estimates cannot be used for this purpose without reformulation. After reviewing the major approaches to valuing risks to life, this paper concludes that estimates based on the human capital approach--reformulated using a willingness-to-pay criterion--produce the only clear, consistent, and objective values for use in cost-benefit analyses of policies affecting risks to life. The paper presents the first empirical estimates of such adjusted willingness-to-pay/human capital values. PMID:6803602
[Study of lung cancer risk in the electroplating industry in Lombardy based on the OCCAM method].
Panizza, C; Bai, E; Oddone, E; Scaburri, Alessandra; Massari, Stefania; Modonesi, C; Crosignani, P
2011-01-01
The OCCAM method consists of case-control studies aimed at estimating occupational risks by cancer site, by area and by economic sector, using available archives to identify cases and controls; for exposure definition each subject is assigned to the category code of the economic sector or company where he/she worked the longest, obtained by automatic link with the Social Security Institute (INPS) files. The reference category (unexposed) consists of service industry workers. The economic sector is given by the ATECO category that INPS assigns to each firm. In the Lombardy Region, lung cancer risk evaluated for the "metal treatment" industry as a whole was 1.32 (90% CI 1.33-3.10, 67 cases) for males and 1.33 (90% CI 0.51-3.59, 10 cases) for females. The aim of the study was to estimate lung cancer risk among metal electroplating workers only. The metal electroplating firms were identified according to the detailed description of production, data which was also contained in INPS files, instead of using the "metal treatment" ATECO code. Lung cancer risk was evaluated using 2001-2008 incident cases identified from hospital discharge records of residents in the Lombardy Region. Controls were a sample from National Health Service files. For the group of firms identified as metal electroplating industries the risk was 2.03 (90% CI 1.69-8.32, 18 cases) for males and 3.75 (90% CI 1.38-9.03, 4 cases) for females. Focusing on the true electroplating firms increased the risk estimates. Even though these risk were due to past exposures, case histories and recent acute effects indicate that, at least in some factories, a carcinogenic hazard still exists.
Kelly, Christopher; Pashayan, Nora; Munisamy, Sreetharan; Powles, John W
2009-06-30
Our aim was to estimate the burden of fatal disease attributable to excess adiposity in England and Wales in 2003 and 2015 and to explore the sensitivity of the estimates to the assumptions and methods used. A spreadsheet implementation of the World Health Organization's (WHO) Comparative Risk Assessment (CRA) methodology for continuously distributed exposures was used. For our base case, adiposity-related risks were assumed to be minimal with a mean (SD) BMI of 21 (1) Kg m-2. All cause mortality risks for 2015 were taken from the Government Actuary and alternative compositions by cause derived. Disease-specific relative risks by BMI were taken from the CRA project and varied in sensitivity analyses. Under base case methods and assumptions for 2003, approximately 41,000 deaths and a loss of 1.05 years of life expectancy were attributed to excess adiposity. Seventy-seven percent of all diabetic deaths, 23% of all ischaemic heart disease deaths and 14% of all cerebrovascular disease deaths were attributed to excess adiposity. Predictions for 2015 were found to be more sensitive to assumptions about the future course of mortality risks for diabetes than to variation in the assumed trend in BMI. On less favourable assumptions the attributable loss of life expectancy in 2015 would rise modestly to 1.28 years. Excess adiposity appears to contribute materially but modestly to mortality risks in England and Wales and this contribution is likely to increase in the future. Uncertainty centres on future trends of associated diseases, especially diabetes. The robustness of these estimates is limited by the lack of control for correlated risks by stratification and by the empirical uncertainty surrounding the effects of prolonged excess adiposity beginning in adolescence.
The effects of spatial population dataset choice on estimates of population at risk of disease
2011-01-01
Background The spatial modeling of infectious disease distributions and dynamics is increasingly being undertaken for health services planning and disease control monitoring, implementation, and evaluation. Where risks are heterogeneous in space or dependent on person-to-person transmission, spatial data on human population distributions are required to estimate infectious disease risks, burdens, and dynamics. Several different modeled human population distribution datasets are available and widely used, but the disparities among them and the implications for enumerating disease burdens and populations at risk have not been considered systematically. Here, we quantify some of these effects using global estimates of populations at risk (PAR) of P. falciparum malaria as an example. Methods The recent construction of a global map of P. falciparum malaria endemicity enabled the testing of different gridded population datasets for providing estimates of PAR by endemicity class. The estimated population numbers within each class were calculated for each country using four different global gridded human population datasets: GRUMP (~1 km spatial resolution), LandScan (~1 km), UNEP Global Population Databases (~5 km), and GPW3 (~5 km). More detailed assessments of PAR variation and accuracy were conducted for three African countries where census data were available at a higher administrative-unit level than used by any of the four gridded population datasets. Results The estimates of PAR based on the datasets varied by more than 10 million people for some countries, even accounting for the fact that estimates of population totals made by different agencies are used to correct national totals in these datasets and can vary by more than 5% for many low-income countries. In many cases, these variations in PAR estimates comprised more than 10% of the total national population. The detailed country-level assessments suggested that none of the datasets was consistently more accurate than the others in estimating PAR. The sizes of such differences among modeled human populations were related to variations in the methods, input resolution, and date of the census data underlying each dataset. Data quality varied from country to country within the spatial population datasets. Conclusions Detailed, highly spatially resolved human population data are an essential resource for planning health service delivery for disease control, for the spatial modeling of epidemics, and for decision-making processes related to public health. However, our results highlight that for the low-income regions of the world where disease burden is greatest, existing datasets display substantial variations in estimated population distributions, resulting in uncertainty in disease assessments that utilize them. Increased efforts are required to gather contemporary and spatially detailed demographic data to reduce this uncertainty, particularly in Africa, and to develop population distribution modeling methods that match the rigor, sophistication, and ability to handle uncertainty of contemporary disease mapping and spread modeling. In the meantime, studies that utilize a particular spatial population dataset need to acknowledge the uncertainties inherent within them and consider how the methods and data that comprise each will affect conclusions. PMID:21299885
Gan, Ryan W.; Ford, Bonne; Lassman, William; Pfister, Gabriele; Vaidyanathan, Ambarish; Fischer, Emily; Volckens, John; Pierce, Jeffrey R.; Magzamen, Sheryl
2017-01-01
Climate forecasts predict an increase in frequency and intensity of wildfires. Associations between health outcomes and population exposure to smoke from Washington 2012 wildfires were compared using surface monitors, chemical-weather models, and a novel method blending three exposure information sources. The association between smoke particulate matter ≤2.5 μm in diameter (PM2.5) and cardiopulmonary hospital admissions occurring in Washington from 1 July to 31 October 2012 was evaluated using a time-stratified case-crossover design. Hospital admissions aggregated by ZIP code were linked with population-weighted daily average concentrations of smoke PM2.5 estimated using three distinct methods: a simulation with the Weather Research and Forecasting with Chemistry (WRF-Chem) model, a kriged interpolation of PM2.5 measurements from surface monitors, and a geographically weighted ridge regression (GWR) that blended inputs from WRF-Chem, satellite observations of aerosol optical depth, and kriged PM2.5. A 10 μg/m3 increase in GWR smoke PM2.5 was associated with an 8% increased risk in asthma-related hospital admissions (odds ratio (OR): 1.076, 95% confidence interval (CI): 1.019–1.136); other smoke estimation methods yielded similar results. However, point estimates for chronic obstructive pulmonary disease (COPD) differed by smoke PM2.5 exposure method: a 10 μg/m3 increase using GWR was significantly associated with increased risk of COPD (OR: 1.084, 95%CI: 1.026–1.145) and not significant using WRF-Chem (OR: 0.986, 95%CI: 0.931–1.045). The magnitude (OR) and uncertainty (95%CI) of associations between smoke PM2.5 and hospital admissions were dependent on estimation method used and outcome evaluated. Choice of smoke exposure estimation method used can impact the overall conclusion of the study. PMID:28868515
Evidence that breast tissue stiffness is associated with risk of breast cancer.
Boyd, Norman F; Li, Qing; Melnichouk, Olga; Huszti, Ella; Martin, Lisa J; Gunasekara, Anoma; Mawdsley, Gord; Yaffe, Martin J; Minkin, Salomon
2014-01-01
Evidence from animal models shows that tissue stiffness increases the invasion and progression of cancers, including mammary cancer. We here use measurements of the volume and the projected area of the compressed breast during mammography to derive estimates of breast tissue stiffness and examine the relationship of stiffness to risk of breast cancer. Mammograms were used to measure the volume and projected areas of total and radiologically dense breast tissue in the unaffected breasts of 362 women with newly diagnosed breast cancer (cases) and 656 women of the same age who did not have breast cancer (controls). Measures of breast tissue volume and the projected area of the compressed breast during mammography were used to calculate the deformation of the breast during compression and, with the recorded compression force, to estimate the stiffness of breast tissue. Stiffness was compared in cases and controls, and associations with breast cancer risk examined after adjustment for other risk factors. After adjustment for percent mammographic density by area measurements, and other risk factors, our estimate of breast tissue stiffness was significantly associated with breast cancer (odds ratio = 1.21, 95% confidence interval = 1.03, 1.43, p = 0.02) and improved breast cancer risk prediction in models with percent mammographic density, by both area and volume measurements. An estimate of breast tissue stiffness was associated with breast cancer risk and improved risk prediction based on mammographic measures and other risk factors. Stiffness may provide an additional mechanism by which breast tissue composition is associated with risk of breast cancer and merits examination using more direct methods of measurement.
Ensemble of trees approaches to risk adjustment for evaluating a hospital's performance.
Liu, Yang; Traskin, Mikhail; Lorch, Scott A; George, Edward I; Small, Dylan
2015-03-01
A commonly used method for evaluating a hospital's performance on an outcome is to compare the hospital's observed outcome rate to the hospital's expected outcome rate given its patient (case) mix and service. The process of calculating the hospital's expected outcome rate given its patient mix and service is called risk adjustment (Iezzoni 1997). Risk adjustment is critical for accurately evaluating and comparing hospitals' performances since we would not want to unfairly penalize a hospital just because it treats sicker patients. The key to risk adjustment is accurately estimating the probability of an Outcome given patient characteristics. For cases with binary outcomes, the method that is commonly used in risk adjustment is logistic regression. In this paper, we consider ensemble of trees methods as alternatives for risk adjustment, including random forests and Bayesian additive regression trees (BART). Both random forests and BART are modern machine learning methods that have been shown recently to have excellent performance for prediction of outcomes in many settings. We apply these methods to carry out risk adjustment for the performance of neonatal intensive care units (NICU). We show that these ensemble of trees methods outperform logistic regression in predicting mortality among babies treated in NICU, and provide a superior method of risk adjustment compared to logistic regression.
Influence of safety measures on the risks of transporting dangerous goods through road tunnels.
Saccomanno, Frank; Haastrup, Palle
2002-12-01
Quantitative risk assessment (QRA) models are used to estimate the risks of transporting dangerous goods and to assess the merits of introducing alternative risk reduction measures for different transportation scenarios and assumptions. A comprehensive QRA model recently was developed in Europe for application to road tunnels. This model can assess the merits of a limited number of "native safety measures." In this article, we introduce a procedure for extending its scope to include the treatment of a number of important "nonnative safety measures" of interest to tunnel operators and decisionmakers. Nonnative safety measures were not included in the original model specification. The suggested procedure makes use of expert judgment and Monte Carlo simulation methods to model uncertainty in the revised risk estimates. The results of a case study application are presented that involve the risks of transporting a given volume of flammable liquid through a 10-km road tunnel.
Haloacetic acids in drinking water and risk for stillbirth
King, W; Dodds, L; Allen, A; Armson, B; Fell, D; Nimrod, C
2005-01-01
Aims: To investigate the effects of haloacetic acid (HAA) compounds in drinking water on stillbirth risk. Methods: A population based case-control study was conducted in Nova Scotia and Eastern Ontario, Canada. Estimates of daily exposure to total and specific HAAs were based on household water samples and questionnaire information on water consumption at home and work. Results: The analysis included 112 stillbirth cases and 398 live birth controls. In analysis without adjustment for total THM exposure, a relative risk greater than 2 was observed for an intermediate exposure category for total HAA and dichloroacetic acid measures. After adjustment for total THM exposure, the risk estimates for intermediate exposure categories were diminished, the relative risk associated with the highest category was in the direction of a protective effect, and all confidence intervals included the null value. Conclusions: No association was observed between HAA exposures and stillbirth risk after controlling for THM exposures. PMID:15657195
In vitro bioaccessibility assays (IVBA) estimate arsenic (As) relative bioavailability (RBA) in contaminated soils to improve the accuracy of site-specific human exposure assessments and risk calculations. For an IVBA assay to gain acceptance for use in risk assessment, it must ...
Goovaerts, Pierre
2006-01-01
Boundary analysis of cancer maps may highlight areas where causative exposures change through geographic space, the presence of local populations with distinct cancer incidences, or the impact of different cancer control methods. Too often, such analysis ignores the spatial pattern of incidence or mortality rates and overlooks the fact that rates computed from sparsely populated geographic entities can be very unreliable. This paper proposes a new methodology that accounts for the uncertainty and spatial correlation of rate data in the detection of significant edges between adjacent entities or polygons. Poisson kriging is first used to estimate the risk value and the associated standard error within each polygon, accounting for the population size and the risk semivariogram computed from raw rates. The boundary statistic is then defined as half the absolute difference between kriged risks. Its reference distribution, under the null hypothesis of no boundary, is derived through the generation of multiple realizations of the spatial distribution of cancer risk values. This paper presents three types of neutral models generated using methods of increasing complexity: the common random shuffle of estimated risk values, a spatial re-ordering of these risks, or p-field simulation that accounts for the population size within each polygon. The approach is illustrated using age-adjusted pancreatic cancer mortality rates for white females in 295 US counties of the Northeast (1970–1994). Simulation studies demonstrate that Poisson kriging yields more accurate estimates of the cancer risk and how its value changes between polygons (i.e. boundary statistic), relatively to the use of raw rates or local empirical Bayes smoother. When used in conjunction with spatial neutral models generated by p-field simulation, the boundary analysis based on Poisson kriging estimates minimizes the proportion of type I errors (i.e. edges wrongly declared significant) while the frequency of these errors is predicted well by the p-value of the statistical test. PMID:19023455
NASA Astrophysics Data System (ADS)
Che Awang, Aznida; Azah Samat, Nor
2017-09-01
Leptospirosis is a disease caused by the infection of pathogenic species from the genus of Leptospira. Human can be infected by the leptospirosis from direct or indirect exposure to the urine of infected animals. The excretion of urine from the animal host that carries pathogenic Leptospira causes the soil or water to be contaminated. Therefore, people can become infected when they are exposed to contaminated soil and water by cut on the skin as well as open wound. It also can enter the human body by mucous membrane such nose, eyes and mouth, for example by splashing contaminated water or urine into the eyes or swallowing contaminated water or food. Currently, there is no vaccine available for the prevention or treatment of leptospirosis disease but this disease can be treated if it is diagnosed early to avoid any complication. The disease risk mapping is important in a way to control and prevention of disease. Using a good choice of statistical model will produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for leptospirosis disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and Poisson-gamma model. This paper begins by providing a review of the SMR method and Poisson-gamma model, which we then applied to leptospirosis data of Kelantan, Malaysia. Both results are displayed and compared using graph, tables and maps. The result shows that the second method Poisson-gamma model produces better relative risk estimates compared to the SMR method. This is because the Poisson-gamma model can overcome the drawback of SMR where the relative risk will become zero when there is no observed leptospirosis case in certain regions. However, the Poisson-gamma model also faced problems where the covariate adjustment for this model is difficult and no possibility for allowing spatial correlation between risks in neighbouring areas. The problems of this model have motivated many researchers to introduce other alternative methods for estimating the risk.
On cancer risk estimation of urban air pollution.
Törnqvist, M; Ehrenberg, L
1994-01-01
The usefulness of data from various sources for a cancer risk estimation of urban air pollution is discussed. Considering the irreversibility of initiations, a multiplicative model is preferred for solid tumors. As has been concluded for exposure to ionizing radiation, the multiplicative model, in comparison with the additive model, predicts a relatively larger number of cases at high ages, with enhanced underestimation of risks by short follow-up times in disease-epidemiological studies. For related reasons, the extrapolation of risk from animal tests on the basis of daily absorbed dose per kilogram body weight or per square meter surface area without considering differences in life span may lead to an underestimation, and agreements with epidemiologically determined values may be fortuitous. Considering these possibilities, the most likely lifetime risks of cancer death at the average exposure levels in Sweden were estimated for certain pollution fractions or indicator compounds in urban air. The risks amount to approximately 50 deaths per 100,000 for inhaled particulate organic material (POM), with a contribution from ingested POM about three times larger, and alkenes, and butadiene cause 20 deaths, respectively, per 100,000 individuals. Also, benzene and formaldehyde are expected to be associated with considerable risk increments. Comparative potency methods were applied for POM and alkenes. Due to incompleteness of the list of compounds considered and the uncertainties of the above estimates, the total risk calculation from urban air has not been attempted here. PMID:7821292
A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels
NASA Astrophysics Data System (ADS)
Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian
2016-08-01
Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin
When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less
Joint-specific risk of impaired function in fibrodysplasia ossificans progressiva (FOP).
Pignolo, Robert J; Durbin-Johnson, Blythe P; Rocke, David M; Kaplan, Frederick S
2018-04-01
Fibrodysplasia ossificans progressiva (FOP) causes progressive disability due to heterotopic ossification from episodic flare-ups. Using data from 500 FOP patients (representing 63% of all known patients world-wide), age- and joint-specific risks of new joint involvement were estimated using parametric and nonparametric statistical methods. Compared to data from a 1994 survey of 44 individuals with FOP, our current estimates of age- and joint-specific risks of new joint involvement are more accurate (narrower confidence limits), based on a wider range of ages, and have less bias due to its greater comprehensiveness (captures over three-fifths of the known FOP patients worldwide). For the neck, chest, abdomen, and upper back, the estimated hazard decreases over time. For the jaw, lower back, shoulder, elbow, wrist, fingers, hip, knee, ankle, and foot, the estimated hazard increases initially then either plateaus or decreases. At any given time and for any anatomic site, the data indicate which joints are at risk. This study of approximately 63% of the world's known population of FOP patients provides a refined estimate of risk for new involvement at any joint at any age, as well as the proportion of patients with uninvolved joints at any age. Importantly, these joint-specific survival curves can be used to facilitate clinical trial design and to determine if potential treatments can modify the predicted trajectory of progressive joint dysfunction. Copyright © 2017 Elsevier Inc. All rights reserved.
Gruber, Joshua S; Arnold, Benjamin F; Reygadas, Fermin; Hubbard, Alan E; Colford, John M
2014-05-01
Complier average causal effects (CACE) estimate the impact of an intervention among treatment compliers in randomized trials. Methods used to estimate CACE have been outlined for parallel-arm trials (e.g., using an instrumental variables (IV) estimator) but not for other randomized study designs. Here, we propose a method for estimating CACE in randomized stepped wedge trials, where experimental units cross over from control conditions to intervention conditions in a randomized sequence. We illustrate the approach with a cluster-randomized drinking water trial conducted in rural Mexico from 2009 to 2011. Additionally, we evaluated the plausibility of assumptions required to estimate CACE using the IV approach, which are testable in stepped wedge trials but not in parallel-arm trials. We observed small increases in the magnitude of CACE risk differences compared with intention-to-treat estimates for drinking water contamination (risk difference (RD) = -22% (95% confidence interval (CI): -33, -11) vs. RD = -19% (95% CI: -26, -12)) and diarrhea (RD = -0.8% (95% CI: -2.1, 0.4) vs. RD = -0.1% (95% CI: -1.1, 0.9)). Assumptions required for IV analysis were probably violated. Stepped wedge trials allow investigators to estimate CACE with an approach that avoids the stronger assumptions required for CACE estimation in parallel-arm trials. Inclusion of CACE estimates in stepped wedge trials with imperfect compliance could enhance reporting and interpretation of the results of such trials.
Strak, Maciej; Janssen, Nicole; Beelen, Rob; Schmitz, Oliver; Karssenberg, Derek; Houthuijs, Danny; van den Brink, Carolien; Dijst, Martin; Brunekreef, Bert; Hoek, Gerard
2017-07-01
Cohorts based on administrative data have size advantages over individual cohorts in investigating air pollution risks, but often lack in-depth information on individual risk factors related to lifestyle. If there is a correlation between lifestyle and air pollution, omitted lifestyle variables may result in biased air pollution risk estimates. Correlations between lifestyle and air pollution can be induced by socio-economic status affecting both lifestyle and air pollution exposure. Our overall aim was to assess potential confounding by missing lifestyle factors on air pollution mortality risk estimates. The first aim was to assess associations between long-term exposure to several air pollutants and lifestyle factors. The second aim was to assess whether these associations were sensitive to adjustment for individual and area-level socioeconomic status (SES), and whether they differed between subgroups of the population. Using the obtained air pollution-lifestyle associations and indirect adjustment methods, our third aim was to investigate the potential bias due to missing lifestyle information on air pollution mortality risk estimates in administrative cohorts. We used a recent Dutch national health survey of 387,195 adults to investigate the associations of PM 10 , PM 2.5 , PM 2.5-10 , PM 2.5 absorbance, OP DTT, OP ESR and NO 2 annual average concentrations at the residential address from land use regression models with individual smoking habits, alcohol consumption, physical activity and body mass index. We assessed the associations with and without adjustment for neighborhood and individual SES characteristics typically available in administrative data cohorts. We illustrated the effect of including lifestyle information on the air pollution mortality risk estimates in administrative cohort studies using a published indirect adjustment method. Current smoking and alcohol consumption were generally positively associated with air pollution. Physical activity and overweight were negatively associated with air pollution. The effect estimates were small (mostly <5% of the air pollutant standard deviations). Direction and magnitude of the associations depended on the pollutant, use of continuous vs. categorical scale of the lifestyle variable, and level of adjustment for individual and area-level SES. Associations further differed between subgroups (age, sex) in the population. Despite the small associations between air pollution and smoking intensity, indirect adjustment resulted in considerable changes of air pollution risk estimates for cardiovascular and especially lung cancer mortality. Individual lifestyle-related risk factors were weakly associated with long-term exposure to air pollution in the Netherlands. Indirect adjustment for missing lifestyle factors in administrative data cohort studies may substantially affect air pollution mortality risk estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Health Risk of Exposure to Atmospheric Pollutant Particles
In relation to multi-component mixture nature of atmospheric PM, this presentation will discuss methods for estimating the respiratory internal dose by experiment and mathematical modeling, limitations of each method and interpretations of the results in the context of health ris...
Contraceptive use and risk of unintended pregnancy in California.
Foster, Diana G; Bley, Julia; Mikanda, John; Induni, Marta; Arons, Abigail; Baumrind, Nikki; Darney, Philip D; Stewart, Felicia
2004-07-01
California is home to more than one out of eight American women of reproductive age. Because California has a large, diverse and growing population, national statistics do not necessarily describe the reproductive health of California women. This article presents risk for pregnancy and sexually transmitted infections among women in California based on the California Women's Health Survey. Over 8900 women of reproductive age who participated in this survey between 1998 and 2001 provide estimates of access to care and use of family-planning methods in the state. We find that 49% of the female population aged 18-44 in California is at risk of unintended pregnancy. Nine percent (9%) of women at risk of an unintended pregnancy are not using any method of contraception, primarily for method-related reasons, such as a concern about side effects or a dislike of available contraceptive methods. Among women at risk for unintended pregnancy, we find disparities by race/ethnicity and education in use of contraceptive methods.
Dawe, Russell Eric; Bishop, Jessica; Pendergast, Amanda; Avery, Susan; Monaghan, Kelly; Duggan, Norah; Aubrey-Bassler, Kris
2017-01-01
Background: Previous research suggests that family physicians have rates of cesarean delivery that are lower than or equivalent to those for obstetricians, but adjustments for risk differences in these analyses may have been inadequate. We used an econometric method to adjust for observed and unobserved factors affecting the risk of cesarean delivery among women attended by family physicians versus obstetricians. Methods: This retrospective population-based cohort study included all Canadian (except Quebec) hospital deliveries by family physicians and obstetricians between Apr. 1, 2006, and Mar. 31, 2009. We excluded women with multiple gestations, and newborns with a birth weight less than 500 g or gestational age less than 20 weeks. We estimated the relative risk of cesarean delivery using instrumental-variable-adjusted and logistic regression. Results: The final cohort included 776 299 women who gave birth in 390 hospitals. The risk of cesarean delivery was 27.3%, and the mean proportion of deliveries by family physicians was 26.9% (standard deviation 23.8%). The relative risk of cesarean delivery for family physicians versus obstetricians was 0.48 (95% confidence interval [CI] 0.41-0.56) with logistic regression and 1.27 (95% CI 1.02-1.57) with instrumental-variable-adjusted regression. Interpretation: Our conventional analyses suggest that family physicians have a lower rate of cesarean delivery than obstetricians, but instrumental variable analyses suggest the opposite. Because instrumental variable methods adjust for unmeasured factors and traditional methods do not, the large discrepancy between these estimates of risk suggests that clinical and/or sociocultural factors affecting the decision to perform cesarean delivery may not be accounted for in our database. PMID:29233843
Boden-Albala, Bernadette; Carman, Heather; Moran, Megan; Doyle, Margaret; Paik, Myunghee C.
2011-01-01
Objectives Risk modification through behavior change is critical for primary and secondary stroke prevention. Theories of health behavior identify perceived risk as an important component to facilitate behavior change; however, little is known about perceived risk of vascular events among stroke survivors. Methods The SWIFT (Stroke Warning Information and Faster Treatment) study includes a prospective population-based ethnically diverse cohort of ischemic stroke and transient ischemic attack survivors. We investigate the baseline relationship between demographics, health beliefs, and knowledge on risk perception. Regression models examined predictors of inaccurate perception. Results Only 20% accurately estimated risk, 10% of the participants underestimated risk, and 70% of the 817 study participants significantly overestimated their risk for a recurrent stroke. The mean perceived likelihood of recurrent ischemic stroke in the next 10 years was 51 ± 7%. We found no significant differences by race-ethnicity with regard to accurate estimation of risk. Inaccurate estimation of risk was associated with attitudes and beliefs [worry (p < 0.04), fatalism (p < 0.07)] and memory problems (p < 0.01), but not history or knowledge of vascular risk factors. Conclusion This paper provides a unique perspective on how factors such as belief systems influence risk perception in a diverse population at high stroke risk. There is a need for future research on how risk perception can inform primary and secondary stroke prevention. Copyright © 2011 S. Karger AG, Basel PMID:21894045
Socioeconomic indicators of heat-related health risk supplemented with remotely sensed data
Johnson, Daniel P; Wilson, Jeffrey S; Luber, George C
2009-01-01
Background Extreme heat events are the number one cause of weather-related fatalities in the United States. The current system of alert for extreme heat events does not take into account intra-urban spatial variation in risk. The purpose of this study is to evaluate a potential method to improve spatial delineation of risk from extreme heat events in urban environments by integrating sociodemographic risk factors with estimates of land surface temperature derived from thermal remote sensing data. Results Comparison of logistic regression models indicates that supplementing known sociodemographic risk factors with remote sensing estimates of land surface temperature improves the delineation of intra-urban variations in risk from extreme heat events. Conclusion Thermal remote sensing data can be utilized to improve understanding of intra-urban variations in risk from extreme heat. The refinement of current risk assessment systems could increase the likelihood of survival during extreme heat events and assist emergency personnel in the delivery of vital resources during such disasters. PMID:19835578
Relative cancer risks of chemical contaminants in the great lakes
NASA Astrophysics Data System (ADS)
Bro, Kenneth M.; Sonzogni, William C.; Hanson, Mark E.
1987-08-01
Anyone who drinks water or eats fish from the Great Lakes consumes potentially carcinogenic chemicals. In choosing how to respond to such pollution, it is important to put the risks these contaminants pose in perspective. Based on recent measurements of carcinogens in Great Lakes fish and water, calculations of lifetime risks of cancer indicate that consumers of sport fish face cancer risks from Great Lakes contaminants that are several orders of magnitude higher than the risks posed by drinking Great Lakes water. But drinking urban groundwater and breathing urban air may be as hazardous as frequent consumption of sport fish from the Great Lakes. Making such comparisons is difficult because of variation in types and quality of information available and in the methods for estimating risk. Much uncertainty pervades the risk assessment process in such areas as estimating carcinogenic potency and human exposure to contaminants. If risk assessment is to be made more useful, it is important to quantify this uncertainty.
Moura, Erly Catarina; Claro, Rafael Moreira; Bernal, Regina; Ribeiro, Juliano; Malta, Deborah Carvalho; Morais Neto, Otaliba
2011-02-01
The study objective was to evaluate the feasibility of interviews by cell phone as a complement to interviews by landline to estimate risk and protection factors for chronic non-communicable diseases. Adult cell phone users were evaluated by random digit dialing. Questions asked were: age, sex, education, race, marital status, ownership of landline and cell phones, health condition, weight and height, medical diagnosis of hypertension and diabetes, physical activity, diet, binge drinking and smoking. The estimates were calculated using post-stratification weights. The cell phone interview system showed a reduced capacity to reach elderly and low educated populations. The estimates of the risk and protection factors for chronic non-communicable diseases in cell phone interviews were equal to the estimates obtained by landline phone. Eligibility, success and refusal rates using the cell phone system were lower than those of the landline system, but loss and cost were much higher, suggesting it is unsatisfactory as a complementary method in such a context.
Moojong, Park; Hwandon, Jun; Minchul, Shin
2008-01-01
Sediments entering the sewer in urban areas reduce the conveyance in sewer pipes, which increases inundation risk. To estimate sediment yields, individual landuse areas in each sub-basin should be obtained. However, because of the complex nature of an urban area, this is almost impossible to obtain manually. Thus, a methodology to obtain individual landuse areas for each sub-basin has been suggested for estimating sediment yields. Using GIS, an urban area is divided into sub-basins with respect to the sewer layout, with the area of individual landuse estimated for each sub-basin. The sediment yield per unit area for each sub-basin is then calculated. The suggested method was applied to the GunJa basin in Seoul. For a relation analysis between sediments and inundation risk, sub-basins were ordered by the sediment yields per unit area and compared with historical inundation areas. From this analysis, sub-basins with higher order were found to match the historical inundation areas. Copyright IWA Publishing 2008.
Electronic health record-based cardiac risk assessment and identification of unmet preventive needs.
Persell, Stephen D; Dunne, Alexis P; Lloyd-Jones, Donald M; Baker, David W
2009-04-01
Cardiac risk assessment may not be routinely performed. Electronic health records (EHRs) offer the potential to automate risk estimation. We compared EHR-based assessment with manual chart review to determine the accuracy of automated cardiac risk estimation and determination of candidates for antiplatelet or lipid-lowering interventions. We performed an observational retrospective study of 23,111 adults aged 20 to 79 years, seen in a large urban primary care group practice. Automated assessments classified patients into 4 cardiac risk groups or as unclassifiable and determined candidates for antiplatelet or lipid-lowering interventions based on current guidelines. A blinded physician manually reviewed 100 patients from each risk group and the unclassifiable group. We determined the agreement between full review and automated assessments for cardiac risk estimation and identification of which patients were candidates for interventions. By automated methods, 9.2% of the population were candidates for lipid-lowering interventions, and 8.0% were candidates for antiplatelet medication. Agreement between automated risk classification and manual review was high (kappa = 0.91; 95% confidence interval [CI], 0.88-0.93). Automated methods accurately identified candidates for antiplatelet therapy [sensitivity, 0.81 (95% CI, 0.73-0.89); specificity, 0.98 (95% CI, 0.96-0.99); positive predictive value, 0.86 (95% CI, 0.78-0.94); and negative predictive value, 0.98 (95% CI, 0.97-0.99)] and lipid lowering [sensitivity, 0.92 (95% CI, 0.87-0.96); specificity, 0.98 (95% CI, 0.97-0.99); positive predictive value, 0.94 (95% CI, 0.89-0.99); and negative predictive value, 0.99 (95% CI, 0.98-> or =0.99)]. EHR data can be used to automatically perform cardiovascular risk stratification and identify patients in need of risk-lowering interventions. This could improve detection of high-risk patients whom physicians would otherwise be unaware.
Avalos, Marta; Adroher, Nuria Duran; Lagarde, Emmanuel; Thiessard, Frantz; Grandvalet, Yves; Contrand, Benjamin; Orriols, Ludivine
2012-09-01
Large data sets with many variables provide particular challenges when constructing analytic models. Lasso-related methods provide a useful tool, although one that remains unfamiliar to most epidemiologists. We illustrate the application of lasso methods in an analysis of the impact of prescribed drugs on the risk of a road traffic crash, using a large French nationwide database (PLoS Med 2010;7:e1000366). In the original case-control study, the authors analyzed each exposure separately. We use the lasso method, which can simultaneously perform estimation and variable selection in a single model. We compare point estimates and confidence intervals using (1) a separate logistic regression model for each drug with a Bonferroni correction and (2) lasso shrinkage logistic regression analysis. Shrinkage regression had little effect on (bias corrected) point estimates, but led to less conservative results, noticeably for drugs with moderate levels of exposure. Carbamates, carboxamide derivative and fatty acid derivative antiepileptics, drugs used in opioid dependence, and mineral supplements of potassium showed stronger associations. Lasso is a relevant method in the analysis of databases with large number of exposures and can be recommended as an alternative to conventional strategies.
Oldmeadow, Christopher; Hure, Alexis; Luu, Judy; Loxton, Deborah
2017-01-01
Background Type 2 diabetes is associated with significant morbidity and mortality. Modifiable risk factors have been found to contribute up to 60% of type 2 diabetes risk. However, type 2 diabetes continues to rise despite implementation of interventions based on traditional risk factors. There is a clear need to identify additional risk factors for chronic disease prevention. The aim of this study was to examine the relationship between perceived stress and type 2 diabetes onset, and partition the estimates into direct and indirect effects. Methods and findings Women born in 1946–1951 (n = 12,844) completed surveys for the Australian Longitudinal Study on Women’s Health in 1998, 2001, 2004, 2007 and 2010. The total causal effect was estimated using logistic regression and marginal structural modelling. Controlled direct effects were estimated through conditioning in the regression model. A graded association was found between perceived stress and all mediators in the multivariate time lag analyses. A significant association was found between hypertension, as well as physical activity and body mass index, and diabetes, but not smoking or diet quality. Moderate/high stress levels were associated with a 2.3-fold increase in the odds of diabetes three years later, for the total estimated effect. Results were only slightly attenuated when the direct and indirect effects of perceived stress on diabetes were partitioned, with the mediators only explaining 10–20% of the excess variation in diabetes. Conclusions Perceived stress is a strong risk factor for type 2 diabetes. The majority of the effect estimate of stress on diabetes risk is not mediated by the traditional risk factors of hypertension, physical activity, smoking, diet quality, and body mass index. This gives a new pathway for diabetes prevention trials and clinical practice. PMID:28222165
Lifetime Risk of Symptomatic Hand Osteoarthritis: The Johnston County Osteoarthritis Project
Qin, Jin; Barbour, Kamil E.; Murphy, Louise B.; Nelson, Amanda E.; Schwartz, Todd A.; Helmick, Charles G.; Allen, Kelli D.; Renner, Jordan B.; Baker, Nancy A; Jordan, Joanne M.
2017-01-01
Objective Symptomatic hand osteoarthritis (SHOA) is a common condition that affects hand strength and function, and causes disability in activities of daily living. Prior studies have estimated lifetime risk for symptomatic knee and hip osteoarthritis to be 45% and 25% respectively. The objective of this study is to estimate overall lifetime risk for SHOA and stratified lifetime risk by potential risk factors. Methods We analyzed data for 2,218 adults ≥ 45 years in the Johnston County Osteoarthritis Project, a population-based prospective cohort study in residents of Johnston County, North Carolina. Data were collected in two cycles (1999–2004 and 2005–2010). SHOA was defined as having both self-reported symptoms and radiographic OA in the same hand. Lifetime risk, defined as the proportion of the population who will develop SHOA in at least one hand by age 85, was estimated from models using generalized estimating equations methodology. Results Overall, the lifetime risk of SHOA is 39.8% (95% confidence interval (CI): 34.4, 45.3). Nearly one in two women (47.2%; 95% CI: 40.6, 53.9) will develop SHOA by age 85 compared with one in four men (24.6%; 95% CI: 19.5, 30.5). Race-specific estimates are 41.4% (95% CI: 35.5, 47.6) among whites and 29.2% (95% CI: 20.5, 39.7) among blacks. Lifetime risk among individuals with obesity (47.1%, 95% CI: 37.8, 56.7) is 11 percentage point higher than those without obesity (36.1%, 95% CI: 29.7, 42.9). Conclusion These findings demonstrate the substantial burden of SHOA overall and in subgroups. Increased use of public health and clinical interventions is needed to address its impact. PMID:28470947
Earthquake Loss Estimates in Near Real-Time
NASA Astrophysics Data System (ADS)
Wyss, Max; Wang, Rongjiang; Zschau, Jochen; Xia, Ye
2006-10-01
The usefulness to rescue teams of nearreal-time loss estimates after major earthquakes is advancing rapidly. The difference in the quality of data available in highly developed compared with developing countries dictates that different approaches be used to maximize mitigation efforts. In developed countries, extensive information from tax and insurance records, together with accurate census figures, furnish detailed data on the fragility of buildings and on the number of people at risk. For example, these data are exploited by the method to estimate losses used in the Hazards U.S. Multi-Hazard (HAZUSMH)software program (http://www.fema.gov/plan/prevent/hazus/). However, in developing countries, the population at risk is estimated from inferior data sources and the fragility of the building stock often is derived empirically, using past disastrous earthquakes for calibration [Wyss, 2004].
Mediators of Childhood Sexual Abuse and High-Risk Sex among Men-Who-Have-Sex-with-Men
ERIC Educational Resources Information Center
Catania, Joseph A.; Paul, Jay; Osmond, Dennis; Folkman, Susan; Pollack, Lance; Canchola, Jesse; Chang, Jason; Neilands, Torsten
2008-01-01
Objective: Mediators of childhood sexual abuse (CSA) and HIV risk behavior were examined for men-who-have-sex-with-men (MSM). Method: Data from a dual frame survey of urban MSM (N = 1078) provided prevalence estimates of CSA, and a test of two latent variable models (defined by partner type) of CSA-risk behavior mediators. Results: A 20%…
The pollution and the potential ecological risk of heavy metals in swan lake wetland of Sanmenxia
NASA Astrophysics Data System (ADS)
Li, Jifeng
2018-04-01
The soil samples were collected from swanlake wetland and digested by the national standard method. The contents of Pb, Cr, Cu, Zn and Mn were detected and the potential ecological risk was estimated by the the potential ecological risk index. The result shows the wetland was slightly ecological hazarded. The ecosystem has been affected by the heavy metal.
Towards a global water scarcity risk assessment framework: using scenarios and risk distributions
NASA Astrophysics Data System (ADS)
Veldkamp, Ted; Wada, Yoshihide; Aerts, Jeroen; Ward, Philip
2016-04-01
Over the past decades, changing hydro-climatic and socioeconomic conditions have led to increased water scarcity problems. A large number of studies have shown that these water scarcity conditions will worsen in the near future. Despite numerous calls for risk-based assessments of water scarcity, a framework that includes UNISDR's definition of risk does not yet exist at the global scale. This study provides a first step towards such a risk-based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change projections and socioeconomic scenarios. Our study highlights that water scarcity risk increases given all future scenarios, up to >56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity in terms of Expected Annual Exposed Population, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels. Covering hazard, exposure, and vulnerability, risk-based methods are well-suited to assess water scarcity adaptation. Completing the presented risk framework therefore offers water managers a promising perspective to increase water security in a well-informed and adaptive manner.
Herrera, Sabina; Guelar, Ana; Sorlì, Luisa; Vila, Joan; Molas, Ema; Grau, María; Marrugat, Jaume; Esteve, Erika; Güerri-Fernández, Roberto; Montero, Milagro; Knobel, Hernando
2016-07-01
Cardiovascular risk (CVR) assessment helps to identify patients at high CVR. The Framingham CVR score (FRS) is the most widely used methods but may overestimate risk in regions with low incidence of cardiovascular disease. The objective was to compare the 10-year performance of the original and the adapted REGICOR - Framingham CVR functions in HIV-infected individuals. We carried out a longitudinal study of HIV-infected patients with CVR evaluation in a hospital in Barcelona between 2003 and 2013. Risk probability was calculated using the FRAMINGHAM function and REGICOR adaptation to the Spanish population, and individuals were categorized in three groups (low, 0 < 5%; moderate, 5-10%; and high, >10%). For each risk group, the number of events over 10 years was calculated using the Kaplan-Meier method, and the expected number of events was calculated by multiplying the frequency of participants in the group by the mean of the probabilities from the risk function. We used the X(2) goodness-of-fit test to assess agreement between observed and expected. Six hundred and forty-one patients were followed up for a median of 10.2 years, and 20 ischemic heart events (IHE) were observed. The mean (95% CI) number of IHEs per 1000 person-years was 3.7 (2.06-5.27). The estimates from the Framingham and REGICOR functions were 40 and 14 IHEs, respectively. The estimate from the original Framingham function differed significantly from the observed incidence (p < 0.001), whereas that from the REGICOR-adapted function did not (p = 0.15). In terms of the number of cardiovascular events (38 events observed), the REGICOR function significantly underestimated risk (p = 0.01), whereas the estimate from the Framingham function was similar to observed (p:0.93). The FRS significantly overestimates risk of IHE events in our HIV-infected patients, while the REGICOR function is a better predictor of these events. In terms of cardiovascular events, the REGICOR function significantly underestimates risk, whereas the FRS is a better estimator. We recommend using CVR scales and adjusting them to the origin of the population being studied.
Flood risk and cultural heritage: the case study of Florence (Italy)
NASA Astrophysics Data System (ADS)
Arrighi, Chiara; Castelli, Fabio; Brugioni, Marcello; Franceschini, Serena; Mazzanti, Bernardo
2016-04-01
Cultural heritage plays a key role for communities in terms of both identity and economic value. It is often under serious threat by natural hazards, nevertheless, quantitative assessments of risk are quite uncommon. This work addresses the flood risk assessment to cultural heritage in an exemplary art city, which is Florence, Italy. The risk assessment method here adopted borrows the most common definition of flood risk as the product of hazard, vulnerability and exposure, with some necessary adjustments. The risk estimation is carried out at the building scale for the whole UNESCO site, which coincides with the historical centre of the city. A distinction in macro- and micro-damage categories has been made according to the vulnerability of the objects at risk. Two damage macro-categories are selected namely cultural buildings and contents. Cultural buildings are classified in damage micro-categories as churches/religious complexes, libraries/archives and museums. The damages to the contents are estimated for four micro-categories: paintings, sculptures, books/prints and goldsmith's art. Data from hydraulic simulations for different recurrence scenarios, historical reports of the devastating 1966 flood and the cultural heritage recognition sheets allow estimating and mapping the annual expected number of works of art lost in absence of risk mitigation strategies.
Kohler, Kathryn A.; Banerjee, Kaushik; Gary Hlady, W.; Andrus, Jon K.; Sutter, Roland W.
2002-01-01
OBJECTIVE: Vaccine-associated paralytic poliomyelitis (VAPP) is a rare but serious consequence of the administration of oral polio vaccine (OPV). Intensified OPV administration has reduced wild poliovirus transmission in India but VAPP is becoming a matter of concern. METHODS: We analysed acute flaccid paralysis (AFP) surveillance data in order to estimate the VAPP risk in this country. VAPP was defined as occurring in AFP cases with onset of paralysis in 1999, residual weakness 60 days after onset, and isolation of vaccine-related poliovirus. Recipient VAPP cases were a subset with onset of paralysis between 4 and 40 days after receipt of OPV. FINDINGS: A total of 181 AFP cases met the case definition. The following estimates of VAPP risk were made: overall risk, 1 case per 4.1 to 4.6 million OPV doses administered; recipient risk,1 case per 12.2 million; first-dose recipient risk, 1 case per 2.8 million; and subsequent-dose recipient risk, 1 case per 13.9 million. CONCLUSION: On the basis of data from a highly sensitive surveillance system the estimated VAPP risk in India is evidently lower than that in other countries, notwithstanding the administration of multiple OPV doses to children in mass immunization campaigns. PMID:11984607
Wu, Cai; Li, Liang
2018-05-15
This paper focuses on quantifying and estimating the predictive accuracy of prognostic models for time-to-event outcomes with competing events. We consider the time-dependent discrimination and calibration metrics, including the receiver operating characteristics curve and the Brier score, in the context of competing risks. To address censoring, we propose a unified nonparametric estimation framework for both discrimination and calibration measures, by weighting the censored subjects with the conditional probability of the event of interest given the observed data. The proposed method can be extended to time-dependent predictive accuracy metrics constructed from a general class of loss functions. We apply the methodology to a data set from the African American Study of Kidney Disease and Hypertension to evaluate the predictive accuracy of a prognostic risk score in predicting end-stage renal disease, accounting for the competing risk of pre-end-stage renal disease death, and evaluate its numerical performance in extensive simulation studies. Copyright © 2018 John Wiley & Sons, Ltd.
Co-morbidities associated with influenza-attributed mortality, 1994-2000, Canada.
Schanzer, Dena L; Langley, Joanne M; Tam, Theresa W S
2008-08-26
The elderly and persons with specific chronic conditions are known to face elevated morbidity and mortality risks resulting from an influenza infection, and hence are routinely recommended for annual influenza vaccination. However, risk-specific mortality rates have not been established. We estimated age-specific influenza-attributable mortality rates stratified by the presence of chronic conditions and type of residence based on deaths of persons who were admitted to hospital with a respiratory complication captured in our national database. The majority of patients had chronic heart or respiratory conditions (80%) and were admitted from the community (80%). Influenza-attributable mortality rates clearly increase with age for all risk groups. Our influenza-specific estimates identified higher risk ratios for chronic lung or heart disease than have been suggested by other methods. These estimates identify groups most in need of improved vaccines and for whom the use of additional strategies, such as immunization of household contacts or caregivers should be considered.
Modeling risk of occupational zoonotic influenza infection in swine workers.
Paccha, Blanca; Jones, Rachael M; Gibbs, Shawn; Kane, Michael J; Torremorell, Montserrat; Neira-Ramirez, Victor; Rabinowitz, Peter M
2016-08-01
Zoonotic transmission of influenza A virus (IAV) between swine and workers in swine production facilities may play a role in the emergence of novel influenza strains with pandemic potential. Guidelines to prevent transmission of influenza to swine workers have been developed but there is a need for evidence-based decision-making about protective measures such as respiratory protection. A mathematical model was applied to estimate the risk of occupational IAV exposure to swine workers by contact and airborne transmission, and to evaluate the use of respirators to reduce transmission. The Markov model was used to simulate the transport and exposure of workers to IAV in a swine facility. A dose-response function was used to estimate the risk of infection. This approach is similar to methods previously used to estimate the risk of infection in human health care settings. This study uses concentration of virus in air from field measurements collected during outbreaks of influenza in commercial swine facilities, and analyzed by polymerase chain reaction. It was found that spending 25 min working in a barn during an influenza outbreak in a swine herd could be sufficient to cause zoonotic infection in a worker. However, this risk estimate was sensitive to estimates of viral infectivity to humans. Wearing an excellent fitting N95 respirator reduced this risk, but with high aerosol levels the predicted risk of infection remained high under certain assumptions. The results of this analysis indicate that under the conditions studied, swine workers are at risk of zoonotic influenza infection. The use of an N95 respirator could reduce such risk. These findings have implications for risk assessment and preventive programs targeting swine workers. The exact level of risk remains uncertain, since our model may have overestimated the viability or infectivity of IAV. Additionally, the potential for partial immunity in swine workers associated with repeated low-dose exposures or from previous infection with other influenza strains was not considered. Further studies should explore these uncertainties.
Projecting Individualized Absolute Invasive Breast Cancer Risk in US Hispanic Women
John, Esther M.; Slattery, Martha L.; Gomez, Scarlett Lin; Yu, Mandi; LaCroix, Andrea Z.; Pee, David; Chlebowski, Rowan T.; Hines, Lisa M.; Thompson, Cynthia A.; Gail, Mitchell H.
2017-01-01
Background: There is no model to estimate absolute invasive breast cancer risk for Hispanic women. Methods: The San Francisco Bay Area Breast Cancer Study (SFBCS) provided data on Hispanic breast cancer case patients (533 US-born, 553 foreign-born) and control participants (464 US-born, 947 foreign-born). These data yielded estimates of relative risk (RR) and attributable risk (AR) separately for US-born and foreign-born women. Nativity-specific absolute risks were estimated by combining RR and AR information with nativity-specific invasive breast cancer incidence and competing mortality rates from the California Cancer Registry and Surveillance, Epidemiology, and End Results program to develop the Hispanic risk model (HRM). In independent data, we assessed model calibration through observed/expected (O/E) ratios, and we estimated discriminatory accuracy with the area under the receiver operating characteristic curve (AUC) statistic. Results: The US-born HRM included age at first full-term pregnancy, biopsy for benign breast disease, and family history of breast cancer; the foreign-born HRM also included age at menarche. The HRM estimated lower risks than the National Cancer Institute’s Breast Cancer Risk Assessment Tool (BCRAT) for US-born Hispanic women, but higher risks in foreign-born women. In independent data from the Women’s Health Initiative, the HRM was well calibrated for US-born women (observed/expected [O/E] ratio = 1.07, 95% confidence interval [CI] = 0.81 to 1.40), but seemed to overestimate risk in foreign-born women (O/E ratio = 0.66, 95% CI = 0.41 to 1.07). The AUC was 0.564 (95% CI = 0.485 to 0.644) for US-born and 0.625 (95% CI = 0.487 to 0.764) for foreign-born women. Conclusions: The HRM is the first absolute risk model that is based entirely on data specific to Hispanic women by nativity. Further studies in Hispanic women are warranted to evaluate its validity. PMID:28003316
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brodin, N. Patrik, E-mail: nils.patrik.brodin@rh.dk; Niels Bohr Institute, University of Copenhagen, Copenhagen; Vogelius, Ivan R.
2013-10-01
Purpose: As pediatric medulloblastoma (MB) is a relatively rare disease, it is important to extract the maximum information from trials and cohort studies. Here, a framework was developed for modeling tumor control with multiple modes of failure and time-to-progression for standard-risk MB, using published pattern of failure data. Methods and Materials: Outcome data for standard-risk MB published after 1990 with pattern of relapse information were used to fit a tumor control dose-response model addressing failures in both the high-dose boost volume and the elective craniospinal volume. Estimates of 5-year event-free survival from 2 large randomized MB trials were used tomore » model the time-to-progression distribution. Uncertainty in freedom from progression (FFP) was estimated by Monte Carlo sampling over the statistical uncertainty in input data. Results: The estimated 5-year FFP (95% confidence intervals [CI]) for craniospinal doses of 15, 18, 24, and 36 Gy while maintaining 54 Gy to the posterior fossa was 77% (95% CI, 70%-81%), 78% (95% CI, 73%-81%), 79% (95% CI, 76%-82%), and 80% (95% CI, 77%-84%) respectively. The uncertainty in FFP was considerably larger for craniospinal doses below 18 Gy, reflecting the lack of data in the lower dose range. Conclusions: Estimates of tumor control and time-to-progression for standard-risk MB provides a data-driven setting for hypothesis generation or power calculations for prospective trials, taking the uncertainties into account. The presented methods can also be applied to incorporate further risk-stratification for example based on molecular biomarkers, when the necessary data become available.« less
Hamada, Tsuyoshi; Nakai, Yousuke; Isayama, Hiroyuki; Togawa, Osamu; Kogure, Hirofumi; Kawakubo, Kazumichi; Tsujino, Takeshi; Sasahira, Naoki; Hirano, Kenji; Yamamoto, Natsuyo; Ito, Yukiko; Sasaki, Takashi; Mizuno, Suguru; Toda, Nobuo; Tada, Minoru; Koike, Kazuhiko
2014-03-01
Self-expandable metallic stent (SEMS) placement is widely carried out for distal malignant biliary obstruction, and survival analysis is used to evaluate the cumulative incidences of SEMS dysfunction (e.g. the Kaplan-Meier [KM] method and the log-rank test). However, these statistical methods might be inappropriate in the presence of 'competing risks' (here, death without SEMS dysfunction), which affects the probability of experiencing the event of interest (SEMS dysfunction); that is, SEMS dysfunction can no longer be observed after death. A competing risk analysis has rarely been done in studies on SEMS. We introduced the concept of a competing risk analysis and illustrated its impact on the evaluation of SEMS outcomes using hypothetical and actual data. Our illustrative study included 476 consecutive patients who underwent SEMS placement for unresectable distal malignant biliary obstruction. A significant difference between cumulative incidences of SEMS dysfunction in male and female patients via theKM method (P = 0.044 by the log-rank test) disappeared after applying a competing risk analysis (P = 0.115 by Gray's test). In contrast, although cumulative incidences of SEMS dysfunction via the KM method were similar with and without chemotherapy (P = 0.647 by the log-rank test), cumulative incidence of SEMS dysfunction in the non-chemotherapy group was shown to be significantly lower (P = 0.031 by Gray's test) in a competing risk analysis. Death as a competing risk event needs to be appropriately considered in estimating a cumulative incidence of SEMS dysfunction, otherwise analytical results may be biased. © 2013 The Authors. Digestive Endoscopy © 2013 Japan Gastroenterological Endoscopy Society.
Hoover, D R; Peng, Y; Saah, A J; Detels, R R; Day, R S; Phair, J P
A simple non-parametric approach is developed to simultaneously estimate net incidence and morbidity time from specific AIDS illnesses in populations at high risk for death from these illnesses and other causes. The disease-death process has four-stages that can be recast as two sandwiching three-state multiple decrement processes. Non-parametric estimation of net incidence and morbidity time with error bounds are achieved from these sandwiching models through modification of methods from Aalen and Greenwood, and bootstrapping. An application to immunosuppressed HIV-1 infected homosexual men reveals that cytomegalovirus disease, Kaposi's sarcoma and Pneumocystis pneumonia are likely to occur and cause significant morbidity time.
Mt-Isa, Shahrul; Hallgreen, Christine E; Wang, Nan; Callréus, Torbjörn; Genov, Georgy; Hirsch, Ian; Hobbiger, Stephen F; Hockley, Kimberley S; Luciani, Davide; Phillips, Lawrence D; Quartey, George; Sarac, Sinan B; Stoeckert, Isabelle; Tzoulaki, Ioanna; Micaleff, Alain; Ashby, Deborah
2014-07-01
The need for formal and structured approaches for benefit-risk assessment of medicines is increasing, as is the complexity of the scientific questions addressed before making decisions on the benefit-risk balance of medicines. We systematically collected, appraised and classified available benefit-risk methodologies to facilitate and inform their future use. A systematic review of publications identified benefit-risk assessment methodologies. Methodologies were appraised on their fundamental principles, features, graphical representations, assessability and accessibility. We created a taxonomy of methodologies to facilitate understanding and choice. We identified 49 methodologies, critically appraised and classified them into four categories: frameworks, metrics, estimation techniques and utility survey techniques. Eight frameworks describe qualitative steps in benefit-risk assessment and eight quantify benefit-risk balance. Nine metric indices include threshold indices to measure either benefit or risk; health indices measure quality-of-life over time; and trade-off indices integrate benefits and risks. Six estimation techniques support benefit-risk modelling and evidence synthesis. Four utility survey techniques elicit robust value preferences from relevant stakeholders to the benefit-risk decisions. Methodologies to help benefit-risk assessments of medicines are diverse and each is associated with different limitations and strengths. There is not a 'one-size-fits-all' method, and a combination of methods may be needed for each benefit-risk assessment. The taxonomy introduced herein may guide choice of adequate methodologies. Finally, we recommend 13 of 49 methodologies for further appraisal for use in the real-life benefit-risk assessment of medicines. Copyright © 2014 John Wiley & Sons, Ltd.
An, Lihua; Fung, Karen Y; Krewski, Daniel
2010-09-01
Spontaneous adverse event reporting systems are widely used to identify adverse reactions to drugs following their introduction into the marketplace. In this article, a James-Stein type shrinkage estimation strategy was developed in a Bayesian logistic regression model to analyze pharmacovigilance data. This method is effective in detecting signals as it combines information and borrows strength across medically related adverse events. Computer simulation demonstrated that the shrinkage estimator is uniformly better than the maximum likelihood estimator in terms of mean squared error. This method was used to investigate the possible association of a series of diabetic drugs and the risk of cardiovascular events using data from the Canada Vigilance Online Database.
2011-01-01
Background Previous research has documented heterogeneity in the effects of maternal education on adverse birth outcomes by nativity and Hispanic subgroup in the United States. In this article, we considered the risk of preterm birth (PTB) using 9 years of vital statistics birth data from New York City. We employed finer categorizations of exposure than used previously and estimated the risk dose-response across the range of education by nativity and ethnicity. Methods Using Bayesian random effects logistic regression models with restricted quadratic spline terms for years of completed maternal education, we calculated and plotted the estimated posterior probabilities of PTB (gestational age < 37 weeks) for each year of education by ethnic and nativity subgroups adjusted for only maternal age, as well as with more extensive covariate adjustments. We then estimated the posterior risk difference between native and foreign born mothers by ethnicity over the continuous range of education exposures. Results The risk of PTB varied substantially by education, nativity and ethnicity. Native born groups showed higher absolute risk of PTB and declining risk associated with higher levels of education beyond about 10 years, as did foreign-born Puerto Ricans. For most other foreign born groups, however, risk of PTB was flatter across the education range. For Mexicans, Central Americans, Dominicans, South Americans and "Others", the protective effect of foreign birth diminished progressively across the educational range. Only for Puerto Ricans was there no nativity advantage for the foreign born, although small numbers of foreign born Cubans limited precision of estimates for that group. Conclusions Using flexible Bayesian regression models with random effects allowed us to estimate absolute risks without strong modeling assumptions. Risk comparisons for any sub-groups at any exposure level were simple to calculate. Shrinkage of posterior estimates through the use of random effects allowed for finer categorization of exposures without restricting joint effects to follow a fixed parametric scale. Although foreign born Hispanic women with the least education appeared to generally have low risk, this seems likely to be a marker for unmeasured environmental and behavioral factors, rather than a causally protective effect of low education itself. PMID:21504612
Ng, S K; McLachlan, G J
2003-04-15
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright 2003 John Wiley & Sons, Ltd.
Cook, D A
2006-04-01
Models that estimate the probability of death of intensive care unit patients can be used to stratify patients according to the severity of their condition and to control for casemix and severity of illness. These models have been used for risk adjustment in quality monitoring, administration, management and research and as an aid to clinical decision making. Models such as the Mortality Prediction Model family, SAPS II, APACHE II, APACHE III and the organ system failure models provide estimates of the probability of in-hospital death of ICU patients. This review examines methods to assess the performance of these models. The key attributes of a model are discrimination (the accuracy of the ranking in order of probability of death) and calibration (the extent to which the model's prediction of probability of death reflects the true risk of death). These attributes should be assessed in existing models that predict the probability of patient mortality, and in any subsequent model that is developed for the purposes of estimating these probabilities. The literature contains a range of approaches for assessment which are reviewed and a survey of the methodologies used in studies of intensive care mortality models is presented. The systematic approach used by Standards for Reporting Diagnostic Accuracy provides a framework to incorporate these theoretical considerations of model assessment and recommendations are made for evaluation and presentation of the performance of models that estimate the probability of death of intensive care patients.
Population viability analysis for endangered Roanoke logperch
Roberts, James H.; Angermeier, Paul; Anderson, Gregory B.
2016-01-01
A common strategy for recovering endangered species is ensuring that populations exceed the minimum viable population size (MVP), a demographic benchmark that theoretically ensures low long-term extinction risk. One method of establishing MVP is population viability analysis, a modeling technique that simulates population trajectories and forecasts extinction risk based on a series of biological, environmental, and management assumptions. Such models also help identify key uncertainties that have a large influence on extinction risk. We used stochastic count-based simulation models to explore extinction risk, MVP, and the possible benefits of alternative management strategies in populations of Roanoke logperch Percina rex, an endangered stream fish. Estimates of extinction risk were sensitive to the assumed population growth rate and model type, carrying capacity, and catastrophe regime (frequency and severity of anthropogenic fish kills), whereas demographic augmentation did little to reduce extinction risk. Under density-dependent growth, the estimated MVP for Roanoke logperch ranged from 200 to 4200 individuals, depending on the assumed severity of catastrophes. Thus, depending on the MVP threshold, anywhere from two to all five of the logperch populations we assessed were projected to be viable. Despite this uncertainty, these results help identify populations with the greatest relative extinction risk, as well as management strategies that might reduce this risk the most, such as increasing carrying capacity and reducing fish kills. Better estimates of population growth parameters and catastrophe regimes would facilitate the refinement of MVP and extinction-risk estimates, and they should be a high priority for future research on Roanoke logperch and other imperiled stream-fish species.
Stock price estimation using ensemble Kalman Filter square root method
NASA Astrophysics Data System (ADS)
Karya, D. F.; Katias, P.; Herlambang, T.
2018-04-01
Shares are securities as the possession or equity evidence of an individual or corporation over an enterprise, especially public companies whose activity is stock trading. Investment in stocks trading is most likely to be the option of investors as stocks trading offers attractive profits. In determining a choice of safe investment in the stocks, the investors require a way of assessing the stock prices to buy so as to help optimize their profits. An effective method of analysis which will reduce the risk the investors may bear is by predicting or estimating the stock price. Estimation is carried out as a problem sometimes can be solved by using previous information or data related or relevant to the problem. The contribution of this paper is that the estimates of stock prices in high, low, and close categorycan be utilized as investors’ consideration for decision making in investment. In this paper, stock price estimation was made by using the Ensemble Kalman Filter Square Root method (EnKF-SR) and Ensemble Kalman Filter method (EnKF). The simulation results showed that the resulted estimation by applying EnKF method was more accurate than that by the EnKF-SR, with an estimation error of about 0.2 % by EnKF and an estimation error of 2.6 % by EnKF-SR.
Forder, Julien; Malley, Juliette; Towers, Ann-Marie; Netten, Ann
2014-08-01
The aim is to describe and trial a pragmatic method to produce estimates of the incremental cost-effectiveness of care services from survey data. The main challenge is in estimating the counterfactual; that is, what the patient's quality of life would be if they did not receive that level of service. A production function method is presented, which seeks to distinguish the variation in care-related quality of life in the data that is due to service use as opposed to other factors. A problem is that relevant need factors also affect the amount of service used and therefore any missing factors could create endogeneity bias. Instrumental variable estimation can mitigate this problem. This method was applied to a survey of older people using home care as a proof of concept. In the analysis, we were able to estimate a quality-of-life production function using survey data with the expected form and robust estimation diagnostics. The practical advantages with this method are clear, but there are limitations. It is computationally complex, and there is a risk of misspecification and biased results, particularly with IV estimation. One strategy would be to use this method to produce preliminary estimates, with a full trial conducted thereafter, if indicated. Copyright © 2013 John Wiley & Sons, Ltd.
Bayesian averaging over Decision Tree models for trauma severity scoring.
Schetinin, V; Jakaite, L; Krzanowski, W
2018-01-01
Health care practitioners analyse possible risks of misleading decisions and need to estimate and quantify uncertainty in predictions. We have examined the "gold" standard of screening a patient's conditions for predicting survival probability, based on logistic regression modelling, which is used in trauma care for clinical purposes and quality audit. This methodology is based on theoretical assumptions about data and uncertainties. Models induced within such an approach have exposed a number of problems, providing unexplained fluctuation of predicted survival and low accuracy of estimating uncertainty intervals within which predictions are made. Bayesian method, which in theory is capable of providing accurate predictions and uncertainty estimates, has been adopted in our study using Decision Tree models. Our approach has been tested on a large set of patients registered in the US National Trauma Data Bank and has outperformed the standard method in terms of prediction accuracy, thereby providing practitioners with accurate estimates of the predictive posterior densities of interest that are required for making risk-aware decisions. Copyright © 2017 Elsevier B.V. All rights reserved.
CULTURE-INDEPENDENT MOLECULAR METHODS FOR FECAL SOURCE IDENTIFICATION
Fecal contamination is widespread in the waterways of the United States. Both to correct the problem, and to estimate public health risk, it is necessary to identify the source of the contamination. Several culture-independent molecular methods for fecal source identification hav...
Shield, Kevin D; Rehm, Jürgen
2015-05-10
Alcohol consumption is a major risk factor for the burden of disease globally. This burden is estimated using Relative Risk (RR) functions for alcohol from meta-analyses that use data from all countries; however, for Russia and surrounding countries, country-specific risk data may need to be used. The objective of this paper is to compare the estimated burden of alcohol consumption calculated using Russia-specific alcohol RRs with the estimated burden of alcohol consumption calculated using alcohol RRs from meta-analyses. Data for 2012 on drinking indicators were calculated based on the Global Information System on Alcohol and Health. Data for 2012 on mortality, Years of Life Lost, Years Lived with Disability, and Disability-Adjusted Life Years (DALYs) lost by cause were obtained by country from the World Health Organization. Alcohol Population-Attributable Fractions (PAFs) were calculated based on a risk modelling methodology from Russia. These PAFs were compared to PAFs calculated using methods applied for all other countries. The 95% Uncertainty Intervals (UIs) for the alcohol PAFs were calculated using a Monte Carlo-like method. Using Russia-specific alcohol RR functions, in Russia in 2012 alcohol caused an estimated 231,900 deaths (95% UI: 185,600 to 278,200) (70,800 deaths among women and 161,100 deaths among men) and 13,295,000 DALYs lost (95% UI: 11,242,000 to 15,348,000) (3,670,000 DALYs lost among women and 9,625,000 DALYs lost among men) among people 0 to 64 years of age. This compares to an estimated 165,600 deaths (95% UI: 97,200 to 228,100) (29,700 deaths among women and 135,900 deaths among men) and 10,623,000 DALYs lost (95% UI: 7,265,000 to 13,754,000) (1,783,000 DALYs lost among women and 8,840,000 DALYs lost among men) among people 0 to 64 years of age caused by alcohol when non-Russia-specific alcohol RRs were used. Results indicate that if the Russia-specific RRs are used when estimating the health burden attributable to alcohol consumption in Russia, then the total estimated burden will be more than if RRs from meta-analyses are used. Furthermore, additional research is needed to understand which aspects of the Russian style of drinking cause the most harm.
Bahouth, George; Digges, Kennerly; Schulman, Carl
2012-01-01
This paper presents methods to estimate crash injury risk based on crash characteristics captured by some passenger vehicles equipped with Advanced Automatic Crash Notification technology. The resulting injury risk estimates could be used within an algorithm to optimize rescue care. Regression analysis was applied to the National Automotive Sampling System / Crashworthiness Data System (NASS/CDS) to determine how variations in a specific injury risk threshold would influence the accuracy of predicting crashes with serious injuries. The recommended thresholds for classifying crashes with severe injuries are 0.10 for frontal crashes and 0.05 for side crashes. The regression analysis of NASS/CDS indicates that these thresholds will provide sensitivity above 0.67 while maintaining a positive predictive value in the range of 0.20. PMID:23169132
Deterministic SLIR model for tuberculosis disease mapping
NASA Astrophysics Data System (ADS)
Aziz, Nazrina; Diah, Ijlal Mohd; Ahmad, Nazihah; Kasim, Maznah Mat
2017-11-01
Tuberculosis (TB) occurs worldwide. It can be transmitted to others directly through air when active TB persons sneeze, cough or spit. In Malaysia, it was reported that TB cases had been recognized as one of the most infectious disease that lead to death. Disease mapping is one of the methods that can be used as the prevention strategies since it can displays clear picture for the high-low risk areas. Important thing that need to be considered when studying the disease occurrence is relative risk estimation. The transmission of TB disease is studied through mathematical model. Therefore, in this study, deterministic SLIR models are used to estimate relative risk for TB disease transmission.
Lee, Peter N; Fry, John S; Thornton, Alison J
2014-02-01
We attempted to quantify the decline in stroke risk following quitting using the negative exponential model, with methodology previously employed for IHD. We identified 22 blocks of RRs (from 13 studies) comparing current smokers, former smokers (by time quit) and never smokers. Corresponding pseudo-numbers of cases and controls/at risk formed the data for model-fitting. We tried to estimate the half-life (H, time since quit when the excess risk becomes half that for a continuing smoker) for each block. The method failed to converge or produced very variable estimates of H in nine blocks with a current smoker RR <1.40. Rejecting these, and combining blocks by amount smoked in one study where problems arose in model-fitting, the final analyses used 11 blocks. Goodness-of-fit was adequate for each block, the combined estimate of H being 4.78(95%CI 2.17-10.50) years. However, considerable heterogeneity existed, unexplained by any factor studied, with the random-effects estimate 3.08(1.32-7.16). Sensitivity analyses allowing for reverse causation or differing assumed times for the final quitting period gave similar results. The estimates of H are similar for stroke and IHD, and the individual estimates similarly heterogeneous. Fitting the model is harder for stroke, due to its weaker association with smoking. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Joshi, Amit D; Kim, Andre; Lewinger, Juan Pablo; Ulrich, Cornelia M; Potter, John D; Cotterchio, Michelle; Le Marchand, Loic; Stern, Mariana C
2015-06-01
Diets high in red meat and processed meats are established colorectal cancer (CRC) risk factors. However, it is still not well understood what explains this association. We conducted comprehensive analyses of CRC risk and red meat and poultry intakes, taking into account cooking methods, level of doneness, estimated intakes of heterocyclic amines (HCAs) that accumulate during meat cooking, tumor location, and tumor mismatch repair proficiency (MMR) status. We analyzed food frequency and portion size data including a meat cooking module for 3364 CRC cases, 1806 unaffected siblings, 136 unaffected spouses, and 1620 unaffected population-based controls, recruited into the CRC Family Registry. Odds ratios (OR) and 95% confidence intervals (CI) for nutrient density variables were estimated using generalized estimating equations. We found no evidence of an association between total nonprocessed red meat or total processed meat and CRC risk. Our main finding was a positive association with CRC for pan-fried beefsteak (P(trend) < 0.001), which was stronger among MMR deficient cases (heterogeneity P = 0.059). Other worth noting associations, of borderline statistical significance after multiple testing correction, were a positive association between diets high in oven-broiled short ribs or spareribs and CRC risk (P(trend) = 0.002), which was also stronger among MMR-deficient cases, and an inverse association with grilled hamburgers (P(trend) = 0.002). Our results support the role of specific meat types and cooking practices as possible sources of human carcinogens relevant for CRC risk. © 2015 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.
Joshi, Amit D; Kim, Andre; Lewinger, Juan Pablo; Ulrich, Cornelia M; Potter, John D; Cotterchio, Michelle; Le Marchand, Loic; Stern, Mariana C
2015-01-01
Diets high in red meat and processed meats are established colorectal cancer (CRC) risk factors. However, it is still not well understood what explains this association. We conducted comprehensive analyses of CRC risk and red meat and poultry intakes, taking into account cooking methods, level of doneness, estimated intakes of heterocyclic amines (HCAs) that accumulate during meat cooking, tumor location, and tumor mismatch repair proficiency (MMR) status. We analyzed food frequency and portion size data including a meat cooking module for 3364 CRC cases, 1806 unaffected siblings, 136 unaffected spouses, and 1620 unaffected population-based controls, recruited into the CRC Family Registry. Odds ratios (OR) and 95% confidence intervals (CI) for nutrient density variables were estimated using generalized estimating equations. We found no evidence of an association between total nonprocessed red meat or total processed meat and CRC risk. Our main finding was a positive association with CRC for pan-fried beefsteak (Ptrend < 0.001), which was stronger among MMR deficient cases (heterogeneity P = 0.059). Other worth noting associations, of borderline statistical significance after multiple testing correction, were a positive association between diets high in oven-broiled short ribs or spareribs and CRC risk (Ptrend = 0.002), which was also stronger among MMR-deficient cases, and an inverse association with grilled hamburgers (Ptrend = 0.002). Our results support the role of specific meat types and cooking practices as possible sources of human carcinogens relevant for CRC risk. PMID:25846122
Guo, How-Ran
2011-10-20
Despite its limitations, ecological study design is widely applied in epidemiology. In most cases, adjustment for age is necessary, but different methods may lead to different conclusions. To compare three methods of age adjustment, a study on the associations between arsenic in drinking water and incidence of bladder cancer in 243 townships in Taiwan was used as an example. A total of 3068 cases of bladder cancer, including 2276 men and 792 women, were identified during a ten-year study period in the study townships. Three methods were applied to analyze the same data set on the ten-year study period. The first (Direct Method) applied direct standardization to obtain standardized incidence rate and then used it as the dependent variable in the regression analysis. The second (Indirect Method) applied indirect standardization to obtain standardized incidence ratio and then used it as the dependent variable in the regression analysis instead. The third (Variable Method) used proportions of residents in different age groups as a part of the independent variables in the multiple regression models. All three methods showed a statistically significant positive association between arsenic exposure above 0.64 mg/L and incidence of bladder cancer in men and women, but different results were observed for the other exposure categories. In addition, the risk estimates obtained by different methods for the same exposure category were all different. Using an empirical example, the current study confirmed the argument made by other researchers previously that whereas the three different methods of age adjustment may lead to different conclusions, only the third approach can obtain unbiased estimates of the risks. The third method can also generate estimates of the risk associated with each age group, but the other two are unable to evaluate the effects of age directly.
Analysis of dengue fever risk using geostatistics model in bone regency
NASA Astrophysics Data System (ADS)
Amran, Stang, Mallongi, Anwar
2017-03-01
This research aim is to analysis of dengue fever risk based on Geostatistics model in Bone Regency. Risk levels of dengue fever are denoted by parameter of Binomial distribution. Effect of temperature, rainfalls, elevation, and larvae abundance are investigated through Geostatistics model. Bayesian hierarchical method is used in estimation process. Using dengue fever data in eleven locations this research shows that temperature and rainfall have significant effect of dengue fever risk in Bone regency.
Provision of a wildfire risk map: informing residents in the wildland urban interface.
Mozumder, Pallab; Helton, Ryan; Berrens, Robert P
2009-11-01
Wildfires in the wildland urban interface (WUI) are an increasing concern throughout the western United States and elsewhere. WUI communities continue to grow and thus increase the wildfire risk to human lives and property. Information such as a wildfire risk map can inform WUI residents of potential risks and may help to efficiently sort mitigation efforts. This study uses the survey-based contingent valuation (CV) method to examine annual household willingness to pay (WTP) for the provision of a wildfire risk map. Data were collected through a mail survey of the East Mountain WUI area in the State of New Mexico (USA). The integrated empirical approach includes a system of equations that involves joint estimation of WTP values, along with measures of a respondent's risk perception and risk mitigation behavior. The median estimated WTP is around U.S. $12 for the annual wildfire risk map, which covers at least the costs of producing and distributing available risk information. Further, providing a wildfire risk map can help address policy goals emphasizing information gathering and sharing among stakeholders to mitigate the effects of wildfires.
McMillan, Garnett P; Hanson, Tim; Bedrick, Edward J; Lapham, Sandra C
2005-09-01
This study demonstrates the usefulness of the Bivariate Dale Model (BDM) as a method for estimating the relationship between risk factors and the quantity and frequency of alcohol use, as well as the degree of association between these highly correlated drinking measures. The BDM is used to evaluate childhood sexual abuse, along with age and gender, as risk factors for the quantity and frequency of beer consumption in a sample of driving-while-intoxicated (DWI) offenders (N = 1,964; 1,612 men). The BDM allows one to estimate the relative odds of drinking up to each level of ordinal-scaled quantity and frequency of alcohol use, as well as model the degree of association between quantity and frequency of alcohol consumption as a function of covariates. Individuals who experienced childhood sexual abuse have increased risks of higher quantity and frequency of beer consumption. History of childhood sexual abuse has a greater effect on women, causing them to drink higher quantities of beer per drinking occasion. The BDM is a useful method for evaluating predictors of the quantity-frequency of alcohol consumption. SAS macrocode for fitting the BDM model is provided.
Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model
NASA Astrophysics Data System (ADS)
Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley
2017-05-01
Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.
Dinitz, Laura B.
2008-01-01
With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS-MH currently performs analyses for earthquakes, floods, and hurricane wind. HAZUS-MH loss estimates, however, do not account for some uncertainties associated with the specific natural-hazard scenarios, such as the likelihood of occurrence within a particular time horizon or the effectiveness of alternative risk-reduction options. Because of the uncertainties involved, it is challenging to make informative decisions about how to cost-effectively reduce risk from natural-hazard events. Risk analysis is one approach that decision-makers can use to evaluate alternative risk-reduction choices when outcomes are unknown. The Land Use Portfolio Model (LUPM), developed by the U.S. Geological Survey (USGS), is a geospatial scenario-based tool that incorporates hazard-event uncertainties to support risk analysis. The LUPM offers an approach to estimate and compare risks and returns from investments in risk-reduction measures. This paper describes and demonstrates a hypothetical application of the LUPM for Ventura County, California, and examines the challenges involved in developing decision tools that provide quantitative methods to estimate losses and analyze risk from natural hazards.
Lung cancer risk from PAHs emitted from biomass combustion.
Sarigiannis, Dimosthenis Α; Karakitsios, Spyros P; Zikopoulos, Dimitrios; Nikolaki, Spyridoula; Kermenidou, Marianthi
2015-02-01
This study deals with the assessment of the cancer risk attributable to PAH exposure, attributable to the increased use of biomass for space heating in Greece in the winter of 2012-2013. Three fractions of particulates (PM1, PM2.5 and PM10) were measured in two sampling sites (urban/residential and traffic-influenced) followed by chemical analysis of 19 PAHs and levoglucosan (used as a biomarker tracer). PAH-induced lung cancer risk was estimated by a comprehensive methodology that incorporated human respiratory tract deposition modelling in order to estimate the toxic equivalent concentration (TEQ) at each target tissue. This allowed us to further differentiate internal exposure and risk by age groups. Results showed that all PM fractions are higher in Greece during the cold months of the year, mainly due to biomass use for space heating. PAH and levoglucosan levels were highly correlated, indicating that particles emitted from biomass combustion are more toxic than PM emitted from other sources. The estimated lung cancer risk was non-negligible for residents close to the urban background monitoring site. Higher risk was estimated for infants and children, due to the higher bodyweight normalized dose and the human respiratory tract (HRT) physiology. HRT structure and physiology in youngsters favor deposition of particles that are smaller and more toxic per unit mass. In all cases, the estimated risk (5.7E-07 and 1.4E-06 for the urban background site and 1.4E-07 to 5.0E-07 for the traffic site) was lower to the one estimated by the conventional methodology (2.8E-06 and 9.7E-07 for the urban background and the traffic site respectively) that is based on Inhalation Unit Risk; the latter assumes that all PAHs adsorbed on particles are taken up by humans. With the methodology proposed herein, the estimated risk presents a 5-7 times difference between the two sampling sites (depending on the age group). These differences could not have been identified had we relied only on conventional risk assessment method. Consequently, the actual cancer risk attributable to PAHs on PM emitted from biomass burning would have been significantly underestimated. Copyright © 2014 Elsevier Inc. All rights reserved.
A Figure of Merit: Quantifying the Probability of a Nuclear Reactor Accident.
Wellock, Thomas R
In recent decades, probabilistic risk assessment (PRA) has become an essential tool in risk analysis and management in many industries and government agencies. The origins of PRA date to the 1975 publication of the U.S. Nuclear Regulatory Commission's (NRC) Reactor Safety Study led by MIT professor Norman Rasmussen. The "Rasmussen Report" inspired considerable political and scholarly disputes over the motives behind it and the value of its methods and numerical estimates of risk. The Report's controversies have overshadowed the deeper technical origins of risk assessment. Nuclear experts had long sought to express risk in a "figure of merit" to verify the safety of weapons and, later, civilian reactors. By the 1970s, technical advances in PRA gave the methodology the potential to serve political ends, too. The Report, it was hoped, would prove nuclear power's safety to a growing chorus of critics. Subsequent attacks on the Report's methods and numerical estimates damaged the NRC's credibility. PRA's fortunes revived when the 1979 Three Mile Island accident demonstrated PRA's potential for improving the safety of nuclear power and other technical systems. Nevertheless, the Report's controversies endure in mistrust of PRA and its experts.
Burger, Andrew E; Reither, Eric N
2014-06-30
Despite the availability of vaccines that mitigate the health risks associated with seasonal influenza, most individuals in the U.S. remain unvaccinated. Monitoring vaccination uptake for seasonal influenza, especially among disadvantaged or high-risk groups, is therefore an important public health activity. The Behavioral Risk Factor Surveillance System (BRFSS) - the largest telephone-based health surveillance system in the world - is an important resource in monitoring population health trends, including influenza vaccination. However, due to limitations in the question that measures influenza vaccination status, difficulties arise in estimating seasonal vaccination rates. Although researchers have proposed various methodologies to address this issue, no systematic review of these methodologies exists. By subjecting these methods to tests of sensitivity and specificity, we identify their strengths and weaknesses and advance a new method for estimating national and state-level vaccination rates with BRFSS data. To ensure that our findings are not anomalous to the BRFSS, we also analyze data from the National Health Interview Survey (NHIS). For both studies, we find that restricting the sample to interviews conducted between January and September offers the best balance of sensitivity (>90% on average), specificity (>90% on average), and statistical power (retention of 92.2% of vaccinations from the target flu season) over other proposed methods. We conclude that including survey participants from these months provides a simple and effective way to estimate seasonal influenza vaccination rates with BRFSS and NHIS data, and we discuss potential ways to better estimate vaccination rates in future epidemiologic surveys. Copyright © 2014 Elsevier Ltd. All rights reserved.
Estimating risk and rate levels, ratios and differences in case-control studies.
King, Gary; Zeng, Langche
2002-05-30
Classic (or 'cumulative') case-control sampling designs do not admit inferences about quantities of interest other than risk ratios, and then only by making the rare events assumption. Probabilities, risk differences and other quantities cannot be computed without knowledge of the population incidence fraction. Similarly, density (or 'risk set') case-control sampling designs do not allow inferences about quantities other than the rate ratio. Rates, rate differences, cumulative rates, risks, and other quantities cannot be estimated unless auxiliary information about the underlying cohort such as the number of controls in each full risk set is available. Most scholars who have considered the issue recommend reporting more than just risk and rate ratios, but auxiliary population information needed to do this is not usually available. We address this problem by developing methods that allow valid inferences about all relevant quantities of interest from either type of case-control study when completely ignorant of or only partially knowledgeable about relevant auxiliary population information.
Xue, Xiaonan; Shore, Roy E; Ye, Xiangyang; Kim, Mimi Y
2004-10-01
Occupational exposures are often recorded as zero when the exposure is below the minimum detection level (BMDL). This can lead to an underestimation of the doses received by individuals and can lead to biased estimates of risk in occupational epidemiologic studies. The extent of the exposure underestimation is increased with the magnitude of the minimum detection level (MDL) and the frequency of monitoring. This paper uses multiple imputation methods to impute values for the missing doses due to BMDL. A Gibbs sampling algorithm is developed to implement the method, which is applied to two distinct scenarios: when dose information is available for each measurement (but BMDL is recorded as zero or some other arbitrary value), or when the dose information available represents the summation of a series of measurements (e.g., only yearly cumulative exposure is available but based on, say, weekly measurements). Then the average of the multiple imputed exposure realizations for each individual is used to obtain an unbiased estimate of the relative risk associated with exposure. Simulation studies are used to evaluate the performance of the estimators. As an illustration, the method is applied to a sample of historical occupational radiation exposure data from the Oak Ridge National Laboratory.
Concepts in ecological risk assessment. Professional paper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, R.K.; Seligman, P.F.
1991-05-01
Assessing the risk of impact to natural ecosystems from xenobiotic compounds requires an accurate characterization of the threatened ecosystem, direct measures or estimates of environmental exposure, and a comprehensive evaluation of the biological effects from actual and potential contamination. Field and laboratory methods have been developed to obtain direct measures of environmental health. These methods have been implemented in monitoring programs to assess and verify the ecological risks of contamination from past events, such as hazardous waste disposal sites, as well as future scenarios, such as the environmental consequences from the use of biocides in antifouling bottom paints for ships.
Ezoe, Satoshi; Morooka, Takeo; Noda, Tatsuya; Sabin, Miriam Lewis; Koike, Soichi
2012-01-01
Men who have sex with men (MSM) are one of the groups most at risk for HIV infection in Japan. However, size estimates of MSM populations have not been conducted with sufficient frequency and rigor because of the difficulty, high cost and stigma associated with reaching such populations. This study examined an innovative and simple method for estimating the size of the MSM population in Japan. We combined an internet survey with the network scale-up method, a social network method for estimating the size of hard-to-reach populations, for the first time in Japan. An internet survey was conducted among 1,500 internet users who registered with a nationwide internet-research agency. The survey participants were asked how many members of particular groups with known population sizes (firepersons, police officers, and military personnel) they knew as acquaintances. The participants were also asked to identify the number of their acquaintances whom they understood to be MSM. Using these survey results with the network scale-up method, the personal network size and MSM population size were estimated. The personal network size was estimated to be 363.5 regardless of the sex of the acquaintances and 174.0 for only male acquaintances. The estimated MSM prevalence among the total male population in Japan was 0.0402% without adjustment, and 2.87% after adjusting for the transmission error of MSM. The estimated personal network size and MSM prevalence seen in this study were comparable to those from previous survey results based on the direct-estimation method. Estimating population sizes through combining an internet survey with the network scale-up method appeared to be an effective method from the perspectives of rapidity, simplicity, and low cost as compared with more-conventional methods.
Estimating the probability of rare events: addressing zero failure data.
Quigley, John; Revie, Matthew
2011-07-01
Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.
Analysis of percent density estimates from digital breast tomosynthesis projection images
NASA Astrophysics Data System (ADS)
Bakic, Predrag R.; Kontos, Despina; Zhang, Cuiping; Yaffe, Martin J.; Maidment, Andrew D. A.
2007-03-01
Women with dense breasts have an increased risk of breast cancer. Breast density is typically measured as the percent density (PD), the percentage of non-fatty (i.e., dense) tissue in breast images. Mammographic PD estimates vary, in part, due to the projective nature of mammograms. Digital breast tomosynthesis (DBT) is a novel radiographic method in which 3D images of the breast are reconstructed from a small number of projection (source) images, acquired at different positions of the x-ray focus. DBT provides superior visualization of breast tissue and has improved sensitivity and specificity as compared to mammography. Our long-term goal is to test the hypothesis that PD obtained from DBT is superior in estimating cancer risk compared with other modalities. As a first step, we have analyzed the PD estimates from DBT source projections since the results would be independent of the reconstruction method. We estimated PD from MLO mammograms (PD M) and from individual DBT projections (PD T). We observed good agreement between PD M and PD T from the central projection images of 40 women. This suggests that variations in breast positioning, dose, and scatter between mammography and DBT do not negatively affect PD estimation. The PD T estimated from individual DBT projections of nine women varied with the angle between the projections. This variation is caused by the 3D arrangement of the breast dense tissue and the acquisition geometry.
Estimating the asbestos-related lung cancer burden from mesothelioma mortality
McCormack, V; Peto, J; Byrnes, G; Straif, K; Boffetta, P
2012-01-01
Background: Quantifying the asbestos-related lung cancer burden is difficult in the presence of this disease's multiple causes. We explore two methods to estimate this burden using mesothelioma deaths as a proxy for asbestos exposure. Methods: From the follow-up of 55 asbestos cohorts, we estimated ratios of (i) absolute number of asbestos-related lung cancers to mesothelioma deaths; (ii) excess lung cancer relative risk (%) to mesothelioma mortality per 1000 non-asbestos-related deaths. Results: Ratios varied by asbestos type; there were a mean 0.7 (95% confidence interval 0.5, 1.0) asbestos-related lung cancers per mesothelioma death in crocidolite cohorts (n=6 estimates), 6.1 (3.6, 10.5) in chrysotile (n=16), 4.0 (2.8, 5.9) in amosite (n=4) and 1.9 (1.4, 2.6) in mixed asbestos fibre cohorts (n=31). In a population with 2 mesothelioma deaths per 1000 deaths at ages 40–84 years (e.g., US men), the estimated lung cancer population attributable fraction due to mixed asbestos was estimated to be 4.0%. Conclusion: All types of asbestos fibres kill at least twice as many people through lung cancer than through mesothelioma, except for crocidolite. For chrysotile, widely consumed today, asbestos-related lung cancers cannot be robustly estimated from few mesothelioma deaths and the latter cannot be used to infer no excess risk of lung or other cancers. PMID:22233924
NASA Astrophysics Data System (ADS)
Wang, Xuchu; Niu, Yanmin
2011-02-01
Automatic measurement of vessels from fundus images is a crucial step for assessing vessel anomalies in ophthalmological community, where the change in retinal vessel diameters is believed to be indicative of the risk level of diabetic retinopathy. In this paper, a new retinal vessel diameter measurement method by combining vessel orientation estimation and filter response is proposed. Its interesting characteristics include: (1) different from the methods that only fit the vessel profiles, the proposed method extracts more stable and accurate vessel diameter by casting this problem as a maximal response problem of a variation of Gabor filter; (2) the proposed method can directly and efficiently estimate the vessel's orientation, which is usually captured by time-consuming multi-orientation fitting techniques in many existing methods. Experimental results shows that the proposed method both retains the computational simplicity and achieves stable and accurate estimation results.
Bowman, Gene L.; Shannon, Jackilen; Ho, Emily; Traber, Maret G.; Frei, Balz; Oken, Barry S.; Kaye, Jeffery A.; Quinn, Joseph F.
2010-01-01
Introduction There is great interest in nutritional strategies for the prevention of age-related cognitive decline, yet the best methods for nutritional assessment in populations at risk for dementia are still evolving. Our study objective was to test the reliability and validity of two common nutritional assessments (plasma nutrient biomarkers and Food Frequency Questionnaire) in people at risk for dementia. Methods Thirty-eight elders, half with amnestic -Mild Cognitive Impairment and half with intact cognition were recruited. Nutritional assessments were collected together at baseline and again at 1 month. Intraclass and Pearson correlation coefficients quantified reliability and validity. Results Twenty-six nutrients were examined and reliability was very good or better for 77% (20/26, ICC ≥ .75) of the plasma nutrient biomarkers and for 88% of the FFQ estimates. Twelve of the plasma nutrient estimates were as reliable as the commonly measured plasma cholesterol (ICC=.92). FFQ and plasma long-chain fatty acids (docosahexaenoic acid, r =.39, eicosapentaenoic acid, r = .39) and carotenoids (α-carotene, r =.49; lutein + zeaxanthin, r = .48; β-carotene, r = .43; β-cryptoxanthin, r = .41) were correlated, but no other FFQ estimates correlated with respective nutrient biomarkers. Correlations between FFQ and plasma fatty acids and carotenoids were significantly stronger after removing subjects with MCI. Conclusion The reliability and validity of plasma and FFQ nutrient estimates vary according to the nutrient of interest. Memory deficit attenuates FFQ estimate validity and inflates FFQ estimate reliability. Many plasma nutrient biomarkers have very good reliability over 1-month regardless of memory state. This method can circumvent sources of error seen in other less direct methods of nutritional assessment. PMID:20856100
Nishino, Jo; Kochi, Yuta; Shigemizu, Daichi; Kato, Mamoru; Ikari, Katsunori; Ochi, Hidenori; Noma, Hisashi; Matsui, Kota; Morizono, Takashi; Boroevich, Keith A.; Tsunoda, Tatsuhiko; Matsui, Shigeyuki
2018-01-01
Genome-wide association studies (GWAS) suggest that the genetic architecture of complex diseases consists of unexpectedly numerous variants with small effect sizes. However, the polygenic architectures of many diseases have not been well characterized due to lack of simple and fast methods for unbiased estimation of the underlying proportion of disease-associated variants and their effect-size distribution. Applying empirical Bayes estimation of semi-parametric hierarchical mixture models to GWAS summary statistics, we confirmed that schizophrenia was extremely polygenic [~40% of independent genome-wide SNPs are risk variants, most within odds ratio (OR = 1.03)], whereas rheumatoid arthritis was less polygenic (~4 to 8% risk variants, significant portion reaching OR = 1.05 to 1.1). For rheumatoid arthritis, stratified estimations revealed that expression quantitative loci in blood explained large genetic variance, and low- and high-frequency derived alleles were prone to be risk and protective, respectively, suggesting a predominance of deleterious-risk and advantageous-protective mutations. Despite genetic correlation, effect-size distributions for schizophrenia and bipolar disorder differed across allele frequency. These analyses distinguished disease polygenic architectures and provided clues for etiological differences in complex diseases. PMID:29740473
Artificial Intelligence Estimation of Carotid-Femoral Pulse Wave Velocity using Carotid Waveform.
Tavallali, Peyman; Razavi, Marianne; Pahlevan, Niema M
2018-01-17
In this article, we offer an artificial intelligence method to estimate the carotid-femoral Pulse Wave Velocity (PWV) non-invasively from one uncalibrated carotid waveform measured by tonometry and few routine clinical variables. Since the signal processing inputs to this machine learning algorithm are sensor agnostic, the presented method can accompany any medical instrument that provides a calibrated or uncalibrated carotid pressure waveform. Our results show that, for an unseen hold back test set population in the age range of 20 to 69, our model can estimate PWV with a Root-Mean-Square Error (RMSE) of 1.12 m/sec compared to the reference method. The results convey the fact that this model is a reliable surrogate of PWV. Our study also showed that estimated PWV was significantly associated with an increased risk of CVDs.
Dynamic drought risk assessment using crop model and remote sensing techniques
NASA Astrophysics Data System (ADS)
Sun, H.; Su, Z.; Lv, J.; Li, L.; Wang, Y.
2017-02-01
Drought risk assessment is of great significance to reduce the loss of agricultural drought and ensure food security. The normally drought risk assessment method is to evaluate its exposure to the hazard and the vulnerability to extended periods of water shortage for a specific region, which is a static evaluation method. The Dynamic Drought Risk Assessment (DDRA) is to estimate the drought risk according to the crop growth and water stress conditions in real time. In this study, a DDRA method using crop model and remote sensing techniques was proposed. The crop model we employed is DeNitrification and DeComposition (DNDC) model. The drought risk was quantified by the yield losses predicted by the crop model in a scenario-based method. The crop model was re-calibrated to improve the performance by the Leaf Area Index (LAI) retrieved from MODerate Resolution Imaging Spectroradiometer (MODIS) data. And the in-situ station-based crop model was extended to assess the regional drought risk by integrating crop planted mapping. The crop planted area was extracted with extended CPPI method from MODIS data. This study was implemented and validated on maize crop in Liaoning province, China.
Barker, S Fiona; Amoah, Philip; Drechsel, Pay
2014-07-15
With a rapidly growing urban population in Kumasi, Ghana, the consumption of street food is increasing. Raw salads, which often accompany street food dishes, are typically composed of perishable vegetables that are grown in close proximity to the city using poor quality water for irrigation. This study assessed the risk of gastroenteritis illness (caused by rotavirus, norovirus and Ascaris lumbricoides) associated with the consumption of street food salads using Quantitative Microbial Risk Assessment (QMRA). Three different risk assessment models were constructed, based on availability of microbial concentrations: 1) Water - starting from irrigation water quality, 2) Produce - starting from the quality of produce at market, and 3) Street - using microbial quality of street food salad. In the absence of viral concentrations, published ratios between faecal coliforms and viruses were used to estimate the quality of water, produce and salad, and annual disease burdens were determined. Rotavirus dominated the estimates of annual disease burden (~10(-3)Disability Adjusted Life Years per person per year (DALYs pppy)), although norovirus also exceeded the 10(-4)DALY threshold for both Produce and Street models. The Water model ignored other on-farm and post-harvest sources of contamination and consistently produced lower estimates of risk; it likely underestimates disease burden and therefore is not recommended. Required log reductions of up to 5.3 (95th percentile) for rotavirus were estimated for the Street model, demonstrating that significant interventions are required to protect the health and safety of street food consumers in Kumasi. Estimates of virus concentrations were a significant source of model uncertainty and more data on pathogen concentrations is needed to refine QMRA estimates of disease burden. Copyright © 2014 Elsevier B.V. All rights reserved.
Patient-specific Radiation Dose and Cancer Risk for Pediatric Chest CT
Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Frush, Donald P.
2011-01-01
Purpose: To estimate patient-specific radiation dose and cancer risk for pediatric chest computed tomography (CT) and to evaluate factors affecting dose and risk, including patient size, patient age, and scanning parameters. Materials and Methods: The institutional review board approved this study and waived informed consent. This study was HIPAA compliant. The study included 30 patients (0–16 years old), for whom full-body computer models were recently created from clinical CT data. A validated Monte Carlo program was used to estimate organ dose from eight chest protocols, representing clinically relevant combinations of bow tie filter, collimation, pitch, and tube potential. Organ dose was used to calculate effective dose and risk index (an index of total cancer incidence risk). The dose and risk estimates before and after normalization by volume-weighted CT dose index (CTDIvol) or dose–length product (DLP) were correlated with patient size and age. The effect of each scanning parameter was studied. Results: Organ dose normalized by tube current–time product or CTDIvol decreased exponentially with increasing average chest diameter. Effective dose normalized by tube current–time product or DLP decreased exponentially with increasing chest diameter. Chest diameter was a stronger predictor of dose than weight and total scan length. Risk index normalized by tube current–time product or DLP decreased exponentially with both chest diameter and age. When normalized by DLP, effective dose and risk index were independent of collimation, pitch, and tube potential (<10% variation). Conclusion: The correlations of dose and risk with patient size and age can be used to estimate patient-specific dose and risk. They can further guide the design and optimization of pediatric chest CT protocols. © RSNA, 2011 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11101900/-/DC1 PMID:21467251
Ramírez, Noelia; Cuadras, Anna; Marcé, Rosa Maria
2011-01-01
Background: Inhalation is one of the main means of human exposure to polycyclic aromatic hydrocarbons (PAHs) because of their ubiquitous presence in the atmosphere. However, most studies have considered only PAHs found in the particle phase and have omitted the contribution of the gas-phase PAHs to the risk. Objective: We estimated the lifetime lung cancer risk from PAH exposure by inhalation in people living next to the largest chemical site in Southern Europe and the Mediterranean area. Methods: We determined 18 PAHs in the atmospheric gas and particle phase. We monitored the PAHs for 1 year in three locations near the chemical site in different seasons. We used toxic equivalence factors to calculate benzo[a]pyrene (BaP) equivalents (BaP-eq) for individual PAHs and applied the World Health Organization unit risk (UR) for BaP (UR = 8.7 × 10–5) to estimate lifetime cancer risks due to PAH exposures. Results: We observed some spatial and seasonal variability in PAH concentrations. The contribution of gas-phase PAHs to the total BaP-eq value was between 34% and 86%. The total estimated average lifetime lung cancer risk due to PAH exposure in the study area was 1.2 × 10–4. Conclusions: The estimated risk was higher than values recommended by the World Health Organization and U.S. Environmental Protection Agency but lower than the threshold value of 10–3 that is considered an indication of definite risk according to similar risk studies. The results also showed that risk may be underestimated if the contributions of gas-phase PAHs are not considered. PMID:21478082
BLASER, Nello; WETTSTEIN, Celina; ESTILL, Janne; VIZCAYA, Luisa SALAZAR; WANDELER, Gilles; EGGER, Matthias; KEISER, Olivia
2014-01-01
Objectives HIV ‘treatment as prevention’ (TasP) describes early treatment of HIV-infected patients intended to reduce viral load (VL) and transmission. Crucial assumptions for estimating TasP's effectiveness are the underlying estimates of transmission risk. We aimed to determine transmission risk during primary infection, and of the relation of HIV transmission risk to VL. Design Systematic review and meta-analysis. Methods We searched PubMed and Embase databases for studies that established a relationship between VL and transmission risk, or primary infection and transmission risk, in serodiscordant couples. We analyzed assumptions about the relationship between VL and transmission risk, and between duration of primary infection and transmission risk. Results We found 36 eligible articles, based on six different study populations. Studies consistently found that larger VLs lead to higher HIV transmission rates, but assumptions about the shape of this increase varied from exponential increase to saturation. The assumed duration of primary infection ranged from 1.5 to 12 months; for each additional month, the log10 transmission rate ratio between primary and asymptomatic infection decreased by 0.40. Conclusions Assumptions and estimates of the relationship between VL and transmission risk, and the relationship between primary infection and transmission risk, vary substantially and predictions of TasP's effectiveness should take this uncertainty into account. PMID:24691205
Jahanfar, Ali; Amirmojahedi, Mohsen; Gharabaghi, Bahram; Dubey, Brajesh; McBean, Edward; Kumar, Dinesh
2017-03-01
Rapid population growth of major urban centres in many developing countries has created massive landfills with extraordinary heights and steep side-slopes, which are frequently surrounded by illegal low-income residential settlements developed too close to landfills. These extraordinary landfills are facing high risks of catastrophic failure with potentially large numbers of fatalities. This study presents a novel method for risk assessment of landfill slope failure, using probabilistic analysis of potential failure scenarios and associated fatalities. The conceptual framework of the method includes selecting appropriate statistical distributions for the municipal solid waste (MSW) material shear strength and rheological properties for potential failure scenario analysis. The MSW material properties for a given scenario is then used to analyse the probability of slope failure and the resulting run-out length to calculate the potential risk of fatalities. In comparison with existing methods, which are solely based on the probability of slope failure, this method provides a more accurate estimate of the risk of fatalities associated with a given landfill slope failure. The application of the new risk assessment method is demonstrated with a case study for a landfill located within a heavily populated area of New Delhi, India.
Osawa, Takeshi; Okawa, Shigenori; Kurokawa, Shunji; Ando, Shinichiro
2016-12-01
In this study, we propose a method for estimating the risk of agricultural damage caused by an invasive species when species-specific information is lacking. We defined the "risk" as the product of the invasion probability and the area of potentially damaged crop for production. As a case study, we estimated the risk imposed by an invasive weed, Sicyos angulatus, based on simple cellular simulations and governmental data on the area of crop that could potentially be damaged in Miyagi Prefecture, Japan. Simulation results revealed that the current distribution range was sufficiently accurate for practical purposes. Using these results and records of crop areas, we present risk maps for S. angulatus in agricultural fields. Managers will be able to use these maps to rapidly establish a management plan with minimal cost. Our approach will be valuable for establishing a management plan before or during the early stages of invasion.
A fault tree model to assess probability of contaminant discharge from shipwrecks.
Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I
2014-11-15
Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Common Clinical Practice versus new PRIM Score in Predicting Coronary Heart Disease Risk
Frikke-Schmidt, Ruth; Tybjærg-Hansen, Anne; Schnohr, Peter; Jensen, Gorm B.; Nordestgaard, Børge G.
2011-01-01
Objectives To compare the new Patient Rule Induction Method(PRIM) Score and common clinical practice with the Framingham Point Score for classification of individuals with respect to coronary heart disease(CHD) risk. Methods and Results PRIM Score and the Framingham Point Score were estimated for 11,444 participants from the Copenhagen City Heart Study. Gender specific cumulative incidences and 10 year absolute CHD risks were estimated for subsets defined by age, total cholesterol, high-density lipoprotein(HDL) cholesterol, blood pressure, diabetes and smoking categories. PRIM defined seven mutually exclusive subsets in women and men, with cumulative incidences of CHD from 0.01 to 0.22 in women, and from 0.03 to 0.26 in men. PRIM versus Framingham Point Score found 11% versus 4% of all women, and 31% versus 35% of all men to have 10 year CHD risks >20%. Among women ≥65 years with hypertension and/or with diabetes, 10 year CHD risk >20% was found for 100% with PRIM scoring but for only 18% with the Framingham Point Score. Conclusion Compared to the PRIM Score, common clinical practice with the Framingham Point Score underestimates CHD risk in women, especially in women ≥65 years with hypertension and/or with diabetes. PMID:20728887
Effectiveness of repeated examination to diagnose enterobiasis in nursery school groups.
Remm, Mare; Remm, Kalle
2009-09-01
The aim of this study was to estimate the benefit from repeated examinations in the diagnosis of enterobiasis in nursery school groups, and to test the effectiveness of individual-based risk predictions using different methods. A total of 604 children were examined using double, and 96 using triple, anal swab examinations. The questionnaires for parents, structured observations, and interviews with supervisors were used to identify factors of possible infection risk. In order to model the risk of enterobiasis at individual level, a similarity-based machine learning and prediction software Constud was compared with data mining methods in the Statistica 8 Data Miner software package. Prevalence according to a single examination was 22.5%; the increase as a result of double examinations was 8.2%. Single swabs resulted in an estimated prevalence of 20.1% among children examined 3 times; double swabs increased this by 10.1%, and triple swabs by 7.3%. Random forest classification, boosting classification trees, and Constud correctly predicted about 2/3 of the results of the second examination. Constud estimated a mean prevalence of 31.5% in groups. Constud was able to yield the highest overall fit of individual-based predictions while boosting classification tree and random forest models were more effective in recognizing Enterobius positive persons. As a rule, the actual prevalence of enterobiasis is higher than indicated by a single examination. We suggest using either the values of the mean increase in prevalence after double examinations compared to single examinations or group estimations deduced from individual-level modelled risk predictions.
Effectiveness of Repeated Examination to Diagnose Enterobiasis in Nursery School Groups
Remm, Kalle
2009-01-01
The aim of this study was to estimate the benefit from repeated examinations in the diagnosis of enterobiasis in nursery school groups, and to test the effectiveness of individual-based risk predictions using different methods. A total of 604 children were examined using double, and 96 using triple, anal swab examinations. The questionnaires for parents, structured observations, and interviews with supervisors were used to identify factors of possible infection risk. In order to model the risk of enterobiasis at individual level, a similarity-based machine learning and prediction software Constud was compared with data mining methods in the Statistica 8 Data Miner software package. Prevalence according to a single examination was 22.5%; the increase as a result of double examinations was 8.2%. Single swabs resulted in an estimated prevalence of 20.1% among children examined 3 times; double swabs increased this by 10.1%, and triple swabs by 7.3%. Random forest classification, boosting classification trees, and Constud correctly predicted about 2/3 of the results of the second examination. Constud estimated a mean prevalence of 31.5% in groups. Constud was able to yield the highest overall fit of individual-based predictions while boosting classification tree and random forest models were more effective in recognizing Enterobius positive persons. As a rule, the actual prevalence of enterobiasis is higher than indicated by a single examination. We suggest using either the values of the mean increase in prevalence after double examinations compared to single examinations or group estimations deduced from individual-level modelled risk predictions. PMID:19724696
Cox, Louis Anthony Tony
2017-08-01
Concentration-response (C-R) functions relating concentrations of pollutants in ambient air to mortality risks or other adverse health effects provide the basis for many public health risk assessments, benefits estimates for clean air regulations, and recommendations for revisions to existing air quality standards. The assumption that C-R functions relating levels of exposure and levels of response estimated from historical data usefully predict how future changes in concentrations would change risks has seldom been carefully tested. This paper critically reviews literature on C-R functions for fine particulate matter (PM2.5) and mortality risks. We find that most of them describe historical associations rather than valid causal models for predicting effects of interventions that change concentrations. The few papers that explicitly attempt to model causality rely on unverified modeling assumptions, casting doubt on their predictions about effects of interventions. A large literature on modern causal inference algorithms for observational data has been little used in C-R modeling. Applying these methods to publicly available data from Boston and the South Coast Air Quality Management District around Los Angeles shows that C-R functions estimated for one do not hold for the other. Changes in month-specific PM2.5 concentrations from one year to the next do not help to predict corresponding changes in average elderly mortality rates in either location. Thus, the assumption that estimated C-R relations predict effects of pollution-reducing interventions may not be true. Better causal modeling methods are needed to better predict how reducing air pollution would affect public health.
Johnston, Lisa G; McLaughlin, Katherine R; Rhilani, Houssine El; Latifi, Amina; Toufik, Abdalla; Bennani, Aziza; Alami, Kamal; Elomari, Boutaina; Handcock, Mark S
2015-01-01
Background Respondent-driven sampling is used worldwide to estimate the population prevalence of characteristics such as HIV/AIDS and associated risk factors in hard-to-reach populations. Estimating the total size of these populations is of great interest to national and international organizations, however reliable measures of population size often do not exist. Methods Successive Sampling-Population Size Estimation (SS-PSE) along with network size imputation allows population size estimates to be made without relying on separate studies or additional data (as in network scale-up, multiplier and capture-recapture methods), which may be biased. Results Ten population size estimates were calculated for people who inject drugs, female sex workers, men who have sex with other men, and migrants from sub-Sahara Africa in six different cities in Morocco. SS-PSE estimates fell within or very close to the likely values provided by experts and the estimates from previous studies using other methods. Conclusions SS-PSE is an effective method for estimating the size of hard-to-reach populations that leverages important information within respondent-driven sampling studies. The addition of a network size imputation method helps to smooth network sizes allowing for more accurate results. However, caution should be used particularly when there is reason to believe that clustered subgroups may exist within the population of interest or when the sample size is small in relation to the population. PMID:26258908
Heidari, Zahra; Feizi, Awat; Azadbakht, Leila; Sarrafzadegan, Nizal
2015-01-01
Minerals are required for the body's normal function. The current study assessed the intake distribution of minerals and estimated the prevalence of inadequacy and excess among a representative sample of healthy middle aged and elderly Iranian people. In this cross-sectional study, the second follow up to the Isfahan Cohort Study (ICS), 1922 generally healthy people aged 40 and older were investigated. Dietary intakes were collected using 24 hour recalls and two or more consecutive food records. Distribution of minerals intake was estimated using traditional (averaging dietary intake days) and National Cancer Institute (NCI) methods, and the results obtained from the two methods, were compared. The prevalence of minerals intake inadequacy or excess was estimated using the estimated average requirement (EAR) cut-point method, the probability approach and the tolerable upper intake levels (UL). There were remarkable differences between values obtained using traditional and NCI methods, particularly in the lower and upper percentiles of the estimated intake distributions. A high prevalence of inadequacy of magnesium (50 - 100 %), calcium (21 - 93 %) and zinc (30 - 55 % for males > 50 years) was observed. Significant gender differences were found regarding inadequate intakes of calcium (21 - 76 % for males vs. 45 - 93 % for females), magnesium (92 % vs. 100 %), iron (0 vs. 15 % for age group 40 - 50 years) and zinc (29 - 55 % vs. 0 %) (all; p < 0.05). Severely imbalanced intakes of magnesium, calcium and zinc were observed among the middle-aged and elderly Iranian population. Nutritional interventions and population-based education to improve healthy diets among the studied population at risk are needed.
The Role of Psychological and Physiological Factors in Decision Making under Risk and in a Dilemma
Fooken, Jonas; Schaffner, Markus
2016-01-01
Different methods to elicit risk attitudes of individuals often provide differing results despite a common theory. Reasons for such inconsistencies may be the different influence of underlying factors in risk-taking decisions. In order to evaluate this conjecture, a better understanding of underlying factors across methods and decision contexts is desirable. In this paper we study the difference in result of two different risk elicitation methods by linking estimates of risk attitudes to gender, age, and personality traits, which have been shown to be related. We also investigate the role of these factors during decision-making in a dilemma situation. For these two decision contexts we also investigate the decision-maker's physiological state during the decision, measured by heart rate variability (HRV), which we use as an indicator of emotional involvement. We found that the two elicitation methods provide different individual risk attitude measures which is partly reflected in a different gender effect between the methods. Personality traits explain only relatively little in terms of driving risk attitudes and the difference between methods. We also found that risk taking and the physiological state are related for one of the methods, suggesting that more emotionally involved individuals are more risk averse in the experiment. Finally, we found evidence that personality traits are connected to whether individuals made a decision in the dilemma situation, but risk attitudes and the physiological state were not indicative for the ability to decide in this decision context. PMID:26834591
Twins less frequent than expected among male births in risk averse populations.
Karasek, Deborah; Goodman, Julia; Gemmill, Alison; Falconi, April; Hartig, Terry; Magganas, Aristotle; Catalano, Ralph
2015-06-01
Male twin gestations exhibit higher incidence of fetal morbidity and mortality than singleton gestations. From an evolutionary perspective, the relatively high rates of infant and child mortality among male twins born into threatening environments reduce the fitness of these gestations, making them more vulnerable to fetal loss. Women do not perceive choosing to spontaneously abort gestations although the outcome may result from estimates, made without awareness, of the risks of continuing a pregnancy. Here, we examine whether the non-conscious decisional biology of gestation can be linked to conscious risk aversion. We test this speculation by measuring the association between household surveys in Sweden that gauge financial risk aversion in the population and the frequency of twins among live male births. We used time-series regression methods to estimate our suspected associations and Box-Jenkins modeling to ensure that autocorrelation did not confound the estimation or reduce its efficiency. We found, consistent with theory, that financial risk aversion in the population correlates inversely with the odds of a twin among Swedish males born two months later. The odds of a twin among males fell by approximately 3.5% two months after unexpectedly great risk aversion in the population. This work implies that shocks that affect population risk aversion carry implications for fetal loss in vulnerable twin pregnancies.
Cancer risk from incidental ingestion exposures to PAHs associated with coal-tar-sealed pavement
Williams, E. Spencer; Mahler, Barbara J.; Van Metre, Peter C.
2012-01-01
Recent (2009-10) studies documented significantly higher concentrations of polycyclic aromatic hydrocarbons (PAHs) in settled house dust in living spaces and soil adjacent to parking lots sealed with coal-tar-based products. To date, no studies have examined the potential human health effects of PAHs from these products in dust and soil. Here we present the results of an analysis of potential cancer risk associated with incidental ingestion exposures to PAHs in settings near coal-tar-sealed pavement. Exposures to benzo[a]pyrene equivalents were characterized across five scenarios. The central tendency estimate of excess cancer risk resulting from lifetime exposures to soil and dust from nondietary ingestion in these settings exceeded 1 × 10–4, as determined using deterministic and probabilistic methods. Soil was the primary driver of risk, but according to probabilistic calculations, reasonable maximum exposure to affected house dust in the first 6 years of life was sufficient to generate an estimated excess lifetime cancer risk of 6 × 10–5. Our results indicate that the presence of coal-tar-based pavement sealants is associated with significant increases in estimated excess lifetime cancer risk for nearby residents. Much of this calculated excess risk arises from exposures to PAHs in early childhood (i.e., 0–6 years of age).
Caccamo, Alexandra; Kachur, Rachel; Williams, Samantha P.
2018-01-01
Background Homelessness affects an estimated 1.6 million US youth annually. Compared with housed youth, homeless youth are more likely to engage in high-risk behaviors, including inconsistent condom use, multiple sex partners, survival sex, and alcohol/drug use, putting them at increased sexually transmitted disease (STD) risk. However, there is no national estimate of STD prevalence among this population. Methods We identified 10 peer-reviewed articles (9 unique studies) reporting STD prevalence among homeless US youth (2000–2015). Descriptive and qualitative analyses identified STD prevalence ranges and risk factors among youth. Results Eight studies reported specific STD prevalence estimates, mainly chlamydia, gonorrhea, and syphilis. Overall STD prevalence among homeless youth ranged from 6% to 32%. STD rates for girls varied from 16.7% to 46%, and from 9% to 13.1% in boys. Most studies were conducted in the Western United States, with no studies from the Southeast or Northeast. Youths who experienced longer periods of homelessness were more likely to engage in high-risk sexual behaviors. Girls had lower rates of condom use and higher rates of STDs; boys were more likely to engage in anal and anonymous sex. Additionally, peer social networks contributed to protective effects on individual sexual risk behavior. Conclusions Sexually transmitted disease prevalence estimates among homeless youth fluctuated greatly by study. Sexually transmitted disease risk behaviors are associated with unmet survival needs, length of homelessness, and influence of social networks. To promote sexual health and reduce STD rates, we need better estimates of STD prevalence, more geographic diversity of studies, and interventions addressing the behavioral associations identified in our review. PMID:28703725
Cheung, Li C; Pan, Qing; Hyun, Noorie; Schiffman, Mark; Fetterman, Barbara; Castle, Philip E; Lorey, Thomas; Katki, Hormuzd A
2017-09-30
For cost-effectiveness and efficiency, many large-scale general-purpose cohort studies are being assembled within large health-care providers who use electronic health records. Two key features of such data are that incident disease is interval-censored between irregular visits and there can be pre-existing (prevalent) disease. Because prevalent disease is not always immediately diagnosed, some disease diagnosed at later visits are actually undiagnosed prevalent disease. We consider prevalent disease as a point mass at time zero for clinical applications where there is no interest in time of prevalent disease onset. We demonstrate that the naive Kaplan-Meier cumulative risk estimator underestimates risks at early time points and overestimates later risks. We propose a general family of mixture models for undiagnosed prevalent disease and interval-censored incident disease that we call prevalence-incidence models. Parameters for parametric prevalence-incidence models, such as the logistic regression and Weibull survival (logistic-Weibull) model, are estimated by direct likelihood maximization or by EM algorithm. Non-parametric methods are proposed to calculate cumulative risks for cases without covariates. We compare naive Kaplan-Meier, logistic-Weibull, and non-parametric estimates of cumulative risk in the cervical cancer screening program at Kaiser Permanente Northern California. Kaplan-Meier provided poor estimates while the logistic-Weibull model was a close fit to the non-parametric. Our findings support our use of logistic-Weibull models to develop the risk estimates that underlie current US risk-based cervical cancer screening guidelines. Published 2017. This article has been contributed to by US Government employees and their work is in the public domain in the USA. Published 2017. This article has been contributed to by US Government employees and their work is in the public domain in the USA.
NASA Astrophysics Data System (ADS)
Parekh, Vishwa S.; Jacobs, Jeremy R.; Jacobs, Michael A.
2014-03-01
The evaluation and treatment of acute cerebral ischemia requires a technique that can determine the total area of tissue at risk for infarction using diagnostic magnetic resonance imaging (MRI) sequences. Typical MRI data sets consist of T1- and T2-weighted imaging (T1WI, T2WI) along with advanced MRI parameters of diffusion-weighted imaging (DWI) and perfusion weighted imaging (PWI) methods. Each of these parameters has distinct radiological-pathological meaning. For example, DWI interrogates the movement of water in the tissue and PWI gives an estimate of the blood flow, both are critical measures during the evolution of stroke. In order to integrate these data and give an estimate of the tissue at risk or damaged; we have developed advanced machine learning methods based on unsupervised non-linear dimensionality reduction (NLDR) techniques. NLDR methods are a class of algorithms that uses mathematically defined manifolds for statistical sampling of multidimensional classes to generate a discrimination rule of guaranteed statistical accuracy and they can generate a two- or three-dimensional map, which represents the prominent structures of the data and provides an embedded image of meaningful low-dimensional structures hidden in their high-dimensional observations. In this manuscript, we develop NLDR methods on high dimensional MRI data sets of preclinical animals and clinical patients with stroke. On analyzing the performance of these methods, we observed that there was a high of similarity between multiparametric embedded images from NLDR methods and the ADC map and perfusion map. It was also observed that embedded scattergram of abnormal (infarcted or at risk) tissue can be visualized and provides a mechanism for automatic methods to delineate potential stroke volumes and early tissue at risk.
NASA Astrophysics Data System (ADS)
Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad
2016-09-01
Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.
NASA Astrophysics Data System (ADS)
Miftahurrohmah, Brina; Iriawan, Nur; Fithriasari, Kartika
2017-06-01
Stocks are known as the financial instruments traded in the capital market which have a high level of risk. Their risks are indicated by their uncertainty of their return which have to be accepted by investors in the future. The higher the risk to be faced, the higher the return would be gained. Therefore, the measurements need to be made against the risk. Value at Risk (VaR) as the most popular risk measurement method, is frequently ignore when the pattern of return is not uni-modal Normal. The calculation of the risks using VaR method with the Normal Mixture Autoregressive (MNAR) approach has been considered. This paper proposes VaR method couple with the Mixture Laplace Autoregressive (MLAR) that would be implemented for analysing the first three biggest capitalization Islamic stock return in JII, namely PT. Astra International Tbk (ASII), PT. Telekomunikasi Indonesia Tbk (TLMK), and PT. Unilever Indonesia Tbk (UNVR). Parameter estimation is performed by employing Bayesian Markov Chain Monte Carlo (MCMC) approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stick, Line B., E-mail: line.bjerregaard.stick@regionh.dk; Niels Bohr Institute, Faculty of Science, University of Copenhagen, Copenhagen; Yu, Jen
Purpose: The study aims to perform joint estimation of the risk of recurrence caused by inadequate radiation dose coverage of lymph node targets and the risk of cardiac toxicity caused by radiation exposure to the heart. Delivered photon plans are compared with realistic proton plans, thereby providing evidence-based estimates of the heterogeneity of treatment effects in consecutive cases for the 2 radiation treatment modalities. Methods and Materials: Forty-one patients referred for postlumpectomy comprehensive nodal photon irradiation for left-sided breast cancer were included. Comparative proton plans were optimized by a spot scanning technique with single-field optimization from 2 en face beams.more » Cardiotoxicity risk was estimated with the model of Darby et al, and risk of recurrence following a compromise of lymph node coverage was estimated by a linear dose-response model fitted to the recurrence data from the recently published EORTC (European Organisation for Research and Treatment of Cancer) 22922/10925 and NCIC-CTG (National Cancer Institute of Canada Clinical Trials Group) MA.20 randomized controlled trials. Results: Excess absolute risk of cardiac morbidity was small with photon therapy at an attained age of 80 years, with median values of 1.0% (range, 0.2%-2.9%) and 0.5% (range, 0.03%-1.0%) with and without cardiac risk factors, respectively, but even lower with proton therapy (0.13% [range, 0.02%-0.5%] and 0.06% [range, 0.004%-0.3%], respectively). The median estimated excess absolute risk of breast cancer recurrence after 10 years was 0.10% (range, 0.0%-0.9%) with photons and 0.02% (range, 0.0%-0.07%) with protons. The association between age of the patient and benefit from proton therapy was weak, almost non-existing (Spearman rank correlations of −0.15 and −0.30 with and without cardiac risk factors, respectively). Conclusions: Modern photon therapy yields limited risk of cardiac toxicity in most patients, but proton therapy can reduce the predicted risk of cardiac toxicity by up to 2.9% and the risk of breast cancer recurrence by 0.9% in individual patients. Predicted benefit correlates weakly with age. Combined assessment of the risk from cardiac exposure and inadequate target coverage is desirable for rational consideration of competing photon and proton therapy plans.« less
The link between judgments of comparative risk and own risk: further evidence.
Gold, Ron S
2007-03-01
Individuals typically believe that they are less likely than the average person to experience negative events, a phenomenon termed "unrealistic optimism". The direct method of assessing unrealistic optimism employs a question of the form, "Compared with the average person, what is the chance that X will occur to you?". However, it has been proposed that responses to such a question (direct-estimates) are based essentially just on estimates that X will occur to the self (self-estimates). If this is so, any factors that affect one of these estimates should also affect the other. This prediction was tested in two experiments. In each, direct- and self-estimates for an unfamiliar health threat - homocysteine-related heart problems - were recorded. It was found that both types of estimate were affected in the same way by varying the stated probability of having unsafe levels of homocysteine (Study 1, N=149) and varying the stated probability that unsafe levels of homocysteine will lead to heart problems (Study 2, N=111). The results are consistent with the proposal that direct-estimates are constructed just from self-estimates.
Beach, Jeremy; Burstyn, Igor; Cherry, Nicola
2012-07-01
We previously described a method to identify the incidence of new-onset adult asthma (NOAA) in Alberta by industry and occupation, utilizing Workers' Compensation Board (WCB) and physician billing data. The aim of this study was to extend this method to data from British Columbia (BC) so as to compare the two provinces and to incorporate Bayesian methodology into estimates of risk. WCB claims for any reason 1995-2004 were linked to physician billing data. NOAA was defined as a billing for asthma (ICD-9 493) in the 12 months before a WCB claim without asthma in the previous 3 years. Incidence was calculated by occupation and industry. In a matched case-referent analysis, associations with exposures were examined using an asthma-specific job exposure matrix (JEM). Posterior distributions from the Alberta analysis and estimated misclassification parameters were used as priors in the Bayesian analysis of the BC data. Among 1 118 239 eligible WCB claims the incidence of NOAA was 1.4%. Sixteen occupations and 44 industries had a significantly increased risk; six industries had a decreased risk. The JEM identified wood dust [odds ratio (OR) 1.55, 95% confidence interval (CI) 1.08-2.24] and animal antigens (OR 1.66, 95% CI 1.17-2.36) as related to an increased risk of NOAA. Exposure to isocyanates was associated with decreased risk (OR 0.57, 95% CI 0.39-0.85). Bayesian analyses taking account of exposure misclassification and informative priors resulted in posterior distributions of ORs with lower boundary of 95% credible intervals >1.00 for almost all exposures. The distribution of NOAA in BC appeared somewhat similar to that in Alberta, except for isocyanates. Bayesian analyses allowed incorporation of prior evidence into risk estimates, permitting reconsideration of the apparently protective effect of isocyanate exposure.
Davies, John R; Chang, Yu-mei; Bishop, D Timothy; Armstrong, Bruce K; Bataille, Veronique; Bergman, Wilma; Berwick, Marianne; Bracci, Paige M; Elwood, J Mark; Ernstoff, Marc S; Green, Adele; Gruis, Nelleke A; Holly, Elizabeth A; Ingvar, Christian; Kanetsky, Peter A; Karagas, Margaret R; Lee, Tim K; Le Marchand, Loïc; Mackie, Rona M; Olsson, Håkan; Østerlind, Anne; Rebbeck, Timothy R; Reich, Kristian; Sasieni, Peter; Siskind, Victor; Swerdlow, Anthony J; Titus, Linda; Zens, Michael S; Ziegler, Andreas; Gallagher, Richard P.; Barrett, Jennifer H; Newton-Bishop, Julia
2015-01-01
Background We report the development of a cutaneous melanoma risk algorithm based upon 7 factors; hair colour, skin type, family history, freckling, nevus count, number of large nevi and history of sunburn, intended to form the basis of a self-assessment webtool for the general public. Methods Predicted odds of melanoma were estimated by analysing a pooled dataset from 16 case-control studies using logistic random coefficients models. Risk categories were defined based on the distribution of the predicted odds in the controls from these studies. Imputation was used to estimate missing data in the pooled datasets. The 30th, 60th and 90th centiles were used to distribute individuals into four risk groups for their age, sex and geographic location. Cross-validation was used to test the robustness of the thresholds for each group by leaving out each study one by one. Performance of the model was assessed in an independent UK case-control study dataset. Results Cross-validation confirmed the robustness of the threshold estimates. Cases and controls were well discriminated in the independent dataset (area under the curve 0.75, 95% CI 0.73-0.78). 29% of cases were in the highest risk group compared with 7% of controls, and 43% of controls were in the lowest risk group compared with 13% of cases. Conclusion We have identified a composite score representing an estimate of relative risk and successfully validated this score in an independent dataset. Impact This score may be a useful tool to inform members of the public about their melanoma risk. PMID:25713022
Robust estimation of simulated urinary volume from camera images under bathroom illumination.
Honda, Chizuru; Bhuiyan, Md Shoaib; Kawanaka, Haruki; Watanabe, Eiichi; Oguri, Koji
2016-08-01
General uroflowmetry method involves the risk of nosocomial infections or time and effort of the recording. Medical institutions, therefore, need to measure voided volume simply and hygienically. Multiple cylindrical model that can estimate the fluid flow rate from the photographed image using camera has been proposed in an earlier study. This study implemented a flow rate estimation by using a general-purpose camera system (Raspberry Pi Camera Module) and the multiple cylindrical model. However, large amounts of noise in extracting liquid region are generated by the variation of the illumination when performing measurements in the bathroom. So the estimation error gets very large. In other words, the specifications of the previous study's camera setup regarding the shutter type and the frame rate was too strict. In this study, we relax the specifications to achieve a flow rate estimation using a general-purpose camera. In order to determine the appropriate approximate curve, we propose a binarizing method using background subtraction at each scanning row and a curve approximation method using RANSAC. Finally, by evaluating the estimation accuracy of our experiment and by comparing it with the earlier study's results, we show the effectiveness of our proposed method for flow rate estimation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diez, Patricia; Vogelius, Ivan S.; Department of Human Oncology, University of Wisconsin School of Medicine and Public Health, Madison, WI 53792
2010-07-15
Purpose: A new method is presented for synthesizing dose-response data for biochemical control of prostate cancer according to study design (randomized vs. nonrandomized) and risk group (low vs. intermediate-high). Methods and Materials: Nine published prostate cancer dose escalation studies including 6,539 patients were identified in the MEDLINE and CINAHL databases and reviewed to assess the relationship between dose and biochemical control. A novel method of analysis is presented in which the normalized dose-response gradient, {gamma}{sub 50}, is estimated for each study and subsequently synthesized across studies. Our method does not assume that biochemical control rates are directly comparable between studies.more » Results: Nonrandomized studies produced a statistically significantly higher {gamma}{sub 50} than randomized studies for intermediate- to high-risk patients ({gamma}{sub 50} = 1.63 vs. {gamma}{sub 50} = 0.93, p = 0.03) and a borderline significantly higher ({gamma}{sub 50} = 1.78 vs. {gamma}{sub 50} = 0.56, p = 0.08) for low-risk patients. No statistically significant difference in {gamma}{sub 50} was found between low- and intermediate- to high-risk patients (p = 0.31). From the pooled data of low and intermediate- to high-risk patients in randomized trials, we obtain the overall best estimate of {gamma}{sub 50} = 0.84 with 95% confidence interval 0.54-1.15. Conclusions: Nonrandomized studies overestimate the steepness of the dose-response curve as compared with randomized trials. This is probably the result of stage migration, improved treatment techniques, and a shorter follow-up in higher dose patients that were typically entered more recently. This overestimation leads to inflated expectations regarding the benefit from dose-escalation and could lead to underpowered clinical trials. There is no evidence of a steeper dose response for intermediate- to high-risk compared with low-risk patients.« less
Evaluation of multiple tracer methods to estimate low groundwater flow velocities.
Reimus, Paul W; Arnold, Bill W
2017-04-01
Four different tracer methods were used to estimate groundwater flow velocity at a multiple-well site in the saturated alluvium south of Yucca Mountain, Nevada: (1) two single-well tracer tests with different rest or "shut-in" periods, (2) a cross-hole tracer test with an extended flow interruption, (3) a comparison of two tracer decay curves in an injection borehole with and without pumping of a downgradient well, and (4) a natural-gradient tracer test. Such tracer methods are potentially very useful for estimating groundwater velocities when hydraulic gradients are flat (and hence uncertain) and also when water level and hydraulic conductivity data are sparse, both of which were the case at this test location. The purpose of the study was to evaluate the first three methods for their ability to provide reasonable estimates of relatively low groundwater flow velocities in such low-hydraulic-gradient environments. The natural-gradient method is generally considered to be the most robust and direct method, so it was used to provide a "ground truth" velocity estimate. However, this method usually requires several wells, so it is often not practical in systems with large depths to groundwater and correspondingly high well installation costs. The fact that a successful natural gradient test was conducted at the test location offered a unique opportunity to compare the flow velocity estimates obtained by the more easily deployed and lower risk methods with the ground-truth natural-gradient method. The groundwater flow velocity estimates from the four methods agreed very well with each other, suggesting that the first three methods all provided reasonably good estimates of groundwater flow velocity at the site. The advantages and disadvantages of the different methods, as well as some of the uncertainties associated with them are discussed. Published by Elsevier B.V.
Analyzing semi-competing risks data with missing cause of informative terminal event.
Zhou, Renke; Zhu, Hong; Bondy, Melissa; Ning, Jing
2017-02-28
Cancer studies frequently yield multiple event times that correspond to landmarks in disease progression, including non-terminal events (i.e., cancer recurrence) and an informative terminal event (i.e., cancer-related death). Hence, we often observe semi-competing risks data. Work on such data has focused on scenarios in which the cause of the terminal event is known. However, in some circumstances, the information on cause for patients who experience the terminal event is missing; consequently, we are not able to differentiate an informative terminal event from a non-informative terminal event. In this article, we propose a method to handle missing data regarding the cause of an informative terminal event when analyzing the semi-competing risks data. We first consider the nonparametric estimation of the survival function for the terminal event time given missing cause-of-failure data via the expectation-maximization algorithm. We then develop an estimation method for semi-competing risks data with missing cause of the terminal event, under a pre-specified semiparametric copula model. We conduct simulation studies to investigate the performance of the proposed method. We illustrate our methodology using data from a study of early-stage breast cancer. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
A review of methods to estimate cause-specific mortality in presence of competing risks
Heisey, Dennis M.; Patterson, Brent R.
2006-01-01
Estimating cause-specific mortality is often of central importance for understanding the dynamics of wildlife populations. Despite such importance, methodology for estimating and analyzing cause-specific mortality has received little attention in wildlife ecology during the past 20 years. The issue of analyzing cause-specific, mutually exclusive events in time is not unique to wildlife. In fact, this general problem has received substantial attention in human biomedical applications within the context of biostatistical survival analysis. Here, we consider cause-specific mortality from a modern biostatistical perspective. This requires carefully defining what we mean by cause-specific mortality and then providing an appropriate hazard-based representation as a competing risks problem. This leads to the general solution of cause-specific mortality as the cumulative incidence function (CIF). We describe the appropriate generalization of the fully nonparametric staggered-entry Kaplan–Meier survival estimator to cause-specific mortality via the nonparametric CIF estimator (NPCIFE), which in many situations offers an attractive alternative to the Heisey–Fuller estimator. An advantage of the NPCIFE is that it lends itself readily to risk factors analysis with standard software for Cox proportional hazards model. The competing risks–based approach also clarifies issues regarding another intuitive but erroneous "cause-specific mortality" estimator based on the Kaplan–Meier survival estimator and commonly seen in the life sciences literature.
Risk estimation using probability machines.
Dasgupta, Abhijit; Szymczak, Silke; Moore, Jason H; Bailey-Wilson, Joan E; Malley, James D
2014-03-01
Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a "risk machine", will share properties from the statistical machine that it is derived from.
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.
2007-01-01
Space radiation presents major challenges to astronauts on the International Space Station and for future missions to the Earth s moon or Mars. Methods used to project risks on Earth need to be modified because of the large uncertainties in projecting cancer risks from space radiation, and thus impact safety factors. We describe NASA s unique approach to radiation safety that applies uncertainty based criteria within the occupational health program for astronauts: The two terrestrial criteria of a point estimate of maximum acceptable level of risk and application of the principle of As Low As Reasonably Achievable (ALARA) are supplemented by a third requirement that protects against risk projection uncertainties using the upper 95% confidence level (CL) in the radiation cancer projection model. NASA s acceptable level of risk for ISS and their new lunar program have been set at the point-estimate of a 3-percent risk of exposure induced death (REID). Tissue-averaged organ dose-equivalents are combined with age at exposure and gender-dependent risk coefficients to project the cumulative occupational radiation risks incurred by astronauts. The 95% CL criteria in practice is a stronger criterion than ALARA, but not an absolute cut-off as is applied to a point projection of a 3% REID. We describe the most recent astronaut dose limits, and present a historical review of astronaut organ doses estimates from the Mercury through the current ISS program, and future projections for lunar and Mars missions. NASA s 95% CL criteria is linked to a vibrant ground based radiobiology program investigating the radiobiology of high-energy protons and heavy ions. The near-term goal of research is new knowledge leading to the reduction of uncertainties in projection models. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. The current model for projecting space radiation cancer risk relies on the three assumptions of linearity, additivity, and scaling along with the use of population averages. We describe uncertainty estimates for this model, and new experimental data that sheds light on the accuracy of the underlying assumptions. These methods make it possible to express risk management objectives in terms of quantitative metrics, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits. The resulting methodology is applied to several human space exploration mission scenarios including lunar station, deep space outpost, and a Mars mission. Factors that dominate risk projection uncertainties and application of this approach to assess candidate mitigation approaches are described.
NASA Technical Reports Server (NTRS)
Peterson, L. E.; Cucinotta, F. A.; Wilson, J. W. (Principal Investigator)
1999-01-01
Estimating uncertainty in lifetime cancer risk for human exposure to space radiation is a unique challenge. Conventional risk assessment with low-linear-energy-transfer (LET)-based risk from Japanese atomic bomb survivor studies may be inappropriate for relativistic protons and nuclei in space due to track structure effects. This paper develops a Monte Carlo mixture model (MCMM) for transferring additive, National Institutes of Health multiplicative, and multiplicative excess cancer incidence risks based on Japanese atomic bomb survivor data to determine excess incidence risk for various US astronaut exposure profiles. The MCMM serves as an anchor point for future risk projection methods involving biophysical models of DNA damage from space radiation. Lifetime incidence risks of radiation-induced cancer for the MCMM based on low-LET Japanese data for nonleukemia (all cancers except leukemia) were 2.77 (90% confidence limit, 0.75-11.34) for males exposed to 1 Sv at age 45 and 2.20 (90% confidence limit, 0.59-10.12) for males exposed at age 55. For females, mixture model risks for nonleukemia exposed separately to 1 Sv at ages of 45 and 55 were 2.98 (90% confidence limit, 0.90-11.70) and 2.44 (90% confidence limit, 0.70-10.30), respectively. Risks for high-LET 200 MeV protons (LET=0.45 keV/micrometer), 1 MeV alpha-particles (LET=100 keV/micrometer), and 600 MeV iron particles (LET=180 keV/micrometer) were scored on a per particle basis by determining the particle fluence required for an average of one particle per cell nucleus of area 100 micrometer(2). Lifetime risk per proton was 2.68x10(-2)% (90% confidence limit, 0.79x10(-3)%-0. 514x10(-2)%). For alpha-particles, lifetime risk was 14.2% (90% confidence limit, 2.5%-31.2%). Conversely, lifetime risk per iron particle was 23.7% (90% confidence limit, 4.5%-53.0%). Uncertainty in the DDREF for high-LET particles may be less than that for low-LET radiation because typically there is very little dose-rate dependence. Probability density functions for high-LET radiation quality and dose-rate may be preferable to conventional risk assessment approaches. Nuclear reactions and track structure effects in tissue may not be properly estimated by existing data using in vitro models for estimating RBEs. The method used here is being extended to estimate uncertainty in spacecraft shielding effectiveness in various space radiation environments.
Dielectric coagulometry: a new approach to estimate venous thrombosis risk.
Hayashi, Yoshihito; Katsumoto, Yoichi; Omori, Shinji; Yasuda, Akio; Asami, Koji; Kaibara, Makoto; Uchimura, Isao
2010-12-01
We present dielectric coagulometry as a new technique to estimate the risk of venous thrombosis by measuring the permittivity change associated with the blood coagulation process. The method was first tested for a simple system of animal erythrocytes suspended in fibrinogen solution, where the coagulation rate was controlled by changing the amount of thrombin added to the suspension. Second, the method was applied to a more realistic system of human whole blood, and the inherent coagulation process was monitored without artificial acceleration by a coagulation initiator. The time dependence of the permittivity at a frequency around 1 MHz showed a distinct peak at a time that corresponds to the clotting time. Our theoretical modeling revealed that the evolution of heterogeneity and the sedimentation in the system cause the peak of the permittivity.
Huang, Biao; Zhao, Yongcun
2014-01-01
Estimating standard-exceeding probabilities of toxic metals in soil is crucial for environmental evaluation. Because soil pH and land use types have strong effects on the bioavailability of trace metals in soil, they were taken into account by some environmental protection agencies in making composite soil environmental quality standards (SEQSs) that contain multiple metal thresholds under different pH and land use conditions. This study proposed a method for estimating the standard-exceeding probability map of soil cadmium using a composite SEQS. The spatial variability and uncertainty of soil pH and site-specific land use type were incorporated through simulated realizations by sequential Gaussian simulation. A case study was conducted using a sample data set from a 150 km2 area in Wuhan City and the composite SEQS for cadmium, recently set by the State Environmental Protection Administration of China. The method may be useful for evaluating the pollution risks of trace metals in soil with composite SEQSs. PMID:24672364
Characterizing the performance of the Conway-Maxwell Poisson generalized linear model.
Francis, Royce A; Geedipally, Srinivas Reddy; Guikema, Seth D; Dhavala, Soma Sekhar; Lord, Dominique; LaRocca, Sarah
2012-01-01
Count data are pervasive in many areas of risk analysis; deaths, adverse health outcomes, infrastructure system failures, and traffic accidents are all recorded as count events, for example. Risk analysts often wish to estimate the probability distribution for the number of discrete events as part of doing a risk assessment. Traditional count data regression models of the type often used in risk assessment for this problem suffer from limitations due to the assumed variance structure. A more flexible model based on the Conway-Maxwell Poisson (COM-Poisson) distribution was recently proposed, a model that has the potential to overcome the limitations of the traditional model. However, the statistical performance of this new model has not yet been fully characterized. This article assesses the performance of a maximum likelihood estimation method for fitting the COM-Poisson generalized linear model (GLM). The objectives of this article are to (1) characterize the parameter estimation accuracy of the MLE implementation of the COM-Poisson GLM, and (2) estimate the prediction accuracy of the COM-Poisson GLM using simulated data sets. The results of the study indicate that the COM-Poisson GLM is flexible enough to model under-, equi-, and overdispersed data sets with different sample mean values. The results also show that the COM-Poisson GLM yields accurate parameter estimates. The COM-Poisson GLM provides a promising and flexible approach for performing count data regression. © 2011 Society for Risk Analysis.
Stark, John D; Vargas, Roger I; Banks, John E
2015-07-01
Historically, point estimates such as the median lethal concentration (LC50) have been instrumental in assessing risks associated with toxicants to rare or economically important species. In recent years, growing awareness of the shortcomings of this approach has led to an increased focus on analyses using population endpoints. However, risk assessment of pesticides still relies heavily on large amounts of LC50 data amassed over decades in the laboratory. Despite the fact that these data are generally well replicated, little or no attention has been given to the sometime high levels of variability associated with the generation of point estimates. This is especially important in agroecosystems where arthropod predator-prey interactions are often disrupted by the use of pesticides. Using laboratory derived data of 4 economically important species (2 fruit fly pest species and 2 braconid parasitoid species) and matrix based population models, the authors demonstrate in the present study a method for bridging traditional point estimate risk assessments with population outcomes. The results illustrate that even closely related species can show strikingly divergent responses to the same exposures to pesticides. Furthermore, the authors show that using different values within the 95% confidence intervals of LC50 values can result in very different population outcomes, ranging from quick recovery to extinction for both pest and parasitoid species. The authors discuss the implications of these results and emphasize the need to incorporate variability and uncertainty in point estimates for use in risk assessment. © 2015 SETAC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dennis, Kristopher; Zhang Liying; Lutz, Stephen
Purpose: To investigate international patterns of practice in the management of radiation therapy-induced nausea and vomiting (RINV). Methods and Materials: Oncologists prescribing radiation therapy in the United States, Canada, The Netherlands, Australia, New Zealand, Spain, Italy, France, Hong Kong, Singapore, Cyprus, and Israel completed a Web-based survey that was based on 6 radiation therapy-only clinical cases modeled after the minimal-, low-, moderate-, and high-emetic risk levels defined in the antiemetic guidelines of the American Society of Clinical Oncology and the Multinational Association of Supportive Care in Cancer. For each case, respondents estimated the risks of nausea and vomiting separately andmore » committed to an initial management approach. Results: In total, 1022 responses were received. Risk estimates and management decisions for the minimal- and high-risk cases varied little and were in line with guideline standards, whereas those for the low- and moderate-risk cases varied greatly. The most common initial management strategies were as follows: rescue therapy for a minimal-risk case (63% of respondents), 2 low-risk cases (56% and 80%), and 1 moderate-risk case (66%); and prophylactic therapy for a second moderate-risk case (75%) and a high-risk case (95%). The serotonin (5-HT){sub 3} receptor antagonists were the most commonly recommended prophylactic agents. On multivariate analysis, factors predictive of a decision for prophylactic or rescue therapy were risk estimates of nausea and vomiting, awareness of the American Society of Clinical Oncology antiemetic guideline, and European Society for Therapeutic Radiology and Oncology membership. Conclusions: Risk estimates and management strategies for RINV varied, especially for low- and moderate-risk radiation therapy cases. Radiation therapy-induced nausea and vomiting are under-studied treatment sequelae. New observational and translational studies are needed to allow for individual patient risk assessment and to refine antiemetic guideline management recommendations.« less
Childhood Secondhand Smoke Exposure and ADHD-Attributable Costs to the Health and Education System
ERIC Educational Resources Information Center
Max, Wendy; Sung, Hai-Yen; Shi, Yanling
2014-01-01
Background: Children exposed to secondhand smoke (SHS) have higher rates of behavioral and cognitive effects, including attention deficit hyperactivity disorder (ADHD), but the costs to the health care and education systems have not been estimated. We estimate these costs for school-aged children aged 5-15. Methods: The relative risk (RR) of ADHD…
Operational Risk Measurement of Chinese Commercial Banks Based on Extreme Value Theory
NASA Astrophysics Data System (ADS)
Song, Jiashan; Li, Yong; Ji, Feng; Peng, Cheng
The financial institutions and supervision institutions have all agreed on strengthening the measurement and management of operational risks. This paper attempts to build a model on the loss of operational risks basing on Peak Over Threshold model, emphasizing on weighted least square, which improved Hill’s estimation method, while discussing the situation of small sample, and fix the sample threshold more objectively basing on the media-published data of primary banks loss on operational risk from 1994 to 2007.
The global distribution of risk factors by poverty level.
Blakely, Tony; Hales, Simon; Kieft, Charlotte; Wilson, Nick; Woodward, Alistair
2005-01-01
OBJECTIVE: To estimate the individual-level association of income poverty with being underweight, using tobacco, drinking alcohol, having access only to unsafe water and sanitation, being exposed to indoor air pollution and being obese. METHODS: Using survey data for as many countries as possible, we estimated the relative risk association between income or assets and risk factors at the individual level within 11 medium- and low-income subregions of WHO. WHO and The World Bank data on the prevalence of risk factors and income poverty (defined as living on < US$ 1.00 per day, US$ 1-2.00 per day and > US$ 2.00 per day) were analysed to impute the association between poverty and risk factors for each subregion. The possible effect of poverty reduction on the prevalence of risk factors was estimated using population-attributable risk percentages. FINDINGS: There were strong associations between poverty and malnutrition among children, having access only to unsafe water and sanitation, and being exposed to indoor air pollution within each subregion (relative risks were twofold to threefold greater for those living on < US$ 1.00 per day compared with those living on > US$ 2.00 per day). Associations between poverty and obesity, tobacco use and alcohol use varied across subregions. If everyone living on < US$ 2.00 per day had the risk factor profile of those living on > US$ 2.00 per day, 51% of exposures to unimproved water and sanitation could be avoided as could 37% of malnutrition among children and 38% of exposure to indoor air pollution. The more realistic, but still challenging, Millennium Development Goal of halving the number of people living on < US$ 1.00 per day would achieve much smaller reductions. CONCLUSION: To achieve large gains in global health requires both poverty eradication and public health action. The methods used in this study may be useful for monitoring pro-equity progress towards Millennium Development Goals. PMID:15744404
Seitz, Holli H.; Gibson, Laura; Skubisz, Christine; Forquer, Heather; Mello, Susan; Schapira, Marilyn M.; Armstrong, Katrina; Cappella, Joseph N.
2016-01-01
Objective This experiment tested the effects of an individualized risk-based online mammography decision intervention. The intervention employs exemplification theory and the Elaboration Likelihood Model of persuasion to improve the match between breast cancer risk and mammography intentions. Methods 2,918 women ages 35-49 were stratified into two levels of 10-year breast cancer risk (< 1.5%; ≥ 1.5%) then randomly assigned to one of eight conditions: two comparison conditions and six risk-based intervention conditions that varied according to a 2 (amount of content: brief vs. extended) × 3 (format: expository vs. untailored exemplar [example case] vs. tailored exemplar) design. Outcomes included mammography intentions and accuracy of perceived breast cancer risk. Results Risk-based intervention conditions improved the match between objective risk estimates and perceived risk, especially for high-numeracy women with a 10-year breast cancer risk <1.5%. For women with a risk < 1.5%, exemplars improved accuracy of perceived risk and all risk-based interventions increased intentions to wait until age 50 to screen. Conclusion A risk-based mammography intervention improved accuracy of perceived risk and the match between objective risk estimates and mammography intentions. Practice Implications Interventions could be applied in online or clinical settings to help women understand risk and make mammography decisions. PMID:27178707
Land, Charles E; Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian; Drozdovitch, Vladimir; Bouville, André; Beck, Harold; Luckyanov, Nicholas; Weinstock, Robert M; Simon, Steven L
2015-02-01
Dosimetic uncertainties, particularly those that are shared among subgroups of a study population, can bias, distort or reduce the slope or significance of a dose response. Exposure estimates in studies of health risks from environmental radiation exposures are generally highly uncertain and thus, susceptible to these methodological limitations. An analysis was published in 2008 concerning radiation-related thyroid nodule prevalence in a study population of 2,994 villagers under the age of 21 years old between August 1949 and September 1962 and who lived downwind from the Semipalatinsk Nuclear Test Site in Kazakhstan. This dose-response analysis identified a statistically significant association between thyroid nodule prevalence and reconstructed doses of fallout-related internal and external radiation to the thyroid gland; however, the effects of dosimetric uncertainty were not evaluated since the doses were simple point "best estimates". In this work, we revised the 2008 study by a comprehensive treatment of dosimetric uncertainties. Our present analysis improves upon the previous study, specifically by accounting for shared and unshared uncertainties in dose estimation and risk analysis, and differs from the 2008 analysis in the following ways: 1. The study population size was reduced from 2,994 to 2,376 subjects, removing 618 persons with uncertain residence histories; 2. Simulation of multiple population dose sets (vectors) was performed using a two-dimensional Monte Carlo dose estimation method; and 3. A Bayesian model averaging approach was employed for evaluating the dose response, explicitly accounting for large and complex uncertainty in dose estimation. The results were compared against conventional regression techniques. The Bayesian approach utilizes 5,000 independent realizations of population dose vectors, each of which corresponds to a set of conditional individual median internal and external doses for the 2,376 subjects. These 5,000 population dose vectors reflect uncertainties in dosimetric parameters, partly shared and partly independent, among individual members of the study population. Risk estimates for thyroid nodules from internal irradiation were higher than those published in 2008, which results, to the best of our knowledge, from explicitly accounting for dose uncertainty. In contrast to earlier findings, the use of Bayesian methods led to the conclusion that the biological effectiveness for internal and external dose was similar. Estimates of excess relative risk per unit dose (ERR/Gy) for males (177 thyroid nodule cases) were almost 30 times those for females (571 cases) and were similar to those reported for thyroid cancers related to childhood exposures to external and internal sources in other studies. For confirmed cases of papillary thyroid cancers (3 in males, 18 in females), the ERR/Gy was also comparable to risk estimates from other studies, but not significantly different from zero. These findings represent the first reported dose response for a radiation epidemiologic study considering all known sources of shared and unshared errors in dose estimation and using a Bayesian model averaging (BMA) method for analysis of the dose response.
Value drivers: an approach for estimating health and disease management program savings.
Phillips, V L; Becker, Edmund R; Howard, David H
2013-12-01
Health and disease management (HDM) programs have faced challenges in documenting savings related to their implementation. The objective of this eliminate study was to describe OptumHealth's (Optum) methods for estimating anticipated savings from HDM programs using Value Drivers. Optum's general methodology was reviewed, along with details of 5 high-use Value Drivers. The results showed that the Value Driver approach offers an innovative method for estimating savings associated with HDM programs. The authors demonstrated how real-time savings can be estimated for 5 Value Drivers commonly used in HDM programs: (1) use of beta-blockers in treatment of heart disease, (2) discharge planning for high-risk patients, (3) decision support related to chronic low back pain, (4) obesity management, and (5) securing transportation for primary care. The validity of savings estimates is dependent on the type of evidence used to gauge the intervention effect, generating changes in utilization and, ultimately, costs. The savings estimates derived from the Value Driver method are generally reasonable to conservative and provide a valuable framework for estimating financial impacts from evidence-based interventions.
Hulin, Anne; Blanchet, Benoît; Audard, Vincent; Barau, Caroline; Furlan, Valérie; Durrbach, Antoine; Taïeb, Fabrice; Lang, Philippe; Grimbert, Philippe; Tod, Michel
2009-04-01
A significant relationship between mycophenolic acid (MPA) area under the plasma concentration-time curve (AUC) and the risk for rejection has been reported. Based on 3 concentration measurements, 3 approaches have been proposed for the estimation of MPA AUC, involving either a multilinear regression approach model (MLRA) or a Bayesian estimation using either gamma absorption or zero-order absorption population models. The aim of the study was to compare the 3 approaches for the estimation of MPA AUC in 150 renal transplant patients treated with mycophenolate mofetil and tacrolimus. The population parameters were determined in 77 patients (learning study). The AUC estimation methods were compared in the learning population and in 73 patients from another center (validation study). In the latter study, the reference AUCs were estimated by the trapezoidal rule on 8 measurements. MPA concentrations were measured by liquid chromatography. The gamma absorption model gave the best fit. In the learning study, the AUCs estimated by both Bayesian methods were very similar, whereas the multilinear approach was highly correlated but yielded estimates about 20% lower than Bayesian methods. This resulted in dosing recommendations differing by 250 mg/12 h or more in 27% of cases. In the validation study, AUC estimates based on the Bayesian method with gamma absorption model and multilinear regression approach model were, respectively, 12% higher and 7% lower than the reference values. To conclude, the bicompartmental model with gamma absorption rate gave the best fit. The 3 AUC estimation methods are highly correlated but not concordant. For a given patient, the same estimation method should always be used.
Cohen, Mark E; Ko, Clifford Y; Bilimoria, Karl Y; Zhou, Lynn; Huffman, Kristopher; Wang, Xue; Liu, Yaoming; Kraemer, Kari; Meng, Xiangju; Merkow, Ryan; Chow, Warren; Matel, Brian; Richards, Karen; Hart, Amy J; Dimick, Justin B; Hall, Bruce L
2013-08-01
The American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) collects detailed clinical data from participating hospitals using standardized data definitions, analyzes these data, and provides participating hospitals with reports that permit risk-adjusted comparisons with a surgical quality standard. Since its inception, the ACS NSQIP has worked to refine surgical outcomes measurements and enhance statistical methods to improve the reliability and validity of this hospital profiling. From an original focus on controlling for between-hospital differences in patient risk factors with logistic regression, ACS NSQIP has added a variable to better adjust for the complexity and risk profile of surgical procedures (procedure mix adjustment) and stabilized estimates derived from small samples by using a hierarchical model with shrinkage adjustment. New models have been developed focusing on specific surgical procedures (eg, "Procedure Targeted" models), which provide opportunities to incorporate indication and other procedure-specific variables and outcomes to improve risk adjustment. In addition, comparative benchmark reports given to participating hospitals have been expanded considerably to allow more detailed evaluations of performance. Finally, procedures have been developed to estimate surgical risk for individual patients. This article describes the development of, and justification for, these new statistical methods and reporting strategies in ACS NSQIP. Copyright © 2013 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Elashoff, Robert M.; Li, Gang; Li, Ning
2009-01-01
Summary In this article we study a joint model for longitudinal measurements and competing risks survival data. Our joint model provides a flexible approach to handle possible nonignorable missing data in the longitudinal measurements due to dropout. It is also an extension of previous joint models with a single failure type, offering a possible way to model informatively censored events as a competing risk. Our model consists of a linear mixed effects submodel for the longitudinal outcome and a proportional cause-specific hazards frailty submodel (Prentice et al., 1978, Biometrics 34, 541-554) for the competing risks survival data, linked together by some latent random effects. We propose to obtain the maximum likelihood estimates of the parameters by an expectation maximization (EM) algorithm and estimate their standard errors using a profile likelihood method. The developed method works well in our simulation studies and is applied to a clinical trial for the scleroderma lung disease. PMID:18162112
Benchmark dose analysis via nonparametric regression modeling
Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen
2013-01-01
Estimation of benchmark doses (BMDs) in quantitative risk assessment traditionally is based upon parametric dose-response modeling. It is a well-known concern, however, that if the chosen parametric model is uncertain and/or misspecified, inaccurate and possibly unsafe low-dose inferences can result. We describe a nonparametric approach for estimating BMDs with quantal-response data based on an isotonic regression method, and also study use of corresponding, nonparametric, bootstrap-based confidence limits for the BMD. We explore the confidence limits’ small-sample properties via a simulation study, and illustrate the calculations with an example from cancer risk assessment. It is seen that this nonparametric approach can provide a useful alternative for BMD estimation when faced with the problem of parametric model uncertainty. PMID:23683057
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, S.C.; Rowe, M.D.; Holtzman, S.
1992-11-01
The Environmental Protection Agency (EPA) has proposed regulations for allowable levels of radioactive material in drinking water (40 CFR Part 141, 56 FR 33050, July 18, 1991). This review examined the assumptions and methods used by EPA in calculating risks that would be avoided by implementing the proposed Maximum Contaminant Levels for uranium, radium, and radon. Proposed limits on gross alpha and beta-gamma emitters were not included in this review.
Empirical Estimates of 0Day Vulnerabilities in Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miles A. McQueen; Wayne F. Boyer; Sean M. McBride
2009-01-01
We define a 0Day vulnerability to be any vulnerability, in deployed software, which has been discovered by at least one person but has not yet been publicly announced or patched. These 0Day vulnerabilities are of particular interest when assessing the risk to well managed control systems which have already effectively mitigated the publicly known vulnerabilities. In these well managed systems the risk contribution from 0Days will have proportionally increased. To aid understanding of how great a risk 0Days may pose to control systems, an estimate of how many are in existence is needed. Consequently, using the 0Day definition given above,more » we developed and applied a method for estimating how many 0Day vulnerabilities are in existence on any given day. The estimate is made by: empirically characterizing the distribution of the lifespans, measured in days, of 0Day vulnerabilities; determining the number of vulnerabilities publicly announced each day; and applying a novel method for estimating the number of 0Day vulnerabilities in existence on any given day using the number of vulnerabilities publicly announced each day and the previously derived distribution of 0Day lifespans. The method was first applied to a general set of software applications by analyzing the 0Day lifespans of 491 software vulnerabilities and using the daily rate of vulnerability announcements in the National Vulnerability Database. This led to a conservative estimate that in the worst year there were, on average, 2500 0Day software related vulnerabilities in existence on any given day. Using a smaller but intriguing set of 15 0Day software vulnerability lifespans representing the actual time from discovery to public disclosure, we then made a more aggressive estimate. In this case, we estimated that in the worst year there were, on average, 4500 0Day software vulnerabilities in existence on any given day. We then proceeded to identify the subset of software applications likely to be used in some control systems, analyzed the associated subset of vulnerabilities, and characterized their lifespans. Using the previously developed method of analysis, we very conservatively estimated 250 control system related 0Day vulnerabilities in existence on any given day. While reasonable, this first order estimate for control systems is probably far more conservative than those made for general software systems since the estimate did not include vulnerabilities unique to control system specific components. These control system specific vulnerabilities were unable to be included in the estimate for a variety of reasons with the most problematic being that the public announcement of unique control system vulnerabilities is very sparse. Consequently, with the intent to improve the above 0Day estimate for control systems, we first identified the additional, unique to control systems, vulnerability estimation constraints and then investigated new mechanisms which may be useful for estimating the number of unique 0Day software vulnerabilities found in control system components. We proceeded to identify a number of new mechanisms and approaches for estimating and incorporating control system specific vulnerabilities into an improved 0Day estimation method. These new mechanisms and approaches appear promising and will be more rigorously evaluated during the course of the next year.« less
Risk communication methods in hip fracture prevention: a randomised trial in primary care.
Hudson, Ben; Toop, Les; Mangin, Dee; Pearson, John
2011-08-01
Treatment acceptance by patients is influenced by the way treatment effects are presented. Presentation of benefits using relative risk increases treatment acceptance compared to the use of absolute risk. It is not known whether this effect is modified by prior presentation of a patient's individualised risk estimate or how presentation of treatment harms by relative or absolute risk affects acceptance. To compare acceptance of a hypothetical treatment to prevent hip fracture after presentation of the treatment's benefit in relative or absolute terms in the context of a personal fracture risk estimate, and to reassess acceptance following subsequent presentation of harm in relative or absolute terms. Randomised controlled trial of patients recruited from 10 GPs' lists in Christchurch, New Zealand. Women aged ≥ 50 years were invited to participate. Participants were given a personal 10-year hip fracture risk estimate and randomised to receive information on a hypothetical treatment's benefit and harm in relative or absolute terms. Of the 1140 women invited to participate 393 (34%) took part. Treatment acceptance was greater following presentation of benefit using absolute terms than relative terms after adjustment forage, education, previous osteoporosis diagnosis, and self-reported risk (OR 1.73, 95% confidence interval [CI] = 1.10 to 2.73, P = 0.018). Presentation of the treatment's harmful effect in relative terms led to a greater proportion of participants declining treatment than did presentation in absolute terms (OR 4.89, 95% CI = 2.3 to 11.0, P<0.001). Presentation of treatment benefit and harm using absolute risk estimates led to greater treatment acceptance than presentation of the same information in relative terms.
NASA Astrophysics Data System (ADS)
Kim, T.; Brauman, K. A.; Schmitt, J.; Goodkind, A. L.; Smith, T. M.
2016-12-01
Water scarcity in US corn farming regions is a significant risk consideration for the ethanol and meat production sectors, which comprise 80% of all US corn demand. Water supply risk can lead to effects across the supply chain, affecting annual corn yields. The purpose of our study is to assess the water risk to the US's most corn-intensive sectors and companies by linking watershed depletion estimates with corn production, linked to downstream companies through a corn transport model. We use a water depletion index as an improved metric for seasonal water scarcity and a corn sourcing supply chain model based on economic cost minimization. Water depletion was calculated as the fraction of renewable (ground and surface) water consumption, with estimates of more than 75% depletion on an annual average basis indicating a significant water risk. We estimated company water risk as the amount of embedded corn coming from three categories of water stressed counties. The ethanol sector had 3.1% of sourced corn grown from counties that were more than 75% depleted while the beef sector had 14.0%. From a firm perspective, Tyson, JBS, Cargill, the top three US corn demanding companies, had 4.5%, 9.6%, 12.8% of their sourced corn respectively, coming from watersheds that are more than 75% depleted. These numbers are significantly higher than the global average of 2.2% of watersheds being classified as more than 75% depleted. Our model enables corn using industries to evaluate their supply chain risk of water scarcity through modeling corn sourcing and watershed depletion, providing the private sector a new method for risk estimation. Our results suggest corn dependent industries are already linked to water scarcity risk in disproportionate amounts due to the spatial heterogeneity of corn sourcing and water scarcity.
[FMEA applied to the radiotherapy patient care process].
Meyrieux, C; Garcia, R; Pourel, N; Mège, A; Bodez, V
2012-10-01
Failure modes and effects analysis (FMEA), is a risk analysis method used at the Radiotherapy Department of Institute Sainte-Catherine as part of a strategy seeking to continuously improve the quality and security of treatments. The method comprises several steps: definition of main processes; for each of them, description for every step of prescription, treatment preparation, treatment application; identification of the possible risks, their consequences, their origins; research of existing safety elements which may avoid these risks; grading of risks to assign a criticality score resulting in a numerical organisation of the risks. Finally, the impact of proposed corrective actions was then estimated by a new grading round. For each process studied, a detailed map of the risks was obtained, facilitating the identification of priority actions to be undertaken. For example, we obtain five steps in patient treatment planning with an unacceptable level of risk, 62 a level of moderate risk and 31 an acceptable level of risk. The FMEA method, used in the industrial domain and applied here to health care, is an effective tool for the management of risks in patient care. However, the time and training requirements necessary to implement this method should not be underestimated. Copyright © 2012 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
What controls the maximum magnitude of injection-induced earthquakes?
NASA Astrophysics Data System (ADS)
Eaton, D. W. S.
2017-12-01
Three different approaches for estimation of maximum magnitude are considered here, along with their implications for managing risk. The first approach is based on a deterministic limit for seismic moment proposed by McGarr (1976), which was originally designed for application to mining-induced seismicity. This approach has since been reformulated for earthquakes induced by fluid injection (McGarr, 2014). In essence, this method assumes that the upper limit for seismic moment release is constrained by the pressure-induced stress change. A deterministic limit is given by the product of shear modulus and the net injected fluid volume. This method is based on the assumptions that the medium is fully saturated and in a state of incipient failure. An alternative geometrical approach was proposed by Shapiro et al. (2011), who postulated that the rupture area for an induced earthquake falls entirely within the stimulated volume. This assumption reduces the maximum-magnitude problem to one of estimating the largest potential slip surface area within a given stimulated volume. Finally, van der Elst et al. (2016) proposed that the maximum observed magnitude, statistically speaking, is the expected maximum value for a finite sample drawn from an unbounded Gutenberg-Richter distribution. These three models imply different approaches for risk management. The deterministic method proposed by McGarr (2014) implies that a ceiling on the maximum magnitude can be imposed by limiting the net injected volume, whereas the approach developed by Shapiro et al. (2011) implies that the time-dependent maximum magnitude is governed by the spatial size of the microseismic event cloud. Finally, the sample-size hypothesis of Van der Elst et al. (2016) implies that the best available estimate of the maximum magnitude is based upon observed seismicity rate. The latter two approaches suggest that real-time monitoring is essential for effective management of risk. A reliable estimate of maximum plausible magnitude would clearly be beneficial for quantitative risk assessment of injection-induced seismicity.
Neocleous, A C; Syngelaki, A; Nicolaides, K H; Schizas, C N
2018-04-01
To estimate the risk of fetal trisomy 21 (T21) and other chromosomal abnormalities (OCA) at 11-13 weeks' gestation using computational intelligence classification methods. As a first step, a training dataset consisting of 72 054 euploid pregnancies, 295 cases of T21 and 305 cases of OCA was used to train an artificial neural network. Then, a two-stage approach was used for stratification of risk and diagnosis of cases of aneuploidy in the blind set. In Stage 1, using four markers, pregnancies in the blind set were classified into no risk and risk. No-risk pregnancies were not examined further, whereas the risk pregnancies were forwarded to Stage 2 for further examination. In Stage 2, using seven markers, pregnancies were classified into three types of risk, namely no risk, moderate risk and high risk. Of 36 328 unknown to the system pregnancies (blind set), 17 512 euploid, two T21 and 18 OCA were classified as no risk in Stage 1. The remaining 18 796 cases were forwarded to Stage 2, of which 7895 euploid, two T21 and two OCA cases were classified as no risk, 10 464 euploid, 83 T21 and 61 OCA as moderate risk and 187 euploid, 50 T21 and 52 OCA as high risk. The sensitivity and the specificity for T21 in Stage 2 were 97.1% and 99.5%, respectively, and the false-positive rate from Stage 1 to Stage 2 was reduced from 51.4% to ∼1%, assuming that the cell-free DNA test could identify all euploid and aneuploid cases. We propose a method for early diagnosis of chromosomal abnormalities that ensures that most T21 cases are classified as high risk at any stage. At the same time, the number of euploid cases subjected to invasive or cell-free DNA examinations was minimized through a routine procedure offered in two stages. Our method is minimally invasive and of relatively low cost, highly effective at T21 identification and it performs better than do other existing statistical methods. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd.
Task exposures in an office environment: a comparison of methods.
Van Eerd, Dwayne; Hogg-Johnson, Sheilah; Mazumder, Anjali; Cole, Donald; Wells, Richard; Moore, Anne
2009-10-01
Task-related factors such as frequency and duration are associated with musculoskeletal disorders in office settings. The primary objective was to compare various task recording methods as measures of exposure in an office workplace. A total of 41 workers from different jobs were recruited from a large urban newspaper (71% female, mean age 41 years SD 9.6). Questionnaire, task diaries, direct observation and video methods were used to record tasks. A common set of task codes was used across methods. Different estimates of task duration, number of tasks and task transitions arose from the different methods. Self-report methods did not consistently result in longer task duration estimates. Methodological issues could explain some of the differences in estimates seen between methods observed. It was concluded that different task recording methods result in different estimates of exposure likely due to different exposure constructs. This work addresses issues of exposure measurement in office environments. It is of relevance to ergonomists/researchers interested in how to best assess the risk of injury among office workers. The paper discusses the trade-offs between precision, accuracy and burden in the collection of computer task-based exposure measures and different underlying constructs captures in each method.
NASA Astrophysics Data System (ADS)
Dai, Jun; Zhou, Haigang; Zhao, Shaoquan
2017-01-01
This paper considers a multi-scale future hedge strategy that minimizes lower partial moments (LPM). To do this, wavelet analysis is adopted to decompose time series data into different components. Next, different parametric estimation methods with known distributions are applied to calculate the LPM of hedged portfolios, which is the key to determining multi-scale hedge ratios over different time scales. Then these parametric methods are compared with the prevailing nonparametric kernel metric method. Empirical results indicate that in the China Securities Index 300 (CSI 300) index futures and spot markets, hedge ratios and hedge efficiency estimated by the nonparametric kernel metric method are inferior to those estimated by parametric hedging model based on the features of sequence distributions. In addition, if minimum-LPM is selected as a hedge target, the hedging periods, degree of risk aversion, and target returns can affect the multi-scale hedge ratios and hedge efficiency, respectively.
Ward, Zachary J.; Long, Michael W.; Resch, Stephen C.; Gortmaker, Steven L.; Cradock, Angie L.; Giles, Catherine; Hsiao, Amber; Wang, Y. Claire
2016-01-01
Background State-level estimates from the Centers for Disease Control and Prevention (CDC) underestimate the obesity epidemic because they use self-reported height and weight. We describe a novel bias-correction method and produce corrected state-level estimates of obesity and severe obesity. Methods Using non-parametric statistical matching, we adjusted self-reported data from the Behavioral Risk Factor Surveillance System (BRFSS) 2013 (n = 386,795) using measured data from the National Health and Nutrition Examination Survey (NHANES) (n = 16,924). We validated our national estimates against NHANES and estimated bias-corrected state-specific prevalence of obesity (BMI≥30) and severe obesity (BMI≥35). We compared these results with previous adjustment methods. Results Compared to NHANES, self-reported BRFSS data underestimated national prevalence of obesity by 16% (28.67% vs 34.01%), and severe obesity by 23% (11.03% vs 14.26%). Our method was not significantly different from NHANES for obesity or severe obesity, while previous methods underestimated both. Only four states had a corrected obesity prevalence below 30%, with four exceeding 40%–in contrast, most states were below 30% in CDC maps. Conclusions Twelve million adults with obesity (including 6.7 million with severe obesity) were misclassified by CDC state-level estimates. Previous bias-correction methods also resulted in underestimates. Accurate state-level estimates are necessary to plan for resources to address the obesity epidemic. PMID:26954566
Che, W W; Frey, H Christopher; Lau, Alexis K H
2014-12-01
Population and diary sampling methods are employed in exposure models to sample simulated individuals and their daily activity on each simulation day. Different sampling methods may lead to variations in estimated human exposure. In this study, two population sampling methods (stratified-random and random-random) and three diary sampling methods (random resampling, diversity and autocorrelation, and Markov-chain cluster [MCC]) are evaluated. Their impacts on estimated children's exposure to ambient fine particulate matter (PM2.5 ) are quantified via case studies for children in Wake County, NC for July 2002. The estimated mean daily average exposure is 12.9 μg/m(3) for simulated children using the stratified population sampling method, and 12.2 μg/m(3) using the random sampling method. These minor differences are caused by the random sampling among ages within census tracts. Among the three diary sampling methods, there are differences in the estimated number of individuals with multiple days of exposures exceeding a benchmark of concern of 25 μg/m(3) due to differences in how multiday longitudinal diaries are estimated. The MCC method is relatively more conservative. In case studies evaluated here, the MCC method led to 10% higher estimation of the number of individuals with repeated exposures exceeding the benchmark. The comparisons help to identify and contrast the capabilities of each method and to offer insight regarding implications of method choice. Exposure simulation results are robust to the two population sampling methods evaluated, and are sensitive to the choice of method for simulating longitudinal diaries, particularly when analyzing results for specific microenvironments or for exposures exceeding a benchmark of concern. © 2014 Society for Risk Analysis.
Ecological risk assessment in a large river-reservoir. 5: Aerial insectivorous wildlife
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baron, L.A.; Sample, B.E.; Suter, G.W. II
Risks to aerial insectivores (e.g., rough-winged swallows, little brown bats, and endangered gray bats) were assessed for the remedial investigation of the Clinch River/Poplar Creek (CR/PC) system. Adult mayflies and sediment were collected from three locations and analyzed for contaminants. Sediment-to-mayfly contaminant uptake factors were generated from these data and used to estimate contaminant concentrations in mayflies from 13 additional locations. Contaminants of potential ecological concern (COPECs) were identified by comparing exposure estimates generated using point estimates of parameter values to NOAELs. To incorporate the variation in exposure parameters and to provide a better estimate of the potential exposure, themore » exposure model was recalculated using Monte Carlo methods. The potential for adverse effects was estimated based on the comparison of exposure distribution and the LOAEL. The results of this assessment suggested that population-level effects to rough-winged swallows and little brown bats are considered unlikely. However, because gray bats are endangered, effects on individuals may be significant from foraging in limited subreaches of the CR/PC system. This assessment illustrates the advantage of an iterative approach to ecological risk assessments, using fewer conservative assumptions and more realistic modeling of exposure.« less
High risk of tuberculous infection in North Sulawesi Province of Indonesia.
Bachtiar, A; Miko, T Y; Machmud, R; Mehta, F; Chadha, V K; Yudarini, P; Loprang, F; Fahmi, S; Jitendra, R
2009-12-01
Of all the provinces in Indonesia, the highest tuberculosis (TB) case notification rates are reported from North Sulawesi Province. To estimate the annual risk of tuberculous infection (ARTI) among schoolchildren in the 6-9 year age group. A cross-sectional survey was carried out in 99 schools selected by a two-stage sampling process. Children attending grades 1-4 in the selected schools were administered intradermally with 2 tuberculin units (TUs) of purified protein derivative RT23 with Tween 80, and the maximum transverse diameter of induration was measured about 72 h later. A total of 6557 children in the 6-9 year age group were satisfactorily test-read, irrespective of their bacille Calmette-Guérin (BCG) vaccination status. Based on the frequency distribution of reaction sizes obtained among satisfactorily test-read children (without and with BCG scar), the estimated ARTI rates when estimated by different methods (anti-mode, mirror-image and mixture model) varied between 1.9% and 2.5%. BCG-induced tuberculin sensitivity was not found to influence the ARTI estimates, as the differences in estimates between children without and with BCG scar were not statistically significant. TB control efforts should be further intensified to reduce the risk of tuberculous infection.
Fernández-Feito, Ana; Antón-Fernández, Raquel; Paz-Zulueta, María
2018-05-01
To estimate the association between the human papillomavirus (HPV) vaccine and sexual risk behaviour, as well as the participation in the Cervical Cancer Screening Program (CCSP). Cross-sectional study. School of Medicine and Health Sciences, School of Law, and School of Economics and Business (University of Oviedo). Female university students. Information was collected about contraceptive methods, sexual behaviours, HPV knowledge, and participation in the CCSP. Furthermore, proportions and odds ratio (OR) were estimated with their corresponding 95% confidence intervals (95%CI). Approximately two-thirds (67.7%) of the sample was vaccinated against HPV, and 216 women (65.3%) were sexually active. Barrier contraceptive methods were used by 67.6% during their current intimate relationships, being less frequent in non-vaccinated women (54.9% vs. 75.4% in vaccinated female students) (P=.002). The risk of having at least one sexual risk behaviour was higher in non-vaccinated women: OR2.29 (95%CI: 1.29-4.07). In addition, the probability of having a PAP test within the CCSP was higher in non-vaccinated women: OR2.18 (95%CI: 1.07-4.47). The prevalence of sexual risk behaviours in non-vaccinated women is elevated, and it is related to the lack of use of barrier contraceptive methods. The vaccination against HPV could affect sexual behaviours and the participation in the CCSP. Therefore, the information received by young people about contraceptive methods, sexually transmitted diseases, and cancer prevention should be reinforced. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
Risk Assessment Methodology Based on the NISTIR 7628 Guidelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Sheldon, Frederick T; Hauser, Katie R
2013-01-01
Earlier work describes computational models of critical infrastructure that allow an analyst to estimate the security of a system in terms of the impact of loss per stakeholder resulting from security breakdowns. Here, we consider how to identify, monitor and estimate risk impact and probability for different smart grid stakeholders. Our constructive method leverages currently available standards and defined failure scenarios. We utilize the National Institute of Standards and Technology (NIST) Interagency or Internal Reports (NISTIR) 7628 as a basis to apply Cyberspace Security Econometrics system (CSES) for comparing design principles and courses of action in making security-related decisions.
Woo, Hae Dong; Kim, Jeongseon
2012-01-01
Good biomarkers for early detection of cancer lead to better prognosis. However, harvesting tumor tissue is invasive and cannot be routinely performed. Global DNA methylation of peripheral blood leukocyte DNA was evaluated as a biomarker for cancer risk. We performed a meta-analysis to estimate overall cancer risk according to global DNA hypomethylation levels among studies with various cancer types and analytical methods used to measure DNA methylation. Studies were systemically searched via PubMed with no language limitation up to July 2011. Summary estimates were calculated using a fixed effects model. The subgroup analyses by experimental methods to determine DNA methylation level were performed due to heterogeneity within the selected studies (p<0.001, I(2): 80%). Heterogeneity was not found in the subgroup of %5-mC (p = 0.393, I(2): 0%) and LINE-1 used same target sequence (p = 0.097, I(2): 49%), whereas considerable variance remained in LINE-1 (p<0.001, I(2): 80%) and bladder cancer studies (p = 0.016, I(2): 76%). These results suggest that experimental methods used to quantify global DNA methylation levels are important factors in the association study between hypomethylation levels and cancer risk. Overall, cancer risks of the group with the lowest DNA methylation levels were significantly higher compared to the group with the highest methylation levels [OR (95% CI): 1.48 (1.28-1.70)]. Global DNA hypomethylation in peripheral blood leukocytes may be a suitable biomarker for cancer risk. However, the association between global DNA methylation and cancer risk may be different based on experimental methods, and region of DNA targeted for measuring global hypomethylation levels as well as the cancer type. Therefore, it is important to select a precise and accurate surrogate marker for global DNA methylation levels in the association studies between global DNA methylation levels in peripheral leukocyte and cancer risk.
Lin, Tin-Chi; Marucci-Wellman, Helen R; Willetts, Joanna L; Brennan, Melanye J; Verma, Santosh K
2016-12-01
A common issue in descriptive injury epidemiology is that in order to calculate injury rates that account for the time spent in an activity, both injury cases and exposure time of specific activities need to be collected. In reality, few national surveys have this capacity. To address this issue, we combined statistics from two different national complex surveys as inputs for the numerator and denominator to estimate injury rate, accounting for the time spent in specific activities and included a procedure to estimate variance using the combined surveys. The 2010 National Health Interview Survey (NHIS) was used to quantify injuries, and the 2010 American Time Use Survey (ATUS) was used to quantify time of exposure to specific activities. The injury rate was estimated by dividing the average number of injuries (from NHIS) by average exposure hours (from ATUS), both measured for specific activities. The variance was calculated using the 'delta method', a general method for variance estimation with complex surveys. Among the five types of injuries examined, 'sport and exercise' had the highest rate (12.64 injuries per 100 000 h), followed by 'working around house/yard' (6.14), driving/riding a motor vehicle (2.98), working (1.45) and sleeping/resting/eating/drinking (0.23). The results show a ranking of injury rate by activity quite different from estimates using population as the denominator. Our approach produces an estimate of injury risk which includes activity exposure time and may more reliably reflect the underlying injury risks, offering an alternative method for injury surveillance and research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Goetzel, Ron Z.; Tabrizi, Maryam; Henke, Rachel Mosher; Benevent, Richele; Brockbank, Claire v. S.; Stinson, Kaylan; Trotter, Margo; Newman, Lee S.
2015-01-01
Objective To determine whether changes in health risks for workers in small businesses can produce medical and productivity cost savings. Methods A 1-year pre- and posttest study tracked changes in 10 modifiable health risks for 2458 workers at 121 Colorado businesses that participated in a comprehensive worksite health promotion program. Risk reductions were entered into a return-on-investment (ROI) simulation model. Results Reductions were recorded in 10 risk factors examined, including obesity (−2.0%), poor eating habits (−5.8%), poor physical activity (−6.5%), tobacco use (−1.3%), high alcohol consumption (−1.7%), high stress (−3.5%), depression (−2.3%), high blood pressure (−0.3%), high total cholesterol (−0.9%), and high blood glucose (−0.2%). The ROI model estimated medical and productivity savings of $2.03 for every $1.00 invested. Conclusions Pooled data suggest that small businesses can realize a positive ROI from effective risk reduction programs. PMID:24806569
Risk factors for UK Plasmodium falciparum cases
2014-01-01
Background An increasing proportion of malaria cases diagnosed in UK residents with a history of travel to malaria endemic areas are due to Plasmodium falciparum. Methods In order to identify travellers at most risk of acquiring malaria a proportional hazards model was used to estimate the risk of acquiring malaria stratified by purpose of travel and age whilst adjusting for entomological inoculation rate (EIR) and duration of stay in endemic countries. Results Travellers visiting friends and relatives and business travellers were found to have significantly higher hazard of acquiring malaria (adjusted hazard ratio (HR) relative to that of holiday makers 7.4, 95% CI 6.4–8.5, p < 0. 0001 and HR 3.4, 95% CI 2.9-3.8, p < 0. 0001, respectively). All age-groups were at lower risk than children aged 0–15 years. Conclusions These estimates of the increased risk for business travellers and those visiting friends and relatives should be used to inform programmes to improve awareness of the risks of malaria when travelling. PMID:25091803
The problem of estimating recent genetic connectivity in a changing world.
Samarasin, Pasan; Shuter, Brian J; Wright, Stephen I; Rodd, F Helen
2017-02-01
Accurate understanding of population connectivity is important to conservation because dispersal can play an important role in population dynamics, microevolution, and assessments of extirpation risk and population rescue. Genetic methods are increasingly used to infer population connectivity because advances in technology have made them more advantageous (e.g., cost effective) relative to ecological methods. Given the reductions in wildlife population connectivity since the Industrial Revolution and more recent drastic reductions from habitat loss, it is important to know the accuracy of and biases in genetic connectivity estimators when connectivity has declined recently. Using simulated data, we investigated the accuracy and bias of 2 common estimators of migration (movement of individuals among populations) rate. We focused on the timing of the connectivity change and the magnitude of that change on the estimates of migration by using a coalescent-based method (Migrate-n) and a disequilibrium-based method (BayesAss). Contrary to expectations, when historically high connectivity had declined recently: (i) both methods over-estimated recent migration rates; (ii) the coalescent-based method (Migrate-n) provided better estimates of recent migration rate than the disequilibrium-based method (BayesAss); (iii) the coalescent-based method did not accurately reflect long-term genetic connectivity. Overall, our results highlight the problems with comparing coalescent and disequilibrium estimates to make inferences about the effects of recent landscape change on genetic connectivity among populations. We found that contrasting these 2 estimates to make inferences about genetic-connectivity changes over time could lead to inaccurate conclusions. © 2016 Society for Conservation Biology.
Platt, Lauren R; Estívariz, Concepción F; Sutter, Roland W
2014-11-01
Vaccine-associated paralytic poliomyelitis (VAPP) is a rare adverse event associated with oral poliovirus vaccine (OPV). This review summarizes the epidemiology and provides a global burden estimate. A literature review was conducted to abstract the epidemiology and calculate the risk of VAPP. A bootstrap method was applied to calculate global VAPP burden estimates. Trends in VAPP epidemiology varied by country income level. In the low-income country, the majority of cases occurred in individuals who had received >3 doses of OPV (63%), whereas in middle and high-income countries, most cases occurred in recipients after their first OPV dose or unvaccinated contacts (81%). Using all risk estimates, VAPP risk was 4.7 cases per million births (range, 2.4-9.7), leading to a global annual burden estimate of 498 cases (range, 255-1018). If the analysis is limited to estimates from countries that currently use OPV, the VAPP risk is 3.8 cases per million births (range, 2.9-4.7) and a burden of 399 cases (range, 306-490). Because many high-income countries have replaced OPV with inactivated poliovirus vaccine, the VAPP burden is concentrated in lower-income countries. The planned universal introduction of inactivated poliovirus vaccine is likely to substantially decrease the global VAPP burden by 80%-90%. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Perceived Versus Objective Breast Cancer, Breast Cancer Risk in Diverse Women
Fehniger, Julia; Livaudais-Toman, Jennifer; Karliner, Leah; Kerlikowske, Karla; Tice, Jeffrey A.; Quinn, Jessica; Ozanne, Elissa
2014-01-01
Abstract Background: Prior research suggests that women do not accurately estimate their risk for breast cancer. Estimating and informing women of their risk is essential for tailoring appropriate screening and risk reduction strategies. Methods: Data were collected for BreastCARE, a randomized controlled trial designed to evaluate a PC-tablet based intervention providing multiethnic women and their primary care physicians with tailored information about breast cancer risk. We included women ages 40–74 visiting general internal medicine primary care clinics at one academic practice and one safety net practice who spoke English, Spanish, or Cantonese, and had no personal history of breast cancer. We collected baseline information regarding risk perception and concern. Women were categorized as high risk (vs. average risk) if their family history met criteria for referral to genetic counseling or if they were in the top 5% of risk for their age based on the Gail or Breast Cancer Surveillance Consortium Model (BCSC) breast cancer risk model. Results: Of 1,261 participants, 25% (N=314) were classified as high risk. More average risk than high risk women had correct risk perception (72% vs. 18%); 25% of both average and high risk women reported being very concerned about breast cancer. Average risk women with correct risk perception were less likely to be concerned about breast cancer (odds ratio [OR]=0.3; 95% confidence interval [CI]=0.2–0.4) while high risk women with correct risk perception were more likely to be concerned about breast cancer (OR=5.1; 95%CI=2.7–9.6). Conclusions: Many women did not accurately perceive their risk for breast cancer. Women with accurate risk perception had an appropriate level of concern about breast cancer. Improved methods of assessing and informing women of their breast cancer risk could motivate high risk women to apply appropriate prevention strategies and allay unnecessary concern among average risk women. PMID:24372085
Re-entry survivability and risk
NASA Astrophysics Data System (ADS)
Fudge, Michael L.
1998-11-01
This paper is the culmination of the research effort which was reported on last year while still in-progress. As previously reported, statistical methods for expressing the impact risk posed to space systems in general [and the International Space Station (ISS) in particular] by other resident space objects have been examined. One of the findings of this investigation is that there are legitimate physical modeling reasons for the common statistical expression of the collision risk. A combination of statistical methods and physical modeling is also used to express the impact risk posed by reentering space systems to objects of interest (e.g., people and property) on Earth. One of the largest uncertainties in the expressing of this risk is the estimation of survivable material which survives reentry to impact Earth's surface. This point was demonstrated in dramatic fashion in January 1997 by the impact of an intact expendable launch vehicle (ELV) upper stage near a private residence in the continental United States. Since approximately half of the missions supporting ISS will utilize ELVs, it is appropriate to examine the methods used to estimate the amount and physical characteristics of ELV debris surviving reentry to impact Earth's surface. This report details reentry survivability estimation methodology, including the specific methodology used by ITT Systems' (formerly Kaman Sciences) 'SURVIVE' model. The major change to the model in the last twelve months has been the increase in the fidelity with which upper- atmospheric aerodynamics has been modeled. This has resulted in an adjustment in the factor relating the amount of kinetic energy loss to the amount of heating entering and reentering body, and also validated and removed the necessity for certain empirically-based adjustments made to the theoretical heating expressions. Comparisons between empirical results (observations of objects which have been recovered on Earth after surviving reentry) and SURVIVE estimates are presented for selected generic upper stage or spacecraft components, a Soyuz launch vehicle second stage, and for a Delta II launch vehicle second stage and its significant components. Significant similarity is demonstrated between the type and dispersion pattern of the recovered debris from the January 1997 Delta II 2nd stage event and the simulation of that reentry and breakup.
Adjustments of the Pesticide Risk Index Used in Environmental Policy in Flanders
Fevery, Davina; Peeters, Bob; Lenders, Sonia; Spanoghe, Pieter
2015-01-01
Indicators are used to quantify the pressure of pesticides on the environment. Pesticide risk indicators typically require weighting environmental exposure by a no effect concentration. An indicator based on spread equivalents (ΣSeq) is used in environmental policy in Flanders (Belgium). The pesticide risk for aquatic life is estimated by weighting active ingredient usage by the ratio of their maximum allowable concentration and their soil halflife. Accurate estimates of total pesticide usage in the region are essential in such calculations. Up to 2012, the environmental impact of pesticides was estimated on sales figures provided by the Federal Government. Since 2013, pesticide use is calculated based on results from the Farm Accountancy Data Network (FADN). The estimation of pesticide use was supplemented with data for non-agricultural use based on sales figures of amateur use provided by industry and data obtained from public services. The Seq-indicator was modified to better reflect reality. This method was applied for the period 2009-2012 and showed differences between estimated use and sales figures of pesticides. The estimated use of pesticides based on accountancy data is more accurate compared to sales figures. This approach resulted in a better view on pesticide use and its respective environmental impact in Flanders. PMID:26046655
Adjustments of the Pesticide Risk Index Used in Environmental Policy in Flanders.
Fevery, Davina; Peeters, Bob; Lenders, Sonia; Spanoghe, Pieter
2015-01-01
Indicators are used to quantify the pressure of pesticides on the environment. Pesticide risk indicators typically require weighting environmental exposure by a no effect concentration. An indicator based on spread equivalents (ΣSeq) is used in environmental policy in Flanders (Belgium). The pesticide risk for aquatic life is estimated by weighting active ingredient usage by the ratio of their maximum allowable concentration and their soil halflife. Accurate estimates of total pesticide usage in the region are essential in such calculations. Up to 2012, the environmental impact of pesticides was estimated on sales figures provided by the Federal Government. Since 2013, pesticide use is calculated based on results from the Farm Accountancy Data Network (FADN). The estimation of pesticide use was supplemented with data for non-agricultural use based on sales figures of amateur use provided by industry and data obtained from public services. The Seq-indicator was modified to better reflect reality. This method was applied for the period 2009-2012 and showed differences between estimated use and sales figures of pesticides. The estimated use of pesticides based on accountancy data is more accurate compared to sales figures. This approach resulted in a better view on pesticide use and its respective environmental impact in Flanders.
van Dijk, Joris D; Groothuis-Oudshoorn, Catharina G M; Marshall, Deborah A; IJzerman, Maarten J
2016-06-01
Previous studies have been inconclusive regarding the validity and reliability of preference elicitation methods. The aim of this study was to compare the metrics obtained from a discrete choice experiment (DCE) and profile-case best-worst scaling (BWS) with respect to hip replacement. We surveyed the general US population of men aged 45 to 65 years, and potentially eligible for hip replacement surgery. The survey included sociodemographic questions, eight DCE questions, and twelve BWS questions. Attributes were the probability of a first and second revision, pain relief, ability to participate in sports and perform daily activities, and length of hospital stay. Conditional logit analysis was used to estimate attribute weights, level preferences, and the maximum acceptable risk (MAR) for undergoing revision surgery in six hypothetical treatment scenarios with different attribute levels. A total of 429 (96%) respondents were included. Comparable attribute weights and level preferences were found for both BWS and DCE. Preferences were greatest for hip replacement surgery with high pain relief and the ability to participate in sports and perform daily activities. Although the estimated MARs for revision surgery followed the same trend, the MARs were systematically higher in five of the six scenarios using DCE. This study confirms previous findings that BWS or DCEs are comparable in estimating attribute weights and level preferences. However, the risk tolerance threshold based on the estimation of MAR differs between these methods, possibly leading to inconsistency in comparing treatment scenarios. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Frederiksen, Kirsten; Deltour, Isabelle; Schüz, Joachim
2012-12-10
Estimating exposure-outcome associations using laterality information on exposure and on outcome is an issue, when estimating associations of mobile phone use and brain tumour risk. The exposure is localized; therefore, a potential risk is expected to exist primarily on the side of the head, where the phone is usually held (ipsilateral exposure), and to a lesser extent at the opposite side of the head (contralateral exposure). Several measures of the associations with ipsilateral and contralateral exposure, dealing with different sampling designs, have been presented in the literature. This paper presents a general framework for the analysis of such studies using a likelihood-based approach in a competing risks model setting. The approach clarifies the implicit assumptions required for the validity of the presented estimators, particularly that in some approaches the risk with contralateral exposure is assumed to be zero. The performance of the estimators is illustrated in a simulation study showing for instance that while in some scenarios there is a loss of statistical power, others - in case of a positive ipsilateral exposure-outcome association - would result in a negatively biased estimate of the contralateral exposure parameter, irrespective of any additional recall bias. In conclusion, our theoretical evaluations and results from the simulation study emphasize the importance of setting up a formal model, which furthermore allows for estimation in more complicated and perhaps more realistic exposure settings, such as taking into account exposure to both sides of the head. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Seo, Bumsuk; Lee, Jihye; Kang, Sinkyu
2017-04-01
The weather-related risks in crop production is not only crucial for farmers but also for market participants and policy makers since securing food supply is an important issue for society. While crop growth condition and phenology are essential information about such risks, the extensive observations on those are often non-existent in many parts of the world. In this study, we have developed a novel integrative approach to remotely sense crop growth condition and phenology at a large scale. For corn and soybeans in Iowa and Illinois of USA (2003-2014), we assessed crop growth condition and crop phenology by EO data and validated it against the United States Department of Agriculture (USDA) National Agriculture Statistics System (NASS) crop statistics. For growth condition, we used two distinguished approaches to acquire crop condition indicators: a process-based crop growth modelling and a satellite NDVI based method. Based on their pixel-wise historic distributions, we determined relative growth conditions and scaled-down to the state-level. For crop phenology, we calculated three crop phenology metrics [i.e., start of season (SOS), end of season (EOS), and peak of season (POS)] at the pixel level from MODIS 8-day Normalized Difference Vegetation Index (NDVI). The estimates were compared with the Crop Progress and Condition (CPC) data of NASS. For the condition, the state-level 10-day estimates showed a moderate agreement (RMSE < 15.0%) and the average accuracy of the normal/bad year classification was well (> 70%). Notably, the condition estimates corresponded to the severe soybeans disease in 2003 and the drought in 2012 for both crops. For the phenology, the average RMSE of the estimates was 8.6 day for the all three metrics. The average |ME| was smaller than 1.0 day after bias correction. The proposed method enables us to evaluate crop growth at any given period and place. Global climate changes are increasing the risk in agricultural production such as long-term drought. We hope that the presented remote sensing method for crop condition and crop phenology contributes to reducing the growing risk of crop production in the Earth.
Suicidal behaviour across the African continent: a review of the literature.
Mars, Becky; Burrows, Stephanie; Hjelmeland, Heidi; Gunnell, David
2014-06-14
Suicide is a major cause of premature mortality worldwide, but data on its epidemiology in Africa, the world's second most populous continent, are limited. We systematically reviewed published literature on suicidal behaviour in African countries. We searched PubMed, Web of Knowledge, PsycINFO, African Index Medicus, Eastern Mediterranean Index Medicus and African Journals OnLine and carried out citation searches of key articles. We crudely estimated the incidence of suicide and suicide attempts in Africa based on country-specific data and compared these with published estimates. We also describe common features of suicide and suicide attempts across the studies, including information related to age, sex, methods used and risk factors. Regional or national suicide incidence data were available for less than one third (16/53) of African countries containing approximately 60% of Africa's population; suicide attempt data were available for <20% of countries (7/53). Crude estimates suggest there are over 34,000 (inter-quartile range 13,141 to 63,757) suicides per year in Africa, with an overall incidence rate of 3.2 per 100,000 population. The recent Global Burden of Disease (GBD) estimate of 49,558 deaths is somewhat higher, but falls within the inter-quartile range of our estimate. Suicide rates in men are typically at least three times higher than in women. The most frequently used methods of suicide are hanging and pesticide poisoning. Reported risk factors are similar for suicide and suicide attempts and include interpersonal difficulties, mental and physical health problems, socioeconomic problems and drug and alcohol use/abuse. Qualitative studies are needed to identify additional culturally relevant risk factors and to understand how risk factors may be connected to suicidal behaviour in different socio-cultural contexts. Our estimate is somewhat lower than GBD, but still clearly indicates suicidal behaviour is an important public health problem in Africa. More regional studies, in both urban and rural areas, are needed to more accurately estimate the burden of suicidal behaviour across the continent. Qualitative studies are required in addition to quantitative studies.
Fuzzy-probabilistic model for risk assessment of radioactive material railway transportation.
Avramenko, M; Bolyatko, V; Kosterev, V
2005-01-01
Transportation of radioactive materials is obviously accompanied by a certain risk. A model for risk assessment of emergency situations and terrorist attacks may be useful for choosing possible routes and for comparing the various defence strategies. In particular, risk assessment is crucial for safe transportation of excess weapons-grade plutonium arising from the removal of plutonium from military employment. A fuzzy-probabilistic model for risk assessment of railway transportation has been developed taking into account the different natures of risk-affecting parameters (probabilistic and not probabilistic but fuzzy). Fuzzy set theory methods as well as standard methods of probability theory have been used for quantitative risk assessment. Information-preserving transformations are applied to realise the correct aggregation of probabilistic and fuzzy parameters. Estimations have also been made of the inhalation doses resulting from possible accidents during plutonium transportation. The obtained data show the scale of possible consequences that may arise from plutonium transportation accidents.
Breast density estimation from high spectral and spatial resolution MRI
Li, Hui; Weiss, William A.; Medved, Milica; Abe, Hiroyuki; Newstead, Gillian M.; Karczmar, Gregory S.; Giger, Maryellen L.
2016-01-01
Abstract. A three-dimensional breast density estimation method is presented for high spectral and spatial resolution (HiSS) MR imaging. Twenty-two patients were recruited (under an Institutional Review Board--approved Health Insurance Portability and Accountability Act-compliant protocol) for high-risk breast cancer screening. Each patient received standard-of-care clinical digital x-ray mammograms and MR scans, as well as HiSS scans. The algorithm for breast density estimation includes breast mask generating, breast skin removal, and breast percentage density calculation. The inter- and intra-user variabilities of the HiSS-based density estimation were determined using correlation analysis and limits of agreement. Correlation analysis was also performed between the HiSS-based density estimation and radiologists’ breast imaging-reporting and data system (BI-RADS) density ratings. A correlation coefficient of 0.91 (p<0.0001) was obtained between left and right breast density estimations. An interclass correlation coefficient of 0.99 (p<0.0001) indicated high reliability for the inter-user variability of the HiSS-based breast density estimations. A moderate correlation coefficient of 0.55 (p=0.0076) was observed between HiSS-based breast density estimations and radiologists’ BI-RADS. In summary, an objective density estimation method using HiSS spectral data from breast MRI was developed. The high reproducibility with low inter- and low intra-user variabilities shown in this preliminary study suggest that such a HiSS-based density metric may be potentially beneficial in programs requiring breast density such as in breast cancer risk assessment and monitoring effects of therapy. PMID:28042590
Cancer incidence attributable to insufficient fibre consumption in Alberta in 2012
Grundy, Anne; Poirier, Abbey E.; Khandwala, Farah; McFadden, Alison; Friedenreich, Christine M.; Brenner, Darren R.
2017-01-01
Background: Insufficient fibre consumption has been associated with a increased risk of colorectal cancer. The purpose of this study was to estimate the proportion and absolute number of cancers in Alberta that could be attributed to insufficient fibre consumption in 2012. Methods: The number and proportion of colorectal cancers in Alberta attributable to insufficient fibre consumption were estimated using the population attributable risk. Relative risks were obtained from the World Cancer Research Fund's 2011 Continuous Update Project on colorectal cancer, and the prevalence of insufficient fibre consumption (< 23 g/d) was estimated using dietary data from Alberta's Tomorrow Project. Age- and sex-specific colorectal cancer incidence data for 2012 were obtained from the Alberta Cancer Registry. Results: Between 66% and 67% of men and between 73% and 78% of women reported a diet with insufficient fibre consumption. Population attributable risk estimates for colorectal cancer were marginally higher in men, ranging from 6.3% to 6.8% across age groups, whereas in women they ranged from 5.0% to 5.5%. Overall, 6.0% of colorectal cancers or 0.7% of all cancers in Alberta in 2012 were estimated to be attributable to insufficient fibre consumption. Interpretation: Insufficient fibre consumption accounted for 6.0% of colorectal cancers in Alberta in 2012. Increasing fibre consumption in Alberta has the potential to reduce to the future burden of colorectal cancer in the province. PMID:28401112
Calleja, Jesus Maria Garcia; Zhao, Jinkou; Reddy, Amala; Seguy, Nicole
2014-01-01
Problem Size estimates of key populations at higher risk of HIV exposure are recognized as critical for understanding the trajectory of the HIV epidemic and planning and monitoring an effective response, especially for countries with concentrated and low epidemics such as those in Asia. Context To help countries estimate population sizes of key populations, global guidelines were updated in 2011 to reflect new technical developments and recent field experiences in applying these methods. Action In September 2013, a meeting of programme managers and experts experienced with population size estimates (PSE) for key populations was held for 13 Asian countries. This article summarizes the key results presented, shares practical lessons learnt and reviews the methodological approaches from implementing PSE in 13 countries. Lessons learnt It is important to build capacity to collect, analyse and use PSE data; establish a technical review group; and implement a transparent, well documented process. Countries should adapt global PSE guidelines and maintain operational definitions that are more relevant and useable for country programmes. Development of methods for non-venue-based key populations requires more investment and collaborative efforts between countries and among partners. PMID:25320676
Peykari, Niloofar; Sepanlou, Sadaf Ghajarieh; Djalalinia, Shirin; Kasaeian, Amir; Parsaeian, Mahboubeh; Ahmadvand, Alireza; Koohpayehzadeh, Jalil; Damari, Behzad; Jamshidi, Hamid Reza; Larijani, Bagher; Farzadfar, Farshad
2014-01-01
Non-communicable diseases (NCDs) and their risk factors are the major public health problems. There are some documented trend and point estimations of metabolic risk factors for Iranian population but there are little information about their exposure distribution at sub-national level and no information about their trends and their effects on the population health. The present study protocol is aimed to provide the standard structure definitions, organization, data sources, methods of data gathering or generating, and data on trend analysis of the metabolic risk factors in NASBOD study. We will estimate 1990 to 2013 trends of prevalence, years of life lost due to premature mortality (YLLs), and years lived with disability (YLDs) and disability-adjusted life years DALYs for MRFs by gender, age group, and province. We will also quantify the uncertainty interval for the estimates of interest. The findings of study could provide practical information regarding metabolic risk factors and their burden for better health policy to reduce the burden of diseases, and to plan cost-effective preventive strategies. The results also could be used for future complementary global, regional, national, and sub national studies.
Aid decision algorithms to estimate the risk in congenital heart surgery.
Ruiz-Fernández, Daniel; Monsalve Torra, Ana; Soriano-Payá, Antonio; Marín-Alonso, Oscar; Triana Palencia, Eddy
2016-04-01
In this paper, we have tested the suitability of using different artificial intelligence-based algorithms for decision support when classifying the risk of congenital heart surgery. In this sense, classification of those surgical risks provides enormous benefits as the a priori estimation of surgical outcomes depending on either the type of disease or the type of repair, and other elements that influence the final result. This preventive estimation may help to avoid future complications, or even death. We have evaluated four machine learning algorithms to achieve our objective: multilayer perceptron, self-organizing map, radial basis function networks and decision trees. The architectures implemented have the aim of classifying among three types of surgical risk: low complexity, medium complexity and high complexity. Accuracy outcomes achieved range between 80% and 99%, being the multilayer perceptron method the one that offered a higher hit ratio. According to the results, it is feasible to develop a clinical decision support system using the evaluated algorithms. Such system would help cardiology specialists, paediatricians and surgeons to forecast the level of risk related to a congenital heart disease surgery. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Evaluating changes to reservoir rule curves using historical water-level data
Mower, Ethan; Miranda, Leandro E.
2013-01-01
Flood control reservoirs are typically managed through rule curves (i.e. target water levels) which control the storage and release timing of flood waters. Changes to rule curves are often contemplated and requested by various user groups and management agencies with no information available about the actual flood risk of such requests. Methods of estimating flood risk in reservoirs are not easily available to those unfamiliar with hydrological models that track water movement through a river basin. We developed a quantile regression model that uses readily available daily water-level data to estimate risk of spilling. Our model provided a relatively simple process for estimating the maximum applicable water level under a specific flood risk for any day of the year. This water level represents an upper-limit umbrella under which water levels can be operated in a variety of ways. Our model allows the visualization of water-level management under a user-specified flood risk and provides a framework for incorporating the effect of a changing environment on water-level management in reservoirs, but is not designed to replace existing hydrological models. The model can improve communication and collaboration among agencies responsible for managing natural resources dependent on reservoir water levels.
Quantifying Soiling Loss Directly From PV Yield
Deceglie, Michael G.; Micheli, Leonardo; Muller, Matthew
2018-01-23
Soiling of photovoltaic (PV) panels is typically quantified through the use of specialized sensors. Here, we describe and validate a method for estimating soiling loss experienced by PV systems directly from system yield without the need for precipitation data. The method, termed the stochastic rate and recovery (SRR) method, automatically detects soiling intervals in a dataset, then stochastically generates a sample of possible soiling profiles based on the observed characteristics of each interval. In this paper, we describe the method, validate it against soiling station measurements, and compare it with other PV-yield-based soiling estimation methods. The broader application of themore » SRR method will enable the fleet scale assessment of soiling loss to facilitate mitigation planning and risk assessment.« less