Rauch, Geraldine; Brannath, Werner; Brückner, Matthias; Kieser, Meinhard
2018-05-01
In many clinical trial applications, the endpoint of interest corresponds to a time-to-event endpoint. In this case, group differences are usually expressed by the hazard ratio. Group differences are commonly assessed by the logrank test, which is optimal under the proportional hazard assumption. However, there are many situations in which this assumption is violated. Especially in applications were a full population and several subgroups or a composite time-to-first-event endpoint and several components are considered, the proportional hazard assumption usually does not simultaneously hold true for all test problems under investigation. As an alternative effect measure, Kalbfleisch and Prentice proposed the so-called 'average hazard ratio'. The average hazard ratio is based on a flexible weighting function to modify the influence of time and has a meaningful interpretation even in the case of non-proportional hazards. Despite this favorable property, it is hardly ever used in practice, whereas the standard hazard ratio is commonly reported in clinical trials regardless of whether the proportional hazard assumption holds true or not. There exist two main approaches to construct corresponding estimators and tests for the average hazard ratio where the first relies on weighted Cox regression and the second on a simple plug-in estimator. The aim of this work is to give a systematic comparison of these two approaches and the standard logrank test for different time-toevent settings with proportional and nonproportional hazards and to illustrate the pros and cons in application. We conduct a systematic comparative study based on Monte-Carlo simulations and by a real clinical trial example. Our results suggest that the properties of the average hazard ratio depend on the underlying weighting function. The two approaches to construct estimators and related tests show very similar performance for adequately chosen weights. In general, the average hazard ratio defines a more valid effect measure than the standard hazard ratio under non-proportional hazards and the corresponding tests provide a power advantage over the common logrank test. As non-proportional hazards are often met in clinical practice and the average hazard ratio tests often outperform the common logrank test, this approach should be used more routinely in applications. Schattauer GmbH.
2009-01-01
Background During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia. PMID:19331670
Two models for evaluating landslide hazards
Davis, J.C.; Chung, C.-J.; Ohlmacher, G.C.
2006-01-01
Two alternative procedures for estimating landslide hazards were evaluated using data on topographic digital elevation models (DEMs) and bedrock lithologies in an area adjacent to the Missouri River in Atchison County, Kansas, USA. The two procedures are based on the likelihood ratio model but utilize different assumptions. The empirical likelihood ratio model is based on non-parametric empirical univariate frequency distribution functions under an assumption of conditional independence while the multivariate logistic discriminant model assumes that likelihood ratios can be expressed in terms of logistic functions. The relative hazards of occurrence of landslides were estimated by an empirical likelihood ratio model and by multivariate logistic discriminant analysis. Predictor variables consisted of grids containing topographic elevations, slope angles, and slope aspects calculated from a 30-m DEM. An integer grid of coded bedrock lithologies taken from digitized geologic maps was also used as a predictor variable. Both statistical models yield relative estimates in the form of the proportion of total map area predicted to already contain or to be the site of future landslides. The stabilities of estimates were checked by cross-validation of results from random subsamples, using each of the two procedures. Cell-by-cell comparisons of hazard maps made by the two models show that the two sets of estimates are virtually identical. This suggests that the empirical likelihood ratio and the logistic discriminant analysis models are robust with respect to the conditional independent assumption and the logistic function assumption, respectively, and that either model can be used successfully to evaluate landslide hazards. ?? 2006.
Gaw, Sally; Brooks, Bryan W
2016-04-01
Pharmaceuticals are ubiquitous contaminants in aquatic ecosystems. Adaptive monitoring, assessment, and management programs will be required to reduce the environmental hazards of pharmaceuticals of concern. Potentially underappreciated factors that drive the environmental dose of pharmaceuticals include regulatory approvals, marketing campaigns, pharmaceutical subsidies and reimbursement schemes, and societal acceptance. Sales data for 5 common antidepressants (duloxetine [Cymbalta], escitalopram [Lexapro], venlafaxine [Effexor], bupropion [Wellbutrin], and sertraline [Zoloft]) in the United States from 2004 to 2008 were modeled to explore how environmental hazards in aquatic ecosystems changed after patents were obtained or expired. Therapeutic hazard ratios for Effexor and Lexapro did not exceed 1; however, the therapeutic hazard ratio for Zoloft declined whereas the therapeutic hazard ratio for Cymbalta increased as a function of patent protection and sale patterns. These changes in therapeutic hazard ratios highlight the importance of considering current and future drivers of pharmaceutical use when prioritizing pharmaceuticals for water quality monitoring programs. When urban systems receiving discharges of environmental contaminants are examined, water quality efforts should identify, prioritize, and select target analytes presently in commerce for effluent monitoring and surveillance. © 2015 SETAC.
Delayed seizures after intracerebral haemorrhage
Rattani, Abbas; Anderson, Christopher D.; Ayres, Alison M.; Gurol, Edip M.; Greenberg, Steven M.; Rosand, Jonathan; Viswanathan, Anand
2016-01-01
Late seizures after intracerebral haemorrhage occur after the initial acute haemorrhagic insult subsides, and represent one of its most feared long-term sequelae. Both susceptibility to late seizures and their functional impact remain poorly characterized. We sought to: (i) compare patients with new-onset late seizures (i.e. delayed seizures), with those who experienced a recurrent late seizure following an immediately post-haemorrhagic seizure; and (ii) investigate the effect of late seizures on long-term functional performance after intracerebral haemorrhage. We performed prospective longitudinal follow-up of consecutive intracerebral haemorrhage survivors presenting to a single tertiary care centre. We tested for association with seizures the following neuroimaging and genetic markers of cerebral small vessel disease: APOE variants ε2/ε4, computer tomography-defined white matter disease, magnetic resonance imaging-defined white matter hyperintensities volume and cerebral microbleeds. Cognitive performance was measured using the Modified Telephone Interview for Cognitive Status, and functional performance using structured questionnaires obtained every 6 months. We performed time-to-event analysis using separate Cox models for risk to develop delayed and recurrent seizures, as well as for functional decline risk (mortality, incident dementia, and loss of functional independence) after intracerebral haemorrhage. A total of 872 survivors of intracerebral haemorrhage were enrolled and followed for a median of 3.9 years. Early seizure developed in 86 patients, 42 of whom went on to experience recurrent seizures. Admission Glasgow Coma Scale, increasing haematoma volume and cortical involvement were associated with recurrent seizure risk (all P < 0.01). Recurrent seizures were not associated with long-term functional outcome (P = 0.67). Delayed seizures occurred in 37 patients, corresponding to an estimated incidence of 0.8% per year (95% confidence interval 0.5–1.2%). Factors associated with delayed seizures included cortical involvement on index haemorrhage (hazard ratio 1.63, P = 0.036), pre-haemorrhage dementia (hazard ratio 1.36, P = 0.044), history of multiple prior lobar haemorrhages (hazard ratio 2.50, P = 0.038), exclusively lobar microbleeds (hazard ratio 2.22, P = 0.008) and presence of ≥ 1 APOE ε4 copies (hazard ratio 1.95, P = 0.020). Delayed seizures were associated with worse long-term functional outcome (hazard ratio 1.83, P = 0.005), but the association was removed by adjusting for neuroimaging and genetic markers of cerebral small vessel disease. Delayed seizures after intracerebral haemorrhage are associated with different risk factors, when compared to recurrent seizures. They are also associated with worse functional outcome, but this finding appears to be related to underlying small vessel disease. Further investigations into the connections between small vessel disease and delayed seizures are warranted. PMID:27497491
Confidence intervals for the first crossing point of two hazard functions.
Cheng, Ming-Yen; Qiu, Peihua; Tan, Xianming; Tu, Dongsheng
2009-12-01
The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte-Carlo simulations and applied to two clinical trial datasets.
Semi-parametric regression model for survival data: graphical visualization with R
2016-01-01
Cox proportional hazards model is a semi-parametric model that leaves its baseline hazard function unspecified. The rationale to use Cox proportional hazards model is that (I) the underlying form of hazard function is stringent and unrealistic, and (II) researchers are only interested in estimation of how the hazard changes with covariate (relative hazard). Cox regression model can be easily fit with coxph() function in survival package. Stratified Cox model may be used for covariate that violates the proportional hazards assumption. The relative importance of covariates in population can be examined with the rankhazard package in R. Hazard ratio curves for continuous covariates can be visualized using smoothHR package. This curve helps to better understand the effects that each continuous covariate has on the outcome. Population attributable fraction is a classic quantity in epidemiology to evaluate the impact of risk factor on the occurrence of event in the population. In survival analysis, the adjusted/unadjusted attributable fraction can be plotted against survival time to obtain attributable fraction function. PMID:28090517
Batty, G David; Deary, Ian J; Zaninotto, Paola
2016-02-01
We examined the little-tested associations between general cognitive function in middle and older age and later risk of death from chronic diseases. In the English Longitudinal Study of Ageing (2002-2012), 11,391 study participants who were 50-100 years of age at study induction underwent a battery of cognitive tests and provided a range of collateral data. In an analytical sample of 9,204 people (4,982 women), there were 1,488 deaths during follow-up (mean duration, 9.0 years). When we combined scores from 4 cognition tests that represented 3 acknowledged key domains of cognitive functioning (memory, executive function, and processing speed), cognition was inversely associated with deaths from cancer (per each 1-standard-deviation decrease in general cognitive function score, hazard ratio = 1.21, 95% CI: 1.10, 1.33), cardiovascular disease (hazard ratio = 1.71, 95% CI: 1.55, 1.89), other causes (hazard ratio = 2.07, 95% CI: 1.79, 2.40), and respiratory illness (hazard ratio = 2.48, 95% CI: 2.12, 2.90). Controlling for a range of covariates, such as health behaviors and socioeconomic status, and left-censoring to explore reverse causality had very little impact on the strength of these relationships. These findings indicate that cognitive test scores can provide relatively simple indicators of the risk of death from an array of chronic diseases and that these associations appear to be independent of other commonly assessed risk factors. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Marui, Akira; Kimura, Takeshi; Nishiwaki, Noboru; Mitsudo, Kazuaki; Komiya, Tatsuhiko; Hanyu, Michiya; Shiomi, Hiroki; Tanaka, Shiro; Sakata, Ryuzo
2014-10-01
Coronary heart disease is a major risk factor for left ventricular (LV) systolic dysfunction. However, limited data are available regarding long-term benefits of percutaneous coronary intervention (PCI) in the era of drug-eluting stent or coronary artery bypass grafting (CABG) in patients with LV systolic dysfunction with severe coronary artery disease. We identified 3,584 patients with 3-vessel and/or left main disease of 15,939 patients undergoing first myocardial revascularization enrolled in the CREDO-Kyoto PCI/CABG Registry Cohort-2. Of them, 2,676 patients had preserved LV systolic function, defined as an LV ejection fraction (LVEF) of >50% and 908 had impaired LV systolic function (LVEF≤50%). In patients with preserved LV function, 5-year outcomes were not different between PCI and CABG regarding propensity score-adjusted risk of all-cause and cardiac deaths. In contrast, in patients with impaired LV systolic function, the risks of all-cause and cardiac deaths after PCI were significantly greater than those after CABG (hazard ratio 1.49, 95% confidence interval 1.04 to 2.14, p=0.03 and hazard ratio 2.39, 95% confidence interval 1.43 to 3.98, p<0.01). In both patients with moderate (35%
Mace Firebaugh, Casey; Moyes, Simon; Jatrana, Santosh; Rolleston, Anna; Kerse, Ngaire
2018-01-18
The relationship between physical activity, function, and mortality is not established in advanced age. Physical activity, function, and mortality were followed in a cohort of Māori and non-Māori adults living in advanced age for a period of six years. Generalised Linear regression models were used to analyse the association between physical activity and NEADL while Kaplan-Meier survival analysis, and Cox-proportional hazard models were used to assess the association between the physical activity and mortality. The Hazard Ratio for mortality for those in the least active physical activity quartile was 4.1 for Māori and 1.8 for non- Māori compared to the most active physical activity quartile. There was an inverse relationship between physical activity and mortality, with lower hazard ratios for mortality at all levels of physical activity. Higher levels of physical activity were associated with lower mortality and higher functional status in advanced aged adults.
Shaikh, Amir Y; Wang, Na; Yin, Xiaoyan; Larson, Martin G; Vasan, Ramachandran S; Hamburg, Naomi M; Magnani, Jared W; Ellinor, Patrick T; Lubitz, Steven A; Mitchell, Gary F; Benjamin, Emelia J; McManus, David D
2016-09-01
The relations of measures of arterial stiffness, pulsatile hemodynamic load, and endothelial dysfunction to atrial fibrillation (AF) remain poorly understood. To better understand the pathophysiology of AF, we examined associations between noninvasive measures of vascular function and new-onset AF. The study sample included participants aged ≥45 years from the Framingham Heart Study offspring and third-generation cohorts. Using Cox proportional hazards regression models, we examined relations between incident AF and tonometry measures of arterial stiffness (carotid-femoral pulse wave velocity), wave reflection (augmentation index), pressure pulsatility (central pulse pressure), endothelial function (flow-mediated dilation), resting brachial arterial diameter, and hyperemic flow. AF developed in 407/5797 participants in the tonometry sample and 270/3921 participants in the endothelial function sample during follow-up (median 7.1 years, maximum 10 years). Higher augmentation index (hazard ratio, 1.16; 95% confidence interval, 1.02-1.32; P=0.02), baseline brachial artery diameter (hazard ratio, 1.20; 95% confidence interval, 1.01-1.43; P=0.04), and lower flow-mediated dilation (hazard ratio, 0.79; 95% confidence interval, 0.63-0.99; P=0.04) were associated with increased risk of incident AF. Central pulse pressure, when adjusted for age, sex, and hypertension (hazard ratio, 1.14; 95% confidence interval, 1.02-1.28; P=0.02) was associated with incident AF. Higher pulsatile load assessed by central pulse pressure and greater apparent wave reflection measured by augmentation index were associated with increased risk of incident AF. Vascular endothelial dysfunction may precede development of AF. These measures may be additional risk factors or markers of subclinical cardiovascular disease associated with increased risk of incident AF. © 2016 American Heart Association, Inc.
Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido
2016-09-01
Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Habibi, Mohammadali; Samiei, Sanaz; Ambale Venkatesh, Bharath; Opdahl, Anders; Helle-Valle, Thomas M; Zareian, Mytra; Almeida, Andre L C; Choi, Eui-Young; Wu, Colin; Alonso, Alvaro; Heckbert, Susan R; Bluemke, David A; Lima, João A C
2016-08-01
Early detection of structural changes in left atrium (LA) before atrial fibrillation (AF) development could be helpful in identification of those at higher risk for AF. Using cardiac magnetic resonance imaging, we examined the association of LA volume and function, and incident AF in a multiethnic population free of clinical cardiovascular diseases. In a case-cohort study embedded in MESA (Multi-Ethnic Study of Atherosclerosis), baseline LA size and function assessed by cardiac magnetic resonance feature-tracking were compared between 197 participants with incident AF and 322 participants randomly selected from the whole MESA cohort. Participants were followed up for 8 years. Incident AF cases had a larger LA volume and decreased passive, active, and total LA emptying fractions and peak global LA longitudinal strain (peak LA strain) at baseline. In multivariable analysis, elevated LA maximum volume index (hazard ratio, 1.38 per SD; 95% confidence interval, 1.01-1.89) and decreased peak LA strain (hazard ratio, 0.68 per SD; 95% confidence interval, 0.48-0.96), and passive and total LA emptying fractions (hazard ratio for passive LA emptying fractions, 0.55 per SD; 95% confidence interval, 0.40-0.75 and hazard ratio for active LA emptying fractions, 0.70 per SD; 95% confidence interval, 0.52-0.95), but not active LA emptying fraction, were associated with incident AF. Elevated LA volumes and decreased passive and total LA emptying fractions were independently associated with incident AF in an asymptomatic multiethnic population. Including LA functional variables along with other risk factors of AF may help to better risk stratify individuals at risk of AF development. © 2016 American Heart Association, Inc.
Molnar, Amber O; Eddeen, Anan Bader; Ducharme, Robin; Garg, Amit X; Harel, Ziv; McCallum, Megan K; Perl, Jeffrey; Wald, Ron; Zimmerman, Deborah; Sood, Manish M
2017-07-06
Early evidence suggests proteinuria is independently associated with incident atrial fibrillation (AF). We sought to investigate whether the association of proteinuria with incident AF is altered by kidney function. Retrospective cohort study using administrative healthcare databases in Ontario, Canada (2002-2015). A total of 736 666 patients aged ≥40 years not receiving dialysis and with no previous history of AF were included. Proteinuria was defined using the urine albumin-to-creatinine ratio (ACR) and kidney function by the estimated glomerular filtration rate (eGFR). The primary outcome was time to AF. Cox proportional models were used to determine the hazard ratio for AF censored for death, dialysis, kidney transplant, or end of follow-up. Fine and Grey models were used to determine the subdistribution hazard ratio for AF, with death as a competing event. Median follow-up was 6 years and 44 809 patients developed AF. In adjusted models, ACR and eGFR were associated with AF ( P <0.0001). The association of proteinuria with AF differed based on kidney function (ACR × eGFR interaction, P <0.0001). Overt proteinuria (ACR, 120 mg/mmol) was associated with greater AF risk in patients with intact (eGFR, 120) versus reduced (eGFR, 30) kidney function (adjusted hazard ratios, 4.5 [95% CI, 4.0-5.1] and 2.6 [95% CI, 2.4-2.8], respectively; referent ACR 0 and eGFR 120). Results were similar in competing risk analyses. Proteinuria increases the risk of incident AF markedly in patients with intact kidney function compared with those with decreased kidney function. Screening and preventative strategies should consider proteinuria as an independent risk factor for AF. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Jafri, Nazia F; Newitt, David C; Kornak, John; Esserman, Laura J; Joe, Bonnie N; Hylton, Nola M
2014-08-01
To evaluate optimal contrast kinetics thresholds for measuring functional tumor volume (FTV) by breast magnetic resonance imaging (MRI) for assessment of recurrence-free survival (RFS). In this Institutional Review Board (IRB)-approved retrospective study of 64 patients (ages 29-72, median age of 48.6) undergoing neoadjuvant chemotherapy (NACT) for breast cancer, all patients underwent pre-MRI1 and postchemotherapy MRI4 of the breast. Tumor was defined as voxels meeting thresholds for early percent enhancement (PEthresh) and early-to-late signal enhancement ratio (SERthresh); and FTV (PEthresh, SERthresh) by summing all voxels meeting threshold criteria and minimum connectivity requirements. Ranges of PEthresh from 50% to 220% and SERthresh from 0.0 to 2.0 were evaluated. A Cox proportional hazard model determined associations between change in FTV over treatment and RFS at different PE and SER thresholds. The plot of hazard ratios for change in FTV from MRI1 to MRI4 showed a broad peak with the maximum hazard ratio and highest significance occurring at PE threshold of 70% and SER threshold of 1.0 (hazard ratio = 8.71, 95% confidence interval 2.86-25.5, P < 0.00015), indicating optimal model fit. Enhancement thresholds affect the ability of MRI tumor volume to predict RFS. The value is robust over a wide range of thresholds, supporting the use of FTV as a biomarker. © 2013 Wiley Periodicals, Inc.
Felker, G Michael; Fiuzat, Mona; Thompson, Vivian; Shaw, Linda K; Neely, Megan L; Adams, Kirkwood F; Whellan, David J; Donahue, Mark P; Ahmad, Tariq; Kitzman, Dalane W; Piña, Ileana L; Zannad, Faiez; Kraus, William E; O'Connor, Christopher M
2013-11-01
ST2 is involved in cardioprotective signaling in the myocardium and has been identified as a potentially promising biomarker in heart failure (HF). We evaluated ST2 levels and their association with functional capacity and long-term clinical outcomes in a cohort of ambulatory patients with HF enrolled in the Heart Failure: A Controlled Trial Investigating Outcomes of Exercise Training (HF-ACTION) study-a multicenter, randomized study of exercise training in HF. HF-ACTION randomized 2331 patients with left ventricular ejection fraction <0.35 and New York Heart Association class II to IV HF to either exercise training or usual care. ST2 was analyzed in a subset of 910 patients with evaluable plasma samples. Correlations and Cox models were used to assess the relationship among ST2, functional capacity, and long-term outcomes. The median baseline ST2 level was 23.7 ng/mL (interquartile range, 18.6-31.8). ST2 was modestly associated with measures of functional capacity. In univariable analysis, ST2 was significantly associated with death or hospitalization (hazard ratio, 1.48; P<0.0001), cardiovascular death or HF hospitalization (hazard ratio, 2.14; P<0.0001), and all-cause mortality (hazard ratio, 2.33; P<0.0001; all hazard ratios for log2 ng/mL). In multivariable models, ST2 remained independently associated with outcomes after adjustment for clinical variables and amino-terminal pro-B-type natriuretic peptide. However, ST2 did not add significantly to reclassification of risk as assessed by changes in the C statistic, net reclassification improvement, and integrated discrimination improvement. ST2 was modestly associated with functional capacity and was significantly associated with outcomes in a well-treated cohort of ambulatory patients with HF although it did not significantly affect reclassification of risk. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00047437.
A 10-Year Follow-Up Study of Social Ties and Functional Health among the Old: The AGES Project
Murata, Chiyoe; Saito, Tami; Tsuji, Taishi; Saito, Masashige
2017-01-01
In Asian nations, family ties are considered important. However, it is not clear what happens among older people with no such ties. To investigate the association, we used longitudinal data from the Aichi Gerontological Evaluation Study (AGES) project. Functionally independent older people at baseline (N = 14,088) in 10 municipalities were followed from 2003 to 2013. Social ties were assessed by asking about their social support exchange with family, relatives, friends, or neighbors. Cox proportional hazard models were employed to investigate the association between social ties and the onset of functional disability adjusting for age, health status, and living arrangement. We found that social ties with co-residing family members, and those with friends or neighbors, independently protected functional health with hazard ratios of 0.81 and 0.85 among men. Among women, ties with friend or neighbors had a stronger effect on health compared to their male counterparts with a hazard ratio of 0.89. The fact that social ties with friends or neighbors are associated with a lower risk of functional decline, independent of family support, serves to underscore the importance of promoting social ties, especially among those lacking family ties. PMID:28671627
Extended cox regression model: The choice of timefunction
NASA Astrophysics Data System (ADS)
Isik, Hatice; Tutkun, Nihal Ata; Karasoy, Durdu
2017-07-01
Cox regression model (CRM), which takes into account the effect of censored observations, is one the most applicative and usedmodels in survival analysis to evaluate the effects of covariates. Proportional hazard (PH), requires a constant hazard ratio over time, is the assumptionofCRM. Using extended CRM provides the test of including a time dependent covariate to assess the PH assumption or an alternative model in case of nonproportional hazards. In this study, the different types of real data sets are used to choose the time function and the differences between time functions are analyzed and discussed.
History of Childhood Kidney Disease and Risk of Adult End-Stage Renal Disease.
Calderon-Margalit, Ronit; Golan, Eliezer; Twig, Gilad; Leiba, Adi; Tzur, Dorit; Afek, Arnon; Skorecki, Karl; Vivante, Asaf
2018-02-01
The long-term risk associated with childhood kidney disease that had not progressed to chronic kidney disease in childhood is unclear. We aimed to estimate the risk of future end-stage renal disease (ESRD) among adolescents who had normal renal function and a history of childhood kidney disease. We conducted a nationwide, population-based, historical cohort study of 1,521,501 Israeli adolescents who were examined before compulsory military service in 1967 through 1997; data were linked to the Israeli ESRD registry. Kidney diseases in childhood included congenital anomalies of the kidney and urinary tract, pyelonephritis, and glomerular disease; all participants included in the primary analysis had normal renal function and no hypertension in adolescence. Cox proportional-hazards models were used to estimate the hazard ratio for ESRD associated with a history of childhood kidney disease. During 30 years of follow-up, ESRD developed in 2490 persons. A history of any childhood kidney disease was associated with a hazard ratio for ESRD of 4.19 (95% confidence interval [CI], 3.52 to 4.99). The associations between each diagnosis of kidney disease in childhood (congenital anomalies of the kidney and urinary tract, pyelonephritis, and glomerular disease) and the risk of ESRD in adulthood were similar in magnitude (multivariable-adjusted hazard ratios of 5.19 [95% CI, 3.41 to 7.90], 4.03 [95% CI, 3.16 to 5.14], and 3.85 [95% CI, 2.77 to 5.36], respectively). A history of kidney disease in childhood was associated with younger age at the onset of ESRD (hazard ratio for ESRD among adults <40 years of age, 10.40 [95% CI, 7.96 to 13.59]). A history of clinically evident kidney disease in childhood, even if renal function was apparently normal in adolescence, was associated with a significantly increased risk of ESRD, which suggests that kidney injury or structural abnormality in childhood has long-term consequences.
Lee, Mi Jung; Park, Jung Tak; Park, Kyoung Sook; Kwon, Young Eun; Oh, Hyung Jung; Yoo, Tae-Hyun; Kim, Yong-Lim; Kim, Yon Su; Yang, Chul Woo; Kim, Nam-Ho; Kang, Shin-Wook; Han, Seung Hyeok
2017-03-07
Residual kidney function can be assessed by simply measuring urine volume, calculating GFR using 24-hour urine collection, or estimating GFR using the proposed equation (eGFR). We aimed to investigate the relative prognostic value of these residual kidney function parameters in patients on dialysis. Using the database from a nationwide prospective cohort study, we compared differential implications of the residual kidney function indices in 1946 patients on dialysis at 36 dialysis centers in Korea between August 1, 2008 and December 31, 2014. Residual GFR calculated using 24-hour urine collection was determined by an average of renal urea and creatinine clearance on the basis of 24-hour urine collection. eGFR-urea, creatinine and eGFR β 2 -microglobulin were calculated from the equations using serum urea and creatinine and β 2 -microglobulin, respectively. The primary outcome was all-cause death. During a mean follow-up of 42 months, 385 (19.8%) patients died. In multivariable Cox analyses, residual urine volume (hazard ratio, 0.96 per 0.1-L/d higher volume; 95% confidence interval, 0.94 to 0.98) and GFR calculated using 24-hour urine collection (hazard ratio, 0.98; 95% confidence interval, 0.95 to 0.99) were independently associated with all-cause mortality. In 1640 patients who had eGFR β 2 -microglobulin data, eGFR β 2 -microglobulin (hazard ratio, 0.98; 95% confidence interval, 0.96 to 0.99) was also significantly associated with all-cause mortality as well as residual urine volume (hazard ratio, 0.96 per 0.1-L/d higher volume; 95% confidence interval, 0.94 to 0.98) and GFR calculated using 24-hour urine collection (hazard ratio, 0.97; 95% confidence interval, 0.95 to 0.99). When each residual kidney function index was added to the base model, only urine volume improved the predictability for all-cause mortality (net reclassification index =0.11, P =0.01; integrated discrimination improvement =0.01, P =0.01). Higher residual urine volume was significantly associated with a lower risk of death and exhibited a stronger association with mortality than GFR calculated using 24-hour urine collection and eGFR-urea, creatinine. These results suggest that determining residual urine volume may be beneficial to predict patient survival in patients on dialysis. Copyright © 2017 by the American Society of Nephrology.
Bancks, Michael P; Alonso, Alvaro; Gottesman, Rebecca F; Mosley, Thomas H; Selvin, Elizabeth; Pankow, James S
2017-12-01
Diabetes is prospectively associated with cognitive decline. Whether lower cognitive function and worse brain structure are prospectively associated with incident diabetes is unclear. We analyzed data for 10,133 individuals with cognitive function testing (1990-1992) and 1212 individuals with brain magnetic resonance imaging (1993-1994) from the Atherosclerosis Risk in Communities cohort. We estimated hazard ratios for incident diabetes through 2014 after adjustment for traditional diabetes risk factors and cohort attrition. Higher level of baseline cognitive function was associated with lower risk for diabetes (per 1 standard deviation, hazard ratio = 0.94; 95% confidence interval = 0.90, 0.98). This association did not persist after accounting for baseline glucose level, case ascertainment methods, and cohort attrition. No association was observed between any brain magnetic resonance imaging measure and incident diabetes. This is one of the first studies to prospectively evaluate the association between both cognitive function and brain structure and the incidence of diabetes. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
The Association Between Kidney Disease and Cardiovascular Risk in a Multiethnic Cohort
Nickolas, Thomas L.; Khatri, Minesh; Boden-Albala, Bernadette; Kiryluk, Krzysztof; Luo, Xiaodong; Gervasi-Franklin, Palma; Paik, Myunghee; Sacco, Ralph L.
2011-01-01
Background and Purpose The objective of this study was to determine the relationship between chronic kidney disease (CKD), race–ethnicity, and vascular outcomes. Methods A prospective, multiracial cohort of 3298 stroke-free subjects with 6.5 years of mean follow-up time for vascular outcomes (stroke, myocardial infarction, vascular death) was used. Kidney function was estimated using serum creatinine and Cockcroft-Gault formula. Cox proportional hazards models were fitted to evaluate the relationship between kidney function and vascular outcomes. Results In multivariate analysis, Cockcroft-Gault formula between 15 and 59 mL/min was associated with a significant 43% increased stroke risk in the overall cohort. Blacks with Cockcroft-Gault formula between 15 and 59 mL/min had significantly increased risk of both stroke (hazard ratio, 2.65; 95% CI, 1.47 to 4.77) and combined vascular outcomes (hazard ratio, 1.59; 95% CI, 1.10–2.92). Conclusion Chronic kidney disease is a significant risk factor for stroke and combined vascular events, especially in blacks. PMID:18617655
Survivorship analysis when cure is a possibility: a Monte Carlo study.
Goldman, A I
1984-01-01
Parametric survivorship analyses of clinical trials commonly involves the assumption of a hazard function constant with time. When the empirical curve obviously levels off, one can modify the hazard function model by use of a Gompertz or Weibull distribution with hazard decreasing over time. Some cancer treatments are thought to cure some patients within a short time of initiation. Then, instead of all patients having the same hazard, decreasing over time, a biologically more appropriate model assumes that an unknown proportion (1 - pi) have constant high risk whereas the remaining proportion (pi) have essentially no risk. This paper discusses the maximum likelihood estimation of pi and the power curves of the likelihood ratio test. Monte Carlo studies provide results for a variety of simulated trials; empirical data illustrate the methods.
Jensen, Annette S; Broberg, Craig S; Rydman, Riikka; Diller, Gerhard-Paul; Li, Wei; Dimopoulos, Konstantinos; Wort, Stephen J; Pennell, Dudley J; Gatzoulis, Michael A; Babu-Narayan, Sonya V
2015-12-01
Patients with Eisenmenger syndrome (ES) have better survival, despite similar pulmonary vascular pathology, compared with other patients with pulmonary arterial hypertension. Cardiovascular magnetic resonance (CMR) is useful for risk stratification in idiopathic pulmonary arterial hypertension, whereas it has not been evaluated in ES. We studied CMR together with other noninvasive measurements in ES to evaluate its potential role as a noninvasive risk stratification test. Between 2003 and 2005, 48 patients with ES, all with a post-tricuspid shunt, were enrolled in a prospective, longitudinal, single-center study. All patients underwent a standardized baseline assessment with CMR, blood test, echocardiography, and 6-minute walk test and were followed up for mortality until the end of December 2013. Twelve patients (25%) died during follow-up, mostly from heart failure (50%). Impaired ventricular function (right or left ventricular ejection fraction) was associated with increased risk of mortality (lowest quartile: right ventricular ejection fraction, <40%; hazard ratio, 4.4 [95% confidence interval, 1.4-13.5]; P=0.01 and left ventricular ejection fraction, <50%; hazard ratio, 6.6 [95% confidence interval, 2.1-20.8]; P=0.001). Biventricular impairment (lowest quartile left ventricular ejection fraction, <50% and right ventricular ejection fraction, <40%) conveyed an even higher risk of mortality (hazard ratio, 8.0 [95% confidence interval, 2.5-25.1]; P=0.0004). No other CMR or noninvasive measurement besides resting oxygen saturation (hazard ratio, 0.90 [0.83-0.97]/%; P=0.007) was associated with mortality. Impaired right, left, or biventricular systolic function derived from baseline CMR and resting oxygen saturation are associated with mortality in adult patients with ES. CMR is a useful noninvasive tool, which may be incorporated in the risk stratification assessment of ES during lifelong follow-up. © 2015 American Heart Association, Inc.
Parametric regression model for survival data: Weibull regression model as an example
2016-01-01
Weibull regression model is one of the most popular forms of parametric regression model that it provides estimate of baseline hazard function, as well as coefficients for covariates. Because of technical difficulties, Weibull regression model is seldom used in medical literature as compared to the semi-parametric proportional hazard model. To make clinical investigators familiar with Weibull regression model, this article introduces some basic knowledge on Weibull regression model and then illustrates how to fit the model with R software. The SurvRegCensCov package is useful in converting estimated coefficients to clinical relevant statistics such as hazard ratio (HR) and event time ratio (ETR). Model adequacy can be assessed by inspecting Kaplan-Meier curves stratified by categorical variable. The eha package provides an alternative method to model Weibull regression model. The check.dist() function helps to assess goodness-of-fit of the model. Variable selection is based on the importance of a covariate, which can be tested using anova() function. Alternatively, backward elimination starting from a full model is an efficient way for model development. Visualization of Weibull regression model after model development is interesting that it provides another way to report your findings. PMID:28149846
Loss-of-function mutations in APOC3 and risk of ischemic vascular disease.
Jørgensen, Anders Berg; Frikke-Schmidt, Ruth; Nordestgaard, Børge G; Tybjærg-Hansen, Anne
2014-07-03
High plasma levels of nonfasting triglycerides are associated with an increased risk of ischemic cardiovascular disease. Whether lifelong low levels of nonfasting triglycerides owing to mutations in the gene encoding apolipoprotein C3 (APOC3) are associated with a reduced risk of ischemic cardiovascular disease in the general population is unknown. Using data from 75,725 participants in two general-population studies, we first tested whether low levels of nonfasting triglycerides were associated with reduced risks of ischemic vascular disease and ischemic heart disease. Second, we tested whether loss-of-function mutations in APOC3, which were associated with reduced levels of nonfasting triglycerides, were also associated with reduced risks of ischemic vascular disease and ischemic heart disease. During follow-up, ischemic vascular disease developed in 10,797 participants, and ischemic heart disease developed in 7557 of these 10,797 participants. Participants with nonfasting triglyceride levels of less than 1.00 mmol per liter (90 mg per deciliter) had a significantly lower incidence of cardiovascular disease than those with levels of 4.00 mmol per liter (350 mg per deciliter) or more (hazard ratio for ischemic vascular disease, 0.43; 95% confidence interval [CI], 0.35 to 0.54; hazard ratio for ischemic heart disease, 0.40; 95% CI, 0.31 to 0.52). Heterozygosity for loss-of-function mutations in APOC3, as compared with no APOC3 mutations, was associated with a mean reduction in nonfasting triglyceride levels of 44% (P<0.001). The cumulative incidences of ischemic vascular disease and ischemic heart disease were reduced in heterozygotes as compared with noncarriers of APOC3 mutations (P=0.009 and P=0.05, respectively), with corresponding risk reductions of 41% (hazard ratio, 0.59; 95% CI, 0.41 to 0.86; P=0.007) and 36% (hazard ratio, 0.64; 95% CI, 0.41 to 0.99; P=0.04). Loss-of-function mutations in APOC3 were associated with low levels of triglycerides and a reduced risk of ischemic cardiovascular disease. (Funded by the European Union and others.).
Chatterjee, Neal A; Shah, Ravi V; Murthy, Venkatesh L; Praestgaard, Amy; Shah, Sanjiv J; Ventetuolo, Corey E; Barr, R Graham; Kronmal, Richard; Lima, Joao A C; Bluemke, David A; Jerosch-Herold, Michael; Alonso, Alvaro; Kawut, Steven M
2017-01-01
Right ventricular (RV) morphology has been associated with drivers of atrial fibrillation (AF) risk, including left ventricular and pulmonary pathology, systemic inflammation, and neurohormonal activation. The aim of this study was to investigate the association between RV morphology and risk of incident AF. We interpreted cardiac magnetic resonance imaging in 4204 participants free of clinical cardiovascular disease in the MESA (Multi-Ethnic Study of Atherosclerosis). Incident AF was determined using hospital discharge records, study electrocardiograms, and Medicare claims data. The study sample (n=3819) was 61±10 years old and 47% male with 47.2% current/former smokers. After adjustment for demographics and clinical factors, including incident heart failure, higher RV ejection fraction (hazard ratio, 1.16 per SD; 95% confidence interval, 1.03-1.32; P=0.02) and greater RV mass (hazard ratio, 1.25 per SD; 95% confidence interval, 1.08-1.44; P=0.002) were significantly associated with incident AF. After additional adjustment for the respective left ventricular parameter, higher RV ejection fraction remained significantly associated with incident AF (hazard ratio, 1.15 per SD; 95% confidence interval, 1.01-1.32; P=0.04), whereas the association was attenuated for RV mass (hazard ratio, 1.16 per SD; 95% confidence interval, 0.99-1.35; P=0.07). In a subset of patients with available spirometry (n=2540), higher RV ejection fraction and mass remained significantly associated with incident AF after additional adjustment for lung function (P=0.02 for both). Higher RV ejection fraction and greater RV mass were associated with an increased risk of AF in a multiethnic population free of clinical cardiovascular disease at baseline. © 2017 American Heart Association, Inc.
Effect and clinical prediction of worsening renal function in acute decompensated heart failure.
Breidthardt, Tobias; Socrates, Thenral; Noveanu, Markus; Klima, Theresia; Heinisch, Corinna; Reichlin, Tobias; Potocki, Mihael; Nowak, Albina; Tschung, Christopher; Arenja, Nisha; Bingisser, Roland; Mueller, Christian
2011-03-01
We aimed to establish the prevalence and effect of worsening renal function (WRF) on survival among patients with acute decompensated heart failure. Furthermore, we sought to establish a risk score for the prediction of WRF and externally validate the previously established Forman risk score. A total of 657 consecutive patients with acute decompensated heart failure presenting to the emergency department and undergoing serial creatinine measurements were enrolled. The potential of the clinical parameters at admission to predict WRF was assessed as the primary end point. The secondary end point was all-cause mortality at 360 days. Of the 657 patients, 136 (21%) developed WRF, and 220 patients had died during the first year. WRF was more common in the nonsurvivors (30% vs 41%, p = 0.03). Multivariate regression analysis found WRF to independently predict mortality (hazard ratio 1.92, p <0.01). In a single parameter model, previously diagnosed chronic kidney disease was the only independent predictor of WRF and achieved an area under the receiver operating characteristic curve of 0.60. After the inclusion of the blood gas analysis parameters into the model history of chronic kidney disease (hazard ratio 2.13, p = 0.03), outpatient diuretics (hazard ratio 5.75, p <0.01), and bicarbonate (hazard ratio 0.91, p <0.01) were all predictive of WRF. A risk score was developed using these predictors. On receiver operating characteristic curve analysis, the Forman and Basel prediction rules achieved an area under the curve of 0.65 and 0.71, respectively. In conclusion, WRF was common in patients with acute decompensated heart failure and was linked to significantly worse outcomes. However, the clinical parameters failed to adequately predict its occurrence, making a tailored therapy approach impossible. Copyright © 2011 Elsevier Inc. All rights reserved.
Murakami, Keiko; Asayama, Kei; Satoh, Michihiro; Hosaka, Miki; Matsuda, Ayako; Inoue, Ryusuke; Tsubota-Utsugi, Megumi; Murakami, Takahisa; Nomura, Kyoko; Kikuya, Masahiro; Metoki, Hirohito; Imai, Yutaka; Ohkubo, Takayoshi
2017-12-01
Several observational studies have found modifying effects of functional status on the association between conventional office blood pressure (BP) and adverse outcomes. We aimed to examine whether the association between higher BP and stroke was attenuated or inverted among older adults with impaired function using self-measured home BP measurements. We followed 501 Japanese community-dwelling adults aged at least 60 years (mean age, 68.6 years) with no history of stroke. Multivariate-adjusted hazard ratios for 1-SD increase in home BP and office BP measurements were calculated by the Cox proportional hazards model. Functional status was assessed by self-reported physical function. During a median follow-up of 11.5 years, first strokes were observed in 47 participants. Higher home SBP, but not office SBP, was significantly associated with increased risk of stroke among both 349 participants with normal physical function and 152 participants with impaired physical function [hazard ratio (95% confidence interval) per 14.4-mmHg increase: 1.74 (1.12-2.69) and 1.77 (1.06-2.94), respectively], with no significant interaction for physical function (P = 0.56). Higher home DBP, but not office DBP, was also significantly associated with increased risk of stroke (P ≤ 0.029) irrespective of physical function (all P > 0.05 for interaction). Neither home BP nor office BP was significantly associated with all-cause mortality irrespective of physical function. Higher home BP was associated with increased risk of stroke even among those with impaired physical function. Measurements of home BP would be useful for stroke prevention, even after physical function decline.
Katz, Ronit; Dalrymple, Lorien; de Boer, Ian; DeFilippi, Christopher; Kestenbaum, Bryan; Park, Meyeon; Sarnak, Mark; Seliger, Stephen; Shlipak, Michael
2015-01-01
Background and objectives Elevations in N-terminal pro–B-type natriuretic peptide and high-sensitivity troponin T are associated with poor cardiovascular outcomes. Whether elevations in these cardiac biomarkers are associated with decline in kidney function was evaluated. Design, setting, participants, & measurements N-terminal pro–B-type natriuretic peptide and troponin T were measured at baseline in 3752 participants free of heart failure in the Cardiovascular Health Study. eGFR was determined from the Chronic Kidney Disease Epidemiology Collaboration equation using serum cystatin C. Rapid decline in kidney function was defined as decline in serum cystatin C eGFR≥30%, and incident CKD was defined as the onset of serum cystatin C eGFR<60 among those without CKD at baseline (n=2786). Cox regression models were used to examine the associations of each biomarker with kidney function decline adjusting for demographics, baseline serum cystatin C eGFR, diabetes, and other CKD risk factors. Results In total, 503 participants had rapid decline in serum cystatin C eGFR over a mean follow-up time of 6.41 (1.81) years, and 685 participants developed incident CKD over a mean follow-up time of 6.41 (1.74) years. Participants in the highest quartile of N-terminal pro–B-type natriuretic peptide (>237 pg/ml) had an 67% higher risk of rapid decline and 38% higher adjusted risk of incident CKD compared with participants in the lowest quartile (adjusted hazard ratio for serum cystatin C eGFR rapid decline, 1.67; 95% confidence interval, 1.25 to 2.23; hazard ratio for incident CKD, 1.38; 95% confidence interval, 1.08 to 1.76). Participants in the highest category of troponin T (>10.58 pg/ml) had 80% greater risk of rapid decline compared with participants in the lowest category (adjusted hazard ratio, 1.80; 95% confidence interval, 1.35 to 2.40). The association of troponin T with incident CKD was not statistically significant (hazard ratio, 1.17; 95% confidence interval, 0.92 to 1.50). Conclusions Elevated N-terminal pro–B-type natriuretic peptide and troponin T are associated with rapid decline of kidney function and incident CKD. Additional studies are needed to evaluate the mechanisms that may explain this association. PMID:25605700
Bamoulid, Jamal; Courivaud, Cécile; Crepin, Thomas; Carron, Clémence; Gaiffe, Emilie; Roubiou, Caroline; Laheurte, Caroline; Moulin, Bruno; Frimat, Luc; Rieu, Philippe; Mousson, Christiane; Durrbach, Antoine; Heng, Anne-Elisabeth; Rebibou, Jean-Michel; Saas, Philippe; Ducloux, Didier
2016-05-01
Lack of clear identification of patients at high risk of acute rejection hampers the ability to individualize immunosuppressive therapy. Here we studied whether thymic function may predict acute rejection in antithymocyte globulin (ATG)-treated renal transplant recipients in 482 patients prospectively studied during the first year post-transplant of which 86 patients experienced acute rejection. Only CD45RA(+)CD31(+)CD4(+) T cell (recent thymic emigrant [RTE]) frequency (RTE%) was marginally associated with acute rejection in the whole population. This T-cell subset accounts for 26% of CD4(+) T cells. Pretransplant RTE% was significantly associated with acute rejection in ATG-treated patients (hazard ratio, 1.04; 95% confidence interval, 1.01-1.08) for each increased percent in RTE/CD4(+) T cells), but not in anti-CD25 monoclonal (αCD25 mAb)-treated patients. Acute rejection was significantly more frequent in ATG-treated patients with high pretransplant RTE% (31.2% vs. 16.4%) or absolute number of RTE/mm(3) (31.7 vs. 16.1). This difference was not found in αCD25 monclonal antibody-treated patients. Highest values of both RTE% (>31%, hazard ratio, 2.50; 95% confidence interval, 1.09-5.74) and RTE/mm(3) (>200/mm(3), hazard ratio, 3.71; 95% confidence interval, 1.59-8.70) were predictive of acute rejection in ATG-treated patients but not in patients having received αCD25 monoclonal antibody). Results were confirmed in a retrospective cohort using T-cell receptor excision circle levels as a marker of thymic function. Thus, pretransplant thymic function predicts acute rejection in ATG-treated patients. Copyright © 2016 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.
Biganzoli, Laura; Mislang, Anna Rachelle; Di Donato, Samantha; Becheri, Dimitri; Biagioni, Chiara; Vitale, Stefania; Sanna, Giuseppina; Zafarana, Elena; Gabellini, Stefano; Del Monte, Francesca; Mori, Elena; Pozzessere, Daniele; Brunello, Antonella; Luciani, Andrea; Laera, Letizia; Boni, Luca; Di Leo, Angelo; Mottino, Giuseppe
2017-07-01
Frailty increases the risk of adverse health outcomes and/or dying when exposed to a stressor, and routine frailty assessment is recommended to guide treatment decision. The Balducci frailty criteria (BFC) and Fried frailty criteria (FFC) are commonly used, but these are time consuming. Vulnerable Elders Survey-13 (VES-13) score of ≥7, a simple and resource conserving function-based scoring system, may be used instead. This prospective study evaluates the performance of VES-13 in parallel with BFC and FFC, to identify frailty in elderly patients with early-stage cancer. Patients aged ≥70 years with early-stage solid tumors were classified as frail/nonfrail based on BFC (≥1 criteria), FFC (≥3 criteria), and VES-13 (score ≥ 7). All patients were assessed for functional decline and death. We evaluated 185 patients. FFC had a 17% frailty rate, whereas BFC and VES-13 both had 25%, with poor concordance seen between the three geriatric tools. FFC (hazard ratio = 1.99, p = .003) and VES-13 (hazard ratio = 2.81, p < .001) strongly discriminated for functional decline, whereas BFC (hazard ratio = 3.29, p < .001) had the highest discriminatory rate for deaths. BFC and VES-13 remained prognostic for overall survival in multivariate analysis correcting for age, tumor type, stage, and systemic treatment. A VES-13 score of ≥7 is a valuable discriminating tool for predicting functional decline or death and can be used as a frailty-screening tool among older cancer patients in centers with limited resources to conduct a comprehensive geriatric assessment. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Right ventricular dysfunction affects survival after surgical left ventricular restoration.
Couperus, Lotte E; Delgado, Victoria; Palmen, Meindert; van Vessem, Marieke E; Braun, Jerry; Fiocco, Marta; Tops, Laurens F; Verwey, Harriëtte F; Klautz, Robert J M; Schalij, Martin J; Beeres, Saskia L M A
2017-04-01
Several clinical and left ventricular parameters have been associated with prognosis after surgical left ventricular restoration in patients with ischemic heart failure. The aim of this study was to determine the prognostic value of right ventricular function. A total of 139 patients with ischemic heart failure (62 ± 10 years; 79% were male; left ventricular ejection fraction 27% ± 7%) underwent surgical left ventricular restoration. Biventricular function was assessed with echocardiography before surgery. The independent association between all-cause mortality and right ventricular fractional area change, tricuspid annular plane systolic excursion, and right ventricular longitudinal peak systolic strain was assessed. The additive effect of multiple impaired right ventricular parameters on mortality also was assessed. Baseline right ventricular fractional area change was 42% ± 9%, tricuspid annular plane systolic excursion was 18 ± 3 mm, and right ventricular longitudinal peak systolic strain was -24% ± 7%. Within 30 days after surgery, 15 patients died. Right ventricular fractional area change (hazard ratio, 0.93; 95% confidence interval, 0.88-0.98; P < .01), tricuspid annular plane systolic excursion (hazard ratio, 0.80; 95% confidence interval, 0.66-0.96; P = .02), and right ventricular longitudinal peak systolic strain (hazard ratio, 1.15; 95% confidence interval, 1.05-1.26; P < .01) were independently associated with 30-day mortality, after adjusting for left ventricular ejection fraction and aortic crossclamping time. Right ventricular function was impaired in 21%, 20%, and 27% of patients on the basis of right ventricular fractional area change, tricuspid annular plane systolic excursion, and right ventricular longitudinal peak systolic strain, respectively. Any echocardiographic parameter of right ventricular dysfunction was present in 39% of patients. The coexistence of several impaired right ventricular parameters per patient was independently associated with increased 30-day mortality (hazard ratio, 2.83; 95% confidence interval, 1.64-4.87, P < .01 per additional impaired parameter). Baseline right ventricular systolic dysfunction is independently associated with increased mortality in patients with ischemic heart failure undergoing surgical left ventricular restoration. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Salisbury, Margaret L; Xia, Meng; Zhou, Yueren; Murray, Susan; Tayob, Nabihah; Brown, Kevin K; Wells, Athol U; Schmidt, Shelley L; Martinez, Fernando J; Flaherty, Kevin R
2016-02-01
Idiopathic pulmonary fibrosis is a progressive lung disease with variable course. The Gender-Age-Physiology (GAP) Index and staging system uses clinical variables to stage mortality risk. It is unknown whether clinical staging predicts future decline in pulmonary function. We assessed whether the GAP stage predicts future pulmonary function decline and whether interval pulmonary function change predicts mortality after accounting for stage. Patients with idiopathic pulmonary fibrosis (N = 657) were identified retrospectively at three tertiary referral centers, and baseline GAP stages were assessed. Mixed models were used to describe average trajectories of FVC and diffusing capacity of the lung for carbon monoxide (Dlco). Multivariable Cox proportional hazards models were used to assess whether declines in pulmonary function ≥ 10% in 6 months predict mortality after accounting for GAP stage. Over a 2-year period, GAP stage was not associated with differences in yearly lung function decline. After accounting for stage, a 10% decrease in FVC or Dlco over 6 months independently predicted death or transplantation (FVC hazard ratio, 1.37; Dlco hazard ratio, 1.30; both, P ≤ .03). Patients with GAP stage 2 with declining pulmonary function experienced a survival profile similar to patients with GAP stage 3, with 1-year event-free survival of 59.3% (95% CI, 49.4-67.8) vs 56.9% (95% CI, 42.2-69.1). Baseline GAP stage predicted death or lung transplantation but not the rate of future pulmonary function decline. After accounting for GAP stage, a decline of ≥ 10% over 6 months independently predicted death or lung transplantation. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
Scherrer, B; Andrieu, S; Ousset, P J; Berrut, G; Dartigues, J F; Dubois, B; Pasquier, F; Piette, F; Robert, P; Touchon, J; Garnier, P; Mathiex-Fortunet, H; Vellas, B
2015-12-01
Time-to-event analysis is frequently used in medical research to investigate potential disease-modifying treatments in neurodegenerative diseases. Potential treatment effects are generally evaluated using the logrank test, which has optimal power and sensitivity when the treatment effect (hazard ratio) is constant over time. However, there is generally no prior information as to how the hazard ratio for the event of interest actually evolves. In these cases, the logrank test is not necessarily the most appropriate to use. When the hazard ratio is expected to decrease or increase over time, alternative statistical tests such as the Fleming-Harrington test, provide a better sensitivity. An example of this comes from a large, five-year randomised, placebo-controlled prevention trial (GuidAge) in 2854 community-based subjects making spontaneous memory complaints to their family physicians, which evaluated whether treatment with EGb761 can modify the risk of developing AD. The primary outcome measure was the time to conversion from memory complaint to Alzheimer's type dementia. Although there was no significant difference in the hazard function of conversion between the two treatment groups according to the preplanned logrank test, a significant treatment-by-time interaction for the incidence of AD was observed in a protocol-specified subgroup analysis, suggesting that the hazard ratio is not constant over time. For this reason, additional post hoc analyses were performed using the Fleming-Harrington test to evaluate whether there was a signal of a late effect of EGb761. Applying the Fleming-Harrington test, the hazard function for conversion to dementia in the placebo group was significantly different from that in the EGb761 treatment group (p = 0.0054), suggesting a late effect of EGb761. Since this was a post hoc analysis, no definitive conclusions can be drawn as to the effectiveness of the treatment. This post hoc analysis illustrates the interest of performing another randomised clinical trial of EGb761 explicitly testing the hypothesis of a late treatment effect, as well as of using of better adapted statistical approaches for long term preventive trials when it is expected that prevention cannot have an immediate effect but rather a delayed effect that increases over time.
Anderson, Todd J; Charbonneau, Francois; Title, Lawrence M; Buithieu, Jean; Rose, M Sarah; Conradson, Heather; Hildebrand, Kathy; Fung, Marinda; Verma, Subodh; Lonn, Eva M
2011-01-18
Biomarkers of atherosclerosis may refine clinical decision making in individuals at risk of cardiovascular disease. The purpose of the study was to determine the prognostic significance of endothelial function and other vascular markers in apparently healthy men. The cohort consisted of 1574 men (age, 49.4 years) free of vascular disease. Measurements included flow-mediated dilation and its microvascular stimulus, hyperemic velocity, carotid intima-media thickness, and C-reactive protein. Cox proportional hazard models evaluated the relationship between vascular markers, Framingham risk score, and time to a first composite cardiovascular end point of vascular death, revascularization, myocardial infarction, angina, and stroke. Subjects had low median Framingham risk score (7.9%). Cardiovascular events occurred in 71 subjects (111 events) over a mean follow-up of 7.2±1.7 years. Flow-mediated dilation was not associated with subsequent cardiovascular events (hazard ratio, 0.92; P=0.54). Both hyperemic velocity (hazard ratio, 0.70; 95% confidence interval, 0.54 to 0.90; P=0.006) and carotid intima-media thickness (hazard ratio, 1.45; confidence interval, 1.15 to 1.83; P=0.002) but not C-reactive protein (P=0.35) were related to events in a multivariable analysis that included Framingham risk score (per unit SD). Furthermore, the addition of hyperemic velocity to Framingham risk score resulted in a net clinical reclassification improvement of 28.7% (P<0.001) after 5 years of follow-up in the intermediate-risk group. Overall net reclassification improvement for hyperemic velocity was 6.9% (P=0.24). In men, hyperemic velocity, the stimulus for flow-mediated dilation, but not flow-mediated dilation itself was a significant risk marker for adverse cardiovascular outcomes. The prognostic value was additive to traditional risk factors and carotid intima-media thickness. Hyperemic velocity, a newly described marker of microvascular function, is a novel tool that may improve risk stratification of lower-risk healthy men.
Mohammed, Selma F; Hussain, Imad; AbouEzzeddine, Omar F; Abou Ezzeddine, Omar F; Takahama, Hiroyuki; Kwon, Susan H; Forfia, Paul; Roger, Véronique L; Redfield, Margaret M
2014-12-23
The prevalence and clinical significance of right ventricular (RV) systolic dysfunction (RVD) in patients with heart failure and preserved ejection fraction (HFpEF) are not well characterized. Consecutive, prospectively identified HFpEF (Framingham HF criteria, ejection fraction ≥50%) patients (n=562) from Olmsted County, Minnesota, underwent echocardiography at HF diagnosis and follow-up for cause-specific mortality and HF hospitalization. RV function was categorized by tertiles of tricuspid annular plane systolic excursion and by semiquantitative (normal, mild RVD, or moderate to severe RVD) 2-dimensional assessment. Whether RVD was defined by semiquantitative assessment or tricuspid annular plane systolic excursion ≤15 mm, HFpEF patients with RVD were more likely to have atrial fibrillation, pacemakers, and chronic diuretic therapy. At echocardiography, patients with RVD had slightly lower left ventricular ejection fraction, worse diastolic dysfunction, lower blood pressure and cardiac output, higher pulmonary artery systolic pressure, and more severe RV enlargement and tricuspid valve regurgitation. After adjustment for age, sex, pulmonary artery systolic pressure, and comorbidities, the presence of any RVD by semiquantitative assessment was associated with higher all-cause (hazard ratio=1.35; 95% confidence interval, 1.03-1.77; P=0.03) and cardiovascular (hazard ratio=1.85; 95% confidence interval, 1.20-2.80; P=0.006) mortality and higher first (hazard ratio=1.99; 95% confidence interval, 1.35-2.90; P=0.0006) and multiple (hazard ratio=1.81; 95% confidence interval, 1.18-2.78; P=0.007) HF hospitalization rates. RVD defined by tricuspid annular plane systolic excursion values showed similar but weaker associations with mortality and HF hospitalizations. In the community, RVD is common in HFpEF patients, is associated with clinical and echocardiographic evidence of more advanced HF, and is predictive of poorer outcomes. © 2014 American Heart Association, Inc.
Remontet, L; Bossard, N; Belot, A; Estève, J
2007-05-10
Relative survival provides a measure of the proportion of patients dying from the disease under study without requiring the knowledge of the cause of death. We propose an overall strategy based on regression models to estimate the relative survival and model the effects of potential prognostic factors. The baseline hazard was modelled until 10 years follow-up using parametric continuous functions. Six models including cubic regression splines were considered and the Akaike Information Criterion was used to select the final model. This approach yielded smooth and reliable estimates of mortality hazard and allowed us to deal with sparse data taking into account all the available information. Splines were also used to model simultaneously non-linear effects of continuous covariates and time-dependent hazard ratios. This led to a graphical representation of the hazard ratio that can be useful for clinical interpretation. Estimates of these models were obtained by likelihood maximization. We showed that these estimates could be also obtained using standard algorithms for Poisson regression. Copyright 2006 John Wiley & Sons, Ltd.
Site Transfer Functions of Three-Component Ground Motion in Western Turkey
NASA Astrophysics Data System (ADS)
Ozgur Kurtulmus, Tevfik; Akyol, Nihal; Camyildiz, Murat; Gungor, Talip
2015-04-01
Because of high seismicity accommodating crustal deformation and deep graben structures, on which have, urbanized and industrialized large cities in western Turkey, the importance of site-specific seismic hazard assessments becomes more crucial. Characterizing source, site and path effects is important for both assessing the seismic hazard in a specific region and generation of the building codes/or renewing previous ones. In this study, we evaluated three-component recordings for micro- and moderate-size earthquakes with local magnitudes ranging between 2.0 and 5.6. This dataset is used for site transfer function estimations, utilizing two different spectral ratio approaches 'Standard Spectral Ratio-(SSR)' and 'Horizontal to Vertical Spectral Ratio-(HVSR)' and a 'Generalized Inversion Technique-(GIT)' to highlight site-specific seismic hazard potential of deep basin structures of the region. Obtained transfer functions revealed that the sites located near the basin edges are characterized by broader HVSR curves. Broad HVSR peaks could be attributed to the complexity of wave propagation related to significant 2D/3D velocity variations at the sediment-bedrock interface near the basin edges. Comparison of HVSR and SSR estimates for the sites located on the grabens showed that SSR estimates give larger values at lower frequencies which could be attributed to lateral variations in regional velocity and attenuation values caused by basin geometry and edge effects. However, large amplitude values of vertical component GIT site transfer functions were observed at varying frequency ranges for some of the stations. These results imply that vertical component of ground motion is not amplification free. Contamination of HVSR site transfer function estimates at different frequency bands could be related to complexities in the wave field caused by deep or shallow heterogeneities in the region such as differences in the basin geometries, fracturing and fluid saturation along different propagation paths. The results also show that, even if the site is located on a horst, the presence of weathered zones near the surface could cause moderate frequency dependent site effects.
Putcha, Nirupama; Crainiceanu, Ciprian; Norato, Gina; Samet, Jonathan; Quan, Stuart F; Gottlieb, Daniel J; Redline, Susan; Punjabi, Naresh M
2016-10-15
Whether sleep-disordered breathing (SDB) severity and diminished lung function act synergistically to heighten the risk of adverse health outcomes remains a topic of significant debate. The current study sought to determine whether the association between lower lung function and mortality would be stronger in those with increasing severity of SDB in a community-based cohort of middle-aged and older adults. Full montage home sleep testing and spirometry data were analyzed on 6,173 participants of the Sleep Heart Health Study. Proportional hazards models were used to calculate risk for all-cause mortality, with FEV 1 and apnea-hypopnea index (AHI) as the primary exposure indicators along with several potential confounders. All-cause mortality rate was 26.9 per 1,000 person-years in those with SDB (AHI ≥5 events/h) and 18.2 per 1,000 person-years in those without (AHI <5 events/h). For every 200-ml decrease in FEV 1 , all-cause mortality increased by 11.0% in those without SDB (hazard ratio, 1.11; 95% confidence interval, 1.08-1.13). In contrast, for every 200-ml decrease in FEV 1 , all-cause mortality increased by only 6.0% in participants with SDB (hazard ratio, 1.06; 95% confidence interval, 1.04-1.09). Additionally, the incremental influence of lung function on all-cause mortality was less with increasing severity of SDB (P value for interaction between AHI and FEV 1 , 0.004). Lung function was associated with risk for all-cause mortality. The incremental contribution of lung function to mortality diminishes with increasing severity of SDB.
Putcha, Nirupama; Crainiceanu, Ciprian; Norato, Gina; Samet, Jonathan; Quan, Stuart F.; Gottlieb, Daniel J.; Redline, Susan
2016-01-01
Rationale: Whether sleep-disordered breathing (SDB) severity and diminished lung function act synergistically to heighten the risk of adverse health outcomes remains a topic of significant debate. Objectives: The current study sought to determine whether the association between lower lung function and mortality would be stronger in those with increasing severity of SDB in a community-based cohort of middle-aged and older adults. Methods: Full montage home sleep testing and spirometry data were analyzed on 6,173 participants of the Sleep Heart Health Study. Proportional hazards models were used to calculate risk for all-cause mortality, with FEV1 and apnea–hypopnea index (AHI) as the primary exposure indicators along with several potential confounders. Measurements and Main Results: All-cause mortality rate was 26.9 per 1,000 person-years in those with SDB (AHI ≥5 events/h) and 18.2 per 1,000 person-years in those without (AHI <5 events/h). For every 200-ml decrease in FEV1, all-cause mortality increased by 11.0% in those without SDB (hazard ratio, 1.11; 95% confidence interval, 1.08–1.13). In contrast, for every 200-ml decrease in FEV1, all-cause mortality increased by only 6.0% in participants with SDB (hazard ratio, 1.06; 95% confidence interval, 1.04–1.09). Additionally, the incremental influence of lung function on all-cause mortality was less with increasing severity of SDB (P value for interaction between AHI and FEV1, 0.004). Conclusions: Lung function was associated with risk for all-cause mortality. The incremental contribution of lung function to mortality diminishes with increasing severity of SDB. PMID:27105053
Nijenhuis, Vincent Johan; Peper, Joyce; Vorselaars, Veronique M M; Swaans, Martin J; De Kroon, Thom; Van der Heyden, Jan A S; Rensing, Benno J W M; Heijmen, Robin; Bos, Willem-Jan W; Ten Berg, Jurrien M
2018-05-15
Transcatheter aortic valve implantation (TAVI) is associated with acute kidney injury (AKI), but can also improve the kidney function (IKF). We assessed the effects of kidney function changes in relation to baseline kidney function on 2-year clinical outcomes after TAVI. In total, 639 consecutive patients with aortic stenosis who underwent TAVI were stratified into 3 groups according to the ratio of serum creatinine post- to pre-TAVI: IKF (≤0.80; n = 95 [15%]), stable kidney function (0.80 to 1.5; n = 477 [75%]), and AKI (≥1.5; n = 67 [10%]). Different AKI risk scores were compared using receiving-operator characteristics. Median follow-up was 24 (8 to 44) months. At 3 months, the increase in estimated glomerular filtration rate in the IKF group remained, and the decreased estimated glomerular filtration rate in the AKI group recovered. Compared with a stable kidney function, AKI showed a higher 2-year mortality rate (adjusted hazard ratio [HR] 3.69, 95% confidence interval [CI] 2.43 to 5.62) and IKF a lower mortality rate (adjusted hazard ratio 0.53, 95% CI 0.30 to 0.93). AKI also predicted major and life-threatening bleeding (adjusted odds ratio 2.94, 95% CI 1.27 to 6.78). Independent predictors of AKI were chronic kidney disease and pulmonary hypertension. Independent predictors of IKF were female gender, a preserved kidney function, absence of atrial fibrillation, and hemoglobin level. Established AKI risk scores performed moderately and did not differentiate between AKI and IKF. In conclusion, AKI is transient and is independently associated with a higher mortality rate, whereas IKF is sustained and is associated with a lower mortality rate. These effects are independent of baseline kidney function. Further studies are warranted to investigate the role of IKF and generate a dedicated prediction model. Copyright © 2018 Elsevier Inc. All rights reserved.
Roldan-Valadez, Ernesto; Rios, Camilo; Motola-Kuba, Daniel; Matus-Santos, Juan; Villa, Antonio R; Moreno-Jimenez, Sergio
2016-11-01
A long-lasting concern has prevailed for the identification of predictive biomarkers for high-grade gliomas (HGGs) using MRI. However, a consensus of which imaging parameters assemble a significant survival model is still missing in the literature; we investigated the significant positive or negative contribution of several MR biomarkers in this tumour prognosis. A retrospective cohort of supratentorial HGGs [11 glioblastoma multiforme (GBM) and 17 anaplastic astrocytomas] included 28 patients (9 females and 19 males, respectively, with a mean age of 50.4 years, standard deviation: 16.28 years; range: 13-85 years). Oedema and viable tumour measurements were acquired using regions of interest in T 1 weighted, T 2 weighted, fluid-attenuated inversion recovery, apparent diffusion coefficient (ADC) and MR spectroscopy (MRS). We calculated Kaplan-Meier curves and obtained Cox's proportional hazards. During the follow-up period (3-98 months), 17 deaths were recorded. The median survival time was 1.73 years (range, 0.287-8.947 years). Only 3 out of 20 covariates (choline-to-N-acetyl aspartate and lipids-lactate-to-creatine ratios and age) showed significance in explaining the variability in the survival hazards model; score test: χ 2 (3) = 9.098, p = 0.028. MRS metabolites overcome volumetric parameters of peritumoral oedema and viable tumour, as well as tumour region ADC measurements. Specific MRS ratios (Cho/Naa, L-L/Cr) might be considered in a regular follow-up for these tumours. Advances in knowledge: Cho/Naa ratio is the strongest survival predictor with a log-hazard function of 2.672 in GBM. Low levels of lipids-lactate/Cr ratio represent up to a 41.6% reduction in the risk of death in GBM.
Oh, Jee-Young; Allison, Matthew A; Barrett-Connor, Elizabeth
2017-01-01
Although the prevalence rates of hypertension (HTN) and diabetes mellitus are slowing in some high-income countries, HTN and diabetes mellitus remain as the two major risk factors for atherosclerotic cardiovascular disease (CVD), the leading cause of death in the United States and worldwide. We aimed to observe the association of HTN and diabetes mellitus with all-cause and CVD mortality in older white adults. All community-dwelling Rancho Bernardo Study participants who were at least 55 years old and had carefully measured blood pressure and plasma glucose from 75-g oral glucose tolerance test at the baseline visit (1984-1987, n = 2186) were followed up until death or the last clinic visit in 2013 (median 14.3 years, interquartile range 8.4-21.3). In unadjusted analyses, diabetes mellitus was associated with all-cause mortality [hazard ratio 1.40, 95% confidence interval (CI) 1.23-1.60] and CVD mortality (hazard ratio 1.67, 95% CI 1.39-2.00); HTN with all-cause mortality [hazard ratio 1.93 (1.73-2.15)] and CVD mortality [hazard ratio 2.45 (2.10-2.93)]. After adjustment for cardiovascular risk factors, including age, BMI, triglycerides, HDL-cholesterol, smoking, exercise, and alcohol consumption, diabetes mellitus was associated with CVD mortality only (hazard ratio 1.25, P = 0.0213). Conversely, HTN was associated with both all-cause (hazard ratio 1.34, P < 0.0001) and CVD mortality (hazard ratio 1.40, P = 0.0003). Having both diabetes mellitus and HTN was associated with all-cause (hazard ratio 1.38, P = 0.0002) and CVD mortality (hazard ratio 1.70, P < 0.0001). We report the novel finding that HTN is more strongly associated with all-cause and CVD mortality than diabetes mellitus. Having both confers a modest increase in the hazards for these types of mortality.
An interdisciplinary perspective on social and physical determinants of seismic risk
NASA Astrophysics Data System (ADS)
Lin, K.-H.; Chang, Y.-C.; Liu, G.-Y.; Chan, C.-H.; Lin, T.-H.; Yeh, C.-H.
2015-01-01
While disaster studies researchers usually view risk as a function of hazard, exposure, and vulnerability, few studies have systematically examined the relationships among the various physical and socioeconomic determinants underlying disasters, and fewer have done so through seismic risk analysis. In the context of the 1999 Chi-Chi earthquake in Taiwan, this study constructs five hypothetical models to test different determinants that affect disaster fatality at the village level, namely seismic hazard intensity, population, building fragility, demographics and socioeconomics. The Poisson Regression Model is used to estimate the impact of natural hazards and social factors on fatality. Results indicate that although all of the determinants have an impact on the specific dimension of seismic fatality, some indicators of social inequality, such as gender ratio, dependency ratio, income and its SD, are the driving determinants deteriorating vulnerability to seismic risk. These findings have strong social implications for policy interventions to mitigate such disasters. This study presents an interdisciplinary investigation into social and physical determinants in seismic risk.
Opioid Analgesics and Adverse Outcomes among Hemodialysis Patients.
Ishida, Julie H; McCulloch, Charles E; Steinman, Michael A; Grimes, Barbara A; Johansen, Kirsten L
2018-05-07
Patients on hemodialysis frequently experience pain and may be particularly vulnerable to opioid-related complications. However, data evaluating the risks of opioid use in patients on hemodialysis are limited. Using the US Renal Data System, we conducted a cohort study evaluating the association between opioid use (modeled as a time-varying exposure and expressed in standardized oral morphine equivalents) and time to first emergency room visit or hospitalization for altered mental status, fall, and fracture among 140,899 Medicare-covered adults receiving hemodialysis in 2011. We evaluated risk according to average daily total opioid dose (>60 mg, ≤60 mg, and per 60-mg dose increment) and specific agents (per 60-mg dose increment). The median age was 61 years old, 52% were men, and 50% were white. Sixty-four percent received opioids, and 17% had an episode of altered mental status (15,658 events), fall (7646 events), or fracture (4151 events) in 2011. Opioid use was associated with risk for all outcomes in a dose-dependent manner: altered mental status (lower dose: hazard ratio, 1.28; 95% confidence interval, 1.23 to 1.34; higher dose: hazard ratio, 1.67; 95% confidence interval, 1.56 to 1.78; hazard ratio, 1.29 per 60 mg; 95% confidence interval, 1.26 to 1.33), fall (lower dose: hazard ratio, 1.28; 95% confidence interval, 1.21 to 1.36; higher dose: hazard ratio, 1.45; 95% confidence interval, 1.31 to 1.61; hazard ratio, 1.04 per 60 mg; 95% confidence interval, 1.03 to 1.05), and fracture (lower dose: hazard ratio, 1.44; 95% confidence interval, 1.33 to 1.56; higher dose: hazard ratio, 1.65; 95% confidence interval, 1.44 to 1.89; hazard ratio, 1.04 per 60 mg; 95% confidence interval, 1.04 to 1.05). All agents were associated with a significantly higher hazard of altered mental status, and several agents were associated with a significantly higher hazard of fall and fracture. Opioids were associated with adverse outcomes in patients on hemodialysis, and this risk was present even at lower dosing and for agents that guidelines have recommended for use. Copyright © 2018 by the American Society of Nephrology.
Ruilope, Luis M; Zanchetti, Alberto; Julius, Stevo; McInnes, Gordon T; Segura, Julian; Stolt, Pelle; Hua, Tsushung A; Weber, Michael A; Jamerson, Ken
2007-07-01
Reduced renal function is predictive of poor cardiovascular outcomes but the predictive value of different measures of renal function is uncertain. We compared the value of estimated creatinine clearance, using the Cockcroft-Gault formula, with that of estimated glomerular filtration rate (GFR), using the Modification of Diet in Renal Disease (MDRD) formula, as predictors of cardiovascular outcome in 15 245 high-risk hypertensive participants in the Valsartan Antihypertensive Long-term Use Evaluation (VALUE) trial. For the primary end-point, the three secondary end-points and for all-cause death, outcomes were compared for individuals with baseline estimated creatinine clearance and estimated GFR < 60 ml/min and > or = 60 ml/min using hazard ratios and 95% confidence intervals. Coronary heart disease, left ventricular hypertrophy, age, sex and treatment effects were included as covariates in the model. For each end-point considered, the risk in individuals with poor renal function at baseline was greater than in those with better renal function. Estimated creatinine clearance (Cockcroft-Gault) was significantly predictive only of all-cause death [hazard ratio = 1.223, 95% confidence interval (CI) = 1.076-1.390; P = 0.0021] whereas estimated GFR was predictive of all outcomes except stroke. Hazard ratios (95% CIs) for estimated GFR were: primary cardiac end-point, 1.497 (1.332-1.682), P < 0.0001; myocardial infarction, 1.501 (1.254-1.796), P < 0.0001; congestive heart failure, 1.699 (1.435-2.013), P < 0.0001; stroke, 1.152 (0.952-1.394) P = 0.1452; and all-cause death, 1.231 (1.098-1.380), P = 0.0004. These results indicate that estimated glomerular filtration rate calculated with the MDRD formula is more informative than estimated creatinine clearance (Cockcroft-Gault) in the prediction of cardiovascular outcomes.
McDermott, Mary McGrae; Ferrucci, Luigi; Simonsick, Eleanor M; Balfour, Jennifer; Fried, Linda; Ling, Shari; Gibson, Daniel; Guralnik, Jack M
2002-02-01
To define the association between baseline ankle brachial index (ABI) level and subsequent onset of severe disability. Prospective cohort study. Baltimore community. Eight hundred forty-seven disabled women aged 65 and older participating in the Women's Health and Aging Study. At baseline, participants underwent measurement of ABI and lower extremity functioning. Measures of lower extremity functioning included patient's report of their ability to walk one-quarter of a mile, number of city blocks walked last week, number of stair flights climbed last week, and performance-based measures including walking speed over 4 meters, five repeated chair stands, and a summary performance score. Functioning was remeasured every 6 months for 3 years. Definitions of severe disability were developed a priori, and participants who met these definitions at baseline were excluded from subsequent analyses. Participants with an ABI of less than 0.60 at baseline had significantly higher cumulative probabilities of developing severe disability than participants with a baseline ABI of 0.90 to 1.50 for walking-specific outcomes (ability to walk a quarter of a mile, number of city blocks walked last week, and walking velocity) but not for the remaining functional outcomes. In age-adjusted Cox proportional hazards analyses, hazard ratios for participants with a baseline ABI of less than 0.60 were 1.63 for becoming unable to walk a quarter of a mile (P = .044), 2.00 for developing severe disability in the number of blocks walked last week (P = .004), and 1.61 for developing severe disability in walking speed (P = .041), compared with participants with a baseline ABI of 0.90 to 1.50. Adjusting for age, race, baseline performance, and comorbidities, an ABI of less than 0.60 remained associated with becoming severely disabled in the number of blocks walked last week (hazard ratio = 1.97, P = .009) and nearly significantly associated with becoming unable to walk a quarter of a mile (hazard ratio = 1.54, P = .09). In fully adjusted random effects models, a baseline ABI of less than 0.60 was associated with significantly greater decline in walking speed per year (P = .019) and nearly significantly greater decline in number of blocks walked last week per year (P = .053) compared with a baseline ABI of 0.90 to 1.50. In community-dwelling disabled older women, a low ABI is associated with a greater incidence of severe disability in walking-specific but not other lower extremity functional outcomes, compared with persons with a normal ABI over 3 years.
Brisco, Meredith A; Coca, Steven G; Chen, Jennifer; Owens, Anjali Tiku; McCauley, Brian D; Kimmel, Stephen E; Testani, Jeffrey M
2013-03-01
Identifying reversible renal dysfunction (RD) in the setting of heart failure is challenging. The goal of this study was to evaluate whether elevated admission blood urea nitrogen/creatinine ratio (BUN/Cr) could identify decompensated heart failure patients likely to experience improvement in renal function (IRF) with treatment. Consecutive hospitalizations with a discharge diagnosis of heart failure were reviewed. IRF was defined as ≥20% increase and worsening renal function as ≥20% decrease in estimated glomerular filtration rate. IRF occurred in 31% of the 896 patients meeting eligibility criteria. Higher admission BUN/Cr was associated with in-hospital IRF (odds ratio, 1.5 per 10 increase; 95% confidence interval [CI], 1.3-1.8; P<0.001), an association persisting after adjustment for baseline characteristics (odds ratio, 1.4; 95% CI, 1.1-1.8; P=0.004). However, higher admission BUN/Cr was also associated with post-discharge worsening renal function (odds ratio, 1.4; 95% CI, 1.1-1.8; P=0.011). Notably, in patients with an elevated admission BUN/Cr, the risk of death associated with RD (estimated glomerular filtration rate <45) was substantial (hazard ratio, 2.2; 95% CI, 1.6-3.1; P<0.001). However, in patients with a normal admission BUN/Cr, RD was not associated with increased mortality (hazard ratio, 1.2; 95% CI, 0.67-2.0; P=0.59; p interaction=0.03). An elevated admission BUN/Cr identifies decompensated patients with heart failure likely to experience IRF with treatment, providing proof of concept that reversible RD may be a discernible entity. However, this improvement seems to be largely transient, and RD, in the setting of an elevated BUN/Cr, remains strongly associated with death. Further research is warranted to develop strategies for the optimal detection and treatment of these high-risk patients.
Brisco, Meredith A.; Coca, Steven G.; Chen, Jennifer; Owens, Anjali Tiku; McCauley, Brian D.; Kimmel, Stephen E.; Testani, Jeffrey M.
2014-01-01
Background Identifying reversible renal dysfunction (RD) in the setting of heart failure is challenging. The goal of this study was to evaluate whether elevated admission blood urea nitrogen/creatinine ratio (BUN/Cr) could identify decompensated heart failure patients likely to experience improvement in renal function (IRF) with treatment. Methods and Results Consecutive hospitalizations with a discharge diagnosis of heart failure were reviewed. IRF was defined as ≥20% increase and worsening renal function as ≥20% decrease in estimated glomerular filtration rate. IRF occurred in 31% of the 896 patients meeting eligibility criteria. Higher admission BUN/Cr was associated with inhospital IRF (odds ratio, 1.5 per 10 increase; 95% confidence interval [CI], 1.3–1.8; P<0.001), an association persisting after adjustment for baseline characteristics (odds ratio, 1.4; 95% CI, 1.1–1.8; P=0.004). However, higher admission BUN/Cr was also associated with post-discharge worsening renal function (odds ratio, 1.4; 95% CI, 1.1–1.8; P=0.011). Notably, in patients with an elevated admission BUN/Cr, the risk of death associated with RD (estimated glomerular filtration rate <45) was substantial (hazard ratio, 2.2; 95% CI, 1.6–3.1; P<0.001). However, in patients with a normal admission BUN/Cr, RD was not associated with increased mortality (hazard ratio, 1.2; 95% CI, 0.67–2.0; P=0.59; p interaction=0.03). Conclusions An elevated admission BUN/Cr identifies decompensated patients with heart failure likely to experience IRF with treatment, providing proof of concept that reversible RD may be a discernible entity. However, this improvement seems to be largely transient, and RD, in the setting of an elevated BUN/Cr, remains strongly associated with death. Further research is warranted to develop strategies for the optimal detection and treatment of these high-risk patients. PMID:23325460
Nagai, Shunji; Safwan, Mohamed; Collins, Kelly; Schilke, Randolph E; Rizzari, Michael; Moonka, Dilip; Brown, Kimberly; Patel, Anita; Yoshida, Atsushi; Abouljoud, Marwan
2018-05-02
The new Organ Procurement and Transplant Network/United Organ Sharing Network (OPTN/UNOS) simultaneous liver-kidney transplant (SLK) policy has been implemented. The aim of this study was to review liver transplant outcomes utilizing the new SLK policy. Liver transplant alone (LTA) and SLK patients between 2009 and 2015 were reviewed. Graft survival and post-transplant kidney function were investigated among LTA patients meeting the chronic kidney disease (CKD) criteria of the new policy (LTA-CKD group). To validate our findings, we reviewed and applied our analysis to the OPTN/UNOS registry. A total of 535 patients were eligible from our series. The LTA-CKD group (n = 27) showed worse 1-year graft survival, compared with the SLK group (n = 44), but not significant (81% vs. 93%, P = 0.15). The LTA-CKD group significantly increased a risk of post-transplant dialysis (odds ratio = 5.59 [95% CI = 1.27-24.7], P = 0.02 [Ref. normal kidney function]). Post-transplant dialysis was an independent risk factor for graft loss (hazard ratio = 7.25, 95% CI = 3.3-15.91, P < 0.001 [Ref. SLK]). In the validation analysis based on the OPTN/UNOS registry, the hazard of 1-year-graft loss in the LTA-CKD group (n = 751) was 34.8% higher than the SLK group (n = 2856) (hazard ratio = 1.348, 95% CI = 1.157-1.572, P < 0.001). Indicating SLK for patients who meet the CKD criteria may significantly improve transplant outcomes. © 2018 Steunstichting ESOT.
Mocroft, Amanda; Sterne, Jonathan A C; Egger, Matthias; May, Margaret; Grabar, Sophie; Furrer, Hansjakob; Sabin, Caroline; Fatkenheuer, Gerd; Justice, Amy; Reiss, Peter; d'Arminio Monforte, Antonella; Gill, John; Hogg, Robert; Bonnet, Fabrice; Kitahata, Mari; Staszewski, Schlomo; Casabona, Jordi; Harris, Ross; Saag, Michael
2009-04-15
The extent to which mortality differs following individual acquired immunodeficiency syndrome (AIDS)-defining events (ADEs) has not been assessed among patients initiating combination antiretroviral therapy. We analyzed data from 31,620 patients with no prior ADEs who started combination antiretroviral therapy. Cox proportional hazards models were used to estimate mortality hazard ratios for each ADE that occurred in >50 patients, after stratification by cohort and adjustment for sex, HIV transmission group, number of antiretroviral drugs initiated, regimen, age, date of starting combination antiretroviral therapy, and CD4+ cell count and HIV RNA load at initiation of combination antiretroviral therapy. ADEs that occurred in <50 patients were grouped together to form a "rare ADEs" category. During a median follow-up period of 43 months (interquartile range, 19-70 months), 2880 ADEs were diagnosed in 2262 patients; 1146 patients died. The most common ADEs were esophageal candidiasis (in 360 patients), Pneumocystis jiroveci pneumonia (320 patients), and Kaposi sarcoma (308 patients). The greatest mortality hazard ratio was associated with non-Hodgkin's lymphoma (hazard ratio, 17.59; 95% confidence interval, 13.84-22.35) and progressive multifocal leukoencephalopathy (hazard ratio, 10.0; 95% confidence interval, 6.70-14.92). Three groups of ADEs were identified on the basis of the ranked hazard ratios with bootstrapped confidence intervals: severe (non-Hodgkin's lymphoma and progressive multifocal leukoencephalopathy [hazard ratio, 7.26; 95% confidence interval, 5.55-9.48]), moderate (cryptococcosis, cerebral toxoplasmosis, AIDS dementia complex, disseminated Mycobacterium avium complex, and rare ADEs [hazard ratio, 2.35; 95% confidence interval, 1.76-3.13]), and mild (all other ADEs [hazard ratio, 1.47; 95% confidence interval, 1.08-2.00]). In the combination antiretroviral therapy era, mortality rates subsequent to an ADE depend on the specific diagnosis. The proposed classification of ADEs may be useful in clinical end point trials, prognostic studies, and patient management.
Fredman, Lisa; Lyons, Jennifer G; Cauley, Jane A; Hochberg, Marc; Applebaum, Katie M
2015-09-01
Previous studies have shown inconsistent associations between caregiving and mortality. This may be due to analyzing caregiver status at baseline only, and that better health is probably related to taking on caregiving responsibilities and continuing in that role. The latter is termed The Healthy Caregiver Hypothesis, similar to the Healthy Worker Effect in occupational epidemiology. We applied common approaches from occupational epidemiology to evaluate the association between caregiving and mortality, including treating caregiving as time-varying and lagging exposure up to 5 years. Caregiving status among 1,068 women (baseline mean age = 81.0 years; 35% caregivers) participating in the Caregiver-Study of Osteoporotic Fractures study was assessed at five interviews conducted between 1999 and 2009. Mortality was determined through January 2012. Cox proportional hazards models were used to estimate adjusted hazard ratios and 95% confidence intervals adjusted for sociodemographics, perceived stress, and functional limitations. A total of 483 participants died during follow-up (38.8% and 48.7% of baseline caregivers and noncaregivers, respectively). Using baseline caregiving status, the association with mortality was 0.77, 0.62-0.95. Models of time-varying caregiving status showed a more pronounced reduction in mortality in current caregivers (hazard ratios = 0.54, 0.38-0.75), which diminished with longer lag periods (3-year lag hazard ratio = 0.68, 0.52-0.88, 5-year lag hazard ratios = 0.76, 0.60-0.95). Overall, caregivers had lower mortality rates than noncaregivers in all analyses. These associations were sensitive to the lagged period, indicating that the timing of leaving caregiving does influence this relationship and should be considered in future investigations. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Adverse health outcomes in women exposed in utero to diethylstilbestrol.
Hoover, Robert N; Hyer, Marianne; Pfeiffer, Ruth M; Adam, Ervin; Bond, Brian; Cheville, Andrea L; Colton, Theodore; Hartge, Patricia; Hatch, Elizabeth E; Herbst, Arthur L; Karlan, Beth Y; Kaufman, Raymond; Noller, Kenneth L; Palmer, Julie R; Robboy, Stanley J; Saal, Robert C; Strohsnitter, William; Titus-Ernstoff, Linda; Troisi, Rebecca
2011-10-06
Before 1971, several million women were exposed in utero to diethylstilbestrol (DES) given to their mothers to prevent pregnancy complications. Several adverse outcomes have been linked to such exposure, but their cumulative effects are not well understood. We combined data from three studies initiated in the 1970s with continued long-term follow-up of 4653 women exposed in utero to DES and 1927 unexposed controls. We assessed the risks of 12 adverse outcomes linked to DES exposure, including cumulative risks to 45 years of age for reproductive outcomes and to 55 years of age for other outcomes, and their relationships to the baseline presence or absence of vaginal epithelial changes, which are correlated with a higher dose of, and earlier exposure to, DES in utero. Cumulative risks in women exposed to DES, as compared with those not exposed, were as follows: for infertility, 33.3% vs. 15.5% (hazard ratio, 2.37; 95% confidence interval [CI], 2.05 to 2.75); spontaneous abortion, 50.3% vs. 38.6% (hazard ratio, 1.64; 95% CI, 1.42 to 1.88); preterm delivery, 53.3% vs. 17.8% (hazard ratio, 4.68; 95% CI, 3.74 to 5.86); loss of second-trimester pregnancy, 16.4% vs. 1.7% (hazard ratio, 3.77; 95% CI, 2.56 to 5.54); ectopic pregnancy, 14.6% vs. 2.9% (hazard ratio, 3.72; 95% CI, 2.58 to 5.38); preeclampsia, 26.4% vs. 13.7% (hazard ratio 1.42; 95% CI, 1.07 to 1.89); stillbirth, 8.9% vs. 2.6% (hazard ratio, 2.45; 95% CI, 1.33 to 4.54); early menopause, 5.1% vs. 1.7% (hazard ratio, 2.35; 95% CI, 1.67 to 3.31); grade 2 or higher cervical intraepithelial neoplasia, 6.9% vs. 3.4% (hazard ratio, 2.28; 95% CI, 1.59 to 3.27); and breast cancer at 40 years of age or older, 3.9% vs. 2.2% (hazard ratio, 1.82; 95% CI, 1.04 to 3.18). For most outcomes, the risks among exposed women were higher for those with vaginal epithelial changes than for those without such changes. In utero exposure of women to DES is associated with a high lifetime risk of a broad spectrum of adverse health outcomes. (Funded by the National Cancer Institute.).
Makizako, Hyuma; Shimada, Hiroyuki; Doi, Takehiko; Tsutsumimoto, Kota; Nakakubo, Sho; Hotta, Ryo; Suzuki, Takao
2017-04-01
Lower extremity functioning is important for maintaining activity in elderly people. Optimal cutoff points for standard measurements of lower extremity functioning would help identify elderly people who are not disabled but have a high risk of developing disability. The purposes of this study were: (1) to determine the optimal cutoff points of the Five-Times Sit-to-Stand Test and the Timed "Up & Go" Test for predicting the development of disability and (2) to examine the impact of poor performance on both tests on the prediction of the risk of disability in elderly people dwelling in the community. This was a prospective cohort study. A population of 4,335 elderly people dwelling in the community (mean age = 71.7 years; 51.6% women) participated in baseline assessments. Participants were monitored for 2 years for the development of disability. During the 2-year follow-up period, 161 participants (3.7%) developed disability. The optimal cutoff points of the Five-Times Sit-to-Stand Test and the Timed "Up & Go" Test for predicting the development of disability were greater than or equal to 10 seconds and greater than or equal to 9 seconds, respectively. Participants with poor performance on the Five-Times Sit-to-Stand Test (hazard ratio = 1.88; 95% CI = 1.11-3.20), the Timed "Up & Go" Test (hazard ratio = 2.24; 95% CI = 1.42-3.53), or both tests (hazard ratio = 2.78; 95% CI = 1.78-4.33) at the baseline assessment had a significantly higher risk of developing disability than participants who had better lower extremity functioning. All participants had good initial functioning and participated in assessments on their own. Causes of disability were not assessed. Assessments of lower extremity functioning with the Five-Times Sit-to-Stand Test and the Timed "Up & Go" Test, especially poor performance on both tests, were good predictors of future disability in elderly people dwelling in the community. © 2017 American Physical Therapy Association
Assessing the Impact of Analytical Error on Perceived Disease Severity.
Kroll, Martin H; Garber, Carl C; Bi, Caixia; Suffin, Stephen C
2015-10-01
The perception of the severity of disease from laboratory results assumes that the results are free of analytical error; however, analytical error creates a spread of results into a band and thus a range of perceived disease severity. To assess the impact of analytical errors by calculating the change in perceived disease severity, represented by the hazard ratio, using non-high-density lipoprotein (nonHDL) cholesterol as an example. We transformed nonHDL values into ranges using the assumed total allowable errors for total cholesterol (9%) and high-density lipoprotein cholesterol (13%). Using a previously determined relationship between the hazard ratio and nonHDL, we calculated a range of hazard ratios for specified nonHDL concentrations affected by analytical error. Analytical error, within allowable limits, created a band of values of nonHDL, with a width spanning 30 to 70 mg/dL (0.78-1.81 mmol/L), depending on the cholesterol and high-density lipoprotein cholesterol concentrations. Hazard ratios ranged from 1.0 to 2.9, a 16% to 50% error. Increased bias widens this range and decreased bias narrows it. Error-transformed results produce a spread of values that straddle the various cutoffs for nonHDL. The range of the hazard ratio obscures the meaning of results, because the spread of ratios at different cutoffs overlap. The magnitude of the perceived hazard ratio error exceeds that for the allowable analytical error, and significantly impacts the perceived cardiovascular disease risk. Evaluating the error in the perceived severity (eg, hazard ratio) provides a new way to assess the impact of analytical error.
Pancreatic β-Cell Function and Prognosis of Nondiabetic Patients With Ischemic Stroke.
Pan, Yuesong; Chen, Weiqi; Jing, Jing; Zheng, Huaguang; Jia, Qian; Li, Hao; Zhao, Xingquan; Liu, Liping; Wang, Yongjun; He, Yan; Wang, Yilong
2017-11-01
Pancreatic β-cell dysfunction is an important factor in the development of type 2 diabetes mellitus. This study aimed to estimate the association between β-cell dysfunction and prognosis of nondiabetic patients with ischemic stroke. Patients with ischemic stroke without a history of diabetes mellitus in the ACROSS-China (Abnormal Glucose Regulation in Patients with Acute Stroke across China) registry were included. Disposition index was estimated as computer-based model of homeostatic model assessment 2-β%/homeostatic model assessment 2-insulin resistance based on fasting C-peptide level. Outcomes included stroke recurrence, all-cause death, and dependency (modified Rankin Scale, 3-5) at 12 months after onset. Among 1171 patients, 37.2% were women with a mean age of 62.4 years. At 12 months, 167 (14.8%) patients had recurrent stroke, 110 (9.4%) died, and 184 (16.0%) had a dependency. The first quartile of the disposition index was associated with an increased risk of stroke recurrence (adjusted hazard ratio, 3.57; 95% confidence interval, 2.13-5.99) and dependency (adjusted hazard ratio, 2.30; 95% confidence interval, 1.21-4.38); both the first and second quartiles of the disposition index were associated with an increased risk of death (adjusted hazard ratio, 5.09; 95% confidence interval, 2.51-10.33; adjusted hazard ratio, 2.42; 95% confidence interval, 1.17-5.03) compared with the fourth quartile. Using a multivariable regression model with restricted cubic spline, we observed an L-shaped association between the disposition index and the risk of each end point. In this large-scale registry, β-cell dysfunction was associated with an increased risk of 12-month poor prognosis in nondiabetic patients with ischemic stroke. © 2017 American Heart Association, Inc.
Falkstedt, Daniel; Wolff, Valerie; Allebeck, Peter; Hemmingsson, Tomas; Danielsson, Anna-Karin
2017-02-01
Current knowledge on cannabis use in relation to stroke is based almost exclusively on clinical reports. By using a population-based cohort, we aimed to find out whether there was an association between cannabis use and early-onset stroke, when accounting for the use of tobacco and alcohol. The cohort comprises 49 321 Swedish men, born between 1949 and 1951, who were conscripted into compulsory military service between the ages of 18 and 20. All men answered 2 detailed questionnaires at conscription and were subject to examinations of physical aptitude, psychological functioning, and medical status. Information on stroke events up to ≈60 years of age was obtained from national databases; this includes strokes experienced before 45 years of age. No associations between cannabis use in young adulthood and strokes experienced ≤45 years of age or beyond were found in multivariable models: cannabis use >50 times, hazard ratios=0.93 (95% confidence interval [CI], 0.34-2.57) and 0.95 (95% CI, 0.59-1.53). Although an almost doubled risk of ischemic stroke was observed in those with cannabis use >50 times, this risk was attenuated when adjusted for tobacco usage: hazards ratio=1.47 (95% CI, 0.83-2.56). Smoking ≥20 cigarettes per day was clearly associated both with strokes before 45 years of age, hazards ratio=5.04 (95% CI, 2.80-9.06), and with strokes throughout the follow-up, hazards ratio=2.15 (95% CI, 1.61-2.88). We found no evident association between cannabis use in young adulthood and stroke, including strokes before 45 years of age. Tobacco smoking, however, showed a clear, dose-response shaped association with stroke. © 2016 American Heart Association, Inc.
de León, A Cabrera; Coello, S Domínguez; González, D Almeida; Díaz, B Brito; Rodríguez, J C del Castillo; Hernández, A González; Aguirre-Jaime, A; Pérez, M del Cristo Rodríguez
2012-03-01
To estimate the incidence rate and risk factors for diabetes in the Canary Islands. A total of 5521 adults without diabetes were followed for a median of 3.5 years. Incident cases of diabetes were self-declared and validated in medical records. The following factors were assessed by Cox regression to estimate the hazard ratios for diabetes: impaired fasting glucose (5.6 mmol/l ≤ fasting glucose ≤ 6.9 mmol/l), BMI, waist-to-height ratio (≥ 0.55), insulin resistance (defined as triglycerides/HDL cholesterol ≥ 3), familial antecedents of diabetes, Canarian ancestry, smoking, alcohol intake, sedentary lifestyle, Mediterranean diet, social class and the metabolic syndrome. The incidence rate was 7.5/10(3) person-years (95% CI 6.4-8.8). The greatest risks were obtained for impaired fasting glucose (hazard ratio 2.6; 95% CI 1.8-3.8), Canarian ancestry (hazard ratio 1.9; 95% CI 1.0-3.4), waist-to-height ratio (hazard ratio 1.7; 95% CI 1.1-2.5), insulin resistance (hazard ratio 1.5; 95% CI 1.0-2.2) and paternal history of diabetes (hazard ratio 1.5; 95% CI 1.0-2.3). The metabolic syndrome (hazard ratio 1.9; 95% CI 1.3-2.8) and BMI ≥ 30 kg/m(2) (hazard ratio 1.7; 95% CI 1.0-2.7) were significant only when their effects were not adjusted for impaired fasting glucose and waist-to-height ratio, respectively. The incidence of diabetes in the Canary Islands is 1.5-fold higher than that in continental Spain and 1.7-fold higher than in the UK. The main predictors of diabetes were impaired fasting glucose, Canarian ancestry, waist-to-height ratio and insulin resistance. The metabolic syndrome predicted diabetes only when its effect was not adjusted for impaired fasting glucose. In individuals with Canarian ancestry, genetic susceptibility studies may be advisable. In order to propose preventive strategies, impaired fasting glucose, waist-to-height ratio and triglyceride/HDL cholesterol should be used to identify subjects with an increased risk of developing diabetes. © 2011 The Authors. Diabetic Medicine © 2011 Diabetes UK.
Zhang, Kun; Gao, Baoshan; Wang, Yuantao; Wang, Gang; Wang, Weigang; Zhu, Yaxiang; Yao, Liyu; Gu, Yiming; Chen, Mo; Zhou, Honglan; Fu, Yaowen
2015-01-01
Since the association of serum uric acid and kidney transplant graft outcome remains disputable, we sought to evaluate the predictive value of uric acid level for graft survival/function and the factors could affect uric acid as time varies. A consecutive cohort of five hundred and seventy three recipients transplanted during January 2008 to December 2011 were recruited. Data and laboratory values of our interest were collected at 1, 3, 6, 12, 24 and 36 months post-transplant for analysis. Cox proportional hazard model, and multiple regression equation were built to adjust for the possible confounding variables and meet our goals as appropriate. The current cohort study lasts for 41.86 ± 15.49 months. Uric acid level is proven to be negatively associated with eGFR at different time point after adjustment for age, body mass index and male gender (standardized β ranges from -0.15 to -0.30 with all P<0.001).Males with low eGFR but high level of TG were on CSA, diuretics and RAS inhibitors and experienced at least one episode of acute rejection and diabetic issue were associated with a higher mean uric acid level. Hyperuricemia was significantly an independent predictor of pure graft failure (hazard ratio=4.01, 95% CI: 1.25-12.91, P=0.02) after adjustment. But it was no longer an independent risk factor for graft loss after adjustment. Interestingly, higher triglyceride level can make incidence of graft loss (hazard ratio=1.442, for each unit increase millimoles per liter 95% CI: 1.008-2.061, P=0.045) and death (hazard ratio=1.717, 95% CI: 1.105-2.665, P=0.016) more likely. The results of our study suggest that post-transplant elevated serum uric acid level is an independent predictor of long-term graft survival and graft function. Together with the high TG level impact on poor outcomes, further investigations for therapeutic effect are needed. PMID:26208103
Fitzhugh, Courtney D; Hsieh, Matthew M; Allen, Darlene; Coles, Wynona A; Seamon, Cassie; Ring, Michael; Zhao, Xiongce; Minniti, Caterina P; Rodgers, Griffin P; Schechter, Alan N; Tisdale, John F; Taylor, James G
2015-01-01
Adults with sickle cell anemia (HbSS) are inconsistently treated with hydroxyurea. We retrospectively evaluated the effects of elevating fetal hemoglobin with hydroxyurea on organ damage and survival in patients enrolled in our screening study between 2001 and 2010. An electronic medical record facilitated development of a database for comparison of study parameters based on hydroxyurea exposure and dose. This study is registered with ClinicalTrials.gov, number NCT00011648. Three hundred eighty-three adults with homozygous sickle cell disease were analyzed with 59 deaths during study follow-up. Cox regression analysis revealed deceased subjects had more hepatic dysfunction (elevated alkaline phosphatase, Hazard Ratio = 1.005, 95% CI 1.003-1.006, p<0.0.0001), kidney dysfunction (elevated creatinine, Hazard Ratio = 1.13, 95% CI 1.00-1.27, p = 0.043), and cardiopulmonary dysfunction (elevated tricuspid jet velocity on echocardiogram, Hazard Ratio = 2.22, 1.23-4.02, p = 0.0082). Sixty-six percent of subjects were treated with hydroxyurea, although only 66% of those received a dose within the recommended therapeutic range. Hydroxyurea use was associated with improved survival (Hazard Ratio = 0.58, 95% CI 0.34-0.97, p = 0.040). This effect was most pronounced in those taking the recommended dose of 15-35 mg/kg/day (Hazard Ratio 0.36, 95% CI 0.17-0.73, p = 0.0050). Hydroxyurea use was not associated with changes in organ function over time. Further, subjects with higher fetal hemoglobin responses to hydroxyurea were more likely to survive (p = 0.0004). While alkaline phosphatase was lowest in patients with the best fetal hemoglobin response (95.4 versus 123.6, p = 0.0065 and 96.1 versus 113.6U/L, p = 0.041 at first and last visits, respectively), other markers of organ damage were not consistently improved over time in patients with the highest fetal hemoglobin levels. Our data suggest that adults should be treated with the maximum tolerated hydroxyurea dose, ideally before organ damage occurs. Prospective studies are indicated to validate these findings.
Fitzhugh, Courtney D.; Hsieh, Matthew M.; Allen, Darlene; Coles, Wynona A.; Seamon, Cassie; Ring, Michael; Zhao, Xiongce; Minniti, Caterina P.; Rodgers, Griffin P.; Schechter, Alan N.; Tisdale, John F.; Taylor, James G.
2015-01-01
Background Adults with sickle cell anemia (HbSS) are inconsistently treated with hydroxyurea. Objectives We retrospectively evaluated the effects of elevating fetal hemoglobin with hydroxyurea on organ damage and survival in patients enrolled in our screening study between 2001 and 2010. Methods An electronic medical record facilitated development of a database for comparison of study parameters based on hydroxyurea exposure and dose. This study is registered with ClinicalTrials.gov, number NCT00011648. Results Three hundred eighty-three adults with homozygous sickle cell disease were analyzed with 59 deaths during study follow-up. Cox regression analysis revealed deceased subjects had more hepatic dysfunction (elevated alkaline phosphatase, Hazard Ratio = 1.005, 95% CI 1.003–1.006, p<0.0.0001), kidney dysfunction (elevated creatinine, Hazard Ratio = 1.13, 95% CI 1.00–1.27, p = 0.043), and cardiopulmonary dysfunction (elevated tricuspid jet velocity on echocardiogram, Hazard Ratio = 2.22, 1.23–4.02, p = 0.0082). Sixty-six percent of subjects were treated with hydroxyurea, although only 66% of those received a dose within the recommended therapeutic range. Hydroxyurea use was associated with improved survival (Hazard Ratio = 0.58, 95% CI 0.34–0.97, p = 0.040). This effect was most pronounced in those taking the recommended dose of 15–35 mg/kg/day (Hazard Ratio 0.36, 95% CI 0.17–0.73, p = 0.0050). Hydroxyurea use was not associated with changes in organ function over time. Further, subjects with higher fetal hemoglobin responses to hydroxyurea were more likely to survive (p = 0.0004). While alkaline phosphatase was lowest in patients with the best fetal hemoglobin response (95.4 versus 123.6, p = 0.0065 and 96.1 versus 113.6U/L, p = 0.041 at first and last visits, respectively), other markers of organ damage were not consistently improved over time in patients with the highest fetal hemoglobin levels. Conclusions Our data suggest that adults should be treated with the maximum tolerated hydroxyurea dose, ideally before organ damage occurs. Prospective studies are indicated to validate these findings. PMID:26576059
Correction of Selection Bias in Survey Data: Is the Statistical Cure Worse Than the Bias?
Hanley, James A
2017-04-01
In previous articles in the American Journal of Epidemiology (Am J Epidemiol. 2013;177(5):431-442) and American Journal of Public Health (Am J Public Health. 2013;103(10):1895-1901), Masters et al. reported age-specific hazard ratios for the contrasts in mortality rates between obesity categories. They corrected the observed hazard ratios for selection bias caused by what they postulated was the nonrepresentativeness of the participants in the National Health Interview Study that increased with age, obesity, and ill health. However, it is possible that their regression approach to remove the alleged bias has not produced, and in general cannot produce, sensible hazard ratio estimates. First, we must consider how many nonparticipants there might have been in each category of obesity and of age at entry and how much higher the mortality rates would have to be in nonparticipants than in participants in these same categories. What plausible set of numerical values would convert the ("biased") decreasing-with-age hazard ratios seen in the data into the ("unbiased") increasing-with-age ratios that they computed? Can these values be encapsulated in (and can sensible values be recovered from) one additional internal variable in a regression model? Second, one must examine the age pattern of the hazard ratios that have been adjusted for selection. Without the correction, the hazard ratios are attenuated with increasing age. With it, the hazard ratios at older ages are considerably higher, but those at younger ages are well below one. Third, one must test whether the regression approach suggested by Masters et al. would correct the nonrepresentativeness that increased with age and ill health that I introduced into real and hypothetical data sets. I found that the approach did not recover the hazard ratio patterns present in the unselected data sets: the corrections overshot the target at older ages and undershot it at lower ages.
Association of serum bicarbonate with incident functional limitation in older adults.
Yenchek, Robert; Ix, Joachim H; Rifkin, Dena E; Shlipak, Michael G; Sarnak, Mark J; Garcia, Melissa; Patel, Kushang V; Satterfield, Suzanne; Harris, Tamara B; Newman, Anne B; Fried, Linda F
2014-12-05
Cross-sectional studies have found that low serum bicarbonate is associated with slower gait speed. Whether bicarbonate levels independently predict the development of functional limitation has not been previously studied. Whether bicarbonate was associated with incident persistent lower extremity functional limitation and whether the relationship differed in individuals with and without CKD were assessed in participants in the Health, Aging, and Body Composition study, a prospective study of well functioning older individuals Functional limitation was defined as difficulty in walking 0.25 miles or up 10 stairs on two consecutive reports 6 months apart in the same activity (stairs or walking). Kidney function was measured using eGFR by the Chronic Kidney Disease Epidemiology Collaboration creatinine equation, and CKD was defined as an eGFR<60 ml/min per 1.73 m(2). Serum bicarbonate was measured using arterialized venous blood gas. Cox proportional hazards analysis was used to assess the association of bicarbonate (<23, 23-25.9, and ≥26 mEq/L) with functional limitation. Mixed model linear regression was performed to assess the association of serum bicarbonate on change in gait speed over time. Of 1544 participants, 412 participants developed incident persistent functional limitation events over a median 4.4 years (interquartile range, 3.1 to 4.5). Compared with ≥26 mEq/L, lower serum bicarbonate was associated with functional limitation. After adjustment for demographics, CKD, diabetes, body mass index, smoking, diuretic use, and gait speed, lower serum bicarbonate was significantly associated with functional limitation (hazard ratio, 1.35; 95% confidence interval, 1.08 to 1.68 and hazard ratio, 1.58; 95% confidence interval, 1.12 to 2.22 for bicarbonate levels from 23 to 25.9 and <23, respectively). There was not a significant interaction of bicarbonate with CKD. In addition, bicarbonate was not significantly associated with change in gait speed. Lower serum bicarbonate was associated with greater risk of incident, persistent functional limitation. This association was present in individuals with and without CKD. Copyright © 2014 by the American Society of Nephrology.
Association of Serum Bicarbonate with Incident Functional Limitation in Older Adults
Yenchek, Robert; Ix, Joachim H.; Rifkin, Dena E.; Shlipak, Michael G.; Sarnak, Mark J.; Garcia, Melissa; Patel, Kushang V.; Satterfield, Suzanne; Harris, Tamara B.; Newman, Anne B.
2014-01-01
Background and objectives Cross-sectional studies have found that low serum bicarbonate is associated with slower gait speed. Whether bicarbonate levels independently predict the development of functional limitation has not been previously studied. Whether bicarbonate was associated with incident persistent lower extremity functional limitation and whether the relationship differed in individuals with and without CKD were assessed in participants in the Health, Aging, and Body Composition study, a prospective study of well functioning older individuals Design, setting, participants, & measurements Functional limitation was defined as difficulty in walking 0.25 miles or up 10 stairs on two consecutive reports 6 months apart in the same activity (stairs or walking). Kidney function was measured using eGFR by the Chronic Kidney Disease Epidemiology Collaboration creatinine equation, and CKD was defined as an eGFR<60 ml/min per 1.73 m2. Serum bicarbonate was measured using arterialized venous blood gas. Cox proportional hazards analysis was used to assess the association of bicarbonate (<23, 23–25.9, and ≥26 mEq/L) with functional limitation. Mixed model linear regression was performed to assess the association of serum bicarbonate on change in gait speed over time. Results Of 1544 participants, 412 participants developed incident persistent functional limitation events over a median 4.4 years (interquartile range, 3.1 to 4.5). Compared with ≥26 mEq/L, lower serum bicarbonate was associated with functional limitation. After adjustment for demographics, CKD, diabetes, body mass index, smoking, diuretic use, and gait speed, lower serum bicarbonate was significantly associated with functional limitation (hazard ratio, 1.35; 95% confidence interval, 1.08 to 1.68 and hazard ratio, 1.58; 95% confidence interval, 1.12 to 2.22 for bicarbonate levels from 23 to 25.9 and <23, respectively). There was not a significant interaction of bicarbonate with CKD. In addition, bicarbonate was not significantly associated with change in gait speed. Conclusions Lower serum bicarbonate was associated with greater risk of incident, persistent functional limitation. This association was present in individuals with and without CKD. PMID:25381341
Moss, Arthur J.; Shimizu, Wataru; Wilde, Arthur A.M.; Towbin, Jeffrey A.; Zareba, Wojciech; Robinson, Jennifer L.; Qi, Ming; Vincent, G. Michael; Ackerman, Michael J.; Kaufman, Elizabeth S.; Hofman, Nynke; Seth, Rahul; Kamakura, Shiro; Miyamoto, Yoshihiro; Goldenberg, Ilan; Andrews, Mark L.; McNitt, Scott
2012-01-01
Background Type-1 long-QT syndrome (LQTS) is caused by loss-of-function mutations in the KCNQ1-encoded IKs cardiac potassium channel. We evaluated the effect of location, coding type, and biophysical function of KCNQ1 mutations on the clinical phenotype of this disorder. Methods and Results We investigated the clinical course in 600 patients with 77 different KCNQ1 mutations in 101 proband-identified families derived from the US portion of the International LQTS Registry (n=425), the Netherlands’ LQTS Registry (n=93), and the Japanese LQTS Registry (n=82). The Cox proportional hazards survivorship model was used to evaluate the independent contribution of clinical and genetic factors to the first occurrence of time-dependent cardiac events from birth through age 40 years. The clinical characteristics, distribution of mutations, and overall outcome event rates were similar in patients enrolled from the 3 geographic regions. Biophysical function of the mutations was categorized according to dominant-negative (>50%) or haploinsufficiency (≤50%) reduction in cardiac repolarizing IKs potassium channel current. Patients with transmembrane versus C-terminus mutations (hazard ratio, 2.06; P<0.001) and those with mutations having dominant-negative versus haploinsufficiency ion channel effects (hazard ratio, 2.26; P<0.001) were at increased risk for cardiac events, and these genetic risks were independent of traditional clinical risk factors. Conclusions This genotype–phenotype study indicates that in type-1 LQTS, mutations located in the transmembrane portion of the ion channel protein and the degree of ion channel dysfunction caused by the mutations are important independent risk factors influencing the clinical course of this disorder. PMID:17470695
Rios, Camilo; Motola-Kuba, Daniel; Matus-Santos, Juan; Villa, Antonio R; Moreno-Jimenez, Sergio
2016-01-01
Objective: A long-lasting concern has prevailed for the identification of predictive biomarkers for high-grade gliomas (HGGs) using MRI. However, a consensus of which imaging parameters assemble a significant survival model is still missing in the literature; we investigated the significant positive or negative contribution of several MR biomarkers in this tumour prognosis. Methods: A retrospective cohort of supratentorial HGGs [11 glioblastoma multiforme (GBM) and 17 anaplastic astrocytomas] included 28 patients (9 females and 19 males, respectively, with a mean age of 50.4 years, standard deviation: 16.28 years; range: 13–85 years). Oedema and viable tumour measurements were acquired using regions of interest in T1 weighted, T2 weighted, fluid-attenuated inversion recovery, apparent diffusion coefficient (ADC) and MR spectroscopy (MRS). We calculated Kaplan–Meier curves and obtained Cox's proportional hazards. Results: During the follow-up period (3–98 months), 17 deaths were recorded. The median survival time was 1.73 years (range, 0.287–8.947 years). Only 3 out of 20 covariates (choline-to-N-acetyl aspartate and lipids-lactate-to-creatine ratios and age) showed significance in explaining the variability in the survival hazards model; score test: χ2 (3) = 9.098, p = 0.028. Conclusion: MRS metabolites overcome volumetric parameters of peritumoral oedema and viable tumour, as well as tumour region ADC measurements. Specific MRS ratios (Cho/Naa, L-L/Cr) might be considered in a regular follow-up for these tumours. Advances in knowledge: Cho/Naa ratio is the strongest survival predictor with a log-hazard function of 2.672 in GBM. Low levels of lipids–lactate/Cr ratio represent up to a 41.6% reduction in the risk of death in GBM. PMID:27626830
A balanced hazard ratio for risk group evaluation from survival data.
Branders, Samuel; Dupont, Pierre
2015-07-30
Common clinical studies assess the quality of prognostic factors, such as gene expression signatures, clinical variables or environmental factors, and cluster patients into various risk groups. Typical examples include cancer clinical trials where patients are clustered into high or low risk groups. Whenever applied to survival data analysis, such groups are intended to represent patients with similar survival odds and to select the most appropriate therapy accordingly. The relevance of such risk groups, and of the related prognostic factors, is typically assessed through the computation of a hazard ratio. We first stress three limitations of assessing risk groups through the hazard ratio: (1) it may promote the definition of arbitrarily unbalanced risk groups; (2) an apparently optimal group hazard ratio can be largely inconsistent with the p-value commonly associated to it; and (3) some marginal changes between risk group proportions may lead to highly different hazard ratio values. Those issues could lead to inappropriate comparisons between various prognostic factors. Next, we propose the balanced hazard ratio to solve those issues. This new performance metric keeps an intuitive interpretation and is as simple to compute. We also show how the balanced hazard ratio leads to a natural cut-off choice to define risk groups from continuous risk scores. The proposed methodology is validated through controlled experiments for which a prescribed cut-off value is defined by design. Further results are also reported on several cancer prognosis studies, and the proposed methodology could be applied more generally to assess the quality of any prognostic markers. Copyright © 2015 John Wiley & Sons, Ltd.
Tsiliyannis, Christos Aristeides
2013-09-01
Hazardous waste incinerators (HWIs) differ substantially from thermal power facilities, since instead of maximizing energy production with the minimum amount of fuel, they aim at maximizing throughput. Variations in quantity or composition of received waste loads may significantly diminish HWI throughput (the decisive profit factor), from its nominal design value. A novel formulation of combustion balance is presented, based on linear operators, which isolates the wastefeed vector from the invariant combustion stoichiometry kernel. Explicit expressions for the throughput are obtained, in terms of incinerator temperature, fluegas heat recuperation ratio and design parameters, for an arbitrary number of wastes, based on fundamental principles (mass and enthalpy balances). The impact of waste variations, of recuperation ratio and of furnace temperature is explicitly determined. It is shown that in the presence of waste uncertainty, the throughput may be a decreasing or increasing function of incinerator temperature and recuperation ratio, depending on the sign of a dimensionless parameter related only to the uncertain wastes. The dimensionless parameter is proposed as a sharp a' priori waste 'fingerprint', determining the necessary increase or decrease of manipulated variables (recuperation ratio, excess air, auxiliary fuel feed rate, auxiliary air flow) in order to balance the HWI and maximize throughput under uncertainty in received wastes. A 10-step procedure is proposed for direct application subject to process capacity constraints. The results may be useful for efficient HWI operation and for preparing hazardous waste blends. Copyright © 2013 Elsevier Ltd. All rights reserved.
Drazner, Mark H; Velez-Martinez, Mariella; Ayers, Colby R; Reimold, Sharon C; Thibodeau, Jennifer T; Mishkin, Joseph D; Mammen, Pradeep P A; Markham, David W; Patel, Chetan B
2013-03-01
Although right atrial pressure (RAP) and pulmonary capillary wedge pressure (PCWP) are correlated in heart failure, in a sizeable minority of patients, the RAP and PCWP are not tightly coupled. The basis of this variability in the RAP/PCWP ratio, and whether it conveys prognostic value, is not known. We analyzed the Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness (ESCAPE) trial database. Baseline characteristics, including echocardiographic assessment of right ventricular (RV) structure and function, and invasively measured hemodynamic parameters, were compared among tertiles of the RAP/PCWP ratio. Multivariable Cox proportional hazard models assessed the association of RAP/PCWP ratio with the primary ESCAPE outcome (6-month death or hospitalization [days]) adjusting for systolic blood pressure, blood urea nitrogen, 6-minute walk distance, and PCWP. The RAP/PCWP tertiles were 0.27 to 0.4 (tertile 1); 0.41 to 0.615 (tertile 2), and 0.62 to 1.21 (tertile 3). Increasing RAP/PCWP was associated with increasing median right atrial area (23, 26, 29 cm2, respectively; P<0.005), RV area in diastole (21, 27, 27 cm2, respectively; P<0.005), and pulmonary vascular resistance (2.4, 2.9, 3.6 woods units, respectively; P=0.003), and lower RV stroke work index (8.6, 8.4, 5.5 g·m/m2 per beat, respectively; P<0.001). RAP/PCWP ratio was associated with death or hospitalization within 6 months (hazard ratio, 1.16 [1, 1.4]; P<0.05). Increased RAP/PCWP ratio was associated with higher pulmonary vascular resistance, reduced RV function (manifest as a larger right atrium and ventricle and lower RV stroke work index), and an increased risk of adverse outcomes in patients with advanced heart failure.
The association between mental health, physical function, and hemodialysis mortality.
Knight, Eric L; Ofsthun, Norma; Teng, Ming; Lazarus, J Michael; Curhan, Gary C
2003-05-01
Mortality rates for individuals on chronic hemodialysis remain very high; therefore, strategies are needed to identify individuals at greatest risk for mortality so preventive strategies can be implemented. One such approach is to stratify individuals by self-reported mental health and physical function. Examining these parameters at baseline, and over time, may help identify individuals at greater risk for mortality. We enrolled 14,815 individuals with end-stage renal disease (ESRD) and followed these individuals for up to 2 years. The mean age was 61.0 +/- 15.4 years (range, 20 to 96 years) and 31% were African Americans. The SF-36 Health Survey was administered 1 to 3 months after hemodialysis initiation and 6 months later. We examined the associations between the initial SF-36 Health Survey mental component summary (MCS) and physical component summary (PCS) scores and mortality during the follow-up period, and examined the associations between 6-month decline in PCS and MCS scores and subsequent mortality. We also examined the interactions between age and MCS and PCS scores. The general population-based mean of each of these scores was 50 with a standard deviation of 10. The main outcome measurement was death. Self-reported baseline mental health (MCS score) and physical function (PCS score) were both independently associated with increased mortality, and 6-month decline in these parameters was also associated with increased mortality. The multivariate hazard ratios for 1-year mortality for MCS scores of less than 30, 30 to 39, and 40 to 49 were 1.48 (95% CI, 1.32 to 1.64), 1.23 (95% CI, 1.14 to 1.32) and 1.18 (95% CI, 1.10 to 1.26) compared with a MCS score of 50 or more. The hazard ratios for PCS scores of less than 20, 20 to 29, and 30 to 39 were 1.97 (95% CI, 1.64 to 2.36), 1.62 (95% CI, 1.36 to 1.92), and 1.32 (95% CI, 1.11 to 1.57) compared with a PCS score of 50 or more. Six-month decline in self-reported mental health (hazard ratio, 1.07; 95% CI, 1.02 to 1.12, per 10-point decline in MCS score) and physical function (hazard ratio, 1.25; 95% CI, 1.18 to 1.33, per 10-point decline in PCS score) were also both significantly associated with an additional increase in mortality beyond baseline risk. We also found a significant interaction between age and physical function (P = 0.02). Specifically, there was a graded response between the PCS score category and mortality in most age strata, but this relationship was not observed in the oldest age (85 years old or older). In individuals newly initiated on chronic hemodialysis, self-reported baseline mental health and physical function are important, independent predictors of mortality, and there is a graded relationship between these parameters and mortality risk. Following these parameters over time provides additional information on mortality risk. One must also consider age when interpreting the relationship between physical function and mortality.
Fujino, Yoshihisa; Shazuki, Shuichiro; Izumi, Hiroyuki; Uehara, Masamichi; Muramatsu, Keiji; Kubo, Tatsuhiko; Oyama, Ichiro; Matsuda, Shinya
2016-07-01
This study examined the association of work functioning impairment as measured by work functioning impairment scale (WFun) and subsequent sick leave. A prospective cohort study was conducted at a manufacturer in Japan, and 1263 employees participated. Information on sick leave was gathered during an 18-month follow-up period. The hazard ratios (HRs) of long-term sick leave were substantially increased for those with a WFun score greater than 25 (HR = 3.99, P = 0.003). The incidence rate ratios (IRRs) of days of short-term absence gradually increased as scores of WFun increased (IRR = 1.18, P < 0.001 in the subjects with WFun of over 25 comparing with those with WFun of 14 or less). Assessing work functioning impairment is a useful way of classifying risk for future sick leave among employees.
Townsend, Raymond R; Anderson, Amanda Hyre; Chirinos, Julio A; Feldman, Harold I; Grunwald, Juan E; Nessel, Lisa; Roy, Jason; Weir, Matthew R; Wright, Jackson T; Bansal, Nisha; Hsu, Chi-Yuan
2018-06-01
Patients with chronic kidney diseases (CKDs) are at risk for further loss of kidney function and death, which occur despite reasonable blood pressure treatment. To determine whether arterial stiffness influences CKD progression and death, independent of blood pressure, we conducted a prospective cohort study of CKD patients enrolled in the CRIC study (Chronic Renal Insufficiency Cohort). Using carotid-femoral pulse wave velocity (PWV), we examined the relationship between PWV and end-stage kidney disease (ESRD), ESRD or halving of estimated glomerular filtration rate, or death from any cause. The 2795 participants we enrolled had a mean age of 60 years, 56.4% were men, 47.3% had diabetes mellitus, and the average estimated glomerular filtration rate at entry was 44.4 mL/min per 1.73 m 2 During follow-up, there were 504 ESRD events, 628 ESRD or halving of estimated glomerular filtration rate events, and 394 deaths. Patients with the highest tertile of PWV (>10.3 m/s) were at higher risk for ESRD (hazard ratio [95% confidence interval], 1.37 [1.05-1.80]), ESRD or 50% decline in estimated glomerular filtration rate (hazard ratio [95% confidence interval], 1.25 [0.98-1.58]), or death (hazard ratio [95% confidence interval], 1.72 [1.24-2.38]). PWV is a significant predictor of CKD progression and death in people with impaired kidney function. Incorporation of PWV measurements may help define better the risks for these important health outcomes in patients with CKDs. Interventions that reduce aortic stiffness deserve study in people with CKD. © 2018 American Heart Association, Inc.
Cigarette Smoking and Incident Heart Failure: Insights From the Jackson Heart Study.
Kamimura, Daisuke; Cain, Loretta R; Mentz, Robert J; White, Wendy B; Blaha, Michael J; DeFilippis, Andrew P; Fox, Ervin R; Rodriguez, Carlos J; Keith, Rachel J; Benjamin, Emelia J; Butler, Javed; Bhatnagar, Aruni; Robertson, Rose M; Winniford, Michael D; Correa, Adolfo; Hall, Michael E
2018-06-12
Cigarette smoking has been linked with several factors associated with cardiac dysfunction. We hypothesized that cigarette smoking is associated with left ventricular (LV) structure and function, and incident heart failure (HF) hospitalization. We investigated 4129 (never smoker n=2884, current smoker n=503, and former smoker n=742) black participants (mean age, 54 years; 63% women) without a history of HF or coronary heart disease at baseline in the Jackson Heart Study. We examined the relationships between cigarette smoking and LV structure and function by using cardiac magnetic resonance imaging among 1092 participants, cigarette smoking and brain natriuretic peptide levels among 3325 participants, and incident HF hospitalization among 3633 participants with complete data. After adjustment for confounding factors, current smoking was associated with higher mean LV mass index and lower mean LV circumferential strain ( P <0.05, for both) in comparison with never smoking. Smoking status, intensity, and burden were associated with higher mean brain natriuretic peptide levels (all P <0.05). Over 8.0 years (7.7-8.0) median follow-up, there were 147 incident HF hospitalizations. After adjustment for traditional risk factors and incident coronary heart disease, current smoking (hazard ratio, 2.82; 95% confidence interval, 1.71-4.64), smoking intensity among current smokers (≥20 cigarettes/d: hazard ratio, 3.48; 95% confidence interval, 1.65-7.32), and smoking burden among ever smokers (≥15 pack-years: hazard ratio, 2.06; 95% confidence interval, 1.29-3.3) were significantly associated with incident HF hospitalization in comparison with never smoking. In blacks, cigarette smoking is an important risk factor for LV hypertrophy, systolic dysfunction, and incident HF hospitalization even after adjusting for effects on coronary heart disease. © 2018 American Heart Association, Inc.
Covariate Measurement Error Correction Methods in Mediation Analysis with Failure Time Data
Zhao, Shanshan
2014-01-01
Summary Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This paper focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error and error associated with temporal variation. The underlying model with the ‘true’ mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling design. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. PMID:25139469
Covariate measurement error correction methods in mediation analysis with failure time data.
Zhao, Shanshan; Prentice, Ross L
2014-12-01
Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This article focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error, and error associated with temporal variation. The underlying model with the "true" mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling designs. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. © 2014, The International Biometric Society.
Navar, Ann Marie; Gallup, Dianne S; Lokhnygina, Yuliya; Green, Jennifer B; McGuire, Darren K; Armstrong, Paul W; Buse, John B; Engel, Samuel S; Lachin, John M; Standl, Eberhard; Van de Werf, Frans; Holman, Rury R; Peterson, Eric D
2017-11-01
Systolic blood pressure (SBP) treatment targets for adults with diabetes mellitus remain unclear. SBP levels among 12 275 adults with diabetes mellitus, prior cardiovascular disease, and treated hypertension were evaluated in the TECOS (Trial Evaluating Cardiovascular Outcomes With Sitagliptin) randomized trial of sitagliptin versus placebo. The association between baseline SBP and recurrent cardiovascular disease was evaluated using multivariable Cox proportional hazards modeling with restricted cubic splines, adjusting for clinical characteristics. Kaplan-Meier curves by baseline SBP were created to assess time to cardiovascular disease and 2 potential hypotension-related adverse events: worsening kidney function and fractures. The association between time-updated SBP and outcomes was examined using multivariable Cox proportional hazards models. Overall, 42.2% of adults with diabetes mellitus, cardiovascular disease, and hypertension had an SBP ≥140 mm Hg. The association between SBP and cardiovascular disease risk was U shaped, with a nadir ≈130 mm Hg. When the analysis was restricted to those with baseline SBP of 110 to 150 mm Hg, the adjusted association between SBP and cardiovascular disease risk was flat (hazard ratio per 10-mm Hg increase, 0.96; 95% confidence interval, 0.91-1.02). There was no association between SBP and risk of fracture. Above 150 mm Hg, higher SBP was associated with increasing risk of worsening kidney function (hazard ratio per 10-mm Hg increase, 1.10; 95% confidence interval, 1.02-1.18). Many patients with diabetes mellitus have uncontrolled hypertension. The U-shaped association between SBP and cardiovascular disease events was largely driven by those with very high or low SBP, with no difference in cardiovascular disease risk between 110 and 150 mm Hg. Lower SBP was not associated with higher risks of fractures or worsening kidney function. © 2017 American Heart Association, Inc.
Correction of Selection Bias in Survey Data: Is the Statistical Cure Worse Than the Bias?
Hanley, James A
2017-03-15
In previous articles in the American Journal of Epidemiology (Am J Epidemiol. 2013;177(5):431-442) and American Journal of Public Health (Am J Public Health. 2013;103(10):1895-1901), Masters et al. reported age-specific hazard ratios for the contrasts in mortality rates between obesity categories. They corrected the observed hazard ratios for selection bias caused by what they postulated was the nonrepresentativeness of the participants in the National Health Interview Study that increased with age, obesity, and ill health. However, it is possible that their regression approach to remove the alleged bias has not produced, and in general cannot produce, sensible hazard ratio estimates. First, one must consider how many nonparticipants there might have been in each category of obesity and of age at entry and how much higher the mortality rates would have to be in nonparticipants than in participants in these same categories. What plausible set of numerical values would convert the ("biased") decreasing-with-age hazard ratios seen in the data into the ("unbiased") increasing-with-age ratios that they computed? Can these values be encapsulated in (and can sensible values be recovered from) 1 additional internal variable in a regression model? Second, one must examine the age pattern of the hazard ratios that have been adjusted for selection. Without the correction, the hazard ratios are attenuated with increasing age. With it, the hazard ratios at older ages are considerably higher, but those at younger ages are well below 1. Third, one must test whether the regression approach suggested by Masters et al. would correct the nonrepresentativeness that increased with age and ill health that I introduced into real and hypothetical data sets. I found that the approach did not recover the hazard ratio patterns present in the unselected data sets: The corrections overshot the target at older ages and undershot it at lower ages. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.
Lee, Sang-Uk; Oh, In-Hwan; Jeon, Hong Jin; Roh, Sungwon
2017-06-01
The relation of income and socioeconomic status with suicide rates remains unclear. Most previous studies have focused on the relationship between suicide rates and macroeconomic factors (e.g., economic growth rate). Therefore, we aimed to identify the relationship between individuals' socioeconomic position and suicide risk. We analyzed suicide mortality rates across socioeconomic positions to identify potential trends using observational data on suicide mortality collected between January 2003 and December 2013 from 1,025,340 national health insurance enrollees. We followed the subjects for 123.5 months on average. Socioeconomic position was estimated using insurance premium levels. To examine the hazard ratios of suicide mortality in various socioeconomic positions, we used Cox proportional hazard models. We found that the hazard ratios of suicide showed an increasing trend as socioeconomic position decreased. After adjusting for gender, age, geographic location, and disability level, Medicaid recipients had the highest suicide hazard ratio (2.28; 95% CI, 1.87-2.77). Among the Medicaid recipients, men had higher hazard ratios than women (2.79; 95% CI, 2.17-3.59 vs. 1.71; 95% CI, 1.25-2.34). Hazard ratios also varied across age groups. The highest hazard ratio was found in the 40-59-year-old group (3.19; 95% CI, 2.31-4.43), whereas the lowest ratio was found in those 60 years and older (1.44; 95% CI, 1.09-1.87). Our results illuminate the relationship between socioeconomic position and suicide rates and can be used to design and implement future policies on suicide prevention. Copyright © 2017 The Authors. Production and hosting by Elsevier B.V. All rights reserved.
Flegal, Katherine M; Kit, Brian K; Graubard, Barry I
2018-01-01
Misclassification of body mass index (BMI) categories arising from self-reported weight and height can bias hazard ratios in studies of BMI and mortality. We examined the effects on hazard ratios of such misclassification using national US survey data for 1976 through 2010 that had both measured and self-reported weight and height along with mortality follow-up for 48,763 adults and a subset of 17,405 healthy never-smokers. BMI was categorized as <22.5 (low), 22.5-24.9 (referent), 25.0-29.9 (overweight), 30.0-34.9 (class I obesity), and ≥35.0 (class II-III obesity). Misreporting at higher BMI categories tended to bias hazard ratios upwards for those categories, but that effect was augmented, counterbalanced, or even reversed by misreporting in other BMI categories, in particular those that affected the reference category. For example, among healthy male never-smokers, misclassifications affecting the overweight and the reference categories changed the hazard ratio for overweight from 0.85 with measured data to 1.24 with self-reported data. Both the magnitude and direction of bias varied according to the underlying hazard ratios in measured data, showing that findings on bias from one study should not be extrapolated to a study with different underlying hazard ratios. Because of misclassification effects, self-reported weight and height cannot reliably indicate the lowest-risk BMI category. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Brunstein, Claudio; Zhang, Mei-Jie; Barker, Juliet; St. Martin, Andrew; Bashey, Asad; de Lima, Marcos; Dehn, Jason; Hematti, Peiman; Perales, Miguel-Angel; Rocha, Vanderson; Territo, Mary; Weisdorf, Daniel; Eapen, Mary
2017-01-01
The effects of inter-unit HLA-match on early outcomes with regards to double cord blood transplantation have not been established. Therefore, we studied the effect of inter-unit HLA-mismatching on the outcomes of 449 patients with acute leukemia after double cord blood transplantation. Patients were divided into two groups: one group that included transplantations with inter-unit mismatch at 2 or less HLA-loci (n=381) and the other group with inter-unit mismatch at 3 or 4 HLA-loci (n=68). HLA-match considered low resolution matching at HLA-A and -B loci and allele-level at HLA-DRB1, the accepted standard for selecting units for double cord blood transplants. Patients’, disease, and transplant characteristics were similar in the two groups. We observed no effect of the degree of inter-unit HLA-mismatch on neutrophil (Hazard Ratio 1.27, P=0.11) or platelet (Hazard Ratio 0.1.13, P=0.42) recovery, acute graft-versus-host disease (Hazard Ratio 1.17, P=0.36), treatment-related mortality (Hazard Ratio 0.92, P=0.75), relapse (Hazard Ratio 1.18, P=0.49), treatment failure (Hazard Ratio 0.99, P=0.98), or overall survival (Hazard Ratio 0.98, P=0.91). There were no differences in the proportion of transplants with engraftment of both units by three months (5% after transplantation of units with inter-unit mismatch at ≤2 HLA-loci and 4% after transplantation of units with inter-unit mismatch at 3 or 4 HLA-loci). Our observations support the elimination of inter-unit HLA-mismatch criterion when selecting cord blood units in favor of optimizing selection based on individual unit characteristics. PMID:28126967
Muruet, Walter; Rudd, Anthony; Wolfe, Charles D A; Douiri, Abdel
2018-03-01
Intravenous thrombolysis with alteplase is one of the few approved treatments for acute ischemic stroke; nevertheless, little is known about its long-term effects on survival and recovery because clinical trials follow-up times are limited. Patients registered between January 2005 and December 2015, to the population-based South London Stroke Register of first-ever strokes. Propensity score was used to match thrombolyzed and control cases to a 1:2 ratio by demographical and clinical covariates. The primary outcome was survival up to 10 years using Kaplan-Meier estimates, Cox proportional hazards, and restricted mean survival time. Secondary outcomes included stroke recurrence and functional status (Barthel Index and Frenchay Activities Index scores) at 5 years. From 2052 ischemic strokes, 246 treated patients were matched to 492 controls. Median follow-up time 5.45 years (interquartile range, 4.56). Survival was higher in the treatment group (median, 5.72 years) compared with control group (4.98 years, stratified log-rank test <0.001). The number needed to treat to prevent 1 death at 5 years was 12 and 20 at 10 years. After Cox regression analysis, thrombolysis reduced risk of mortality by 37% (hazard ratio, 0.63; 95% confidence interval [CI], 0.48-0.82) at 10 years; however, after introducing a multiplicative interaction term into the model, mortality risk reduction was 42% (hazard ratio, 0.58; 95% CI, 0.40-0.82) at 10 years for those arriving within 3 hours to the hospital. On average, in a 10-year period, treated patients lived 1 year longer than controls. At 5 years, thrombolysis was associated with independence (Barthel Index≥90; odds ratio, 3.76; 95% CI, 1.22-13.34) and increased odds of a higher Frenchay Activities Index (proportional odds ratio, 2.37; 95% CI, 1.16-4.91). There was no difference in stroke recurrence. Thrombolysis with intravenous alteplase is associated with improved long-term survival and functional status after ischemic stroke. © 2018 The Authors.
Debette-Gratien, Marilyne; Woillard, Jean-Baptiste; Picard, Nicolas; Sebagh, Mylène; Loustaud-Ratti, Véronique; Sautereau, Denis; Samuel, Didier; Marquet, Pierre
2016-10-01
This study investigated the influence of the CYP3A4*22, CYP3A5*3, and ABCB1 exons 12, 21, and 26 polymorphisms in donors and recipients on clinical outcomes and renal function in 170 liver transplant patients on cyclosporin A (CsA) or tacrolimus (Tac). Allelic discrimination assays were used for genotyping. Multivariate time-dependent Cox proportional hazard models, multiple linear regression using the generalized estimating equation and linear mixed-effect models were used for statistical analysis. Expression of CYP3A5 by either or both the donor and the recipient was significantly associated with lower Tac, but not CsA, dose-normalized trough levels. In the whole population, graft loss was only significantly associated with longer exposure to high calcineurin inhibitor (CNI) concentrations (hazard ratio, 6.93; 95% confidence interval, 2.13-22.55), P = 0.00129), whereas in the Tac subgroup, the risk of graft loss was significantly higher in recipient CYP3A5*1 expressers (hazard ratio, 3.39; 95% confidence interval, 1.52-7.58; P = 0.0028). Renal function was significantly associated with: (1) baseline modification of diet in renal disease (β = 0.51 ± 0.05; P < 0.0001); (2) duration of patient follow-up (per visit, β = -0.98 ± 0.22; P < 0.0001); and (3) CNI exposure (per quantile increase, β = -2.42 ± 0.59; P < 0.0001). No genetic factor was associated with patient survival, acute rejection, liver function test results, recurrence of viral or other initial liver disease, or renal function. This study confirms the effect of CYP3A5*3 on tacrolimus dose requirement in liver transplantation and shows unexpected associations between the type of, and exposure to, CNI and either chronic rejection or graft loss. None of the genetic polymorphisms studied had a noticeable impact on renal function degradation at 10 years.
On the Interpretation of the Hazard Ratio and Communication of Survival Benefit.
Sashegyi, Andreas; Ferry, David
2017-04-01
This brief communication will clarify the difference between a relative hazard and a relative risk. We highlight the importance of this difference, and demonstrate in practical terms that 1 minus the hazard ratio should not be interpreted as a risk reduction in the commonly understood sense of the term. This article aims to provide a better understanding of the type of risk reduction that a hazard ratio implies, thereby clarifying the intent in the communication among practitioners and researchers and establishing an accurate and realistic foundation for communicating with patients. The Oncologist 2017;22:484-486. © AlphaMed Press 2017.
Jarvis, J; Seed, M; Elton, R; Sawyer, L; Agius, R
2005-01-01
Aims: To investigate quantitatively, relationships between chemical structure and reported occupational asthma hazard for low molecular weight (LMW) organic compounds; to develop and validate a model linking asthma hazard with chemical substructure; and to generate mechanistic hypotheses that might explain the relationships. Methods: A learning dataset used 78 LMW chemical asthmagens reported in the literature before 1995, and 301 control compounds with recognised occupational exposures and hazards other than respiratory sensitisation. The chemical structures of the asthmagens and control compounds were characterised by the presence of chemical substructure fragments. Odds ratios were calculated for these fragments to determine which were associated with a likelihood of being reported as an occupational asthmagen. Logistic regression modelling was used to identify the independent contribution of these substructures. A post-1995 set of 21 asthmagens and 77 controls were selected to externally validate the model. Results: Nitrogen or oxygen containing functional groups such as isocyanate, amine, acid anhydride, and carbonyl were associated with an occupational asthma hazard, particularly when the functional group was present twice or more in the same molecule. A logistic regression model using only statistically significant independent variables for occupational asthma hazard correctly assigned 90% of the model development set. The external validation showed a sensitivity of 86% and specificity of 99%. Conclusions: Although a wide variety of chemical structures are associated with occupational asthma, bifunctional reactivity is strongly associated with occupational asthma hazard across a range of chemical substructures. This suggests that chemical cross-linking is an important molecular mechanism leading to the development of occupational asthma. The logistic regression model is freely available on the internet and may offer a useful but inexpensive adjunct to the prediction of occupational asthma hazard. PMID:15778257
Royston, Patrick; Parmar, Mahesh K B
2014-08-07
Most randomized controlled trials with a time-to-event outcome are designed and analysed under the proportional hazards assumption, with a target hazard ratio for the treatment effect in mind. However, the hazards may be non-proportional. We address how to design a trial under such conditions, and how to analyse the results. We propose to extend the usual approach, a logrank test, to also include the Grambsch-Therneau test of proportional hazards. We test the resulting composite null hypothesis using a joint test for the hazard ratio and for time-dependent behaviour of the hazard ratio. We compute the power and sample size for the logrank test under proportional hazards, and from that we compute the power of the joint test. For the estimation of relevant quantities from the trial data, various models could be used; we advocate adopting a pre-specified flexible parametric survival model that supports time-dependent behaviour of the hazard ratio. We present the mathematics for calculating the power and sample size for the joint test. We illustrate the methodology in real data from two randomized trials, one in ovarian cancer and the other in treating cellulitis. We show selected estimates and their uncertainty derived from the advocated flexible parametric model. We demonstrate in a small simulation study that when a treatment effect either increases or decreases over time, the joint test can outperform the logrank test in the presence of both patterns of non-proportional hazards. Those designing and analysing trials in the era of non-proportional hazards need to acknowledge that a more complex type of treatment effect is becoming more common. Our method for the design of the trial retains the tools familiar in the standard methodology based on the logrank test, and extends it to incorporate a joint test of the null hypothesis with power against non-proportional hazards. For the analysis of trial data, we propose the use of a pre-specified flexible parametric model that can represent a time-dependent hazard ratio if one is present.
Mutambudzi, Miriam; Chen, Nai-Wei; Markides, Kyriakos S; Al Snih, Soham
2016-11-01
To examine the effect of co-occurring depressive symptoms and functional disability on mortality in older Mexican-American adults with diabetes mellitus. Longitudinal cohort study. Hispanic Established Populations for the Epidemiological Study of the Elderly (HEPESE) survey conducted in the southwestern United States (Texas, Colorado, Arizona, New Mexico, California). Community-dwelling Mexican Americans with self-reported diabetes mellitus participating in the HEPESE survey (N = 624). Functional disability was assessed using a modified version of the Katz activity of daily living scale. Depressive symptoms were measured using the Center for Epidemiologic Studies Depression Scale. Mortality was determined by examining death certificates and reports from relatives. Cox proportional hazards regression analyses were used to examine the hazard of mortality as a function of co-occurring depressive symptoms and functional disability. Over a 9.2-year follow-up, 391 participants died. Co-occurring high depressive symptoms and functional disability increased the risk of mortality (hazard ratio (HR) = 3.02, 95% confidence interval (CI) = 2.11-4.34). Risk was greater in men (HR = 8.11, 95% CI = 4.34-16.31) than women (HR = 2.21, 95% CI = 1.42-3.43). Co-occurring depressive symptoms and functional disability in older Mexican-American adults with diabetes mellitus increases mortality risk, especially in men. These findings have important implications for research, practice, and public health interventions. © 2016, Copyright the Authors Journal compilation © 2016, The American Geriatrics Society.
Edoxaban versus warfarin in patients with atrial fibrillation.
Giugliano, Robert P; Ruff, Christian T; Braunwald, Eugene; Murphy, Sabina A; Wiviott, Stephen D; Halperin, Jonathan L; Waldo, Albert L; Ezekowitz, Michael D; Weitz, Jeffrey I; Špinar, Jindřich; Ruzyllo, Witold; Ruda, Mikhail; Koretsune, Yukihiro; Betcher, Joshua; Shi, Minggao; Grip, Laura T; Patel, Shirali P; Patel, Indravadan; Hanyok, James J; Mercuri, Michele; Antman, Elliott M
2013-11-28
Edoxaban is a direct oral factor Xa inhibitor with proven antithrombotic effects. The long-term efficacy and safety of edoxaban as compared with warfarin in patients with atrial fibrillation is not known. We conducted a randomized, double-blind, double-dummy trial comparing two once-daily regimens of edoxaban with warfarin in 21,105 patients with moderate-to-high-risk atrial fibrillation (median follow-up, 2.8 years). The primary efficacy end point was stroke or systemic embolism. Each edoxaban regimen was tested for noninferiority to warfarin during the treatment period. The principal safety end point was major bleeding. The annualized rate of the primary end point during treatment was 1.50% with warfarin (median time in the therapeutic range, 68.4%), as compared with 1.18% with high-dose edoxaban (hazard ratio, 0.79; 97.5% confidence interval [CI], 0.63 to 0.99; P<0.001 for noninferiority) and 1.61% with low-dose edoxaban (hazard ratio, 1.07; 97.5% CI, 0.87 to 1.31; P=0.005 for noninferiority). In the intention-to-treat analysis, there was a trend favoring high-dose edoxaban versus warfarin (hazard ratio, 0.87; 97.5% CI, 0.73 to 1.04; P=0.08) and an unfavorable trend with low-dose edoxaban versus warfarin (hazard ratio, 1.13; 97.5% CI, 0.96 to 1.34; P=0.10). The annualized rate of major bleeding was 3.43% with warfarin versus 2.75% with high-dose edoxaban (hazard ratio, 0.80; 95% CI, 0.71 to 0.91; P<0.001) and 1.61% with low-dose edoxaban (hazard ratio, 0.47; 95% CI, 0.41 to 0.55; P<0.001). The corresponding annualized rates of death from cardiovascular causes were 3.17% versus 2.74% (hazard ratio, 0.86; 95% CI, 0.77 to 0.97; P=0.01), and 2.71% (hazard ratio, 0.85; 95% CI, 0.76 to 0.96; P=0.008), and the corresponding rates of the key secondary end point (a composite of stroke, systemic embolism, or death from cardiovascular causes) were 4.43% versus 3.85% (hazard ratio, 0.87; 95% CI, 0.78 to 0.96; P=0.005), and 4.23% (hazard ratio, 0.95; 95% CI, 0.86 to 1.05; P=0.32). Both once-daily regimens of edoxaban were noninferior to warfarin with respect to the prevention of stroke or systemic embolism and were associated with significantly lower rates of bleeding and death from cardiovascular causes. (Funded by Daiichi Sankyo Pharma Development; ENGAGE AF-TIMI 48 ClinicalTrials.gov number, NCT00781391.).
2013-06-01
rising IL-6 levels portended worse overall survival (hazard ratio = 1.525, P = 0.02). The following is a synopsis of year-2, followed by a summary...6 with patient outcome. Specifically, our data indicated that rising IL-6 levels portended worse overall survival (hazard ratio = 1.525, P = 0.02...portended worse overall survival (hazard ratio = 1.525, P = 0.02). 3. Key Research Accomplishments: Altogether, we identified… • A significant
Exercise capacity and all-cause mortality in male veterans with hypertension aged ≥70 years.
Faselis, Charles; Doumas, Michael; Pittaras, Andreas; Narayan, Puneet; Myers, Jonathan; Tsimploulis, Apostolos; Kokkinos, Peter
2014-07-01
Aging, even in otherwise healthy subjects, is associated with declines in muscle mass, strength, and aerobic capacity. Older individuals respond favorably to exercise, suggesting that physical inactivity plays an important role in age-related functional decline. Conversely, physical activity and improved exercise capacity are associated with lower mortality risk in hypertensive individuals. However, the effect of exercise capacity in older hypertensive individuals has not been investigated extensively. A total of 2153 men with hypertension, aged ≥70 years (mean, 75 ± 4) from the Washington, DC, and Palo Alto Veterans Affairs Medical Centers, underwent routine exercise tolerance testing. Peak workload was estimated in metabolic equivalents (METs). Fitness categories were established based on peak METs achieved, adjusted for age: very-low-fit, 2.0 to 4.0 METs (n=386); low-fit, 4.1 to 6.0 METs (n=1058); moderate-fit, 6.1 to 8.0 METs (n=495); high-fit >8.0 METs (n=214). Cox proportional hazard models were applied after adjusting for age, body mass index, race, cardiovascular disease, cardiovascular medications, and risk factors. All-cause mortality was quantified during a mean follow-up period of 9.0 ± 5.5 years. There were a total of 1039 deaths or 51.2 deaths per 1000 person-years of follow-up. Mortality risk was 11% lower (hazard ratio, 0.89; 95% confidence interval, 0.86-0.93; P<0.001) for every 1-MET increase in exercise capacity. When compared with those achieving ≤4.0 METs, mortality risk was 18% lower (hazard ratio, 0.82; 95% confidence interval, 0.70-0.95; P=0.011) for the low-fit, 36% for the moderate-fit (hazard ratio, 0.64; 95% confidence interval, 0.52-0.78; P<0.001), and 48% for the high-fit individuals (hazard ratio, 0.52; 95% confidence interval, 0.39-0.69; P<0.001). These findings suggest that exercise capacity is associated with lower mortality risk in elderly men with hypertension. © 2014 American Heart Association, Inc.
Hanley, James A
2008-01-01
Most survival analysis textbooks explain how the hazard ratio parameters in Cox's life table regression model are estimated. Fewer explain how the components of the nonparametric baseline survivor function are derived. Those that do often relegate the explanation to an "advanced" section and merely present the components as algebraic or iterative solutions to estimating equations. None comment on the structure of these estimators. This note brings out a heuristic representation that may help to de-mystify the structure.
Comparison of methods for estimating the attributable risk in the context of survival analysis.
Gassama, Malamine; Bénichou, Jacques; Dartois, Laureen; Thiébaut, Anne C M
2017-01-23
The attributable risk (AR) measures the proportion of disease cases that can be attributed to an exposure in the population. Several definitions and estimation methods have been proposed for survival data. Using simulations, we compared four methods for estimating AR defined in terms of survival functions: two nonparametric methods based on Kaplan-Meier's estimator, one semiparametric based on Cox's model, and one parametric based on the piecewise constant hazards model, as well as one simpler method based on estimated exposure prevalence at baseline and Cox's model hazard ratio. We considered a fixed binary exposure with varying exposure probabilities and strengths of association, and generated event times from a proportional hazards model with constant or monotonic (decreasing or increasing) Weibull baseline hazard, as well as from a nonproportional hazards model. We simulated 1,000 independent samples of size 1,000 or 10,000. The methods were compared in terms of mean bias, mean estimated standard error, empirical standard deviation and 95% confidence interval coverage probability at four equally spaced time points. Under proportional hazards, all five methods yielded unbiased results regardless of sample size. Nonparametric methods displayed greater variability than other approaches. All methods showed satisfactory coverage except for nonparametric methods at the end of follow-up for a sample size of 1,000 especially. With nonproportional hazards, nonparametric methods yielded similar results to those under proportional hazards, whereas semiparametric and parametric approaches that both relied on the proportional hazards assumption performed poorly. These methods were applied to estimate the AR of breast cancer due to menopausal hormone therapy in 38,359 women of the E3N cohort. In practice, our study suggests to use the semiparametric or parametric approaches to estimate AR as a function of time in cohort studies if the proportional hazards assumption appears appropriate.
Abdel Raheem, Ali; Shin, Tae Young; Chang, Ki Don; Santok, Glen Denmer R; Alenzi, Mohamed Jayed; Yoon, Young Eun; Ham, Won Sik; Han, Woong Kyu; Choi, Young Deuk; Rha, Koon Ho
2018-06-19
To develop a predictive nomogram for chronic kidney disease-free survival probability in the long term after partial nephrectomy. A retrospective analysis was carried out of 698 patients with T1 renal tumors undergoing partial nephrectomy at a tertiary academic institution. A multivariable Cox regression analysis was carried out based on parameters proven to have an impact on postoperative renal function. Patients with incomplete data, <12 months follow up and preoperative chronic kidney disease stage III or greater were excluded. The study end-points were to identify independent risk factors for new-onset chronic kidney disease development, as well as to construct a predictive model for chronic kidney disease-free survival probability after partial nephrectomy. The median age was 52 years, median tumor size was 2.5 cm and mean warm ischemia time was 28 min. A total of 91 patients (13.1%) developed new-onset chronic kidney disease at a median follow up of 60 months. The chronic kidney disease-free survival rates at 1, 3, 5 and 10 year were 97.1%, 94.4%, 85.3% and 70.6%, respectively. On multivariable Cox regression analysis, age (1.041, P = 0.001), male sex (hazard ratio 1.653, P < 0.001), diabetes mellitus (hazard ratio 1.921, P = 0.046), tumor size (hazard ratio 1.331, P < 0.001) and preoperative estimated glomerular filtration rate (hazard ratio 0.937, P < 0.001) were independent predictors for new-onset chronic kidney disease. The C-index for chronic kidney disease-free survival was 0.853 (95% confidence interval 0.815-0.895). We developed a novel nomogram for predicting the 5-year chronic kidney disease-free survival probability after on-clamp partial nephrectomy. This model might have an important role in partial nephrectomy decision-making and follow-up plan after surgery. External validation of our nomogram in a larger cohort of patients should be considered. © 2018 The Japanese Urological Association.
Circulating endothelial progenitor cells and cardiovascular outcomes.
Werner, Nikos; Kosiol, Sonja; Schiegl, Tobias; Ahlers, Patrick; Walenta, Katrin; Link, Andreas; Böhm, Michael; Nickenig, Georg
2005-09-08
Endothelial progenitor cells derived from bone marrow are believed to support the integrity of the vascular endothelium. The number and function of endothelial progenitor cells correlate inversely with cardiovascular risk factors, but the prognostic value associated with circulating endothelial progenitor cells has not been defined. The number of endothelial progenitor cells positive for CD34 and kinase insert domain receptor (KDR) was determined with the use of flow cytometry in 519 patients with coronary artery disease as confirmed on angiography. After 12 months, we evaluated the association between baseline levels of endothelial progenitor cells and death from cardiovascular causes, the occurrence of a first major cardiovascular event (myocardial infarction, hospitalization, revascularization, or death from cardiovascular causes), revascularization, hospitalization, and death from all causes. A total of 43 participants died, 23 from cardiovascular causes. A first major cardiovascular event occurred in 214 patients. The cumulative event-free survival rate increased stepwise across three increasing baseline levels of endothelial progenitor cells in an analysis of death from cardiovascular causes, a first major cardiovascular event, revascularization, and hospitalization. After adjustment for age, sex, vascular risk factors, and other relevant variables, increased levels of endothelial progenitor cells were associated with a reduced risk of death from cardiovascular causes (hazard ratio, 0.31; 95 percent confidence interval, 0.16 to 0.63; P=0.001), a first major cardiovascular event (hazard ratio, 0.74; 95 percent confidence interval, 0.62 to 0.89; P=0.002), revascularization (hazard ratio, 0.77; 95 percent confidence interval, 0.62 to 0.95; P=0.02), and hospitalization (hazard ratio, 0.76; 95 percent confidence interval, 0.63 to 0.94; P=0.01). Endothelial progenitor-cell levels were not predictive of myocardial infarction or of death from all causes. The level of circulating CD34+KDR+ endothelial progenitor cells predicts the occurrence of cardiovascular events and death from cardiovascular causes and may help to identify patients at increased cardiovascular risk. Copyright 2005 Massachusetts Medical Society.
Cardiovascular Safety of Febuxostat or Allopurinol in Patients with Gout.
White, William B; Saag, Kenneth G; Becker, Michael A; Borer, Jeffrey S; Gorelick, Philip B; Whelton, Andrew; Hunt, Barbara; Castillo, Majin; Gunawardhana, Lhanoo
2018-03-29
Cardiovascular risk is increased in patients with gout. We compared cardiovascular outcomes associated with febuxostat, a nonpurine xanthine oxidase inhibitor, with those associated with allopurinol, a purine base analogue xanthine oxidase inhibitor, in patients with gout and cardiovascular disease. We conducted a multicenter, double-blind, noninferiority trial involving patients with gout and cardiovascular disease; patients were randomly assigned to receive febuxostat or allopurinol and were stratified according to kidney function. The trial had a prespecified noninferiority margin of 1.3 for the hazard ratio for the primary end point (a composite of cardiovascular death, nonfatal myocardial infarction, nonfatal stroke, or unstable angina with urgent revascularization). In total, 6190 patients underwent randomization, received febuxostat or allopurinol, and were followed for a median of 32 months (maximum, 85 months). The trial regimen was discontinued in 56.6% of patients, and 45.0% discontinued follow-up. In the modified intention-to-treat analysis, a primary end-point event occurred in 335 patients (10.8%) in the febuxostat group and in 321 patients (10.4%) in the allopurinol group (hazard ratio, 1.03; upper limit of the one-sided 98.5% confidence interval [CI], 1.23; P=0.002 for noninferiority). All-cause and cardiovascular mortality were higher in the febuxostat group than in the allopurinol group (hazard ratio for death from any cause, 1.22 [95% CI, 1.01 to 1.47]; hazard ratio for cardiovascular death, 1.34 [95% CI, 1.03 to 1.73]). The results with regard to the primary end point and all-cause and cardiovascular mortality in the analysis of events that occurred while patients were being treated were similar to the results in the modified intention-to-treat analysis. In patients with gout and major cardiovascular coexisting conditions, febuxostat was noninferior to allopurinol with respect to rates of adverse cardiovascular events. All-cause mortality and cardiovascular mortality were higher with febuxostat than with allopurinol. (Funded by Takeda Development Center Americas; CARES ClinicalTrials.gov number, NCT01101035 .).
Gupta, Nishant; Lee, Hye-Seung; Ryu, Jay H; Taveira-DaSilva, Angelo M; Beck, Gerald J; Lee, Jar-Chi; McCarthy, Kevin; Finlay, Geraldine A; Brown, Kevin K; Ruoss, Stephen J; Avila, Nilo A; Moss, Joel; McCormack, Francis X
2018-06-22
The natural history of lymphangioleiomyomatosis is mainly derived from retrospective cohort analyses and remains incompletely understood. A National Institutes of Health LAM Registry was established to define the natural history and identify prognostic biomarkers that can help guide management and decision-making in patients with LAM. A linear mixed effects model was employed to compute the rate of decline of FEV1, and identify variables impacting FEV1 decline among 217 registry patients who enrolled from 1998-2001. Prognostic variables associated with progression to death/lung transplantation were identified using a Cox proportional hazard model. Mean annual decline of FEV1 was 89±53 ml/year, and remained remarkably constant regardless of baseline lung function. FEV1 decline was more rapid in those with greater cyst profusion on CT scan (p=0.02), and in premenopausal subjects (118ml/year) compared to postmenopausal subjects (74ml/year), (p=0.003). There were 26 deaths and 43 lung transplants during the evaluation period. Estimated 5-, 10-, 15-, and 20-year transplant-free survival rates were 95%, 85%, 75%, and 64%, respectively. Postmenopausal status (hazard ratio 0.30, p=0.0002) and higher baseline FEV1 (hazard ratio 0.97, p=0.008) or DLCO (hazard ratio 0.97, p=0.001) were independently associated with a lower risk of progression to death or lung transplantation. The median transplant-free survival in patients with LAM is greater than 20 years. Menopausal status as well as structural and physiological markers of disease severity significantly affect the rate of decline of FEV1 and progression to death or lung transplantation in LAM. Copyright © 2018. Published by Elsevier Inc.
Menotti, Alessandro; Puddu, Paolo Emilio; Maiani, Giuseppe; Catasta, Giovina
2016-05-01
To relate major causes of death with lifestyle habits in an almost extinct male middle-aged population. A 40-59 aged male population of 1712 subjects was examined and followed-up for 50 years. Baseline smoking habits, working physical activity and dietary habits were related to 50 years mortality subdivided into 12 simple and 3 composite causes of death by Cox proportional hazard models. Duration of survival was related to the same characteristics by a multiple linear regression model. Death rate in 50 years was of 97.5%. Out of 12 simple groups of causes of death, 6 were related to smoking habits, 3 to physical activity and 4 to dietary habits. Among composite groups of causes of death, hazard ratios (and their 95% confidence limits) of never smokers versus smokers were 0.68 (0.57-0.81) for major cardiovascular diseases; 0.65 (0.52-0.81) for all cancers; and 0.72 (0.64-0.81) for all-cause deaths. Hazard ratios of vigorous physical activity at work versus sedentary physical activity were 0.63 (0.49-0.80) for major cardiovascular diseases; 1.01 (0.72-1.41) for all cancers; and 0.76 (0.64-0.90) for all-cause deaths. Hazard ratios of Mediterranean Diet versus non-Mediterranean Diet were 0.68 (0.54-0.86) for major cardiovascular diseases; 0.54 (0.40-0.73) for all cancers; and 0.67 (0.57-0.78) for all-cause deaths. Expectancy of life was 12 years longer for men with the 3 best behaviors than for those with the 3 worst behaviors. Some lifestyle habits are strongly related to lifetime mortality. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Loftfield, Erikka; Freedman, Neal D; Graubard, Barry I; Guertin, Kristin A; Black, Amanda; Huang, Wen-Yi; Shebl, Fatma M; Mayne, Susan T; Sinha, Rashmi
2015-12-15
Concerns about high caffeine intake and coffee as a vehicle for added fat and sugar have raised questions about the net impact of coffee on health. Although inverse associations have been observed for overall mortality, data for cause-specific mortality are sparse. Additionally, few studies have considered exclusively decaffeinated coffee intake or use of coffee additives. Coffee intake was assessed at baseline by self-report in the Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial. Hazard ratios were estimated using Cox proportional hazards models. Among 90,317 US adults without cancer at study baseline (1998-2001) or history of cardiovascular disease at study enrollment (1993-2001), 8,718 deaths occurred during 805,644 person-years of follow-up from 1998 through 2009. Following adjustment for smoking and other potential confounders, coffee drinkers, as compared with nondrinkers, had lower hazard ratios for overall mortality (<1 cup/day: hazard ratio (HR) = 0.99 (95% confidence interval (CI): 0.92, 1.07); 1 cup/day: HR = 0.94 (95% CI: 0.87, 1.02); 2-3 cups/day: HR = 0.82 (95% CI: 0.77, 0.88); 4-5 cups/day: HR = 0.79 (95% CI: 0.72, 0.86); ≥6 cups/day: HR = 0.84 (95% CI: 0.75, 0.95)). Similar findings were observed for decaffeinated coffee and coffee additives. Inverse associations were observed for deaths from heart disease, chronic respiratory diseases, diabetes, pneumonia and influenza, and intentional self-harm, but not cancer. Coffee may reduce mortality risk by favorably affecting inflammation, lung function, insulin sensitivity, and depression. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Loftfield, Erikka; Freedman, Neal D.; Graubard, Barry I.; Guertin, Kristin A.; Black, Amanda; Huang, Wen-Yi; Shebl, Fatma M.; Mayne, Susan T.; Sinha, Rashmi
2015-01-01
Abstract Concerns about high caffeine intake and coffee as a vehicle for added fat and sugar have raised questions about the net impact of coffee on health. Although inverse associations have been observed for overall mortality, data for cause-specific mortality are sparse. Additionally, few studies have considered exclusively decaffeinated coffee intake or use of coffee additives. Coffee intake was assessed at baseline by self-report in the Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial. Hazard ratios were estimated using Cox proportional hazards models. Among 90,317 US adults without cancer at study baseline (1998–2001) or history of cardiovascular disease at study enrollment (1993–2001), 8,718 deaths occurred during 805,644 person-years of follow-up from 1998 through 2009. Following adjustment for smoking and other potential confounders, coffee drinkers, as compared with nondrinkers, had lower hazard ratios for overall mortality (<1 cup/day: hazard ratio (HR) = 0.99 (95% confidence interval (CI): 0.92, 1.07); 1 cup/day: HR = 0.94 (95% CI: 0.87, 1.02); 2–3 cups/day: HR = 0.82 (95% CI: 0.77, 0.88); 4–5 cups/day: HR = 0.79 (95% CI: 0.72, 0.86); ≥6 cups/day: HR = 0.84 (95% CI: 0.75, 0.95)). Similar findings were observed for decaffeinated coffee and coffee additives. Inverse associations were observed for deaths from heart disease, chronic respiratory diseases, diabetes, pneumonia and influenza, and intentional self-harm, but not cancer. Coffee may reduce mortality risk by favorably affecting inflammation, lung function, insulin sensitivity, and depression. PMID:26614599
Vu, Thuy C.; Nutt, John G.; Holford, Nicholas H. G.
2012-01-01
AIM To describe the time to clinical events (death, disability, cognitive impairment and depression) in Parkinson's disease using the time course of disease status and treatment as explanatory variables. METHODS Disease status based on the Unified Parkinson's Disease Rating Scale (UPDRS) and the time to clinical outcome events were obtained from 800 patients who initially had early Parkinson's disease. Parametric hazard models were used to describe the time to the events of interest. RESULTS Time course of disease status (severity) was an important predictor of clinical outcome events. There was an increased hazard ratio for death 1.4 (95% CI 1.31, 149), disability 2.75 (95% CI 2.30, 3.28), cognitive impairment 4.35 (95% CI 1.94, 9.74), and depressive state 1.43 (95% CI 1.26, 1.63) with each 10 unit increase of UPDRS. Age at study entry increased the hazard with hazard ratios of 49.1 (95% CI 8.7, 278) for death, 4.76 (95% CI 1.10, 20.6) for disability and 90.0 (95% CI 63.3–128) for cognitive impairment at age 60 years. Selegiline treatment had independent effects as a predictor of death at 8 year follow-up with a hazard ratio of 2.54 (95% CI 1.51, 4.25) but had beneficial effects on disability with a hazard ratio of 0.363 (95% CI 0.132, 0.533) and depression with a hazard ratio of 0.372 (95% CI 0.12, 0.552). CONCLUSIONS Our findings show that the time course of disease status based on UPDRS is a much better predictor of future clinical events than any baseline disease characteristic. Continued selegiline treatment appears to increase the hazard of death. PMID:22300470
Health Literacy and Access to Kidney Transplantation
Grubbs, Vanessa; Gregorich, Steven E.; Perez-Stable, Eliseo J.; Hsu, Chi-yuan
2009-01-01
Background and objectives: Few studies have examined health literacy in patients with end stage kidney disease. We hypothesized that inadequate health literacy in a hemodialysis population is common and is associated with poorer access to kidney transplant wait-lists. Design, setting, participants, & measurements: We enrolled 62 Black and White maintenance hemodialysis patients aged 18 to 75. We measured health literacy using the short form Test of Functional Health Literacy in Adults. Our primary outcomes were (1) time from dialysis start date to referral date for kidney transplant evaluation and (2) time from referral date to date placed on kidney transplant wait-list. We used Cox proportional hazard models to examine the association between health literacy (adequate versus inadequate) and our outcomes after controlling for demographics and co-morbid conditions. Results: Roughly one third (32.3%) of participants had inadequate health literacy. Forty-seven (75.8%) of participants were referred for transplant evaluation. Among those referred, 40 (85.1%) were wait-listed. Participants with inadequate health literacy had 78% lower hazard of referral for transplant evaluation than those with adequate health literacy (adjusted hazard ratio [AHR] 0.22; 95% confidence interval 0.08, 0.60; P = 0.003). The hazard ratio of being wait-listed by health literacy was not statistically different (AHR 0.80, 95% CI, 0.39, 1.61), P = 0.5). Conclusions: Inadequate health literacy is common in our hemodialysis patient population and is associated with a lower hazard of referral for transplant evaluation. Strategies to reduce the impact of health literacy on the kidney transplant process should be explored. PMID:19056617
Kendler, Kenneth S; Lönn, Sara Larsson; Salvatore, Jessica; Sundquist, Jan; Sundquist, Kristina
2017-05-01
The purpose of this study was to clarify the magnitude and nature of the relationship between divorce and risk for alcohol use disorder (AUD). In a population-based Swedish sample of married individuals (N=942,366), the authors examined the association between divorce or widowhood and risk for first registration for AUD. AUD was assessed using medical, criminal, and pharmacy registries. Divorce was strongly associated with risk for first AUD onset in both men (hazard ratio=5.98, 95% CI=5.65-6.33) and women (hazard ratio=7.29, 95% CI=6.72-7.91). The hazard ratio was estimated for AUD onset given divorce among discordant monozygotic twins to equal 3.45 and 3.62 in men and women, respectively. Divorce was also associated with an AUD recurrence in those with AUD registrations before marriage. Furthermore, widowhood increased risk for AUD in men (hazard ratio=3.85, 95% CI=2.81-5.28) and women (hazard ratio=4.10, 95% CI=2.98-5.64). Among divorced individuals, remarriage was associated with a large decline in AUD in both sexes (men: hazard ratio=0.56, 95% CI=0.52-0.64; women: hazard ratio=0.61, 95% CI=0.55-0.69). Divorce produced a greater increase in first AUD onset in those with a family history of AUD or with prior externalizing behaviors. Spousal loss through divorce or bereavement is associated with a large enduring increased AUD risk. This association likely reflects both causal and noncausal processes. That the AUD status of the spouse alters this association highlights the importance of spouse characteristics for the behavioral health consequences of spousal loss. The pronounced elevation in AUD risk following divorce or widowhood, and the protective effect of remarriage against subsequent AUD, speaks to the profound impact of marriage on problematic alcohol use.
Lim, Wendy; Meade, Maureen; Lauzier, Francois; Zarychanski, Ryan; Mehta, Sangeeta; Lamontagne, Francois; Dodek, Peter; McIntyre, Lauralyn; Hall, Richard; Heels-Ansdell, Diane; Fowler, Robert; Pai, Menaka; Guyatt, Gordon; Crowther, Mark A; Warkentin, Theodore E; Devereaux, P J; Walter, Stephen D; Muscedere, John; Herridge, Margaret; Turgeon, Alexis F; Geerts, William; Finfer, Simon; Jacka, Michael; Berwanger, Otavio; Ostermann, Marlies; Qushmaq, Ismael; Friedrich, Jan O; Cook, Deborah J
2015-02-01
To identify risk factors for failure of anticoagulant thromboprophylaxis in critically ill patients in the ICU. Multivariable regression analysis of thrombosis predictors from a randomized thromboprophylaxis trial. Sixty-seven medical-surgical ICUs in six countries. Three thousand seven hundred forty-six medical-surgical critically ill patients. All patients received anticoagulant thromboprophylaxis with low-molecular-weight heparin or unfractionated heparin at standard doses. Independent predictors for venous thromboembolism, proximal leg deep vein thrombosis, and pulmonary embolism developing during critical illness were assessed. A total of 289 patients (7.7%) developed venous thromboembolism. Predictors of thromboprophylaxis failure as measured by development of venous thromboembolism included a personal or family history of venous thromboembolism (hazard ratio, 1.64; 95% CI, 1.03-2.59; p = 0.04) and body mass index (hazard ratio, 1.18 per 10-point increase; 95% CI, 1.04-1.35; p = 0.01). Increasing body mass index was also a predictor for developing proximal leg deep vein thrombosis (hazard ratio, 1.25; 95% CI, 1.06-1.46; p = 0.007), which occurred in 182 patients (4.9%). Pulmonary embolism occurred in 47 patients (1.3%) and was associated with body mass index (hazard ratio, 1.37; 95% CI, 1.02-1.83; p = 0.035) and vasopressor use (hazard ratio, 1.84; 95% CI, 1.01-3.35; p = 0.046). Low-molecular-weight heparin (in comparison to unfractionated heparin) thromboprophylaxis lowered pulmonary embolism risk (hazard ratio, 0.51; 95% CI, 0.27-0.95; p = 0.034) while statin use in the preceding week lowered the risk of proximal leg deep vein thrombosis (hazard ratio, 0.46; 95% CI, 0.27-0.77; p = 0.004). Failure of standard thromboprophylaxis using low-molecular-weight heparin or unfractionated heparin is more likely in ICU patients with elevated body mass index, those with a personal or family history of venous thromboembolism, and those receiving vasopressors. Alternate management or incremental risk reduction strategies may be needed in such patients.
Executive function, but not memory, associates with incident coronary heart disease and stroke.
Rostamian, Somayeh; van Buchem, Mark A; Westendorp, Rudi G J; Jukema, J Wouter; Mooijaart, Simon P; Sabayan, Behnam; de Craen, Anton J M
2015-09-01
To evaluate the association of performance in cognitive domains executive function and memory with incident coronary heart disease and stroke in older participants without dementia. We included 3,926 participants (mean age 75 years, 44% male) at risk for cardiovascular diseases from the Prospective Study of Pravastatin in the Elderly at Risk (PROSPER) with Mini-Mental State Examination score ≥24 points. Scores on the Stroop Color-Word Test (selective attention) and the Letter Digit Substitution Test (processing speed) were converted to Z scores and averaged into a composite executive function score. Likewise, scores of the Picture Learning Test (immediate and delayed memory) were transformed into a composite memory score. Associations of executive function and memory were longitudinally assessed with risk of coronary heart disease and stroke using multivariable Cox regression models. During 3.2 years of follow-up, incidence rates of coronary heart disease and stroke were 30.5 and 12.4 per 1,000 person-years, respectively. In multivariable models, participants in the lowest third of executive function, as compared to participants in the highest third, had 1.85-fold (95% confidence interval [CI] 1.39-2.45) higher risk of coronary heart disease and 1.51-fold (95% CI 0.99-2.30) higher risk of stroke. Participants in the lowest third of memory had no increased risk of coronary heart disease (hazard ratio 0.99, 95% CI 0.74-1.32) or stroke (hazard ratio 0.87, 95% CI 0.57-1.32). Lower executive function, but not memory, is associated with higher risk of coronary heart disease and stroke. Lower executive function, as an independent risk indicator, might better reflect brain vascular pathologies. © 2015 American Academy of Neurology.
Albuminuria and Rapid Loss of GFR and Risk of New Hip and Pelvic Fractures
Gao, Peggy; Clase, Catherine M.; Mente, Andrew; Mann, Johannes F.E.; Sleight, Peter; Yusuf, Salim; Teo, Koon K.
2013-01-01
Summary Background and objectives The microvascular circulation plays an important role in bone health. This study examines whether albuminuria, a marker of renal microvascular disease, is associated with incident hip and pelvic fractures. Design, setting, participants, & measurements This study reanalyzed data from the Ongoing Telmisartan Alone and in combination with Ramipril Global End Point Trial/Telmisartan Randomized Assessment Study in Angiotensin-Converting Enzyme Intolerant Subjects with Cardiovascular Disease trials, which examined the impact of renin angiotensin system blockade on cardiovascular outcomes (n=28,601). Albuminuria was defined as an albumin-to-creatinine ratio≥30 mg/g (n=4597). Cox proportional hazards models were used to determine the association of albuminuria with fracture risk adjusted for known risk factors for fractures, estimated GFR, and rapid decline in estimated GFR (≥5%/yr). Results There were 276 hip and pelvic fractures during a mean of 4.6 years of follow-up. Participants with baseline albuminuria had a significantly increased risk of fracture compared with participants without albuminuria (unadjusted hazard ratio=1.62 [1.22, 2.15], P<0.001; adjusted hazard ratio=1.36 [1.01, 1.84], P=0.05). A dose-dependent relationship was observed, with macroalbuminuria having a large fracture risk (unadjusted hazard ratio=2.01 [1.21, 3.35], P=0.007; adjusted hazard ratio=1.71 [1.007, 2.91], P=0.05) and microalbuminuria associating with borderline or no statistical significance (unadjusted hazard ratio=1.52 [1.10, 2.09], P=0.01; adjusted hazard ratio=1.28 [0.92, 1.78], P=0.15). Estimated GFR was not a predictor of fracture in any model, but rapid loss of estimated GFR over the first 2 years of follow-up predicted subsequent fracture (adjusted hazard ratio=1.47 [1.05, 2.04], P=0.02). Conclusions Albuminuria, especially macroalbuminuria, and rapid decline of estimated GFR predict hip and pelvic fractures. These findings support a theoretical model of a relationship between underlying causes of microalbuminuria and bone disease. PMID:23184565
Tominaga, K; Andow, J; Koyama, Y; Numao, S; Kurokawa, E; Ojima, M; Nagai, M
1998-01-01
Many psychosocial factors have been reported to influence the duration of survival of breast cancer patients. We have studied how family members, hobbies and habits of the patients may alter their psychosocial status. Female patients with surgically treated breast cancer diagnosed between 1986 and 1995 at the Tochigi Cancer Center Hospital, who provided information on the above-mentioned factors, were used. Their subsequent physical status was followed up in the outpatients clinic. The Cox regression model was used to evaluate the relationship between the results of the factors examined and the duration of the patients' survival, adjusting for the patients' age, stage of disease at diagnosis and curability, as judged by the physician in charge after the treatment. The following factors were revealed to be significant with regard to the survival of surgically treated breast cancer patients: being a widow (hazard ratio 3.29; 95% confidence interval 1.32-8.20), having a hobby (hazard ratio 0.43; 95% confidence interval 0.23-0.82), number of hobbies (hazard ratio 0.64; 95% confidence interval 0.41-1.00), number of female children (hazard ratio 0.64; 95% confidence interval 0.42-0.98), smoker (hazard ratio 2.08; 95% confidence interval 1.02-4.26) and alcohol consumption (hazard ratio 0.10; 95% confidence interval 0.01-0.72). These results suggest that psychosocial factors, including the family environment, where patients receive emotional support from their spouse and children, hobbies and the patients' habits, may influence the duration of survival in surgically treated breast cancer patients.
Letang, Emilio; Lewis, James J; Bower, Mark; Mosam, Anisa; Borok, Margareth; Campbell, Thomas B; Naniche, Denise; Newsom-Davis, Tom; Shaik, Fahmida; Fiorillo, Suzanne; Miro, Jose M; Schellenberg, David; Easterbrook, Philippa J
2013-06-19
To assess the incidence, predictors, and outcomes of Kaposi sarcoma-associated paradoxical immune reconstitution inflammatory syndrome (KS-IRIS) in antiretroviral therapy (ART)-naive HIV-infected patients with Kaposi sarcoma initiating ART in both well resourced and limited-resourced settings. Pooled analysis of three prospective cohorts of ART-naive HIV-infected patients with Kaposi sarcoma from sub-Saharan Africa (SSA) and one from the UK. KS-IRIS case definition was standardized across sites. Cox regression and Kaplan-Meier survival analysis were used to identify the incidence and predictors of KS-IRIS and Kaposi sarcoma-associated mortality. Fifty-eight of 417 (13.9%) eligible individuals experienced KS-IRIS with an incidence 2.5 times higher in the African vs. European cohorts (P=0.001). ART alone as initial Kaposi sarcoma treatment (hazard ratio 2.97, 95% confidence interval (CI) 1.02-8.69); T1 Kaposi sarcoma stage (hazard ratio 2.96, 95% CI 1.26-6.94); and plasma HIV-1 RNA more than 5 log₁₀ copies/ml (hazard ratio 2.14, 95% CI 1.25-3.67) independently predicted KS-IRIS at baseline. Detectable plasma Kaposi sarcoma-associated herpes virus (KSHV) DNA additionally predicted KS-IRIS among the 259 patients with KSHV DNA assessed (hazard ratio 2.98, 95% CI 1.23-7.19). Nineteen KS-IRIS patients died, all in SSA. Kaposi sarcoma mortality was 3.3-fold higher in Africa, and was predicted by KS-IRIS (hazard ratio 19.24, CI 7.62-48.58), lack of chemotherapy (hazard ratio 2.35, 95% CI 1.09-5.05), pre-ART CD4 cell count less than 200 cells/μl (hazard ratio 2.04, 95% CI 0.99-4.2), and detectable baseline KSHV DNA (hazard ratio 2.12, 95% CI 0.94-4.77). KS-IRIS incidence and mortality are higher in SSA than in the UK. This is largely explained by the more advanced Kaposi sarcoma disease and lower chemotherapy availability. KS-IRIS is a major contributor to Kaposi sarcoma-associated mortality in Africa. Our results support the need to increase awareness on KS-IRIS, encourage earlier presentation, referral and diagnosis of Kaposi sarcoma, and advocate on access to systemic chemotherapy in Africa. © 2013 Wolters Kluwer Health | Lippincott Williams & Wilkins
Chronic Use of Theophylline and Mortality in Chronic Obstructive Pulmonary Disease: A Meta-analysis.
Horita, Nobuyuki; Miyazawa, Naoki; Kojima, Ryota; Inoue, Miyo; Ishigatsubo, Yoshiaki; Kaneko, Takeshi
2016-05-01
Theophylline has been shown to improve respiratory function and oxygenation in patients with chronic obstruction pulmonary disease (COPD). However, the impact of theophylline on mortality in COPD patients has not been not sufficiently evaluated. Two investigators independently searched for eligible articles in 4 databases. The eligibility criterion for this meta-analysis was an original research article that provided a hazard ratio for theophylline for all-cause mortality of COPD patients. Both randomized controlled trials and observational studies were accepted. After we confirmed no substantial heterogeneity (I(2)<50%), the fixed-model method with generic inverse variance was used for meta-analysis to estimate the pooled hazard ratio. We screened 364 potentially eligible articles. Of the 364 articles, 259 were excluded on the basis of title and abstract, and 99 were excluded after examination of the full text. Our final analysis included 6 observational studies and no randomized controlled trials. One study reported 2 cohorts. The number of patients in each cohort ranged from 47 to 46,403. Heterogeneity (I(2)=42%, P=.11) and publication bias (Begg's test r=0.21, P=.662) were not substantial. Fixed-model meta-analysis yielded a pooled hazard ratio for theophylline for all-cause death of 1.07 (95% confidence interval: 1.02-1.13, P=.003). This meta-analysis of 7 observational cohorts suggests that theophylline slightly increases all-cause death in COPD patients. Copyright © 2014 SEPAR. Published by Elsevier Espana. All rights reserved.
Dynamic frailty models based on compound birth-death processes.
Putter, Hein; van Houwelingen, Hans C
2015-07-01
Frailty models are used in survival analysis to model unobserved heterogeneity. They accommodate such heterogeneity by the inclusion of a random term, the frailty, which is assumed to multiply the hazard of a subject (individual frailty) or the hazards of all subjects in a cluster (shared frailty). Typically, the frailty term is assumed to be constant over time. This is a restrictive assumption and extensions to allow for time-varying or dynamic frailties are of interest. In this paper, we extend the auto-correlated frailty models of Henderson and Shimakura and of Fiocco, Putter and van Houwelingen, developed for longitudinal count data and discrete survival data, to continuous survival data. We present a rigorous construction of the frailty processes in continuous time based on compound birth-death processes. When the frailty processes are used as mixtures in models for survival data, we derive the marginal hazards and survival functions and the marginal bivariate survival functions and cross-ratio function. We derive distributional properties of the processes, conditional on observed data, and show how to obtain the maximum likelihood estimators of the parameters of the model using a (stochastic) expectation-maximization algorithm. The methods are applied to a publicly available data set. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Woodward, M; Zhang, X; Barzi, F; Pan, W; Ueshima, H; Rodgers, A; MacMahon, S
2003-02-01
To provide reliable age- and region-specific estimates of the associations between diabetes and major cardiovascular diseases and death in populations from the Asia-Pacific region. Twenty-four cohort studies from Asia, Australia, and New Zealand (median follow-up, 5.4 years) provided individual participant data from 161,214 people (58% from Asia) of whom 4,873 had a history of diabetes at baseline. The associations of diabetes with the risks of coronary heart disease, stroke, and cause-specific mortality during follow-up were estimated using time-dependent Cox models, stratified by study cohort and sex and adjusted for age at risk. In all, 9,277 deaths occurred (3,635 from cardiovascular disease). The hazard ratio (95% CI) associated with diabetes was 1.97 (1.72-2.25) for fatal cardiovascular disease; there were similar hazard ratios for fatal coronary heart disease, fatal stroke, and composites of fatal and nonfatal outcomes. For all cardiovascular outcomes, hazard ratios were similar in Asian and non-Asian populations and in men and women, but were greater in younger than older individuals. For noncardiovascular death, the hazard ratio was 1.56 (1.38-1.77), with separately significant increases in the risks of death from renal disease, cancer, respiratory infections, and other infective causes. The hazard ratio for all-causes mortality was 1.68 (1.55-1.84), with similar ratios in Asian and non-Asian populations, but with significantly higher ratios in younger than older individuals. The relative effect of diabetes on the risks of cardiovascular disease and death in Asian populations is much the same as that in the largely Caucasian populations of Australia and New Zealand. Hazard ratios were severalfold greater in younger people than older people. The rapidly growing prevalence of diabetes in Asia heralds a large increase in the incidence of diabetes-related death in the coming decades.
Prentice, Ross L.; Chlebowski, Rowan T.; Stefanick, Marcia L.; Manson, JoAnn E.; Langer, Robert D.; Pettinger, Mary; Hendrix, Susan L.; Hubbell, F. Allan; Kooperberg, Charles; Kuller, Lewis H.; Lane, Dorothy S.; McTiernan, Anne; O’Sullivan, Mary Jo; Rossouw, Jacques E.; Anderson, Garnet L.
2009-01-01
The Women’s Health Initiative randomized controlled trial found a trend (p = 0.09) toward a lower breast cancer risk among women assigned to daily 0.625-mg conjugated equine estrogens (CEEs) compared with placebo, in contrast to an observational literature that mostly reports a moderate increase in risk with estrogenalone preparations. In 1993–2004 at 40 US clinical centers, breast cancer hazard ratio estimates for this CEE regimen were compared between the Women’s Health Initiative clinical trial and observational study toward understanding this apparent discrepancy and refining hazard ratio estimates. After control for prior use of postmenopausal hormone therapy and for confounding factors, CEE hazard ratio estimates were higher from the observational study compared with the clinical trial by 43% (p = 0.12). However, after additional control for time from menopause to first use of postmenopausal hormone therapy, the hazard ratios agreed closely between the two cohorts (p = 0.82). For women who begin use soon after menopause, combined analyses of clinical trial and observational study data do not provide clear evidence of either an overall reduction or an increase in breast cancer risk with CEEs, although hazard ratios appeared to be relatively higher among women having certain breast cancer risk factors or a low body mass index. PMID:18448442
Preclinical Alzheimer's disease and longitudinal driving decline.
Roe, Catherine M; Babulal, Ganesh M; Head, Denise M; Stout, Sarah H; Vernon, Elizabeth K; Ghoshal, Nupur; Garland, Brad; Barco, Peggy P; Williams, Monique M; Johnson, Ann; Fierberg, Rebecca; Fague, M Scot; Xiong, Chengjie; Mormino, Elizabeth; Grant, Elizabeth A; Holtzman, David M; Benzinger, Tammie L S; Fagan, Anne M; Ott, Brian R; Carr, David B; Morris, John C
2017-01-01
Links between preclinical AD and driving difficulty onset would support the use of driving performance as an outcome in primary and secondary prevention trials among older adults (OAs). We examined whether AD biomarkers predicted the onset of driving difficulties among OAs. 104 OAs (65+ years) with normal cognition took part in biomarker measurements, a road test, clinical and psychometric batteries and self-reported their driving habits. Higher values of CSF tau/Aβ 42 and ptau 181 /Aβ 42 ratios, but not uptake on PIB amyloid imaging (p=.12), predicted time to a rating of Marginal or Fail on the driving test using Cox proportional hazards models. Hazards ratios (95% confidence interval) were 5.75 (1.70-19.53), p=.005 for CSF tau/Aβ 42 ; 6.19 (1.75-21.88) and p=.005 for CSF ptau 181 /Aβ 42 . Preclinical AD predicted time to receiving a Marginal or Fail rating on an on-road driving test. Driving performance shows promise as a functional outcome in AD prevention trials.
Retirement as Meaningful: Positive Retirement Stereotypes Associated with Longevity
Ng, Reuben; Allore, Heather G.; Monin, Joan K.; Levy, Becca R.
2016-01-01
Studies examining the association between retirement and health have produced mixed results. This may be due to previous studies treating retirement as merely a change in job status rather than a transition associated with stereotypes or societal beliefs (e.g., retirement is a time of mental decline or retirement is a time of growth). To examine whether these stereotypes are associated with health, we studied retirement stereotypes and survival over a 23-year period among 1,011 older adults. As predicted by stereotype embodiment theory, it was found that positive stereotypes about physical health during retirement showed a survival advantage of 4.5 years (hazard ratio = 0.88, p = .022) and positive stereotypes about mental health during retirement tended to show a survival advantage of 2.5 years (hazard ratio = 0.87, p = .034). Models adjusted for relevant covariates such as age, gender, race, employment status, functional health, and self-rated health. These results suggest that retirement preparation could benefit from considering retirement stereotypes. PMID:27346893
Panotopoulos, Joannis; Posch, Florian; Funovics, Philipp T; Willegger, Madeleine; Scharrer, Anke; Lamm, Wolfgang; Brodowicz, Thomas; Windhager, Reinhard; Ay, Cihan
2016-03-01
Low serum albumin levels and impaired kidney function have been associated with decreased survival in patients with a variety of cancer types. In a retrospective cohort study, we analyzed 84 patients with liposarcoma treated at from May 1994 to October 2011. Uni- and multivariable Cox proportional hazard models and competing risk analyses were performed to evaluate the association between putative biomarkers with disease-specific and overall survival. The median age of the study population was 51.7 (range 19.6-83.8) years. In multivariable analysis adjusted for AJCC tumor stage, serum creatinine was highly associated with disease-specific survival (Subdistribution Hazard ratio (SHR) per 1 mg/dl increase = 2.94; 95%CI 1.39-6.23; p = 0.005). High albumin was associated with improved overall and disease-specific survival (Hazard Ratio (HR) per 10 units increase = 0.50; 95%CI 0.26-0.95; p = 0.033 and SHR = 0.64; 95%CI 0.42-1.00; p = 0.049). The serum albumin-creatinine-ratio emerged to be associated with both overall and disease-specific survival after adjusting for AJCC tumor stage (HR = 0.95; 95%CI 0.92-0.99; p = 0.011 and SHR = 0.96; 95%CI 0.93-0.99; p = 0.08). Our study provides evidence for a tumor-stage-independent association between higher creatinine and lower albumin with worse disease-specific survival. Low albumin and a high albumin-creatinine-ratio independently predict poor overall survival. Our work identified novel prognostic biomarkers for prognosis of patients with liposarcoma. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.
Regression dilution in the proportional hazards model.
Hughes, M D
1993-12-01
The problem of regression dilution arising from covariate measurement error is investigated for survival data using the proportional hazards model. The naive approach to parameter estimation is considered whereby observed covariate values are used, inappropriately, in the usual analysis instead of the underlying covariate values. A relationship between the estimated parameter in large samples and the true parameter is obtained showing that the bias does not depend on the form of the baseline hazard function when the errors are normally distributed. With high censorship, adjustment of the naive estimate by the factor 1 + lambda, where lambda is the ratio of within-person variability about an underlying mean level to the variability of these levels in the population sampled, removes the bias. As censorship increases, the adjustment required increases and when there is no censorship is markedly higher than 1 + lambda and depends also on the true risk relationship.
Age Variation in the Association Between Obesity and Mortality in Adults.
Wang, Zhiqiang; Peng, Yang; Liu, Meina
2017-12-01
The aim of this study was to evaluate the previously reported finding that the association between obesity and mortality strengthens with increasing age. The data were derived from the National Health Interview Survey. Age-specific hazard ratios of mortality for grade 2/3 obesity (BMI ≥ 35 kg/m 2 ), relative to a BMI of 18.5 kg/m 2 to < 25 kg/m 2 , were calculated by using a flexible parametric survival model (240,184 white men) and Cox proportional hazard models (51,697 matched pairs). When the model included interaction terms between obesity and age at the survey, hazard ratios appeared to increase with age if those interaction terms were ignored by fixing age at the survey as a single value. However, when recalculated for adults with various ages at the survey, according to model specifications, hazard ratios were higher for younger adults than for older adults with the same follow-up duration. Based on matched data, hazard ratios were also higher for younger adults (2.14 [95% CI: 1.90-2.40] for those 40-49 years of age) than for older adults (1.22 [95%: 0.91-1.63] for those 90+ years of age). For any given follow-up duration, the association between obesity and mortality weakens with age. The previously reported strengthening of the obesity-mortality association with increasing age was caused by the failure to take all the model specifications into consideration when calculating adjusted hazard ratios. © 2017 The Obesity Society.
Chalmers, Samuel; Fuller, Joel T; Debenedictis, Thomas A; Townsley, Samuel; Lynagh, Matthew; Gleeson, Cara; Zacharia, Andrew; Thomson, Stuart; Magarey, Mary
2017-07-01
The Functional Movement Screen (FMS) is a popular screening tool, however, the postulated relationship between prospective injury and FMS scoring remains sparsely explored in adolescent athletes. The aim of the study was to examine the association between pre-season FMS scores and injuries sustained during one regular season competition in elite adolescent Australian football players. Prospective cohort study. 237 elite junior Australian football players completed FMS testing during the late pre-season phase and had their weekly playing status monitored during the regular season. The definition of an injury was 'a trauma which caused a player to miss a competitive match'. The median composite FMS score was 14 (mean=13.5±2.3). An apriori analysis revealed that the presence of ≥1 asymmetrical sub-test was associated with a moderate increase in the risk of injury (hazard ratio=2.2 [1.0-4.8]; relative risk=1.9; p=0.047; sensitivity=78.4%; specificity=41.0%). Notably, post-hoc analysis identified that the presence of ≥2 asymmetrical sub-tests was associated with an even greater increase in risk of prospective injury (hazard ratio=3.7 [1.6-8.6]; relative risk=2.8; p=0.003; sensitivity=66.7%; specificity=78.0%). Achieving a composite score of ≤14 did not substantially increase the risk of prospective injury (hazard ratio=1.1 [0.5-2.1]; p=0.834). Junior Australian football players demonstrating asymmetrical movement during pre-season FMS testing were more likely to sustain an injury during the regular season than players without asymmetry. Findings suggest that the commonly reported composite FMS threshold score of ≤14 was not associated with injury in elite junior AF players. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Prognostic impact of metastatic pattern in stage IV breast cancer at initial diagnosis.
Leone, Bernardo Amadeo; Vallejo, Carlos Teodoro; Romero, Alberto Omar; Machiavelli, Mario Raúl; Pérez, Juan Eduardo; Leone, Julieta; Leone, José Pablo
2017-02-01
To analyze the prognostic influence of metastatic pattern (MP) compared with other biologic and clinical factors in stage IV breast cancer at initial diagnosis (BCID) and evaluate factors associated with specific sites of metastases (SSM). We evaluated women with stage IV BCID with known metastatic sites, reported to the Surveillance, Epidemiology and End Results program from 2010 to 2013. MP was categorized as bone-only, visceral, bone and visceral (BV), and other. Univariate and multivariate analyses determined the effects of each variable on overall survival (OS). Logistic regression examined factors associated with SSM. We included 9143 patients. Bone represented 37.5% of patients, visceral 21.9%, BV 28.8%, and other 11.9%. Median OS by MP was as follows: bone 38 months, visceral 21 months, BV 19 months, and other 33 months (P < 0.0001). Univariate analysis showed that higher number of metastatic sites had worse prognosis. In multivariate analysis, older age (hazard ratio 1.9), black race (hazard ratio 1.17), grade 3/4 tumors (hazard ratio 1.6), triple-negative (hazard ratio 2.24), BV MP (hazard ratio 2.07), and unmarried patients (hazard ratio 1.25) had significantly shorter OS. As compared with HR+/HER2- tumors, triple-negative and HR-/HER2+ had higher odds of brain, liver, lung, and other metastases. HR+/HER2+ had higher odds of liver metastases. All three subtypes had lower odds of bone metastases. There were substantial differences in OS according to MP. Tumor subtypes have a clear influence among other factors on SSM. We identified several prognostic factors that could guide therapy selection in treatment naïve patients.
Brasky, Theodore M.; Sponholtz, Todd R.; Palmer, Julie R.; Rosenberg, Lynn; Ruiz-Narváez, Edward A.; Wise, Lauren A.
2016-01-01
Dietary long-chain (LC) ω-3 polyunsaturated fatty acids (PUFAs), which derive primarily from intakes of fatty fish, are thought to inhibit inflammation and de novo estrogen synthesis. This study prospectively examined the associations of dietary LC ω-3 PUFAs and fish with endometrial cancer risk in 47,602 African-American women living in the United States, aged 21–69 years at baseline in 1995, and followed them until 2013 (n = 282 cases). Multivariable-adjusted Cox regression models estimated hazard ratios and 95% confidence intervals for associations of LC ω-3 PUFA (quintiled) and fish (quartiled) intake with endometrial cancer risk, overall and by body mass index (BMI; weight (kg)/height (m)2). The hazard ratio for quintile 5 of total dietary LC ω-3 PUFAs versus quintile 1 was 0.79 (95% confidence interval (CI): 0.51, 1.24); there was no linear trend. Hazard ratios for the association were smaller among normal-weight women (BMI <25: hazard ratio (HR) = 0.53, 95% CI: 0.18, 1.58) than among overweight/obese women (BMI ≥25: HR = 0.88, 95% CI: 0.54, 1.43), but these differences were not statistically significant. Fish intake was also not associated with risk (quartile 4 vs. quartile 1: HR = 0.86, 95% CI: 0.56, 1.31). Again hazard ratios were smaller among normal-weight women (HR = 0.65) than among overweight/obese women (HR = 0.94). While compatible with no association, the hazard ratios observed among leaner African-American women are similar to those from recent prospective studies conducted in predominantly white populations. PMID:26755676
Wicks, Susanne; Hjern, Anders; Dalman, Christina
2010-10-01
Recent studies suggest a role for social factors during childhood in the later development of schizophrenia. Since social conditions in childhood are closely related to parental psychiatric illness, there is a need to disentangle how genes and social environmental factors interact. A total of 13,163 children born in Sweden between 1955 and 1984 and reared in Swedish adoptive families were linked to the National Patient Register until 2006 regarding admissions for non-affective psychoses, including schizophrenia. Hazard ratios for nonaffective psychoses were estimated in relation to three indicators of socioeconomic position in childhood (household data of the rearing family obtained via linkage to the National Censuses of 1960-1985) and in relation to indicator of genetic liability (biological parental inpatient care for psychosis). In addition, the total Swedish-born population was investigated. Increased risks for nonaffective psychosis were found among adoptees (without biological parental history of psychosis) reared in families with disadvantaged socioeconomic position, which consisted of adoptive parental unemployment (hazard ratio=2.0), single-parent household (hazard ratio=1.2), and living in apartments (hazard ratio=1.3). The risk was also increased among persons with genetic liability for psychosis alone (hazard ratio=4.7). Among those exposed to both genetic liability and a disadvantaged socioeconomic situation in childhood, the risk was considerably higher (hazard ratio=15.0, 10.3, and 5.7 for parental unemployment, single-parent household, and apartment living, respectively). Analyses in the larger population supported these results. The results indicate that children reared in families with a disadvantaged socioeconomic position have an increased risk for psychosis. There was also some support for an interaction effect, suggesting that social disadvantage increases this risk more in children with genetic liability for psychosis.
Relationships between exercise, smoking habit and mortality in more than 100,000 adults.
O'Donovan, Gary; Hamer, Mark; Stamatakis, Emmanuel
2017-04-15
Exercise is associated with reduced risks of all-cause, cardiovascular disease (CVD) and cancer mortality; however, the benefits in smokers and ex-smokers are unclear. The aim of this study was to investigate associations between exercise, smoking habit and mortality. Self-reported exercise and smoking, and all-cause, CVD and cancer mortality were assessed in 106,341 adults in the Health Survey for England and the Scottish Health Survey. There were 9149 deaths from all causes, 2839 from CVD and 2634 from cancer during 999,948 person-years of follow-up. Greater amounts of exercise were associated with decreases and greater amounts of smoking were associated with increases in the risks of mortality from all causes, CVD and cancer. There was no statistically significant evidence of biological interaction; rather, the relative risks of all-cause mortality were additive. In the subgroup of 26,768 ex-smokers, the all-cause mortality hazard ratio was 0.70 (95% CI 0.60, 0.80), the CVD mortality hazard ratio was 0.71 (0.55, 092) and the cancer mortality hazard ratio was 0.66 (0.52, 0.84) in those who exercised compared to those who did not. In the subgroup of 28,440 smokers, the all-cause mortality hazard ratio was 0.69 (0.57, 0.83), the CVD mortality hazard ratio was 0.66 (0.45, 0.96) and the cancer mortality hazard ratio was 0.69 (0.51, 0.94) in those who exercised compared to those who did not. Given that an outright ban is unlikely, this study is important because it suggests exercise reduces the risks of all-cause, CVD and cancer mortality by around 30% in smokers and ex-smokers. © 2017 UICC.
Olsen, Morten; Hjortdal, Vibeke E; Mortensen, Laust H; Christensen, Thomas D; Sørensen, Henrik T; Pedersen, Lars
2011-04-01
Congenital heart defect patients may experience neurodevelopmental impairment. We investigated their educational attainments from basic schooling to higher education. Using administrative databases, we identified all Danish patients with a cardiac defect diagnosis born from 1 January, 1977 to 1 January, 1991 and alive at age 13 years. As a comparison cohort, we randomly sampled 10 persons per patient. We obtained information on educational attainment from Denmark's Database for Labour Market Research. The study population was followed until achievement of educational levels, death, emigration, or 1 January, 2006. We estimated the hazard ratio of attaining given educational levels, conditional on completing preceding levels, using discrete-time Cox regression and adjusting for socio-economic factors. Analyses were repeated for a sub-cohort of patients and controls born at term and without extracardiac defects or chromosomal anomalies. We identified 2986 patients. Their probability of completing compulsory basic schooling was approximately 10% lower than that of control individuals (adjusted hazard ratio = 0.79, ranged from 0.75 to 0.82 0.79; 95% confidence interval: 0.75-0.82). Their subsequent probability of completing secondary school was lower than that of the controls, both for all patients (adjusted hazard ratio = 0.74; 95% confidence interval: 0.69-0.80) and for the sub-cohort (adjusted hazard ratio = 0.80; 95% confidence interval: 0.73-0.86). The probability of attaining a higher degree, conditional on completion of youth education, was affected both for all patients (adjusted hazard ratio = 0.88; 95% confidence interval: 0.76-1.01) and for the sub-cohort (adjusted hazard ratio = 0.92; 95% confidence interval: 0.79-1.07). The probability of educational attainment was reduced among long-term congenital heart defect survivors.
Association between GFR Estimated by Multiple Methods at Dialysis Commencement and Patient Survival
Wong, Muh Geot; Pollock, Carol A.; Cooper, Bruce A.; Branley, Pauline; Collins, John F.; Craig, Jonathan C.; Kesselhut, Joan; Luxton, Grant; Pilmore, Andrew; Harris, David C.
2014-01-01
Summary Background and objectives The Initiating Dialysis Early and Late study showed that planned early or late initiation of dialysis, based on the Cockcroft and Gault estimation of GFR, was associated with identical clinical outcomes. This study examined the association of all-cause mortality with estimated GFR at dialysis commencement, which was determined using multiple formulas. Design, setting, participants, & measurements Initiating Dialysis Early and Late trial participants were stratified into tertiles according to the estimated GFR measured by Cockcroft and Gault, Modification of Diet in Renal Disease, or Chronic Kidney Disease-Epidemiology Collaboration formula at dialysis commencement. Patient survival was determined using multivariable Cox proportional hazards model regression. Results Only Initiating Dialysis Early and Late trial participants who commenced on dialysis were included in this study (n=768). A total of 275 patients died during the study. After adjustment for age, sex, racial origin, body mass index, diabetes, and cardiovascular disease, no significant differences in survival were observed between estimated GFR tertiles determined by Cockcroft and Gault (lowest tertile adjusted hazard ratio, 1.11; 95% confidence interval, 0.82 to 1.49; middle tertile hazard ratio, 1.29; 95% confidence interval, 0.96 to 1.74; highest tertile reference), Modification of Diet in Renal Disease (lowest tertile hazard ratio, 0.88; 95% confidence interval, 0.63 to 1.24; middle tertile hazard ratio, 1.20; 95% confidence interval, 0.90 to 1.61; highest tertile reference), and Chronic Kidney Disease-Epidemiology Collaboration equations (lowest tertile hazard ratio, 0.93; 95% confidence interval, 0.67 to 1.27; middle tertile hazard ratio, 1.15; 95% confidence interval, 0.86 to 1.54; highest tertile reference). Conclusion Estimated GFR at dialysis commencement was not significantly associated with patient survival, regardless of the formula used. However, a clinically important association cannot be excluded, because observed confidence intervals were wide. PMID:24178976
Holma, K Mikael; Melartin, Tarja K; Haukka, Jari; Holma, Irina A K; Sokero, T Petteri; Isometsä, Erkki T
2010-07-01
Prospective long-term studies of risk factors for suicide attempts among patients with major depressive disorder have not investigated the course of illness and state at the time of the act. Therefore, the importance of state factors, particularly time spent in risk states, for overall risk remains unknown. In the Vantaa Depression Study, a longitudinal 5-year evaluation of psychiatric patients with major depressive disorder, prospective information on 249 patients (92.6%) was available. Time spent in depressive states and the timing of suicide attempts were investigated with life charts. During the follow-up assessment period, there were 106 suicide attempts per 1,018 patient-years. The incidence rate per 1,000 patient-years during major depressive episodes was 21-fold (N=332 [95% confidence interval [CI]=258.6-419.2]), and it was fourfold during partial remission (N=62 [95% CI=34.6-92.4]) compared with full remission (N=16 [95% CI=11.2-40.2]). In the Cox proportional hazards model, suicide attempts were predicted by the months spent in a major depressive episode (hazard ratio=7.74 [95% CI=3.40-17.6]) or in partial remission (hazard ratio=4.20 [95% CI=1.71-10.3]), history of suicide attempts (hazard ratio=4.39 [95% CI=1.78-10.8]), age (hazard ratio=0.94 [95% CI=0.91-0.98]), lack of a partner (hazard ratio=2.33 [95% CI=0.97-5.56]), and low perceived social support (hazard ratio=3.57 [95% CI=1.09-11.1]). The adjusted population attributable fraction of the time spent depressed for suicide attempts was 78%. Among patients with major depressive disorder, incidence of suicide attempts varies markedly depending on the level of depression, being highest during major depressive episodes. Although previous attempts and poor social support also indicate risk, the time spent depressed is likely the major factor determining overall long-term risk.
Hansen, Richard A.; Khodneva, Yulia; Glasser, Stephen P.; Qian, Jingjing; Redmond, Nicole; Safford, Monika M.
2018-01-01
Background Mixed evidence suggests second-generation antidepressants may increase risk of cardiovascular and cerebrovascular events. Objective Assess whether antidepressant use is associated with acute coronary heart disease, stroke, cardiovascular disease death, and all-cause mortality. Methods Secondary analyses of the Reasons for Geographic and Racial Differences in Stroke (REGARDS) longitudinal cohort study were conducted. Use of selective serotonin reuptake inhibitors, serotonin and norepinephrine reuptake inhibitors, bupropion, nefazodone, and trazodone was measured during the baseline (2003-2007) in-home visit. Outcomes of coronary heart disease, stroke, cardiovascular disease death, and all-cause mortality were assessed every 6 months and adjudicated by medical record review. Cox proportional hazards time-to-event analysis followed patients until their first event on or before December 31, 2011, iteratively adjusting for covariates. Results Among 29,616 participants, 3,458 (11.7%) used an antidepressant of interest. Intermediate models adjusting for everything but physical and mental health found an increased risk of acute coronary heart disease (Hazard Ratio=1.21; 95% CI 1.04-1.41), stroke (Hazard Ratio=1.28; 95% CI 1.02-1.60), cardiovascular disease death (Hazard Ratio =1.29; 95% CI 1.09-1.53), and all-cause mortality (Hazard Ratio=1.27; 95% CI 1.15-1.41) for antidepressant users. Risk estimates trended in this direction for all outcomes in the fully adjusted model, but only remained statistically associated with increased risk of all-cause mortality (Hazard Ratio=1.12; 95% CI 1.01-1.24). This risk was attenuated in sensitivity analyses censoring follow-up time at 2-years (Hazard Ratio=1.37; 95% CI 1.11-1.68). Conclusions In fully adjusted models antidepressant use was associated with a small increase in all-cause mortality. PMID:26783360
Risk of Cause-Specific Death in Individuals With Diabetes: A Competing Risks Analysis.
Baena-Díez, Jose Miguel; Peñafiel, Judit; Subirana, Isaac; Ramos, Rafel; Elosua, Roberto; Marín-Ibañez, Alejandro; Guembe, María Jesús; Rigo, Fernando; Tormo-Díaz, María José; Moreno-Iribas, Conchi; Cabré, Joan Josep; Segura, Antonio; García-Lareo, Manel; Gómez de la Cámara, Agustín; Lapetra, José; Quesada, Miquel; Marrugat, Jaume; Medrano, Maria José; Berjón, Jesús; Frontera, Guiem; Gavrila, Diana; Barricarte, Aurelio; Basora, Josep; García, Jose María; Pavone, Natalia C; Lora-Pablos, David; Mayoral, Eduardo; Franch, Josep; Mata, Manel; Castell, Conxa; Frances, Albert; Grau, María
2016-11-01
Diabetes is a common cause of shortened life expectancy. We aimed to assess the association between diabetes and cause-specific death. We used the pooled analysis of individual data from 12 Spanish population cohorts with 10-year follow-up. Participants had no previous history of cardiovascular diseases and were 35-79 years old. Diabetes status was self-reported or defined as glycemia >125 mg/dL at baseline. Vital status and causes of death were ascertained by medical records review and linkage with the official death registry. The hazard ratios and cumulative mortality function were assessed with two approaches, with and without competing risks: proportional subdistribution hazard (PSH) and cause-specific hazard (CSH), respectively. Multivariate analyses were fitted for cardiovascular, cancer, and noncardiovascular noncancer deaths. We included 55,292 individuals (15.6% with diabetes and overall mortality of 9.1%). The adjusted hazard ratios showed that diabetes increased mortality risk: 1) cardiovascular death, CSH = 2.03 (95% CI 1.63-2.52) and PSH = 1.99 (1.60-2.49) in men; and CSH = 2.28 (1.75-2.97) and PSH = 2.23 (1.70-2.91) in women; 2) cancer death, CSH = 1.37 (1.13-1.67) and PSH = 1.35 (1.10-1.65) in men; and CSH = 1.68 (1.29-2.20) and PSH = 1.66 (1.25-2.19) in women; and 3) noncardiovascular noncancer death, CSH = 1.53 (1.23-1.91) and PSH = 1.50 (1.20-1.89) in men; and CSH = 1.89 (1.43-2.48) and PSH = 1.84 (1.39-2.45) in women. In all instances, the cumulative mortality function was significantly higher in individuals with diabetes. Diabetes is associated with premature death from cardiovascular disease, cancer, and noncardiovascular noncancer causes. The use of CSH and PSH provides a comprehensive view of mortality dynamics in a population with diabetes. © 2016 by the American Diabetes Association.
Pérez de Prado, Armando; López-Gómez, Juan M.; Quiroga, Borja; Goicoechea, Marian; García-Prieto, Ana; Torres, Esther; Reque, Javier; Luño, José
2016-01-01
Background and objectives Supraventricular arrhythmias are associated with high morbidity and mortality. Nevertheless, this condition has received little attention in patients on hemodialysis. The objective of this study was to analyze the incidence of intradialysis supraventricular arrhythmia and its long–term prognostic value. Design, setting, participants, & measurements We designed an observational and prospective study in a cohort of patients on hemodialysis with a 10-year follow-up period. All patients were recruited for study participation and were not recruited for clinical indications. The study population comprised 77 patients (42 men and 35 women; mean age =58±15 years old) with sinus rhythm monitored using a Holter electrocardiogram over six consecutive hemodialysis sessions at recruitment. Results Hypertension was present in 68.8% of patients, and diabetes was present in 29.9% of patients. Supraventricular arrhythmias were recorded in 38 patients (49.3%); all of these were short, asymptomatic, and self-limiting. Age (hazard ratio, 1.04 per year; 95% confidence interval, 1.00 to 1.08) and right atrial enlargement (hazard ratio, 4.29; 95% confidence interval, 1.30 to 14.09) were associated with supraventricular arrhythmia in the multivariate analysis. During a median follow-up of 40 months, 57 patients died, and cardiovascular disease was the main cause of death (52.6%). The variables associated with all-cause mortality in the Cox model were age (hazard ratio, 1.04 per year; 95% confidence interval, 1.00 to 1.08), C-reactive protein (hazard ratio, 1.04 per 1 mg/L; 95% confidence interval, 1.00 to 1.08), and supraventricular arrhythmia (hazard ratio, 3.21; 95% confidence interval, 1.29 to 7.96). Patients with supraventricular arrhythmia also had a higher risk of nonfatal cardiovascular events (hazard ratio, 4.32; 95% confidence interval, 2.11 to 8.83) and symptomatic atrial fibrillation during follow-up (hazard ratio, 17.19; 95% confidence interval, 2.03 to 145.15). Conclusions The incidence of intradialysis supraventricular arrhythmia was high in our hemodialysis study population. Supraventricular arrhythmias were short, asymptomatic, and self-limiting, and although silent, these arrhythmias were independently associated with mortality and cardiovascular events. PMID:27697781
Verde, Eduardo; Pérez de Prado, Armando; López-Gómez, Juan M; Quiroga, Borja; Goicoechea, Marian; García-Prieto, Ana; Torres, Esther; Reque, Javier; Luño, José
2016-12-07
Supraventricular arrhythmias are associated with high morbidity and mortality. Nevertheless, this condition has received little attention in patients on hemodialysis. The objective of this study was to analyze the incidence of intradialysis supraventricular arrhythmia and its long-term prognostic value. We designed an observational and prospective study in a cohort of patients on hemodialysis with a 10-year follow-up period. All patients were recruited for study participation and were not recruited for clinical indications. The study population comprised 77 patients (42 men and 35 women; mean age =58±15 years old) with sinus rhythm monitored using a Holter electrocardiogram over six consecutive hemodialysis sessions at recruitment. Hypertension was present in 68.8% of patients, and diabetes was present in 29.9% of patients. Supraventricular arrhythmias were recorded in 38 patients (49.3%); all of these were short, asymptomatic, and self-limiting. Age (hazard ratio, 1.04 per year; 95% confidence interval, 1.00 to 1.08) and right atrial enlargement (hazard ratio, 4.29; 95% confidence interval, 1.30 to 14.09) were associated with supraventricular arrhythmia in the multivariate analysis. During a median follow-up of 40 months, 57 patients died, and cardiovascular disease was the main cause of death (52.6%). The variables associated with all-cause mortality in the Cox model were age (hazard ratio, 1.04 per year; 95% confidence interval, 1.00 to 1.08), C-reactive protein (hazard ratio, 1.04 per 1 mg/L; 95% confidence interval, 1.00 to 1.08), and supraventricular arrhythmia (hazard ratio, 3.21; 95% confidence interval, 1.29 to 7.96). Patients with supraventricular arrhythmia also had a higher risk of nonfatal cardiovascular events (hazard ratio, 4.32; 95% confidence interval, 2.11 to 8.83) and symptomatic atrial fibrillation during follow-up (hazard ratio, 17.19; 95% confidence interval, 2.03 to 145.15). The incidence of intradialysis supraventricular arrhythmia was high in our hemodialysis study population. Supraventricular arrhythmias were short, asymptomatic, and self-limiting, and although silent, these arrhythmias were independently associated with mortality and cardiovascular events. Copyright © 2016 by the American Society of Nephrology.
LeBlanc, John C.; Pless, I. Barry; King, W. James; Bawden, Harry; Bernard-Bonnin, Anne-Claude; Klassen, Terry; Tenenbein, Milton
2006-01-01
Background Young children may sustain injuries when exposed to certain hazards in the home. To better understand the relation between several childproofing strategies and the risk of injuries to children in the home, we undertook a multicentre case–control study in which we compared hazards in the homes of children with and without injuries. Methods We conducted this case-control study using records from 5 pediatric hospital emergency departments for the 2-year period 1995–1996. The 351 case subjects were children aged 7 years and less who presented with injuries from falls, burns or scalds, ingestions or choking. The matched control subjects were children who presented during the same period with acute non-injury-related conditions. A home visitor, blinded to case-control status, assessed 19 injury hazards at the children's homes. Results Hazards found in the homes included baby walkers (21% of homes with infants), no functioning smoke alarm (17% of homes) and no fire extinguisher (51% of homes). Cases did not differ from controls in the mean proportion of home hazards. After controlling for siblings, maternal education and employment, we found that cases differed from controls for 5 hazards: the presence of a baby walker (odds ratio [OR] 9.0, 95% confidence interval [CI] 1.1–71.0), the presence of choking hazards within a child's reach (OR 2.0, 95% CI 1.0–3.7), no child-resistant lids in bathroom (OR 1.6, 95% CI 1.0–2.5), no smoke alarm (OR 3.2, 95% CI 1.4–7.7) and no functioning smoke alarm (OR 1.7, 95% CI 1.0–2.8). Interpretation Homes of children with injuries differed from those of children without injuries in the proportions of specific hazards for falls, choking, poisoning and burns, with a striking difference noted for the presence of a baby walker. In addition to counselling parents about specific hazards, clinicians should consider that the presence of some hazards may indicate an increased risk for home injuries beyond those directly related to the hazard found. Families with any home hazard may be candidates for interventions to childproof against other types of home hazards. PMID:16998079
Hazard ratio estimation and inference in clinical trials with many tied event times.
Mehrotra, Devan V; Zhang, Yiwei
2018-06-13
The medical literature contains numerous examples of randomized clinical trials with time-to-event endpoints in which large numbers of events accrued over relatively short follow-up periods, resulting in many tied event times. A generally common feature across such examples was that the logrank test was used for hypothesis testing and the Cox proportional hazards model was used for hazard ratio estimation. We caution that this common practice is particularly risky in the setting of many tied event times for two reasons. First, the estimator of the hazard ratio can be severely biased if the Breslow tie-handling approximation for the Cox model (the default in SAS and Stata software) is used. Second, the 95% confidence interval for the hazard ratio can include one even when the corresponding logrank test p-value is less than 0.05. To help establish a better practice, with applicability for both superiority and noninferiority trials, we use theory and simulations to contrast Wald and score tests based on well-known tie-handling approximations for the Cox model. Our recommendation is to report the Wald test p-value and corresponding confidence interval based on the Efron approximation. The recommended test is essentially as powerful as the logrank test, the accompanying point and interval estimates of the hazard ratio have excellent statistical properties even in settings with many tied event times, inferential alignment between the p-value and confidence interval is guaranteed, and implementation is straightforward using commonly used software. Copyright © 2018 John Wiley & Sons, Ltd.
Allopurinol and Cardiovascular Outcomes in Adults With Hypertension.
MacIsaac, Rachael L; Salatzki, Janek; Higgins, Peter; Walters, Matthew R; Padmanabhan, Sandosh; Dominiczak, Anna F; Touyz, Rhian M; Dawson, Jesse
2016-03-01
Allopurinol lowers blood pressure in adolescents and has other vasoprotective effects. Whether similar benefits occur in older individuals remains unclear. We hypothesized that allopurinol is associated with improved cardiovascular outcomes in older adults with hypertension. Data from the United Kingdom Clinical Research Practice Datalink were used. Multivariate Cox-proportional hazard models were applied to estimate hazard ratios for stroke and cardiac events (defined as myocardial infarction or acute coronary syndrome) associated with allopurinol use over a 10-year period in adults aged >65 years with hypertension. A propensity-matched design was used to reduce potential for confounding. Allopurinol exposure was a time-dependent variable and was defined as any exposure and then as high (≥300 mg daily) or low-dose exposure. A total of 2032 allopurinol-exposed patients and 2032 matched nonexposed patients were studied. Allopurinol use was associated with a significantly lower risk of both stroke (hazard ratio, 0.50; 95% confidence interval, 0.32-0.80) and cardiac events (hazard ratio, 0.61; 95% confidence interval, 0.43-0.87) than nonexposed control patients. In exposed patients, high-dose treatment with allopurinol (n=1052) was associated with a significantly lower risk of both stroke (hazard ratio, 0.58; 95% confidence interval, 0.36-0.94) and cardiac events (hazard ratio, 0.65; 95% confidence interval, 0.46-0.93) than low-dose treatment (n=980). Allopurinol use is associated with lower rates of stroke and cardiac events in older adults with hypertension, particularly at higher doses. Prospective clinical trials are needed to evaluate whether allopurinol improves cardiovascular outcomes in adults with hypertension. © 2016 American Heart Association, Inc.
Neighborhood disadvantage and ischemic stroke: the Cardiovascular Health Study (CHS).
Brown, Arleen F; Liang, Li-Jung; Vassar, Stefanie D; Stein-Merkin, Sharon; Longstreth, W T; Ovbiagele, Bruce; Yan, Tingjian; Escarce, José J
2011-12-01
Neighborhood characteristics may influence the risk of stroke and contribute to socioeconomic disparities in stroke incidence. The objectives of this study were to examine the relationship between neighborhood socioeconomic status and incident ischemic stroke and examine potential mediators of these associations. We analyzed data from 3834 whites and 785 blacks enrolled in the Cardiovascular Health Study, a multicenter, population-based, longitudinal study of adults ages≥65 years from 4 US counties. The primary outcome was adjudicated incident ischemic stroke. Neighborhood socioeconomic status was measured using a composite of 6 census tract variables. Race-stratified multilevel Cox proportional hazard models were constructed adjusted for sociodemographic, behavioral, and biological risk factors. Among whites, in models adjusted for sociodemographic characteristics, stroke hazard was significantly higher among residents of neighborhoods in the lowest compared with the highest neighborhood socioeconomic status quartile (hazard ratio, 1.32; 95% CI, 1.01-1.72) with greater attenuation of the hazard ratio after adjustment for biological risk factors (hazard ratio, 1.16; 0.88-1.52) than for behavioral risk factors (hazard ratio, 1.30; 0.99-1.70). Among blacks, we found no significant associations between neighborhood socioeconomic status and ischemic stroke. Higher risk of incident ischemic stroke was observed in the most disadvantaged neighborhoods among whites, but not among blacks. The relationship between neighborhood socioeconomic status and stroke among whites appears to be mediated more strongly by biological than behavioral risk factors.
Association Between Cortisol to DHEA-s Ratio and Sickness Absence in Japanese Male Workers.
Hirokawa, Kumi; Fujii, Yasuhito; Taniguchi, Toshiyo; Takaki, Jiro; Tsutsumi, Akizumi
2018-06-01
This study aimed to investigate the association between serum levels of cortisol and dehydroepiandrosterone sulfate (DHEA-s) and sickness absence over 2 years in Japanese male workers. A baseline survey including questions about health behavior, along with blood sampling for cortisol and DHEA-s, was conducted in 2009. In total, 429 men (mean ± SD age, 52.9 ± 8.6 years) from whom blood samples were collected at baseline were followed until December 31, 2011. The hazard ratios (HR) and 95% confidence intervals (CI) for sickness absence were calculated using a Cox proportional hazard model, adjusted for potential confounders. Among 35 workers who took sickness absences, 31 had physical illness. A high cortisol to DHEA-s ratio increased the risk of sickness absence (crude HR = 2.68, 95% CI 1.12-6.41; adjusted HR = 3.33, 95% CI 1.35-8.20). The cortisol to DHEA-s ratio was linearly associated with an increased risk of sickness absence (p for trend < .050). Single effects of cortisol and DHEA-s levels were not associated with sickness absences. This trend did not change when limited to absences resulting from physical illness. Hormonal conditions related to the hypothalamus-pituitary-adrenocortical axis and adrenal function should be considered when predicting sickness absence. The cortisol to DHEA-s ratio may be more informative than single effects of cortisol and DHEA-s levels.
Formiga, Francesc; Ferrer, Assumpta; Padros, Gloria; Montero, Abelardo; Gimenez-Argente, Carme; Corbella, Xavier
2016-01-01
Objective To investigate the predictive value of functional impairment, chronic conditions, and laboratory biomarkers of aging for predicting 5-year mortality in the elderly aged 85 years. Methods Predictive value for mortality of different geriatric assessments carried out during the OCTABAIX study was evaluated after 5 years of follow-up in 328 subjects aged 85 years. Measurements included assessment of functional status comorbidity, along with laboratory tests on vitamin D, cholesterol, CD4/CD8 ratio, hemoglobin, and serum thyrotropin. Results Overall, the mortality rate after 5 years of follow-up was 42.07%. Bivariate analysis showed that patients who survived were predominantly female (P=0.02), and they showed a significantly better baseline functional status for both basic (P<0.001) and instrumental (P<0.001) activities of daily living (Barthel and Lawton index), better cognitive performance (Spanish version of the Mini-Mental State Examination) (P<0.001), lower comorbidity conditions (Charlson) (P<0.001), lower nutritional risk (Mini Nutritional Assessment) (P<0.001), lower risk of falls (Tinetti gait scale) (P<0.001), less percentage of heart failure (P=0.03) and chronic obstructive pulmonary disease (P=0.03), and took less chronic prescription drugs (P=0.002) than nonsurvivors. Multivariate Cox regression analysis identified a decreased score in the Lawton index (hazard ratio 0.86, 95% confidence interval: 0.78–0.91) and higher comorbidity conditions (hazard ratio 1.20, 95% confidence interval: 1.08–1.33) as independent predictors of mortality at 5 years in the studied population. Conclusion The ability to perform instrumental activities of daily living and the global comorbidity assessed at baseline were the predictors of death, identified in our 85-year-old community-dwelling subjects after 5 years of follow-up. PMID:27143867
Association between Late-Life Social Activity and Motor Decline in Older Adults
Buchman, Aron S.; Boyle, Patricia A.; Wilson, Robert S.; Fleischman, Debra A.; Leurgans, Sue; Bennett, David A.
2009-01-01
Background Loss of motor function is a common consequence of aging, but little is known about factors that predict idiopathic motor decline. Methods We studied 906 persons without dementia, history of stroke or Parkinson's disease participating in the Rush Memory and Aging Project. At baseline, they rated their frequency of participation in common social activities. Outcome was annual change in global motor function, based on nine measures of muscle strength and nine motor performances. Results Mean social activity score at baseline was 2.6 (SD=0.58), with higher scores indicating more frequent participation in social activities. In a generalized estimating equation model, controlling for age, sex and education, motor function declined by about 0.05 unit/year [Estimate, 0.016; 95%CI (-0.057, -0.041); p=0.017]. Each 1-point decrease in social activity was associated with about a 33% more rapid rate of decline in motor function [Estimate, 0.016; 95%CI (0.003, 0.029); p=0.017)]. This amount of annual motor decline was associated with a more than 40% increased risk of death (Hazard Ratio: 1.44; 95%CI: 1.30, 1.60) and 65% increased risk of incident Katz disability (Hazard Ratio: 1.65; 95%CI: 1.48, 1.83). The association of social activity with change in motor function did not vary along demographic lines and was unchanged after controlling for potential confounders including late-life physical and cognitive activity, disability, global cognition, depressive symptoms, body composition and chronic medical conditions [Estimate, 0.025; 95%CI (0.005, 0.045); p=0.010]. Conclusion Less frequent participation in social activities is associated with a more rapid rate of motor decline in old age. PMID:19546415
Alashi, Alaa; Mentias, Amgad; Patel, Krishna; Gillinov, A Marc; Sabik, Joseph F; Popović, Zoran B; Mihaljevic, Tomislav; Suri, Rakesh M; Rodriguez, L Leonardo; Svensson, Lars G; Griffin, Brian P; Desai, Milind Y
2016-07-01
In asymptomatic patients with ≥3+ mitral regurgitation and preserved left ventricular (LV) ejection fraction who underwent mitral valve surgery, we sought to discover whether baseline LV global longitudinal strain (LV-GLS) and brain natriuretic peptide provided incremental prognostic utility. Four hundred and forty-eight asymptomatic patients (61±12 years and 69% men) with ≥3+ primary mitral regurgitation and preserved left ventricular ejection fraction, who underwent mitral valve surgery (92% repair) at our center between 2005 and 2008, were studied. Baseline clinical and echocardiographic data (including LV-GLS using Velocity Vector Imaging, Siemens, PA) were recorded. The Society of Thoracic Surgeons score was calculated. The primary outcome was death. Mean Society of Thoracic Surgeons score, left ventricular ejection fraction, mitral effective regurgitant orifice, indexed LV end-diastolic volume, and right ventricular systolic pressure were 4±1%, 62±3%, 0.55±0.2 cm(2), 58±13 cc/m(2), and 37±15 mm Hg, respectively. Forty-five percent of patients had flail. Median log-transformed BNP and LV-GLS were 4.04 (absolute brain natriuretic peptide: 60 pg/dL) and -20.7%. At 7.7±2 years, death occurred in 41 patients (9%; 0% at 30 days). On Cox analysis, a higher Society of Thoracic Surgeons score (hazard ratio 1.55), higher baseline right ventricular systolic pressure (hazard ratio 1.11), more abnormal LV-GLS (hazard ratio 1.17), and higher median log-transformed BNP (hazard ratio 2.26) were associated with worse longer-term survival (all P<0.01). Addition of LV-GLS and median log-transformed BNP to a clinical model (Society of Thoracic Surgeons score and baseline right ventricular systolic pressure) provided incremental prognostic utility (χ(2) for longer-term mortality increased from 31-47 to 61; P<0.001). In asymptomatic patients with significant primary mitral regurgitation and preserved left ventricular ejection fraction who underwent mitral valve surgery, brain natriuretic peptide and LV-GLS provided synergistic risk stratification, independent of established factors. © 2016 American Heart Association, Inc.
Feng, Xiaming; Wang, Xin; Cai, Wei; Qiu, Shuilai; Hu, Yuan; Liew, Kim Meow
2016-09-28
Practical application of functionalized graphene in polymeric nanocomposites is hampered by the lack of cost-effective and eco-friendly methods for its production. Here, we reported a facile and green electrochemical approach for preparing ferric phytate functionalized graphene (f-GNS) by simultaneously utilizing biobased phytic acid as electrolyte and modifier for the first time. Due to the presence of phytic acid, electrochemical exfoliation leads to low oxidized graphene sheets (a C/O ratio of 14.8) that are tens of micrometers large. Successful functionalization of graphene was confirmed by the appearance of phosphorus and iron peaks in the X-ray photoelectron spectrum. Further, high-performance polylactic acid/f-GNS nanocomposites are readily fabricated by a convenient masterbatch strategy. Notably, inclusion of well-dispersed f-GNS resulted in dramatic suppression on fire hazards of polylactic acid in terms of reduced peak heat-release rate (decreased by 40%), low CO yield, and formation of a high graphitized protective char layer. Moreover, obviously improvements in crystallization rate and thermal conductivities of polylactic acid nanocomposites were observed, highlighting its promising potential in practical application. This novel strategy toward the simultaneous exfoliation and functionalization for graphene demonstrates a simple yet very effective approach for fabricating graphene-based flame retardants.
Serum metabolomic profiling and incident CKD among African Americans.
Yu, Bing; Zheng, Yan; Nettleton, Jennifer A; Alexander, Danny; Coresh, Josef; Boerwinkle, Eric
2014-08-07
Novel biomarkers that more accurately reflect kidney function and predict future CKD are needed. The human metabolome is the product of multiple physiologic or pathophysiologic processes and may provide novel insight into disease etiology and progression. This study investigated whether estimated kidney function would be associated with multiple metabolites and whether selected metabolomic factors would be independent risk factors for incident CKD. In total, 1921 African Americans free of CKD with a median of 19.6 years follow-up among the Atherosclerosis Risk in Communities Study were included. A total of 204 serum metabolites quantified by untargeted gas chromatography-mass spectrometry and liquid chromatography-mass spectrometry was analyzed by both linear regression for the cross-sectional associations with eGFR (specified by the Chronic Kidney Disease Epidemiology Collaboration equation) and Cox proportional hazards model for the longitudinal associations with incident CKD. Forty named and 34 unnamed metabolites were found to be associated with eGFR specified by the Chronic Kidney Disease Epidemiology Collaboration equation with creatine and 3-indoxyl sulfate showing the strongest positive (2.8 ml/min per 1.73 m(2) per +1 SD; 95% confidence interval, 2.1 to 3.5) and negative association (-14.2 ml/min per 1.73 m(2) per +1 SD; 95% confidence interval, -17.0 to -11.3), respectively. Two hundred four incident CKD events with a median follow-up time of 19.6 years were included in the survival analyses. Higher levels of 5-oxoproline (hazard ratio, 0.70; 95% confidence interval, 0.60 to 0.82) and 1,5-anhydroglucitol (hazard ratio, 0.68; 95% confidence interval, 0.58 to 0.80) were significantly related to lower risk of incident CKD, and the associations did not appreciably change when mutually adjusted. These data identify a large number of metabolites associated with kidney function as well as two metabolites that are candidate risk factors for CKD and may provide new insights into CKD biomarker identification. Copyright © 2014 by the American Society of Nephrology.
Konishi, S; Ng, C F S; Stickley, A; Watanabe, C
2016-08-01
Having an allergic disease may have health implications beyond those more commonly associated with allergy given that previous epidemiological studies have suggested that both atopy and allergy are linked to mortality. More viable immune functioning among the elderly, as indicated by the presence of an allergic disease, might therefore be associated with differences in all-cause mortality. Using data from a Japanese cohort, this study examined whether having pollinosis (a form of allergic rhinitis) in a follow-up survey could predict all-cause and cause-specific mortality. Data came from the Komo-Ise cohort, which at its 1993 baseline recruited residents aged 40-69 years from two areas in Gunma prefecture, Japan. The current study used information on pollinosis that was obtained from the follow-up survey in 2000. Mortality and migration data were obtained throughout the follow-up period up to December 2008. Proportional hazard models were used to examine the relation between pollinosis and mortality. At the 2000 follow-up survey, 12% (1088 of 8796) of respondents reported that they had pollinosis symptoms in the past 12 months. During the 76 186 person-years of follow-up, 748 died from all causes. Among these, there were 37 external, 208 cardiovascular, 74 respiratory, and 329 neoplasm deaths. After adjusting for potential confounders, pollinosis was associated with significantly lower all-cause [hazard ratio 0.57 (95% confidence interval = 0.38-0.87)] and neoplasms mortality [hazard ratio 0.48 (95% confidence interval = 0.26-0.92)]. Having an allergic disease (pollinosis) at an older age may be indicative of more viable immune functioning and be protective against certain causes of death. Further research is needed to determine the possible mechanisms underlying the association between pollinosis and mortality. © 2015 John Wiley & Sons Ltd.
Hypermetabolism is a deleterious prognostic factor in patients with amyotrophic lateral sclerosis.
Jésus, P; Fayemendy, P; Nicol, M; Lautrette, G; Sourisseau, H; Preux, P-M; Desport, J-C; Marin, B; Couratier, P
2018-01-01
The aim of this study was to investigate patients with amyotrophic lateral sclerosis in order to determine their nutritional, neurological and respiratory parameters, and survival according to metabolic level. Nutritional assessment included resting energy expenditure (REE) measured by indirect calorimetry [hypermetabolism if REE variation (ΔREE) > 10%] and fat mass (FM) using impedancemetry. Neurological assessment included the Amyotrophic Lateral Sclerosis Functional Rating Scale-Revised score. Survival analysis used the Kaplan-Meier method and multivariate Cox model. A total of 315 patients were analysed. Median age at diagnosis was 65.9 years and 55.2% of patients were hypermetabolic. With regard to the metabolic level (ΔREE: < 10%, 10-20% and >20%), patients with ΔREE > 20% initially had a lower FM(29.7% vs. 32.1% in those with ΔREE ≤10%; P = 0.0054). During follow-up, the median slope of Amyotrophic Lateral Sclerosis Functional Rating Scale-Revised tended to worsen more in patients with ΔREE > 20% (-1.4 vs. -1.0 points/month in those with ΔREE ≤10%; P = 0.07). Overall median survival since diagnosis was 18.4 months. ΔREE > 20% tended to increase the risk of dying compared with ΔREE ≤10% (hazard ratio, 1.33; P = 0.055). In multivariate analysis, an increased REE:FM ratio was independently associated with death (hazard ratio, 1.005; P = 0.001). Hypermetabolism is present in more than half of patients with amyotrophic lateral sclerosis. It modifies the body composition at diagnosis, and patients with hypermetabolism >20% have a worse prognosis than those without hypermetabolism. © 2017 EAN.
Apolipoprotein E and mortality in African-Americans and Yoruba.
Lane, Kathleen A; Gao, Sujuan; Hui, Siu L; Murrell, Jill R; Hall, Kathleen S; Hendrie, Hugh C
2003-10-01
The literature on the association between apolipoprotein E (ApoE) and mortality across ethnic and age groups has been inconsistent. No studies have looked at this association in developing countries. We used data from the Indianapolis-Ibadan Dementia study to examine this association between APOE and mortality in 354 African-Americans from Indianapolis and 968 Yoruba from Ibadan, Nigeria. Participants were followed up to 9.5 years for Indianapolis and 8.7 years for Ibadan. Subjects from both sites were divided into 2 groups based upon age at baseline. A Cox proportional hazards regression model adjusting for age at baseline, education, hypertension, smoking history and gender in addition to time-dependent covariates of cancer, diabetes, heart disease, stroke, and dementia was fit for each cohort and age group. Having ApoE epsilon4 alleles significantly increased mortality risk in Indianapolis subjects under age 75 (hazard ratio: 2.00; 95% CI: 1.19-3.35; p = 0.0089). No association was found in Indianapolis subjects 75 and older (hazard ratio: 0.71; 95% CI: 0.45-1.10; p = 0.1238), Ibadan subjects under 75 (hazard ratio: 1.04; 95% CI: 0.78 to 1.40; p = 0.7782), or Ibadan subjects over 75 (hazard ratio: 1.21; 95% CI: 0.83 to 1.75; p = 0.3274).
Apolipoprotein E and mortality in African-Americans and Yoruba
Lane, Kathleen A.; Gao, Sujuan; Hui, Siu L.; Murrell, Jill R.; Hall, Kathleen S.; Hendrie, Hugh C.
2011-01-01
The literature on the association between apolipoprotein E (ApoE) and mortality across ethnic and age groups has been inconsistent. No studies have looked at this association in developing countries. We used data from the Indianapolis-Ibadan Dementia study to examine this association between APOE and mortality in 354 African-Americans from Indianapolis and 968 Yoruba from Ibadan, Nigeria. Participants were followed up to 9.5 years for Indianapolis and 8.7 years for Ibadan. Subjects from both sites were divided into 2 groups based upon age at baseline. A Cox proportional hazards regression model adjusting for age at baseline, education, hypertension, smoking history and gender in addition to time-dependent covariates of cancer, diabetes, heart disease, stroke, and dementia was fit for each cohort and age group. Having ApoE ε4 alleles significantly increased mortality risk in Indianapolis subjects under age 75 ( hazard ratio: 2.00; 95% CI: 1.19–3.35; p = 0.0089). No association was found in Indianapolis subjects 75 and older (hazard ratio: 0.71; 95% CI: 0.45–1.10; p = 0.1238), Ibadan subjects under 75 (hazard ratio: 1.04; 95% CI: 0.78 to 1.40; p = 0.7782), or Ibadan subjects over 75 (hazard ratio: 1.21; 95% CI: 0.83 to 1.75; p = 0.3274). PMID:14646029
Coffee and risk of death from hepatocellular carcinoma in a large cohort study in Japan.
Kurozawa, Y; Ogimoto, I; Shibata, A; Nose, T; Yoshimura, T; Suzuki, H; Sakata, R; Fujita, Y; Ichikawa, S; Iwai, N; Tamakoshi, A
2005-09-05
We examined the relation between coffee drinking and hepatocellular carcinoma (HCC) mortality in the Japan Collaborative Cohort Study for Evaluation of Cancer Risk (JACC Study). In total, 110,688 cohort members (46,399 male and 64,289 female subjects) aged 40-79 years were grouped by coffee intake into three categories: one or more cups per day, less than one cup per day and non-coffee drinkers. Cox proportional hazards model by SAS was used to obtain hazard ratio of HCC mortality for each coffee consumption categories. The hazard ratios were adjusted for age, gender, educational status, history of diabetes and liver diseases, smoking habits and alcohol. The hazard ratio of death due to HCC for drinkers of one and more cups of coffee per day, compared with non-coffee drinkers, was 0.50 (95% confidence interval 0.31-0.79), and the ratio for drinkers of less than one cup per day was 0.83 (95% confidence interval 0.54-1.25). Our data confirmed an inverse association between coffee consumption and HCC mortality.
Coffee and risk of death from hepatocellular carcinoma in a large cohort study in Japan
Kurozawa, Y; Ogimoto, I; Shibata, A; Nose, T; Yoshimura, T; Suzuki, H; Sakata, R; Fujita, Y; Ichikawa, S; Iwai, N; Tamakoshi, A
2005-01-01
We examined the relation between coffee drinking and hepatocellular carcinoma (HCC) mortality in the Japan Collaborative Cohort Study for Evaluation of Cancer Risk (JACC Study). In total, 110 688 cohort members (46 399 male and 64 289 female subjects) aged 40–79 years were grouped by coffee intake into three categories: one or more cups per day, less than one cup per day and non-coffee drinkers. Cox proportional hazards model by SAS was used to obtain hazard ratio of HCC mortality for each coffee consumption categories. The hazard ratios were adjusted for age, gender, educational status, history of diabetes and liver diseases, smoking habits and alcohol. The hazard ratio of death due to HCC for drinkers of one and more cups of coffee per day, compared with non-coffee drinkers, was 0.50 (95% confidence interval 0.31–0.79), and the ratio for drinkers of less than one cup per day was 0.83 (95% confidence interval 0.54–1.25). Our data confirmed an inverse association between coffee consumption and HCC mortality. PMID:16091758
Suicide Following Deliberate Self-Harm.
Olfson, Mark; Wall, Melanie; Wang, Shuai; Crystal, Stephen; Gerhard, Tobias; Blanco, Carlos
2017-08-01
The authors sought to identify risk factors for repeat self-harm and completed suicide over the following year among adults with deliberate self-harm. A national cohort of Medicaid-financed adults clinically diagnosed with deliberate self-harm (N=61,297) was followed for up to 1 year. Repeat self-harm per 1,000 person-years and suicide rates per 100,000 person-years (based on cause of death information from the National Death Index) were determined. Hazard ratios of repeat self-harm and suicide were estimated by Cox proportional hazard models. During the 12 months after nonfatal self-harm, the rate of repeat self-harm was 263.2 per 1,000 person-years and the rate of completed suicide was 439.1 per 100,000 person-years, or 37.2 times higher than in a matched general population cohort. The hazard of suicide was higher after initial self-harm events involving violent as compared with nonviolent methods (hazard ratio=7.5, 95% CI=5.5-10.1), especially firearms (hazard ratio=15.86, 95% CI=10.7-23.4; computed with poisoning as reference), and to a lesser extent after events of patients who had recently received outpatient mental health care (hazard ratio=1.6, 95% CI=1.2-2.0). Compared with self-harm patients using nonviolent methods, those who used violent methods were at significantly increased risk of suicide during the first 30 days after the initial event (hazard ratio=17.5, 95% CI=11.2-27.3), but not during the following 335 days. Adults treated for deliberate self-harm frequently repeat self-harm in the following year. Patients who use a violent method for their initial self-harm, especially firearms, have an exceptionally high risk of suicide, particularly right after the initial event, which highlights the importance of careful assessment and close follow-up of this group.
Chuang, Michael L.; Gona, Philimon; Salton, Carol J.; Yeon, Susan B.; Kissinger, Kraig V.; Blease, Susan J.; Levy, Daniel; O'Donnell, Christopher J.; Manning, Warren J.
2013-01-01
We sought to determine whether depressed myocardial contraction fraction (MCF, the ratio of left ventricular (LV) stroke volume to myocardial volume) predicts cardiovascular disease (CVD) events in initially healthy adults. A subset (N=318, 60±9 yrs, 158 men) of the Framingham Heart Study Offspring cohort free of clinical CVD underwent volumetric cardiovascular magnetic resonance (CMR) imaging in 1998–1999. LV ejection fraction (EF), mass and MCF were determined. “Hard” CVD events comprised cardiovascular death, myocardial infarction, stroke or new heart failure. A Cox proportional hazards model adjusting for Framingham Coronary Risk Score (FCRS) was used to estimate hazard ratios for incident hard CVD events for sex-specific quartiles of MCF, LV mass and LVEF. The lowest quartile of LV mass and highest quartiles of MCF and EF served as referent. Kaplan-Meier survival plots and the log rank test were used to compare event-free survival. MCF was greater in women (0.58±0.13) than men (0.52±0.11), p<0.01. Nearly all (99%) participants had EF ≥ 0.55. Over up to 9-year (median 5.2) follow-up, 31 participants (10%) experienced an incident hard CVD event. Lowest-quartile MCF was 7 times more likely to develop hard CVD (hazard ratio 7.11, p=0.010) compared to the lowest quartile, and the elevated hazards persisted even after adjustment for LV mass (hazard ratio=6.09, p=0.020). The highest-quartile LV mass/height2.7 had nearly five-fold risk (hazard ratio 4.68, p=0.016). Event-free survival was shorter in lowest-quartile MCF, p = 0.0006, but not in lowest-quartile LVEF. Conclusion: In a cohort of adults initially without clinical CVD, lowest-quartile MCF conferred an increased hazard for hard CVD events after adjustment for traditional CVD risk factors and LV mass. PMID:22381161
Chuang, Michael L; Gona, Philimon; Salton, Carol J; Yeon, Susan B; Kissinger, Kraig V; Blease, Susan J; Levy, Daniel; O'Donnell, Christopher J; Manning, Warren J
2012-05-15
We sought to determine whether depressed myocardial contraction fraction (MCF; ratio of left ventricular [LV] stroke volume to myocardial volume) predicts cardiovascular disease (CVD) events in initially healthy adults. A subset (n = 318, 60 ± 9 years old, 158 men) of the Framingham Heart Study Offspring cohort free of clinical CVD underwent volumetric cardiovascular magnetic resonance imaging in 1998 through 1999. LV ejection fraction (EF), mass, and MCF were determined. "Hard" CVD events consisted of cardiovascular death, myocardial infarction, stroke, or new heart failure. A Cox proportional hazards model adjusting for Framingham Coronary Risk Score was used to estimate hazard ratios for incident hard CVD events for gender-specific quartiles of MCF, LV mass, and LVEF. The lowest quartile of LV mass and highest quartiles of MCF and EF served as referents. Kaplan-Meier survival plots and log-rank test were used to compare event-free survival. MCF was greater in women (0.58 ± 0.13) than in men (0.52 ± 0.11, p <0.01). Nearly all participants (99%) had EF ≥0.55. During an up to 9-year follow-up (median 5.2), 31 participants (10%) developed an incident hard CVD event. Lowest-quartile MCF was 7 times more likely to develop a hard CVD (hazard ratio 7.11, p = 0.010) compared to the remaining quartiles, and increased hazards persisted even after adjustment for LV mass (hazard ratio 6.09, p = 0.020). The highest-quartile LV mass/height 2.7 had a nearly fivefold risk (hazard ratio 4.68, p = 0.016). Event-free survival was shorter in lowest-quartile MCF (p = 0.0006) but not in lowest-quartile LVEF. In conclusion, in a cohort of adults initially without clinical CVD, lowest-quartile MCF conferred an increased hazard for hard CVD events after adjustment for traditional CVD risk factors and LV mass. Copyright © 2012 Elsevier Inc. All rights reserved.
Socioeconomic disparities in outcomes after acute myocardial infarction.
Bernheim, Susannah M; Spertus, John A; Reid, Kimberly J; Bradley, Elizabeth H; Desai, Rani A; Peterson, Eric D; Rathore, Saif S; Normand, Sharon-Lise T; Jones, Philip G; Rahimi, Ali; Krumholz, Harlan M
2007-02-01
Patients of low socioeconomic status (SES) have higher mortality after acute myocardial infarction (AMI). Little is known about the underlying mechanisms or the relationship between SES and rehospitalization after AMI. We analyzed data from the PREMIER observational study, which included 2142 patients hospitalized with AMI from 18 US hospitals. Socioeconomic status was measured by self-reported household income and education level. Sequential multivariable modeling assessed the relationship of socioeconomic factors with 1-year all-cause mortality and all-cause rehospitalization after adjustment for demographics, clinical factors, and quality-of-care measures. Both household income and education level were associated with higher risk of mortality (hazard ratio 2.80, 95% CI 1.37-5.72, lowest to highest income group) and rehospitalization after AMI (hazard ratio 1.55, 95% CI 1.17-2.05). Patients with low SES had worse clinical status at admission and received poorer quality of care. In multivariable modeling, the relationship between household income and mortality was attenuated by adjustment for demographic and clinical factors (hazard ratio 1.19, 95% CI 0.54-2.62), with a further small decrement in the hazard ratio after adjustment for quality of care. The relationship between income and rehospitalization was only partly attenuated by demographic and clinical factors (hazard ratio 1.38, 95% CI 1.01-1.89) and was not influenced by adjustment for quality of care. Patients' baseline clinical status largely explained the relationship between SES and mortality, but not rehospitalization, among patients with AMI.
Transplantation Outcomes for Children with Hypodiploid Acute Lymphoblastic Leukemia.
Mehta, Parinda A; Zhang, Mei-Jie; Eapen, Mary; He, Wensheng; Seber, Adriana; Gibson, Brenda; Camitta, Bruce M; Kitko, Carrie L; Dvorak, Christopher C; Nemecek, Eneida R; Frangoul, Haydar A; Abdel-Azim, Hisham; Kasow, Kimberly A; Lehmann, Leslie; Gonzalez Vicent, Marta; Diaz Pérez, Miguel A; Ayas, Mouhab; Qayed, Muna; Carpenter, Paul A; Jodele, Sonata; Lund, Troy C; Leung, Wing H; Davies, Stella M
2015-07-01
Children with hypodiploid acute lymphoblastic leukemia (ALL) have inferior outcomes despite intensive risk-adapted chemotherapy regimens. We describe 78 children with hypodiploid ALL who underwent hematopoietic stem cell transplantation between 1990 and 2010. Thirty-nine (50%) patients had ≤ 43 chromosomes, 12 (15%) had 44 chromosomes, and 27 (35%) had 45 chromosomes. Forty-three (55%) patients underwent transplantation in first remission (CR1) and 35 (45%) underwent transplantation in ≥ second remission (CR2). Twenty-nine patients (37%) received a graft from a related donor and 49 (63%) from an unrelated donor. All patients received a myeloablative conditioning regimen. The 5-year probabilities of leukemia-free survival, overall survival, relapse, and treatment-related mortality for the entire cohort were 51%, 56%, 27%, and 22%, respectively. Multivariate analysis confirmed that mortality risks were higher for patients who underwent transplantation in CR2 (hazard ratio, 2.16; P = .05), with number of chromosomes ≤ 43 (hazard ratio, 2.15; P = .05), and for those who underwent transplantation in the first decade of the study period (hazard ratio, 2.60; P = .01). Similarly, treatment failure risks were higher with number of chromosomes ≤ 43 (hazard ratio, 2.28; P = .04) and the earlier transplantation period (hazard ratio, 2.51; P = .01). Although survival is better with advances in donor selection and supportive care, disease-related risk factors significantly influence transplantation outcomes. Copyright © 2015 American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
Association between chronic azotemic kidney disease and the severity of periodontal disease in dogs.
Glickman, Lawrence T; Glickman, Nita W; Moore, George E; Lund, Elizabeth M; Lantz, Gary C; Pressler, Barrak M
2011-05-01
Naturally occurring periodontal disease affects >75% of dogs and has been associated with cardiac lesions and presumptive endocarditis. However, the relationships between periodontal disease and chronic kidney disease (CKD) in dogs have not been studied. In a retrospective longitudinal study the incidence of azotemic CKD was compared between a cohort of 164,706 dogs with periodontal disease and a cohort of age-matched dogs with no periodontal disease from a national primary care practice. These dogs contributed 415,971 dog-years of follow-up from 2002 to 2008. Hazard ratios and 95% confidence intervals from Cox regression were used to compare the incidence of azotemic CKD in dogs with stage 1, 2, or 3/4 periodontal disease to dogs with no periodontal disease. The hazard ratio for azotemic CKD increased with increasing severity of periodontal disease (stage 1 hazard ratio=1.8, 95% confidence interval: 1.6, 2.1; stage 2 hazard ratio=2.0, 95% confidence interval: 1.7, 2.3; stage 3/4 hazard ratio=2.7, 95% confidence interval: 2.3, 3.0; P(trend)=<0.0001) after adjustment for age, gender, neuter status, breed, body weight, number of hospital visits, and dental procedures. Increasing severity of periodontal disease was also associated with serum creatinine >1.4 mg/dl and blood urea nitrogen >36 mg/dl, independent of a veterinarian's clinical diagnosis of CKD. Copyright © 2011 Elsevier B.V. All rights reserved.
Feng, Tom; Howard, Lauren E; Vidal, Adriana C; Moreira, Daniel M; Castro-Santamaria, Ramiro; Andriole, Gerald L; Freedland, Stephen J
2017-02-01
To determine if cholesterol is a risk factor for the development of lower urinary tract symptoms in asymptomatic men. A post-hoc analysis of the Reduction by Dutasteride of Prostate Cancer Events (REDUCE) study was carried out in 2323 men with baseline International Prostate Symptom Score <8 and not taking benign prostatic hyperplasia or cholesterol medications. Cox proportion models were used to test the association between cholesterol, high-density lipoprotein, low-density lipoprotein and the cholesterol : high-density lipoprotein ratio with incident lower urinary tract symptoms, defined as first report of medical treatment, surgery or two reports of an International Prostate Symptom Score >14. A total of 253 men (10.9%) developed incident lower urinary tract symptoms. On crude analysis, higher high-density lipoprotein was associated with a decreased lower urinary tract symptoms risk (hazard ratio 0.89, P = 0.024), whereas total cholesterol and low-density lipoprotein showed no association. After multivariable adjustment, the association between high-density lipoprotein and incident lower urinary tract symptoms remained significant (hazard ratio 0.89, P = 0.044), whereas no association was observed for low-density lipoprotein (P = 0.611). There was a trend for higher cholesterol to be linked with higher lower urinary tract symptoms risk, though this was not statistically significant (hazard ratio 1.04, P = 0.054). A higher cholesterol : high-density lipoprotein ratio was associated with increased lower urinary tract symptoms risk on crude (hazard ratio 1.11, P = 0.016) and adjusted models (hazard ratio 1.12, P = 0.012). Among asymptomatic men participating in the REDUCE study, higher cholesterol was associated with increased incident lower urinary tract symptoms risk, though the association was not significant. A higher cholesterol : high-density lipoprotein ratio was associated with increased incident lower urinary tract symptoms, whereas higher high-density lipoprotein was protective. These findings suggest dyslipidemia might play a role in lower urinary tract symptoms progression. © 2016 The Japanese Urological Association.
Mackenzie, P; Pryor, D; Burmeister, E; Foote, M; Panizza, B; Burmeister, B; Porceddu, S
2014-10-01
To determine prognostic factors for locoregional relapse (LRR), distant relapse and all-cause death in a contemporary cohort of locoregionally advanced oropharyngeal squamous cell carcinoma (OSCC) treated with definitive chemoradiotherapy or radiotherapy alone. OSCC patients treated with definitive radiotherapy between 2005 and 2010 were identified from a prospective head and neck database. Patient age, gender, smoking history, human papillomavirus (HPV) status, T- and N-category, lowest involved nodal level and gross tumour volume of the primary (GTV-p) and nodal (GTV-n) disease were analysed in relation to LRR, distant relapse and death by way of univariate and multivariate analysis. In total, 130 patients were identified, 88 HPV positive, with a median follow-up of 42 months. On multivariate analysis HPV status was a significant predictor of LRR (hazard ratio 0.15; 95% confidence interval 0.05-0.51) and death (hazard ratio 0.29; 95% confidence interval 0.14-0.59) but not distant relapse (hazard ratio 0.53, 95% confidence interval 0.22-1.27). Increasing T-category was associated with a higher risk of LRR (hazard ratio 1.80 for T3/4 versus T1/2; 95% confidence interval 1.08-2.99), death (hazard ratio 1.37, 95% confidence interval 1.06-1.77) and distant relapse (hazard ratio 1.35; 95% confidence interval 1.00-1.83). Increasing GTV-p was associated with increased risk of distant relapse and death. N3 disease and low neck nodes were significant for LRR, distant relapse and death on univariate analysis only. Tumour HPV status was the strongest predictor of LRR and death. T-category is more predictive of distant relapse and may provide additional prognostic value for LRR and death when accounting for HPV status. Copyright © 2014 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Giebel, Sebastian; Labopin, Myriam; Socié, Gerard; Beelen, Dietrich; Browne, Paul; Volin, Liisa; Kyrcz-Krzemien, Slawomira; Yakoub-Agha, Ibrahim; Aljurf, Mahmoud; Wu, Depei; Michallet, Mauricette; Arnold, Renate; Mohty, Mohamad; Nagler, Arnon
2017-01-01
Allogeneic hematopoietic cell transplantation is widely used to treat adults with high-risk acute lymphoblastic leukemia. The aim of this study was to analyze whether the results changed over time and to identify prognostic factors. Adult patients treated between 1993 and 2012 with myeloablative allogeneic hematopoietic cell transplantation from HLA matched sibling (n=2681) or unrelated (n=2178) donors in first complete remission were included. For transplantations from sibling donors performed between 2008 and 2012, 2-year probabilities of overall survival were: 76% (18–25 years old), 69% (26–35 and 36–45 years old) and 60% (46–55 years old). Among recipients of transplantations from unrelated donors, the respective survival rates were 66%, 70%, 61%, and 62%. In comparison with the 1993–2007 period, significant improvements were observed for all age groups except for the 26–35-year old patients. In a multivariate model, transplantations performed between 2008 and 2012, when compared to 1993–2007, were associated with significantly reduced risks of non-relapse mortality (Hazard Ratio 0.77, P=0.00006), relapse (Hazard Ratio 0.85, P=0.007), treatment failure (Hazard Ratio 0.81, P<0.00001), and overall mortality (Hazard Ratio 0.79, P<0.00001). In the analysis restricted to transplantations performed between 2008 and 2012, the use of total body irradiation-based conditioning was associated with reduced risk of relapse (Hazard Ratio 0.48, P=0.004) and treatment failure (Hazard Ratio 0.63, P=0.02). We conclude that results of allogeneic hematopoietic cell transplantation for adults with acute lymphoblastic leukemia improved significantly over time. Total body irradiation should be considered as the preferable type of myeloablative conditioning. PMID:27686376
Predicting risk of cancer during HIV infection: the role of inflammatory and coagulation biomarkers.
Borges, Álvaro H; Silverberg, Michael J; Wentworth, Deborah; Grulich, Andrew E; Fätkenheuer, Gerd; Mitsuyasu, Ronald; Tambussi, Giuseppe; Sabin, Caroline A; Neaton, James D; Lundgren, Jens D
2013-06-01
To investigate the relationship between inflammatory [interleukin-6 (IL-6) and C-reactive protein (CRP)] and coagulation (D-dimer) biomarkers and cancer risk during HIV infection. A prospective cohort. HIV-infected patients on continuous antiretroviral therapy (ART) in the control arms of three randomized trials (N=5023) were included in an analysis of predictors of cancer (any type, infection-related or infection-unrelated). Hazard ratios for IL-6, CRP and D-dimer levels (log2-transformed) were calculated using Cox models stratified by trial and adjusted for demographics and CD4+ cell counts and adjusted also for all biomarkers simultaneously. To assess the possibility that biomarker levels were elevated at entry due to undiagnosed cancer, analyses were repeated excluding early cancer events (i.e. diagnosed during first 2 years of follow-up). During approximately 24,000 person-years of follow-up (PYFU), 172 patients developed cancer (70 infection-related; 102 infection-unrelated). The risk of developing cancer was associated with higher levels (per doubling) of IL-6 (hazard ratio 1.38, P<0.001), CRP (hazard ratio 1.16, P=0.001) and D-dimer (hazard ratio 1.17, P=0.03). However, only IL-6 (hazard ratio 1.29, P=0.003) remained associated with cancer risk when all biomarkers were considered simultaneously. Results for infection-related and infection-unrelated cancers were similar to results for any cancer. Hazard ratios excluding 69 early cancer events were 1.31 (P=0.007), 1.14 (P=0.02) and 1.07 (P=0.49) for IL-6, CRP and D-dimer, respectively. Activated inflammation and coagulation pathways are associated with increased cancer risk during HIV infection. This association was stronger for IL-6 and persisted after excluding early cancer. Trials of interventions may be warranted to assess whether cancer risk can be reduced by lowering IL-6 levels in HIV-positive individuals.
Matarraz, Sergio; Leoz, Pilar; Fernández, Carlos; Colado, Enrique; Chillón, María Carmen; Vidriales, María Belén; González, Marcos; Rivera, Daniel; Osuna, Carlos Salvador; Caballero-Velázquez, Teresa; Van Der Velden, Vincent; Jongen-Lavrencic, Mojca; Gutiérrez, Oliver; Bermejo, Ana Yeguas; Alonso, Luis García; García, Monique Bourgeois; De Ramón Sánchez, Cristina; García-Donas, Gloria; Mateo, Aránzazu García; Recio, Isabel; Sánchez-Real, Javier; Mayado, Andrea; Gutiérrez, María Laura; Bárcena, Paloma; Barrena, Susana; López, Antonio; Van Dongen, Jacques; Orfao, Alberto
2018-03-23
Severe hemorrhagic events occur in a significant fraction of acute promyelocytic leukemia patients, either at presentation and/or early after starting therapy, leading to treatment failure and early deaths. However, identification of independent predictors for high-risk of severe bleeding at diagnosis, remains a challenge. Here, we investigated the immunophenotype of bone marrow leukemic cells from 109 newly diagnosed acute promyelocytic leukemia patients, particularly focusing on the identification of basophil-related features, and their potential association with severe bleeding episodes and patient overall survival.From all phenotypes investigated on leukemic cells, expression of the CD203c and/or CD22 basophil-associated markers showed the strongest association with the occurrence and severity of bleeding (p ≤ 0.007); moreover, aberrant expression of CD7, coexpression of CD34 + /CD7 + and lack of CD71 was also more frequently found among patients with (mild and severe) bleeding at baseline and/or after starting treatment (p ≤ 0.009). Multivariate analysis showed that CD203c expression (hazard ratio: 26.4; p = 0.003) and older age (hazard ratio: 5.4; p = 0.03) were the best independent predictors for cumulative incidence of severe bleeding after starting therapy. In addition, CD203c expression on leukemic cells (hazard ratio: 4.4; p = 0.01), low fibrinogen levels (hazard ratio: 8.8; p = 0.001), older age (hazard ratio: 9.0; p = 0.002), and high leukocyte count (hazard ratio: 5.6; p = 0.02) were the most informative independent predictors for overall survival.In summary, our results show that the presence of basophil-associated phenotypic characteristics on leukemic cells from acute promyelocytic leukemia patients at diagnosis is a powerful independent predictor for severe bleeding and overall survival, which might contribute in the future to (early) risk-adapted therapy decisions.
Rasouli, B; Ahlbom, A; Andersson, T; Grill, V; Midthjell, K; Olsson, L; Carlsson, S
2013-01-01
We investigated the influence of different aspects of alcohol consumption on the risk of Type 2 diabetes and autoimmune diabetes in adults. We used data from the Nord-Trøndelag Health Survey (HUNT) study, in which all adults aged ≥ 20 years from Nord-Trondelag County were invited to participate in three surveys in 1984-1986, 1995-1997 and 2006-2008. Patients with diabetes were identified using self-reports, and participants with onset age ≥ 35 years were classified as having Type 2 diabetes if they were negative for anti-glutamic acid decarboxylase (n = 1841) and as having autoimmune diabetes if they were positive for anti-glutamic acid decarboxylase (n = 140). Hazard ratios of amount and frequency of alcohol use, alcoholic beverage choice, and binge drinking and alcohol use disorders were estimated. Moderate alcohol consumption (adjusted for confounders) was associated with a reduced risk of Type 2 diabetes in men, but not in women (hazard ratio for men 10-15 g/day 0.48, 95% CI 0.28-0.77; hazard ratio for women ≥ 10 g/day 0.81, 95% CI 0.33-1.96). The reduced risk was primarily linked to consumption of wine [hazard ratio 0.93, 95% CI 0.87-0.99 (per g/day)]. No increased risk was seen in participants reporting binge drinking or in problem drinkers. The results were also compatible with a reduced risk of autoimmune diabetes associated with alcohol consumption [hazard ratio 0.70, 95% CI 0.45-1.08 (frequent consumption) and hazard ratio 0.36, 95% CI 0.13-0.97 (2-7 g/day)]. Moderate alcohol consumption associates with reduced risk of both Type 2 diabetes and autoimmune diabetes. A protective effect of alcohol intake may be limited to men. High alcohol consumption does not seem to carry an increased risk of diabetes. © 2012 The Authors. Diabetic Medicine © 2012 Diabetes UK.
Cancer Survival Estimates Due to Non-Uniform Loss to Follow-Up and Non-Proportional Hazards
K M, Jagathnath Krishna; Mathew, Aleyamma; Sara George, Preethi
2017-06-25
Background: Cancer survival depends on loss to follow-up (LFU) and non-proportional hazards (non-PH). If LFU is high, survival will be over-estimated. If hazard is non-PH, rank tests will provide biased inference and Cox-model will provide biased hazard-ratio. We assessed the bias due to LFU and non-PH factor in cancer survival and provided alternate methods for unbiased inference and hazard-ratio. Materials and Methods: Kaplan-Meier survival were plotted using a realistic breast cancer (BC) data-set, with >40%, 5-year LFU and compared it using another BC data-set with <15%, 5-year LFU to assess the bias in survival due to high LFU. Age at diagnosis of the latter data set was used to illustrate the bias due to a non-PH factor. Log-rank test was employed to assess the bias in p-value and Cox-model was used to assess the bias in hazard-ratio for the non-PH factor. Schoenfeld statistic was used to test the non-PH of age. For the non-PH factor, we employed Renyi statistic for inference and time dependent Cox-model for hazard-ratio. Results: Five-year BC survival was 69% (SE: 1.1%) vs. 90% (SE: 0.7%) for data with low vs. high LFU respectively. Age (<45, 46-54 & >54 years) was a non-PH factor (p-value: 0.036). However, survival by age was significant (log-rank p-value: 0.026), but not significant using Renyi statistic (p=0.067). Hazard ratio (HR) for age using Cox-model was 1.012 (95%CI: 1.004 -1.019) and the same using time-dependent Cox-model was in the other direction (HR: 0.997; 95% CI: 0.997- 0.998). Conclusion: Over-estimated survival was observed for cancer with high LFU. Log-rank statistic and Cox-model provided biased results for non-PH factor. For data with non-PH factors, Renyi statistic and time dependent Cox-model can be used as alternate methods to obtain unbiased inference and estimates. Creative Commons Attribution License
Guertler, Diana; Vandelanotte, Corneel; Kirwan, Morwenna; Duncan, Mitch J
2015-07-15
Data from controlled trials indicate that Web-based interventions generally suffer from low engagement and high attrition. This is important because the level of exposure to intervention content is linked to intervention effectiveness. However, data from real-life Web-based behavior change interventions are scarce, especially when looking at physical activity promotion. The aims of this study were to (1) examine the engagement with the freely available physical activity promotion program 10,000 Steps, (2) examine how the use of a smartphone app may be helpful in increasing engagement with the intervention and in decreasing nonusage attrition, and (3) identify sociodemographic- and engagement-related determinants of nonusage attrition. Users (N=16,948) were grouped based on which platform (website, app) they logged their physical activity: Web only, app only, or Web and app. Groups were compared on sociodemographics and engagement parameters (duration of usage, number of individual and workplace challenges started, and number of physical activity log days) using ANOVA and chi-square tests. For a subsample of users that had been members for at least 3 months (n=11,651), Kaplan-Meier survival curves were estimated to plot attrition over the first 3 months after registration. A Cox regression model was used to determine predictors of nonusage attrition. In the overall sample, user groups differed significantly in all sociodemographics and engagement parameters. Engagement with the program was highest for Web-and-app users. In the subsample, 50.00% (5826/11,651) of users stopped logging physical activity through the program after 30 days. Cox regression showed that user group predicted nonusage attrition: Web-and-app users (hazard ratio=0.86, 95% CI 0.81-0.93, P<.001) and app-only users (hazard ratio=0.63, 95% CI 0.58-0.68, P<.001) showed a reduced attrition risk compared to Web-only users. Further, having a higher number of individual challenges (hazard ratio=0.62, 95% CI 0.59-0.66, P<.001), workplace challenges (hazard ratio=0.94, 95% CI 0.90-0.97, P<.001), physical activity logging days (hazard ratio=0.921, 95% CI 0.919-0.922, P<.001), and steps logged per day (hazard ratio=0.99999, 95% CI 0.99998-0.99999, P<.001) were associated with reduced nonusage attrition risk as well as older age (hazard ratio=0.992, 95% CI 0.991-0.994, P<.001), being male (hazard ratio=0.85, 95% CI 0.82-0.89, P<.001), and being non-Australian (hazard ratio=0.87, 95% CI 0.82-0.91, P<.001). Compared to other freely accessible Web-based health behavior interventions, the 10,000 Steps program showed high engagement. The use of an app alone or in addition to the website can enhance program engagement and reduce risk of attrition. Better understanding of participant reasons for reducing engagement can assist in clarifying how to best address this issue to maximize behavior change.
Saulnier, Pierre-Jean; Gand, Elise; Velho, Gilberto; Mohammedi, Kamel; Zaoui, Philippe; Fraty, Mathilde; Halimi, Jean Michel; Roussel, Ronan; Ragot, Stéphanie; Hadjadj, Samy
2017-03-01
We explored the prognostic value of three circulating candidate biomarkers-midregional-proadrenomedullin (MR-proADM), soluble tumor necrosis factor receptor 1 (sTNFR1), and N-terminal prohormone brain natriuretic peptide (NT-proBNP)-for change in renal function in patients with type 2 diabetes. Outcomes were defined as renal function loss (RFL), ≥40% decline of estimated glomerular filtration rate (eGFR) from baseline, and rapid renal function decline (RRFD), absolute annual eGFR slope <-5 mL/min/year. We used a proportional hazard model for RFL and a logistic model for RRFD. Adjustments were performed for established risk factors (age, sex, diabetes duration, HbA 1c , blood pressure, baseline eGFR, and urinary albumin-to-creatinine ratio [uACR]). C-statistics were used to assess the incremental predictive value of the biomarkers to these risk factors. Among 1,135 participants (mean eGFR 76 mL/min, median uACR 2.6 mg/mmol, and median GFR slope -1.6 mL/min/year), RFL occurred in 397, RRFD developed in 233, and 292 died during follow-up. Each biomarker predicted RFL and RRFD. When combined, MR-proADM, sTNFR1, and NT-proBNP predicted RFL independently from the established risk factors (adjusted hazard ratio 1.59 [95% CI 1.34-1.89], P < 0.0001; 1.33 [1.14-1.55], P = 0.0003; and 1.22 [1.07-1.40], P = 0.004, respectively) and RRFD (adjusted odds ratio 1.56 [95% CI 1.7-2.09], P = 0.003; 1.72 [1.33-2.22], P < 0.0001; and 1.28 [1.03-1.59], P = 0.02, respectively). The combination of the three biomarkers yielded the highest discrimination (difference in C-statistic = 0.054, P < 0.0001; 0.067, P < 0.0001 for RFL; and 0.027, P < 0.0001 for RRFD). In addition to established risk factors, MR-proADM, sTNFR1, and NT-proBNP improve risk prediction of loss of renal function in patients with type 2 diabetes. © 2017 by the American Diabetes Association.
Health Insurance Trajectories and Long-Term Survival After Heart Transplantation.
Tumin, Dmitry; Foraker, Randi E; Smith, Sakima; Tobias, Joseph D; Hayes, Don
2016-09-01
Health insurance status at heart transplantation influences recipient survival, but implications of change in insurance for long-term outcomes are unclear. Adults aged 18 to 64 receiving first-time orthotopic heart transplants between July 2006 and December 2013 were identified in the United Network for Organ Sharing registry. Patients surviving >1 year were categorized according to trajectory of insurance status (private compared with public) at wait listing, transplantation, and 1-year follow-up. The most common insurance trajectories were continuous private coverage (44%), continuous public coverage (27%), and transition from private to public coverage (11%). Among patients who survived to 1 year (n=9088), continuous public insurance (hazard ratio =1.36; 95% confidence interval 1.19, 1.56; P<0.001) and transition from private to public insurance (hazard ratio =1.25; 95% confidence interval 1.04, 1.50; P=0.017) were associated with increased mortality hazard relative to continuous private insurance. Supplementary analyses of 11 247 patients included all durations of post-transplant survival and examined post-transplant private-to-public and public-to-private transitions as time-varying covariates. In these analyses, transition from private to public insurance was associated with increased mortality hazard (hazard ratio =1.25; 95% confidence interval 1.07, 1.47; P=0.005), whereas transition from public to private insurance was associated with lower mortality hazard (hazard ratio =0.78; 95% confidence interval 0.62, 0.97; P=0.024). Transition from private to public insurance after heart transplantation is associated with worse long-term outcomes, compounding disparities in post-transplant survival attributed to insurance status at transplantation. By contrast, post-transplant gain of private insurance among patients receiving publicly funded heart transplants was associated with improved outcomes. © 2016 American Heart Association, Inc.
May, Heidi T; Nelson, John R; Lirette, Seth T; Kulkarni, Krishnaji R; Anderson, Jeffrey L; Griswold, Michael E; Horne, Benjamin D; Correa, Adolfo; Muhlestein, Joseph B
2016-05-01
Dyslipidemia plays a significant role in the progression of cardiovascular disease. The apolipoprotein (apo) A1 remnant ratio (apo A1/VLDL3-C + IDL-C) has recently been shown to be a strong predictor of death/myocardial infarction risk among women >50 years undergoing angiography. However, whether this ratio is associated with coronary heart disease risk among other populations is unknown. We evaluated the apo A1 remnant ratio and its components for coronary heart disease incidence. Observational. Participants (N = 4722) of the Jackson Heart Study were evaluated. Baseline clinical characteristics and lipoprotein subfractions (Vertical Auto Profile method) were collected. Cox hazard regression analysis, adjusted by standard cardiovascular risk factors, was utilized to determine associations of lipoproteins with coronary heart disease. Those with new-onset coronary heart disease were older, diabetic, smokers, had less education, used more lipid-lowering medication, and had a more atherogenic lipoprotein profile. After adjustment, the apo A1 remnant ratio (hazard ratio = 0.67 per 1-SD, p = 0.002) was strongly associated with coronary heart disease incidence. This association appears to be driven by the IDL-C denominator (hazard ratio = 1.23 per 1-SD, p = 0.007). Remnants (hazard ratio = 1.21 per 1-SD, p = 0.017), but not apo A1 (hazard ratio = 0.85 per 1-SD, p = 0.121) or VLDL3-C (hazard ratio = 1.13 per 1-SD, p = 0.120) were associated with coronary heart disease. Standard lipids were not associated with coronary heart disease incidence. We found the apo A1 remnant ratio to be strongly associated with coronary heart disease. This ratio appears to better stratify risk than standard lipids, apo A1, and remnants among a primary prevention cohort of African Americans. Its utility requires further study as a lipoprotein management target for risk reduction. © The European Society of Cardiology 2015.
The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single component of a chemical mixture drives the cumulative risk of a receptor.1 This study used the MCR, the Hazard Index (HI) and Hazard Quotient (HQ) to evaluate co-exposures to six phthalates using biomonito...
Memory Hazard Functions: A Vehicle for Theory Development and Test
ERIC Educational Resources Information Center
Chechile, Richard A.
2006-01-01
A framework is developed to rigorously test an entire class of memory retention functions by examining hazard properties. Evidence is provided that the memory hazard function is not monotonically decreasing. Yet most of the proposals for retention functions, which have emerged from the psychological literature, imply that memory hazard is…
Is Functional Independence Associated With Improved Long-Term Survival After Lung Transplantation?
Osho, Asishana; Mulvihill, Michael; Lamba, Nayan; Hirji, Sameer; Yerokun, Babatunde; Bishawi, Muath; Spencer, Philip; Panda, Nikhil; Villavicencio, Mauricio; Hartwig, Matthew
2018-07-01
Existing research demonstrates superior short-term outcomes (length of stay, 1-year survival) after lung transplantation in patients with preoperative functional independence. The aim of this study was to determine whether advantages remain significant in the long-term. The United Network for Organ Sharing database was queried for adult, first-time, isolated lung transplantation records from January 2005 to December 2015. Stratification was performed based on Karnofsky Performance Status Score (3 groups) and on employment at the time of transplantation (2 groups). Kaplan-Meier and Cox analyses were performed to determine the association between these factors and survival in the long-term. Of 16,497 patients meeting criteria, 1,581 (9.6%) were almost completely independent at the time of transplant vs 5,662 (34.3%) who were disabled (completely reliant on others for activities of daily living). Cox models adjusting for recipient, donor, and transplant factors demonstrated a statistically significant association between disability at the time of transplant and long-term death (hazard ratio, 1.26; 95% confidence interval, 1.14 to 1.40; p < 0.001). There were 15,931 patients with available data on paid employment at the time of transplantation. Multivariable analysis demonstrated a statistically significant association between employment at the time of transplantation and death (hazard ratio, 0.86; 95% confidence interval, 0.75 to 0.91; p < 0.001). Preoperative functional independence and maintenance of employment are associated with superior long-term outcomes in lung recipients. The results highlight potential benefits of pretransplant functional rehabilitation for patients on the waiting list for lungs. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Romito, Giovanni; Guglielmini, Carlo; Diana, Alessia; Pelle, Nazzareno G.; Contiero, Barbara; Cipone, Mario
2018-01-01
Background The prognostic relevance of left atrial (LA) morphological and functional variables, including those derived from speckle tracking echocardiography (STE), has been little investigated in veterinary medicine. Objectives To assess the prognostic value of several echocardiographic variables, with a focus on LA morphological and functional variables in dogs with myxomatous mitral valve disease (MMVD). Animals One‐hundred and fifteen dogs of different breeds with MMVD. Methods Prospective cohort study. Conventional morphologic and echo‐Doppler variables, LA areas and volumes, and STE‐based LA strain analysis were performed in all dogs. A survival analysis was performed to test for the best echocardiographic predictors of cardiac‐related death. Results Most of the tested variables, including all LA STE‐derived variables were univariate predictors of cardiac death in Cox proportional hazard analysis. Because of strong correlation between many variables, only left atrium to aorta ratio (LA/Ao > 1.7), mitral valve E wave velocity (MV E vel > 1.3 m/s), LA maximal volume (LAVmax > 3.53 mL/kg), peak atrial longitudinal strain (PALS < 30%), and contraction strain index (CSI per 1% increase) were entered in the univariate analysis, and all were predictors of cardiac death. However, only the MV E vel (hazard ratio [HR], 4.45; confidence interval [CI], 1.76‐11.24; P < .001) and LAVmax (HR, 2.32; CI, 1.10‐4.89; P = .024) remained statistically significant in the multivariable analysis. Conclusions and Clinical Importance The assessment of LA dimension and function provides useful prognostic information in dogs with MMVD. Considering all the LA variables, LAVmax appears the strongest predictor of cardiac death, being superior to LA/Ao and STE‐derived variables. PMID:29572938
Anema, Johannes R; Steenstra, Ivan A; Bongers, Paulien M; de Vet, Henrica C W; Knol, Dirk L; Loisel, Patrick; van Mechelen, Willem
2007-02-01
Population-based randomized controlled trial. To assess the effectiveness of workplace intervention and graded activity, separately and combined, for multidisciplinary rehabilitation of low back pain (LBP). Effective components for multidisciplinary rehabilitation of LBP are not yet established. Participants sick-listed 2 to 6 weeks due to nonspecific LBP were randomized to workplace intervention (n = 96) or usual care (n = 100). Workplace intervention consisted of workplace assessment, work modifications, and case management involving all stakeholders. Participants still sick-listed at 8 weeks were randomized for graded activity (n = 55) or usual care (n = 57). Graded activity comprised biweekly 1-hour exercise sessions based on operant-conditioning principles. Outcomes were lasting return to work, pain intensity and functional status, assessed at baseline, and at 12, 26, and 52 weeks after the start of sick leave. Time until return to work for workers with workplace intervention was 77 versus 104 days (median) for workers without this intervention (P = 0.02). Workplace intervention was effective on return to work (hazard ratio = 1.7; 95% CI, 1.2-2.3; P = 0.002). Graded activity had a negative effect on return to work (hazard ratio = 0.4; 95% CI, 0.3-0.6; P < 0.001) and functional status. Combined intervention had no effect. Workplace intervention is advised for multidisciplinary rehabilitation of subacute LBP. Graded activity or combined intervention is not advised.
Lerma, Claudia; Gorelick, Alexander; Ghanem, Raja N; Glass, Leon; Huikuri, Heikki V
2013-09-01
To identify potential new markers for assessing the risk of sudden arrhythmic events based on a method that captures features of premature ventricular complexes (PVCs) in relation to sinus RR intervals in Holter recordings (heartprint). Holter recordings obtained 6 weeks after acute myocardial infarction from 227 patients with reduced ventricular function (left ventricular ejection fraction ≤ 40%) were used to produce heartprints. Measured indices were: PVCs per hour, standard deviation of coupling interval (SDCI), and the number of occurrences of the most prevalent form of PVCs (SNIB). Predictive values, survival analysis, and Cox regression with adjustment for clinical variables were performed based on primary endpoint, defined as an electrocardiogram-documented fatal or near-fatal arrhythmic event, death from any cause, and cardiac death. High ectopy (PVCs per hour ≥10) was a predictor of all endpoints. Repeating forms of PVCs (SNIB ≥ 83) was a predictor of primary endpoint, hazard ratio = 3.5 (1.3-9.5), and all-cause death, hazard ratio = 2.8 (1.1-7.3), but not cardiac death. SDCI ≤ 80 ms was a predictor of all-cause death and cardiac death, but not of primary endpoint. High ectopy, prevalence of repeating forms of PVCs, and low coupling interval variability are potentially useful risk markers of fatal or near-fatal arrhythmias after myocardial infarction.
Ji, W H; Jiang, Y H; Ji, Y L; Li, B; Mao, W M
2016-07-01
The study aimed to evaluate the prognostic significance of prechemotherapy neutrophil to lymphocyte ratio and platelet to lymphocyte ratio, and preoperative neutrophil to lymphocyte ratio and platelet to lymphocyte ratio in locally advanced esophageal squamous cell cancer. We analyzed retrospectively locally advanced esophageal squamous cell cancer patients who had received neoadjuvant chemotherapy before undergoing a radical esophagectomy between 2009 and 2012. Neutrophil to lymphocyte ratio and platelet to lymphocyte ratio before chemotherapy and before the surgery were calculated. Univariate analyses showed that prechemotherapy neutrophil to lymphocyte ratio >5 (P = 0.048, hazard ratio = 2.86; 95% confidence interval: 1.01-8.12) and prechemotherapy platelet to lymphocyte ratio >130 (P = 0.025, hazard ratio = 5.50; 95% confidence interval: 1.23-24.55) were associated significantly with overall survival (OS), and prechemotherapy platelet to lymphocyte ratio >130 (P = 0.026, hazard ratio = 3.18; 95% confidence interval: 1.15-8.85) was associated significantly with progression-free survival. However, only prechemotherapy neutrophil to lymphocyte ratio >5 (P = 0.024, hazard ratio = 3.50; 95% confidence interval: 1.18-10.40) remained significantly associated with OS in multivariate analyses. Neither preoperative neutrophil to lymphocyte ratio nor platelet to lymphocyte ratio was associated with OS or progression-free survival. The prechemotherapy neutrophil to lymphocyte ratio >5 to preoperative neutrophil to lymphocyte ratio ≤5 group showed significantly worse OS than the prechemotherapy neutrophil to lymphocyte ratio ≤5 to preoperative neutrophil to lymphocyte ratio ≤5 group (P = 0.050). The prechemotherapy platelet to lymphocyte ratio >130 to preoperative platelet to lymphocyte ratio ≤130 group (P = 0.016) and platelet to lymphocyte ratio >130 to preoperative platelet to lymphocyte ratio >130 group (P = 0.042) showed significantly worse OS than the prechemotherapy platelet to lymphocyte ratio ≤30 to preoperative platelet to lymphocyte ratio ≤130 group. In conclusions, prechemotherapy neutrophil to lymphocyte ratio is an independent prognostic factor for OS in patients with advanced esophageal squamous cell cancer treated with neoadjuvant chemotherapy, and, as an adverse prognostic predictor, increased prechemotherapy neutrophil to lymphocyte ratio is superior to platelet to lymphocyte ratio. Maintaining a low neutrophil to lymphocyte ratio and platelet to lymphocyte ratio throughout treatment is a predictor of better OS. © 2015 International Society for Diseases of the Esophagus.
2013-01-01
Background Designs and analyses of clinical trials with a time-to-event outcome almost invariably rely on the hazard ratio to estimate the treatment effect and implicitly, therefore, on the proportional hazards assumption. However, the results of some recent trials indicate that there is no guarantee that the assumption will hold. Here, we describe the use of the restricted mean survival time as a possible alternative tool in the design and analysis of these trials. Methods The restricted mean is a measure of average survival from time 0 to a specified time point, and may be estimated as the area under the survival curve up to that point. We consider the design of such trials according to a wide range of possible survival distributions in the control and research arm(s). The distributions are conveniently defined as piecewise exponential distributions and can be specified through piecewise constant hazards and time-fixed or time-dependent hazard ratios. Such designs can embody proportional or non-proportional hazards of the treatment effect. Results We demonstrate the use of restricted mean survival time and a test of the difference in restricted means as an alternative measure of treatment effect. We support the approach through the results of simulation studies and in real examples from several cancer trials. We illustrate the required sample size under proportional and non-proportional hazards, also the significance level and power of the proposed test. Values are compared with those from the standard approach which utilizes the logrank test. Conclusions We conclude that the hazard ratio cannot be recommended as a general measure of the treatment effect in a randomized controlled trial, nor is it always appropriate when designing a trial. Restricted mean survival time may provide a practical way forward and deserves greater attention. PMID:24314264
Imamura, Fumiaki; Lichtenstein, Alice H; Dallal, Gerard E; Meigs, James B; Jacques, Paul F
2009-07-01
The ability to interpret epidemiologic observations is limited because of potential residual confounding by correlated dietary components. Dietary pattern analyses by factor analysis or partial least squares may overcome the limitation. To examine confounding by dietary pattern as well as standard risk factors and selected nutrients, the authors modeled the longitudinal association between alcohol consumption and 7-year risk of type 2 diabetes mellitus in 2,879 healthy adults enrolled in the Framingham Offspring Study (1991-2001) by Cox proportional hazard models. After adjustment for standard risk factors, consumers of > or =9.0 drinks/week had a significantly lower risk of type 2 diabetes mellitus compared with abstainers (hazard ratio = 0.47, 95% confidence interval (CI): 0.27, 0.81). Adjustment for selected nutrients had little effect on the hazard ratio, whereas adjustment for dietary pattern variables by factor analysis significantly shifted the hazard ratio away from null (hazard ratio = 0.33, 95% CI: 0.17, 0.64) by 40.0% (95% CI: 16.8, 57.0; P = 0.002). Dietary pattern variables by partial least squares showed similar results. Therefore, the observed inverse association, consistent with past studies, was confounded by dietary patterns, and this confounding was not captured by individual nutrient adjustment. The data suggest that alcohol intake, not dietary patterns associated with alcohol intake, is responsible for the observed inverse association with type 2 diabetes mellitus risk.
Bajwa, Ednan K; Yu, Chu-Ling; Gong, Michelle N; Thompson, B Taylor; Christiani, David C
2007-05-01
Pre-B-cell colony-enhancing factor (PBEF) levels are elevated in bronchoalveolar lavage fluid and serum of patients with acute lung injury. There are several suspected functional polymorphisms of the corresponding PBEF gene. We hypothesized that variations in PBEF gene polymorphisms alter the risk of developing acute respiratory distress syndrome (ARDS). Nested case-control study. Tertiary academic medical center. We studied 375 patients with ARDS and 787 at-risk controls genotyped for the PBEF T-1001G and C-1543T polymorphisms. None. Patients with the -1001G (variant) allele had significantly greater odds of developing ARDS than wild-type homozygotes (odds ratio, 1.35; 95% confidence interval, 1.02-1.78). Patients with the -1543T (variant) allele did not have significantly different odds of developing ARDS than wild-type homozygotes (odds ratio, 0.86; 95% confidence interval, 0.65-1.13). When analysis was stratified by ARDS risk factor, -1543T was associated with decreased odds of developing ARDS in septic shock patients (odds ratio, 0.66; 95% confidence interval, 0.45-0.97). Also, -1001G was associated with increased hazard of intensive care unit mortality, whereas -1543T was associated with decreased hazard of 28-day and 60-day ARDS mortality, as well as shorter duration of mechanical ventilation. Similar results were found in analyses of the related GC (-1001G:-1543C) and TT (-1001T:-1543T) haplotypes. The PBEFT-1001G variant allele and related haplotype are associated with increased odds of developing ARDS and increased hazard of intensive care unit mortality among at-risk patients, whereas the C-1543T variant allele and related haplotype are associated with decreased odds of ARDS among patients with septic shock and better outcomes among patients with ARDS.
Aspergillus sensitization or carriage in cystic fibrosis patients.
Fillaux, Judith; Brémont, François; Murris, Marlène; Cassaing, Sophie; Tétu, Laurent; Segonds, Christine; Pipy, Bernard; Magnaval, Jean-François
2014-07-01
Aspergillus fumigatus (Af) sensitization and persistent carriage are deleterious to lung function, but no consensus has been reached defining these medical entities. This work aimed to identify possible predictive factors for patients who become sensitized to Af, compared with a control group of non-sensitized Af carriers. Between 1995 and 2007, 117 pediatric patients were evaluated. Demographic data, CFTR gene mutations, body mass index and FEV1 were recorded. The presence of Af in sputum, the levels of Af-precipitin, total IgE (t-IgE) and specific IgE to Af (Af-IgE) were determined. Patients were divided into 2 groups: (1) "sensitization": level of Af-IgE > 0.35 IU/mL with t-IgE level < 500 IU/mL and (2) "persistent or transient carriage": Af-IgE level ≤ 0.35 IU/mL with either an Af transient or persistent positive culture. A survival analysis was performed with the appearance of Af-IgE in serum as an outcome variable. Severe mutation (hazard ratio = 3.2), FEV1 baseline over 70% of theoretical value (hazard ratio = 4.9), absence of Pa colonization, catalase activity and previous azithromycin administration (hazard ratio = 9.8, 4.1 and 1.9, respectively) were predictive factors for sensitization. We propose a timeline of the biological events and a tree diagram for risk calculation. Two profiles of cystic fibrosis patients can be envisaged: (1) patients with nonsevere mutation but low FEV1 baselines are becoming colonized with Af or (2) patients with high FEV1 baselines who present with severe mutation are more susceptible to the Af sensitization and then to the presentation of an allergic bronchopulmonary aspergillosis event.
Knight, Richard J; Graviss, Edward A; Nguyen, Duc T; Kuten, Samantha A; Patel, Samir J; Gaber, Lillian; Gaber, A Osama
2018-04-19
We sought to determine whether conversion from tacrolimus/mycophenolate mofetil (TAC-MMF) into tacrolimus/mTOR inhibitor (TAC-mTOR) immunosuppression would reduce the incidences of BK and CMV viremia after kidney/pancreas (KP) transplantation. In this single-center review, the TAC-mTOR cohort (n = 39) was converted at 1 month post-transplant to an mTOR inhibitor and reduced-dose tacrolimus. Outcomes were compared to a cohort of KP recipients (n = 40) maintained on TAC-MMF. At 3 years post-transplant, KP survivals and incidences of kidney/pancreas rejection were equivalent between mTOR and MMF-treated cohorts. (P = ns). BK viremia-free survival was better for the mTOR vs MMF-treated group (P = .004). In multivariate analysis, MMF vs mTOR immunosuppression was an independent risk factor for BK viremia (hazard ratio 12.27, P = .02). Similarly, mTOR-treated recipients displayed better CMV infection-free survival compared to the MMF-treated cohort (P = .01). MMF vs mTOR immunosuppression (hazard ratio 18.77, P = .001) and older recipient age (hazard ratio 1.13 per year, P = .006) were independent risk factors for CMV viremia. Mean estimated GFR and HgbA1c levels were equivalent between groups at 1, 2, and 3 years post-transplantation. Conversion from TAC/MMF into TAC/mTOR immunosuppression after KP transplantation reduced the incidences of BK and CMV viremia with an equivalent risk of acute rejection and similar renal/pancreas function. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Past Decline Versus Current eGFR and Subsequent ESRD Risk.
Kovesdy, Csaba P; Coresh, Josef; Ballew, Shoshana H; Woodward, Mark; Levin, Adeera; Naimark, David M J; Nally, Joseph; Rothenbacher, Dietrich; Stengel, Benedicte; Iseki, Kunitoshi; Matsushita, Kunihiro; Levey, Andrew S
2016-08-01
eGFR is a robust predictor of ESRD risk. However, the prognostic information gained from the past trajectory (slope) beyond that of the current eGFR is unclear. We examined 22 cohorts to determine the association of past slopes and current eGFR level with subsequent ESRD. We modeled hazard ratios as a spline function of slopes, adjusting for demographic variables, eGFR, and comorbidities. We used random effects meta-analyses to combine results across studies stratified by cohort type. We calculated the absolute risk of ESRD at 5 years after the last eGFR using the weighted average baseline risk. Overall, 1,080,223 participants experienced 5163 ESRD events during a mean follow-up of 2.0 years. In CKD cohorts, a slope of -6 versus 0 ml/min per 1.73 m(2) per year over the previous 3 years (a decline of 18 ml/min per 1.73 m(2) versus no decline) associated with an adjusted hazard ratio of ESRD of 2.28 (95% confidence interval, 1.88 to 2.76). In contrast, a current eGFR of 30 versus 50 ml/min per 1.73 m(2) (a difference of 20 ml/min per 1.73 m(2)) associated with an adjusted hazard ratio of 19.9 (95% confidence interval, 13.6 to 29.1). Past decline contributed more to the absolute risk of ESRD at lower than higher levels of current eGFR. In conclusion, during a follow-up of 2 years, current eGFR associates more strongly with future ESRD risk than the magnitude of past eGFR decline, but both contribute substantially to the risk of ESRD, especially at eGFR<30 ml/min per 1.73 m(2). Copyright © 2016 by the American Society of Nephrology.
Modelling multi-hazard hurricane damages on an urbanized coast with a Bayesian Network approach
van Verseveld, H.C.W.; Van Dongeren, A. R.; Plant, Nathaniel G.; Jäger, W.S.; den Heijer, C.
2015-01-01
Hurricane flood impacts to residential buildings in coastal zones are caused by a number of hazards, such as inundation, overflow currents, erosion, and wave attack. However, traditional hurricane damage models typically make use of stage-damage functions, where the stage is related to flooding depth only. Moreover, these models are deterministic and do not consider the large amount of uncertainty associated with both the processes themselves and with the predictions. This uncertainty becomes increasingly important when multiple hazards (flooding, wave attack, erosion, etc.) are considered simultaneously. This paper focusses on establishing relationships between observed damage and multiple hazard indicators in order to make better probabilistic predictions. The concept consists of (1) determining Local Hazard Indicators (LHIs) from a hindcasted storm with use of a nearshore morphodynamic model, XBeach, and (2) coupling these LHIs and building characteristics to the observed damages. We chose a Bayesian Network approach in order to make this coupling and used the LHIs ‘Inundation depth’, ‘Flow velocity’, ‘Wave attack’, and ‘Scour depth’ to represent flooding, current, wave impacts, and erosion related hazards.The coupled hazard model was tested against four thousand damage observations from a case site at the Rockaway Peninsula, NY, that was impacted by Hurricane Sandy in late October, 2012. The model was able to accurately distinguish ‘Minor damage’ from all other outcomes 95% of the time and could distinguish areas that were affected by the storm, but not severely damaged, 68% of the time. For the most heavily damaged buildings (‘Major Damage’ and ‘Destroyed’), projections of the expected damage underestimated the observed damage. The model demonstrated that including multiple hazards doubled the prediction skill, with Log-Likelihood Ratio test (a measure of improved accuracy and reduction in uncertainty) scores between 0.02 and 0.17 when only one hazard is considered and a score of 0.37 when multiple hazards are considered simultaneously. The LHIs with the most predictive skill were ‘Inundation depth’ and ‘Wave attack’. The Bayesian Network approach has several advantages over the market-standard stage-damage functions: the predictive capacity of multiple indicators can be combined; probabilistic predictions can be obtained, which include uncertainty; and quantitative as well as descriptive information can be used simultaneously.
Sun, Ming-Hui; Liao, Yaping Joyce; Lin, Che-Chen; Chiang, Rayleigh Ping-Ying; Wei, James Cheng-Chung
2018-04-26
Obstructive sleep apnea (OSA) is associated with many systemic diseases including diabetes, hypertension, stroke, and cardiovascular disease. The aim of our study was to investigate the association between OSA and optic neuropathy (ON), and to evaluate the efficacy of treatment for OSA on the risk of ON. We used the data from the Longitudinal Health Insurance Database, which involved one million insurants from Taiwan National Health Insurance program (Taiwan NHI). OSA patients had a 1.95-fold higher risk of ON compared with non-OSA patients in all age group. The risk was significantly higher (adjusted hazard ratio: 4.21) in the group aged <45 years and male individuals (adjusted hazard ratio: 1.93). Meanwhile, sleep apnea was associated with ON regardless of the existence of comorbidity or not. OSA patients treated with continuous positive airway pressure (CPAP) had an adjusted 2.31-fold higher hazard of developing ON compared to controls, and those without any treatment had an adjusted 1.82-fold higher hazard of developing ON compared to controls. Moreover, ON patients had a 1.45-fold higher risk of OSA, and those aged between 45 and 64 years (hazard ratio: 1.76) and male individuals (hazard ratio: 1.55) had highest risk. Our study showed that OSA increased the risk of developing ON after controlling the comorbidities; however, treatment with CPAP did not reduce the risk of ON. Further large population study accessing to medical records about the severity of OSA and treatment for OSA is needed to clarify the efficacy of treatment for OSA in reducing the risk of ON.
Potanas, Christopher P; Padgett, Sheldon; Gamblin, Rance M
2015-04-15
Objective-To identify variables associated with prognosis in dogs undergoing surgical excision of anal sac apocrine gland adenocarcinomas (ASACs) with and without adjunctive chemotherapy. Design-Retrospective case series. Animals-42 dogs with ASACs. Procedures-Information on signalment, clinical signs, diagnostic procedures, surgical procedures, adjunctive therapies, survival time, and disease-free interval was obtained from the medical records. Results-Survival time was significantly associated with the presence of sublumbar lymphadenopathy and sublumbar lymph node extirpation, with median survival time significantly shorter for dogs with sublumbar lymphadenopathy (hazard ratio, 2.31) than for those without and for dogs that underwent lymph node extirpation (hazard ratio, 2.31) than for those that did not. Disease-free interval was significantly associated with the presence of sublumbar lymphadenopathy, lymph node extirpation, and administration of platinum-containing chemotherapeutic agents, with median disease-free interval significantly shorter for dogs with sublumbar lymphadenopathy (hazard ratio, 2.47) than for those without, for dogs that underwent lymph node extirpation (hazard ratio, 2.47) than for those that did not, and for dogs that received platinum-containing chemotherapeutic agents (hazard ratio, 2.69) than for those that did not. Survival time and disease-free interval did not differ among groups when dogs were grouped on the basis of histopathologic margins (complete vs marginal vs incomplete excision). Conclusions and Clinical Relevance-Results suggested that in dogs with ASAC undergoing surgical excision, the presence of sublumbar lymphadenopathy and lymph node extirpation were both negative prognostic factors. However, completeness of surgical excision was not associated with survival time or disease-free interval.
Sood, Anshuman; Hakim, David N; Hakim, Nadey S
2016-04-01
The prevalence of obesity is increasing rapidly and globally, yet systemic reviews on this topic are scarce. Our meta-analysis and systemic review aimed to assess how obesity affects 5 postoperative outcomes: biopsy-proven acute rejection, patient death, allograft loss, type 2 diabetes mellitus after transplant, and delayed graft function. We evaluated peer-reviewed literature from 22 medical databases. Studies were included if they were conducted in accordance with the Meta-analysis of Observational Studies in Epidemiology criteria, only examined postoperative outcomes in adult patients, only examined the relation between recipient obesity at time of transplant and our 5 postoperative outcomes, and had a minimum score of > 5 stars on the Newcastle-Ottawa scale for nonrandomized studies. Reliable conclusions were ensured by having our studies examined against 2 internationally known scoring systems. Obesity was defined in accordance with the World Health Organization as having a body mass index of > 30 kg/m(2). All obese recipients were compared versus "healthy" recipients (body mass index of 18.5-24.9 kg/m(2)). Hazard ratios were calculated for biopsy-proven acute rejection, patient death, allograft loss, and type 2 diabetes mellitus after transplant. An odds ratio was calculated for delayed graft function. We assessed 21 retrospective observational studies in our meta-analysis (N = 241 381 patients). In obese transplant recipients, hazard ratios were 1.51 (95% confidence interval, 1.24-1.78) for presence of biopsy-proven acute rejection, 1.19 (95% confidence interval, 1.10-1.31) for patient death, 1.54 (95% confidence interval, 1.38-1.68) for allograft loss, and 1.01 (95% confidence interval, 0.98-1.07) for development of type 2 diabetes mellitus. The odds ratio for delayed graft function was 1.81 (95% confidence interval, 1.51-2.13). Our meta-analysis clearly demonstrated greater risks for obese renal transplant recipients and poorer postoperative outcomes with obesity. We confidently recommend renal transplant candidates seek medically supervised weight loss before transplant.
Hardy, Dale S; Stallings, Devita T; Garvin, Jane T; Xu, Hongyan; Racette, Susan B
2017-01-01
To determine which anthropometric measures are the strongest discriminators of incident type 2 diabetes (T2DM) among White and Black males and females in a large U.S. cohort. We used Atherosclerosis Risk in Communities study data from 12,121 participants aged 45-64 years without diabetes at baseline who were followed for over 11 years. Anthropometric measures included a body shape index (ABSI), body adiposity index (BAI), body mass index (BMI), waist circumference (WC), waist to hip ratio (WHR), waist to height ratio (WHtR), and waist to hip to height ratio (WHHR). All anthropometric measures were repeated at each visit and converted to Z-scores. Hazard ratios and 95% confidence intervals adjusted for age were calculated using repeated measures Cox proportional hazard regression analysis. Akaike Information Criteria was used to select best-fit models. The magnitude of the hazard ratio effect sizes and the Harrell's C-indexes were used to rank the highest associations and discriminators, respectively. There were 1,359 incident diabetes cases. Higher values of all anthropometric measures increased the risk for development of T2DM (p < 0.0001) except ABSI, which was not significant in White and Black males. Statistically significant hazard ratios ranged from 1.26-1.63 for males and 1.15-1.88 for females. In general, the largest hazard ratios were those that corresponded to the highest Harrell's C-Index and lowest Akaike Information Criteria values. Among White and Black males and females, BMI, WC, WHR, and WHtR were comparable in discriminating cases from non-cases of T2DM. ABSI, BAI, and WHHR were inferior discriminators of incident T2DM across all race-gender groups. BMI, the most commonly used anthropometric measure, and three anthropometric measures that included waist circumference (i.e., WC, WHR, WHtR) were the best anthropometric discriminators of incident T2DM across all race-gender groups in the ARIC cohort.
Load Adaptability in Patients With Pulmonary Arterial Hypertension.
Amsallem, Myriam; Boulate, David; Aymami, Marie; Guihaire, Julien; Selej, Mona; Huo, Jennie; Denault, Andre Y; McConnell, Michael V; Schnittger, Ingela; Fadel, Elie; Mercier, Olaf; Zamanian, Roham T; Haddad, Francois
2017-09-01
Right ventricular (RV) adaptation to pressure overload is a major prognostic factor in patients with pulmonary arterial hypertension (PAH). The objectives were first to define the relation between RV adaptation and load using allometric modeling, then to compare the prognostic value of different indices of load adaptability in PAH. Both a derivation (n = 85) and a validation cohort (n = 200) were included. Load adaptability was assessed using 3 approaches: (1) surrogates of ventriculo-arterial coupling (e.g., RV area change/end-systolic area), (2) simple ratio of function and load (e.g., tricuspid annular plane systolic excursion/right ventricular systolic pressure), and (3) indices assessing the proportionality of adaptation using allometric pressure-function or size modeling. Proportional hazard modeling was used to compare the hazard ratio for the outcome of death or lung transplantation. The mean age of the derivation cohort was 44 ± 11 years, with 80% female and 74% in New York Heart Association class III or IV. Mean pulmonary vascular resistance index (PVRI) was 24 ± 11 with a wide distribution (1.6 to 57.5 WU/m 2 ). Allometric relations were observed between PVRI and RV fractional area change (R 2 = 0.53, p < 0.001) and RV end-systolic area indexed to body surface area right ventricular end-systolic area index (RVESAI) (R 2 = 0.29, p < 0.001), allowing the derivation of simple ratiometric load-specific indices of RV adaptation. In right heart parameters, RVESAI was the strongest predictor of outcomes (hazard ratio per SD = 1.93, 95% confidence interval 1.37 to 2.75, p < 0.001). Although RVESAI/PVRI 0.35 provided small incremental discrimination on multivariate modeling, none of the load-adaptability indices provided stronger discrimination of outcome than simple RV adaptation metrics in either the derivation or the validation cohort. In conclusion, allometric modeling enables quantification of the proportionality of RV load adaptation but offers small incremental prognostic value to RV end-systolic dimension in PAH. Copyright © 2017 Elsevier Inc. All rights reserved.
Fan, Fangfang; Yuan, Ziwen; Qin, Xianhui; Li, Jianping; Zhang, Yan; Li, Youbao; Yu, Tao; Ji, Meng; Ge, Junbo; Zheng, Meili; Yang, Xinchun; Bao, Huihui; Cheng, Xiaoshu; Gu, Dongfeng; Zhao, Dong; Wang, Jiguang; Sun, Ningling; Chen, Yundai; Wang, Hong; Wang, Xiaobin; Parati, Gianfranco; Hou, Fanfan; Xu, Xiping; Wang, Xian; Zhao, Gang; Huo, Yong
2017-04-01
We aimed to investigate the relationship of time-averaged on-treatment systolic blood pressure (SBP) with the risk of first stroke in the CSPPT (China Stroke Primary Prevention Trial). A post hoc analysis was conducted using data from 17 720 hypertensive adults without cardiovascular disease, diabetes mellitus, and renal function decline from the CSPPT, a randomized double-blind controlled trial. The primary outcome was first stroke. Over a median follow-up duration of 4.5 years, the association between averaged on-treatment SBP and risk for first stoke followed a U-shape curve, with increased risk above and below the reference range of 120 to 130 mm Hg. Compared with participants with time-averaged on-treatment SBP at 120 to 130 mm Hg (mean, 126.2 mm Hg), the risk of first stroke was not only increased in participants with SBP at 130 to 135 mm Hg (mean, 132.6 mm Hg; 1.5% versus 0.8%; hazard ratio, 1.63; 95% confidence interval, 1.01-2.63) or 135 to 140 mm Hg (mean, 137.5 mm Hg; 1.9% versus 0.8%; hazard ratio, 1.85; 95% confidence interval, 1.17-2.93), but also increased in participants with SBP <120 mm Hg (mean, 116.7 mm Hg; 3.1% versus 0.8%; hazard ratio, 4.37; 95% confidence interval, 2.10-9.07). Similar results were found in various subgroups stratified by age, sex, and treatment group. Furthermore, lower diastolic blood pressure was associated with lower risk of stroke, with a plateau at a time-average on-treatment diastolic blood pressure <80 mm Hg. In conclusion, among adults with hypertension and without a history of stroke or myocardial infarction, diabetes mellitus, or renal function decline, a lower SBP goal of 120 to 130 mm Hg, as compared with a target SBP of 130 to 140 mm Hg or <120 mm Hg, resulted in the lowest risk of first stroke. © 2017 American Heart Association, Inc.
Prediction of functional loss in glaucoma from progressive optic disc damage.
Medeiros, Felipe A; Alencar, Luciana M; Zangwill, Linda M; Bowd, Christopher; Sample, Pamela A; Weinreb, Robert N
2009-10-01
To evaluate the ability of progressive optic disc damage detected by assessment of longitudinal stereophotographs to predict future development of functional loss in those with suspected glaucoma. The study included 639 eyes of 407 patients with suspected glaucoma followed up for an average of 8.0 years with annual standard automated perimetry visual field and optic disc stereophotographs. All patients had normal and reliable standard automated perimetry results at baseline. Conversion to glaucoma was defined as development of 3 consecutive abnormal visual fields during follow-up. Presence of progressive optic disc damage was evaluated by grading longitudinally acquired simultaneous stereophotographs. Other predictive factors included age, intraocular pressure, central corneal thickness, pattern standard deviation, and baseline stereophotograph grading. Hazard ratios for predicting visual field loss were obtained by extended Cox models, with optic disc progression as a time-dependent covariate. Predictive accuracy was evaluated using a modified R(2) index. Progressive optic disc damage had a hazard ratio of 25.8 (95% confidence interval, 16.0-41.7) and was the most important risk factor for development of visual field loss with an R(2) of 79%. The R(2)s for other predictive factors ranged from 6% to 26%. Presence of progressive optic disc damage on stereophotographs was a highly predictive factor for future development of functional loss in glaucoma. These findings suggest the importance of careful monitoring of the optic disc appearance and a potential role for longitudinal assessment of the optic disc as an end point in clinical trials and as a reference for evaluation of diagnostic tests in glaucoma.
Lindsay, Alistair C; Harron, Katie; Jabbour, Richard J; Kanyal, Ritesh; Snow, Thomas M; Sawhney, Paramvir; Alpendurada, Francisco; Roughton, Michael; Pennell, Dudley J; Duncan, Alison; Di Mario, Carlo; Davies, Simon W; Mohiaddin, Raad H; Moat, Neil E
2016-07-01
Cardiovascular magnetic resonance (CMR) can provide important structural information in patients undergoing transcatheter aortic valve implantation. Although CMR is considered the standard of reference for measuring ventricular volumes and mass, the relationship between CMR findings of right ventricular (RV) function and outcomes after transcatheter aortic valve implantation has not previously been reported. A total of 190 patients underwent 1.5 Tesla CMR before transcatheter aortic valve implantation. Steady-state free precession sequences were used for aortic valve planimetry and to assess ventricular volumes and mass. Semiautomated image analysis was performed by 2 specialist reviewers blinded to patient treatment. Patient follow-up was obtained from the Office of National Statistics mortality database. The median age was 81.0 (interquartile range, 74.9-85.5) years; 50.0% were women. Impaired RV function (RV ejection fraction ≤50%) was present in 45 (23.7%) patients. Patients with RV dysfunction had poorer left ventricular ejection fractions (42% versus 69%), higher indexed left ventricular end-systolic volumes (96 versus 40 mL), and greater indexed left ventricular mass (101 versus 85 g/m(2); P<0.01 for all) than those with normal RV function. Median follow-up was 850 days; 21 of 45 (46.7%) patients with RV dysfunction died, compared with 43 of 145 (29.7%) patients with normal RV function (P=0.035). After adjustment for significant baseline variables, both RV ejection fraction ≤50% (hazard ratio, 2.12; P=0.017) and indexed aortic valve area (hazard ratio, 4.16; P=0.025) were independently associated with survival. RV function, measured on preprocedural CMR, is an independent predictor of mortality after transcatheter aortic valve implantation. CMR assessment of RV function may be important in the risk stratification of patients undergoing transcatheter aortic valve implantation. © 2016 American Heart Association, Inc.
Maroules, Christopher D; Rosero, Eric; Ayers, Colby; Peshock, Ronald M; Khera, Amit
2013-10-01
To determine the value of two abdominal aortic atherosclerosis measurements at magnetic resonance (MR) imaging for predicting future cardiovascular events. This study was approved by the institutional review board and complied with HIPAA regulations. The study consisted of 2122 participants from the multiethnic, population-based Dallas Heart Study who underwent abdominal aortic MR imaging at 1.5 T. Aortic atherosclerosis was measured by quantifying mean aortic wall thickness (MAWT) and aortic plaque burden. Participants were monitored for cardiovascular death, nonfatal cardiac events, and nonfatal extracardiac vascular events over a mean period of 7.8 years ± 1.5 (standard deviation [SD]). Cox proportional hazards regression was used to assess independent associations of aortic atherosclerosis and cardiovascular events. Increasing MAWT was positively associated with male sex (odds ratio, 3.66; P < .0001), current smoking (odds ratio, 2.53; P < .0001), 10-year increase in age (odds ratio, 2.24; P < .0001), and hypertension (odds ratio, 1.66; P = .0001). A total of 143 participants (6.7%) experienced a cardiovascular event. MAWT conferred an increased risk for composite events (hazard ratio, 1.28 per 1 SD; P = .001). Aortic plaque was not associated with increased risk for composite events. Increasing MAWT and aortic plaque burden both conferred an increased risk for nonfatal extracardiac events (hazard ratio of 1.52 per 1 SD [P < .001] and hazard ratio of 1.46 per 1 SD [P = .03], respectively). MR imaging measures of aortic atherosclerosis are predictive of future adverse cardiovascular events. © RSNA, 2013.
Hakeem, Abdul; Bhatti, Sabha; Dillie, Kathryn Sullivan; Cook, Jeffrey R; Samad, Zainab; Roth-Cline, Michelle D; Chang, Su Min
2008-12-09
Patients with chronic kidney disease (CKD) have worse cardiovascular outcomes than those without CKD. The prognostic utility of myocardial perfusion single-photon emission CT (MPS) in patients with varying degrees of renal dysfunction and the impact of CKD on cardiac death prediction in patients undergoing MPS have not been investigated. We followed up 1652 consecutive patients who underwent stress MPS (32% exercise, 95% gated) for cardiac death for a mean of 2.15+/-0.8 years. MPS defects were defined with a summed stress score (normal summed stress score <4, abnormal summed stress score>or=4). Ischemia was defined as a summed stress score >or=4 plus a summed difference score >or=2, and scar was defined as a summed difference score <2 plus a summed stress score >or=4. Renal function was calculated with the Modified Diet in Renal Disease equation. CKD (estimated glomerular filtration rate <60 mL . min(-1) . 1.73 m(-2)) was present in 36%. Cardiac death increased with worsening levels of perfusion defects across the entire spectrum of renal function. Presence of ischemia was independently predictive of cardiac death, all-cause mortality, and nonfatal myocardial infarction. Patients with normal MPS and CKD had higher unadjusted cardiac death event rates than those with no CKD and normal MPS (2.7% versus 0.8%, P=0.001). Multivariate Cox proportional hazards models revealed that both perfusion defects (hazard ratio 1.90, 95% CI 1.47 to 2.46) and CKD (hazard ratio 1.96, 95% CI 1.29 to 2.95) were independent predictors of cardiac death after accounting for risk factors, left ventricular dysfunction, pharmacological stress, and symptom status. Both MPS and CKD had incremental power for cardiac death prediction over baseline risk factors and left ventricular dysfunction (global chi(2) 207.5 versus 169.3, P<0.0001). MPS provides effective risk stratification across the entire spectrum of renal function. Renal dysfunction is also an important independent predictor of cardiac death in patients undergoing MPS. Renal function and MPS have additive value in risk stratisfying patients with suspected coronary artery disease. Patients with CKD appear to have a relatively less benign prognosis than those without CKD, even in the presence of a normal scan.
Dhruva, Sanket S; Huang, Chenxi; Spatz, Erica S; Coppi, Andreas C; Warner, Frederick; Li, Shu-Xia; Lin, Haiqun; Xu, Xiao; Furberg, Curt D; Davis, Barry R; Pressel, Sara L; Coifman, Ronald R; Krumholz, Harlan M
2017-07-01
Randomized trials of hypertension have seldom examined heterogeneity in response to treatments over time and the implications for cardiovascular outcomes. Understanding this heterogeneity, however, is a necessary step toward personalizing antihypertensive therapy. We applied trajectory-based modeling to data on 39 763 study participants of the ALLHAT (Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial) to identify distinct patterns of systolic blood pressure (SBP) response to randomized medications during the first 6 months of the trial. Two trajectory patterns were identified: immediate responders (85.5%), on average, had a decreasing SBP, whereas nonimmediate responders (14.5%), on average, had an initially increasing SBP followed by a decrease. Compared with those randomized to chlorthalidone, participants randomized to amlodipine (odds ratio, 1.20; 95% confidence interval [CI], 1.10-1.31), lisinopril (odds ratio, 1.88; 95% CI, 1.73-2.03), and doxazosin (odds ratio, 1.65; 95% CI, 1.52-1.78) had higher adjusted odds ratios associated with being a nonimmediate responder (versus immediate responder). After multivariable adjustment, nonimmediate responders had a higher hazard ratio of stroke (hazard ratio, 1.49; 95% CI, 1.21-1.84), combined cardiovascular disease (hazard ratio, 1.21; 95% CI, 1.11-1.31), and heart failure (hazard ratio, 1.48; 95% CI, 1.24-1.78) during follow-up between 6 months and 2 years. The SBP response trajectories provided superior discrimination for predicting downstream adverse cardiovascular events than classification based on difference in SBP between the first 2 measurements, SBP at 6 months, and average SBP during the first 6 months. Our findings demonstrate heterogeneity in response to antihypertensive therapies and show that chlorthalidone is associated with more favorable initial response than the other medications. © 2017 American Heart Association, Inc.
Dai, Jie; Liu, Ming; Swensen, Stephen J; Stoddard, Shawn M; Wampfler, Jason A; Limper, Andrew H; Jiang, Gening; Yang, Ping
2017-05-01
Pulmonary emphysema is a frequent comorbidity in lung cancer, but its role in tumor prognosis remains obscure. Our aim was to evaluate the impact of the regional emphysema score (RES) on a patient's overall survival, quality of life (QOL), and recovery of pulmonary function in stage I to II lung cancer. Between 1997 and 2009, a total of 1073 patients were identified and divided into two surgical groups-cancer in the emphysematous (group 1 [n = 565]) and nonemphysematous (group 2 [n = 435]) regions-and one nonsurgical group (group 3 [n = 73]). RES was derived from the emphysematous region and categorized as mild (≤5%), moderate (6%-24%), or severe (25%-60%). In group 1, patients with a moderate or severe RES experienced slight decreases in postoperative forced expiratory volume in 1 second, but increases in the ratio of forced expiratory volume in 1 second to forced vital capacity compared with those with a mild RES (p < 0.01); however, this correlation was not observed in group 2. Posttreatment QOL was lower in patients with higher RESs in all groups, mainly owing to dyspnea (p < 0.05). Cox regression analysis revealed that patients with a higher RES had significantly poorer survival in both surgical groups, with adjusted hazard ratios of 1.41 and 1.43 for a moderate RES and 1.63 and 2.04 for a severe RES, respectively; however, this association was insignificant in the nonsurgical group (adjusted hazard ratio of 0.99 for a moderate or severe RES). In surgically treated patients with cancer in the emphysematous region, RES is associated with postoperative changes in lung function. RES is also predictive of posttreatment QOL related to dyspnea in early-stage lung cancer. In both surgical groups, RES is an independent predictor of survival. Copyright © 2017 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.
Germline PARP4 mutations in patients with primary thyroid and breast cancers.
Ikeda, Yuji; Kiyotani, Kazuma; Yew, Poh Yin; Kato, Taigo; Tamura, Kenji; Yap, Kai Lee; Nielsen, Sarah M; Mester, Jessica L; Eng, Charis; Nakamura, Yusuke; Grogan, Raymon H
2016-03-01
Germline mutations in the PTEN gene, which cause Cowden syndrome, are known to be one of the genetic factors for primary thyroid and breast cancers; however, PTEN mutations are found in only a small subset of research participants with non-syndrome breast and thyroid cancers. In this study, we aimed to identify germline variants that may be related to genetic risk of primary thyroid and breast cancers. Genomic DNAs extracted from peripheral blood of 14 PTEN WT female research participants with primary thyroid and breast cancers were analyzed by whole-exome sequencing. Gene-based case-control association analysis using the information of 406 Europeans obtained from the 1000 Genomes Project database identified 34 genes possibly associated with the phenotype with P < 1.0 × 10(-3). Among them, rare variants in the PARP4 gene were detected at significant high frequency (odds ratio = 5.2; P = 1.0 × 10(-5)). The variants, G496V and T1170I, were found in six of the 14 study participants (43%) while their frequencies were only 0.5% in controls. Functional analysis using HCC1143 cell line showed that knockdown of PARP4 with siRNA significantly enhanced the cell proliferation, compared with the cells transfected with siControl (P = 0.02). Kaplan-Meier analysis using Gene Expression Omnibus (GEO), European Genome-phenome Archive (EGA) and The Cancer Genome Atlas (TCGA) datasets showed poor relapse-free survival (P < 0.001, Hazard ratio 1.27) and overall survival (P = 0.006, Hazard ratio 1.41) in a PARP4 low-expression group, suggesting that PARP4 may function as a tumor suppressor. In conclusion, we identified PARP4 as a possible susceptibility gene of primary thyroid and breast cancer. © 2016 Society for Endocrinology.
Germline PARP4 mutations in patients with primary thyroid and breast cancers
Ikeda, Yuji; Kiyotani, Kazuma; Yew, Poh Yin; Kato, Taigo; Tamura, Kenji; Yap, Kai-Lee; Nielsen, Sarah M.; Mester, Jessica L; Eng, Charis; Nakamura, Yusuke; Grogan, Raymon H.
2016-01-01
Germline mutations in the PTEN gene, which cause Cowden syndrome (CS), are known to be one of the genetic factors for primary thyroid and breast cancers, however, PTEN mutations are found in only a small subset of research participants with non-syndrome breast and thyroid cancers. In this study, we aimed to identify germline variants that may be related to genetic risk of primary thyroid and breast cancers. Genomic DNAs extracted from peripheral blood of 14 PTEN-wild-type female research participants with primary thyroid and breast cancers were analyzed by whole-exome sequencing. Gene-based case control association analysis using the information of 406 Europeans obtained from the 1000 Genomes Project database identified 34 genes possibly associated with the phenotype with P<1.0×10−3. Among them, rare variants in the PARP4 gene were detected at significant high frequency (odds ratio = 5.2, P = 1.0×10−5). The variants, G496V and T1170I, were found in 6 of the 14 study participants (43%) while their frequencies were only 0.5% in controls. Functional analysis using HCC1143 cell line showed that knockdown of PARP4 with siRNA significantly enhanced the cell proliferation, compared with the cells transfected with siControl (P = 0.02). Kaplan-Meier analysis using GEO, EGA and TCGA datasets showed poor progression-free survival (P = 0.006, Hazard ratio 0.71) and overall survival (P < 0.0001, Hazard ratio 0.79) in a PARP4 low-expression group, suggesting that PARP4 may function as a tumor suppression. In conclusion, we identified PARP4 as a possible susceptibility gene of primary thyroid and breast cancer. PMID:26699384
Tomioka, Kimiko; Kurumatani, Norio; Hosoi, Hiroshi
2016-01-01
Background This study’s aim was to clarify the relationship of having hobbies and a purpose in life (PIL; in Japanese, ikigai) with mortality and a decline in the activities of daily living (ADL) and instrumental ADL (IADL) among the community-dwelling elderly. Methods Prospective observational data from residents aged ≥65 years who were at increased risk for death (n = 1853) and developing a decline in ADL (n = 1254) and IADL (n = 1162) were analyzed. Cox proportional hazard models were used for mortality analysis of data from February 2011 to November 2014. ADL and IADL were evaluated using the Barthel Index and the Tokyo Metropolitan Institute of Gerontology Index of Competence, respectively. ADL and IADL were assessed at baseline and follow-up and were evaluated using logistic regression models. Fully adjusted models included terms for age, gender, BMI, income, alcohol intake, smoking history, number of chronic diseases, cognitive function, and depression. Results During the follow-up of eligible participants, 248 had died, 119 saw a decline in ADL, and 178 saw a decline in IADL. In fully adjusted models, having neither hobbies nor PIL was significantly associated with an increased risk of mortality (hazard ratio 2.08; 95% confidence interval [CI], 1.47–2.94), decline in ADL (odds ratio 2.74; 95% CI, 1.44–5.21), and decline in IADL (odds ratio 1.89; 95% CI, 1.01–3.55) compared to having both hobbies and PIL. Conclusions Although effect modifications by cognitive functioning and depression cannot be ruled out, our findings suggest that having hobbies and PIL may extend not only longevity, but also healthy life expectancy among community-dwelling older adults. PMID:26947954
Tomioka, Kimiko; Kurumatani, Norio; Hosoi, Hiroshi
2016-07-05
This study's aim was to clarify the relationship of having hobbies and a purpose in life (PIL; in Japanese, ikigai) with mortality and a decline in the activities of daily living (ADL) and instrumental ADL (IADL) among the community-dwelling elderly. Prospective observational data from residents aged ≥65 years who were at increased risk for death (n = 1853) and developing a decline in ADL (n = 1254) and IADL (n = 1162) were analyzed. Cox proportional hazard models were used for mortality analysis of data from February 2011 to November 2014. ADL and IADL were evaluated using the Barthel Index and the Tokyo Metropolitan Institute of Gerontology Index of Competence, respectively. ADL and IADL were assessed at baseline and follow-up and were evaluated using logistic regression models. Fully adjusted models included terms for age, gender, BMI, income, alcohol intake, smoking history, number of chronic diseases, cognitive function, and depression. During the follow-up of eligible participants, 248 had died, 119 saw a decline in ADL, and 178 saw a decline in IADL. In fully adjusted models, having neither hobbies nor PIL was significantly associated with an increased risk of mortality (hazard ratio 2.08; 95% confidence interval [CI], 1.47-2.94), decline in ADL (odds ratio 2.74; 95% CI, 1.44-5.21), and decline in IADL (odds ratio 1.89; 95% CI, 1.01-3.55) compared to having both hobbies and PIL. Although effect modifications by cognitive functioning and depression cannot be ruled out, our findings suggest that having hobbies and PIL may extend not only longevity, but also healthy life expectancy among community-dwelling older adults.
Nishioka, Shinta; Okamoto, Takatsugu; Takayama, Masako; Urushihara, Maki; Watanabe, Misuzu; Kiriya, Yumiko; Shintani, Keiko; Nakagomi, Hiromi; Kageyama, Noriko
2017-08-01
Whether malnutrition risk correlates with recovery of swallowing function of convalescent stroke patients is unknown. This study was conducted to clarify whether malnutrition risks predict achievement of full oral intake in convalescent stroke patients undergoing enteral nutrition. We conducted a secondary analysis of 466 convalescent stroke patients, aged 65 years or over, who were undergoing enteral nutrition. Patients were extracted from the "Algorithm for Post-stroke Patients to improve oral intake Level; APPLE" study database compiled at the Kaifukuki (convalescent) rehabilitation wards. Malnutrition risk was determined by the Geriatric Nutritional Risk Index as follows: severe (<82), moderate (82 to <92), mild (92 to <98), and no malnutrition risks (≥98). Swallowing function was assessed by Fujishima's swallowing grade (FSG) on admission and discharge. The primary outcome was achievement of full oral intake, indicated by FSG ≥ 7. Binary logistic regression analysis was performed to identify predictive factors, including malnutrition risk, for achieving full oral intake. Estimated hazard risk was computed by Cox's hazard model. Of the 466 individuals, 264 were ultimately included in this study. Participants with severe malnutrition risk showed a significantly lower proportion of achievement of full oral intake than lower severity groups (P = 0.001). After adjusting for potential confounders, binary logistic regression analysis showed that patients with severe malnutrition risk were less likely to achieve full oral intake (adjusted odds ratio: 0.232, 95% confidence interval [95% CI]: 0.047-1.141). Cox's proportional hazard model revealed that severe malnutrition risk was an independent predictor of full oral intake (adjusted hazard ratio: 0.374, 95% CI: 0.166-0.842). Compared to patients who did not achieve full oral intake, patients who achieved full oral intake had significantly higher energy intake, but there was no difference in protein intake and weight change. Severe malnutrition risk independently predicts the achievement of full oral intake in convalescent stroke patients undergoing enteral nutrition. Copyright © 2016 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
Sickle Cell Trait, Rhabdomyolysis, and Mortality among U.S. Army Soldiers
Nelson, D. Alan; Deuster, Patricia A.; Carter, Robert; Hill, Owen T.; Wolcott, Vickee L.; Kurina, Lianne M.
2016-01-01
Background Studies have suggested that sickle cell trait elevates the risks of exertional rhabdomyolysis and death. We conducted a study of sickle cell trait in relation to these outcomes, controlling for known risk factors for exertional rhabdomyolysis, in a large population of active persons who had undergone laboratory tests for hemoglobin AS (HbAS) and who were subject to exertional-injury precautions. Methods We used Cox proportional-hazards models to test whether the risks of exertional rhabdomyolysis and death varied according to sickle cell trait status among 47,944 black soldiers who had undergone testing for HbAS and who were on active duty in the U.S. Army between January 2011 and December 2014. We used the Stanford Military Data Repository, which contains comprehensive medical and administrative data on all active-duty soldiers. Results There was no significant difference in the risk of death among soldiers with sickle cell trait, as compared with those without the trait (hazard ratio, 0.99; 95% confidence interval [CI], 0.46 to 2.13; P = 0.97), but the trait was associated with a significantly higher adjusted risk of exertional rhabdomyolysis (hazard ratio, 1.54; 95% CI, 1.12 to 2.12; P = 0.008). This effect was similar in magnitude to that associated with tobacco use, as compared with no use (hazard ratio, 1.54; 95% CI, 1.23 to 1.94; P<0.001), and to that associated with having a body-mass index (BMI; the weight in kilograms divided by the square of the height in meters) of 30.0 or more, as compared with a BMI of less than 25.0 (hazard ratio, 1.39; 95% CI, 1.04 to 1.86; P = 0.03). The effect was less than that associated with recent use of a statin, as compared with no use (hazard ratio, 2.89; 95% CI, 1.51 to 5.55; P = 0.001), or an antipsychotic agent (hazard ratio, 3.02; 95% CI, 1.34 to 6.82; P = 0.008). Conclusions Sickle cell trait was not associated with a higher risk of death than absence of the trait, but it was associated with a significantly higher risk of exertional rhabdomyolysis. (Funded by the National Heart, Lung, and Blood Institute and the Uniformed Services University of the Health Sciences.) PMID:27518662
Mori, Kentaro; Kaiho, Yu; Tomata, Yasutake; Narita, Mamoru; Tanji, Fumiya; Sugiyama, Kemmyo; Sugawara, Yumi; Tsuji, Ichiro
2017-04-01
To test the hypothesis that elderly persons who feel ikigai (a sense of life worth living) have a lower risk of incident functional disability than those who do not. Recent studies have suggested that ikigai impacts on mortality. However, its impact upon disability is unknown. The aim of the present study was to investigate the association between ikigai and incident functional disability among elderly persons. We conducted a prospective cohort study of 830 Japanese elderly persons aged ≥70 years as a comprehensive geriatric assessment in 2003. Information on ikigai was collected by self-reported questionnaire. Data on functional disability were retrieved from the public Long-term Care Insurance database in which participants were followed up for 11 years. Hazard ratios (HRs) and 95% confidence intervals (CIs) for incidence of functional disability were calculated for three groups delineated according to the presence of ikigai (“no”, “uncertain” or “yes”) using the Cox proportional hazards regression model. The 11-year incidence of functional disability was 53.3% (442 cases). As compared with the “no” group, the multiple-adjusted HR (95% CI) of incident functional disability was 0.61 (0.36–1.02) for the “uncertain” group and 0.50 (0.30–0.84) for the “yes” group. A stronger degree of ikigai is significantly associated with a lower risk of incident functional disability. Copyright © 2017 Elsevier Inc. All rights reserved.
Matsuo, Koji; Machida, Hiroko; Horowitz, Max P; Shahzad, Mian M K; Guntupalli, Saketh R; Roman, Lynda D; Wright, Jason D
2017-11-01
While there is an increasing trend of ovarian conservation at the time of surgical treatment for young women with stage I cervical cancer, the risk for subsequent ovarian cancer after ovarian conservation has not been well studied. We sought to examine the incidence of and risk factors for metachronous ovarian cancer among young women with stage I cervical cancer who had ovarian conservation at the time of hysterectomy. The Surveillance, Epidemiology, and End Results Program was used to identify women aged <50 years who underwent hysterectomy with ovarian conservation for stage I cervical cancer from 1983 through 2013 (n = 4365). Time-dependent analysis was performed for ovarian cancer risk after cervical cancer diagnosis. Mean age at cervical cancer diagnosis was 37 years, and the majority of patients had stage IA disease (68.2%) and squamous histology (72.9%). Median follow-up time was 10.8 years, and there were 13 women who developed metachronous ovarian cancer. The 10- and 20-year cumulative incidences of metachronous ovarian cancer were 0.2% (95% confidence interval, 0.1-0.4) and 0.5% (95% confidence interval, 0.2-0.8), respectively. Mean age at the time of diagnosis of metachronous ovarian cancer was 47.5 years, and stage III-IV disease was seen in 55.6%. Age (≥45 vs <45 years, hazard ratio, 4.22; 95% confidence interval, 1.16-15.4; P = .018), ethnicity (non-white vs white, hazard ratio, 4.29; 95% confidence interval, 1.31-14.0; P = .009), cervical cancer histology (adenocarcinoma or adenosquamous vs squamous, hazard ratio, 3.50; 95% confidence interval, 1.17-10.5; P = .028), and adjuvant radiotherapy use (yes vs no, hazard ratio, 3.69; 95% confidence interval, 1.01-13.4; P = .034) were significantly associated with metachronous ovarian cancer risk. The presence of multiple risk factors was associated with a significantly increased risk of metachronous ovarian cancer compared to the no risk factor group: 1 risk factor (hazard ratio range, 2.96-8.43), 2 risk factors (hazard ratio range, 16.6-31.0), and 3-4 risk factors (hazard ratio range, 62.3-109), respectively. Metachronous ovarian cancer risk after ovarian conservation for women with stage I cervical cancer is <1%. Older age, non-white ethnicity, adenocarcinoma or adenosquamous histology, and adjuvant radiotherapy may be associated with an increased metachronous ovarian cancer risk. Copyright © 2017 Elsevier Inc. All rights reserved.
Intensive blood glucose control and vascular outcomes in patients with type 2 diabetes.
Patel, Anushka; MacMahon, Stephen; Chalmers, John; Neal, Bruce; Billot, Laurent; Woodward, Mark; Marre, Michel; Cooper, Mark; Glasziou, Paul; Grobbee, Diederick; Hamet, Pavel; Harrap, Stephen; Heller, Simon; Liu, Lisheng; Mancia, Giuseppe; Mogensen, Carl Erik; Pan, Changyu; Poulter, Neil; Rodgers, Anthony; Williams, Bryan; Bompoint, Severine; de Galan, Bastiaan E; Joshi, Rohina; Travert, Florence
2008-06-12
In patients with type 2 diabetes, the effects of intensive glucose control on vascular outcomes remain uncertain. We randomly assigned 11,140 patients with type 2 diabetes to undergo either standard glucose control or intensive glucose control, defined as the use of gliclazide (modified release) plus other drugs as required to achieve a glycated hemoglobin value of 6.5% or less. Primary end points were composites of major macrovascular events (death from cardiovascular causes, nonfatal myocardial infarction, or nonfatal stroke) and major microvascular events (new or worsening nephropathy or retinopathy), assessed both jointly and separately. After a median of 5 years of follow-up, the mean glycated hemoglobin level was lower in the intensive-control group (6.5%) than in the standard-control group (7.3%). Intensive control reduced the incidence of combined major macrovascular and microvascular events (18.1%, vs. 20.0% with standard control; hazard ratio, 0.90; 95% confidence interval [CI], 0.82 to 0.98; P=0.01), as well as that of major microvascular events (9.4% vs. 10.9%; hazard ratio, 0.86; 95% CI, 0.77 to 0.97; P=0.01), primarily because of a reduction in the incidence of nephropathy (4.1% vs. 5.2%; hazard ratio, 0.79; 95% CI, 0.66 to 0.93; P=0.006), with no significant effect on retinopathy (P=0.50). There were no significant effects of the type of glucose control on major macrovascular events (hazard ratio with intensive control, 0.94; 95% CI, 0.84 to 1.06; P=0.32), death from cardiovascular causes (hazard ratio with intensive control, 0.88; 95% CI, 0.74 to 1.04; P=0.12), or death from any cause (hazard ratio with intensive control, 0.93; 95% CI, 0.83 to 1.06; P=0.28). Severe hypoglycemia, although uncommon, was more common in the intensive-control group (2.7%, vs. 1.5% in the standard-control group; hazard ratio, 1.86; 95% CI, 1.42 to 2.40; P<0.001). A strategy of intensive glucose control, involving gliclazide (modified release) and other drugs as required, that lowered the glycated hemoglobin value to 6.5% yielded a 10% relative reduction in the combined outcome of major macrovascular and microvascular events, primarily as a consequence of a 21% relative reduction in nephropathy. (ClinicalTrials.gov number, NCT00145925.) 2008 Massachusetts Medical Society
Nochioka, Kotaro; Biering-Sørensen, Tor; Hansen, Kim Wadt; Sørensen, Rikke; Pedersen, Sune; Jørgensen, Peter Godsk; Iversen, Allan; Shimokawa, Hiroaki; Jeger, Raban; Kaiser, Christoph; Pfisterer, Matthias; Galatius, Søren
2017-12-01
Rheumatologic disorders are characterised by inflammation and an increased risk of coronary artery disease (CAD). However, the association between rheumatologic disorders and long-term prognosis in CAD patients undergoing percutaneous coronary intervention (PCI) is unknown. Thus, we aimed to examine the association between rheumatologic disorders and long-term prognosis in CAD patients undergoing PCI. A post-hoc analysis was performed in 4605 patients (age: 63.3 ± 11.0 years; male: 76.6%) with ST-segment elevation myocardial infarction (STEMI; n = 1396), non-STEMI ( n = 1541), and stable CAD ( n = 1668) from the all-comer stent trials, the BAsel Stent Kosten-Effektivitäts Trial-PROspective Validation Examination (BASKET-PROVE) I and II trials. We evaluated the association between rheumatologic disorders and 2-year major adverse cardiac events (MACEs; cardiac death, nonfatal myocardial infarction (MI), and target vessel revascularisation (TVR)) by Cox regression analysis. Patients with rheumatologic disorders ( n = 197) were older, more often female, had a higher prevalence of renal disease, multi-vessel coronary disease, and bifurcation lesions, and had longer total stent lengths. During the 2-year follow-up, the MACE rate was 8.6% in the total cohort. After adjustment for potential confounders, rheumatologic disorders were associated with MACEs in the total cohort (adjusted hazard ratio: 1.55; 95% confidence interval (CI): 1.04-2.31) driven by the STEMI subgroup (adjusted hazard ratio: 2.38; 95% CI: 1.26-4.51). In all patients, rheumatologic disorders were associated with all-cause death (adjusted hazard ratio: 2.05; 95% CI: 1.14-3.70), cardiac death (adjusted hazard ratio: 2.63; 95% CI: 1.27-5.43), and non-fatal MI (adjusted hazard ratio: 2.64; 95% CI: 1.36-5.13), but not with TVR (adjusted hazard ratio: 0.81; 95% CI: 0.41-1.58). The presence of rheumatologic disorders appears to be independently associated with worse outcome in CAD patients undergoing PCI. This calls for further studies and focus on this high-risk group of patients following PCI.
Prednisolone and Mycobacterium indicus pranii in Tuberculous Pericarditis
Mayosi, Bongani M; Ntsekhe, Mpiko; Bosch, Jackie; Pandie, Shaheen; Jung, Hyejung; Gumedze, Freedom; Pogue, Janice; Thabane, Lehana; Smieja, Marek; Francis, Veronica; Joldersma, Laura; Thomas, Kandithalal M.; Thomas, Baby; Awotedu, Abolade A.; Magula, Nombulelo P.; Naidoo, Datshana P.; Damasceno, Albertino; Banda, Alfred Chitsa; Brown, Basil; Manga, Pravin; Kirenga, Bruce; Mondo, Charles; Mntla, Phindile; Tsitsi, Jacob M.; Peters, Ferande; Essop, Mohammed R.; Russell, James B.W.; Hakim, James; Matenga, Jonathan; Barasa, Ayub F.; Sani, Mahmoud U.; Olunuga, Taiwo; Ogah, Okechukwu; Ansa, Victor; Aje, Akinyemi; Danbauchi, Solomon; Ojji, Dike; Yusuf, Salim
2016-01-01
BACKGROUND Tuberculous pericarditis is associated with high morbidity and mortality even if antituberculosis therapy is administered. We evaluated the effects of adjunctive glucocorticoid therapy and Mycobacterium indicus pranii immunotherapy in patients with tuberculous pericarditis. METHODS Using a 2-by-2 factorial design, we randomly assigned 1400 adults with definite or probable tuberculous pericarditis to either prednisolone or placebo for 6 weeks and to either M. indicus pranii or placebo, administered in five injections over the course of 3 months. Two thirds of the participants had concomitant human immunodeficiency virus (HIV) infection. The primary efficacy outcome was a composite of death, cardiac tamponade requiring pericardiocentesis, or constrictive pericarditis. RESULTS There was no significant difference in the primary outcome between patients who received prednisolone and those who received placebo (23.8% and 24.5%, respectively; hazard ratio, 0.95; 95% confidence interval [CI], 0.77 to 1.18; P = 0.66) or between those who received M. indicus pranii immunotherapy and those who received placebo (25.0% and 24.3%, respectively; hazard ratio, 1.03; 95% CI, 0.82 to 1.29; P = 0.81). Prednisolone therapy, as compared with placebo, was associated with significant reductions in the incidence of constrictive pericarditis (4.4% vs. 7.8%; hazard ratio, 0.56; 95% CI, 0.36 to 0.87; P = 0.009) and hospitalization (20.7% vs. 25.2%; hazard ratio, 0.79; 95% CI, 0.63 to 0.99; P = 0.04). Both prednisolone and M. indicus pranii, each as compared with placebo, were associated with a significant increase in the incidence of cancer (1.8% vs. 0.6%; hazard ratio, 3.27; 95% CI, 1.07 to 10.03; P = 0.03, and 1.8% vs. 0.5%; hazard ratio, 3.69; 95% CI, 1.03 to 13.24; P = 0.03, respectively), owing mainly to an increase in HIV-associated cancer. CONCLUSIONS In patients with tuberculous pericarditis, neither prednisolone nor M. indicus pranii had a significant effect on the composite of death, cardiac tamponade requiring pericardiocentesis, or constrictive pericarditis. (Funded by the Canadian Institutes of Health Research and others; IMPI ClinicalTrials.gov number, NCT00810849.) PMID:25178809
Long-term mortality risk and life expectancy following recurrent hypertensive disease of pregnancy.
Theilen, Lauren H; Meeks, Huong; Fraser, Alison; Esplin, M Sean; Smith, Ken R; Varner, Michael W
2018-04-07
Women with a history of hypertensive disease of pregnancy have increased risks for early mortality from multiple causes. The effect of recurrent hypertensive disease of pregnancy on mortality risk and life expectancy is unknown. We sought to determine whether recurrent hypertensive disease of pregnancy is associated with increased mortality risks. In this retrospective cohort study, we used birth certificate data to determine the number of pregnancies affected by hypertensive disease of pregnancy for each woman delivering in Utah from 1939 through 2012. We assigned women to 1 of 3 groups based on number of affected pregnancies: 0, 1, or ≥2. Exposed women had ≥1 affected singleton pregnancy and lived in Utah for ≥1 year postpartum. Exposed women were matched 1:2 to unexposed women by age, year of childbirth, and parity. Underlying cause of death was determined from death certificates. Mortality risks by underlying cause of death were compared between exposed and unexposed women as a function of number of affected pregnancies. Cox regressions controlled for infant sex, gestational age, parental education, ethnicity, and marital status. We identified 57,384 women with ≥1 affected pregnancy (49,598 women with 1 affected pregnancy and 7786 women with ≥2 affected pregnancies). These women were matched to 114,768 unexposed women. As of 2016, 11,894 women were deceased: 4722 (8.2%) exposed and 7172 (6.3%) unexposed. Women with ≥2 affected pregnancies had increased mortality from all causes (adjusted hazard ratio, 2.04; 95% confidence interval, 1.76-2.36), diabetes (adjusted hazard ratio, 4.33; 95% confidence interval, 2.21-8.47), ischemic heart disease (adjusted hazard ratio, 3.30; 95% confidence interval, 2.02-5.40), and stroke (adjusted hazard ratio, 5.10; 95% confidence interval, 2.62-9.92). For women whose index pregnancy delivered from 1939 through 1959 (n = 10,488), those with ≥2 affected pregnancies had shorter additional life expectancies than mothers who had only 1 or 0 hypertensive pregnancies (48.92 vs 51.91 vs 55.48 years, respectively). Hypertensive diseases of pregnancy are associated with excess risks for early all-cause mortality and some cause-specific mortality, and these risks increase further with recurrent disease. Copyright © 2018 Elsevier Inc. All rights reserved.
Pandhi, Jay; Gottdiener, John S.; Bartz, Traci M.; Kop, Willem J.; Mehra, Mandeep R.
2014-01-01
Although asymptomatic left ventricular (LV) systolic dysfunction (ALVSD) is common, its phenotype and prognosis for incident heart failure (HF) and mortality are insufficiently understood. Echocardiography was done in 5,649 participants in the Cardiovascular Health Study (age 73.0 ± 5.6 years, 57.6% women). The clinical characteristics and cardiovascular risk factors of the participants with ALVSD were compared to those with normal LV function (ejection fraction ≥55%) and with symptomatic LV systolic dysfunction (SLVSD; ejection fraction <55% and a history of HF). Cox proportional hazards models were used to estimate the risk of incident HF and mortality in those with ALVSD. Also, comparisons were made among the LV ejection fraction subgroups using previously validated cutoff values (<45% and 45% to 55%), adjusting for the demographic and cardiovascular disease risk factors. Those with ALVSD (7.3%) were more likely to have cardiovascular risk factors than those in the reference group (without LV dysfunction or symptomatic HF) but less likely than those with SLVSD. The HF rate was 24 occurrences per 1,000 person-years in the reference group and 57 occurrences per 1,000 person-years in those with ALVSD. The HF rate was 45 occurrences per 1,000 person-years for those with ALVSD and mildly impaired LV dysfunction and 93 occurrences per 1,000 person-years for those with ALVSD and moderate to severe LV dysfunction. The mortality rate was 51 deaths per 1,000 person-years in the reference group, 90 deaths per 1,000 person-years in the ALVSD group, and 156 deaths per 1,000 person-years in the SLVSD group. Adjusting for covariates, compared to the reference group, ALVSD was associated with an increased risk of incident HF (hazard ratio 1.60,95% confidence interval 1.35 to 1.91), cardiovascular mortality (hazard ratio 2.13, 95% confidence interval 1.81 to 2.51), and all-cause mortality (hazard ratio 1.46, 95% confidence interval 1.29 to 1.64). In conclusion, subjects with ALVSD are characterized by a greater prevalence of cardiovascular risk factors and co-morbidities than those with normal LV function and without HF. However, the prevalence is lower than in those with SLVSD. Patients with ALVSD are at an increased risk of HF and mortality, particularly those with greater severity of LV impairment. PMID:21575752
Pandhi, Jay; Gottdiener, John S; Bartz, Traci M; Kop, Willem J; Mehra, Mandeep R
2011-06-01
Although asymptomatic left ventricular (LV) systolic dysfunction (ALVSD) is common, its phenotype and prognosis for incident heart failure (HF) and mortality are insufficiently understood. Echocardiography was done in 5,649 participants in the Cardiovascular Health Study (age 73.0 ± 5.6 years, 57.6% women). The clinical characteristics and cardiovascular risk factors of the participants with ALVSD were compared to those with normal LV function (ejection fraction ≥55%) and with symptomatic LV systolic dysfunction (SLVSD; ejection fraction <55% and a history of HF). Cox proportional hazards models were used to estimate the risk of incident HF and mortality in those with ALVSD. Also, comparisons were made among the LV ejection fraction subgroups using previously validated cutoff values (<45% and 45% to 55%), adjusting for the demographic and cardiovascular disease risk factors. Those with ALVSD (7.3%) were more likely to have cardiovascular risk factors than those in the reference group (without LV dysfunction or symptomatic HF) but less likely than those with SLVSD. The HF rate was 24 occurrences per 1,000 person-years in the reference group and 57 occurrences per 1,000 person-years in those with ALVSD. The HF rate was 45 occurrences per 1,000 person-years for those with ALVSD and mildly impaired LV dysfunction and 93 occurrences per 1,000 person-years for those with ALVSD and moderate to severe LV dysfunction. The mortality rate was 51 deaths per 1,000 person-years in the reference group, 90 deaths per 1,000 person-years in the ALVSD group, and 156 deaths per 1,000 person-years in the SLVSD group. Adjusting for covariates, compared to the reference group, ALVSD was associated with an increased risk of incident HF (hazard ratio 1.60, 95% confidence interval 1.35 to 1.91), cardiovascular mortality (hazard ratio 2.13, 95% confidence interval 1.81 to 2.51), and all-cause mortality (hazard ratio 1.46, 95% confidence interval 1.29 to 1.64). In conclusion, subjects with ALVSD are characterized by a greater prevalence of cardiovascular risk factors and co-morbidities than those with normal LV function and without HF. However, the prevalence is lower than in those with SLVSD. Patients with ALVSD are at an increased risk of HF and mortality, particularly those with greater severity of LV impairment. Copyright © 2011 Elsevier Inc. All rights reserved.
Towards developing drought impact functions to advance drought monitoring and early warning
NASA Astrophysics Data System (ADS)
Bachmair, Sophie; Stahl, Kerstin; Hannaford, Jamie; Svoboda, Mark
2015-04-01
In natural hazard analysis, damage functions (also referred to as vulnerability or susceptibility functions) relate hazard intensity to the negative effects of the hazard event, often expressed as damage ratio or monetary loss. While damage functions for floods and seismic hazards have gained considerable attention, there is little knowledge on how drought intensity translates into ecological and socioeconomic impacts. One reason for this is the multifaceted nature of drought affecting different domains of the hydrological cycle and different sectors of human activity (for example, recognizing meteorological - agricultural - hydrological - socioeconomic drought) leading to a wide range of drought impacts. Moreover, drought impacts are often non-structural and hard to quantify or monetarize (e.g. impaired navigability of streams, bans on domestic water use, increased mortality of aquatic species). Knowledge on the relationship between drought intensity and drought impacts, i.e. negative environmental, economic or social effects experienced under drought conditions, however, is vital to identify critical thresholds for drought impact occurrence. Such information may help to improve drought monitoring and early warning (M&EW), one goal of the international DrIVER project (Drought Impacts: Vulnerability thresholds in monitoring and Early-warning Research). The aim of this study is to test the feasibility of designing "drought impact functions" for case study areas in Europe (Germany and UK) and the United States to derive thresholds meaningful for drought impact occurrence; to account for the multidimensionality of drought impacts, we use the broader term "drought impact function" over "damage function". First steps towards developing empirical drought impact functions are (1) to identify meaningful indicators characterizing the hazard intensity (e.g. indicators expressing a precipitation or streamflow deficit), (2) to identify suitable variables representing impacts, damage, or loss due to drought, and (3) to test different statistical models to link drought intensity with drought impact information to derive meaningful thresholds. While the focus regarding drought impact variables lies on text-based impact reports from the European Drought Impact report Inventory (EDII) and the US Drought Impact Reporter (DIR), the information gain through exploiting other variables such as agricultural yield statistics and remotely sensed vegetation indices is explored. First results reveal interesting insights into the complex relationship between drought indicators and impacts and highlight differences among drought impact variables and geographies. Although a simple intensity threshold evoking specific drought impacts cannot be identified, developing drought impact functions helps to elucidate how drought conditions relate to ecological or socioeconomic impacts. Such knowledge may provide guidance for inferring meaningful triggers for drought M&EW and could have potential for a wide range of drought management applications (for example, building drought scenarios for testing the resilience of drought plans or water supply systems).
Lee, Sang-Uk; Roh, Sungwon; Kim, Young-Eun; Park, Jong-Ik; Jeon, Boyoung; Oh, In-Hwan
2017-01-01
The elevated risk of suicide in people with disability has been suggested in the previous studies; however, the majority of study results have been limited to specific disability types, and there is a lack of research comparing the risk of suicide in people with disability in general. To examine the hazard ratio of suicide according to the presence and the types of disability and identify patterns in the results. In this study, we used National Health Insurance Service-National Sample Cohort data on 990,598 people, and performed analysis on the cause of death from 2003 through 2013. A Cox proportional hazard model was used to estimate the hazard ratio of suicide associated with disability and its types. The hazard ratio of suicide among people with disability was 1.9-folds higher compared to people without disability. The risk of suicide among different disability types was higher in mental disorder, renal failure, brain injury and physical disability. The hazard ratio of suicide in people with disability was not varied by income. The time to death by suicide for people with disability from the onset of their disability was 39.8 months on average. Our findings suggest that when the government plans suicide prevention policies, early and additional interventions specific to people with disability are needed. Disability due to mental disorder, renal failure should be given priority. Copyright © 2016 Elsevier Inc. All rights reserved.
Cole, Stephen R.; Hudgens, Michael G.; Tien, Phyllis C.; Anastos, Kathryn; Kingsley, Lawrence; Chmiel, Joan S.; Jacobson, Lisa P.
2012-01-01
To estimate the association of antiretroviral therapy initiation with incident acquired immunodeficiency syndrome (AIDS) or death while accounting for time-varying confounding in a cost-efficient manner, the authors combined a case-cohort study design with inverse probability-weighted estimation of a marginal structural Cox proportional hazards model. A total of 950 adults who were positive for human immunodeficiency virus type 1 were followed in 2 US cohort studies between 1995 and 2007. In the full cohort, 211 AIDS cases or deaths occurred during 4,456 person-years. In an illustrative 20% random subcohort of 190 participants, 41 AIDS cases or deaths occurred during 861 person-years. Accounting for measured confounders and determinants of dropout by inverse probability weighting, the full cohort hazard ratio was 0.41 (95% confidence interval: 0.26, 0.65) and the case-cohort hazard ratio was 0.47 (95% confidence interval: 0.26, 0.83). Standard multivariable-adjusted hazard ratios were closer to the null, regardless of study design. The precision lost with the case-cohort design was modest given the cost savings. Results from Monte Carlo simulations demonstrated that the proposed approach yields approximately unbiased estimates of the hazard ratio with appropriate confidence interval coverage. Marginal structural model analysis of case-cohort study designs provides a cost-efficient design coupled with an accurate analytic method for research settings in which there is time-varying confounding. PMID:22302074
Pigmentation Traits, Sun Exposure, and Risk of Incident Vitiligo in Women.
Dunlap, Rachel; Wu, Shaowei; Wilmer, Erin; Cho, Eunyoung; Li, Wen-Qing; Lajevardi, Newsha; Qureshi, Abrar
2017-06-01
Vitiligo is the most common cutaneous depigmentation disorder worldwide, yet little is known about specific risk factors for disease development. Using data from the Nurses' Health Study, a prospective cohort study of 51,337 white women, we examined the associations between (i) pigmentary traits and (ii) reactions to sun exposure and risk of incident vitiligo. Nurses' Health Study participants responded to a question about clinician-diagnosed vitiligo and year of diagnosis (2001 or before, 2002-2005, 2006-2009, 2010-2011, or 2012+). We used Cox proportional hazards regression models to estimate the multivariate-adjusted hazard ratios and 95% confidence intervals of incident vitiligo associated with exposures variables, adjusting for potential confounders. We documented 271 cases of incident vitiligo over 835,594 person-years. Vitiligo risk was higher in women who had at least one mole larger than 3 mm in diameter on their left arms (hazard ratio = 1.37, 95% confidence interval = 1.02-1.83). Additionally, vitiligo risk was higher among women with better tanning ability (hazard ratio = 2.59, 95% confidence interval = 1.21-5.54) and in women who experienced at least one blistering sunburn (hazard ratio = 2.17, 95% confidence interval = 1.15-4.10). In this study, upper extremity moles, a higher ability to achieve a tan, and history of a blistering sunburn were associated with a higher risk of developing vitiligo in a population of white women. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Hlatky, Mark A; Ray, Roberta M; Burwen, Dale R; Margolis, Karen L; Johnson, Karen C; Kucharska-Newton, Anna; Manson, JoAnn E; Robinson, Jennifer G; Safford, Monika M; Allison, Matthew; Assimes, Themistocles L; Bavry, Anthony A; Berger, Jeffrey; Cooper-DeHoff, Rhonda M; Heckbert, Susan R; Li, Wenjun; Liu, Simin; Martin, Lisa W; Perez, Marco V; Tindle, Hilary A; Winkelmayer, Wolfgang C; Stefanick, Marcia L
2015-01-01
Background Data collected as part of routine clinical practice could be used to detect cardiovascular outcomes in pragmatic clinical trials, or in clinical registry studies. The reliability of claims data for documenting outcomes is unknown. Methods and Results We linked records of Women's Health Initiative (WHI) participants aged 65 years and older to Medicare claims data, and compared hospitalizations that had diagnosis codes for acute myocardial infarction (MI) or coronary revascularization with WHI outcomes adjudicated by study physicians. We then compared the hazard ratios for active versus placebo hormone therapy based solely on WHI adjudicated events with corresponding hazard ratios based solely on claims data for the same hormone trial participants. Agreement between WHI adjudicated outcomes and Medicare claims was good for the diagnosis for MI (kappa = 0.71 to 0.74), and excellent for coronary revascularization (kappa=0.88 to 0.91). The hormone:placebo hazard ratio for clinical MI was 1.31 (95% confidence interval (CI) 1.03 to 1.67) based on WHI outcomes, and 1.29 (CI 1.00 to 1.68) based on Medicare data. The hazard ratio for coronary revascularization was 1.09 (CI 0.88 to 1.35) based on WHI outcomes and 1.10 (CI 0.89 to 1.35) based on Medicare data. The differences between hazard ratios derived from WHI and Medicare data were not significant in 1,000 bootstrap replications. Conclusion Medicare claims may provide useful data on coronary heart disease outcomes among patients aged 65 years and older in clinical research studies. Clinical Trials Registration Information www.clinicaltrials.gov, Trial Number NCT00000611 PMID:24399330
Hamilton, S.J.; Buhl, K.J.
1997-01-01
Larval flannelmouth sucker (Catostomus latipinnis) were exposed to arsenate, boron, copper, molybdenum, selenate, selenite, uranium, vanadium, and zinc singly, and to five mixtures of five to nine inorganics. The exposures were conducted in reconstituted water representative of the San Juan River near Shiprock, New Mexico. The mixtures simulated environmental ratios reported for sites along the San Juan River (San Juan River backwater, Fruitland marsh, Hogback East Drain, Mancos River, and McElmo Creek). The rank order of the individual inorganics, from most to least toxic, was: copper > zinc > vanadium > selenite > selenate > arsenate > uranium > boron > molybdenum. All five mixtures exhibited additive toxicity to flannelmouth sucker. In a limited number of tests, 44-day-old and 13-day-old larvae exhibited no difference in sensitivity to three mixtures. Copper was the major toxic component in four mixtures (San Juan backwater, Hogback East Drain, Mancos River, and McElmo Creek), whereas zinc was the major toxic component in the Fruitland marsh mixture, which did not contain copper. The Hogback East Drain was the most toxic mixture tested. Comparison of 96-h LC50values with reported environmental water concentrations from the San Juan River revealed low hazard ratios for arsenic, boron, molybdenum, selenate, selenite, uranium, and vanadium, moderate hazard ratios for zinc and the Fruitland marsh mixture, and high hazard ratios for copper at three sites and four environmental mixtures representing a San Juan backwater, Hogback East Drain, Mancos River, and McElmo Creek. The high hazard ratios suggest that inorganic contaminants could adversely affect larval flannelmouth sucker in the San Juan River at four sites receiving elevated inorganics.
Gratwohl, Alois; Brand, Ronald; McGrath, Eoin; van Biezen, Anja; Sureda, Anna; Ljungman, Per; Baldomero, Helen; Chabannon, Christian; Apperley, Jane
2014-05-01
Competent authorities, healthcare payers and hospitals devote increasing resources to quality management systems but scientific analyses searching for an impact of these systems on clinical outcome remain scarce. Earlier data indicated a stepwise improvement in outcome after allogeneic hematopoietic stem cell transplantation with each phase of the accreditation process for the quality management system "JACIE". We therefore tested the hypothesis that working towards and achieving "JACIE" accreditation would accelerate improvement in outcome over calendar time. Overall mortality of the entire cohort of 107,904 patients who had a transplant (41,623 allogeneic, 39%; 66,281 autologous, 61%) between 1999 and 2006 decreased over the 14-year observation period by a factor of 0.63 per 10 years (hazard ratio: 0.63; 0.58-0.69). Considering "JACIE"-accredited centers as those with programs having achieved accreditation by November 2012, at the latest, this improvement was significantly faster in "JACIE"-accredited centers than in non-accredited centers (approximately 5.3% per year for 49,459 patients versus approximately 3.5% per year for 58,445 patients, respectively; hazard ratio: 0.83; 0.71-0.97). As a result, relapse-free survival (hazard ratio 0.85; 0.75-0.95) and overall survival (hazard ratio 0.86; 0.76-0.98) were significantly higher at 72 months for those patients transplanted in the 162 "JACIE"-accredited centers. No significant effects were observed after autologous transplants (hazard ratio 1.06; 0.99-1.13). Hence, working towards implementation of a quality management system triggers a dynamic process associated with a steeper reduction in mortality over the years and a significantly improved survival after allogeneic stem cell transplantation. Our data support the use of a quality management system for complex medical procedures.
Filippini, Graziella; Falcone, Chiara; Boiardi, Amerigo; Broggi, Giovanni; Bruzzone, Maria G; Caldiroli, Dario; Farina, Rita; Farinotti, Mariangela; Fariselli, Laura; Finocchiaro, Gaetano; Giombini, Sergio; Pollo, Bianca; Savoiardo, Mario; Solero, Carlo L; Valsecchi, Maria G
2008-02-01
Reliable data on large cohorts of patients with glioblastoma are needed because such studies differ importantly from trials that have a strong bias toward the recruitment of younger patients with a higher performance status. We analyzed the outcome of 676 patients with histologically confirmed newly diagnosed glioblastoma who were treated consecutively at a single institution over a 7-year period (1997-2003) with follow-up to April 30, 2006. Survival probabilities were 57% at 1 year, 16% at 2 years, and 7% at 3 years. Progression-free survival was 15% at 1 year. Prolongation of survival was significantly associated with surgery in patients with a good performance status, whatever the patient's age, with an adjusted hazard ratio of 0.55 (p < 0.001) or a 45% relative decrease in the risk of death. Radiotherapy and chemotherapy improved survival, with adjusted hazard ratios of 0.61 (p = 0.001) and 0.89 (p = 0.04), respectively, regardless of age, performance status, or residual tumor volume. Recurrence occurred in 99% of patients throughout the follow-up. Reoperation was performed in one-fourth of these patients but was not effective, whether performed within 9 months (hazard ratio, 0.86; p = 0.256) or after 9 months (hazard ratio, 0.98; p = 0.860) of initial surgery, whereas second-line chemotherapy with procarbazine, lomustine, and vincristine (PCV) or with temozolomide improved survival (hazard ratio, 0.77; p = 0.008). Surgery followed by radiotherapy and chemotherapy should be considered in all patients with glioblastoma, and these treatments should not be withheld because of increasing age alone. The benefit of second surgery at recurrence is uncertain, and new trials are needed to assess its effectiveness. Chemotherapy with PCV or temozolomide seems to be a reasonable option at tumor recurrence.
Filippini, Graziella; Falcone, Chiara; Boiardi, Amerigo; Broggi, Giovanni; Bruzzone, Maria G.; Caldiroli, Dario; Farina, Rita; Farinotti, Mariangela; Fariselli, Laura; Finocchiaro, Gaetano; Giombini, Sergio; Pollo, Bianca; Savoiardo, Mario; Solero, Carlo L.; Valsecchi, Maria G.
2008-01-01
Reliable data on large cohorts of patients with glioblastoma are needed because such studies differ importantly from trials that have a strong bias toward the recruitment of younger patients with a higher performance status. We analyzed the outcome of 676 patients with histologically confirmed newly diagnosed glioblastoma who were treated consecutively at a single institution over a 7-year period (1997 – 2003) with follow-up to April 30, 2006. Survival probabilities were 57% at 1 year, 16% at 2 years, and 7% at 3 years. Progression-free survival was 15% at 1 year. Prolongation of survival was significantly associated with surgery in patients with a good performance status, whatever the patient’s age, with an adjusted hazard ratio of 0.55 (p < 0.001) or a 45% relative decrease in the risk of death. Radiotherapy and chemotherapy improved survival, with adjusted hazard ratios of 0.61 (p = 0.001) and 0.89 (p = 0.04), respectively, regardless of age, performance status, or residual tumor volume. Recurrence occurred in 99% of patients throughout the follow-up. Reoperation was performed in one-fourth of these patients but was not effective, whether performed within 9 months (hazard ratio, 0.86; p = 0.256) or after 9 months (hazard ratio, 0.98; p = 0.860) of initial surgery, whereas second-line chemotherapy with procarbazine, lomustine, and vincristine (PCV) or with temozolomide improved survival (hazard ratio, 0.77; p = 0.008). Surgery followed by radiotherapy and chemotherapy should be considered in all patients with glioblastoma, and these treatments should not be withheld because of increasing age alone. The benefit of second surgery at recurrence is uncertain, and new trials are needed to assess its effectiveness. Chemotherapy with PCV or temozolomide seems to be a reasonable option at tumor recurrence. PMID:17993634
Leone, José Pablo; Leone, Julieta; Zwenger, Ariel Osvaldo; Iturbe, Julián; Leone, Bernardo Amadeo; Vallejo, Carlos Teodoro
2017-03-01
The presence of brain metastases at the time of initial breast cancer diagnosis (BMIBCD) is uncommon. Hence, the prognostic assessment and management of these patients is very challenging. The aim of this study was to analyse the influence of tumour subtype compared with other prognostic factors in the survival of patients with BMIBCD. We evaluated women with BMIBCD, reported to Surveillance, Epidemiology and End Results program from 2010 to 2013. Patients with other primary malignancy were excluded. Univariate and multivariate analyses were performed to determine the effects of each variable on overall survival (OS). We included 740 patients. Median OS for the whole population was 10 months, and 20.7% of patients were alive at 36 months. Tumour subtype distribution was: 46.6% hormone receptor (HR)+/HER2-, 17% HR+/HER2+, 14.1% HR-/HER2+ and 22.3% triple-negative. Univariate analysis showed that the presence of liver metastases, lung metastases and triple-negative patients (median OS 6 months) had worse prognosis. The HR+/HER2+ subtype had the longest OS with a median of 22 months. In multivariate analysis, older age (hazard ratio 1.8), lobular histology (hazard ratio 2.08), triple-negative subtype (hazard ratio 2.25), liver metastases (hazard ratio 1.6) and unmarried patients (hazard ratio 1.39) had significantly shorter OS. Although the prognosis of patients with BMIBCD is generally poor, 20.7% were still alive 3 years after the diagnosis. There were substantial differences in OS according to tumour subtype. In addition to tumour subtype, other independent predictors of OS are age at diagnosis, marital status, histology and liver metastases. Copyright © 2017 Elsevier Ltd. All rights reserved.
Wakai, Kenji; Sugawara, Yumi; Tsuji, Ichiro; Tamakoshi, Akiko; Shimazu, Taichi; Matsuo, Keitaro; Nagata, Chisato; Mizoue, Tetsuya; Tanaka, Keitaro; Inoue, Manami; Tsugane, Shoichiro; Sasazuki, Shizuka
2015-08-01
International reviews have concluded that consumption of fruit and vegetables might decrease the risk of lung cancer. However, the relevant epidemiological evidence still remains insufficient in Japan. Therefore, we performed a pooled analysis of data from four population-based cohort studies in Japan with >200 000 participants and >1700 lung cancer cases. We computed study-specific hazard ratios by quintiles of vegetable and fruit consumption as assessed by food frequency questionnaires. Summary hazard ratios were estimated by pooling the study-specific hazard ratios with a fixed-effect model. In men, we found inverse associations between fruit consumption and the age-adjusted and area-adjusted risk of mortality or incidence of lung cancer. However, the associations were largely attenuated after adjustment for smoking and energy intake. The significant decrease in risk among men remained only for a moderate level of fruit consumption; the lowest summary hazard ratios were found in the third quintile of intake (mortality: 0.71, 95% confidence interval 0.60-0.84; incidence: 0.83, 95% confidence interval 0.70-0.98). This decrease in risk was mainly detected in ever smokers. Conversely, vegetable intake was positively correlated with the risk of incidence of lung cancer after adjustment for smoking and energy intake in men (trend P, 0.024); the summary hazard ratio for the highest quintile was 1.26 (95% confidence interval 1.05-1.50). However, a similar association was not detected for mortality from lung cancer. In conclusion, a moderate level of fruit consumption is associated with a decreased risk of lung cancer in men among the Japanese population. © 2015 The Authors. Cancer Science published by Wiley Publishing Asia Pty Ltd on behalf of Japanese Cancer Association.
Abdolvahabi, Alireza; Shi, Yunhua; Rasouli, Sanaz; Croom, Corbin M; Aliyan, Amir; Martí, Angel A; Shaw, Bryan F
2017-06-21
Over 150 mutations in SOD1 (superoxide dismutase-1) cause amyotrophic lateral sclerosis (ALS), presumably by accelerating SOD1 amyloidogenesis. Like many nucleation processes, SOD1 fibrillization is stochastic (in vitro), which inhibits the determination of aggregation rates (and obscures whether rates correlate with patient phenotypes). Here, we diverged from classical chemical kinetics and used Kaplan-Meier estimators to quantify the probability of apo-SOD1 fibrillization (in vitro) from ∼10 3 replicate amyloid assays of wild-type (WT) SOD1 and nine ALS variants. The probability of apo-SOD1 fibrillization (expressed as a Hazard ratio) is increased by certain ALS-linked SOD1 mutations but is decreased or remains unchanged by other mutations. Despite this diversity, Hazard ratios of fibrillization correlated linearly with (and for three mutants, approximately equaled) Hazard ratios of patient survival (R 2 = 0.67; Pearson's r = 0.82). No correlation exists between Hazard ratios of fibrillization and age of initial onset of ALS (R 2 = 0.09). Thus, Hazard ratios of fibrillization might explain rates of disease progression but not onset. Classical kinetic metrics of fibrillization, i.e., mean lag time and propagation rate, did not correlate as strongly with phenotype (and ALS mutations did not uniformly accelerate mean rate of nucleation or propagation). A strong correlation was found, however, between mean ThT fluorescence at lag time and patient survival (R 2 = 0.93); oligomers of SOD1 with weaker fluorescence correlated with shorter survival. This study suggests that SOD1 mutations trigger ALS by altering a property of SOD1 or its oligomers other than the intrinsic rate of amyloid nucleation (e.g., oligomer stability; rates of intercellular propagation; affinity for membrane surfaces; and maturation rate).
Devillier, Raynier; Dalle, Jean-Hugues; Kulasekararaj, Austin; D'aveni, Maud; Clément, Laurence; Chybicka, Alicja; Vigouroux, Stéphane; Chevallier, Patrice; Koh, Mickey; Bertrand, Yves; Michallet, Mauricette; Zecca, Marco; Yakoub-Agha, Ibrahim; Cahn, Jean-Yves; Ljungman, Per; Bernard, Marc; Loiseau, Pascale; Dubois, Valérie; Maury, Sébastien; Socié, Gérard; Dufour, Carlo; Peffault de Latour, Regis
2016-07-01
Unrelated allogeneic transplantation for severe aplastic anemia is a treatment option after immunosuppressive treatment failure in the absence of a matched sibling donor. Age, delay between disease diagnosis and transplantation, and HLA matching are the key factors in transplantation decisions, but their combined impact on patient outcomes remains unclear. Using the French Society of Bone Marrow Transplantation and Cell Therapies registry, we analyzed all consecutive patients (n=139) who underwent a first allogeneic transplantation for idiopathic severe aplastic anemia from an unrelated donor between 2000 and 2012. In an adjusted multivariate model, age over 30 years (Hazard Ratio=2.39; P=0.011), time from diagnosis to transplantation over 12 months (Hazard Ratio=2.18; P=0.027) and the use of a 9/10 mismatched unrelated donor (Hazard Ratio=2.14; P=0.036) were independent risk factors that significantly worsened overall survival. Accordingly, we built a predictive score using these three parameters, considering patients at low (zero or one risk factors, n=94) or high (two or three risk factors, n=45) risk. High-risk patients had significantly shorter survival (Hazard Ratio=3.04; P<0.001). The score was then confirmed on an independent cohort from the European Group for Blood and Marrow Transplantation database of 296 patients, with shorter survival in patients with at least 2 risk factors (Hazard Ratio=2.13; P=0.005) In conclusion, a simple score using age, transplantation timing and HLA matching would appear useful to help physicians in the daily care of patients with severe aplastic anemia. Copyright© Ferrata Storti Foundation.
Wakai, Kenji; Sugawara, Yumi; Tsuji, Ichiro; Tamakoshi, Akiko; Shimazu, Taichi; Matsuo, Keitaro; Nagata, Chisato; Mizoue, Tetsuya; Tanaka, Keitaro; Inoue, Manami; Tsugane, Shoichiro; Sasazuki, Shizuka
2015-01-01
International reviews have concluded that consumption of fruit and vegetables might decrease the risk of lung cancer. However, the relevant epidemiological evidence still remains insufficient in Japan. Therefore, we performed a pooled analysis of data from four population-based cohort studies in Japan with >200 000 participants and >1700 lung cancer cases. We computed study-specific hazard ratios by quintiles of vegetable and fruit consumption as assessed by food frequency questionnaires. Summary hazard ratios were estimated by pooling the study-specific hazard ratios with a fixed-effect model. In men, we found inverse associations between fruit consumption and the age-adjusted and area-adjusted risk of mortality or incidence of lung cancer. However, the associations were largely attenuated after adjustment for smoking and energy intake. The significant decrease in risk among men remained only for a moderate level of fruit consumption; the lowest summary hazard ratios were found in the third quintile of intake (mortality: 0.71, 95% confidence interval 0.60–0.84; incidence: 0.83, 95% confidence interval 0.70–0.98). This decrease in risk was mainly detected in ever smokers. Conversely, vegetable intake was positively correlated with the risk of incidence of lung cancer after adjustment for smoking and energy intake in men (trend P, 0.024); the summary hazard ratio for the highest quintile was 1.26 (95% confidence interval 1.05–1.50). However, a similar association was not detected for mortality from lung cancer. In conclusion, a moderate level of fruit consumption is associated with a decreased risk of lung cancer in men among the Japanese population. PMID:26033436
Kim, Jin Sug; Kim, Weon; Woo, Jong Shin; Lee, Tae Won; Ihm, Chun Gyoo; Kim, Yang Gyoon; Moon, Joo Young; Lee, Sang Ho; Jeong, Myung Ho; Jeong, Kyung Hwan
2016-01-01
A high serum triglyceride to high-density lipoprotein cholesterol (TG/HDL-C) ratio has been reported as an independent predictor for cardiovascular events in the general population. However, the prognostic value of this ratio in patients with renal dysfunction is unclear. We examined the association of the TG/HDL-C ratio with major adverse cardiovascular events (MACEs) according to renal function in patients with acute myocardial infarction (AMI). This study was based on the Korea Acute Myocardial Infarction Registry database. Of 13,897 patients who were diagnosed with AMI, the study population included the 7,016 patients with available TG/HDL-C ratio data. Patients were stratified into three groups according to their estimated glomerular filtration rate (eGFR), and the TG/HDL-C ratio was categorized into tertiles. We investigated 12-month MACEs, which included cardiac death, myocardial infarction, and repeated percutaneous coronary intervention or coronary artery bypass grafting. During the 12-month follow up period, 593 patients experienced MACEs. There was a significant association between the TG/HDL-C ratio and MACEs (p<0.001) in the entire study cohort. Having a TG/HDL-C ratio value in the highest tertile of TG/HDL-C ratio was an independent factor associated with increased risk of MACEs (hazard ratio [HR], 1.56; 95% confidence interval [CI], 1.26-1.93; p<0.001). Then we performed subgroup analyses according to renal function. In patients with normal renal function (eGFR ≥ 90 ml/min/1.73m2) and mild renal dysfunction (eGFR ≥ 60 to < 90ml/min/1.73m2), a higher TG/HDL-C ratio was significantly associated with increased risk of MACEs (HR, 1.64; 95% CI, 1.04-2.60; p = 0.035; and HR, 1.56; 95% CI, 1.14-2.12; p = 0.005, respectively). However, in patients with moderate renal dysfunction (eGFR < 60 ml/min/1.73m2), TG/HDL-C ratio lost its predictive value on the risk of MACEs (HR, 1.23; 95% CI, 0.82-1.83; p = 0.317). In patients with AMI, TG/HDL-C ratio is a useful independent predictor of 12-month MACEs. However, this ratio does not have predictive power in patients with moderate renal dysfunction.
da Cunha-Bang, Caspar; Sørensen, Søren S; Iversen, Martin; Sengeløv, Henrik; Hillingsø, Jens G; Rasmussen, Allan; Mortensen, Svend A; Fox, Zoe V; Kirkby, Nikolai S; Christiansen, Claus B; Lundgren, Jens D
2011-05-01
Infection with cytomegalovirus (CMV) remains a potentially serious complication in transplant patients. In this study we explored the risk factors for CMV infection in the 12 months following a solid organ transplantation (n = 242) in patients monitored for CMV infection from 2004 to 2007. CMV infection was defined as 2 consecutive quantifiable CMV-polymerase chain reaction (PCR) values or 1 measurement of >3000 copies/ml. Data describing pre- and post-transplantation variables were extracted from electronic health records. Time to CMV infection was investigated using Cox proportional hazards analysis. Overall, 31% (75/242) of solid organ transplant patients developed CMV infection: 4/8 (50.0%) heart, 15/43 (34.9%) liver, 30/89 (33.7%) lung and 26/102 (25.5%) kidney transplant patients. The risk of CMV infection according to donor (D)/recipient (R) CMV serostatus (positive + or negative-) was highest for D+/R-(adjusted hazard ratio 2.6, 95% confidence interval 1.6-4.2) vs D+/R+, and was reduced for D-/R+(adjusted hazard ratio 0.2, 95% confidence interval 0.2-0.8) vs D+/R+. Positive donor CMV-serostatus is a major risk factor for CMV-infection in CMV-na ve recipients, but also in recipients with positive CMV-serostatus. Conversely, if donor is CMV serostatus is negative, the risk of CMV infection is low, irrespective of recipients CMV-serostatus. These findings suggest poorer immune function towards donor-induced strains of CMV versus recipient own latent strains.
Remarriage after divorce and depression risk.
Hiyoshi, A; Fall, K; Netuveli, G; Montgomery, S
2015-09-01
As marriage is associated with lower depression rates compared with being single in men, we aimed to examine if remarriage compared with remaining divorced is also associated with a reduced depression risk. Swedish register data were used to define a cohort of men who were born between 1952 and 1956 and underwent a compulsory military conscription assessment in adolescence. This study population comprised men who were divorced in 1985 (n = 72,246). The risk of pharmaceutically treated depression from 2005 to 2009 was compared for those who remarried or remained divorced between 1986 and 2004. Cox proportional hazards analysis was used to estimate hazard ratios for the risk of depression identified by pharmaceutical treatment, with adjustment for a range of potential confounding factors including childhood and adulthood socioeconomic circumstances, cognitive, physical, psychological and medical characteristics at the conscription assessment. The results showed that, even though divorced men who remarried had markers of lower depression risk in earlier life such as higher cognitive and physical function, higher stress resilience and socioeconomic advantages than men who remained divorced, remarriage was associated with a statistically significant elevated risk of depression with an adjusted hazard ratio (and 95% confidence interval) of 1.27(1.03 1.55), compared with men who remained divorced. Remarriage following divorce is not associated with a reduced risk of depression identified by pharmaceutical treatment, compared with remaining divorced. Interpersonal or financial difficulties resulting from remarriage may outweigh the benefits of marriage in terms of depression risk. Copyright © 2015 Elsevier Ltd. All rights reserved.
Calcium and Vitamin D Supplementation and Cognitive Impairment in the Women’s Health Initiative
Rossom, Rebecca C.; Espeland, Mark A.; Manson, JoAnn E.; Dysken, Maurice W.; Johnson, Karen C.; Lane, Dorothy S.; LeBlanc, Erin S.; Lederle, Frank A.; Masaki, Kamal H.; Margolis, Karen L.
2012-01-01
Background Calcium and vitamin D are thought to play important roles in neuronal functioning. Studies have found associations between low serum vitamin D levels and reduced cognitive functioning, as well as high serum calcium levels and reduced cognitive functioning. Objectives To examine the effects of vitamin D and calcium on cognitive outcomes in elderly women. Design Post-hoc analysis of a randomized double-blinded placebo-controlled trial. Setting 40 Women’s Health Initiative clinical centers across the U.S. Participants 4143 women aged 65 years and older without probable dementia at baseline who participated in the WHI Calcium and Vitamin D trial and the Women’s Health Initiative Memory Study. Interventions 2034 women were randomized to 1000 mg of calcium carbonate combined with 400 IU of vitamin D3; 2109 women were randomized to placebo. Measurements Primary: classifications of probable dementia or mild cognitive impairment via a 4-phase protocol that included central adjudication. Secondary: global cognitive function and individual cognitive subtests. Results Mean age of participants was 71 years. During mean follow-up of 7.8 years, there were 39 cases of incident dementia among calcium plus vitamin D subjects compared to 37 cases among placebo subjects (hazard ratio=1.11, 95% CI: 0.71–1.74, p=0.64). Likewise, there were 98 cases of incident mild cognitive impairment among calcium plus vitamin D subjects compared to 108 cases among placebo subjects (hazard ratio=0.95, 95% CI: 0.72–1.25, p=0.72). There were no significant differences in incident dementia or mild cognitive impairment, or in global or domain-specific cognitive function between groups. Conclusion There was no association between treatment assignment and incident cognitive impairment. Further studies are needed to investigate the effects of vitamin D and calcium separately, on men and in other age and ethnic groups, and with other doses. PMID:23176129
Suzuki, Takeki; Agarwal, Sunil K; Deo, Rajat; Sotoodehnia, Nona; Grams, Morgan E; Selvin, Elizabeth; Calkins, Hugh; Rosamond, Wayne; Tomaselli, Gordon; Coresh, Josef; Matsushita, Kunihiro
2016-10-01
Individuals with chronic kidney disease, particularly those requiring dialysis, are at high risk of sudden cardiac death (SCD). However, comprehensive data for the full spectrum of kidney function and SCD risk in the community are sparse. Furthermore, newly developed equations for estimated glomerular filtration rate (eGFR) and novel filtration markers might add further insight to the role of kidney function in SCD. We investigated the associations of baseline eGFRs using serum creatinine, cystatin C, or both (eGFRcr, eGFRcys, and eGFRcr-cys); cystatin C itself; and β2-microglobulin (B2M) with SCD (205 cases through 2001) among 13,070 black and white ARIC participants at baseline during 1990-1992 using Cox regression models accounting for potential confounders. Low eGFR was independently associated with SCD risk: for example, hazard ratio for eGFR <45 versus ≥90mL/(min 1.73m(2)) was 3.71 (95% CI 1.74-7.90) with eGFRcr, 5.40 (2.97-9.83) with eGFRcr-cys, and 5.24 (3.01-9.11) with eGFRcys. When eGFRcr and eGFRcys were included together in a single model, the association was only significant for eGFRcys. When three eGFRs, cystatin C, and B2M were divided into quartiles, B2M demonstrated the strongest association with SCD (hazard ratio for fourth quartile vs first quartile 3.48 (2.03-5.96) vs ≤2.7 for the other kidney markers). Kidney function was independently and robustly associated with SCD in the community, particularly when cystatin C or B2M was used. These results suggest the potential value of kidney function as a risk factor for SCD and the advantage of novel filtration markers over eGFRcr in this context. Copyright © 2016 Elsevier Inc. All rights reserved.
Scrutinio, Domenico; Lanzillo, Bernardo; Guida, Pietro; Mastropasqua, Filippo; Monitillo, Vincenzo; Pusineri, Monica; Formica, Roberto; Russo, Giovanna; Guarnaschelli, Caterina; Ferretti, Chiara; Calabrese, Gianluigi
2017-12-01
Prediction of outcome after stroke rehabilitation may help clinicians in decision-making and planning rehabilitation care. We developed and validated a predictive tool to estimate the probability of achieving improvement in physical functioning (model 1) and a level of independence requiring no more than supervision (model 2) after stroke rehabilitation. The models were derived from 717 patients admitted for stroke rehabilitation. We used multivariable logistic regression analysis to build each model. Then, each model was prospectively validated in 875 patients. Model 1 included age, time from stroke occurrence to rehabilitation admission, admission motor and cognitive Functional Independence Measure scores, and neglect. Model 2 included age, male gender, time since stroke onset, and admission motor and cognitive Functional Independence Measure score. Both models demonstrated excellent discrimination. In the derivation cohort, the area under the curve was 0.883 (95% confidence intervals, 0.858-0.910) for model 1 and 0.913 (95% confidence intervals, 0.884-0.942) for model 2. The Hosmer-Lemeshow χ 2 was 4.12 ( P =0.249) and 1.20 ( P =0.754), respectively. In the validation cohort, the area under the curve was 0.866 (95% confidence intervals, 0.840-0.892) for model 1 and 0.850 (95% confidence intervals, 0.815-0.885) for model 2. The Hosmer-Lemeshow χ 2 was 8.86 ( P =0.115) and 34.50 ( P =0.001), respectively. Both improvement in physical functioning (hazard ratios, 0.43; 0.25-0.71; P =0.001) and a level of independence requiring no more than supervision (hazard ratios, 0.32; 0.14-0.68; P =0.004) were independently associated with improved 4-year survival. A calculator is freely available for download at https://goo.gl/fEAp81. This study provides researchers and clinicians with an easy-to-use, accurate, and validated predictive tool for potential application in rehabilitation research and stroke management. © 2017 American Heart Association, Inc.
Willey, Joshua; Gardener, Hannah; Cespedes, Sandino; Cheung, Ying K; Sacco, Ralph L; Elkind, Mitchell S V
2017-11-01
There is growing evidence that increased dietary sodium (Na) intake increases the risk of vascular diseases, including stroke, at least in part via an increase in blood pressure. Higher dietary potassium (K), seen with increased intake of fruits and vegetables, is associated with lower blood pressure. The goal of this study was to determine the association of a dietary Na:K with risk of stroke in a multiethnic urban population. Stroke-free participants from the Northern Manhattan Study, a population-based cohort study of stroke incidence, were followed-up for incident stroke. Baseline food frequency questionnaires were analyzed for Na and K intake. We estimated the hazard ratios and 95% confidence intervals for the association of Na:K with incident total stroke using multivariable Cox proportional hazards models. Among 2570 participants with dietary data (mean age, 69±10 years; 64% women; 21% white; 55% Hispanic; 24% black), the mean Na:K ratio was 1.22±0.43. Over a mean follow-up of 12 years, there were 274 strokes. In adjusted models, a higher Na:K ratio was associated with increased risk for stroke (hazard ratio, 1.6; 95% confidence interval, 1.2-2.1) and specifically ischemic stroke (hazard ratio, 1.6; 95% confidence interval, 1.2-2.1). Na:K intake is an independent predictor of stroke risk. Further studies are required to understand the joint effect of Na and K intake on risk of cardiovascular disease. © 2017 American Heart Association, Inc.
Candida transmission and sexual behaviors as risks for a repeat episode of Candida vulvovaginitis.
Reed, Barbara D; Zazove, Philip; Pierson, Carl L; Gorenflo, Daniel W; Horrocks, Julie
2003-12-01
To assess associations between female and male factors and the risk of recurring Candida vulvovaginitis. A prospective cohort study of 148 women with Candida vulvovaginitis and 78 of their male sexual partners was conducted at two primary care practices in the Ann Arbor, Michigan, area. Thirty-three of 148 women developed at least one further episode of Candida albicans vulvovaginitis within 1 year of follow-up. Cultures of Candida species from various sites of the woman (tongue, feces, vulva, and vagina) and from her partner (tongue, feces, urine, and semen) did not predict recurrences. Female factors associated with recurrence included recent masturbating with saliva (hazard ratio 2.66 [95% CI 1.17-6.06]) or cunnilingus (hazard ratio 2.94 [95% CI 1.12-7.68]) and ingestion of two or more servings of bread per day (p = 0.05). Male factors associated with recurrences in the woman included history of the male masturbating with saliva in the previous month (hazard ratio 3.68 [95% CI 1.24-10.87]) and lower age at first intercourse (hazard ratio 0.83 [95% CI 0.71-0.96]). Sexual behaviors, rather than the presence of Candida species at various body locations of the male partner, are associated with recurrences of C. albicans vulvovaginitis.
Serum Metabolomic Profiling and Incident CKD among African Americans
Yu, Bing; Zheng, Yan; Nettleton, Jennifer A.; Alexander, Danny; Coresh, Josef
2014-01-01
Background and objectives Novel biomarkers that more accurately reflect kidney function and predict future CKD are needed. The human metabolome is the product of multiple physiologic or pathophysiologic processes and may provide novel insight into disease etiology and progression. This study investigated whether estimated kidney function would be associated with multiple metabolites and whether selected metabolomic factors would be independent risk factors for incident CKD. Design, setting, participants, & measurements In total, 1921 African Americans free of CKD with a median of 19.6 years follow-up among the Atherosclerosis Risk in Communities Study were included. A total of 204 serum metabolites quantified by untargeted gas chromatography–mass spectrometry and liquid chromatography–mass spectrometry was analyzed by both linear regression for the cross-sectional associations with eGFR (specified by the Chronic Kidney Disease Epidemiology Collaboration equation) and Cox proportional hazards model for the longitudinal associations with incident CKD. Results Forty named and 34 unnamed metabolites were found to be associated with eGFR specified by the Chronic Kidney Disease Epidemiology Collaboration equation with creatine and 3-indoxyl sulfate showing the strongest positive (2.8 ml/min per 1.73 m2 per +1 SD; 95% confidence interval, 2.1 to 3.5) and negative association (−14.2 ml/min per 1.73 m2 per +1 SD; 95% confidence interval, −17.0 to −11.3), respectively. Two hundred four incident CKD events with a median follow-up time of 19.6 years were included in the survival analyses. Higher levels of 5-oxoproline (hazard ratio, 0.70; 95% confidence interval, 0.60 to 0.82) and 1,5-anhydroglucitol (hazard ratio, 0.68; 95% confidence interval, 0.58 to 0.80) were significantly related to lower risk of incident CKD, and the associations did not appreciably change when mutually adjusted. Conclusions These data identify a large number of metabolites associated with kidney function as well as two metabolites that are candidate risk factors for CKD and may provide new insights into CKD biomarker identification. PMID:25011442
Quantifying the relative risk of sex offenders: risk ratios for static-99R.
Hanson, R Karl; Babchishin, Kelly M; Helmus, Leslie; Thornton, David
2013-10-01
Given the widespread use of empirical actuarial risk tools in corrections and forensic mental health, it is important that evaluators and decision makers understand how scores relate to recidivism risk. In the current study, we found strong evidence for a relative risk interpretation of Static-99R scores using 8 samples from Canada, United Kingdom, and Western Europe (N = 4,037 sex offenders). Each increase in Static-99R score was associated with a stable and consistent increase in relative risk (as measured by an odds ratio or hazard ratio of approximately 1.4). Hazard ratios from Cox regression were used to calculate risk ratios that can be reported for Static-99R. We recommend that evaluators consider risk ratios as a useful, nonarbitrary metric for quantifying and communicating risk information. To avoid misinterpretation, however, risk ratios should be presented with recidivism base rates.
Vegetables, unsaturated fats, moderate alcohol intake, and mild cognitive impairment.
Roberts, Rosebud O; Geda, Yonas E; Cerhan, James R; Knopman, David S; Cha, Ruth H; Christianson, Teresa J H; Pankratz, V Shane; Ivnik, Robert J; Boeve, Bradley F; O'Connor, Helen M; Petersen, Ronald C
2010-01-01
To investigate associations of the Mediterranean diet (MeDi) components and the MeDi score with mild cognitive impairment (MCI). Participants (aged 70-89 years) were clinically evaluated to assess MCI and dementia, and completed a 128-item food frequency questionnaire. 163 of 1,233 nondemented persons had MCI. The odds ratio of MCI was reduced for high vegetable intake [0.66 (95% CI = 0.44-0.99), p = 0.05] and for high mono- plus polyunsaturated fatty acid to saturated fatty acid ratio [0.52 (95% CI = 0.33-0.81), p = 0.007], adjusted for confounders. The risk of incident MCI or dementia was reduced in subjects with a high MeDi score [hazard ratio = 0.75 (95% CI = 0.46-1.21), p = 0.24]. Vegetables, unsaturated fats, and a high MeDi score may be beneficial to cognitive function.
van der Velde, A Rogier; Gullestad, Lars; Ueland, Thor; Aukrust, Pål; Guo, Yu; Adourian, Aram; Muntendam, Pieter; van Veldhuisen, Dirk J; de Boer, Rudolf A
2013-03-01
In several cross-sectional analyses, circulating baseline levels of galectin-3, a protein involved in myocardial fibrosis and remodeling, have been associated with increased risk for morbidity and mortality in patients with heart failure (HF). The importance and clinical use of repeated measurements of galectin-3 have not yet been reported. Plasma galectin-3 was measured at baseline and at 3 months in patients enrolled in the Controlled Rosuvastatin Multinational Trial in Heart Failure (CORONA) trial (n=1329), and at baseline and at 6 months in patients enrolled in the Coordinating Study Evaluating Outcomes of Advising and Counseling Failure (COACH) trial (n=324). Patient results were analyzed by categorical and percentage changes in galectin-3 level. A threshold value of 17.8 ng/mL or 15% change from baseline was used to categorize patients. Increasing galectin-3 levels over time, from a low to high galectin-3 category, were associated with significantly more HF hospitalization and mortality compared with stable or decreasing galectin-3 levels (hazard ratio in CORONA, 1.60; 95% confidence interval, 1.13-2.25; P=0.007; hazard ratio in COACH, 2.38; 95% confidence interval, 1.02-5.55; P=0.046). In addition, patients whose galectin-3 increased by >15% between measurements had a 50% higher relative hazard of adverse event than those whose galectin-3 stayed within ±15% of the baseline value, independent of age, sex, diabetes mellitus, left ventricular ejection fraction, renal function, medication (β-blocker, angiotensin converting enzyme inhibitor, and angiotensin receptor blocker), and N-terminal probrain natriuretic peptide (hazard ratio in CORONA, 1.50; 95% confidence interval, 1.17-1.92; P=0.001). The impact of changing galectin-3 levels on other secondary end points was comparable. In 2 large cohorts of patients with chronic and acute decompensated HF, repeated measurements of galectin-3 level provided important and significant prognostic value in identifying patients with HF at elevated risk for subsequent HF morbidity and mortality.
Markers of nutritional status and mortality in older adults: The role of anemia and hypoalbuminemia.
Corona, Ligiana Pires; de Oliveira Duarte, Yeda Aparecida; Lebrão, Maria Lúcia
2018-01-01
The aim of the present study was to analyze the impact of anemia and hypoalbuminemia on mortality in a 5-year period. This was longitudinal population-based observational survey part of the Saúde, Bem-Estar e Envelhecimento study (Health, Well-being and Aging), carried out with 1256 older adults from the third wave of the cohort, followed for 5 years, when they were contacted for the fourth wave, in Sao Paulo, Brazil. Anemia was defined when hemoglobin was <12 g/dL for women and <13 g/dL for men, and hypoalbuminemia when serum albumin was <3.5 g/dL. Survival functions were estimated according to nutritional status in four groups: (i) without nutritional alteration; (ii) anemia only; (iii) hypoalbuminemia only; and (iv) anemia and hypoalbuminemia. Hazard ratios were calculated, following the Cox proportional hazards model, controlling for baseline covariates. All analyses considered sample weights, and were carried out using the Stata 12. After the 5-year period, 12.3% of the participants died, and 8.2% were lost to follow up. Those who died had lower hemoglobin and albumin concentrations (13.4 g/dL and 3.7 g/dL) compared with survivors (14.3d/dL and 3.9 g/dL; P < 0.001). The crude death rate was 27.6/1000 person-years for participants in group i, 124.3 in group ii, 116.0 in group iii and 222.8 in group iv (P < 0.001). In the final Cox models, group 2 and 3 had a similar effect (hazard ratio 2.23, P = 0.020; 2.53, P = 0.005; respectively) and group 4 had a higher risk (hazard ratio 3.36; P = 0.004). Anemia and hypoalbuminemia are important markers for death in older adults, and have an additive effect on mortality. Because they are common and cost-effective biomarkers, their use should be encouraged in geriatric evaluation for all health professionals and in population settings, such as primary care. Geriatr Gerontol Int 2018; 18: 177-182. © 2017 Japan Geriatrics Society.
Allen, M B; Billig, E; Reese, P P; Shults, J; Hasz, R; West, S; Abt, P L
2016-01-01
Donation after cardiac death is an important source of transplantable organs, but evidence suggests donor warm ischemia contributes to inferior outcomes. Attempts to predict recipient outcome using donor hemodynamic measurements have not yielded statistically significant results. We evaluated novel measures of donor hemodynamics as predictors of delayed graft function and graft failure in a cohort of 1050 kidneys from 566 donors. Hemodynamics were described using regression line slopes, areas under the curve, and time beyond thresholds for systolic blood pressure, oxygen saturation, and shock index (heart rate divided by systolic blood pressure). A logistic generalized estimation equation model showed that area under the curve for systolic blood pressure was predictive of delayed graft function (above median: odds ratio 1.42, 95% confidence interval [CI] 1.06-1.90). Multivariable Cox regression demonstrated that slope of oxygen saturation during the first 10 minutes after extubation was associated with graft failure (below median: hazard ratio 1.30, 95% CI 1.03-1.64), with 5-year graft survival of 70.0% (95%CI 64.5%-74.8%) for donors above the median versus 61.4% (95%CI 55.5%-66.7%) for those below the median. Among older donors, increased shock index slope was associated with increased hazard of graft failure. Validation of these findings is necessary to determine the utility of characterizing donor warm ischemia to predict recipient outcome. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.
Evaluating MoE and its Uncertainty and Variability for Food Contaminants (EuroTox presentation)
Margin of Exposure (MoE), is a metric for quantifying the relationship between exposure and hazard. Ideally, it is the ratio of the dose associated with hazard and an estimate of exposure. For example, hazard may be characterized by a benchmark dose (BMD), and, for food contami...
Ferguson, Kelly K; Meeker, John D; McElrath, Thomas F; Mukherjee, Bhramar; Cantonwine, David E
2017-05-01
Preeclampsia is a prevalent and enigmatic disease, in part characterized by poor remodeling of the spiral arteries. However, preeclampsia does not always clinically present when remodeling has failed to occur. Hypotheses surrounding the "second hit" that is necessary for the clinical presentation of the disease focus on maternal inflammation and oxidative stress. Yet, the studies to date that have investigated these factors have used cross-sectional study designs or small study populations. In the present study, we sought to explore longitudinal trajectories, beginning early in gestation, of a panel of inflammation and oxidative stress markers in women who went on to have preeclamptic or normotensive pregnancies. We examined 441 subjects from the ongoing LIFECODES prospective birth cohort, which included 50 mothers who experienced preeclampsia and 391 mothers with normotensive pregnancies. Participants provided urine and plasma samples at 4 time points during gestation (median, 10, 18, 26, and 35 weeks) that were analyzed for a panel of oxidative stress and inflammation markers. Oxidative stress biomarkers included 8-isoprostane and 8-hydroxydeoxyguanosine. Inflammation biomarkers included C-reactive protein, the cytokines interleukin-1β, -6, and -10, and tumor necrosis factor-α. We created Cox proportional hazard models to calculate hazard ratios based on time of preeclampsia diagnosis in association with biomarker concentrations at each of the 4 study visits. In adjusted models, hazard ratios of preeclampsia were significantly (P<.01) elevated in association with all inflammation biomarkers that were measured at visit 2 (median, 18 weeks; hazard ratios, 1.31-1.83, in association with an interquartile range increase in biomarker). Hazard ratios at this time point were the most elevated for C-reactive protein, for interleukin-1β, -6, and -10, and for the oxidative stress biomarker 8-isoprostane (hazard ratio, 1.68; 95% confidence interval, 1.14-2.48) compared to other time points. Hazard ratios for tumor necrosis factor-α were consistently elevated at all 4 of the study visits (hazard ratios, 1.49-1.63; P<.01). In sensitivity analyses, we observed that these associations were attenuated within groups typically at higher risk of experiencing preeclampsia, which include African American mothers, mothers with higher body mass index at the beginning of gestation, and pregnancies that ended preterm. This study provides the most robust data to date on repeated measures of inflammation and oxidative stress in preeclamptic compared with normotensive pregnancies. Within these groups, inflammation and oxidative stress biomarkers show different patterns across gestation, beginning as early as 10 weeks. The start of the second trimester appears to be a particularly important time point for the measurement of these biomarkers. Although biomarkers alone do not appear to be useful in the prediction of preeclampsia, these data are useful in understanding the maternal inflammatory profile in pregnancy before the development of the disease and may be used to further develop an understanding of potentially preventative measures. Published by Elsevier Inc.
Stallings, Devita T.; Garvin, Jane T.; Xu, Hongyan; Racette, Susan B.
2017-01-01
Objective To determine which anthropometric measures are the strongest discriminators of incident type 2 diabetes (T2DM) among White and Black males and females in a large U.S. cohort. Methods We used Atherosclerosis Risk in Communities study data from 12,121 participants aged 45–64 years without diabetes at baseline who were followed for over 11 years. Anthropometric measures included a body shape index (ABSI), body adiposity index (BAI), body mass index (BMI), waist circumference (WC), waist to hip ratio (WHR), waist to height ratio (WHtR), and waist to hip to height ratio (WHHR). All anthropometric measures were repeated at each visit and converted to Z-scores. Hazard ratios and 95% confidence intervals adjusted for age were calculated using repeated measures Cox proportional hazard regression analysis. Akaike Information Criteria was used to select best-fit models. The magnitude of the hazard ratio effect sizes and the Harrell’s C-indexes were used to rank the highest associations and discriminators, respectively. Results There were 1,359 incident diabetes cases. Higher values of all anthropometric measures increased the risk for development of T2DM (p < 0.0001) except ABSI, which was not significant in White and Black males. Statistically significant hazard ratios ranged from 1.26–1.63 for males and 1.15–1.88 for females. In general, the largest hazard ratios were those that corresponded to the highest Harrell’s C-Index and lowest Akaike Information Criteria values. Among White and Black males and females, BMI, WC, WHR, and WHtR were comparable in discriminating cases from non-cases of T2DM. ABSI, BAI, and WHHR were inferior discriminators of incident T2DM across all race-gender groups. Conclusions BMI, the most commonly used anthropometric measure, and three anthropometric measures that included waist circumference (i.e., WC, WHR, WHtR) were the best anthropometric discriminators of incident T2DM across all race-gender groups in the ARIC cohort. PMID:28141847
A rainfall risk analysis thanks to an GIS based estimation of urban vulnerability
NASA Astrophysics Data System (ADS)
Renard, Florent; Pierre-Marie, Chapon
2010-05-01
The urban community of Lyon, situated in France in the north of the Rhône valley, comprises 1.2 million inhabitants within 515 km ². With such a concentration of issues, policy makers and local elected officials therefore attach great importance to the management of hydrological risks, particularly due to the inherent characteristics of the territory. If the hazards associated with these risks in the territory of Lyon have been the subject of numerous analyses, studies on the vulnerability of greater Lyon are rare and have common shortcomings that impair their validity. We recall that the risk is seen as the classic relationship between the probability of occurrence of hazards and vulnerability. In this article, this vulnerability will be composed of two parts. The first one is the sensitivity of the stakes facing hydrological hazards as urban runoff, that is to say, their propensity to suffer damage during a flood (Gleize and Reghezza, 2007). The second factor is their relative importance in the functioning of the community. Indeed, not all the stakes could provide the same role and contribution to the Greater Lyon. For example, damage to the urban furniture such as bus shelter seems less harmful to the activities of the urban area than that of transport infrastructure (Renard and Chapon, 2010). This communication proposes to assess the vulnerability of Lyon urban area facing to hydrological hazards. This territory is composed of human, environmental and material stakes. The first part of this work is to identify all these issues so as to completeness. Then, is it required to build a "vulnerability index" (Tixier et al, 2006). Thus, it is necessary to use methods of multicriteria decision aid to evaluate the two components of vulnerability: the sensitivity and the contribution to the functioning of the community. Finally, the results of the overall vulnerability are presented, and then coupled to various hazards related to water such as runoff associated with heavy rains, to locate areas of risk in the urban area. The targets that share the same rank of this vulnerability index do not possess the same importance, or the same sensitivity to the flood hazard. Therefore, the second part of this work is to define the priorities and sensitivities of different targets based on the judgments of experts. Multicriteria decision methods are used to prioritize elements and are therefore adapted to the modelling of the sensitivity of the issues of greater Lyon (Griot, 2008). The purpose of these methods is the assessment of priorities between the different components of the situation. Thomas Saaty's analytic hierarchy process (1980) is the most frequently used because of its many advantages. On this basis, the formal calculations of priorities and sensitivities of the elements have been conducted. These calculations are based on the judgments of experts. Indeed, during semi-structured interview, the 38 experts in our sample delivered a verdict on issues that seem relatively more important than others by binary comparison. They carry the same manner to determine sensitivity's stakes to hazard flooding. Finally, the consistency of answers given by experts is validated by calculating a ratio of coherence, and their results are aggregated to provide functions of priority (based on the relative importance of each stakes), and functions of sensitivity (based on the relative sensitivity of each stakes). From these functions of priority and sensitivity is obtained the general function of vulnerability. The vulnerability functions allow defining the importance of the stakes of Greater Lyon and their sensitivity to hydrological hazards. The global vulnerability function is obtained from sensitivity and priority functions and shows the great importance of human issues (75 %). The vulnerability factor of environmental targets represents 12 % of the global vulnerability function, as much as the materials issues. However, it can be seen that the environmental and material stakes do not represent the same weight into the priority and sensitivity functions. Indeed, the environmental issues seem more important than the material ones (17 % for the environmental stakes whereas only 5 % for the material stakes in the priority function), but less sensitive to an hydrological hazard (6 % for the environmental issues while 20 % for the material issues in the sensitivity function). Similarly, priority functions and sensitivity are established for all stakes at all levels. The stakes are then converted into a mesh form (100 meters wide). This will standardize the collection framework and the heterogeneous nature of data to allow their comparison. Finally, it is obtained a detailed, consistent and objective vulnerability of the territory of Greater Lyon. At the end, to get a direct reading of risk, combination of hazard and vulnerability, it is overlaid the two maps.
Permuth-Wey, Jennifer; Thompson, Reid C.; Nabors, L. Burton; Olson, Jeffrey J.; Browning, James E.; Madden, Melissa H.; Chen, Y. Ann
2011-01-01
MicroRNAs (miRNAs) are non-coding RNAs that function as post-transcriptional regulators of tumor suppressors and oncogenes. Single nucleotide polymorphisms (SNPs) in miRNAs may contribute to carcinogenesis by altering expression of miRNAs and their targets. A G>C polymorphism (rs2910164) in the miR-146a precursor sequence leads to a functional change associated with the risk for numerous malignancies. A role for this SNP in glioma pathogenesis has not yet been examined. We investigated whether rs2910164 genotypes influence glioma risk and prognosis in a multi-center case–control study comprised of 593 Caucasian glioma cases and 614 community-based controls. Unconditional logistic regression was used to estimate odds ratios (OR) and 95% confidence intervals (CI) for rs2910164 genotypes according to case status. Cox proportional hazards regression modeling was used to estimate hazards ratios (HR) and 95% CIs according to genotype among glioblastomas, the most lethal glioma subtype. An increased glioma risk was observed among rs2910164 minor allele (C) carriers (per allele OR (95% CI) = 1.22 (1.01–1.46, ptrend = 0.039)). The association was stronger among older subjects carrying at least one copy of the C allele (OR (95% CI) = 1.38 (1.04–1.83, P = 0.026). Mortality was increased among minor allele carriers (HR (95% CI) = 1.33 (1.03–1.72, P = 0.029)), with the association largely restricted to females (HR (95% CI) = 2.02 (1.28–3.17, P = 0.002)). We provide novel data suggesting rs2910164 genotype may contribute to glioma susceptibility and outcome. Future studies are warranted to replicate these findings and characterize mechanisms underlying these associations. PMID:21744077
Lu, Lingeng; Katsaros, Dionyssios; Mayne, Susan T; Risch, Harvey A; Benedetto, Chiara; Canuto, Emilie Marion; Yu, Herbert
2012-11-01
Several single-nucleotide polymorphisms (SNPs) of the stem cell-associated gene lin-28B have been identified in association with ovarian cancer and ovarian cancer-related risk factors. However, whether these SNPs are functional or might be potential biomarkers for ovarian cancer prognosis remains unknown. The purposes of this study were to investigate the functional relevance of the identified lin-28B SNPs, as well as the associations of genotype and phenotype with epithelial ovarian cancer (EOC) survival. We analyzed five SNPs and mRNA levels of lin-28B in 211 primary EOC tissues using Taqman(®) SNP genotyping assays and SYBR green-based real-time PCR, respectively. The RNA secondary structures at the region of a genome-wide association-identified intronic rs314276 were analyzed theoretically with mfold and experimentally with circular dichroism spectroscopy. We found that rs314276 was a cis-acting expression quantitative trait locus (eQTL) in both additive and dominant models, while rs7759938 and rs314277 were significant or of borderline significance in dominant models only. The rs314276 variant significantly affects RNA secondary structure. No SNPs alone were associated with patient survival. However, we found that among patients initially responding to chemotherapy, those with higher lin-28B expression had higher mortality risk (hazard ratio =3.27, 95% confidence interval: 1.63-6.56) and relapse risk (hazard ratio = 2.53, 95% confidence interval: 1.41-4.54) than those with lower expression, and these associations remained in multivariate analyses. These results suggest that rs314276 alters RNA secondary structure and thereby influences gene expression, and that lin-28B is a cancer stem cell-associated marker, which may be a pharmaceutical target in the management of EOC.
Schaubel, Douglas E; Wei, Guanghui
2011-03-01
In medical studies of time-to-event data, nonproportional hazards and dependent censoring are very common issues when estimating the treatment effect. A traditional method for dealing with time-dependent treatment effects is to model the time-dependence parametrically. Limitations of this approach include the difficulty to verify the correctness of the specified functional form and the fact that, in the presence of a treatment effect that varies over time, investigators are usually interested in the cumulative as opposed to instantaneous treatment effect. In many applications, censoring time is not independent of event time. Therefore, we propose methods for estimating the cumulative treatment effect in the presence of nonproportional hazards and dependent censoring. Three measures are proposed, including the ratio of cumulative hazards, relative risk, and difference in restricted mean lifetime. For each measure, we propose a double inverse-weighted estimator, constructed by first using inverse probability of treatment weighting (IPTW) to balance the treatment-specific covariate distributions, then using inverse probability of censoring weighting (IPCW) to overcome the dependent censoring. The proposed estimators are shown to be consistent and asymptotically normal. We study their finite-sample properties through simulation. The proposed methods are used to compare kidney wait-list mortality by race. © 2010, The International Biometric Society.
Janghorbani, Mohsen; Amini, Masoud
2016-09-01
In this study, we evaluate the association between triglyceride to high-density lipoprotein cholesterol (TG/HDL) ratio and total cholesterol (TC) to HDL (TC/HDL) ratio and the risks of type 2 diabetes (T2D) in an Iranian high-risk population. We analysed 7-year follow-up data (n = 1771) in non-diabetic first-degree relatives of consecutive patients with T2D 30-70 years old. The primary outcome was the diagnosis of T2D based on repeated oral glucose tolerance tests. We used Cox proportional hazard models to estimate hazard ratio for incident T2D across tertiles of TG/HDL and TC/HDL ratios and plotted a receiver operating characteristic (ROC) curve to assess discrimination. The highest tertile of TG/HDL and TC/HDL ratios compared with the lowest tertile was not associated with T2D in age- and gender-adjusted models (HR 0.99, 95% CI: 0.88, 1.11 for TG/HDL ratio and 1.10, 95% CI: 0.97, 1.23 for TC/HDL ratio). Further adjustment for waist circumference or body mass index, fasting plasma glucose, and low-density lipoprotein cholesterol did not appreciably alter the hazard ratio compared with the age- and gender-adjusted model. The area under the ROC curve for TG/HDL ratio was 57.7% (95% CI: 54.0, 61.5) and for TC/HDL ratio was 55.1% (95% CI: 51.2, 59.0). TG/HDL and TC/HDL ratios were not robust predictors of T2D in high-risk individuals in Iran. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Roh, Hyun Woong; Hong, Chang Hyung; Lee, SooJin; Lee, Yunhwan; Lee, Kang Soo; Chang, Ki Jung; Oh, Byoung Hoon; Choi, Seong Hye; Kim, Seong Yoon; Back, Joung Hwan; Chung, Young Ki; Lim, Ki Young; Noh, Jai Sung; Son, Sang Joon
2015-11-01
To determine the association between frontal lobe function and risk of hip fracture in patients with Alzheimer disease (AD).Retrospective cohort study using multicenter hospital-based dementia registry and national health insurance claim data was done. Participants who had available data of neuropsychological test, national health insurance claim, and other covariates were included. A total of 1660 patients with AD were included based on Stroop Test results. A total of 1563 patients with AD were included based on the Controlled Oral Word Association Test (COWAT) results. Hip fracture was measured by validated identification criteria using national health insurance claim data. Frontal lobe function was measured by Stroop Test and COWAT at baseline.After adjusting for potential covariates, including cognitive function in other domains (language, verbal and nonverbal memory, and attention), the Cox proportional hazard regression analysis revealed that risk of a hip fracture was decreased with a hazard ratio (HR) of 0.98 per one point of increase in the Stroop Test (adjusted HR = 0.98, 95% confidence interval [CI]: 0.97-1.00) and 0.93 per one point increase in COWAT (adjusted HR = 0.93, 95% CI: 0.88-0.99).The risk of hip fracture in AD patients was associated with baseline frontal lobe function. The result of this research presents evidence of association between frontal lobe function and risk of hip fracture in patients with AD.
Frontal Lobe Function and Risk of Hip Fracture in Patient With Alzheimer Disease
Roh, Hyun Woong; Hong, Chang Hyung; Lee, SooJin; Lee, Yunhwan; Lee, Kang Soo; Chang, Ki Jung; Oh, Byoung Hoon; Choi, Seong Hye; Kim, Seong Yoon; Back, Joung Hwan; Chung, Young Ki; Lim, Ki Young; Noh, Jai Sung; Son, Sang Joon
2015-01-01
Abstract To determine the association between frontal lobe function and risk of hip fracture in patients with Alzheimer disease (AD). Retrospective cohort study using multicenter hospital-based dementia registry and national health insurance claim data was done. Participants who had available data of neuropsychological test, national health insurance claim, and other covariates were included. A total of 1660 patients with AD were included based on Stroop Test results. A total of 1563 patients with AD were included based on the Controlled Oral Word Association Test (COWAT) results. Hip fracture was measured by validated identification criteria using national health insurance claim data. Frontal lobe function was measured by Stroop Test and COWAT at baseline. After adjusting for potential covariates, including cognitive function in other domains (language, verbal and nonverbal memory, and attention), the Cox proportional hazard regression analysis revealed that risk of a hip fracture was decreased with a hazard ratio (HR) of 0.98 per one point of increase in the Stroop Test (adjusted HR = 0.98, 95% confidence interval [CI]: 0.97–1.00) and 0.93 per one point increase in COWAT (adjusted HR = 0.93, 95% CI: 0.88–0.99). The risk of hip fracture in AD patients was associated with baseline frontal lobe function. The result of this research presents evidence of association between frontal lobe function and risk of hip fracture in patients with AD. PMID:26559259
Preconception B-vitamin and homocysteine status, conception, and early pregnancy loss.
Ronnenberg, Alayne G; Venners, Scott A; Xu, Xiping; Chen, Changzhong; Wang, Lihua; Guang, Wenwei; Huang, Aiqun; Wang, Xiaobin
2007-08-01
Maternal vitamin status contributes to clinical spontaneous abortion, but the role of B-vitamin and homocysteine status in subclinical early pregnancy loss is unknown. Three-hundred sixty-four textile workers from Anqing, China, who conceived at least once during prospective observation (1996-1998), provided daily urine specimens for up to 1 year, and urinary human chorionic gonadotropin was assayed to detect conception and early pregnancy loss. Homocysteine, folate, and vitamins B6 and B12 were measured in preconception plasma. Relative to women in the lowest quartile of vitamin B6, those in the third and fourth quartiles had higher adjusted proportional hazard ratios of conception (hazard ratio (HR)=2.2, 95% confidence interval (CI): 1.3, 3.4; HR=1.6, 95% CI: 1.1, 2.3, respectively), and the adjusted odds ratio for early pregnancy loss in conceptive cycles was lower in the fourth quartile (odds ratio=0.5, 95% CI: 0.3, 1.0). Women with sufficient vitamin B6 had a higher adjusted hazard ratio of conception (HR=1.4, 95% CI: 1.1, 1.9) and a lower adjusted odds ratio of early pregnancy loss in conceptive cycles (odds ratio=0.7, 95% CI: 0.4, 1.1) than did women with vitamin B6 deficiency. Poor vitamin B6 status appears to decrease the probability of conception and to contribute to the risk of early pregnancy loss in this population.
Life-Time Risk for Substance Use Among Offspring of Abusive Family Environment From the Community
Nomura, Yoko; Hurd, Yasmin L.; Pilowsky, Daniel J.
2018-01-01
The current study examined the cumulative risk, age of initiation, and functional impairments among adults with substance use problems (N = 1748) by child abuse status. Child abuse was associated with earlier initiation of marijuana, cocaine, and heroin use, and had greater risks for all the drugs studied (hazard ratios, 1.7–3.2). Furthermore, child abuse was associated with increased medical and functional impairments, including ER visits, health problems, drug dealing, drug dependence, and drug cravings. Provision of social services and parenting education during the perinatal period may prevent the long-term impact of child abuse on substance use and related impairments. The study’s limitations are noted. PMID:22780838
Malhotra, Konark; Katsanos, Aristeidis H; Bilal, Mohammad; Ishfaq, Muhammad Fawad; Goyal, Nitin; Tsivgoulis, Georgios
2018-02-01
Pharmacokinetic and prior studies on thienopyridine and proton pump inhibitors (PPI) coadministration provide conflicting data for cardiovascular outcomes, whereas there is no established evidence on the association of concomitant use of PPI and thienopyridines with adverse cerebrovascular outcomes. We conducted a systematic review and meta-analysis of randomized controlled trials and cohort studies from inception to July 2017, reporting following outcomes among patients treated with thienopyridine and PPI versus thienopyridine alone (1) ischemic stroke, (2) combined ischemic or hemorrhagic stroke, (3) composite outcome of stroke, myocardial infarction (MI), and cardiovascular death, (4) MI, (5) all-cause mortality, and (6) major or minor bleeding events. After the unadjusted analyses of risk ratios, we performed additional analyses of studies reporting hazard ratios adjusted for potential confounders. We identified 22 studies (12 randomized controlled trials and 10 cohort studies) comprising 131 714 patients. Concomitant use of PPI with thienopyridines was associated with increased risk of ischemic stroke (risk ratio, 1.74; 95% confidence interval [CI], 1.41-2.16; P <0.001), composite stroke/MI/cardiovascular death (risk ratio, 1.14; 95% CI, 1.01-1.29; P =0.04), and MI (risk ratio, 1.19; 95% CI, 1.00-1.40; P =0.05). Likewise, in adjusted analyses concomitant use of PPI with thienopyridines was again associated with increased risk of stroke (hazard ratios adjusted, 1.30; 95% CI, 1.04-1.61; P =0.02), composite stroke/MI/cardiovascular death (hazard ratios adjusted, 1.23; 95% CI, 1.03-1.47; P =0.02), but not with MI (hazard ratios adjusted, 1.19; 95% CI, 0.93-1.52; P =0.16). Co-prescription of PPI and thienopyridines increases the risk of incident ischemic strokes and composite stroke/MI/cardiovascular death. Our findings corroborate the current guidelines for PPI deprescription and pharmacovigilance, especially in patients treated with thienopyridines. © 2018 American Heart Association, Inc.
Ma, Yunsheng; Hébert, James R.; Balasubramanian, Raji; Wedick, Nicole M.; Howard, Barbara V.; Rosal, Milagros C.; Liu, Simin; Bird, Chloe E.; Olendzki, Barbara C.; Ockene, Judith K.; Wactawski-Wende, Jean; Phillips, Lawrence S.; LaMonte, Michael J.; Schneider, Kristin L.; Garcia, Lorena; Ockene, Ira S.; Merriam, Philip A.; Sepavich, Deidre M.; Mackey, Rachel H.; Johnson, Karen C.; Manson, JoAnn E.
2013-01-01
Using data from the Women's Health Initiative (1993–2009; n = 158,833 participants, of whom 84.1% were white, 9.2% were black, 4.1% were Hispanic, and 2.6% were Asian), we compared all-cause, cardiovascular, and cancer mortality rates in white, black, Hispanic, and Asian postmenopausal women with and without diabetes. Cox proportional hazard models were used for the comparison from which hazard ratios and 95% confidence intervals were computed. Within each racial/ethnic subgroup, women with diabetes had an approximately 2–3 times higher risk of all-cause, cardiovascular, and cancer mortality than did those without diabetes. However, the hazard ratios for mortality outcomes were not significantly different between racial/ethnic subgroups. Population attributable risk percentages (PARPs) take into account both the prevalence of diabetes and hazard ratios. For all-cause mortality, whites had the lowest PARP (11.1, 95% confidence interval (CI): 10.1, 12.1), followed by Asians (12.9, 95% CI: 4.7, 20.9), blacks (19.4, 95% CI: 15.0, 23.7), and Hispanics (23.2, 95% CI: 14.8, 31.2). To our knowledge, the present study is the first to show that hazard ratios for mortality outcomes were not significantly different between racial/ethnic subgroups when stratified by diabetes status. Because of the “amplifying” effect of diabetes prevalence, efforts to reduce racial/ethnic disparities in the rate of death from diabetes should focus on prevention of diabetes. PMID:24045960
Ishikawa, Joji; Ishikawa, Shizukiyo; Kario, Kazuomi
2015-03-01
We attempted to evaluate whether subjects who exhibit prolonged corrected QT (QTc) interval (≥440 ms in men and ≥460 ms in women) on ECG, with and without ECG-diagnosed left ventricular hypertrophy (ECG-LVH; Cornell product, ≥244 mV×ms), are at increased risk of stroke. Among the 10 643 subjects, there were a total of 375 stroke events during the follow-up period (128.7±28.1 months; 114 142 person-years). The subjects with prolonged QTc interval (hazard ratio, 2.13; 95% confidence interval, 1.22-3.73) had an increased risk of stroke even after adjustment for ECG-LVH (hazard ratio, 1.71; 95% confidence interval, 1.22-2.40). When we stratified the subjects into those with neither a prolonged QTc interval nor ECG-LVH, those with a prolonged QTc interval but without ECG-LVH, and those with ECG-LVH, multivariate-adjusted Cox proportional hazards analysis demonstrated that the subjects with prolonged QTc intervals but not ECG-LVH (1.2% of all subjects; incidence, 10.7%; hazard ratio, 2.70, 95% confidence interval, 1.48-4.94) and those with ECG-LVH (incidence, 7.9%; hazard ratio, 1.83; 95% confidence interval, 1.31-2.57) had an increased risk of stroke events, compared with those with neither a prolonged QTc interval nor ECG-LVH. In conclusion, prolonged QTc interval was associated with stroke risk even among patients without ECG-LVH in the general population. © 2014 American Heart Association, Inc.
Ethnic Differences in Incidence and Outcomes of Childhood Nephrotic Syndrome.
Banh, Tonny H M; Hussain-Shamsy, Neesha; Patel, Viral; Vasilevska-Ristovska, Jovanka; Borges, Karlota; Sibbald, Cathryn; Lipszyc, Deborah; Brooke, Josefina; Geary, Denis; Langlois, Valerie; Reddon, Michele; Pearl, Rachel; Levin, Leo; Piekut, Monica; Licht, Christoph P B; Radhakrishnan, Seetha; Aitken-Menezes, Kimberly; Harvey, Elizabeth; Hebert, Diane; Piscione, Tino D; Parekh, Rulan S
2016-10-07
Ethnic differences in outcomes among children with nephrotic syndrome are unknown. We conducted a longitudinal study at a single regional pediatric center comparing ethnic differences in incidence from 2001 to 2011 census data and longitudinal outcomes, including relapse rates, time to first relapse, frequently relapsing disease, and use of cyclophosphamide. Among 711 children, 24% were European, 33% were South Asian, 10% were East/Southeast Asian, and 33% were of other origins. Over 10 years, the overall incidence increased from 1.99/100,000 to 4.71/100,000 among children ages 1-18 years old. In 2011, South Asians had a higher incidence rate ratio of 6.61 (95% confidence interval, 3.16 to 15.1) compared with Europeans. East/Southeast Asians had a similar incidence rate ratio (0.76; 95% confidence interval, 0.13 to 2.94) to Europeans. We determined outcomes in 455 children from the three largest ethnic groups with steroid-sensitive disease over a median of 4 years. South Asian and East/Southeast Asian children had significantly lower odds of frequently relapsing disease at 12 months (South Asian: adjusted odds ratio; 0.55; 95% confidence interval, 0.39 to 0.77; East/Southeast Asian: adjusted odds ratio; 0.42; 95% confidence interval, 0.34 to 0.51), fewer subsequent relapses (South Asian: adjusted odds ratio; 0.64; 95% confidence interval, 0.50 to 0.81; East/Southeast Asian: adjusted odds ratio; 0.47; 95% confidence interval, 0.24 to 0.91), lower risk of a first relapse (South Asian: adjusted hazard ratio, 0.74; 95% confidence interval, 0.67 to 0.83; East/Southeast Asian: adjusted hazard ratio, 0.65; 95% CI, 0.63 to 0.68), and lower use of cyclophosphamide (South Asian: adjusted hazard ratio, 0.82; 95% confidence interval, 0.53 to 1.28; East/Southeast Asian: adjusted hazard ratio, 0.54; 95% confidence interval, 0.41 to 0.71) compared with European children. Despite the higher incidence among South Asians, South and East/Southeast Asian children have significantly less complicated clinical outcomes compared with Europeans. Copyright © 2016 by the American Society of Nephrology.
A Bayesian Hybrid Adaptive Randomisation Design for Clinical Trials with Survival Outcomes.
Moatti, M; Chevret, S; Zohar, S; Rosenberger, W F
2016-01-01
Response-adaptive randomisation designs have been proposed to improve the efficiency of phase III randomised clinical trials and improve the outcomes of the clinical trial population. In the setting of failure time outcomes, Zhang and Rosenberger (2007) developed a response-adaptive randomisation approach that targets an optimal allocation, based on a fixed sample size. The aim of this research is to propose a response-adaptive randomisation procedure for survival trials with an interim monitoring plan, based on the following optimal criterion: for fixed variance of the estimated log hazard ratio, what allocation minimizes the expected hazard of failure? We demonstrate the utility of the design by redesigning a clinical trial on multiple myeloma. To handle continuous monitoring of data, we propose a Bayesian response-adaptive randomisation procedure, where the log hazard ratio is the effect measure of interest. Combining the prior with the normal likelihood, the mean posterior estimate of the log hazard ratio allows derivation of the optimal target allocation. We perform a simulation study to assess and compare the performance of this proposed Bayesian hybrid adaptive design to those of fixed, sequential or adaptive - either frequentist or fully Bayesian - designs. Non informative normal priors of the log hazard ratio were used, as well as mixture of enthusiastic and skeptical priors. Stopping rules based on the posterior distribution of the log hazard ratio were computed. The method is then illustrated by redesigning a phase III randomised clinical trial of chemotherapy in patients with multiple myeloma, with mixture of normal priors elicited from experts. As expected, there was a reduction in the proportion of observed deaths in the adaptive vs. non-adaptive designs; this reduction was maximized using a Bayes mixture prior, with no clear-cut improvement by using a fully Bayesian procedure. The use of stopping rules allows a slight decrease in the observed proportion of deaths under the alternate hypothesis compared with the adaptive designs with no stopping rules. Such Bayesian hybrid adaptive survival trials may be promising alternatives to traditional designs, reducing the duration of survival trials, as well as optimizing the ethical concerns for patients enrolled in the trial.
Effect of Long Working Hours on Self-reported Hypertension among Middle-aged and Older Wage Workers
2014-01-01
Objectives Many studies have reported an association between overwork and hypertension. However, research on the health effects of long working hours has yielded inconclusive results. The objective of this study was to identify an association between overtime work and hypertension in wage workers 45 years and over of age using prospective data. Methods Wage workers in Korea aged 45 years and over were selected for inclusion in this study from among 10,254 subjects from the Korean Longitudinal Study of Ageing. Workers with baseline hypertension and those with other major diseases were excluded. In the end, a total of 1,079 subjects were included. A Cox proportional hazards model was used to calculate hazard ratios and adjust for baseline characteristics such as sex, age, education, income, occupation, form of employment, body mass index, alcohol habit, smoking habit, regular exercise, and number of working days per week. Additional models were used to calculate hazard ratios after gender stratification. Results Among the 1,079 subjects, 85 workers were diagnosed with hypertension during 3974.2 person-months. The average number of working hours per week for all subjects was 47.68. The proportion of overtime workers was 61.0% (cutoff, 40 h per week). Compared with those working 40 h and less per week, the hazard ratio of subjects in the final model, which adjusted for all selected variables, working 41-50 h per week was 2.20 (95% confidence interval [CI], 1.19–4.06), that of subjects working 51-60 h per week was 2.40 (95% CI, 1.07–5.39), and that of subjects working 61 h and over per week was 2.87 (95% CI, 1.33–6.20). In gender stratification models, the hazard ratio of the females tended to be higher than that of the males. Conclusion As the number of working hours per week increased, the hazard ratio for diagnosis of hypertension significantly increased. This result suggests a positive association between overtime work and the risk of hypertension. PMID:25852938
Effect of Long Working Hours on Self-reported Hypertension among Middle-aged and Older Wage Workers.
Yoo, Dong Hyun; Kang, Mo-Yeol; Paek, Domyung; Min, Bokki; Cho, Sung-Il
2014-01-01
Many studies have reported an association between overwork and hypertension. However, research on the health effects of long working hours has yielded inconclusive results. The objective of this study was to identify an association between overtime work and hypertension in wage workers 45 years and over of age using prospective data. Wage workers in Korea aged 45 years and over were selected for inclusion in this study from among 10,254 subjects from the Korean Longitudinal Study of Ageing. Workers with baseline hypertension and those with other major diseases were excluded. In the end, a total of 1,079 subjects were included. A Cox proportional hazards model was used to calculate hazard ratios and adjust for baseline characteristics such as sex, age, education, income, occupation, form of employment, body mass index, alcohol habit, smoking habit, regular exercise, and number of working days per week. Additional models were used to calculate hazard ratios after gender stratification. Among the 1,079 subjects, 85 workers were diagnosed with hypertension during 3974.2 person-months. The average number of working hours per week for all subjects was 47.68. The proportion of overtime workers was 61.0% (cutoff, 40 h per week). Compared with those working 40 h and less per week, the hazard ratio of subjects in the final model, which adjusted for all selected variables, working 41-50 h per week was 2.20 (95% confidence interval [CI], 1.19-4.06), that of subjects working 51-60 h per week was 2.40 (95% CI, 1.07-5.39), and that of subjects working 61 h and over per week was 2.87 (95% CI, 1.33-6.20). In gender stratification models, the hazard ratio of the females tended to be higher than that of the males. As the number of working hours per week increased, the hazard ratio for diagnosis of hypertension significantly increased. This result suggests a positive association between overtime work and the risk of hypertension.
Mental Health Symptoms Among Student Service Members/Veterans and Civilian College Students.
Cleveland, Sandi D; Branscum, Adam J; Bovbjerg, Viktor E; Thorburn, Sheryl
2015-01-01
The aim of this study was to investigate if and to what extent student service members/veterans differ from civilian college students in the prevalence of self-reported symptoms of poor mental health. The Fall 2011 implementation of the American College Health Association-National College Health Assessment included 27,774 respondents from 44 colleges and universities. Participants were matched using propensity scores, and the prevalence of symptoms was compared using logistic regression and zero-inflated negative binomial regression models. The odds of feeling overwhelmed in the last 12 months were significantly lower among student service members/veterans with a history of hazardous duty (odd ratio [OR] = 0.46, adjusted p value <.05) compared with civilian students. Military service, with and without hazardous duty deployment, was not a significant predictor of the total number of symptoms of poor mental health. Current student service members/veterans may not be disproportionately affected by poor psychological functioning.
Induced abortion and breast cancer among parous women: a Danish cohort study.
Braüner, Christina Marie; Overvad, Kim; Tjønneland, Anne; Attermann, Jørn
2013-06-01
We investigated whether induced abortion is associated with breast cancer when lifestyle confounders, including smoking and alcohol consumption, are adjusted for. Design. Prospective cohort study. Danish women from the Diet, Cancer and Health study. A total of 25,576 women. We obtained exposure data from baseline questionnaires filled in by the women between 1993 and 1997. Information on breast cancer and emigration was retrieved from Danish national registries. The study power was approximately 85% when applying a minimum detection hazard ratio of 1.2. Long-term effects of induced abortion on the risk of breast cancer among women above 50 years of age. During a follow up of approximately 12 years, 1215 women were diagnosed with breast cancer. When comparing parous women who had an abortion with parous women who never had an abortion, there was no association between breast cancer risk and induced abortion (ever vs. never), with a hazard ratio 0.95 (95% confidence interval 0.83-1.09), regardless of whether the abortion occurred before the first birth (hazard ratio 0.86; 95% confidence interval 0.65-1.14), or after the first birth (hazard ratio 0.97; 95% confidence interval 0.84-1.13). Our study did not show evidence of an association between induced abortion and breast cancer risk. © 2013 The Authors Acta Obstetricia et Gynecologica Scandinavica © 2013 Nordic Federation of Societies of Obstetrics and Gynecology.
Gender related Long-term Differences after Open Infrainguinal Surgery for Critical Limb Ischemia.
Lejay, A; Schaeffer, M; Georg, Y; Lucereau, B; Roussin, M; Girsowicz, E; Delay, C; Schwein, A; Thaveau, F; Geny, B; Chakfe, N
2015-10-01
The role of gender on long-term infrainguinal open surgery outcomes still remains uncertain in critical limb ischemia patients. The aim of this study is to evaluate the gender-specific differences in patient characteristics and long-term clinical outcomes in terms of survival, primary patency and limb salvage among patients undergoing infrainguinal open surgery for CLI. All consecutive patients undergoing infrainguinal open surgery for critical limb ischemia between 2003 and 2012 were included. Survival, limb salvage and primary patency rates were assessed. Independent outcome determinants were identified by the Cox proportional hazard ratio using age and gender as adjustment factors. 584 patients (269 women and 315 men, mean age 76 and 71 years respectively) underwent 658 infrainguinal open surgery (313 in women and 345 in men). Survival rate at 6 years was lower among women compared to men with 53.5% vs 70.9% (p < 0.001). The same applied to primary patency (35.9% vs 52.4%, p < 0.001) and limb salvage (54.3% vs 81.1%, p < 0.001) at 6 years. Female-gender was an independent factor predicting death (hazard ratio 1.50), thrombosis (hazard ratio 2.37) and limb loss (hazard ratio 7.05) in age and gender-adjusted analysis. Gender-related disparity in critical limb ischemia open surgical revascularization outcomes still remains. Copyright © 2015 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
Thomas, Laine; Svetkey, Laura; Brancati, Frederick L.; Califf, Robert M.; Edelman, David
2013-01-01
BACKGROUND Low and low-normal serum potassium is associated with an increased risk of diabetes. We hypothesized that the protective effect of valsartan on diabetes risk could be mediated by its effect of raising serum potassium. METHODS We analyzed data from the Nateglinide and Valsartan in Impaired Glucose Tolerance Outcomes Research (NAVIGATOR) trial, which randomized participants at risk for diabetes to either valsartan (up to 160mg daily) or no valsartan. Using Cox models, we evaluated the effect of valsartan on diabetes risk over a median of 4 years of follow-up and calculated the mediation effect of serum potassium as the difference in treatment hazard ratios from models excluding and including 1-year change in serum potassium. The 95% confidence interval (CI) for the difference in log hazard ratios was computed by bootstrapping. RESULTS The hazard ratio for developing diabetes among those on valsartan vs. no valsartan was 0.866 (95% CI = 0.795–0.943) vs. 0.868 (95% CI = 0.797–0.945), after controlling for 1-year change in potassium. The bootstrap 95% CI for a difference in these log hazard ratios was not statistically significant (−0.003 to 0.009). CONCLUSIONS Serum potassium does not appear to significantly mediate the protective effect of valsartan on diabetes risk. PMID:23417031
Magvanjav, Oyunbileg; McDonough, Caitrin W; Gong, Yan; McClure, Leslie A; Talbert, Robert L; Horenstein, Richard B; Shuldiner, Alan R; Benavente, Oscar R; Mitchell, Braxton D; Johnson, Julie A
2017-05-01
Functional polymorphisms (Ser49Gly and Arg389Gly) in ADRB1 have been associated with cardiovascular and β-blocker response outcomes. Herein we examined associations of these polymorphisms with major adverse cardiovascular events (MACE), with and without stratification by β-blocker treatment in patients with a history of stroke. Nine hundred and twenty-six participants of the SPS3 trial's (Secondary Prevention of Small Subcortical Strokes) genetic substudy with hypertension were included. MACE included stroke, myocardial infarction, and all-cause death. Kaplan-Meier and multivariable Cox regression analyses were used. Because the primary component of MACE was ischemic stroke, we tested the association of Ser49Gly with ischemic stroke among 41 475 individuals of European and African ancestry in the NINDS (National Institute of Neurological Disorders and Stroke) SiGN (Stroke Genetics Network). MACE was higher in carriers of the Gly49 allele than in those with the Ser49Ser genotype (10.5% versus 5.4%, log-rank P =0.005). Gly49 carrier status was associated with MACE (hazard ratio, 1.62; 95% confidence interval, 1.00-2.68) and ischemic stroke (hazard ratio, 1.81; 95% confidence interval, 1.01-3.23) in SPS3 and with small artery ischemic stroke (odds ratio, 1.14; 95% confidence interval, 1.03-1.26) in SiGN. In SPS3, β-blocker-treated Gly49 carriers had increased MACE versus non-β-blocker-treated individuals and noncarriers (hazard ratio, 2.03; 95% confidence interval, 1.20-3.45). No associations were observed with the Arg389Gly polymorphism. Among individuals with previous small artery ischemic stroke, the ADRB1 Gly49 polymorphism was associated with MACE, particularly small artery ischemic stroke, a risk that may be increased among β-blocker-treated individuals. Further research is needed to define β-blocker benefit among ischemic stroke patients by ADRB1 genotype. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00059306. © 2017 American Heart Association, Inc.
Hazardous drinking and military community functioning: identifying mediating risk factors.
Foran, Heather M; Heyman, Richard E; Slep, Amy M Smith
2011-08-01
Hazardous drinking is a serious societal concern in military populations. Efforts to reduce hazardous drinking among military personnel have been limited in effectiveness. There is a need for a deeper understanding of how community-based prevention models apply to hazardous drinking in the military. Community-wide prevention efforts may be most effective in targeting community functioning (e.g., support from formal agencies, community cohesion) that impacts hazardous drinking via other proximal risk factors. The goal of the current study is to inform community-wide prevention efforts by testing a model of community functioning and mediating risk factors of hazardous drinking among active duty U.S. Air Force personnel. A large, representative survey sample of U.S. Air Force active duty members (N = 52,780) was collected at 82 bases worldwide. Hazardous drinking was assessed with the widely used Alcohol Use Disorders Identification Test (Saunders, Aasland, Babor, de la Fuente, & Grant, 1993). A variety of individual, family, and community measures were also assessed. Structural equation modeling was used to test a hypothesized model of community functioning, mediating risk factors and hazardous drinking. Depressive symptoms, perceived financial stress, and satisfaction with the U.S. Air Force were identified as significant mediators of the link between community functioning and hazardous drinking for men and women. Relationship satisfaction was also identified as a mediator for men. These results provide a framework for further community prevention research and suggest that prevention efforts geared at increasing aspects of community functioning (e.g., the U.S. Air Force Community Capacity model) may indirectly lead to reductions in hazardous drinking through other proximal risk factors.
A retrospective study of two populations to test a simple rule for spirometry.
Ohar, Jill A; Yawn, Barbara P; Ruppel, Gregg L; Donohue, James F
2016-06-04
Chronic lung disease is common and often under-diagnosed. To test a simple rule for conducting spirometry we reviewed spirograms from two populations, occupational medicine evaluations (OME) conducted by Saint Louis and Wake Forest Universities at 3 sites (n = 3260, mean age 64.14 years, 95 % CI 58.94-69.34, 97 % men) and conducted by Wake Forest University preop clinic (POC) at one site (n = 845, mean age 62.10 years, 95 % CI 50.46-73.74, 57 % men). This retrospective review of database information that the first author collected prospectively identified rates, types, sensitivity, specificity and positive and negative predictive value for lung function abnormalities and associated mortality rate found when conducting spirometry based on the 20/40 rule (≥20 years of smoking in those aged ≥ 40 years) in the OME population. To determine the reproducibility of the 20/40 rule for conducting spirometry, the rule was applied to the POC population. A lung function abnormality was found in 74 % of the OME population and 67 % of the POC population. Sensitivity of the rule was 85 % for an obstructive pattern and 77 % for any abnormality on spirometry. Positive and negative predictive values of the rule for a spirometric abnormality were 74 and 55 %, respectively. Patients with an obstructive pattern were at greater risk of coronary heart disease (odds ratio (OR) 1.39 [confidence interval (CI) 1.00-1.93] vs. normal) and death (hazard ratio (HR) 1.53, 95 % CI 1.20-1.84) than subjects with normal spirometry. Restricted spirometry patterns were also associated with greater risk of coronary disease (odds ratio (OR) 1.7 [CI 1.23-2.35]) and death (Hazard ratio 1.40, 95 % CI 1.08-1.72). Smokers (≥ 20 pack years) age ≥ 40 years are at an increased risk for lung function abnormalities and those abnormalities are associated with greater presence of coronary heart disease and increased all-cause mortality. Use of the 20/40 rule could provide a simple method to enhance selection of candidates for spirometry evaluation in the primary care setting.
Aga, Cathrine; Kartus, Jüri-Tomas; Lind, Martin; Lygre, Stein Håkon Låstad; Granan, Lars-Petter; Engebretsen, Lars
2017-10-01
Double-bundle anterior cruciate ligament (ACL) reconstruction has demonstrated improved biomechanical properties and moderately better objective outcomes compared with single-bundle reconstructions. This could make an impact on the rerupture rate and reduce the risk of revisions in patients undergoing double-bundle ACL reconstruction compared with patients reconstructed with a traditional single-bundle technique. The National Knee Ligament Registers in Scandinavia provide information that can be used to evaluate the revision outcome after ACL reconstructions. The purposes of the study were (1) to compare the risk of revision between double-bundle and single-bundle reconstructions, reconstructed with autologous hamstring tendon grafts; (2) to compare the risk of revision between double-bundle hamstring tendon and single-bundle bone-patellar tendon-bone autografts; and (3) to compare the hazard ratios for the same two research questions after Cox regression analysis was performed. Data collection of primary ACL reconstructions from the National Knee Ligament Registers in Denmark, Norway, and Sweden from July 1, 2005, to December 31, 2014, was retrospectively analyzed. A total of 60,775 patients were included in the study; 994 patients were reconstructed with double-bundle hamstring tendon grafts, 51,991 with single-bundle hamstring tendon grafts, and 7790 with single-bundle bone-patellar tendon-bone grafts. The double-bundle ACL-reconstructed patients were compared with the two other groups. The risk of revision for each research question was detected by the risk ratio, hazard ratio, and the corresponding 95% confidence intervals. Kaplan-Meier analysis was used to estimate survival at 1, 2, and 5 years for the three different groups. Furthermore, a Cox proportional hazard regression model was applied and the hazard ratios were adjusted for country, age, sex, meniscal or chondral injury, and utilized fixation devices on the femoral and tibial sides. There were no differences in the crude risk of revision between the patients undergoing the double-bundle technique and the two other groups. A total of 3.7% patients were revised in the double-bundle group (37 of 994 patients) versus 3.8% in the single-bundle hamstring tendon group (1952 of 51,991; risk ratio, 1.01; 95% confidence interval (CI), 0.73-1.39; p = 0.96), and 2.8% of the patients were revised in the bone-patellar tendon-bone group (219 of the 7790 bone-patellar tendon-bone patients; risk ratio, 0.76; 95% CI, 0.54-1.06; p = 0.11). Cox regression analysis with adjustment for country, age, sex, menisci or cartilage injury, and utilized fixation device on the femoral and tibial sides, did not reveal any further difference in the risk of revision between the single-bundle hamstring tendon and double-bundle hamstring tendon groups (hazard ratio, 1.18; 95% CI, 0.85-1.62; p = 0.33), but the adjusted hazard ratio showed a lower risk of revision in the single-bundle bone-patellar tendon-bone group compared with the double-bundle group (hazard ratio, 0.62; 95% CI, 0.43-0.90; p = 0.01). Comparisons of the graft revision rates reported separately for each country revealed that double-bundle hamstring tendon reconstructions in Sweden had a lower hazard ratio compared with the single-bundle hamstring tendon reconstructions (hazard ratio, 1.00 versus 1.89; 95% CI, 1.09-3.29; p = 0.02). Survival at 5 years after index surgery was 96.0% for the double-bundle group, 95.4% for the single-bundle hamstring tendon group, and 97.0% for the single-bundle bone-patellar tendon-bone group. Based on the data from all three national registers, the risk of revision was not influenced by the reconstruction technique in terms of using single- or double-bundle hamstring tendons, although national differences in survival existed. Using bone-patellar tendon-bone grafts lowered the risk of revision compared with double-bundle hamstring tendon grafts. These findings should be considered when deciding what reconstruction technique to use in ACL-deficient knees. Future studies identifying the reasons for graft rerupture in single- and double-bundle reconstructions would be of interest to understand the findings of the present study. Level III, therapeutic study.
Espelt, Albert; Marí-Dell'Olmo, Marc; Penelo, Eva; Bosque-Prous, Marina
2016-06-14
To examine the differences between Prevalence Ratio (PR) and Odds Ratio (OR) in a cross-sectional study and to provide tools to calculate PR using two statistical packages widely used in substance use research (STATA and R). We used cross-sectional data from 41,263 participants of 16 European countries participating in the Survey on Health, Ageing and Retirement in Europe (SHARE). The dependent variable, hazardous drinking, was calculated using the Alcohol Use Disorders Identification Test - Consumption (AUDIT-C). The main independent variable was gender. Other variables used were: age, educational level and country of residence. PR of hazardous drinking in men with relation to women was estimated using Mantel-Haenszel method, log-binomial regression models and poisson regression models with robust variance. These estimations were compared to the OR calculated using logistic regression models. Prevalence of hazardous drinkers varied among countries. Generally, men have higher prevalence of hazardous drinking than women [PR=1.43 (1.38-1.47)]. Estimated PR was identical independently of the method and the statistical package used. However, OR overestimated PR, depending on the prevalence of hazardous drinking in the country. In cross-sectional studies, where comparisons between countries with differences in the prevalence of the disease or condition are made, it is advisable to use PR instead of OR.
Näslund-Koch, Charlotte; Nordestgaard, Børge G; Bojesen, Stig E
2016-04-10
CHEK2 is a cell cycle checkpoint regulator, and the CHEK2*1100delC germline mutation leads to loss of function and increased breast cancer risk. It seems plausible that this mutation could also predispose to other cancers. Therefore, we tested the hypothesis that CHEK2*1100delC heterozygosity is associated with increased risk for other cancers in addition to breast cancer in the general population. We examined 86,975 individuals from the Copenhagen General Population Study, recruited from 2003 through 2010. The participants completed a questionnaire on health and lifestyle, were examined physically, had blood drawn for DNA extraction, were tested for presence of CHEK2*1100delC using Taqman assays and sequencing, and were linked over 1943 through 2011 to the Danish Cancer Registry. Incidences and risks of individual cancer types, including breast cancer, were calculated using Kaplan-Meier estimates, Fine and Gray competing-risks regressions, and stratified analyses with interaction tests. Among 86,975 individuals, 670 (0.8%) were CHEK2*1100delC heterozygous, 2,442 developed breast cancer, and 6,635 developed other cancers. The age- and sex-adjusted hazard ratio for CHEK2*1100delC heterozygotes compared with noncarriers was 2.08 (95% CI, 1.51 to 2.85) for breast cancer and 1.45 (95% CI, 1.15 to 1.82) for other cancers. When stratifying for sex, the age-adjusted hazard ratios for other cancers were 1.54 (95% CI, 1.08 to 2.18) for women and 1.37 (95% CI, 1.01 to 1.85) for men (sex difference: P = .63). For CHEK2*1100delC heterozygotes compared with noncarriers, the age- and sex-adjusted hazard ratios were 5.76 (95% CI, 2.12 to 15.6) for stomach cancer, 3.61 (95% CI, 1.33 to 9.79) for kidney cancer, 3.45 (95% CI, 1.09 to 10.9) for sarcoma, and 1.60 (95% CI, 1.00 to 2.56) for prostate cancer. CHEK2*1100delC heterozygosity is associated with 15% to 82% increased risk for at least some cancers in addition to breast cancer. This information may be useful in clinical counseling of patients with this loss-of-function mutation. © 2016 by American Society of Clinical Oncology.
Husser, Oliver; Monmeneu, Jose V; Bonanad, Clara; Lopez-Lereu, Maria P; Nuñez, Julio; Bosch, Maria J; Garcia, Carlos; Sanchis, Juan; Chorro, Francisco J; Bodi, Vicente
2014-09-01
The incremental prognostic value of inducible myocardial ischemia over necrosis derived by stress cardiac magnetic resonance in depressed left ventricular function is unknown. We determined the prognostic value of necrosis and ischemia in patients with depressed left ventricular function referred for dipyridamole stress perfusion magnetic resonance. In a multicenter registry using stress magnetic resonance, the presence (≥ 2 segments) of late enhancement and perfusion defects and their association with major events (cardiac death and nonfatal infarction) was determined. In 391 patients, perfusion defect or late enhancement were present in 224 (57%) and 237 (61%). During follow-up (median, 96 weeks), 47 major events (12%) occurred: 25 cardiac deaths and 22 myocardial infarctions. Patients with major events displayed a larger extent of perfusion defects (6 segments vs 3 segments; P <.001) but not late enhancement (5 segments vs 3 segments; P =.1). Major event rate was significantly higher in the presence of perfusion defects (17% vs 5%; P =.0005) but not of late enhancement (14% vs 9%; P =.1). Patients were categorized into 4 groups: absence of perfusion defect and absence of late enhancement (n = 124), presence of late enhancement and absence of perfusion defect (n = 43), presence of perfusion defect and presence of late enhancement (n = 195), absence of late enhancement and presence of perfusion defect (n = 29). Event rate was 5%, 7%, 16%, and 24%, respectively (P for trend = .003). In a multivariate regression model, only perfusion defect (hazard ratio = 2.86; 95% confidence interval, 1.37-5.95]; P = .002) but not late enhancement (hazard ratio = 1.70; 95% confidence interval, 0.90-3.22; P =.105) predicted events. In depressed left ventricular function, the presence of inducible ischemia is the strongest predictor of major events. Copyright © 2014 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.
Outcome of intracerebral hemorrhage associated with different oral anticoagulants
Wilson, Duncan; Seiffge, David J.; Traenka, Christopher; Basir, Ghazala; Purrucker, Jan C.; Rizos, Timolaos; Sobowale, Oluwaseun A.; Sallinen, Hanne; Yeh, Shin-Joe; Wu, Teddy Y.; Ferrigno, Marc; Houben, Rik; Schreuder, Floris H.B.M.; Perry, Luke A.; Tanaka, Jun; Boulanger, Marion; Al-Shahi Salman, Rustam; Jäger, Hans R.; Ambler, Gareth; Shakeshaft, Clare; Yakushiji, Yusuke; Choi, Philip M.C.; Staals, Julie; Cordonnier, Charlotte; Jeng, Jiann-Shing; Veltkamp, Roland; Dowlatshahi, Dar; Engelter, Stefan T.; Parry-Jones, Adrian R.; Meretoja, Atte
2017-01-01
Objective: In an international collaborative multicenter pooled analysis, we compared mortality, functional outcome, intracerebral hemorrhage (ICH) volume, and hematoma expansion (HE) between non–vitamin K antagonist oral anticoagulation–related ICH (NOAC-ICH) and vitamin K antagonist–associated ICH (VKA-ICH). Methods: We compared all-cause mortality within 90 days for NOAC-ICH and VKA-ICH using a Cox proportional hazards model adjusted for age; sex; baseline Glasgow Coma Scale score, ICH location, and log volume; intraventricular hemorrhage volume; and intracranial surgery. We addressed heterogeneity using a shared frailty term. Good functional outcome was defined as discharge modified Rankin Scale score ≤2 and investigated in multivariable logistic regression. ICH volume was measured by ABC/2 or a semiautomated planimetric method. HE was defined as an ICH volume increase >33% or >6 mL from baseline within 72 hours. Results: We included 500 patients (97 NOAC-ICH and 403 VKA-ICH). Median baseline ICH volume was 14.4 mL (interquartile range [IQR] 3.6–38.4) for NOAC-ICH vs 10.6 mL (IQR 4.0–27.9) for VKA-ICH (p = 0.78). We did not find any difference between NOAC-ICH and VKA-ICH for all-cause mortality within 90 days (33% for NOAC-ICH vs 31% for VKA-ICH [p = 0.64]; adjusted Cox hazard ratio (for NOAC-ICH vs VKA-ICH) 0.93 [95% confidence interval (CI) 0.52–1.64] [p = 0.79]), the rate of HE (NOAC-ICH n = 29/48 [40%] vs VKA-ICH n = 93/140 [34%] [p = 0.45]), or functional outcome at hospital discharge (NOAC-ICH vs VKA-ICH odds ratio 0.47; 95% CI 0.18–1.19 [p = 0.11]). Conclusions: In our international collaborative multicenter pooled analysis, baseline ICH volume, hematoma expansion, 90-day mortality, and functional outcome were similar following NOAC-ICH and VKA-ICH. PMID:28381513
Baron Toaldo, Marco; Romito, Giovanni; Guglielmini, Carlo; Diana, Alessia; Pelle, Nazzareno G; Contiero, Barbara; Cipone, Mario
2018-05-01
The prognostic relevance of left atrial (LA) morphological and functional variables, including those derived from speckle tracking echocardiography (STE), has been little investigated in veterinary medicine. To assess the prognostic value of several echocardiographic variables, with a focus on LA morphological and functional variables in dogs with myxomatous mitral valve disease (MMVD). One-hundred and fifteen dogs of different breeds with MMVD. Prospective cohort study. Conventional morphologic and echo-Doppler variables, LA areas and volumes, and STE-based LA strain analysis were performed in all dogs. A survival analysis was performed to test for the best echocardiographic predictors of cardiac-related death. Most of the tested variables, including all LA STE-derived variables were univariate predictors of cardiac death in Cox proportional hazard analysis. Because of strong correlation between many variables, only left atrium to aorta ratio (LA/Ao > 1.7), mitral valve E wave velocity (MV E vel > 1.3 m/s), LA maximal volume (LAVmax > 3.53 mL/kg), peak atrial longitudinal strain (PALS < 30%), and contraction strain index (CSI per 1% increase) were entered in the univariate analysis, and all were predictors of cardiac death. However, only the MV E vel (hazard ratio [HR], 4.45; confidence interval [CI], 1.76-11.24; P < .001) and LAVmax (HR, 2.32; CI, 1.10-4.89; P = .024) remained statistically significant in the multivariable analysis. The assessment of LA dimension and function provides useful prognostic information in dogs with MMVD. Considering all the LA variables, LAVmax appears the strongest predictor of cardiac death, being superior to LA/Ao and STE-derived variables. Copyright © 2018 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
Outcome of intracerebral hemorrhage associated with different oral anticoagulants.
Wilson, Duncan; Seiffge, David J; Traenka, Christopher; Basir, Ghazala; Purrucker, Jan C; Rizos, Timolaos; Sobowale, Oluwaseun A; Sallinen, Hanne; Yeh, Shin-Joe; Wu, Teddy Y; Ferrigno, Marc; Houben, Rik; Schreuder, Floris H B M; Perry, Luke A; Tanaka, Jun; Boulanger, Marion; Al-Shahi Salman, Rustam; Jäger, Hans R; Ambler, Gareth; Shakeshaft, Clare; Yakushiji, Yusuke; Choi, Philip M C; Staals, Julie; Cordonnier, Charlotte; Jeng, Jiann-Shing; Veltkamp, Roland; Dowlatshahi, Dar; Engelter, Stefan T; Parry-Jones, Adrian R; Meretoja, Atte; Werring, David J
2017-05-02
In an international collaborative multicenter pooled analysis, we compared mortality, functional outcome, intracerebral hemorrhage (ICH) volume, and hematoma expansion (HE) between non-vitamin K antagonist oral anticoagulation-related ICH (NOAC-ICH) and vitamin K antagonist-associated ICH (VKA-ICH). We compared all-cause mortality within 90 days for NOAC-ICH and VKA-ICH using a Cox proportional hazards model adjusted for age; sex; baseline Glasgow Coma Scale score, ICH location, and log volume; intraventricular hemorrhage volume; and intracranial surgery. We addressed heterogeneity using a shared frailty term. Good functional outcome was defined as discharge modified Rankin Scale score ≤2 and investigated in multivariable logistic regression. ICH volume was measured by ABC/2 or a semiautomated planimetric method. HE was defined as an ICH volume increase >33% or >6 mL from baseline within 72 hours. We included 500 patients (97 NOAC-ICH and 403 VKA-ICH). Median baseline ICH volume was 14.4 mL (interquartile range [IQR] 3.6-38.4) for NOAC-ICH vs 10.6 mL (IQR 4.0-27.9) for VKA-ICH ( p = 0.78). We did not find any difference between NOAC-ICH and VKA-ICH for all-cause mortality within 90 days (33% for NOAC-ICH vs 31% for VKA-ICH [ p = 0.64]; adjusted Cox hazard ratio (for NOAC-ICH vs VKA-ICH) 0.93 [95% confidence interval (CI) 0.52-1.64] [ p = 0.79]), the rate of HE (NOAC-ICH n = 29/48 [40%] vs VKA-ICH n = 93/140 [34%] [ p = 0.45]), or functional outcome at hospital discharge (NOAC-ICH vs VKA-ICH odds ratio 0.47; 95% CI 0.18-1.19 [ p = 0.11]). In our international collaborative multicenter pooled analysis, baseline ICH volume, hematoma expansion, 90-day mortality, and functional outcome were similar following NOAC-ICH and VKA-ICH. Copyright © 2017 The Author(s). Published by Wolters Kluwer Health, Inc. on behalf of the American Academy of Neurology.
Physical activity, function, and longevity among the very old.
Stessman, Jochanan; Hammerman-Rozenberg, Robert; Cohen, Aaron; Ein-Mor, Eliana; Jacobs, Jeremy M
2009-09-14
Recommendations encouraging physical activity (PA) set no upper age limit, yet evidence supporting the benefits of PA among the very old is sparse. We examined the effects of continuing, increasing, or decreasing PA levels on survival, function, and health status among the very old. Mortality data from ages 70 to 88 years and health, comorbidity, and functional status at ages 70, 78, and 85 years were assessed through the Jerusalem Longitudinal Cohort Study (1990-2008). A representative sample of 1861 people born in 1920 and 1921 enrolled in this prospective study, resulting in 17 109 person-years of follow-up for all-cause mortality. Among physically active vs sedentary participants, respectively, at age 70, the 8-year mortality was 15.2% vs 27.2% (P < .001); at age 78, the 8-year mortality was 26.1% vs 40.8% (P <.001); and at age 85 years, the 3-year mortality was 6.8% vs 24.4% (P < .001). In Cox proportional-hazards models adjusting for mortality risk factors, lower mortality was associated with PA level at ages 70 (hazard ratio, 0.61; 95% confidence interval, 0.38-0.96), 78 (0.69; 0.48-0.98), and 85 (0.42; 0.25-0.68). A significant survival benefit was associated with initiating PA between ages 70 and 78 years (P = .04) and ages 78 and 85 years (P < .001). Participation in higher levels of PA, compared with being sedentary, did not show a dose-dependent association with mortality. The PA level at age 78 was associated with remaining independent while performing activities of daily living at age 85 (odds ratio, 1.92; 95% confidence interval, 1.11-3.33). Among the very old, not only continuing but also initiating PA was associated with better survival and function. This finding supports the encouragement of PA into advanced old age.
Nadeau-Fredette, Annie-Claire; Hawley, Carmel M.; Pascoe, Elaine M.; Chan, Christopher T.; Clayton, Philip A.; Polkinghorne, Kevan R.; Boudville, Neil; Leblanc, Martine
2015-01-01
Background and objectives Home dialysis is often recognized as a first-choice therapy for patients initiating dialysis. However, studies comparing clinical outcomes between peritoneal dialysis and home hemodialysis have been very limited. Design, setting, participants, & measurements This Australia and New Zealand Dialysis and Transplantation Registry study assessed all Australian and New Zealand adult patients receiving home dialysis on day 90 after initiation of RRT between 2000 and 2012. The primary outcome was overall survival. The secondary outcomes were on-treatment survival, patient and technique survival, and death-censored technique survival. All results were adjusted with three prespecified models: multivariable Cox proportional hazards model (main model), propensity score quintile–stratified model, and propensity score–matched model. Results The study included 10,710 patients on incident peritoneal dialysis and 706 patients on incident home hemodialysis. Treatment with home hemodialysis was associated with better patient survival than treatment with peritoneal dialysis (5-year survival: 85% versus 44%, respectively; log-rank P<0.001). Using multivariable Cox proportional hazards analysis, home hemodialysis was associated with superior patient survival (hazard ratio for overall death, 0.47; 95% confidence interval, 0.38 to 0.59) as well as better on-treatment survival (hazard ratio for on-treatment death, 0.34; 95% confidence interval, 0.26 to 0.45), composite patient and technique survival (hazard ratio for death or technique failure, 0.34; 95% confidence interval, 0.29 to 0.40), and death-censored technique survival (hazard ratio for technique failure, 0.34; 95% confidence interval, 0.28 to 0.41). Similar results were obtained with the propensity score models as well as sensitivity analyses using competing risks models and different definitions for technique failure and lag period after modality switch, during which events were attributed to the initial modality. Conclusions Home hemodialysis was associated with superior patient and technique survival compared with peritoneal dialysis. PMID:26068181
NASA Astrophysics Data System (ADS)
Babaie Mahani, A.; Eaton, D. W.
2013-12-01
Ground Motion Prediction Equations (GMPEs) are widely used in Probabilistic Seismic Hazard Assessment (PSHA) to estimate ground-motion amplitudes at Earth's surface as a function of magnitude and distance. Certain applications, such as hazard assessment for caprock integrity in the case of underground storage of CO2, waste disposal sites, and underground pipelines, require subsurface estimates of ground motion; at present, such estimates depend upon theoretical modeling and simulations. The objective of this study is to derive correction factors for GMPEs to enable estimation of amplitudes in the subsurface. We use a semi-analytic approach along with finite-difference simulations of ground-motion amplitudes for surface and underground motions. Spectral ratios of underground to surface motions are used to calculate the correction factors. Two predictive methods are used. The first is a semi-analytic approach based on a quarter-wavelength method that is widely used for earthquake site-response investigations; the second is a numerical approach based on elastic finite-difference simulations of wave propagation. Both methods are evaluated using recordings of regional earthquakes by broadband seismometers installed at the surface and at depths of 1400 m and 2100 m in the Sudbury Neutrino Observatory, Canada. Overall, both methods provide a reasonable fit to the peaks and troughs observed in the ratios of real data. The finite-difference method, however, has the capability to simulate ground motion ratios more accurately than the semi-analytic approach.
Renal Salvage with Renal Artery Stenting Improves Long-term Survival.
Modrall, J Gregory; Trimmer, Clayton; Tsai, Shirling; Kirkwood, Melissa L; Ali, Mujtaba; Rectenwald, John E; Timaran, Carlos H; Rosero, Eric B
2017-11-01
The Cardiovascular Outcomes in Renal Atherosclerotic Lesions (CORAL) Trial cast doubt on the benefits of renal artery stenting (RAS). However, the outcomes for patients with chronic kidney disease (CKD) were not analyzed separately in the CORAL Trial. We hypothesized that patients who experienced a significant improvement in renal function after RAS would have improved long-term survival, compared with patients whose renal function was not improved by stenting. This single-center retrospective study included 60 patients with stage 3 or worse CKD and renal artery occlusive disease who were treated with RAS for renal salvage. Patients were categorized as "responders" or "nonresponders" based on postoperative changes in estimated glomerular filtration rate (eGFR) after RAS. "Responders" were those patients with an improvement of at least 20% in eGFR over baseline; all others were categorized as "nonresponders." Survival was analyzed using the Kaplan-Meier method. Cox proportional hazards regression was used to identify predictors of long-term survival. The median age of the cohort was 66 years (interquartile range [IQR], 60-73). Median preoperative eGFR was 34 mL/min/1.73 m 2 (IQR, 24-45). At late follow-up (median 35 months, IQR, 22-97 months), 16 of 60 patients (26.7%) were categorized as "responders" with a median increase in postoperative eGFR of 40% (IQR, 21-67). Long-term survival was superior for responders, compared with nonresponders (P = 0.046 by log-rank test). Cox proportional hazards regression identified improved renal function after RAS as the only significant predictor of increased long-term survival (hazard ratio = 0.235, 95% confidence interval = 0.075-0.733; P = 0.0126 for improved versus worsened renal function after RAS). Successful salvage of renal function by RAS is associated with improved long-term survival. These data provide an important counter argument to the prior negative clinical trials that found no benefit to RAS. Published by Elsevier Inc.
Yoon, Chang-Yun; Noh, Juhwan; Jhee, Jong Hyun; Chang, Tae Ik; Kang, Ea Wha; Kee, Youn Kyung; Kim, Hyoungnae; Park, Seohyun; Yun, Hae-Ryong; Jung, Su-Young; Oh, Hyung Jung; Park, Jung Tak; Han, Seung Hyeok; Kang, Shin-Wook; Kim, Changsoo; Yoo, Tae-Hyun
2017-09-01
The aim of this study is to elucidate the effects of warfarin use in patients with atrial fibrillation undergoing dialysis using a population-based Korean registry. Data were extracted from the Health Insurance Review and Assessment Service, which is a nationwide, mandatory social insurance database of all Korean citizens enrolled in the National Health Information Service between 2009 and 2013. Thromboembolic and hemorrhagic outcomes were analyzed according to warfarin use. Overall and propensity score-matched cohorts were analyzed by Cox proportional hazards models. Among 9974 hemodialysis patients with atrial fibrillation, the mean age was 66.6±12.2 years, 5806 (58.2%) were men, and 2921 (29.3%) used warfarin. After propensity score matching to adjust for all described baseline differences, 5548 subjects remained, and differences in baseline variables were distributed equally between warfarin users and nonusers. During a mean follow-up duration of 15.9±11.1 months, ischemic and hemorrhagic stroke occurred in 678 (6.8%) and 227 (2.3%) patients, respectively. In a multiple Cox model, warfarin use was significantly associated with an increased risk of hemorrhagic stroke (hazard ratio, 1.44; 95% confidence interval, 1.09-1.91; P =0.010) in the overall cohort. Furthermore, a significant relationship between warfarin use and hemorrhagic stroke was found in propensity-matched subjects (hazard ratio, 1.56; 95% confidence interval, 1.10-2.22; P =0.013). However, the ratios for ischemic stroke were not significantly different in either the propensity-matched (hazard ratio, 0.95; 95% confidence interval, 0.78-1.15; P =0.569) or overall cohort (hazard ratio, 1.06; 95% confidence interval, 0.90-1.26; P =0.470). Our findings suggest that warfarin should be used carefully in hemodialysis patients, given the higher risk of hemorrhagic events and the lack of ability to prevent thromboembolic complications. © 2017 American Heart Association, Inc.
Hamer, Mark; Batty, G David; Stamatakis, Emmanuel; Kivimaki, Mika
2010-12-01
Common mental disorders, such as anxiety and depression, are risk factors for mortality among cardiac patients, although this topic has gained little attention in individuals with hypertension. We examined the combined effects of hypertension and common mental disorder on mortality in participants with both treated and untreated hypertension. In a representative, prospective study of 31 495 adults (aged 52.5 ± 12.5 years, 45.7% men) we measured baseline levels of common mental disorder using the 12-item General Health Questionnaire (GHQ-12) and collected data on blood pressure, history of hypertension diagnosis, and medication use. High blood pressure (systolic/diastolic >140/90 mmHg) in study members with an existing diagnosis of hypertension indicated uncontrolled hypertension and, in undiagnosed individuals, untreated hypertension. There were 3200 deaths from all causes [943 cardiovascular disease (CVD)] over 8.4 years follow-up. As expected, the risk of CVD was elevated in participants with controlled [multivariate hazard ratio = 1.63, 95% confidence interval (CI) 1.26-2.12] and uncontrolled (multivariate hazard ratio = 1.57, 95% CI 1.08-2.27) hypertension compared with normotensive participants. Common mental disorder (GHQ-12 score of ≥4) was also associated with CVD death (multivariate hazard ratio = 1.60, 95% CI 1.35-1.90). The risk of CVD death was highest in participants with both diagnosed hypertension and common mental disorder, especially in study members with controlled (multivariate hazard ratio = 2.32, 95% CI 1.70-3.17) hypertension but also in uncontrolled hypertension (multivariate hazard ratio = 1.90, 95% CI 1.18-3.05). The combined effect of common mental disorder was also apparent in participants with undiagnosed (untreated) hypertension, especially for all-cause mortality. These findings suggest that the association of hypertension with total and CVD mortality is stronger when combined with common mental disorder.
Patel, Siddharth; Kwak, Lucia; Agarwal, Sunil K; Tereshchenko, Larisa G; Coresh, Josef; Soliman, Elsayed Z; Matsushita, Kunihiro
2017-11-03
A few studies have recently reported clockwise and counterclockwise rotations of QRS transition zone as predictors of mortality. However, their prospective correlates and associations with individual cardiovascular disease (CVD) outcomes are yet to be investigated. Among 13 567 ARIC (Atherosclerosis Risk in Communities) study participants aged 45 to 64 years, we studied key correlates of changes in the status of clockwise and counterclockwise rotation over time as well as the association of rotation status with incidence of coronary heart disease (2408 events), heart failure (2196 events), stroke (991 events), composite CVD (4124 events), 898 CVD deaths, and 3469 non-CVD deaths over 23 years of follow-up. At baseline, counterclockwise rotation was most prevalent (52.9%), followed by no (40.5%) and clockwise (6.6%) rotation. Of patients with no rotation, 57.9% experienced counterclockwise or clockwise rotation during follow-up, with diabetes mellitus and black race significantly predicting clockwise and counterclockwise conversion, respectively. Clockwise rotation was significantly associated with higher risk of heart failure (hazard ratio, 1.20; 95% confidence interval [CI], 1.02-1.41) and non-CVD death (hazard ratio, 1.28; 95% CI, 1.12-1.46) after adjusting for potential confounders including other ECG parameters. On the contrary, counterclockwise rotation was significantly related to lower risk of composite CVD (hazard ratio, 0.93; 95% CI, 0.87-0.99]), CVD mortality (hazard ratio, 0.76; 95% CI, 0.65-0.88), and non-CVD deaths (hazard ratio, 0.92; 95% CI, 0.85-0.99 [borderline significance with heart failure]). Counterclockwise rotation, the most prevalent QRS transition zone pattern, demonstrated the lowest risk of CVD and mortality, whereas clockwise rotation was associated with the highest risk of heart failure and non-CVD mortality. These results have implications on how to interpret QRS transition zone rotation when ECG was recorded. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Aspirin and extended-release dipyridamole versus clopidogrel for recurrent stroke.
Sacco, Ralph L; Diener, Hans-Christoph; Yusuf, Salim; Cotton, Daniel; Ounpuu, Stephanie; Lawton, William A; Palesch, Yuko; Martin, Reneé H; Albers, Gregory W; Bath, Philip; Bornstein, Natan; Chan, Bernard P L; Chen, Sien-Tsong; Cunha, Luis; Dahlöf, Björn; De Keyser, Jacques; Donnan, Geoffrey A; Estol, Conrado; Gorelick, Philip; Gu, Vivian; Hermansson, Karin; Hilbrich, Lutz; Kaste, Markku; Lu, Chuanzhen; Machnig, Thomas; Pais, Prem; Roberts, Robin; Skvortsova, Veronika; Teal, Philip; Toni, Danilo; Vandermaelen, Cam; Voigt, Thor; Weber, Michael; Yoon, Byung-Woo
2008-09-18
Recurrent stroke is a frequent, disabling event after ischemic stroke. This study compared the efficacy and safety of two antiplatelet regimens--aspirin plus extended-release dipyridamole (ASA-ERDP) versus clopidogrel. In this double-blind, 2-by-2 factorial trial, we randomly assigned patients to receive 25 mg of aspirin plus 200 mg of extended-release dipyridamole twice daily or to receive 75 mg of clopidogrel daily. The primary outcome was first recurrence of stroke. The secondary outcome was a composite of stroke, myocardial infarction, or death from vascular causes. Sequential statistical testing of noninferiority (margin of 1.075), followed by superiority testing, was planned. A total of 20,332 patients were followed for a mean of 2.5 years. Recurrent stroke occurred in 916 patients (9.0%) receiving ASA-ERDP and in 898 patients (8.8%) receiving clopidogrel (hazard ratio, 1.01; 95% confidence interval [CI], 0.92 to 1.11). The secondary outcome occurred in 1333 patients (13.1%) in each group (hazard ratio for ASA-ERDP, 0.99; 95% CI, 0.92 to 1.07). There were more major hemorrhagic events among ASA-ERDP recipients (419 [4.1%]) than among clopidogrel recipients (365 [3.6%]) (hazard ratio, 1.15; 95% CI, 1.00 to 1.32), including intracranial hemorrhage (hazard ratio, 1.42; 95% CI, 1.11 to 1.83). The net risk of recurrent stroke or major hemorrhagic event was similar in the two groups (1194 ASA-ERDP recipients [11.7%], vs. 1156 clopidogrel recipients [11.4%]; hazard ratio, 1.03; 95% CI, 0.95 to 1.11). The trial did not meet the predefined criteria for noninferiority but showed similar rates of recurrent stroke with ASA-ERDP and with clopidogrel. There is no evidence that either of the two treatments was superior to the other in the prevention of recurrent stroke. (ClinicalTrials.gov number, NCT00153062.) 2008 Massachusetts Medical Society
Low-Risk Lifestyle, Coronary Calcium, Cardiovascular Events, and Mortality: Results From MESA
Ahmed, Haitham M.; Blaha, Michael J.; Nasir, Khurram; Jones, Steven R.; Rivera, Juan J.; Agatston, Arthur; Blankstein, Ron; Wong, Nathan D.; Lakoski, Susan; Budoff, Matthew J.; Burke, Gregory L.; Sibley, Christopher T.; Ouyang, Pamela; Blumenthal, Roger S.
2013-01-01
Unhealthy lifestyle habits are a major contributor to coronary artery disease. The purpose of the present study was to investigate the associations of smoking, weight maintenance, physical activity, and diet with coronary calcium, cardiovascular events, and mortality. US participants who were 44–84 years of age (n = 6,229) were followed in the Multi-Ethnic Study of Atherosclerosis from 2000 to 2010. A lifestyle score ranging from 0 to 4 was created using diet, exercise, body mass index, and smoking status. Coronary calcium was measured at baseline and a mean of 3.1 (standard deviation, 1.3) years later to assess calcium progression. Participants who experienced coronary events or died were followed for a median of 7.6 (standard deviation, 1.5) years. Participants with lifestyle scores of 1, 2, 3, and 4 were found to have mean adjusted annual calcium progressions that were 3.5 (95% confidence interval (CI): 0.0, 7.0), 4.2 (95% CI: 0.6, 7.9), 6.8 (95% CI: 2.0, 11.5), and 11.1 (95% CI: 2.2, 20.1) points per year slower, respectively, relative to the reference group (P = 0.003). Unadjusted hazard ratios for death by lifestyle score were as follows: for a score of 1, the hazard ratio was 0.79 (95% CI: 0.61, 1.03); for a score of 2, the hazard ratio was 0.61 (95% CI: 0.46, 0.81); for a score of 3, the hazard ratio was 0.49 (95% CI: 0.32, 0.75); and for a score of 4, the hazard ratio was 0.19 (95% CI: 0.05, 0.75) (P < 0.001 by log-rank test). In conclusion, a combination of regular exercise, healthy diet, smoking avoidance, and weight maintenance was associated with lower coronary calcium incidence, slower calcium progression, and lower all-cause mortality over 7.6 years. PMID:23733562
Gratwohl, Alois; Brand, Ronald; McGrath, Eoin; van Biezen, Anja; Sureda, Anna; Ljungman, Per; Baldomero, Helen; Chabannon, Christian; Apperley, Jane
2014-01-01
Competent authorities, healthcare payers and hospitals devote increasing resources to quality management systems but scientific analyses searching for an impact of these systems on clinical outcome remain scarce. Earlier data indicated a stepwise improvement in outcome after allogeneic hematopoietic stem cell transplantation with each phase of the accreditation process for the quality management system “JACIE”. We therefore tested the hypothesis that working towards and achieving “JACIE” accreditation would accelerate improvement in outcome over calendar time. Overall mortality of the entire cohort of 107,904 patients who had a transplant (41,623 allogeneic, 39%; 66,281 autologous, 61%) between 1999 and 2006 decreased over the 14-year observation period by a factor of 0.63 per 10 years (hazard ratio: 0.63; 0.58–0.69). Considering “JACIE“-accredited centers as those with programs having achieved accreditation by November 2012, at the latest, this improvement was significantly faster in “JACIE”-accredited centers than in non-accredited centers (approximately 5.3% per year for 49,459 patients versus approximately 3.5% per year for 58,445 patients, respectively; hazard ratio: 0.83; 0.71–0.97). As a result, relapse-free survival (hazard ratio 0.85; 0.75–0.95) and overall survival (hazard ratio 0.86; 0.76–0.98) were significantly higher at 72 months for those patients transplanted in the 162 “JACIE“-accredited centers. No significant effects were observed after autologous transplants (hazard ratio 1.06; 0.99–1.13). Hence, working towards implementation of a quality management system triggers a dynamic process associated with a steeper reduction in mortality over the years and a significantly improved survival after allogeneic stem cell transplantation. Our data support the use of a quality management system for complex medical procedures. PMID:24488562
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Shih-Neng; Department of Biomedical Imaging and Radiological Science, China Medical University, Taichung, Taiwan; Liao, Chih-Ying
2011-03-15
Purpose: To investigate the prognostic value of the volume reduction rate (VRR) in patients with head-and-neck cancer treated with intensity-modulated radiotherapy (IMRT). Methods and Materials: Seventy-six patients with oropharyngeal cancer (OPC) and another 76 with hypopharyngeal cancer (HPC) were enrolled in volumetric analysis. All patients received allocated radiotherapy courses. Adaptive computed tomography was done 4 to 5 weeks after the start of IMRT. Primary tumor volume measurement was derived using separate images for the pretreatment gross tumor volume (pGTV) and the interval gross tumor volume. Results: In the OPC group, the pGTV ranged from 6.6 to 242.6 mL (mean, 49.9more » mL), whereas the value of the VRR ranged from 0.014 to 0.74 (mean, 0.43). In HPC patients, the pGTV ranged from 4.1 to 152.4 mL (mean, 35.6 mL), whereas the VRR ranged from -1.15 to 0.79 (mean, 0.33). Multivariate analysis of the primary tumor relapse-free survival for OPC revealed three prognostic factors: T4 tumor (p = 0.0001, hazard ratio 7.38), pGTV {>=}20 mL (p = 0.01, hazard ratio 10.61), and VRR <0.5 (p = 0.001, hazard ratio 6.49). Multivariate analysis of the primary tumor relapse-free survival for HPC showed two prognostic factors: pGTV {>=}30 mL (p = 0.001, hazard ratio 2.87) and VRR <0.5 (p = 0.03, hazard ratio 2.25). Conclusion: The VRR is an outcome predictor for local control in OPC and HPC patients treated with IMRT. Those with large tumor volumes or a VRR <0.5 should be considered for a salvage operation or a dose-escalation scheme.« less
Dronedarone in high-risk permanent atrial fibrillation.
Connolly, Stuart J; Camm, A John; Halperin, Jonathan L; Joyner, Campbell; Alings, Marco; Amerena, John; Atar, Dan; Avezum, Álvaro; Blomström, Per; Borggrefe, Martin; Budaj, Andrzej; Chen, Shih-Ann; Ching, Chi Keong; Commerford, Patrick; Dans, Antonio; Davy, Jean-Marc; Delacrétaz, Etienne; Di Pasquale, Giuseppe; Diaz, Rafael; Dorian, Paul; Flaker, Greg; Golitsyn, Sergey; Gonzalez-Hermosillo, Antonio; Granger, Christopher B; Heidbüchel, Hein; Kautzner, Josef; Kim, June Soo; Lanas, Fernando; Lewis, Basil S; Merino, Jose L; Morillo, Carlos; Murin, Jan; Narasimhan, Calambur; Paolasso, Ernesto; Parkhomenko, Alexander; Peters, Nicholas S; Sim, Kui-Hian; Stiles, Martin K; Tanomsup, Supachai; Toivonen, Lauri; Tomcsányi, János; Torp-Pedersen, Christian; Tse, Hung-Fat; Vardas, Panos; Vinereanu, Dragos; Xavier, Denis; Zhu, Jun; Zhu, Jun-Ren; Baret-Cormel, Lydie; Weinling, Estelle; Staiger, Christoph; Yusuf, Salim; Chrolavicius, Susan; Afzal, Rizwan; Hohnloser, Stefan H
2011-12-15
Dronedarone restores sinus rhythm and reduces hospitalization or death in intermittent atrial fibrillation. It also lowers heart rate and blood pressure and has antiadrenergic and potential ventricular antiarrhythmic effects. We hypothesized that dronedarone would reduce major vascular events in high-risk permanent atrial fibrillation. We assigned patients who were at least 65 years of age with at least a 6-month history of permanent atrial fibrillation and risk factors for major vascular events to receive dronedarone or placebo. The first coprimary outcome was stroke, myocardial infarction, systemic embolism, or death from cardiovascular causes. The second coprimary outcome was unplanned hospitalization for a cardiovascular cause or death. After the enrollment of 3236 patients, the study was stopped for safety reasons. The first coprimary outcome occurred in 43 patients receiving dronedarone and 19 receiving placebo (hazard ratio, 2.29; 95% confidence interval [CI], 1.34 to 3.94; P=0.002). There were 21 deaths from cardiovascular causes in the dronedarone group and 10 in the placebo group (hazard ratio, 2.11; 95% CI, 1.00 to 4.49; P=0.046), including death from arrhythmia in 13 patients and 4 patients, respectively (hazard ratio, 3.26; 95% CI, 1.06 to 10.00; P=0.03). Stroke occurred in 23 patients in the dronedarone group and 10 in the placebo group (hazard ratio, 2.32; 95% CI, 1.11 to 4.88; P=0.02). Hospitalization for heart failure occurred in 43 patients in the dronedarone group and 24 in the placebo group (hazard ratio, 1.81; 95% CI, 1.10 to 2.99; P=0.02). Dronedarone increased rates of heart failure, stroke, and death from cardiovascular causes in patients with permanent atrial fibrillation who were at risk for major vascular events. Our data show that this drug should not be used in such patients. (Funded by Sanofi-Aventis; PALLAS ClinicalTrials.gov number, NCT01151137.).
Negatively-Biased Credulity and the Cultural Evolution of Beliefs
Fessler, Daniel M. T.; Pisor, Anne C.; Navarrete, Carlos David
2014-01-01
The functions of cultural beliefs are often opaque to those who hold them. Accordingly, to benefit from cultural evolution’s ability to solve complex adaptive problems, learners must be credulous. However, credulity entails costs, including susceptibility to exploitation, and effort wasted due to false beliefs. One determinant of the optimal level of credulity is the ratio between the costs of two types of errors: erroneous incredulity (failing to believe information that is true) and erroneous credulity (believing information that is false). This ratio can be expected to be asymmetric when information concerns hazards, as the costs of erroneous incredulity will, on average, exceed the costs of erroneous credulity; no equivalent asymmetry characterizes information concerning benefits. Natural selection can therefore be expected to have crafted learners’ minds so as to be more credulous toward information concerning hazards. This negatively-biased credulity extends general negativity bias, the adaptive tendency for negative events to be more salient than positive events. Together, these biases constitute attractors that should shape cultural evolution via the aggregated effects of learners’ differential retention and transmission of information. In two studies in the U.S., we demonstrate the existence of negatively-biased credulity, and show that it is most pronounced in those who believe the world to be dangerous, individuals who may constitute important nodes in cultural transmission networks. We then document the predicted imbalance in cultural content using a sample of urban legends collected from the Internet and a sample of supernatural beliefs obtained from ethnographies of a representative collection of the world’s cultures, showing that beliefs about hazards predominate in both. PMID:24736596
LGE Provides Incremental Prognostic Information Over Serum Biomarkers in AL Cardiac Amyloidosis.
Boynton, Samuel J; Geske, Jeffrey B; Dispenzieri, Angela; Syed, Imran S; Hanson, Theodore J; Grogan, Martha; Araoz, Philip A
2016-06-01
This study sought to determine the prognostic value of cardiac magnetic resonance (CMR) late gadolinium enhancement (LGE) in amyloid light chain (AL) cardiac amyloidosis. Cardiac involvement is the major determinant of mortality in AL amyloidosis. CMR LGE is a marker of amyloid infiltration of the myocardium. The purpose of this study was to evaluate retrospectively the prognostic value of CMR LGE for determining all-cause mortality in AL amyloidosis and to compare the prognostic power with the biomarker stage. Seventy-six patients with histologically proven AL amyloidosis underwent CMR LGE imaging. LGE was categorized as global, focal patchy, or none. Global LGE was considered present if it was visualized on LGE images or if the myocardium nulled before the blood pool on a cine multiple inversion time (TI) sequence. CMR morphologic and functional evaluation, echocardiographic diastolic evaluation, and cardiac biomarker staging were also performed. Subjects' charts were reviewed for all-cause mortality. Cox proportional hazards analysis was used to evaluate survival in univariate and multivariate analysis. There were 40 deaths, and the median study follow-up period was 34.4 months. Global LGE was associated with all-cause mortality in univariate analysis (hazard ratio = 2.93; p < 0.001). In multivariate modeling with biomarker stage, global LGE remained prognostic (hazard ratio = 2.43; p = 0.01). Diffuse LGE provides incremental prognosis over cardiac biomarker stage in patients with AL cardiac amyloidosis. Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Negatively-biased credulity and the cultural evolution of beliefs.
Fessler, Daniel M T; Pisor, Anne C; Navarrete, Carlos David
2014-01-01
The functions of cultural beliefs are often opaque to those who hold them. Accordingly, to benefit from cultural evolution's ability to solve complex adaptive problems, learners must be credulous. However, credulity entails costs, including susceptibility to exploitation, and effort wasted due to false beliefs. One determinant of the optimal level of credulity is the ratio between the costs of two types of errors: erroneous incredulity (failing to believe information that is true) and erroneous credulity (believing information that is false). This ratio can be expected to be asymmetric when information concerns hazards, as the costs of erroneous incredulity will, on average, exceed the costs of erroneous credulity; no equivalent asymmetry characterizes information concerning benefits. Natural selection can therefore be expected to have crafted learners' minds so as to be more credulous toward information concerning hazards. This negatively-biased credulity extends general negativity bias, the adaptive tendency for negative events to be more salient than positive events. Together, these biases constitute attractors that should shape cultural evolution via the aggregated effects of learners' differential retention and transmission of information. In two studies in the U.S., we demonstrate the existence of negatively-biased credulity, and show that it is most pronounced in those who believe the world to be dangerous, individuals who may constitute important nodes in cultural transmission networks. We then document the predicted imbalance in cultural content using a sample of urban legends collected from the Internet and a sample of supernatural beliefs obtained from ethnographies of a representative collection of the world's cultures, showing that beliefs about hazards predominate in both.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-19
... Compatibility Group S indicates that hazardous effects from accidental functioning are limited to the extent the... package is capable of containing any hazardous effects in the event of an accidental functioning of its... demonstrate that any hazardous effects are confined within a package. In the ANPRM, we invited commenters to...
Grewal, Jasmine; McKelvie, Robert S; Persson, Hans; Tait, Peter; Carlsson, Jonas; Swedberg, Karl; Ostergren, Jan; Lonn, Eva
2008-09-15
More than 40% of patients hospitalized with heart failure have preserved left ventricular ejection fraction (HF-PLVEF) and are at high risk for cardiovascular (CV) events. The purpose of this study was to determine the value of N-terminal pro-brain natriuretic peptide (NT-proBNP) and brain natriuretic peptide (BNP) in predicting CV outcomes in patients with HF-PLVEF. Participants with an ejection fraction >40% in the prospective CHARM Echocardiographic Substudy were included in this analysis. Plasma NT-proBNP levels were measured, and 2 cut-offs were selected prospectively at 300 pg/ml and 600 pg/ml. BNP cut-off was set at 100 pg/ml. Clinical characteristics were recorded, and systolic and diastolic function were evaluated by echocardiography. The primary substudy outcome was the composite of CV mortality, hospitalization for heart failure, and myocardial infarction or stroke. A total of 181 patients were included, and there were 17 primary CV events (9.4%) during a median follow-up time of 524 days. In a model including clinical characteristics, echocardiographic measures, and BNP or NT-proBNP, the composite CV event outcome was best predicted by NT-proBNP >300 pg/ml (hazard ratio 5.8, 95% confidence intervals [CI] 1.3 to 26.4, p = 0.02) and moderate or severe diastolic dysfunction on echocardiography. When NT-proBNP >600 pg/ml was used in the model, it was the sole independent predictor of primary CV events (hazard ratio 8.0, 95% CI 2.6 to 24.8, p = 0.0003) as was BNP >100 pg/ml (hazard ratio 3.1, 95% CI 1.2 to 8.2, p = 0.02) in the BNP model. In conclusion, both elevated NT-proBNP and BNP are strong independent predictors of clinical events in patients with HF-PLVEF.
Schaumeier, Maria Johanna; Nagy, Alexandra; Dell-Kuster, Salome; Rosenthal, Rachel; Schaub, Stefan; Dickenmann, Michael; Gurke, Lorenz; Wolff, Thomas
2017-09-05
Right-sided retroperitoneoscopic living donor nephrectomy (LDN) has been shown to be safe for the donor but it is unknown whether the short renal vein is associated with complications or an impaired long-term outcome in the recipient. In this retrospective cohort study, consecutive transplant recipients after retroperitoneoscopic LDN were enrolled. Complications occurring within 1 year were classified according to the Clavien-Dindo Classification for Surgical Complications and analysed using multivariable logistic regression. Predictors of 1-year creatinine clearance were analysed with multivariable linear regression. Cox proportional hazard models were used to analyse graft survival. Of the 251 recipients, 193 (77%) received a left kidney and 58 (23%) a right kidney. Surgical complications of Clavien-Dindo grade 3 or higher were comparable in recipients of right and left kidneys (33% vs 29%, odds ratio 0.98, 95% confidence interval [CI] 0.50, 1.94). The occurrence of a surgical complication had a significant impact on creatinine clearance at 1 year (decrease of 6 ml/min/m2, p = 0.016). Vascular complications in right kidneys were more common but were all corrected without impact on graft survival. One-year graft-survival was similar in recipients of right (98.3%) and left (96.9%) kidneys, as was creatinine clearance one year after transplantation (mean difference 3.3 ml/min/m2, 95% CI -1.5, 8.1; p = 0.175). After a median follow-up of 5 years, neither the side (hazard ratio 1.56, 95% CI 0.67, 3.63) nor surgical complications (hazard ratio 1.44, 95% CI 0.65, 3.19) were associated with graft failure. Right retroperitoneoscopic LDN does not compromise the outcome of transplantation. Surgical complications, long-term graft function and graft survival were comparable in right and left kidneys.
Age at menopause and incident heart failure: the Multi-Ethnic Study of Atherosclerosis.
Ebong, Imo A; Watson, Karol E; Goff, David C; Bluemke, David A; Srikanthan, Preethi; Horwich, Tamara; Bertoni, Alain G
2014-06-01
This study aims to evaluate the associations of early menopause (menopause occurring before age 45 years) and age at menopause with incident heart failure (HF) in postmenopausal women. We also explored the associations of early menopause and age at menopause with left ventricular (LV) measures of structure and function in postmenopausal women. We included 2,947 postmenopausal women, aged 45 to 84 years without known cardiovascular disease (2000-2002), from the Multi-Ethnic Study of Atherosclerosis. Cox proportional hazards models were used to examine the associations of early menopause and age at menopause with incident HF. In 2,123 postmenopausal women in whom cardiac magnetic resonance imaging was obtained at baseline, we explored the associations of early menopause and age at menopause with LV measures using multivariable linear regression. Across a median follow-up of 8.5 years, we observed 71 HF events. There were no significant interactions with ethnicity for incident HF (Pinteraction > 0.05). In adjusted analysis, early menopause was associated with an increased risk of incident HF (hazard ratio, 1.66; 95% CI, 1.01-2.73), whereas every 1-year increase in age at menopause was associated with a decreased risk of incident HF (hazard ratio, 0.96; 95% CI, 0.94-0.99). We observed significant interactions between early menopause and ethnicity for LV mass-to-volume ratio (LVMVR; Pinteraction = 0.02). In Chinese-American women, early menopause was associated with a higher LVMVR (+0.11; P = 0.0002), whereas every 1-year increase in age at menopause was associated with a lower LVMVR (-0.004; P = 0.04) at baseline. Older age at menopause is independently associated with a decreased risk of incident HF. Concentric LV remodeling, indicated by a higher LVMVR, is present in Chinese-American women who experienced early menopause at baseline.
Liu, Gang; Ding, Ming; Chiuve, Stephanie E; Rimm, Eric B; Franks, Paul W; Meigs, James B; Hu, Frank B; Sun, Qi
2016-11-01
To examine select adipokines, including fatty acid-binding protein 4, retinol-binding protein 4, and high-molecular-weight (HMW) adiponectin in relation to cardiovascular disease (CVD) mortality among patients with type 2 diabetes mellitus. Plasma levels of fatty acid-binding protein 4, retinol-binding protein 4, and HMW adiponectin were measured in 950 men with type 2 diabetes mellitus in the Health Professionals Follow-up Study. After an average of 22 years of follow-up (1993-2015), 580 deaths occurred, of whom 220 died of CVD. After multivariate adjustment for covariates, higher levels of fatty acid-binding protein 4 were significantly associated with a higher CVD mortality: comparing extreme tertiles, the hazard ratio and 95% confidence interval of CVD mortality was 1.78 (1.22-2.59; P trend=0.001). A positive association was also observed for HMW adiponectin: the hazard ratio (95% confidence interval) was 2.07 (1.42-3.06; P trend=0.0002), comparing extreme tertiles, whereas higher retinol-binding protein 4 levels were nonsignificantly associated with a decreased CVD mortality with an hazard ratio (95% confidence interval) of 0.73 (0.50-1.07; P trend=0.09). A Mendelian randomization analysis suggested that the causal relationships of HMW adiponectin and retinol-binding protein 4 would be directionally opposite to those observed based on the biomarkers, although none of the Mendelian randomization associations achieved statistical significance. These data suggest that higher levels of fatty acid-binding protein 4 and HMW adiponectin are associated with elevated CVD mortality among men with type 2 diabetes mellitus. Biological mechanisms underlying these observations deserve elucidation, but the associations of HMW adiponectin may partially reflect altered adipose tissue functionality among patients with type 2 diabetes mellitus. © 2016 American Heart Association, Inc.
Exercise Is Associated with Lower Long-Term Risk of Olfactory Impairment in Older Adults
Schubert, Carla R.; Cruickshanks, Karen J.; Nondahl, David M.; Klein, Barbara EK; Klein, Ronald; Fischer, Mary E.
2013-01-01
Importance The prevalence of olfactory impairment is high in older adults and this decline in olfactory ability may pose health and safety risks, affect nutrition and decrease quality of life. It is important to identify modifiable risk factors to reduce the burden of olfactory impairment in aging populations. Objectives To determine if exercise is associated with the 10-year cumulative incidence of olfactory impairment. Design, Setting and Participants Observational longitudinal population-based Epidemiology of Hearing Loss Study. Participants without olfactory impairment (n=1611) were ages 53-97 years at baseline and were followed for up to ten years (1998-2010). Interventions None Main Outcome and Measures Olfaction was measured with the San Diego Odor Identification Test at three examinations (1998-2000, 2003-2005, 2009-2010) of the Epidemiology of Hearing Loss Study. The main outcome was the incidence of olfactory impairment five (2003-2005) or ten (2009-2010) years later and the association of baseline exercise with the long-term risk of developing olfactory impairment. Results The 10-year cumulative incidence of olfactory impairment was 27.6% (95% confidence interval =25.3, 29.9) and rates varied by age and sex; those who were older (Hazard Ratio =1.88, 95% Confidence Interval=1.74, 2.03, for every 5 years) or male (Hazard Ratio=1.27, 95% Confidence Interval=1.00, 1.61) had an increased risk of olfactory impairment. Participants who reported exercising at least once a week long enough to work up a sweat had a decreased risk of olfactory impairment (age and sex adjusted Hazard Ratio= 0.76, 95% CI= 0.60, 0.97). Increasing frequency of exercise was associated with decreasing risk of developing olfactory impairment (p for trend = 0.02). Conclusion and Relevance Regular exercise was associated with lower 10-year cumulative incidence of olfactory impairment. Older adults who exercise may be able to retain olfactory function with age. PMID:24135745
Bruun, C; Guassora, A D; Nielsen, A B S; Siersma, V; Holstein, P E; de Fine Olivarius, N
2014-11-01
To investigate the predictive value of both patients' motivation and effort in their management of Type 2 diabetes and their life circumstances for the development of foot ulcers and amputations. This study was based on the Diabetes Care in General Practice study and Danish population and health registers. The associations between patient motivation, effort and life circumstances and foot ulcer prevalence 6 years after diabetes diagnosis and the incidence of amputation in the following 13 years were analysed using odds ratios from logistic regression and hazard ratios from Cox regression models, respectively. Foot ulcer prevalence 6 years after diabetes diagnosis was 2.93% (95% CI 1.86-4.00) among 956 patients. General practitioners' indication of 'poor' vs 'very good' patient motivation for diabetes management was associated with higher foot ulcer prevalence (odds ratio 6.11, 95% CI 1.22-30.61). The same trend was seen for 'poor' vs 'good' influence of the patient's own effort in diabetes treatment (odds ratio 7.06, 95% CI 2.65-18.84). Of 1058 patients examined at 6-year follow-up, 45 experienced amputation during the following 13 years. 'Poor' vs 'good' influence of the patients' own effort was associated with amputation (hazard ratio 7.12, 95% CI 3.40-14.92). When general practitioners assessed the influence of patients' life circumstances as 'poor' vs 'good', the amputation incidence increased (hazard ratio 2.97, 95% CI 1.22-7.24). 'Poor' vs 'very good' patient motivation was also associated with a higher amputation incidence (hazard ratio 7.57, 95% CI 2.43-23.57), although not in fully adjusted models. General practitioners' existing knowledge of patients' life circumstances, motivation and effort in diabetes management should be included in treatment strategies to prevent foot complications. © 2014 The Authors. Diabetic Medicine © 2014 Diabetes UK.
Fealy, Nigel; Aitken, Leanne; du Toit, Eugene; Lo, Serigne; Baldwin, Ian
2017-10-01
To determine whether blood flow rate influences circuit life in continuous renal replacement therapy. Prospective randomized controlled trial. Single center tertiary level ICU. Critically ill adults requiring continuous renal replacement therapy. Patients were randomized to receive one of two blood flow rates: 150 or 250 mL/min. The primary outcome was circuit life measured in hours. Circuit and patient data were collected until each circuit clotted or was ceased electively for nonclotting reasons. Data for clotted circuits are presented as median (interquartile range) and compared using the Mann-Whitney U test. Survival probability for clotted circuits was compared using log-rank test. Circuit clotting data were analyzed for repeated events using hazards ratio. One hundred patients were randomized with 96 completing the study (150 mL/min, n = 49; 250 mL/min, n = 47) using 462 circuits (245 run at 150 mL/min and 217 run at 250 mL/min). Median circuit life for first circuit (clotted) was similar for both groups (150 mL/min: 9.1 hr [5.5-26 hr] vs 10 hr [4.2-17 hr]; p = 0.37). Continuous renal replacement therapy using blood flow rate set at 250 mL/min was not more likely to cause clotting compared with 150 mL/min (hazards ratio, 1.00 [0.60-1.69]; p = 0.68). Gender, body mass index, weight, vascular access type, length, site, and mode of continuous renal replacement therapy or international normalized ratio had no effect on clotting risk. Continuous renal replacement therapy without anticoagulation was more likely to cause clotting compared with use of heparin strategies (hazards ratio, 1.62; p = 0.003). Longer activated partial thromboplastin time (hazards ratio, 0.98; p = 0.002) and decreased platelet count (hazards ratio, 1.19; p = 0.03) were associated with a reduced likelihood of circuit clotting. There was no difference in circuit life whether using blood flow rates of 250 or 150 mL/min during continuous renal replacement therapy.
Nam, Jin Young; Choi, Young; Kim, Juyeong; Cho, Kyoung Hee; Park, Eun-Cheol
2017-08-15
The relationships between breastfeeding discontinuation and cesarean section delivery, and the occurrence of postpartum depression (PPD) remain unclear. Therefore, we aimed to investigate the association of breastfeeding discontinuation and cesarean section delivery with PPD during the first 6 months after delivery. Data were extracted from the Korean National Health Insurance Service-National Sample Cohort for 81,447 women who delivered during 2004-2013. PPD status was determined using the diagnosis code at outpatient or inpatient visit during the 6-month postpartum period. Breastfeeding discontinuation and cesarean section delivery were identified from prescription of lactation suppression drugs and diagnosis, respectively. Cox proportional hazards models were used to calculate adjusted hazard ratios. Of the 81,447 women, 666 (0.82%) had PPD. PPD risk was higher in women who discontinued breastfeeding than in those who continued breastfeeding (hazard ratio=3.23, P<0.0001), in women with cesarean section delivery than in those with vaginal delivery (hazard ratio=1.26, P=0.0040), and in women with cesarean section delivery who discontinued breastfeeding than in those with vaginal delivery who continued breastfeeding (hazard ratio=4.92, P<0.0001). Study limitations include low PPD incidence; use of indirect indicators for PPD, breastfeeding discontinuation, and working status, which could introduce selection bias and errors due to miscoding; and potential lack of adjustment for important confounders. Breastfeeding discontinuation and cesarean section delivery were associated with PPD during the 6-month postpartum period. Our results support the implementation of breastfeeding promoting policies, and PPD screening and treatment programs during the early postpartum period. Copyright © 2017 Elsevier B.V. All rights reserved.
Association of Modality with Mortality among Canadian Aboriginals
Hemmelgarn, Brenda; Rigatto, Claudio; Komenda, Paul; Yeates, Karen; Promislow, Steven; Mojica, Julie; Tangri, Navdeep
2012-01-01
Summary Background and objectives Previous studies have shown that Aboriginals and Caucasians experience similar outcome on dialysis in Canada. Using the Canadian Organ Replacement Registry, this study examined whether dialysis modality (peritoneal or hemodialysis) impacted mortality in Aboriginal patients. Design, setting, participants, & measurements This study identified 31,576 adult patients (hemodialysis: Aboriginal=1839, Caucasian=21,430; peritoneal dialysis: Aboriginal=554, Caucasian=6769) who initiated dialysis between January of 2000 and December of 2009. Aboriginal status was identified by self-report. Dialysis modality was determined 90 days after dialysis initiation. Multivariate Cox proportional hazards and competing risk models were constructed to determine the association between race and mortality by dialysis modality. Results During the study period, 939 (51.1%) Aboriginals and 12,798 (53.3%) Caucasians initiating hemodialysis died, whereas 166 (30.0%) and 2037 (30.1%), respectively, initiating peritoneal dialysis died. Compared with Caucasians, Aboriginals on hemodialysis had a comparable risk of mortality (adjusted hazards ratio=1.04, 95% confidence interval=0.96–1.11, P=0.37). However, on peritoneal dialysis, Aboriginals experienced a higher risk of mortality (adjusted hazards ratio=1.36, 95% confidence interval=1.13–1.62, P=0.001) and technique failure (adjusted hazards ratio=1.29, 95% confidence interval=1.03–1.60, P=0.03) than Caucasians. The risk of technique failure varied by patient age, with younger Aboriginals (<50 years old) more likely to develop technique failure than Caucasians (adjusted hazards ratio=1.76, 95% confidence interval=1.23–2.52, P=0.002). Conclusions Aboriginals on peritoneal dialysis experience higher mortality and technique failure relative to Caucasians. Reasons for this race disparity in peritoneal dialysis outcomes are unclear. PMID:22997343
Olsen, Anne-Marie Schjerning; Fosbøl, Emil L; Lindhardsen, Jesper; Folke, Fredrik; Charlot, Mette; Selmer, Christian; Bjerring Olesen, Jonas; Lamberts, Morten; Ruwald, Martin H; Køber, Lars; Hansen, Peter R; Torp-Pedersen, Christian; Gislason, Gunnar H
2012-10-16
The cardiovascular risk after the first myocardial infarction (MI) declines rapidly during the first year. We analyzed whether the cardiovascular risk associated with using nonsteroidal anti-inflammatory drugs (NSAIDs) was associated with the time elapsed following first-time MI. We identified patients aged 30 years or older admitted with first-time MI in 1997 to 2009 and subsequent NSAID use by individual-level linkage of nationwide registries of hospitalization and drug dispensing from pharmacies in Denmark. We calculated the incidence rates of death and a composite end point of coronary death or nonfatal recurrent MIs associated with NSAID use in 1-year time intervals up to 5 years after inclusion and analyzed risk by using multivariable adjusted time-dependent Cox proportional hazards models. Of the 99 187 patients included, 43 608 (44%) were prescribed NSAIDs after the index MI. There were 36 747 deaths and 28 693 coronary deaths or nonfatal recurrent MIs during the 5 years of follow-up. Relative to noncurrent treatment with NSAIDs, the use of any NSAID in the years following MI was persistently associated with an increased risk of death (hazard ratio 1.59 [95% confidence interval, 1.49-1.69]) after 1 year and hazard ratio 1.63 [95% confidence interval, 1.52-1.74] after 5 years) and coronary death or nonfatal recurrent MI (hazard ratio, 1.30 [95% confidence interval,l 1.22-1.39] and hazard ratio, 1.41 [95% confidence interval, 1.28-1.55]). The use of NSAIDs is associated with persistently increased coronary risk regardless of time elapsed after first-time MI. We advise long-term caution in the use of NSAIDs for patients after MI.
Grønhøj, C; Jensen, D; Dehlendorff, C; Nørregaard, C; Andersen, E; Specht, L; Charabi, B; von Buchwald, C
2018-06-01
The distinct difference in disease phenotype of human papillomavirus-positive (HPV+) and -negative (HPV-) oropharyngeal squamous cell cancer (OPSCC) patients might also be apparent when assessing the effect of time to treatment initiation (TTI). We assessed the overall survival and progression-free survival (PFS) effect from increasing TTI for HPV+ and HPV- OPSCC patients. We examined patients who received curative-intended therapy for OPSCC in eastern Denmark between 2000 and 2014. TTI was the number of days from diagnosis to the initiation of curative treatment. Overall survival and PFS were measured from the start of treatment and estimated with the Kaplan-Meier estimator. Hazard ratios and 95% confidence intervals were estimated with Cox proportional hazard regression. At a median follow-up of 3.6 years (interquartile range 1.86-6.07 years), 1177 patients were included (59% HPV+). In the adjusted analysis for the HPV+ and HPV- patient population, TTI influenced overall survival and PFS, most evident in the HPV- group, where TTI >60 days statistically significantly influenced overall survival but not PFS (overall survival: hazard ratio 1.60; 95% confidence interval 1.04-2.45; PFS: hazard ratio 1.46; 95% confidence interval 0.96-2.22). For patients with a TTI >60 days in the HPV+ group, TTI affected overall survival and PFS similarly, with slightly lower hazard ratio estimates of 1.44 (95% confidence interval 0.83-2.51) and 1.15 (95% confidence interval 0.70-1.88), respectively. For patients treated for a HPV+ or HPV- OPSCC, TTI affects outcome, with the strongest effect for overall survival among HPV- patients. Reducing TTI is an important tool to improve the prognosis. Copyright © 2018. Published by Elsevier Ltd.
Müller, G; Wellmann, J; Hartwig, S; Greiser, K H; Moebus, S; Jöckel, K-H; Schipf, S; Völzke, H; Maier, W; Meisinger, C; Tamayo, T; Rathmann, W; Berger, K
2015-08-01
To analyse the association of neighbourhood unemployment with incident self-reported physician-diagnosed Type 2 diabetes in a population aged 45-74 years from five German regions. Study participants were linked via their addresses at baseline to particular neighbourhoods. Individual-level data from five population-based studies were pooled and combined with contextual data on neighbourhood unemployment. Type 2 diabetes was assessed according to a self-reported physician diagnosis of diabetes. We estimated proportional hazard models (Weibull distribution) in order to obtain hazard ratios and 95% CIs of Type 2 diabetes mellitus, taking into account interval-censoring and clustering. We included 7250 participants residing in 228 inner city neighbourhoods in five German regions in our analysis. The incidence rate was 12.6 per 1000 person-years (95% CI 11.4-13.8). The risk of Type 2 diabetes mellitus was higher in men [hazard ratio 1.79 (95% CI 1.47-2.18)] than in women and higher in people with a low education level [hazard ratio 1.55 (95% CI 1.18-2.02)] than in those with a high education level. Independently of individual-level characteristics, we found a higher risk of Type 2 diabetes mellitus in neighbourhoods with high levels of unemployment [quintile 5; hazard ratio 1.72 (95% CI 1.23-2.42)] than in neighbourhoods with low unemployment (quintile 1). Low education level and high neighbourhood unemployment were independently associated with an elevated risk of Type 2 diabetes mellitus. Studies examining the impact of the residential environment on Type 2 diabetes mellitus will provide knowledge that is essential for the identification of high-risk populations. © 2014 The Authors. Diabetic Medicine © 2014 Diabetes UK.
Farré, Núria; Aranyó, Júlia; Enjuanes, Cristina; Verdú-Rotellar, José María; Ruiz, Sonia; Gonzalez-Robledo, Gina; Meroño, Oona; de Ramon, Marta; Moliner, Pedro; Bruguera, Jordi; Comin-Colet, Josep
2015-02-15
Obese patients with chronic Heart Failure (HF) have better outcome than their lean counterparts, although little is known about the pathophysiology of this obesity paradox. Our aim was to evaluate the hypothesis that patients with chronic HF and obesity (defined as body mass index (BMI)≥30kg/m(2)), may have an attenuated neurohormonal activation in comparison with non-obese patients. The present study is the post-hoc analysis of a cohort of 742 chronic HF patients from a single-center study evaluating sympathetic activation by measuring baseline levels of norepinephrine (NE). Obesity was present in 33% of patients. Higher BMI and obesity were significantly associated with lower NE levels in multivariable linear regression models adjusted for covariates (p<0.001). Addition to NE in multivariate Cox proportional hazard models attenuated the prognostic impact of BMI in terms of outcomes. Finally, when we explored the prognosis impact of raised NE levels (>70th percentile) carrying out a separate analysis in obese and non-obese patients we found that in both groups NE remained a significant independent predictor of poorer outcomes, despite the lower NE levels in patients with chronic HF and obesity: all-cause mortality hazard ratio=2.37 (95% confidence interval, 1.14-4.94) and hazard ratio=1.59 (95% confidence interval, 1.05-2.4) in obese and non-obese respectively; and cardiovascular mortality hazard ratio=3.08 (95% confidence interval, 1.05-9.01) in obese patients and hazard ratio=2.08 (95% confidence interval, 1.42-3.05) in non-obese patients. Patients with chronic HF and obesity have significantly lower sympathetic activation. This finding may partially explain the obesity paradox described in chronic HF patients. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Baeten, Jared M; Donnell, Deborah; Kapiga, Saidi H; Ronald, Allan; John-Stewart, Grace; Inambao, Mubiana; Manongi, Rachel; Vwalika, Bellington; Celum, Connie
2010-03-13
Male circumcision reduces female-to-male HIV-1 transmission risk by approximately 60%. Data assessing the effect of circumcision on male-to-female HIV-1 transmission are conflicting, with one observational study among HIV-1-serodiscordant couples showing reduced transmission but a randomized trial suggesting no short-term benefit of circumcision. Data collected as part of a prospective study among African HIV-1-serodiscordant couples were analyzed for the relationship between circumcision status of HIV-1-seropositive men and risk of HIV-1 acquisition among their female partners. Circumcision status was determined by physical examination. Cox proportional hazards analysis was used. A total of 1096 HIV-1-serodiscordant couples in which the male partner was HIV-1-infected were followed for a median of 18 months; 374 (34%) male partners were circumcised. Sixty-four female partners seroconverted to HIV-1 (incidence 3.8 per 100 person-years). Circumcision of the male partner was associated with a nonstatistically significant approximately 40% lower risk of HIV-1 acquisition by the female partner (hazard ratio 0.62, 95% confidence interval 0.35-1.10, P = 0.10). The magnitude of this effect was similar when restricted to the subset of HIV-1 transmission events confirmed by viral sequencing to have occurred within the partnership (n = 50, hazard ratio 0.57, P = 0.11), after adjustment for male partner plasma HIV-1 concentrations (hazard ratio 0.60, P = 0.13), and when excluding follow-up time for male partners who initiated antiretroviral therapy (hazard ratio 0.53, P = 0.07). Among HIV-1-serodiscordant couples in which the HIV-1-seropositive partner was male, we observed no increased risk and potentially decreased risk from circumcision on male-to-female transmission of HIV-1.
Satoh, Michihiro; Ohkubo, Takayoshi; Asayama, Kei; Murakami, Yoshitaka; Sakurai, Masaru; Nakagawa, Hideaki; Iso, Hiroyasu; Okayama, Akira; Miura, Katsuyuki; Imai, Yutaka; Ueshima, Hirotsugu; Okamura, Tomonori
2015-03-01
No large-scale, longitudinal studies have examined the combined effects of blood pressure (BP) and total cholesterol levels on long-term risks for subtypes of cardiovascular death in an Asian population. To investigate these relationships, a meta-analysis of individual participant data, which included 73 916 Japanese subjects (age, 57.7 years; men, 41.1%) from 11 cohorts, was conducted. During a mean follow-up of 15.0 years, deaths from coronary heart disease, ischemic stroke, and intraparenchymal hemorrhage occurred in 770, 724, and 345 cases, respectively. Cohort-stratified Cox proportional hazard models were used. After stratifying the participants by 4 systolic BP ×4 total cholesterol categories, the group with systolic BP ≥160 mm Hg with total cholesterol ≥5.7 mmol/L had the greatest risk for coronary heart disease death (adjusted hazard ratio, 4.39; P<0.0001 versus group with systolic BP <120 mm Hg and total cholesterol <4.7 mmol/L). The adjusted hazard ratios of systolic BP (per 20 mm Hg) increased with increases in total cholesterol categories (hazard ratio, 1.52; P<0.0001 in group with total cholesterol ≥5.7 mmol/L). Similarly, the adjusted hazard ratios of total cholesterol increased with increases in systolic BP categories (P for interaction ≤0.04). Systolic BP was positively associated with ischemic stroke and intraparenchymal hemorrhage death, and total cholesterol was inversely associated with intraparenchymal hemorrhage, but no significant interactions between BP and total cholesterol were observed for stroke. High BP and high total cholesterol can synergistically increase the risk for coronary heart disease death but not for stroke in the Asian population. © 2015 American Heart Association, Inc.
Association between divorce and risks for acute myocardial infarction.
Dupre, Matthew E; George, Linda K; Liu, Guangya; Peterson, Eric D
2015-05-01
Divorce is a major life stressor that can have economic, emotional, and physical health consequences. However, the cumulative association between divorce and risks for acute myocardial infarction (AMI) is unknown. This study investigated the association between lifetime exposure to divorce and the incidence of AMI in US adults. We used nationally representative data from a prospective cohort of ever-married adults aged 45 to 80 years (n=15,827) who were followed biennially from 1992 to 2010. Approximately 14% of men and 19% of women were divorced at baseline and more than one third of the cohort had ≥1 divorce in their lifetime. In 200,524 person-years of follow-up, 8% (n=1211) of the cohort had an AMI and age-specific rates of AMI were consistently higher in those who were divorced compared with those who were continuously married (P<0.05). Results from competing-risk hazard models showed that AMI risks were significantly higher in women who had 1 divorce (hazard ratio, 1.24; 95% confidence interval, 1.01-1.55), ≥2 divorces (hazard ratio, 1.77; 95% confidence interval, 1.30-2.41), and among the remarried (hazard ratio, 1.35; 95% confidence interval, 1.07-1.70) compared with continuously married women after adjusting for multiple risk factors. Multivariable-adjusted risks were elevated only in men with a history of ≥2 divorces (hazard ratio, 1.30; 95% confidence interval, 1.02-1.66) compared with continuously married men. Men who remarried had no significant risk for AMI. Interaction terms for sex were not statistically significant. Divorce is a significant risk factor for AMI. The risks associated with multiple divorces are especially high in women and are not reduced with remarriage. © 2015 American Heart Association, Inc.
Liu, Jui-Ming; Hsu, Ren-Jun; Chang, Fung-Wei; Chiu, Feng-Hsiang; Yeh, Chia-Lun; Huang, Chun-Fa; Chang, Shu-Ting; Lee, Hung-Chang; Chi, Hsin; Lin, Chien-Yu
2017-01-01
Scabies is a common and annoying disorder. Pernicious anemia (PA) is a serious disease which, when untreated, leads to death. Mounting evidence suggests that immune-mediated inflammatory processes play a role in the pathophysiology of both diseases. The relationship between these two diseases has not been investigated. We conducted this study to explore the potential relationship between scabies and PA. This nationwide, population-based study was conducted using the National Health Insurance Research Database of Taiwan. In total, 5,407 patients with scabies were identified as a study group and 20,089 matched patients were randomly selected as a control group. We tracked patients in both groups for a 7-year period to identify the incidence of PA. The demographic characteristics and comorbidities of the patients were analyzed, and Cox proportional hazards regression was used to calculate the hazard ratios for PA. Of the 25,496 patients in this study, 183 (0.7%) patients with newly diagnosed PA were identified during the 7-year follow-up period; 71 of 5,407 (1.3%) from the scabies group and 112 of 20,089 (0.6%) from the control group. Patients with scabies had a higher risk of subsequent PA, with a crude hazard ratio of 2.368. After adjusting for covariates, the adjusted hazard ratio was 1.51 (95% confidence interval: 1.09-2.08). This study demonstrated an increased risk of PA (adjusted hazard ratio 1.51) among patients with scabies. Immune-mediated inflammatory processes may contribute to this association. Further studies are warranted to investigate the entire pathological mechanisms between these two diseases. Physicians should pay attention to patients with history of scabies presented with anemia. Further confirmative tests of PA may contribute to correct diagnosis and initiation of vitamin B12 supplement.
Woo, Jong Shin; Lee, Tae Won; Ihm, Chun Gyoo; Kim, Yang Gyoon; Moon, Joo Young; Lee, Sang Ho; Jeong, Myung Ho; Jeong, Kyung Hwan
2016-01-01
Objective A high serum triglyceride to high-density lipoprotein cholesterol (TG/HDL-C) ratio has been reported as an independent predictor for cardiovascular events in the general population. However, the prognostic value of this ratio in patients with renal dysfunction is unclear. We examined the association of the TG/HDL-C ratio with major adverse cardiovascular events (MACEs) according to renal function in patients with acute myocardial infarction (AMI). Method This study was based on the Korea Acute Myocardial Infarction Registry database. Of 13,897 patients who were diagnosed with AMI, the study population included the 7,016 patients with available TG/HDL-C ratio data. Patients were stratified into three groups according to their estimated glomerular filtration rate (eGFR), and the TG/HDL-C ratio was categorized into tertiles. We investigated 12-month MACEs, which included cardiac death, myocardial infarction, and repeated percutaneous coronary intervention or coronary artery bypass grafting. Results During the 12-month follow up period, 593 patients experienced MACEs. There was a significant association between the TG/HDL-C ratio and MACEs (p<0.001) in the entire study cohort. Having a TG/HDL-C ratio value in the highest tertile of TG/HDL-C ratio was an independent factor associated with increased risk of MACEs (hazard ratio [HR], 1.56; 95% confidence interval [CI], 1.26–1.93; p<0.001). Then we performed subgroup analyses according to renal function. In patients with normal renal function (eGFR ≥ 90 ml/min/1.73m2) and mild renal dysfunction (eGFR ≥ 60 to < 90ml/min/1.73m2), a higher TG/HDL-C ratio was significantly associated with increased risk of MACEs (HR, 1.64; 95% CI, 1.04–2.60; p = 0.035; and HR, 1.56; 95% CI, 1.14–2.12; p = 0.005, respectively). However, in patients with moderate renal dysfunction (eGFR < 60 ml/min/1.73m2), TG/HDL-C ratio lost its predictive value on the risk of MACEs (HR, 1.23; 95% CI, 0.82–1.83; p = 0.317). Conclusions In patients with AMI, TG/HDL-C ratio is a useful independent predictor of 12-month MACEs. However, this ratio does not have predictive power in patients with moderate renal dysfunction. PMID:27788233
Parashos, Sotirios A; Luo, Sheng; Biglan, Kevin M; Bodis-Wollner, Ivan; He, Bo; Liang, Grace S; Ross, G Webster; Tilley, Barbara C; Shulman, Lisa M
2014-06-01
Optimizing assessments of rate of progression in Parkinson disease (PD) is important in designing clinical trials, especially of potential disease-modifying agents. To examine the value of measures of impairment, disability, and quality of life in assessing progression in early PD. Inception cohort analysis of data from 413 patients with early, untreated PD who were enrolled in 2 multicenter, randomized, double-blind clinical trials. Participants were randomly assigned to 1 of 5 treatments (67 received creatine, 66 received minocycline, 71 received coenzyme Q10, 71 received GPI-1485, and 138 received placebo). We assessed the association between the rates of change in measures of impairment, disability, and quality of life and time to initiation of symptomatic treatment. Time between baseline assessment and need for the initiation of symptomatic pharmaceutical treatment for PD was the primary indicator of disease progression. After adjusting for baseline confounding variables with regard to the Unified Parkinson's Disease Rating Scale (UPDRS) Part II score, the UPDRS Part III score, the modified Rankin Scale score, level of education, and treatment group, we assessed the rate of change for the following measurements: the UPDRS Part II score; the UPDRS Part III score; the Schwab and England Independence Scale score (which measures activities of daily living); the Total Functional Capacity scale; the 39-item Parkinson's Disease Questionnaire, summary index, and activities of daily living subscale; and version 2 of the 12-item Short Form Health Survey Physical Summary and Mental Summary. Variables reaching the statistical threshold in univariate analysis were entered into a multivariable Cox proportional hazards model using time to symptomatic treatment as the dependent variable. More rapid change (ie, worsening) in the UPDRS Part II score (hazard ratio, 1.15 [95% CI, 1.08-1.22] for 1 scale unit change per 6 months), the UPDRS Part III score (hazard ratio, 1.09 [95% CI, 1.06-1.13] for 1 scale unit change per 6 months), and the Schwab and England Independence Scale score (hazard ratio, 1.29 [95% CI, 1.12-1.48] for 5 percentage point change per 6 months) was associated with earlier need for symptomatic therapy. AND RELEVANCE In early PD, the UPDRS Part II score and Part III score and the Schwab and England Independence Scale score can be used to measure disease progression, whereas the 39-item Parkinson's Disease Questionnaire and summary index, Total Functional Capacity scale, and the 12-item Short Form Health Survey Physical Summary and Mental Summary are not sensitive to change. clinicaltrials.gov Identifiers: NCT00063193 and NCT00076492.
Development of vulnerability curves to typhoon hazards based on insurance policy and claim dataset
NASA Astrophysics Data System (ADS)
Mo, Wanmei; Fang, Weihua; li, Xinze; Wu, Peng; Tong, Xingwei
2016-04-01
Vulnerability refers to the characteristics and circumstances of an exposure that make it vulnerable to the effects of some certain hazards. It can be divided into physical vulnerability, social vulnerability, economic vulnerabilities and environmental vulnerability. Physical vulnerability indicates the potential physical damage of exposure caused by natural hazards. Vulnerability curves, quantifying the loss ratio against hazard intensity with a horizontal axis for the intensity and a vertical axis for the Mean Damage Ratio (MDR), is essential to the vulnerability assessment and quantitative evaluation of disasters. Fragility refers to the probability of diverse damage states under different hazard intensity, revealing a kind of characteristic of the exposure. Fragility curves are often used to quantify the probability of a given set of exposure at or exceeding a certain damage state. The development of quantitative fragility and vulnerability curves is the basis of catastrophe modeling. Generally, methods for quantitative fragility and vulnerability assessment can be categorized into empirical, analytical and expert opinion or judgment-based ones. Empirical method is one of the most popular methods and it relies heavily on the availability and quality of historical hazard and loss dataset, which has always been a great challenge. Analytical method is usually based on the engineering experiments and it is time-consuming and lacks built-in validation, so its credibility is also sometimes criticized widely. Expert opinion or judgment-based method is quite effective in the absence of data but the results could be too subjective so that the uncertainty is likely to be underestimated. In this study, we will present the fragility and vulnerability curves developed with empirical method based on simulated historical typhoon wind, rainfall and induced flood, and insurance policy and claim datasets of more than 100 historical typhoon events. Firstly, an insurance exposure classification system is built according to structure type, occupation type and insurance coverage. Then MDR estimation method based on considering insurance policy structure and claim information is proposed and validated. Following that, fragility and vulnerability curves of the major exposure types for construction, homeowner insurance and enterprise property insurance are fitted with empirical function based on the historical dataset. The results of this study can not only help understand catastrophe risk and mange insured disaster risks, but can also be applied in other disaster risk reduction efforts.
Tada, Toshifumi; Kumada, Takashi; Toyoda, Hidenori; Kiriyama, Seiki; Tanikawa, Makoto; Hisanaga, Yasuhiro; Kanamori, Akira; Kitabatake, Shusuke; Yama, Tsuyoki
2015-09-01
It has been reported that the branched-chain amino acid (BCAA) to tyrosine ratio (BTR) is a useful indicator of liver function and BCAA therapy is associated with a decreased incidence of hepatocellular carcinoma (HCC). However, there has not been sufficient research on the relationship between BTR and the effects of BCAA therapy after initial treatment of HCC. We investigated the impact of BTR and BCAA therapy on survival in patients with HCC. A total of 315 patients with HCC who were treated (n = 66) or not treated (n = 249) with BCAA were enrolled; of these, 66 were selected from each group using propensity score matching. Survival from liver-related mortality was analyzed. In patients who did not receive BCAA therapy (n = 249), multivariate analysis for factors associated with survival indicated that low BTR (≤ 4.4) was independently associated with poor prognosis in patients with HCC (hazard ratio, 1.880; 95% confidence interval, 1.125-3.143; P = 0.016). In addition, among patients selected by propensity score matching (n = 132), multivariate analysis indicated that BCAA therapy was independently associated with good prognosis in patients with HCC (hazard ratio, 0.524; 95% confidence interval, 0.282-0.973; P = 0.041). BTR was not significantly associated with survival. Intervention involving BCAA therapy improved survival in patients with HCC versus untreated controls, regardless of BTR. In addition, low BTR was associated with poor prognosis in patients who did not receive BCAA therapy. © 2015 Journal of Gastroenterology and Hepatology Foundation and Wiley Publishing Asia Pty Ltd.
de Jongh, Renate T; Lips, Paul; van Schoor, Natasja M; Rijs, Kelly J; Deeg, Dorly J H; Comijs, Hannie C; Kramer, Mark H H; Vandenbroucke, Jan P; Dekkers, Olaf M
2011-10-01
To what extent endogenous subclinical thyroid disorders contribute to impaired physical and cognitive function, depression, and mortality in older individuals remains a matter of debate. A population-based, prospective cohort of the Longitudinal Aging Study Amsterdam. TSH and, if necessary, thyroxine and triiodothyronine levels were measured in individuals aged 65 years or older. Participants were classified according to clinical categories of thyroid function. Participants with overt thyroid disease or use of thyroid medication were excluded, leaving 1219 participants for analyses. Outcome measures were physical and cognitive function, depressive symptoms (cross-sectional), and mortality (longitudinal) Sixty-four (5.3%) individuals had subclinical hypothyroidism and 34 (2.8%) individuals had subclinical hyperthyroidism. Compared with euthyroidism (n=1121), subclinical hypo-, and hyper-thyroidism were not significantly associated with impairment of physical or cognitive function, or depression. On the contrary, participants with subclinical hypothyroidism did less often report more than one activity limitation (odds ratio 0.44, 95% confidence interval (CI) 0.22-0.86). After a median follow-up of 10.7 years, 601 participants were deceased. Subclinical hypo- and hyper-thyroidism were not associated with increased overall mortality risk (hazard ratio 0.89, 95% CI 0.59-1.35 and 0.69, 95% CI 0.40-1.20 respectively). This study does not support disadvantageous effects of subclinical thyroid disorders on physical or cognitive function, depression, or mortality in an older population.
Erythropoietin improves cardiac wasting and outcomes in a rat model of liver cancer cachexia.
Saitoh, Masakazu; Hatanaka, Michiyoshi; Konishi, Masaaki; Ishida, Junichi; Palus, Sandra; Ebner, Nicole; Döhner, Wolfram; von Haehling, Stephan; Anker, Stefan D; Springer, Jochen
2016-09-01
Erythropoietin administration, which is clinically used in cancer patients with cancer-induced anemia, has also potentially beneficial effects on nonhematopoietic organs. We assessed the effects of erythropoietin on cancer cachexia progression and cardiac wasting compared with placebo using the Yoshida hepatoma model. Wistar rats were divided in a sham group (n=10) and a tumor-bearing group (n=60). The tumor-bearing group was further randomized to placebo (n=28), 500Unit/kg/day (n=16) or 5000Unit/kg/day of erythropoietin (n=16). Body composition was measured using nuclear magnetic resonance spectroscopy, cardiac function using echocardiography, physical activity using infrared monitoring system. Tumor-bearing rats with high dose erythropoietin led to a significant improvement on survival compared with placebo (hazard ratio: 0.43, 95%CI: 0.20-0.92, p=0.030), though low dose erythropoietin did not reach significance (hazard ratio: 0.46, 95%CI: 0.22-1.02, p=0.056). Loss of body weight, wasting of lean mass, fat mass, and reduced physical activity were ameliorated in rats treated with both low and high doses of erythropoietin (p<0.05, all). Moreover, reduced left ventricular mass and left ventricular systolic function were also ameliorated in rats treated with low and high doses of erythropoietin (p<0.05, respectively). Overall, the present data support that cardiac wasting induced by cancer cachexia plays an important role which leads to impaired survival, provided that the erythropoietin could be an effective therapeutic approach for cancer cachexia progression and cardiac wasting. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Speers, Corey; Liu, Meilan; Wilder-Romans, Kari; Lawrence, Theodore S.; Pierce, Lori J.; Feng, Felix Y.
2015-01-01
Purpose The molecular drivers of metastasis in breast cancer are not well understood. Therefore, we sought to identify the biological processes underlying distant progression and define a prognostic signature for metastatic potential in breast cancer. Experimental design In vivo screening for metastases was performed using Chick Chorioallantoic Membrane assays in 21 preclinical breast cancer models. Expressed genes associated with metastatic potential were identified using high-throughput analysis. Correlations with biological function were determined using the Database for Annotation, Visualization and Integrated Discovery. Results We identified a broad range of metastatic potential that was independent of intrinsic breast cancer subtypes. 146 genes were significantly associated with metastasis progression and were linked to cancer-related biological functions, including cell migration/adhesion, Jak-STAT, TGF-beta, and Wnt signaling. These genes were used to develop a platform-independent gene expression signature (M-Sig), which was trained and subsequently validated on 5 independent cohorts totaling nearly 1800 breast cancer patients with all p-values < 0.005 and hazard ratios ranging from approximately 2.5 to 3. On multivariate analysis accounting for standard clinicopathologic prognostic variables, M-Sig remained the strongest prognostic factor for metastatic progression, with p-values < 0.001 and hazard ratios > 2 in three different cohorts. Conclusion M-Sig is strongly prognostic for metastatic progression, and may provide clinical utility in combination with treatment prediction tools to better guide patient care. In addition, the platform-independent nature of the signature makes it an excellent research tool as it can be directly applied onto existing, and future, datasets. PMID:25974184
Kang, Jeehoon; Park, Jin Joo; Cho, Young-Jin; Oh, Il-Young; Park, Hyun-Ah; Lee, Sang Eun; Kim, Min-Seok; Cho, Hyun-Jai; Lee, Hae-Young; Choi, Jin Oh; Hwang, Kyung-Kuk; Kim, Kye Hun; Yoo, Byung-Su; Kang, Seok-Min; Baek, Sang Hong; Jeon, Eun-Seok; Kim, Jae-Joong; Cho, Myeong-Chan; Chae, Shung Chull; Oh, Byung-Hee; Choi, Dong-Ju
2018-03-13
Worsening renal function (WRF) is associated with adverse outcomes in patients with heart failure. We investigated the predictors and prognostic value of WRF during admission, in patients with preserved ejection fraction (HFpEF) versus those with reduced ejection fraction (HFrEF). A total of 5625 patients were enrolled in the KorAHF (Korean Acute Heart Failure) registry. WRF was defined as an absolute increase in creatinine of ≥0.3 mg/dL. Transient WRF was defined as recovery of creatinine at discharge, whereas persistent WRF was indicated by a nonrecovered creatinine level. HFpEF and HFrEF were defined as a left ventricle ejection fraction ≥50% and ≤40%, respectively. Among the total population, WRF occurred in 3101 patients (55.1%). By heart failure subgroup, WRF occurred more frequently in HFrEF (57.0% versus 51.3%; P <0.001 in HFrEF and HFpEF). Prevalence of WRF increased as creatinine clearance decreased in both heart failure subgroups. Among various predictors of WRF, chronic renal failure was the strongest predictor. WRF was an independent predictor of adverse in-hospital outcomes (HFrEF: odds ratio; 2.75; 95% confidence interval, 1.50-5.02; P =0.001; HFpEF: odds ratio, 9.48; 95% confidence interval, 1.19-75.89; P =0.034) and 1-year mortality (HFrEF: hazard ratio, 1.41; 95% confidence interval, 1.12-1.78; P =0.004 versus HFpEF: hazard ratio, 1.72; 95% confidence interval, 1.23-2.42; P =0.002). Transient WRF was a risk factor for 1-year mortality, whereas persistent WRF had no additive risk compared to transient WRF. In patients with acute heart failure patients, WRF is an independent predictor of adverse in-hospital and follow-up outcomes in both HFrEF and HFpEF, though with a different effect size. URL: https://www.clinicaltrials.gov. Unique identifier: NCT01389843. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Aedo, Sócrates; Cavada, Gabriel; Blümel, Juan E; Chedraui, Peter; Fica, Juan; Barriga, Patricio; Brantes, Sergio; Irribarra, Cristina; Vallejo, María; Campodónico, Ítalo
2015-12-01
This study aims to determine time differences (differences in restricted mean survival times [RMSTs]) in the onset of invasive breast cancer, coronary heart disease, stroke, pulmonary embolism, colorectal cancer, and hip fracture between the placebo group and the conjugated equine estrogens 0.625 mg plus medroxyprogesterone acetate 2.5 mg group of the Women's Health Initiative (WHI) trial based on survival curves of the original report and to provide adequate interpretation of the clinical effects of a given intervention. Distribution of survival function was obtained from cumulative hazard plots of the WHI report; Monte Carlo simulation was performed to obtain censored observations for each outcome, in which assumptions of the Cox model were evaluated once corresponding hazard ratios had been estimated. Using estimation methods such as numerical integration, pseudovalues, and flexible parametric modeling, we determined differences in RMSTs for each outcome. Obtained cumulative hazard plots, hazard ratios, and outcome rates from the simulated model did not show differences in relation to the original WHI report. The differences in RMST between placebo and conjugated equine estrogens 0.625 mg plus medroxyprogesterone acetate 2.5 mg (in flexible parametric modeling) were 1.17 days (95% CI, -2.25 to 4.59) for invasive breast cancer, 7.50 days (95% CI, 2.90 to 12.11) for coronary heart disease, 2.75 days (95% CI, -0.84 to 6.34) for stroke, 4.23 days (95% CI, 1.82 to 6.64) for pulmonary embolism, -2.73 days (95% CI, -5.32 to -0.13) for colorectal cancer, and -2.77 days (95% CI, -5.44 to -0.1) for hip fracture. The differences in RMST for the outcomes of the WHI study are too small to establish clinical risks related to hormone therapy use.
Serum Urate at Trial Entry and ALS Progression in EMPOWER
O'Reilly, Éilis J.; Liu, Dawei; Johns, Donald R.; Cudkowicz, Merit E.; Paganoni, Sabrina; Schwarzschild, Michael A.; Leitner, Melanie; Ascherio, Alberto
2017-01-01
Objective Determine whether serum urate predicts ALS progression. Methods The study population comprised adult participants of EMPOWER (n=942), a phase III clinical trial to evaluate the efficacy of dexpramipexole to treat ALS. Urate was measured in blood samples collected during enrollment as part of the routine block chemistry. Outcomes Combined assessment of function and survival rank (CAFs), and time to death, by 12 months. Results In women there was not a significant relation between urate and outcomes. In men, outcomes improved with increasing urate (comparing highest to lowest urate quartile: CAFS was 53 points better with p for trend=0.04; and hazard ratio for death was 0.60 with p for trend=0.07), but with adjustment for body mass index (BMI) at baseline, a predictor of both urate levels and prognosis, associations were attenuated and no longer statistically significant. Overall, participants with urate levels equal to or above the median (5.1 mg/dL) appeared to have a survival advantage compared to those below (hazard ratio adjusted for BMI: 0.67; 95% confidence interval: 0.47 to 0.95). Conclusion These findings suggest that while the association between urate at baseline and ALS progression is partially explained by BMI, there may be an independent beneficial effect of urate. PMID:27677562
Bouchi, Ryotaro; Fukuda, Tatsuya; Takeuchi, Takato; Minami, Isao; Yoshimoto, Takanobu; Ogawa, Yoshihiro
2017-11-01
Sarcopenia, defined as age-related loss of skeletal muscle mass and function, increases the risk of albuminuria. However, it has still unknown whether sarcopenia could increase the risk for the progression of albuminuria. A total 238 patients with type 2 diabetes (mean age 64 ± 12 years; 39.2% women) were studied in the present retrospective observational study. The prevalence of sarcopenia was 17.6%. During the median follow-up period of 2.6 years, albuminuria was measured 5.8 ± 1.8 times, and progression of albuminuria was observed in 14.9% of patients with normoalbuminuria, as was 11.5% in those with microalbuminuria. Sarcopenia was significantly associated with both progression (hazard ratio 2.61, 95% confidence interval 1.08-6.31, P = 0.034) and regression (hazard ratio 0.23, 95% confidence interval 0.05-0.98, P = 0.048) of albuminuria by multivariate Cox regression analysis. The present data suggest that sarcopenia is an important determinant of both progression and regression of albuminuria in patients with type 2 diabetes. © 2017 The Authors. Journal of Diabetes Investigation published by Asian Association for the Study of Diabetes (AASD) and John Wiley & Sons Australia, Ltd.
UCH-LI acts as a novel prognostic biomarker in gastric cardiac adenocarcinoma.
Yang, Honghong; Zhang, Chunhong; Fang, Shan; Ou, Rongying; Li, Wenfeng; Xu, Yunsheng
2015-01-01
Gastric cardiac adenocarcinoma (GCA) accounts for a majority of gastric cancer population and harbors unfavorable outcome. Ubiquitin C-terminal hydrolase L1 (UCH-L1) belongs to the deubiquitinating enzyme family, which could regulate cell growth in human cancers. In the present study, expression of UCH-L1 was evaluated in 196 GCAs by immunohistochemistry using tissue microarray and its function on gastric cancer cells was measured. UCH-L1 expression was increased in GCA specimens, compared with their normal tissues and UCH-L1 overexpression is tightly correlated with tumor size and overall TNM stage. Log-rank analysis showed that UCH-L1 positive is reversely associated with cumulative survival (P<0.001). Multivariate Cox regression model showed that UCH-L1 overexpression is a remarkably negative predictor in GCA prognosis (Hazard Ratio=0.53, P<0.01), along with advanced TNM stage that is a known negative factor in gastric cancers (Hazard Ratio=0.33, P<0.05). Silencing of UCH-L1 reduced the ability of cell proliferation, colony formation, migration and invasion of gastric cancer cells. Our findings suggest that UCH-L1 is a promising prognostic biomarker for GCAs and might play an important role in the carcinogenesis of gastric cancer.
Kang, Minyong; Yu, Jiwoong; Sung, Hyun Hwan; Jeon, Hwang Gyun; Jeong, Byong Chang; Park, Se Hoon; Jeon, Seong Soo; Lee, Hyun Moo; Choi, Han Yong; Seo, Seong Il
2018-05-13
To examine the prognostic role of the pretreatment aspartate transaminase/alanine transaminase or De Ritis ratio in patients with metastatic renal cell carcinoma receiving first-line systemic tyrosine kinase inhibitor therapy. We retrospectively searched the medical records of 579 patients with metastatic renal cell carcinoma who visited Samsung Medical Center, Seoul, Korea, from January 2001 through August 2016. After excluding 210 patients, we analyzed 360 patients who received first-line tyrosine kinase inhibitor therapy. Cancer-specific survival and overall survival were defined as the primary and secondary end-points, respectively. A multivariate Cox proportional hazards regression model was used to identify independent prognosticators of survival outcomes. The overall population was divided into two groups according to the pretreatment De Ritis ratio as an optimal cut-off value of 1.2, which was determined by a time-dependent receiver operating characteristic curve analysis. Patients with a higher pretreatment De Ritis ratio (≥1.2) had worse cancer-specific survival and overall survival outcomes, compared with those with a lower De Ritis ratio (<1.2). Notably, a higher De Ritis ratio (≥1.2) was found to be an independent predictor of both cancer-specific survival (hazard ratio 1.61, 95% confidence interval 1.13-2.30) and overall survival outcomes (hazard ratio 1.69, 95% confidence interval 1.19-2.39), along with male sex, multiple metastasis (≥2), non-clear cell histology, advanced pT stage (≥3), previous metastasectomy and the Memorial Sloan Kettering Cancer Center risk classification. Our findings show that the pretreatment De Ritis ratio can provide valuable information about the survival outcomes of metastatic renal cell carcinoma patients receiving first-line tyrosine kinase inhibitor therapy. © 2018 The Japanese Urological Association.
Christensen, Bianca; Qin, Zijian; Byrd, Desiree A; Yu, Fang; Morgello, Susan; Gelman, Benjamin B; Moore, David J; Grant, Igor; Singer, Elyse J; Fox, Howard S; Baccaglini, Lorena
2017-10-01
With the transition of HIV infection from an acute to a chronic disease after the introduction of antiretroviral medications, there has been an increased focus on long-term neurocognitive and other functional outcomes of HIV patients. Thus, we assessed factors, particularly history of a substance use disorder, associated with time to loss of measures of physical or mental independence among HIV-positive individuals. Data were obtained from the National NeuroAIDS Tissue Consortium. Kaplan-Meier and Cox proportional hazards regression analyses were used to estimate the time since HIV diagnosis to loss of independence, and to identify associated risk factors. HIV-positive participants who self-identified as physically (n = 698) or mentally (n = 616) independent on selected activities of daily living at baseline were eligible for analyses. A history of substance use disorder was associated with a higher hazard of loss of both physical and mental independence [adjusted hazard ratio (HR) = 1.71, 95% confidence interval (95% CI): 1.07-2.78; adjusted HR = 1.67, 95% CI: 1.11-2.52, respectively]. After adjusting for substance use disorder and other covariates, older age at diagnosis and female gender were associated with higher hazards of loss of both physical and mental independence, non-white participants had higher hazards of loss of physical independence, whereas participants with an abnormal neurocognitive diagnosis and fewer years of education had higher hazards of loss of mental independence. In summary, history of substance use disorder was associated with loss of measures of both physical and mental independence. The nature of this link and the means to prevent such loss of independence need further investigation.
Cole, Stephen R.; Jacobson, Lisa P.; Tien, Phyllis C.; Kingsley, Lawrence; Chmiel, Joan S.; Anastos, Kathryn
2010-01-01
To estimate the net effect of imperfectly measured highly active antiretroviral therapy on incident acquired immunodeficiency syndrome or death, the authors combined inverse probability-of-treatment-and-censoring weighted estimation of a marginal structural Cox model with regression-calibration methods. Between 1995 and 2007, 950 human immunodeficiency virus–positive men and women were followed in 2 US cohort studies. During 4,054 person-years, 374 initiated highly active antiretroviral therapy, 211 developed acquired immunodeficiency syndrome or died, and 173 dropped out. Accounting for measured confounders and determinants of dropout, the weighted hazard ratio for acquired immunodeficiency syndrome or death comparing use of highly active antiretroviral therapy in the prior 2 years with no therapy was 0.36 (95% confidence limits: 0.21, 0.61). This association was relatively constant over follow-up (P = 0.19) and stronger than crude or adjusted hazard ratios of 0.75 and 0.95, respectively. Accounting for measurement error in reported exposure using external validation data on 331 men and women provided a hazard ratio of 0.17, with bias shifted from the hazard ratio to the estimate of precision as seen by the 2.5-fold wider confidence limits (95% confidence limits: 0.06, 0.43). Marginal structural measurement-error models can simultaneously account for 3 major sources of bias in epidemiologic research: validated exposure measurement error, measured selection bias, and measured time-fixed and time-varying confounding. PMID:19934191
Todd, Jonathan V.; Cole, Stephen R.; Pence, Brian W.; Lesko, Catherine R.; Bacchetti, Peter; Cohen, Mardge H.; Feaster, Daniel J.; Gange, Stephen; Griswold, Michael E.; Mack, Wendy; Rubtsova, Anna; Wang, Cuiwei; Weedon, Jeremy; Anastos, Kathryn; Adimora, Adaora A.
2017-01-01
Abstract Depression affects up to 30% of human immunodeficiency virus (HIV)-infected individuals. We estimated joint effects of antiretroviral therapy (ART) initiation and depressive symptoms on time to death using a joint marginal structural model and data from a cohort of HIV-infected women from the Women's Interagency HIV Study (conducted in the United States) from 1998–2011. Among 848 women contributing 6,721 years of follow-up, 194 participants died during follow-up, resulting in a crude mortality rate of 2.9 per 100 women-years. Cumulative mortality curves indicated greatest mortality for women who reported depressive symptoms and had not initiated ART. The hazard ratio for depressive symptoms was 3.38 (95% confidence interval (CI): 2.15, 5.33) and for ART was 0.47 (95% CI: 0.31, 0.70). Using a reference category of women without depressive symptoms who had initiated ART, the hazard ratio for women with depressive symptoms who had initiated ART was 3.60 (95% CI: 2.02, 6.43). For women without depressive symptoms who had not started ART, the hazard ratio was 2.36 (95% CI: 1.16, 4.81). Among women reporting depressive symptoms who had not started ART, the hazard ratio was 7.47 (95% CI: 3.91, 14.3). We found a protective effect of ART initiation on mortality, as well as a harmful effect of depressive symptoms, in a cohort of HIV-infected women. PMID:28430844
Kim, Sung-Yong; Le Rademacher, Jennifer; Antin, Joseph H; Anderlini, Paolo; Ayas, Mouhab; Battiwalla, Minoo; Carreras, Jeanette; Kurtzberg, Joanne; Nakamura, Ryotaro; Eapen, Mary; Deeg, H Joachim
2014-12-01
A proportion of patients with aplastic anemia who are treated with immunosuppressive therapy develop clonal hematologic disorders, including post-aplastic anemia myelodysplastic syndrome. Many will proceed to allogeneic hematopoietic stem cell transplantation. We identified 123 patients with post-aplastic anemia myelodysplastic syndrome who from 1991 through 2011 underwent allogeneic hematopoietic stem cell transplantation, and in a matched-pair analysis compared outcome to that in 393 patients with de novo myelodysplastic syndrome. There was no difference in overall survival. There were no significant differences with regard to 5-year probabilities of relapse, non-relapse mortality, relapse-free survival and overall survival; these were 14%, 40%, 46% and 49% for post-aplastic anemia myelodysplastic syndrome, and 20%, 33%, 47% and 49% for de novo myelodysplastic syndrome, respectively. In multivariate analysis, relapse (hazard ratio 0.71; P=0.18), non-relapse mortality (hazard ratio 1.28; P=0.18), relapse-free survival (hazard ratio 0.97; P=0.80) and overall survival (hazard ratio 1.02; P=0.88) of post-aplastic anemia myelodysplastic syndrome were similar to those of patients with de novo myelodysplastic syndrome. Cytogenetic risk was independently associated with overall survival in both groups. Thus, transplant success in patients with post-aplastic anemia myelodysplastic syndrome was similar to that in patients with de novo myelodysplastic syndrome, and cytogenetics was the only significant prognostic factor for post-aplastic anemia myelodysplastic syndrome patients. Copyright© Ferrata Storti Foundation.
Ren, J S; Freedman, N D; Kamangar, F; Dawsey, S M; Hollenbeck, A R; Schatzkin, A; Abnet, C C
2010-07-01
The authors investigated the relationship between hot tea, iced tea, coffee and carbonated soft drinks consumption and upper gastrointestinal tract cancers risk in the NIH-AARP Study. During 2,584,953 person-years of follow-up on 481,563 subjects, 392 oral cavity, 178 pharynx, 307 larynx, 231 gastric cardia, 224 gastric non-cardia cancer, 123 Oesophageal Squamous Cell Carcinoma (ESCC) and 305 Oesophageal Adenocarcinoma (EADC) cases were accrued. Hazard ratios (HRs) and 95% confidence intervals (95% CIs) were calculated by multivariate-adjusted Cox regression. Compared to non-drinking, the hazard ratio for hot tea intake of > or =1 cup/day was 0.37 (95% CI: 0.20, 0.70) for pharyngeal cancer. The authors also observed a significant association between coffee drinking and risk of gastric cardia cancer (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 1.57 (95% CI: 1.03, 2.39)), and an inverse association between coffee drinking and EADC for the cases occurring in the last 3 years of follow-up (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 0.54 (95% CI: 0.31, 0.92)), but no association in earlier follow-up. In summary, hot tea intake was inversely associated with pharyngeal cancer, and coffee was directly associated with gastric cardia cancer, but was inversely associated with EADC during some follow-up periods. Published by Elsevier Ltd.
Ren, JS; Freedman, ND; Kamangar, F; Dawsey, SM; Hollenbeck, AR; Schatzkin, A; Abnet, CC
2010-01-01
The authors investigated the relationship between hot tea, iced tea, coffee and carbonated soft drinks consumption and upper gastrointestinal tract cancers risk in the NIH-AARP Study. During 2,584,953 person-years of follow-up on 481,563 subjects, 392 oral cavity, 178 pharynx, 307 larynx, 231 gastric cardia, 224 gastric noncardia cancer, 123 esophageal squamous cell carcinoma (ESCC) and 305 esophageal adenocarcinoma (EADC) cases were accrued. Hazard ratios (HRs) and 95% Confidence Intervals (95%CIs) were calculated by multivariate-adjusted Cox regression. Compared to non-drinking, the hazard ratio for hot tea intake of ≥1 cup/day was 0.37 (95%CI: 0.20, 0.70) for pharyngeal cancer. The authors also observed a significant association between coffee drinking and risk of gastric cardia cancer (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 1.57 (95%CI: 1.03, 2.39)), and an inverse association between coffee drinking and EADC for the cases occurring in the last three years of follow-up (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 0.54 (95%CI: 0.31, 0.92)), but no association in earlier follow-up. In summary, hot tea intake was inversely associated with pharyngeal cancer, and coffee was directly associated with gastric cardia cancer, but was inversely associated with EADC during some follow-up periods. PMID:20395127
Anthropometry and the Risk of Lung Cancer in EPIC.
Dewi, Nikmah Utami; Boshuizen, Hendriek C; Johansson, Mattias; Vineis, Paolo; Kampman, Ellen; Steffen, Annika; Tjønneland, Anne; Halkjær, Jytte; Overvad, Kim; Severi, Gianluca; Fagherazzi, Guy; Boutron-Ruault, Marie-Christine; Kaaks, Rudolf; Li, Kuanrong; Boeing, Heiner; Trichopoulou, Antonia; Bamia, Christina; Klinaki, Eleni; Tumino, Rosario; Palli, Domenico; Mattiello, Amalia; Tagliabue, Giovanna; Peeters, Petra H; Vermeulen, Roel; Weiderpass, Elisabete; Torhild Gram, Inger; Huerta, José María; Agudo, Antonio; Sánchez, María-José; Ardanaz, Eva; Dorronsoro, Miren; Quirós, José Ramón; Sonestedt, Emily; Johansson, Mikael; Grankvist, Kjell; Key, Tim; Khaw, Kay-Tee; Wareham, Nick; Cross, Amanda J; Norat, Teresa; Riboli, Elio; Fanidi, Anouar; Muller, David; Bueno-de-Mesquita, H Bas
2016-07-15
The associations of body mass index (BMI) and other anthropometric measurements with lung cancer were examined in 348,108 participants in the European Investigation Into Cancer and Nutrition (EPIC) between 1992 and 2010. The study population included 2,400 case patients with incident lung cancer, and the average length of follow-up was 11 years. Hazard ratios were calculated using Cox proportional hazard models in which we modeled smoking variables with cubic splines. Overall, there was a significant inverse association between BMI (weight (kg)/height (m)(2)) and the risk of lung cancer after adjustment for smoking and other confounders (for BMI of 30.0-34.9 versus 18.5-25.0, hazard ratio = 0.72, 95% confidence interval: 0.62, 0.84). The strength of the association declined with increasing follow-up time. Conversely, after adjustment for BMI, waist circumference and waist-to-height ratio were significantly positively associated with lung cancer risk (for the highest category of waist circumference vs. the lowest, hazard ratio = 1.25, 95% confidence interval: 1.05, 1.50). Given the decline of the inverse association between BMI and lung cancer over time, the association is likely at least partly due to weight loss resulting from preclinical lung cancer that was present at baseline. Residual confounding by smoking could also have influenced our findings. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Pulmonary function and adverse cardiovascular outcomes: Can cardiac function explain the link?
Burroughs Peña, Melissa S; Dunning, Allison; Schulte, Phillip J; Durheim, Michael T; Kussin, Peter; Checkley, William; Velazquez, Eric J
2016-12-01
The complex interaction between pulmonary function, cardiac function and adverse cardiovascular events has only been partially described. We sought to describe the association between pulmonary function with left heart structure and function, all-cause mortality and incident cardiovascular hospitalization. This study is a retrospective analysis of patients evaluated in a single tertiary care medical center. We used multivariable linear regression analyses to examine the relationship between FVC and FEV1 with left ventricular ejection fraction (LVEF), left ventricular internal dimension in systole and diastole (LVIDS, LVIDD) and left atrial diameter, adjusting for baseline characteristics, right ventricular function and lung hyperinflation. We also used Cox proportional hazards models to examine the relationship between FVC and FEV1 with all-cause mortality and cardiac hospitalization. A total of 1807 patients were included in this analysis with a median age of 61 years and 50% were female. Decreased FVC and FEV1 were both associated with decreased LVEF. In individuals with FVC less than 2.75 L, decreased FVC was associated with increased all-cause mortality after adjusting for left and right heart echocardiographic variables (hazard ratio [HR] 0.49, 95% CI 0.29, 0.82, respectively). Decreased FVC was associated with increased cardiac hospitalization after adjusting for left heart size (HR 0.80, 95% CI 0.67, 0.96), even in patients with normal LVEF (HR 0.75, 95% CI 0.57, 0.97). In a tertiary care center reduced pulmonary function was associated with adverse cardiovascular events, a relationship that is not fully explained by left heart remodeling or right heart dysfunction. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kidane, Biniam; Sulman, Joanne; Xu, Wei; Kong, Qin Quinn; Wong, Rebecca; Knox, Jennifer J; Darling, Gail E
2016-06-01
Functional Assessment of Cancer Therapy-Esophagus is a health-related quality of life instrument validated in patients with esophageal cancer. It is composed of a general component and an esophageal cancer subscale. Our objective was to determine whether the baseline Functional Assessment of Cancer Therapy-Esophagus and esophageal cancer subscale scores are associated with survival in patients with stage II and III cancer of the gastroesophageal junction or thoracic esophagus. Data from 4 prospective studies in Canadian academic hospitals were combined. These included consecutive patients with stage II and III esophageal cancer who received neoadjuvant therapy followed by surgery or chemoradiation/radiation alone. All patients completed baseline Functional Assessment of Cancer Therapy-Esophagus. Functional Assessment of Cancer Therapy-Esophagus and esophageal cancer subscale scores were dichotomized on the basis of median scores. Cox regression analyses were performed. There were 207 patients treated between 1996 and 2014. Mean age was 61 ± 10.6 years. Approximately 69.6% of patients (n = 144) had adenocarcinoma. All patients had more than 9 months of follow-up. In patients with stage II and III, 93 deaths were observed. When treated as continuous variables, baseline Functional Assessment of Cancer Therapy-Esophagus and esophageal cancer subscale were associated with survival with hazard ratios of 0.89 (95% confidence interval [CI], 0.81-0.96; P = .005) and 0.68 (95% CI, 0.56-0.82; P < .001), respectively. When dichotomized, they were also associated with survival with a hazard ratio of 0.58 (95% CI, 0.38-0.89; P = .01) and 0.43 (95% CI, 0.28-0.67; P < .001), respectively. In patients with stage II and III esophageal cancer being considered for therapy, higher baseline Functional Assessment of Cancer Therapy-Esophagus and esophageal cancer subscale were independently associated with longer survival, even after adjusting for age, stage, histology, and therapy received. Further study is needed, but Functional Assessment of Cancer Therapy-Esophagus may be useful as a prognostic tool to inform patient decision-making and patient selection criteria for studies. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Failing, Jarrett J; Yan, Yiyi; Porrata, Luis F; Markovic, Svetomir N
2017-12-01
The peripheral blood lymphocyte-to-monocyte ratio (LMR) has been associated with prognosis in many malignancies including metastatic melanoma. However, it has not been studied in patients treated with immune checkpoint inhibitors. In this study, we analyzed the baseline LMR with progression-free survival (PFS) and overall survival (OS) in metastatic melanoma patients treated with pembrolizumab. A total of 133 patients with metastatic melanoma treated with pembrolizumab were included in this retrospective study. LMR was calculated from pretherapy peripheral blood counts and the optimal cutoff value was determined by a receiver operator characteristic curve. PFS and OS were evaluated using the Kaplan-Meier method and multivariate Cox proportional hazard modeling. Patients with an LMR of at least 1.7 showed improved PFS (hazard ratio=0.55; 95% confidence interval: 0.34-0.92; P=0.024) and OS (hazard ratio=0.29; 95% confidence interval: 0.15-0.59; P=0.0007). The baseline LMR is associated with PFS and OS in metastatic melanoma patients treated with pembrolizumab, and could represent a convenient and cost-effective prognostic biomarker. Validation of these findings in an independent cohort is needed.
Palatini, Paolo; Reboldi, Gianpaolo; Beilin, Lawrence J; Eguchi, Kazuo; Imai, Yutaka; Kario, Kazuomi; Ohkubo, Takayoshi; Pierdomenico, Sante D; Saladini, Francesca; Schwartz, Joseph E; Wing, Lindon; Verdecchia, Paolo
2013-09-30
Data from prospective cohort studies regarding the association between ambulatory heart rate (HR) and cardiovascular events (CVE) are conflicting. To investigate whether ambulatory HR predicts CVE in hypertension, we performed 24-hour ambulatory blood pressure and HR monitoring in 7600 hypertensive patients aged 52 ± 16 years from Italy, U.S.A., Japan, and Australia, included in the 'ABP-International' registry. All were untreated at baseline examination. Standardized hazard ratios for ambulatory HRs were computed, stratifying for cohort, and adjusting for age, gender, blood pressure, smoking, diabetes, serum total cholesterol and serum creatinine. During a median follow-up of 5.0 years there were 639 fatal and nonfatal CVE. In a multivariable Cox model, night-time HR predicted fatal combined with nonfatal CVE more closely than 24h HR (p=0.007 and =0.03, respectively). Daytime HR and the night:day HR ratio were not associated with CVE (p=0.07 and =0.18, respectively). The hazard ratio of the fatal combined with nonfatal CVE for a 10-beats/min increment of the night-time HR was 1.13 (95% CI, 1.04-1.22). This relationship remained significant when subjects taking beta-blockers during the follow-up (hazard ratio, 1.15; 95% CI, 1.05-1.25) or subjects who had an event within 5 years after enrollment (hazard ratio, 1.23; 95% CI, 1.05-1.45) were excluded from analysis. At variance with previous data obtained from general populations, ambulatory HR added to the risk stratification for fatal combined with nonfatal CVE in the hypertensive patients from the ABP-International study. Night-time HR was a better predictor of CVE than daytime HR. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Tracking Temporal Hazard in the Human Electroencephalogram Using a Forward Encoding Model
2018-01-01
Abstract Human observers automatically extract temporal contingencies from the environment and predict the onset of future events. Temporal predictions are modeled by the hazard function, which describes the instantaneous probability for an event to occur given it has not occurred yet. Here, we tackle the question of whether and how the human brain tracks continuous temporal hazard on a moment-to-moment basis, and how flexibly it adjusts to strictly implicit variations in the hazard function. We applied an encoding-model approach to human electroencephalographic data recorded during a pitch-discrimination task, in which we implicitly manipulated temporal predictability of the target tones by varying the interval between cue and target tone (i.e. the foreperiod). Critically, temporal predictability either was driven solely by the passage of time (resulting in a monotonic hazard function) or was modulated to increase at intermediate foreperiods (resulting in a modulated hazard function with a peak at the intermediate foreperiod). Forward-encoding models trained to predict the recorded EEG signal from different temporal hazard functions were able to distinguish between experimental conditions, showing that implicit variations of temporal hazard bear tractable signatures in the human electroencephalogram. Notably, this tracking signal was reconstructed best from the supplementary motor area, underlining this area’s link to cognitive processing of time. Our results underline the relevance of temporal hazard to cognitive processing and show that the predictive accuracy of the encoding-model approach can be utilized to track abstract time-resolved stimuli. PMID:29740594
Rosuvastatin to prevent vascular events in men and women with elevated C-reactive protein.
Ridker, Paul M; Danielson, Eleanor; Fonseca, Francisco A H; Genest, Jacques; Gotto, Antonio M; Kastelein, John J P; Koenig, Wolfgang; Libby, Peter; Lorenzatti, Alberto J; MacFadyen, Jean G; Nordestgaard, Børge G; Shepherd, James; Willerson, James T; Glynn, Robert J
2008-11-20
Increased levels of the inflammatory biomarker high-sensitivity C-reactive protein predict cardiovascular events. Since statins lower levels of high-sensitivity C-reactive protein as well as cholesterol, we hypothesized that people with elevated high-sensitivity C-reactive protein levels but without hyperlipidemia might benefit from statin treatment. We randomly assigned 17,802 apparently healthy men and women with low-density lipoprotein (LDL) cholesterol levels of less than 130 mg per deciliter (3.4 mmol per liter) and high-sensitivity C-reactive protein levels of 2.0 mg per liter or higher to rosuvastatin, 20 mg daily, or placebo and followed them for the occurrence of the combined primary end point of myocardial infarction, stroke, arterial revascularization, hospitalization for unstable angina, or death from cardiovascular causes. The trial was stopped after a median follow-up of 1.9 years (maximum, 5.0). Rosuvastatin reduced LDL cholesterol levels by 50% and high-sensitivity C-reactive protein levels by 37%. The rates of the primary end point were 0.77 and 1.36 per 100 person-years of follow-up in the rosuvastatin and placebo groups, respectively (hazard ratio for rosuvastatin, 0.56; 95% confidence interval [CI], 0.46 to 0.69; P<0.00001), with corresponding rates of 0.17 and 0.37 for myocardial infarction (hazard ratio, 0.46; 95% CI, 0.30 to 0.70; P=0.0002), 0.18 and 0.34 for stroke (hazard ratio, 0.52; 95% CI, 0.34 to 0.79; P=0.002), 0.41 and 0.77 for revascularization or unstable angina (hazard ratio, 0.53; 95% CI, 0.40 to 0.70; P<0.00001), 0.45 and 0.85 for the combined end point of myocardial infarction, stroke, or death from cardiovascular causes (hazard ratio, 0.53; 95% CI, 0.40 to 0.69; P<0.00001), and 1.00 and 1.25 for death from any cause (hazard ratio, 0.80; 95% CI, 0.67 to 0.97; P=0.02). Consistent effects were observed in all subgroups evaluated. The rosuvastatin group did not have a significant increase in myopathy or cancer but did have a higher incidence of physician-reported diabetes. In this trial of apparently healthy persons without hyperlipidemia but with elevated high-sensitivity C-reactive protein levels, rosuvastatin significantly reduced the incidence of major cardiovascular events. (ClinicalTrials.gov number, NCT00239681.) 2008 Massachusetts Medical Society
[Hazard function and life table: an introduction to the failure time analysis].
Matsushita, K; Inaba, H
1987-04-01
Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.
Li, Wenjin; Ray, Roberta M.; Thomas, David B.; Yost, Michael; Davis, Scott; Breslow, Norman; Gao, Dao Li; Fitzgibbons, E. Dawn; Camp, Janice E.; Wong, Eva; Wernli, Karen J.; Checkoway, Harvey
2013-01-01
Exposure to magnetic fields (MFs) is hypothesized to increase the risk of breast cancer by reducing production of melatonin by the pineal gland. A nested case-cohort study was conducted to investigate the association between occupational exposure to MFs and the risk of breast cancer within a cohort of 267,400 female textile workers in Shanghai, China. The study included 1,687 incident breast cancer cases diagnosed from 1989 to 2000 and 4,702 noncases selected from the cohort. Subjects’ complete work histories were linked to a job–exposure matrix developed specifically for the present study to estimate cumulative MF exposure. Hazard ratios and 95% confidence intervals were calculated using Cox proportional hazards modeling that was adapted for the case-cohort design. Hazard ratios were estimated in relation to cumulative exposure during a woman's entire working years. No association was observed between cumulative exposure to MFs and overall risk of breast cancer. The hazard ratio for the highest compared with the lowest quartile of cumulative exposure was 1.03 (95% confidence interval: 0.87, 1.21). Similar null findings were observed when exposures were lagged and stratified by age at breast cancer diagnosis. The findings do not support the hypothesis that MF exposure increases the risk of breast cancer. PMID:24043439
Pole, Jason D.; Mustard, Cameron A.; To, Teresa; Beyene, Joseph; Allen, Alexander C.
2010-01-01
This study was designed to test the hypothesis that fetal exposure to corticosteroids in the antenatal period is an independent risk factor for the development of asthma in early childhood with little or no effect in later childhood. A population-based cohort study of all pregnant women who resided in Nova Scotia, Canada, and gave birth to a singleton fetus between 1989 and 1998 was undertaken. After a priori specified exclusions, 80,448 infants were available for analysis. Using linked health care utilization records, incident asthma cases developed after 36 months of age were identified. Extended Cox proportional hazards models were used to estimate hazard ratios while controlling for confounders. Exposure to corticosteroids during pregnancy was associated with a risk of asthma in childhood between 3–5 years of age: adjusted hazard ratio of 1.19 (95% confidence interval: 1.03, 1.39), with no association noted after 5 years of age: adjusted hazard ratio for 5–7 years was 1.06 (95% confidence interval: 0.86, 1.30) and for 8 or greater years was 0.74 (95% confidence interval: 0.54, 1.03). Antenatal steroid therapy appears to be an independent risk factor for the development of asthma between 3 and 5 years of age. PMID:21490744
Optimal Scaling of Aftershock Zones using Ground Motion Forecasts
NASA Astrophysics Data System (ADS)
Wilson, John Max; Yoder, Mark R.; Rundle, John B.
2018-02-01
The spatial distribution of aftershocks following major earthquakes has received significant attention due to the shaking hazard these events pose for structures and populations in the affected region. Forecasting the spatial distribution of aftershock events is an important part of the estimation of future seismic hazard. A simple spatial shape for the zone of activity has often been assumed in the form of an ellipse having semimajor axis to semiminor axis ratio of 2.0. However, since an important application of these calculations is the estimation of ground shaking hazard, an effective criterion for forecasting future aftershock impacts is to use ground motion prediction equations (GMPEs) in addition to the more usual approach of using epicentral or hypocentral locations. Based on these ideas, we present an aftershock model that uses self-similarity and scaling relations to constrain parameters as an option for such hazard assessment. We fit the spatial aspect ratio to previous earthquake sequences in the studied regions, and demonstrate the effect of the fitting on the likelihood of post-disaster ground motion forecasts for eighteen recent large earthquakes. We find that the forecasts in most geographic regions studied benefit from this optimization technique, while some are better suited to the use of the a priori aspect ratio.
Rates of Atrial Fibrillation in Black Versus White Patients With Pacemakers.
Kamel, Hooman; Kleindorfer, Dawn O; Bhave, Prashant D; Cushman, Mary; Levitan, Emily B; Howard, George; Soliman, Elsayed Z
2016-02-12
Black US residents experience higher rates of ischemic stroke than white residents but have lower rates of clinically apparent atrial fibrillation (AF), a strong risk factor for stroke. It is unclear whether black persons truly have less AF or simply more undiagnosed AF. We obtained administrative claims data from state health agencies regarding all emergency department visits and hospitalizations in California, Florida, and New York. We identified a cohort of patients with pacemakers, the regular interrogation of which reduces the likelihood of undiagnosed AF. We compared rates of documented AF or atrial flutter at follow-up visits using Kaplan-Meier survival statistics and Cox proportional hazards models adjusted for demographic characteristics and vascular risk factors. We identified 10 393 black and 91 380 white patients without documented AF or atrial flutter before or at the index visit for pacemaker implantation. During 3.7 (±1.8) years of follow-up, black patients had a significantly lower rate of AF (21.4%; 95% CI 19.8-23.2) than white patients (25.5%; 95% CI 24.9-26.0). After adjustment for demographic characteristics and comorbidities, black patients had a lower hazard of AF (hazard ratio 0.91; 95% CI 0.86-0.96), a higher hazard of atrial flutter (hazard ratio 1.29; 95% CI 1.11-1.49), and a lower hazard of the composite of AF or atrial flutter (hazard ratio 0.94; 95% CI 0.88-99). In a population-based sample of patients with pacemakers, black patients had a lower rate of AF compared with white patients. These findings indicate that the persistent racial disparities in rates of ischemic stroke are likely to be related to factors other than undiagnosed AF. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Lee, D; Porter, J; Hertel, N; Hatswell, A J; Briggs, A
2016-08-01
In the absence of head-to-head data, a common method for modelling comparative survival for cost-effectiveness analysis is estimating hazard ratios from trial publications. This assumes that the hazards of mortality are proportional between treatments and that outcomes are not polluted by subsequent therapy use. Newer techniques that compare treatments where the proportional hazards assumption is violated and adjust for use of subsequent therapies often require patient-level data, which are rarely available for all treatments. The objective of this study was to provide a comparison of overall survival data for ipilimumab, vemurafenib and dacarbazine using data from three trials lacking a common comparator arm and confounded by the use of subsequent treatment. We compared three estimated overall survival curves for vemurafenib and the difference compared to ipilimumab and dacarbazine. We performed a naïve comparison and adjusted it for heterogeneity between the ipilimumab and vemurafenib trials, including differences in prognostic characteristics and subsequent therapy using a published hazard function for the impact of prognostic characteristics in melanoma and trial data on the impact of second-line use of ipilimumab. The mean incremental life-years gained for patients receiving ipilimumab compared with vemurafenib were 0.34 (95 % confidence interval [CI] -0.24 to 0.84) using the naïve comparison and 0.51 (95 % CI -0.08 to 0.99) using the covariate-adjusted survival curve. The analyses estimated the comparative efficacy of ipilimumab and vemurafenib in the absence of head-to-head patient-level data for all trials and proportional hazards in overall survival.
Environmental Risk Assessment Strategy for Nanomaterials.
Scott-Fordsmand, Janeck J; Peijnenburg, Willie J G M; Semenzin, Elena; Nowack, Bernd; Hunt, Neil; Hristozov, Danail; Marcomini, Antonio; Irfan, Muhammad-Adeel; Jiménez, Araceli Sánchez; Landsiedel, Robert; Tran, Lang; Oomen, Agnes G; Bos, Peter M J; Hund-Rinke, Kerstin
2017-10-19
An Environmental Risk Assessment (ERA) for nanomaterials (NMs) is outlined in this paper. Contrary to other recent papers on the subject, the main data requirements, models and advancement within each of the four risk assessment domains are described, i.e., in the: (i) materials, (ii) release, fate and exposure, (iii) hazard and (iv) risk characterisation domains. The material, which is obviously the foundation for any risk assessment, should be described according to the legislatively required characterisation data. Characterisation data will also be used at various levels within the ERA, e.g., exposure modelling. The release, fate and exposure data and models cover the input for environmental distribution models in order to identify the potential (PES) and relevant exposure scenarios (RES) and, subsequently, the possible release routes, both with regard to which compartment(s) NMs are distributed in line with the factors determining the fate within environmental compartment. The initial outcome in the risk characterisation will be a generic Predicted Environmental Concentration (PEC), but a refined PEC can be obtained by applying specific exposure models for relevant media. The hazard information covers a variety of representative, relevant and reliable organisms and/or functions, relevant for the RES and enabling a hazard characterisation. The initial outcome will be hazard characterisation in test systems allowing estimating a Predicted No-Effect concentration (PNEC), either based on uncertainty factors or on a NM adapted version of the Species Sensitivity Distributions approach. The risk characterisation will either be based on a deterministic risk ratio approach (i.e., PEC/PNEC) or an overlay of probability distributions, i.e., exposure and hazard distributions, using the nano relevant models.
Environmental Risk Assessment Strategy for Nanomaterials
Scott-Fordsmand, Janeck J.; Nowack, Bernd; Hunt, Neil; Hristozov, Danail; Marcomini, Antonio; Irfan, Muhammad-Adeel; Jiménez, Araceli Sánchez; Landsiedel, Robert; Tran, Lang; Oomen, Agnes G.; Bos, Peter M. J.
2017-01-01
An Environmental Risk Assessment (ERA) for nanomaterials (NMs) is outlined in this paper. Contrary to other recent papers on the subject, the main data requirements, models and advancement within each of the four risk assessment domains are described, i.e., in the: (i) materials, (ii) release, fate and exposure, (iii) hazard and (iv) risk characterisation domains. The material, which is obviously the foundation for any risk assessment, should be described according to the legislatively required characterisation data. Characterisation data will also be used at various levels within the ERA, e.g., exposure modelling. The release, fate and exposure data and models cover the input for environmental distribution models in order to identify the potential (PES) and relevant exposure scenarios (RES) and, subsequently, the possible release routes, both with regard to which compartment(s) NMs are distributed in line with the factors determining the fate within environmental compartment. The initial outcome in the risk characterisation will be a generic Predicted Environmental Concentration (PEC), but a refined PEC can be obtained by applying specific exposure models for relevant media. The hazard information covers a variety of representative, relevant and reliable organisms and/or functions, relevant for the RES and enabling a hazard characterisation. The initial outcome will be hazard characterisation in test systems allowing estimating a Predicted No-Effect concentration (PNEC), either based on uncertainty factors or on a NM adapted version of the Species Sensitivity Distributions approach. The risk characterisation will either be based on a deterministic risk ratio approach (i.e., PEC/PNEC) or an overlay of probability distributions, i.e., exposure and hazard distributions, using the nano relevant models. PMID:29048395
Lundin, A; Modig, K; Halldin, J; Carlsson, A C; Wändell, P; Theobald, H
2016-08-01
An increased mortality risk associated with mental disorder has been reported for patients, but there are few studies are based on random samples with interview-based psychiatric diagnoses. Part of the increased mortality for those with mental disorder may be attributable to worse somatic health or hazardous health behaviour - consequences of the disorder - but somatic health information is commonly lacking in psychiatric samples. This study aims to examine long-term mortality risk for psychiatric diagnoses in a general population sample and to assess mediation by somatic ill health and hazardous health behaviour. We used a double-phase stratified random sample of individuals aged 18-65 in Stockholm County 1970-1971 linked to vital records. First phase sample was 32 186 individuals screened with postal questionnaire and second phase was 1896 individuals (920 men and 976 women) that participated in a full-day examination (participation rate 88%). Baseline examination included both a semi-structured interview with a psychiatrist, with mental disorders set according to the 8th version of the International Classification of Disease (ICD-8), and clinical somatic examination, including measures of body composition (BMI), hypertension, fasting blood glucose, pulmonary function and self-reported tobacco smoking. Information on vital status was obtained from the Total Population Register for the years 1970-2011. Associations with mortality were studied with Cox proportional hazard analyses. A total of 883 deaths occurred among the participants during the 41-year follow-up. Increased mortality rates were found for ICD-8 functional psychoses (hazard ratio, HR = 2.22, 95% confidence interval (95% CI): 1.15-4.30); psycho-organic symptoms (HR = 1.94, 95% CI: 1.31-2.87); depressive neuroses (HR = 1.71, 95% CI: 1.23-2.39); alcohol use disorder (HR = 1.91, 95% CI: 1.40-2.61); drug dependence (HR = 3.71, 95% CI: 1.80-7.65) and psychopathy (HR = 2.88, 95% CI: 1.02-8.16). Non-participants (n = 349) had mortality rates similar to participants (HR = 0.98, 95% CI: 0.81-1.18). In subgroup analyses of those with psychoses, depression or alcohol use disorder, adjusting for the potential mediators smoking and pulmonary function, showed only slight changes in the HRs. This study confirms the increased risk of mortality for several psychiatric diagnoses in follow-up studies on American, Finnish and Swedish population-based samples. Only a small part of the increased mortality hazard was attributable to differences in somatic health or hazardous health behaviour measured at baseline.
Simpson, Colin R; Steiner, Markus Fc; Cezard, Genevieve; Bansal, Narinder; Fischbacher, Colin; Douglas, Anne; Bhopal, Raj; Sheikh, Aziz
2015-10-01
There is evidence of substantial ethnic variations in asthma morbidity and the risk of hospitalisation, but the picture in relation to lower respiratory tract infections is unclear. We carried out an observational study to identify ethnic group differences for lower respiratory tract infections. A retrospective, cohort study. Scotland. 4.65 million people on whom information was available from the 2001 census, followed from May 2001 to April 2010. Hospitalisations and deaths (any time following first hospitalisation) from lower respiratory tract infections, adjusted risk ratios and hazard ratios by ethnicity and sex were calculated. We multiplied ratios and confidence intervals by 100, so the reference Scottish White population's risk ratio and hazard ratio was 100. Among men, adjusted risk ratios for lower respiratory tract infection hospitalisation were lower in Other White British (80, 95% confidence interval 73-86) and Chinese (69, 95% confidence interval 56-84) populations and higher in Pakistani groups (152, 95% confidence interval 136-169). In women, results were mostly similar to those in men (e.g. Chinese 68, 95% confidence interval 56-82), although higher adjusted risk ratios were found among women of the Other South Asians group (145, 95% confidence interval 120-175). Survival (adjusted hazard ratio) following lower respiratory tract infection for Pakistani men (54, 95% confidence interval 39-74) and women (31, 95% confidence interval 18-53) was better than the reference population. Substantial differences in the rates of lower respiratory tract infections amongst different ethnic groups in Scotland were found. Pakistani men and women had particularly high rates of lower respiratory tract infection hospitalisation. The reasons behind the high rates of lower respiratory tract infection in the Pakistani community are now required. © The Royal Society of Medicine.
Hazard Function Estimation with Cause-of-Death Data Missing at Random.
Wang, Qihua; Dinse, Gregg E; Liu, Chunling
2012-04-01
Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted estimator. All three estimators are uniformly strongly consistent and asymptotically normal. We derive asymptotic representations of the mean squared error and the mean integrated squared error for these estimators and we discuss a data-driven bandwidth selection method. A simulation study, conducted to assess finite sample behavior, demonstrates that the proposed hazard estimators perform relatively well. We illustrate our methods with an analysis of some vascular disease data.
Meier-Hirmer, Carolina; Schumacher, Martin
2013-06-20
The aim of this article is to propose several methods that allow to investigate how and whether the shape of the hazard ratio after an intermediate event depends on the waiting time to occurrence of this event and/or the sojourn time in this state. A simple multi-state model, the illness-death model, is used as a framework to investigate the occurrence of this intermediate event. Several approaches are shown and their advantages and disadvantages are discussed. All these approaches are based on Cox regression. As different time-scales are used, these models go beyond Markov models. Different estimation methods for the transition hazards are presented. Additionally, time-varying covariates are included into the model using an approach based on fractional polynomials. The different methods of this article are then applied to a dataset consisting of four studies conducted by the German Breast Cancer Study Group (GBSG). The occurrence of the first isolated locoregional recurrence (ILRR) is studied. The results contribute to the debate on the role of the ILRR with respect to the course of the breast cancer disease and the resulting prognosis. We have investigated different modelling strategies for the transition hazard after ILRR or in general after an intermediate event. Including time-dependent structures altered the resulting hazard functions considerably and it was shown that this time-dependent structure has to be taken into account in the case of our breast cancer dataset. The results indicate that an early recurrence increases the risk of death. A late ILRR increases the hazard function much less and after the successful removal of the second tumour the risk of death is almost the same as before the recurrence. With respect to distant disease, the appearance of the ILRR only slightly increases the risk of death if the recurrence was treated successfully. It is important to realize that there are several modelling strategies for the intermediate event and that each of these strategies has restrictions and may lead to different results. Especially in the medical literature considering breast cancer development, the time-dependency is often neglected in the statistical analyses. We show that the time-varying variables cannot be neglected in the case of ILRR and that fractional polynomials are a useful tool for finding the functional form of these time-varying variables.
Inverse odds ratio-weighted estimation for causal mediation analysis.
Tchetgen Tchetgen, Eric J
2013-11-20
An important scientific goal of studies in the health and social sciences is increasingly to determine to what extent the total effect of a point exposure is mediated by an intermediate variable on the causal pathway between the exposure and the outcome. A causal framework has recently been proposed for mediation analysis, which gives rise to new definitions, formal identification results and novel estimators of direct and indirect effects. In the present paper, the author describes a new inverse odds ratio-weighted approach to estimate so-called natural direct and indirect effects. The approach, which uses as a weight the inverse of an estimate of the odds ratio function relating the exposure and the mediator, is universal in that it can be used to decompose total effects in a number of regression models commonly used in practice. Specifically, the approach may be used for effect decomposition in generalized linear models with a nonlinear link function, and in a number of other commonly used models such as the Cox proportional hazards regression for a survival outcome. The approach is simple and can be implemented in standard software provided a weight can be specified for each observation. An additional advantage of the method is that it easily incorporates multiple mediators of a categorical, discrete or continuous nature. Copyright © 2013 John Wiley & Sons, Ltd.
Association between late-life social activity and motor decline in older adults.
Buchman, Aron S; Boyle, Patricia A; Wilson, Robert S; Fleischman, Debra A; Leurgans, Sue; Bennett, David A
2009-06-22
Loss of motor function is a common consequence of aging, but little is known about the factors that predict idiopathic motor decline. Our objective was to test the hypothesis that late-life social activity is related to the rate of change in motor function in old age. Longitudinal cohort study with a mean follow-up of 4.9 years with 906 persons without stroke, Parkinson disease, or dementia participating in the Rush Memory and Aging Project. At baseline, participants rated the frequency of their current participation in common social activities from which a summary measure of social activity was derived. The main outcome measure was annual change in a composite measure of global motor function, based on 9 measures of muscle strength and 9 motor performances. Mean (SD) social activity score at baseline was 2.6 (0.58), with higher scores indicating more frequent participation in social activities. In a generalized estimating equation model, controlling for age, sex, and education, global motor function declined by approximately 0.05 U/y (estimate, 0.016; 95% confidence interval [CI], -0.057 to 0.041 [P = .02]). Each 1-point decrease in social activity was associated with approximately a 33% more rapid rate of decline in motor function (estimate, 0.016; 95% CI, 0.003 to 0.029 [P = .02]). The effect of each 1-point decrease in the social activity score at baseline on the rate of change in global motor function was the same as being approximately 5 years older at baseline (age estimate, -0.003; 95% CI, -0.004 to -0.002 [P<.001]). Furthermore, this amount of motor decline per year was associated with a more than 40% increased risk of death (hazard ratio, 1.44; 95% CI, 1.30 to 1.60) and a 65% increased risk of incident Katz disability (hazard ratio, 1.65; 95% CI, 1.48 to 1.83). The association of social activity with the rate of global motor decline did not vary along demographic lines and was unchanged (estimate, 0.025; 95% CI, 0.005 to 0.045 [P = .01]) after controlling for potential confounders including late-life physical and cognitive activity, disability, global cognition depressive symptoms, body composition, and chronic medical conditions. Less frequent participation in social activities is associated with a more rapid rate of motor function decline in old age.
Bowden, Jack; Seaman, Shaun; Huang, Xin; White, Ian R
2016-04-30
In randomised controlled trials of treatments for late-stage cancer, it is common for control arm patients to receive the experimental treatment around the point of disease progression. This treatment switching can dilute the estimated treatment effect on overall survival and impact the assessment of a treatment's benefit on health economic evaluations. The rank-preserving structural failure time model of Robins and Tsiatis (Comm. Stat., 20:2609-2631) offers a potential solution to this problem and is typically implemented using the logrank test. However, in the presence of substantial switching, this test can have low power because the hazard ratio is not constant over time. Schoenfeld (Biometrika, 68:316-319) showed that when the hazard ratio is not constant, weighted versions of the logrank test become optimal. We present a weighted logrank test statistic for the late stage cancer trial context given the treatment switching pattern and working assumptions about the underlying hazard function in the population. Simulations suggest that the weighted approach can lead to large efficiency gains in either an intention-to-treat or a causal rank-preserving structural failure time model analysis compared with the unweighted approach. Furthermore, violation of the working assumptions used in the derivation of the weights only affects the efficiency of the estimates and does not induce bias or inflate the type I error rate. The weighted logrank test statistic should therefore be considered for use as part of a careful secondary, exploratory analysis of trial data affected by substantial treatment switching. ©2015 The Authors. Statistics inMedicine Published by John Wiley & Sons Ltd.
Glucose Levels and Risk of Frailty.
Zaslavsky, Oleg; Walker, Rod L; Crane, Paul K; Gray, Shelly L; Larson, Eric B
2016-09-01
The association between glucose levels and incident frailty in older persons remains unclear. We examined the extent to which higher glucose levels in older adults with and without diabetes are related to risk of frailty. The data are from the Adult Changes in Thought study. We identified 1,848 individuals aged 65+ without dementia for whom glucose levels from laboratory measurements of glucose and glycated hemoglobin were available. Physical frailty using modified Fried's criteria was determined from biennial assessments. Frailty hazard was modeled as a function of time-varying measures of diabetes and average glucose levels using Cox regression. A total of 578 incident frailty cases (94 with diabetes, 484 without) occurred during a median follow-up of 4.8 years. The adjusted hazard ratio for frailty comparing those with and without diabetes was 1.52 (95% confidence interval = 1.19-1.94). In participants without diabetes, modeling suggested elevated frailty risk with greater average glucose levels (p = .019); for example, a glucose level of 110mg/dL compared with 100mg/dL yielded a hazard ratio of 1.32 (95% confidence interval = 1.09-1.59). In participants with diabetes, glucose levels less than 160mg/dL and greater than 180mg/dL were related to increased risk of frailty (p = .001). Higher glucose levels may be a risk factor for frailty in older adults without diabetes. The apparent U-shape association between glucose levels and frailty in people with diabetes is consistent with the literature on glycemia and mortality and deserves further examination. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Portman, Michael A.; Slee, April; Olson, Aaron K.; Cohen, Gordon; Karl, Tom; Tong, Elizabeth; Hastings, Laura; Patel, Hitendra; Reinhartz, Olaf; Mott, Antonio R.; Mainwaring, Richard; Linam, Justin; Danzi, Sara
2011-01-01
Background Triiodothyronine levels decrease in infants and children after cardiopulmonary bypass. We tested the primary hypothesis that triiodothyronine (T3) repletion is safe in this population and produces improvements in postoperative clinical outcome. Methods and Results The TRICC study was a prospective, multicenter, double-blind, randomized, placebo-controlled trial in children younger than 2 years old undergoing heart surgery with cardiopulmonary bypass. Enrollment was stratified by surgical diagnosis. Time to extubation (TTE) was the primary outcome. Patients received intravenous T3 as Triostat (n=98) or placebo (n=95), and data were analyzed using Cox proportional hazards. Overall, TTE was similar between groups. There were no differences in adverse event rates, including arrhythmia. Prespecified analyses showed a significant interaction between age and treatment (P=0.0012). For patients younger than 5 months, the hazard ratio (chance of extubation) for Triostat was 1.72. (P=0.0216). Placebo median TTE was 98 hours with 95% confidence interval (CI) of 71 to 142 compared to Triostat TTE at 55 hours with CI of 44 to 92. TTE shortening corresponded to a reduction in inotropic agent use and improvement in cardiac function. For children 5 months of age, or older, Triostat produced a significant delay in median TTE: 16 hours (CI, 7–22) for placebo and 20 hours (CI, 16–45) for Triostat and (hazard ratio, 0.60; P=0.0220). Conclusions T3 supplementation is safe. Analyses using age stratification indicate that T3 supplementation provides clinical advantages in patients younger than 5 months and no benefit for those older than 5 months. Clinical Trial Registration URL: http://www.clinicaltrials.gov. Unique identifier: NCT00027417. PMID:20837917
Elevated Plasma CXCL12α Is Associated with a Poorer Prognosis in Pulmonary Arterial Hypertension
Li, Lili; O’Connell, Caroline; Codd, Mary; Lawrie, Allan; Morton, Allison; Kiely, David G.; Condliffe, Robin; Elliot, Charles; McLoughlin, Paul; Gaine, Sean
2015-01-01
Rationale Recent work in preclinical models suggests that signalling via the pro-angiogenic and pro-inflammatory cytokine, CXCL12 (SDF-1), plays an important pathogenic role in pulmonary hypertension (PH). The objective of this study was to establish whether circulating concentrations of CXCL12α were elevated in patients with PAH and related to mortality. Methods Plasma samples were collected from patients with idiopathic pulmonary arterial hypertension (IPAH) and PAH associated with connective tissue diseases (CTD-PAH) attending two pulmonary hypertension referral centres (n = 95) and from age and gender matched healthy controls (n = 44). Patients were subsequently monitored throughout a period of five years. Results CXCL12α concentrations were elevated in PAH groups compared to controls (P<0.05) and receiver-operating-characteristic analysis showed that plasma CXCL12α concentrations discriminated patients from healthy controls (AUC 0.80, 95% confidence interval 0.73-0.88). Kaplan Meier analysis indicated that elevated plasma CXCL12α concentration was associated with reduced survival (P<0.01). Multivariate Cox proportional hazards model showed that elevated CXCL12α independently predicted (P<0.05) earlier death in PAH with a hazard ratio (95% confidence interval) of 2.25 (1.01-5.00). In the largest subset by WHO functional class (Class 3, 65% of patients) elevated CXCL12α independently predicted (P<0.05) earlier death, hazard ratio 2.27 (1.05-4.89). Conclusions Our data show that elevated concentrations of circulating CXCL12α in PAH predicted poorer survival. Furthermore, elevated circulating CXCL12α was an independent risk factor for death that could potentially be included in a prognostic model and guide therapy. PMID:25856504
Elevated plasma CXCL12α is associated with a poorer prognosis in pulmonary arterial hypertension.
McCullagh, Brian N; Costello, Christine M; Li, Lili; O'Connell, Caroline; Codd, Mary; Lawrie, Allan; Morton, Allison; Kiely, David G; Condliffe, Robin; Elliot, Charles; McLoughlin, Paul; Gaine, Sean
2015-01-01
Recent work in preclinical models suggests that signalling via the pro-angiogenic and pro-inflammatory cytokine, CXCL12 (SDF-1), plays an important pathogenic role in pulmonary hypertension (PH). The objective of this study was to establish whether circulating concentrations of CXCL12α were elevated in patients with PAH and related to mortality. Plasma samples were collected from patients with idiopathic pulmonary arterial hypertension (IPAH) and PAH associated with connective tissue diseases (CTD-PAH) attending two pulmonary hypertension referral centres (n = 95) and from age and gender matched healthy controls (n = 44). Patients were subsequently monitored throughout a period of five years. CXCL12α concentrations were elevated in PAH groups compared to controls (P<0.05) and receiver-operating-characteristic analysis showed that plasma CXCL12α concentrations discriminated patients from healthy controls (AUC 0.80, 95% confidence interval 0.73-0.88). Kaplan Meier analysis indicated that elevated plasma CXCL12α concentration was associated with reduced survival (P<0.01). Multivariate Cox proportional hazards model showed that elevated CXCL12α independently predicted (P<0.05) earlier death in PAH with a hazard ratio (95% confidence interval) of 2.25 (1.01-5.00). In the largest subset by WHO functional class (Class 3, 65% of patients) elevated CXCL12α independently predicted (P<0.05) earlier death, hazard ratio 2.27 (1.05-4.89). Our data show that elevated concentrations of circulating CXCL12α in PAH predicted poorer survival. Furthermore, elevated circulating CXCL12α was an independent risk factor for death that could potentially be included in a prognostic model and guide therapy.
Finkelstein, Murray M; Chapman, Kenneth R; McIvor, R Andrew; Sears, Malcolm R
2011-01-01
BACKGROUND: Chronic obstructive pulmonary disease (COPD) and asthma are common; however, mortality rates among individuals with these diseases are not well studied in North America. OBJECTIVE: To investigate mortality rates and risk factors for premature death among subjects with COPD. METHODS: Subjects were identified from the lung function testing databases of two academic respiratory disease clinics in Hamilton and Toronto, Ontario. Mortality was ascertained by linkage to the Ontario mortality registry between 1992 and 2002, inclusive. Standardized mortality ratios were computed. Poisson regression of standardized mortality ratios and proportional hazards regression were performed to examine the multivariate effect of risk factors on the standardized mortality ratios and mortality hazards. RESULTS: Compared with the Ontario population, all-cause mortality was approximately doubled among subjects with COPD, but was lower than expected among subjects with asthma. The risk of mortality in patients with COPD was related to cigarette smoking, to the presence of comorbid conditons of ischemic heart disease and diabetes, and to Global initiative for chronic Obstructive Lung Disease severity scores. Individuals living closer to traffic sources showed an elevated risk of death compared with those who lived further away from traffic sources. CONCLUSIONS: Mortality rates among subjects diagnosed with COPD were substantially elevated. There were several deaths attributed to asthma among subjects in the present study; however, overall, patients with asthma demonstrated lower mortality rates than the general population. Subjects with COPD need to be managed with attention devoted to both their respiratory disorders and related comorbidities. PMID:22187688
Mueller, C.S.
2010-01-01
I analyze the sensitivity of seismic-hazard estimates in the central and eastern United States (CEUS) to maximum magnitude (mmax) by exercising the U.S. Geological Survey (USGS) probabilistic hazard model with several mmax alternatives. Seismicity-based sources control the hazard in most of the CEUS, but data seldom provide an objective basis for estimating mmax. The USGS uses preferred mmax values of moment magnitude 7.0 and 7.5 for the CEUS craton and extended margin, respectively, derived from data in stable continental regions worldwide. Other approaches, for example analysis of local seismicity or judgment about a source's seismogenic potential, often lead to much smaller mmax. Alternative models span the mmax ranges from the 1980s Electric Power Research Institute/Seismicity Owners Group (EPRI/SOG) analysis. Results are presented as haz-ard ratios relative to the USGS national seismic hazard maps. One alternative model specifies mmax equal to moment magnitude 5.0 and 5.5 for the craton and margin, respectively, similar to EPRI/SOG for some sources. For 2% probability of exceedance in 50 years (about 0.0004 annual probability), the strong mmax truncation produces hazard ratios equal to 0.35-0.60 for 0.2-sec spectral acceleration, and 0.15-0.35 for 1.0-sec spectral acceleration. Hazard-controlling earthquakes interact with mmax in complex ways. There is a relatively weak dependence on probability level: hazardratios increase 0-15% for 0.002 annual exceedance probability and decrease 5-25% for 0.00001 annual exceedance probability. Although differences at some sites are tempered when faults are added, mmax clearly accounts for some of the discrepancies that are seen in comparisons between USGS-based and EPRI/SOG-based hazard results.
Yang, Hyun; Woo, Hyun Young; Lee, Soon Kyu; Han, Ji Won; Jang, Bohyun; Nam, Hee Chul; Lee, Hae Lim; Lee, Sung Won; Song, Do Seon; Song, Myeong Jun; Oh, Jung Suk; Chun, Ho Jong; Jang, Jeong Won; Lozada, Angelo; Bae, Si Hyun; Choi, Jong Young; Yoon, Seung Kew
2017-06-01
Metronomic chemotherapy (MET) is frequently administered in comparatively low doses as a continuous chemotherapeutic agent. The aim of this study was to evaluate the feasibility and overall survival (OS) of MET compared to sorafenib for advanced hepatocellular carcinoma (HCC) patients with portal vein tumor thrombosis (PVTT). A total of 54 patients with advanced HCC and PVTT who had undergone MET were analyzed between 2005 and 2013. A total of 53 patients who had undergone sorafenib therapy were analyzed as the control group. The primary endpoint of this study was OS. The median number of MET cycles was two (1-15). The OS values for the MET group and sorafenib group were 158 days (132-184) and 117 days (92-142), respectively ( P =0.029). The Cox proportional-hazard model showed that a higher risk of death was correlated with higher serum alpha fetoprotein level (≥400 mg/dL, hazard ratio [HR]=1.680, P =0.014) and Child-Pugh class B (HR=1.856, P =0.008). MET was associated with more favorable outcomes in terms of overall survival than was sorafenib in patients with advanced HCC with PVTT, especially in patients with poor liver function. Therefore, MET can be considered as a treatment option in patients with advanced HCC with PVTT and poor liver function.
Cooper, Leroy L; Palmisano, Joseph N; Benjamin, Emelia J; Larson, Martin G; Vasan, Ramachandran S; Mitchell, Gary F; Hamburg, Naomi M
2016-12-01
Arterial dysfunction contributes to cardiovascular disease (CVD) progression and clinical events. Inter-relations of aortic stiffness and vasodilator function with incident CVD remain incompletely studied. We used proportional hazards models to relate individual measures of vascular function to incident CVD in 4547 participants (mean age, 51±11 years; 54% women) in 2 generations of Framingham Heart Study participants. During follow-up (0.02-13.83 years), 232 participants (5%) experienced new-onset CVD events. In multivariable models adjusted for cardiovascular risk factors, both higher carotid-femoral pulse wave velocity (hazard ratio [HR], 1.32; 95% confidence interval [CI], 1.07-1.63; P=0.01) and lower hyperemic mean flow velocity (HR, 0.84; 95% CI, 0.71-0.99; P=0.04) were associated significantly with incident CVD, whereas primary pressure wave amplitude (HR, 1.12; 95% CI, 0.99-1.27; P=0.06), baseline brachial diameter (HR, 1.09; 95% CI, 0.90-1.31; P=0.39), and flow-mediated vasodilation (HR, 0.85; 95% CI, 0.69-1.04; P=0.12) were not. In mediation analyses, 8% to 13% of the relation between aortic stiffness and CVD events was mediated by hyperemic mean flow velocity. Our results suggest that associations between aortic stiffness and CVD events are mediated by pathways that include microvascular damage and remodeling. © 2016 American Heart Association, Inc.
Information processing speed and 8-year mortality among community-dwelling elderly Japanese.
Iwasa, Hajime; Kai, Ichiro; Yoshida, Yuko; Suzuki, Takao; Kim, Hunkyung; Yoshida, Hideyo
2014-01-01
Cognitive function is an important contributor to health among elderly adults. One reliable measure of cognitive functioning is information processing speed, which can predict incident dementia and is longitudinally related to the incidence of functional dependence. Few studies have examined the association between information processing speed and mortality. This 8-year prospective cohort study design with mortality surveillance examined the longitudinal relationship between information processing speed and all-cause mortality among community-dwelling elderly Japanese. A total of 440 men and 371 women aged 70 years or older participated in this study. The Digit Symbol Substitution Test (DSST) was used to assess information processing speed. DSST score was used as an independent variable, and age, sex, education level, depressive symptoms, chronic disease, sensory deficit, instrumental activities of daily living, walking speed, and cognitive impairment were used as covariates. During the follow-up period, 182 participants (133 men and 49 women) died. A multivariate Cox proportional hazards model showed that lower DSST score was associated with increased risk of mortality (hazard ratio [HR] = 1.62, 95% CI = 0.97-2.72; HR = 1.73, 95% CI = 1.05-2.87; and HR = 2.55, 95% CI = 1.51-4.29, for the third, second, and first quartiles of DSST score, respectively). Slower information processing speed was associated with shorter survival among elderly Japanese.
Ashida, Toyo; Kondo, Naoki; Kondo, Katsunori
2016-08-01
The impact of social participation on older adults' health may differ by individual socioeconomic status (SES). Consequently, we examined SES effect modification on the associations between types of social activity participation and incident functional disability. Cohort data from the 2003 Japan Gerontological Evaluation Study (JAGES) was utilized. This included individuals who were aged 65 or older and functionally independent at baseline. Analysis was carried out on 12,991 respondents after acquisition of information about their long-term care (LTC) status in Japan. Incident functional disability was defined based on medical certification and LTC information was obtained from municipal insurance databases. Cox proportional hazard regression was conducted for analysis. Results indicated that participants in a sport (hazard ratio [HR]: 0.66; 95% confidence interval [CI]: 0.51, 0.85) or hobby group (HR: 0.69; 95% CI: 0.55, 0.87), or who had a group facilitator role (HR: 0.82; 95% CI: 0.66, 1.02) were less likely to be disabled. While men with 13 or more years of education were less likely to become disabled if they held facilitator roles, this association was weak among men with 0-5years of education (HR of interaction term between 0 and 5years of education and facilitator role dummy variable=3.95; 95% CI: 1.30, 12.05). In conclusion, the association between group participation and smaller risk of the functional disability was stronger among highly educated older adults. Intervention programs promoting social participation should consider participants' socioeconomic backgrounds. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Enzalutamide in metastatic prostate cancer before chemotherapy.
Beer, Tomasz M; Armstrong, Andrew J; Rathkopf, Dana E; Loriot, Yohann; Sternberg, Cora N; Higano, Celestia S; Iversen, Peter; Bhattacharya, Suman; Carles, Joan; Chowdhury, Simon; Davis, Ian D; de Bono, Johann S; Evans, Christopher P; Fizazi, Karim; Joshua, Anthony M; Kim, Choung-Soo; Kimura, Go; Mainwaring, Paul; Mansbach, Harry; Miller, Kurt; Noonberg, Sarah B; Perabo, Frank; Phung, De; Saad, Fred; Scher, Howard I; Taplin, Mary-Ellen; Venner, Peter M; Tombal, Bertrand
2014-07-31
Enzalutamide is an oral androgen-receptor inhibitor that prolongs survival in men with metastatic castration-resistant prostate cancer in whom the disease has progressed after chemotherapy. New treatment options are needed for patients with metastatic prostate cancer who have not received chemotherapy, in whom the disease has progressed despite androgen-deprivation therapy. In this double-blind, phase 3 study, we randomly assigned 1717 patients to receive either enzalutamide (at a dose of 160 mg) or placebo once daily. The coprimary end points were radiographic progression-free survival and overall survival. The study was stopped after a planned interim analysis, conducted when 540 deaths had been reported, showed a benefit of the active treatment. The rate of radiographic progression-free survival at 12 months was 65% among patients treated with enzalutamide, as compared with 14% among patients receiving placebo (81% risk reduction; hazard ratio in the enzalutamide group, 0.19; 95% confidence interval [CI], 0.15 to 0.23; P<0.001). A total of 626 patients (72%) in the enzalutamide group, as compared with 532 patients (63%) in the placebo group, were alive at the data-cutoff date (29% reduction in the risk of death; hazard ratio, 0.71; 95% CI, 0.60 to 0.84; P<0.001). The benefit of enzalutamide was shown with respect to all secondary end points, including the time until the initiation of cytotoxic chemotherapy (hazard ratio, 0.35), the time until the first skeletal-related event (hazard ratio, 0.72), a complete or partial soft-tissue response (59% vs. 5%), the time until prostate-specific antigen (PSA) progression (hazard ratio, 0.17), and a rate of decline of at least 50% in PSA (78% vs. 3%) (P<0.001 for all comparisons). Fatigue and hypertension were the most common clinically relevant adverse events associated with enzalutamide treatment. Enzalutamide significantly decreased the risk of radiographic progression and death and delayed the initiation of chemotherapy in men with metastatic prostate cancer. (Funded by Medivation and Astellas Pharma; PREVAIL ClinicalTrials.gov number, NCT01212991.).
Peritoneal dialysis in rural Australia.
Gray, Nicholas A; Grace, Blair S; McDonald, Stephen P
2013-12-20
Australians living in rural areas have lower incidence rates of renal replacement therapy and poorer dialysis survival compared with urban dwellers. This study compares peritoneal dialysis (PD) patient characteristics and outcomes in rural and urban Australia. Non-indigenous Australian adults who commenced chronic dialysis between 1 January 2000 and 31 December 2010 according to the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) were investigated. Each patient's residence was classified according to the Australian Bureau of Statistics remote area index as major city (MC), inner regional (IR), outer regional (OR), or remote/very remote (REM). A total of 7657 patients underwent PD treatment during the study period. Patient distribution was 69.0% MC, 19.6% IR, 9.5% OR, and 1.8% REM. PD uptake increased with increasing remoteness. Compared with MC, sub-hazard ratios [95% confidence intervals] for commencing PD were 1.70 [1.61-1.79] IR, 2.01 [1.87-2.16] OR, and 2.60 [2.21-3.06] REM. During the first 6 months of PD, technique failure was less likely outside MC (sub-hazard ratio 0.47 [95% CI: 0.35-0.62], P < 0.001), but no difference was seen after 6 months (sub-hazard ratio 1.05 [95% CI: 0.84-1.32], P = 0.6). Technique failure due to technical (sub-hazard ratio 0.57 [95% CI: 0.38-0.84], P = 0.005) and non-medical causes (sub-hazard ratio 0.52 [95% CI: 0.31-0.87], P = 0.01) was less likely outside MC. Time to first peritonitis episode was not associated with remoteness (P = 0.8). Patient survival while on PD or within 90 days of stopping PD did not differ by region (P = 0.2). PD uptake increases with increasing remoteness. In rural areas, PD technique failure is less likely during the first 6 months and time to first peritonitis is comparable to urban areas. Mortality while on PD does not differ by region. PD is therefore a good dialysis modality choice for rural patients in Australia.
Peritoneal dialysis in rural Australia
2013-01-01
Background Australians living in rural areas have lower incidence rates of renal replacement therapy and poorer dialysis survival compared with urban dwellers. This study compares peritoneal dialysis (PD) patient characteristics and outcomes in rural and urban Australia. Methods Non-indigenous Australian adults who commenced chronic dialysis between 1 January 2000 and 31 December 2010 according to the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) were investigated. Each patient’s residence was classified according to the Australian Bureau of Statistics remote area index as major city (MC), inner regional (IR), outer regional (OR), or remote/very remote (REM). Results A total of 7657 patients underwent PD treatment during the study period. Patient distribution was 69.0% MC, 19.6% IR, 9.5% OR, and 1.8% REM. PD uptake increased with increasing remoteness. Compared with MC, sub-hazard ratios [95% confidence intervals] for commencing PD were 1.70 [1.61-1.79] IR, 2.01 [1.87-2.16] OR, and 2.60 [2.21-3.06] REM. During the first 6 months of PD, technique failure was less likely outside MC (sub-hazard ratio 0.47 [95% CI: 0.35-0.62], P < 0.001), but no difference was seen after 6 months (sub-hazard ratio 1.05 [95% CI: 0.84-1.32], P = 0.6). Technique failure due to technical (sub-hazard ratio 0.57 [95% CI: 0.38-0.84], P = 0.005) and non-medical causes (sub-hazard ratio 0.52 [95% CI: 0.31-0.87], P = 0.01) was less likely outside MC. Time to first peritonitis episode was not associated with remoteness (P = 0.8). Patient survival while on PD or within 90 days of stopping PD did not differ by region (P = 0.2). Conclusions PD uptake increases with increasing remoteness. In rural areas, PD technique failure is less likely during the first 6 months and time to first peritonitis is comparable to urban areas. Mortality while on PD does not differ by region. PD is therefore a good dialysis modality choice for rural patients in Australia. PMID:24359341
Effects of clopidogrel added to aspirin in patients with recent lacunar stroke.
Benavente, Oscar R; Hart, Robert G; McClure, Leslie A; Szychowski, Jeffrey M; Coffey, Christopher S; Pearce, Lesly A
2012-08-30
Lacunar infarcts are a frequent type of stroke caused mainly by cerebral small-vessel disease. The effectiveness of antiplatelet therapy for secondary prevention has not been defined. We conducted a double-blind, multicenter trial involving 3020 patients with recent symptomatic lacunar infarcts identified by magnetic resonance imaging. Patients were randomly assigned to receive 75 mg of clopidogrel or placebo daily; patients in both groups received 325 mg of aspirin daily. The primary outcome was any recurrent stroke, including ischemic stroke and intracranial hemorrhage. The participants had a mean age of 63 years, and 63% were men. After a mean follow-up of 3.4 years, the risk of recurrent stroke was not significantly reduced with aspirin and clopidogrel (dual antiplatelet therapy) (125 strokes; rate, 2.5% per year) as compared with aspirin alone (138 strokes, 2.7% per year) (hazard ratio, 0.92; 95% confidence interval [CI], 0.72 to 1.16), nor was the risk of recurrent ischemic stroke (hazard ratio, 0.82; 95% CI, 0.63 to 1.09) or disabling or fatal stroke (hazard ratio, 1.06; 95% CI, 0.69 to 1.64). The risk of major hemorrhage was almost doubled with dual antiplatelet therapy (105 hemorrhages, 2.1% per year) as compared with aspirin alone (56, 1.1% per year) (hazard ratio, 1.97; 95% CI, 1.41 to 2.71; P<0.001). Among classifiable recurrent ischemic strokes, 71% (133 of 187) were lacunar strokes. All-cause mortality was increased among patients assigned to receive dual antiplatelet therapy (77 deaths in the group receiving aspirin alone vs. 113 in the group receiving dual antiplatelet therapy) (hazard ratio, 1.52; 95% CI, 1.14 to 2.04; P=0.004); this difference was not accounted for by fatal hemorrhages (9 in the group receiving dual antiplatelet therapy vs. 4 in the group receiving aspirin alone). Among patients with recent lacunar strokes, the addition of clopidogrel to aspirin did not significantly reduce the risk of recurrent stroke and did significantly increase the risk of bleeding and death. (Funded by the National Institute of Neurological Disorders and Stroke and others; SPS3 ClinicalTrials.gov number, NCT00059306.).
Apixaban versus warfarin in patients with atrial fibrillation.
Granger, Christopher B; Alexander, John H; McMurray, John J V; Lopes, Renato D; Hylek, Elaine M; Hanna, Michael; Al-Khalidi, Hussein R; Ansell, Jack; Atar, Dan; Avezum, Alvaro; Bahit, M Cecilia; Diaz, Rafael; Easton, J Donald; Ezekowitz, Justin A; Flaker, Greg; Garcia, David; Geraldes, Margarida; Gersh, Bernard J; Golitsyn, Sergey; Goto, Shinya; Hermosillo, Antonio G; Hohnloser, Stefan H; Horowitz, John; Mohan, Puneet; Jansky, Petr; Lewis, Basil S; Lopez-Sendon, Jose Luis; Pais, Prem; Parkhomenko, Alexander; Verheugt, Freek W A; Zhu, Jun; Wallentin, Lars
2011-09-15
Vitamin K antagonists are highly effective in preventing stroke in patients with atrial fibrillation but have several limitations. Apixaban is a novel oral direct factor Xa inhibitor that has been shown to reduce the risk of stroke in a similar population in comparison with aspirin. In this randomized, double-blind trial, we compared apixaban (at a dose of 5 mg twice daily) with warfarin (target international normalized ratio, 2.0 to 3.0) in 18,201 patients with atrial fibrillation and at least one additional risk factor for stroke. The primary outcome was ischemic or hemorrhagic stroke or systemic embolism. The trial was designed to test for noninferiority, with key secondary objectives of testing for superiority with respect to the primary outcome and to the rates of major bleeding and death from any cause. The median duration of follow-up was 1.8 years. The rate of the primary outcome was 1.27% per year in the apixaban group, as compared with 1.60% per year in the warfarin group (hazard ratio with apixaban, 0.79; 95% confidence interval [CI], 0.66 to 0.95; P<0.001 for noninferiority; P=0.01 for superiority). The rate of major bleeding was 2.13% per year in the apixaban group, as compared with 3.09% per year in the warfarin group (hazard ratio, 0.69; 95% CI, 0.60 to 0.80; P<0.001), and the rates of death from any cause were 3.52% and 3.94%, respectively (hazard ratio, 0.89; 95% CI, 0.80 to 0.99; P=0.047). The rate of hemorrhagic stroke was 0.24% per year in the apixaban group, as compared with 0.47% per year in the warfarin group (hazard ratio, 0.51; 95% CI, 0.35 to 0.75; P<0.001), and the rate of ischemic or uncertain type of stroke was 0.97% per year in the apixaban group and 1.05% per year in the warfarin group (hazard ratio, 0.92; 95% CI, 0.74 to 1.13; P=0.42). In patients with atrial fibrillation, apixaban was superior to warfarin in preventing stroke or systemic embolism, caused less bleeding, and resulted in lower mortality. (Funded by Bristol-Myers Squibb and Pfizer; ARISTOTLE ClinicalTrials.gov number, NCT00412984.).
Salim, Agus; Tai, E Shyong; Tan, Vincent Y; Welsh, Alan H; Liew, Reginald; Naidoo, Nasheen; Wu, Yi; Yuan, Jian-Min; Koh, Woon P; van Dam, Rob M
2016-08-01
In western populations, high-sensitivity C-reactive protein (hsCRP), and to a lesser degree serum creatinine and haemoglobin A1c, predict risk of coronary heart disease (CHD). However, data on Asian populations that are increasingly affected by CHD are sparse and it is not clear whether these biomarkers can be used to improve CHD risk classification. We conducted a nested case-control study within the Singapore Chinese Health Study cohort, with incident 'hard' CHD (myocardial infarction or CHD death) as an outcome. We used data from 965 men (298 cases, 667 controls) and 528 women (143 cases, 385 controls) to examine the utility of hsCRP, serum creatinine and haemoglobin A1c in improving the prediction of CHD risk over and above traditional risk factors for CHD included in the ATP III model. For each sex, the performance of models with only traditional risk factors used in the ATP III model was compared with models with the biomarkers added using weighted Cox proportional hazards analysis. The impact of adding these biomarkers was assessed using the net reclassification improvement index. For men, loge hsCRP (hazard ratio 1.25, 95% confidence interval: 1.05; 1.49) and loge serum creatinine (hazard ratio 4.82, 95% confidence interval: 2.10; 11.04) showed statistically significantly associations with CHD risk when added to the ATP III model. We did not observe a significant association between loge haemoglobin A1c and CHD risk (hazard ratio 1.83, 95% confidence interval: 0.21; 16.06). Adding hsCRP and serum creatinine to the ATP III model improved risk classification in men with a net gain of 6.3% of cases (p-value = 0.001) being reclassified to a higher risk category, while it did not significantly reduce the accuracy of classification for non-cases. For women, squared hsCRP was borderline significantly (hazard ratio 1.01, 95% confidence interval: 1.00; 1.03) and squared serum creatinine was significantly (hazard ratio 1.81, 95% confidence interval: 1.49; 2.21) associated with CHD risk. However, the association between squared haemoglobin A1c and CHD risk was not significant (hazard ratio 1.05, 95% confidence interval: 0.99; 1.12). The addition of hsCRP and serum creatinine to the ATP III model resulted in 3.7% of future cases being reclassified to a higher risk category (p-value = 0.025), while it did not significantly reduce the accuracy of classification for non-cases. Adding hsCRP and serum creatinine, but not haemoglobin A1c, to traditional risk factors improved CHD risk prediction among non-diabetic Singaporean Chinese. The improved risk estimates will allow better identification of individuals at high risk of CHD than existing risk calculators such as the ATP III model. © The European Society of Cardiology 2016.
Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J
2012-02-01
The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.
Phenotype at diagnosis predicts recurrence rates in Crohn's disease.
Wolters, F L; Russel, M G; Sijbrandij, J; Ambergen, T; Odes, S; Riis, L; Langholz, E; Politi, P; Qasim, A; Koutroubakis, I; Tsianos, E; Vermeire, S; Freitas, J; van Zeijl, G; Hoie, O; Bernklev, T; Beltrami, M; Rodriguez, D; Stockbrügger, R W; Moum, B
2006-08-01
In Crohn's disease (CD), studies associating phenotype at diagnosis and subsequent disease activity are important for patient counselling and health care planning. To calculate disease recurrence rates and to correlate these with phenotypic traits at diagnosis. A prospectively assembled uniformly diagnosed European population based inception cohort of CD patients was classified according to the Vienna classification for disease phenotype at diagnosis. Surgical and non-surgical recurrence rates throughout a 10 year follow up period were calculated. Multivariate analysis was performed to classify risk factors present at diagnosis for recurrent disease. A total of 358 were classified for phenotype at diagnosis, of whom 262 (73.2%) had a first recurrence and 113 patients (31.6%) a first surgical recurrence during the first 10 years after diagnosis. Patients with upper gastrointestinal disease at diagnosis had an excess risk of recurrence (hazard ratio 1.54 (95% confidence interval (CI) 1.13-2.10)) whereas age >/=40 years at diagnosis was protective (hazard ratio 0.82 (95% CI 0.70-0.97)). Colonic disease was a protective characteristic for resective surgery (hazard ratio 0.38 (95% CI 0.21-0.69)). More frequent resective surgical recurrences were reported from Copenhagen (hazard ratio 3.23 (95% CI 1.32-7.89)). A mild course of disease in terms of disease recurrence was observed in this European cohort. Phenotype at diagnosis had predictive value for disease recurrence with upper gastrointestinal disease being the most important positive predictor. A phenotypic North-South gradient in CD may be present, illustrated by higher surgery risks in some of the Northern European centres.
Hansen, Morten; Nyby, Sebastian; Eifer Møller, Jacob; Videbæk, Lars; Kassem, Moustapha; Barington, Torben; Thayssen, Per; Diederichsen, Axel Cosmus Pyndt
2014-01-01
Seven years ago, the DanCell study was carried out to test the hypothesis of improvement in left ventricular ejection fraction (LVEF) following repeated intracoronary injections of autologous bone marrow-derived stem cells (BMSCs) in patients suffering from chronic ischemic heart failure. In this post hoc analysis, the long-term effect of therapy is assessed. 32 patients [mean age 61 (SD ± 9), 81% males] with systolic dysfunction (LVEF 33 ± 9%) received two repeated intracoronary infusions (4 months apart) of autologous BMSCs (1,533 ± 765 × 10(6) BMSCs including 23 ± 11 × 10(6) CD34(+) cells and 14 ± 7 × 10(6) CD133(+) cells). Patients were followed for 7 years and deaths were recorded. During follow-up, 10 patients died (31%). In univariate regression analysis, the total number of BMSCs, CD34(+) cell count and CD133(+) cell count did not significantly correlate with survival (hazard ratio: 0.999, 95% CI: 0.998-1.000, p = 0.24; hazard ratio: 0.94, 95% CI: 0.88-1.01, p = 0.10, and hazard ratio: 0.96, 95% CI: 0.87-1.07, p = 0.47, respectively). After adjustment for baseline variables in multivariate regression analysis, the CD34(+) cell count was significantly associated with survival (hazard ratio: 0.90, 95% CI: 0.82-1.00, p = 0.04). Intracoronary injections of a high number of CD34(+) cells may have a beneficial effect on chronic ischemic heart failure in terms of long-term survival.
Cardiac rehabilitation attendance and outcomes in coronary artery disease patients.
Martin, Billie-Jean; Hauer, Trina; Arena, Ross; Austford, Leslie D; Galbraith, P Diane; Lewin, Adriane M; Knudtson, Merril L; Ghali, William A; Stone, James A; Aggarwal, Sandeep G
2012-08-07
Cardiac rehabilitation (CR) is an efficacious yet underused treatment for patients with coronary artery disease. The objective of this study was to determine the association between CR completion and mortality and resource use. We conducted a prospective cohort study of 5886 subjects (20.8% female; mean age, 60.6 years) who had undergone angiography and were referred for CR in Calgary, AB, Canada, between 1996 and 2009. Outcomes of interest included freedom from emergency room visits, hospitalization, and survival in CR completers versus noncompleters, adjusted for clinical covariates, treatment strategy, and coronary anatomy. Hazard ratios for events for CR completers versus noncompleters were also constructed. A propensity model was used to match completers to noncompleters on baseline characteristics, and each outcome was compared between propensity-matched groups. Of the subjects referred for CR, 2900 (49.3%) completed the program, and an additional 554 subjects started but did not complete CR. CR completion was associated with a lower risk of death, with an adjusted hazard ratio of 0.59 (95% confidence interval, 0.49-0.70). CR completion was also associated with a decreased risk of all-cause hospitalization (adjusted hazard ratio, 0.77; 95% confidence interval, 0.71-0.84) and cardiac hospitalization (adjusted hazard ratio, 0.68; 95% confidence interval, 0.55-0.83) but not with emergency room visits. Propensity-matched analysis demonstrated a persistent association between CR completion and reduced mortality. Among those coronary artery disease patients referred, CR completion is associated with improved survival and decreased hospitalization. There is a need to explore reasons for nonattendance and to test interventions to improve attendance after referral.
Leigh, Lucy; Hudson, Irene L; Byles, Julie E
2015-12-01
The aim of this study is to identify patterns of sleep difficulty in older women, to investigate whether sleep difficulty is an indicator for poorer survival, and to determine whether sleep difficulty modifies the association between disease and death. Data were from the Australian Longitudinal Study on Women's Health, a 15-year longitudinal cohort study, with 10 721 women aged 70-75 years at baseline. Repeated-measures latent class analysis identified four classes of persistent sleep difficulty: troubled sleepers (N = 2429, 22.7%); early wakers (N = 3083, 28.8%); trouble falling asleep (N = 1767, 16.5%); and untroubled sleepers (N = 3442, 32.1%). Sleep difficulty was an indicator for mortality. Compared with untroubled sleepers, hazard ratios and 95% confidence intervals for troubled sleepers, early wakers, and troubled falling asleep were 1.12 (1.03, 1.23), 0.81 (0.75, 0.91) and 0.89 (0.79, 1.00), respectively. Sleep difficulty may modify the prognosis of women with chronic diseases. Hazard ratios (and 95% confidence intervals) for having three or more diseases (compared with 0 diseases) were enhanced for untroubled sleepers, early wakers and trouble falling asleep [hazard ratio = 1.86 (1.55, 2.22), 1.91 (1.56, 2.35) and 1.98 (1.47, 2.66), respectively], and reduced for troubled sleepers [hazard ratio = 1.57 (1.24, 1.98)]. Sleep difficulty in older women is more complex than the presence or absence of sleep difficulty, and should be considered when assessing the risk of death associated with disease. © 2015 European Sleep Research Society.
Step 1: Human System Integration Pilot-Technology Interface Requirements for Weather Management
NASA Technical Reports Server (NTRS)
2005-01-01
This document involves definition of technology interface requirements for Hazardous Weather Avoidance. Technology concepts in use by the Access 5 Weather Management Work Package were considered. Beginning with the Human System Integration (HIS) high-level functional requirement for Hazardous Weather Avoidance, and Hazardous Weather Avoidance technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge of hazardous weather, and (2) the control capability needed by the pilot to obtain hazardous weather information. Fundamentally, these requirements provide the candidate Hazardous Weather Avoidance technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how Hazardous Weather Avoidance operations and functions should interface with the pilot to provide the necessary Weather Management functionality to the UA-pilot system. Requirements and guidelines for Hazardous Weather Avoidance are partitioned into four categories: (1) Planning En Route (2) Encountering Hazardous Weather En Route, (3) Planning to Destination, and (4) Diversion Planning Alternate Airport. Each requirement is stated and is supported with a rationale and associated reference(s).
Hazard Function Estimation with Cause-of-Death Data Missing at Random
Wang, Qihua; Dinse, Gregg E.; Liu, Chunling
2010-01-01
Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted estimator. All three estimators are uniformly strongly consistent and asymptotically normal. We derive asymptotic representations of the mean squared error and the mean integrated squared error for these estimators and we discuss a data-driven bandwidth selection method. A simulation study, conducted to assess finite sample behavior, demonstrates that the proposed hazard estimators perform relatively well. We illustrate our methods with an analysis of some vascular disease data. PMID:22267874
Attrition in Psychotherapy: A Survival Analysis
ERIC Educational Resources Information Center
Roseborough, David John; McLeod, Jeffrey T.; Wright, Florence I.
2016-01-01
Purpose: Attrition is a common problem in psychotherapy and can be defined as clients ending treatment before achieving an optimal response. Method: This longitudinal, archival study utilized data for 3,728 clients, using the Outcome Questionnaire 45.2. A Cox regression proportional hazards (hazard ratios) model was used in order to better…
Wieder, Robert; Shafiq, Basit; Adam, Nabil
2016-01-01
BACKGROUND: African American race negatively impacts survival from localized breast cancer but co-variable factors confound the impact. METHODS: Data sets were analyzed from the Surveillance, Epidemiology and End Results (SEER) directories from 1973 to 2011 consisting of patients with designated diagnosis of breast adenocarcinoma, race as White or Caucasian, Black or African American, Asian, American Indian or Alaskan Native, Native Hawaiian or Pacific Islander, age, stage I, II or III, grade 1, 2 or 3, estrogen receptor or progesterone receptor positive or negative, marital status as single, married, separated, divorced or widowed and laterality as right or left. The Cox Proportional Hazards Regression model was used to determine hazard ratios for survival. Chi square test was applied to determine the interdependence of variables found significant in the multivariable Cox Proportional Hazards Regression analysis. Cells with stratified data of patients with identical characteristics except African American or Caucasian race were compared. RESULTS: Age, stage, grade, ER and PR status and marital status significantly co-varied with race and with each other. Stratifications by single co-variables demonstrated worse hazard ratios for survival for African Americans. Stratification by three and four co-variables demonstrated worse hazard ratios for survival for African Americans in most subgroupings with sufficient numbers of values. Differences in some subgroupings containing poor prognostic co-variables did not reach significance, suggesting that race effects may be partly overcome by additional poor prognostic indicators. CONCLUSIONS: African American race is a poor prognostic indicator for survival from breast cancer independent of 6 associated co-variables with prognostic significance. PMID:27698895
Heymans, Martijn W; de Vet, Henrica C W; Bongers, Paulien M; Knol, Dirk L; Koes, Bart W; van Mechelen, Willem
2006-05-01
Randomized controlled trial. To compare high- and low-intensity back schools with usual care in occupational health care. The content and intensity of back schools vary widely and the methodologic quality of randomized controlled trials is generally weak. Until now, no back school has proven to be superior for workers sick-listed because of subacute nonspecific low back pain. Workers (n = 299) sick-listed for a period of 3 to 6 weeks because of nonspecific low back pain were recruited by the occupational physician and randomly assigned to a high-intensity back school, a low-intensity back school, or care as usual. Outcome measures were days until return to work, total days of sick-leave, pain, functional status, kinesiophobia, and perceived recovery and were assessed at baseline and at 3 and 6 months of follow-up. Principal analyses were performed according to the intention-to-treat principle. We randomly allocated 299 workers. Workers in the low-intensity back school returned to work faster compared with usual care and the high-intensity back school, with hazard ratios of 1.4 (P = 0.06) and 1.3 (P = 0.09), respectively. The comparison between high-intensity back school and usual care resulted in a hazard ratio of 1.0 (P = 0.83). The median number of sick-leave days was 68, 75, and 85 in the low-intensity back school, usual care, and high-intensity back school, respectively. Beneficial effects on functional status and kinesiophobia were found at 3 months in favor of the low-intensity back school. No substantial differences on pain and perceived recovery were found between groups. The low-intensity back school was most effective in reducing work absence, functional disability, and kinesiophobia, and more workers in this group scored a higher perceived recovery during the 6-month follow-up.
Durheim, Michael T; Smith, Patrick J; Babyak, Michael A; Mabe, Stephanie K; Martinu, Tereza; Welty-Wolf, Karen E; Emery, Charles F; Palmer, Scott M; Blumenthal, James A
2015-03-01
The 2011 combined Global Initiative for Chronic Obstructive Lung Disease (GOLD) assessment incorporates symptoms, exacerbation history, and spirometry in discriminating risk of exacerbations in patients with chronic obstructive pulmonary disease (COPD). Six-minute-walk distance (6MWD) and accelerometry also have been used to assess disease severity in COPD. The association between these measures and the risks of hospitalization and mortality in the context of GOLD 2011 is unknown. To describe changes in exercise tolerance and physical activity over time in patients with COPD and to test the hypothesis that lower baseline 6MWD or accelerometry step count is associated with increased risk of COPD-related hospitalization or all-cause mortality, independent of GOLD 2011 group. Physical function and medical outcomes were prospectively assessed in 326 patients with moderate to severe COPD in INSPIRE-II, a randomized controlled trial of a coping skills training intervention. Cox models were used to determine if GOLD 2011 group, 6MWD, or accelerometry steps were associated with risk of COPD-related hospitalization or all-cause mortality. Physical function declined over time in GOLD group D but remained stable in groups A, B, and C. GOLD classification was associated with time to death or first COPD-related hospitalization. Baseline 6MWD was more strongly associated with time to death or first COPD-related hospitalization (hazard ratio, 0.50 [95% confidence interval, 0.34, 0.73] per 150 m, P=0.0003) than GOLD 2011 classification. A similar relationship was observed for accelerometry steps (hazard ratio, 0.80 [95% confidence interval, 0.70, 0.92] per 1,000 steps, P=0.002). Exercise tolerance and daily physical activity are important predictors of hospitalization and mortality in COPD, independent of GOLD 2011 classification. Physical function may represent a modifiable risk factor that warrants increased attention as a target for interventions to improve clinically meaningful outcomes in COPD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boelling, Tobias, E-mail: Tobias.Boelling@uni-muenster.de; Department of Radiotherapy, Paracelsus Clinic Osnabrueck, Osnabrueck; Geisenheiser, Alina
Purpose: The 'Registry for the Evaluation of Side Effects After Radiotherapy in Childhood and Adolescence' (RiSK) has been established to prospectively characterize dose-volume effects of radiation in terms of side effects. The aim of this analysis was to characterize the function of the thyroid gland after radiotherapy to the head-and-neck region in children and adolescents. Methods and Materials: Detailed information regarding radiation doses to at-risk organs has been collected across Germany since 2001. Thyroid function was evaluated by blood value examinations of thyroid-stimulating hormone, triiodothyronine, and thyroxine. Information regarding thyroid hormone substitution was requested from the treating physicians. Results: Untilmore » May 2009, 1,086 patients from 62 centers were recruited, including 404 patients (median age, 10.9 years) who had received radiotherapy to the thyroid gland and/or hypophysis. Follow-up information was available for 264 patients (60.9%; median follow-up, 40 months), with 60 patients (22.7%) showing pathologic values. In comparison to patients treated with prophylactic cranial irradiation (median dose, 12 Gy), patients with radiation doses of 15 to 25 Gy to the thyroid gland had a hazard ratio of 3.072 (p = 0.002) for the development of pathologic thyroid blood values. Patients with greater than 25 Gy to the thyroid gland and patients who underwent craniospinal irradiation had hazard ratios of 3.768 (p = 0.009) and 5.674 (p < 0.001), respectively. The cumulative incidence of thyroid hormone substitution therapy did not differ between defined subgroups. Conclusions: Radiation-induced thyroid function impairment, including damage to the thyroid gland and/or hypophysis, can frequently be observed after radiotherapy in children. A structured follow-up examination is advised.« less
Concealed renal failure and adverse drug reactions in older patients with type 2 diabetes mellitus.
Corsonello, Andrea; Pedone, Claudio; Corica, Francesco; Mazzei, Bruno; Di Iorio, Angelo; Carbonin, Pierugo; Incalzi, Raffaele Antonelli
2005-09-01
In elderly patients serum creatinine may be normal despite decreased glomerular filtration rate (GFR). The aim of this study was to evaluate the prevalence of this "concealed" renal failure, i.e., renal failure with normal serum creatinine levels, in elderly diabetic patients, and to verify whether it is a risk factor for adverse drug reactions (ADR) to hydrosoluble drugs. We used data on 2257 hospitalized patients with type 2 diabetes mellitus enrolled in the Gruppo Italiano di Farmacovigilanza nell'Anziano study. On the basis of serum creatinine and calculated GFR, patients were grouped as follows: normal renal function (normal serum creatinine levels and normal GFR), concealed (normal serum creatinine levels and reduced GFR), or overt (increased creatinine levels and reduced GFR) renal failure. GFR was calculated using the Modification of Diet in Renal Disease (MDRD) equation. The outcome of the study was the incidence of ADR to hydrosoluble drugs during the hospital stay. The relationship between renal function and ADR was evaluated using Cox regression analysis including potential confounders. Concealed renal failure was observed in 363 (16.1%) of patients studied. Patients with concealed or overt renal failure were older, had more frequently cognitive impairment and polypharmacy, and had lower serum albumin levels than did those with normal renal function. Both concealed (hazard ratio = 1.90; 95% confidence interval, 1.04-3.48; p =.036) and overt (hazard ratio = 2.23; 95% confidence interval, 1.40-3.55; p =.001) renal failure were significantly associated with ADR to hydrosoluble drugs. The use of more than four drugs also qualified as an independent risk factor for ADRs to hydrosoluble drugs during hospital stay. Older diabetic patients should be systematically screened to ascertain the presence of concealed renal failure in an attempt to optimize the pharmacological treatment and reduce the risk of ADRs.
How much does a reminder letter increase cervical screening among under-screened women in NSW?
Morrell, Stephen; Taylor, Richard; Zeckendorf, Sue; Niciak, Amanda; Wain, Gerard; Ross, Jayne
2005-02-01
To evaluate a direct mail-out campaign to increase Pap screening rates in women who have not had a test in 48 months. Ninety thousand under-screened women were randomised to be mailed a 48-month reminder letter to have a Pap test (n=60,000), or not to be mailed a letter (n=30,000). Differences in Pap test rates were assessed by Kaplan-Meier survival analysis, by chi2 tests of significance between Pap test rates in letter versus no-letter groups, and by proportional hazards regression modelling of predictors of a Pap test with letter versus no-letter as the main study variable. T-tests were conducted on mean time to Pap test to assess whether time to Pap test was significantly different between the intervention and control groups. After 90 days following each mail-out, Pap test rates in the letter group were significantly higher than in the non-letter group, by approximately two percentage points. After controlling for potential confounders, the hazard ratio of a Pap test within 90 days of a mail-out in the letter group was 1.5 compared with 1.0 in the no-letter group. Hazard ratios of having a Pap test within 90 days decreased significantly with time since last Pap test (p<0.0001); were significantly higher than 1.0 for most non-metropolitan areas of NSW compared with metropolitan areas; and increased significantly with age (p<0.0001). Pap test hazard ratios were not associated with socio-economic status of area of residence, but the hazard ratio was significantly higher than 1.0 if the reminder letter was sent after the Christmas/New Year break. No significant differences in mean time to Pap test were found between the letter and no-letter groups. Being sent a reminder letter is associated with higher Pap testing rates in under-screened women.
Ganz, Peter; Amarenco, Pierre; Goldstein, Larry B; Sillesen, Henrik; Bao, Weihang; Preston, Gregory M; Welch, K Michael A
2017-12-01
Established risk factors do not fully identify patients at risk for recurrent stroke. The SPARCL trial (Stroke Prevention by Aggressive Reduction in Cholesterol Levels) evaluated the effect of atorvastatin on stroke risk in patients with a recent stroke or transient ischemic attack and no known coronary heart disease. This analysis explored the relationships between 13 plasma biomarkers assessed at trial enrollment and the occurrence of outcome strokes. We conducted a case-cohort study of 2176 participants; 562 had outcome strokes and 1614 were selected randomly from those without outcome strokes. Time to stroke was evaluated by Cox proportional hazards models. There was no association between time to stroke and lipoprotein-associated phospholipase A 2 , monocyte chemoattractant protein-1, resistin, matrix metalloproteinase-9, N-terminal fragment of pro-B-type natriuretic peptide, soluble vascular cell adhesion molecule-1, soluble intercellular adhesion molecule-1, or soluble CD40 ligand. In adjusted analyses, osteopontin (hazard ratio per SD change, 1.362; P <0.0001), neopterin (hazard ratio, 1.137; P =0.0107), myeloperoxidase (hazard ratio, 1.177; P =0.0022), and adiponectin (hazard ratio, 1.207; P =0.0013) were independently associated with outcome strokes. After adjustment for the Stroke Prognostic Instrument-II and treatment, osteopontin, neopterin, and myeloperoxidase remained independently associated with outcome strokes. The addition of these 3 biomarkers to Stroke Prognostic Instrument-II increased the area under the receiver operating characteristic curve by 0.023 ( P =0.015) and yielded a continuous net reclassification improvement (29.1%; P <0.0001) and an integrated discrimination improvement (42.3%; P <0.0001). Osteopontin, neopterin, and myeloperoxidase were independently associated with the risk of recurrent stroke and improved risk classification when added to a clinical risk algorithm. URL: http://www.clinicaltrials.gov. Unique Identifier: NCT00147602. © 2017 American Heart Association, Inc.
Meguid, Robert A.; Hooker, Craig M.; Harris, James; Xu, Li; Westra, William H.; Sherwood, J. Timothy; Sussman, Marc; Cattaneo, Stephen M.; Shin, James; Cox, Solange; Christensen, Joani; Prints, Yelena; Yuan, Nance; Zhang, Jennifer; Yang, Stephen C.
2010-01-01
Background: Survival outcomes of never smokers with non-small cell lung cancer (NSCLC) who undergo surgery are poorly characterized. This investigation compared surgical outcomes of never and current smokers with NSCLC. Methods: This investigation was a single-institution retrospective study of never and current smokers with NSCLC from 1975 to 2004. From an analytic cohort of 4,546 patients with NSCLC, we identified 724 never smokers and 3,822 current smokers. Overall, 1,142 patients underwent surgery with curative intent. For survival analysis by smoking status, hazard ratios (HRs) were estimated using Cox proportional hazard modeling and then further adjusted by other covariates. Results: Never smokers were significantly more likely than current smokers to be women (P < .01), older (P < .01), and to have adenocarcinoma (P < .01) and bronchioloalveolar carcinoma (P < .01). No statistically significant differences existed in stage distribution at presentation for the analytic cohort (P = .35) or for the subgroup undergoing surgery (P = .24). The strongest risk factors of mortality among patients with NSCLC who underwent surgery were advanced stage (adjusted hazard ratio, 3.43; 95% CI, 2.32-5.07; P < .01) and elevated American Society of Anesthesiologists classification (adjusted hazard ratio, 2.18; 95% CI, 1.40-3.40; P < .01). The minor trend toward an elevated risk of death on univariate analysis for current vs never smokers in the surgically treated group (hazard ratio, 1.20; 95% CI, 0.98-1.46; P = .07) was completely eliminated when the model was adjusted for covariates (P = .97). Conclusions: Our findings suggest that smoking status at time of lung cancer diagnosis has little impact on the long-term survival of patients with NSCLC, especially after curative surgery. Despite different etiologies between lung cancer in never and current smokers the prognosis is equally dismal. PMID:20507946
Whitson, Bryan A; Groth, Shawn S; Andrade, Rafael S; Mitiek, Mohi O; Maddaus, Michael A; D'Cunha, Jonathan
2012-03-01
We used a population-based data set to assess the association between the extent of pulmonary resection for bronchoalveolar carcinoma and survival. The reports thus far have been limited to small, institutional series. Using the Surveillance, Epidemiology, and End Results database (1988-2007), we identified patients with bronchoalveolar carcinoma who had undergone wedge resection, segmentectomy, or lobectomy. The bronchoalveolar carcinoma histologic findings were mucinous, nonmucinous, mixed, not otherwise specified, and alveolar carcinoma. To adjust for potential confounders, we used a Cox proportional hazards regression model. A total of 6810 patients met the inclusion criteria. Compared with the sublobar resections (wedge resections and segmentectomies), lobectomy conferred superior 5-year overall (59.5% vs 43.9%) and cancer-specific (67.1% vs 53.1%) survival (P < .0001). After adjusting for potential confounding patient and tumor characteristics, we found that patients who underwent an anatomic resection had significantly better overall (segmentectomy: hazard ratio, 0.59; 95% confidence interval, 0.43-0.81; lobectomy: hazard ratio, 0.50; 95% confidence interval, 0.44-0.57) and cancer-specific (segmentectomy: hazard ratio, 0.51; 95% confidence interval, 0.34-0.75; lobectomy: hazard ratio, 0.46; 95% confidence interval, 0.40-0.53) survival compared with patients who underwent wedge resection. Additionally, gender, race, tumor size, and degree of tumor de-differentiation were negative prognostic factors. Our results were unchanged when we limited our analysis to early-stage disease. Using a population-based data set, we found that anatomic resections for bronchoalveolar carcinoma conferred superior overall and cancer-specific survival rates compared with wedge resection. Bronchoalveolar carcinoma's propensity for intraparenchymal spread might be the underlying biologic basis of our observation of improved survival after anatomic resection. Copyright © 2012 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Budhathoki, Sanjeev; Hidaka, Akihisa; Sawada, Norie; Tanaka-Mizuno, Sachiko; Kuchiba, Aya; Charvat, Hadrien; Goto, Atsushi; Kojima, Satoshi; Sudo, Natsuki; Shimazu, Taichi; Sasazuki, Shizuka; Inoue, Manami; Tsugane, Shoichiro; Iwasaki, Motoki
2018-01-01
Abstract Objective To evaluate the association between pre-diagnostic circulating vitamin D concentration and the subsequent risk of overall and site specific cancer in a large cohort study. Design Nested case-cohort study within the Japan Public Health Center-based Prospective Study cohort. Setting Nine public health centre areas across Japan. Participants 3301 incident cases of cancer and 4044 randomly selected subcohort participants. Exposure Plasma concentration of 25-hydroxyvitamin D measured by enzyme immunoassay. Participants were divided into quarters based on the sex and season specific distribution of 25-hydroxyvitamin D among subcohorts. Weighted Cox proportional hazard models were used to calculate the multivariable adjusted hazard ratios for overall and site specific cancer across categories of 25-hydroxyvitamin D concentration, with the lowest quarter as the reference. Main outcome measure Incidence of overall or site specific cancer. Results Plasma 25-hydroxyvitamin D concentration was inversely associated with the risk of total cancer, with multivariable adjusted hazard ratios for the second to fourth quarters compared with the lowest quarter of 0.81 (95% confidence interval 0.70 to 0.94), 0.75 (0.65 to 0.87), and 0.78 (0.67 to 0.91), respectively (P for trend=0.001). Among the findings for cancers at specific sites, an inverse association was found for liver cancer, with corresponding hazard ratios of 0.70 (0.44 to 1.13), 0.65 (0.40 to 1.06), and 0.45 (0.26 to 0.79) (P for trend=0.006). A sensitivity analysis showed that alternately removing cases of cancer at one specific site from total cancer cases did not substantially change the overall hazard ratios. Conclusions In this large prospective study, higher vitamin D concentration was associated with lower risk of total cancer. These findings support the hypothesis that vitamin D has protective effects against cancers at many sites. PMID:29514781
Sáez, María E; González-Pérez, Antonio; Johansson, Saga; Himmelmann, Anders; García Rodríguez, Luis A
2016-07-01
In secondary cardiovascular prevention, discontinuation of acetylsalicylic acid (ASA) is associated with an increased risk of cardiovascular events. This study assessed the impact of ASA reinitiation on the risk of myocardial infarction and coronary heart disease death. Patients prescribed ASA for secondary cardiovascular prevention and who had had a period of ASA discontinuation of ≥90 days in 2000-2007 were identified from The Health Improvement Network (N = 10,453). Incidence of myocardial infarction/coronary heart disease death was calculated. Survival analyses using adjusted Cox proportional hazard models were performed to calculate hazard ratios and 95% confidence intervals for the risk of myocardial infarction/coronary heart disease death associated with ASA use patterns after the initial period of discontinuation. Individuals who were prescribed ASA during follow-up were considered reinitiators. The incidence of myocardial infarction/coronary heart disease death was 8.90 cases per 1000 person-years. Risk of myocardial infarction/coronary heart disease death was similar for current ASA users, who had been continuously exposed since reinitiation, and patients who had not reinitiated ASA (hazard ratio 1.27, 95% confidence interval 0.93-1.73). Among reinitiators, an additional period of ASA discontinuation was associated with increased risk of myocardial infarction/coronary heart disease death compared with no reinitiation (current users: hazard ratio 1.46, 95% confidence interval 1.13-1.90; noncurrent users: hazard ratio 1.70, 95% confidence interval 1.31-2.21). ASA reinitiation was not associated with a decreased risk of myocardial infarction/coronary heart disease death. This may be explained by confounding by indication/comorbidity, whereby higher-risk patients are more likely to reinitiate therapy. An additional period of ASA discontinuation among reinitiators was associated with an increased risk of myocardial infarction/coronary heart disease death. © The European Society of Cardiology 2015.
Hayes, J F; Bhaskaran, K; Batterham, R; Smeeth, L; Douglas, I
2015-01-01
Background/Objectives: The marketing authorization for the weight loss drug sibutramine was suspended in 2010 following a major trial that showed increased rates of non-fatal myocardial infarction and cerebrovascular events in patients with pre-existing cardiovascular disease. In routine clinical practice, sibutramine was already contraindicated in patients with cardiovascular disease and so the relevance of these influential clinical trial findings to the ‘real World' population of patients receiving or eligible for the drug is questionable. We assessed rates of myocardial infarction and cerebrovascular events in a cohort of patients prescribed sibutramine or orlistat in the United Kingdom. Subjects/Methods: A cohort of patients prescribed weight loss medication was identified within the Clinical Practice Research Datalink. Rates of myocardial infarction or cerebrovascular event, and all-cause mortality were compared between patients prescribed sibutramine and similar patients prescribed orlistat, using both a multivariable Cox proportional hazard model, and propensity score-adjusted model. Possible effect modification by pre-existing cardiovascular disease and cardiovascular risk factors was assessed. Results: Patients prescribed sibutramine (N=23 927) appeared to have an elevated rate of myocardial infarction or cerebrovascular events compared with those taking orlistat (N=77 047; hazard ratio 1.69, 95% confidence interval 1.12–2.56). However, subgroup analysis showed the elevated rate was larger in those with pre-existing cardiovascular disease (hazard ratio 4.37, 95% confidence interval 2.21–8.64), compared with those with no cardiovascular disease (hazard ratio 1.52, 95% confidence interval 0.92–2.48, P-interaction=0.0076). All-cause mortality was not increased in those prescribed sibutramine (hazard ratio 0.67, 95% confidence interval 0.34–1.32). Conclusions: Sibutramine was associated with increased rates of acute cardiovascular events in people with pre-existing cardiovascular disease, but there was a low absolute risk in those without. Sibutramine's marketing authorization may have, therefore, been inappropriately withdrawn for people without cardiovascular disease. PMID:25971925
Yen, Amy Ming-Fang; Boucher, Barbara J; Chiu, Sherry Yueh-Hsia; Fann, Jean Ching-Yuan; Chen, Sam Li-Sheng; Huang, Kuo-Chin; Chen, Hsiu-Hsi
2016-08-02
Transgenerational effects of paternal Areca catechu nut chewing on offspring metabolic syndrome (MetS) risk in humans, on obesity and diabetes mellitus experimentally, and of paternal smoking on offspring obesity, are reported, likely attributable to genetic and epigenetic effects previously reported in betel-associated disease. We aimed to determine the effects of paternal smoking, and betel chewing, on the risks of early MetS in human offspring. The 13 179 parent-child trios identified from 238 364 Taiwanese aged ≥20 years screened at 2 community-based integrated screening sessions were tested for the effects of paternal smoking, areca nut chewing, and their duration prefatherhood on age of detecting offspring MetS at screen by using a Cox proportional hazards regression model. Offspring MetS risks increased with prefatherhood paternal areca nutusage (adjusted hazard ratio, 1.77; 95% confidence interval [CI], 1.23-2.53) versus nonchewing fathers (adjusted hazard ratio, 3.28; 95% CI, 1.67-6.43) with >10 years paternal betel chewing, 1.62 (95% CI, 0.88-2.96) for 5 to 9 years, and 1.42 (95% CI, 0.80-2.54) for <5 years betel usage prefatherhood (Ptrend=0.0002), with increased risk (adjusted hazard ratio, 1.95; 95% CI, 1.26-3.04) for paternal areca nut usage from 20 to 29 years of age, versus from >30 years of age (adjusted hazard ratio,1.61; 95% CI, 0.22-11.69). MetS offspring risk for paternal smoking increased dosewise (Ptrend<0.0001) with earlier age of onset (Ptrend=0.0009), independently. Longer duration of paternal betel quid chewing and smoking, prefatherhood, independently predicted early occurrence of incident MetS in offspring, corroborating previously reported transgenerational effects of these habits, and supporting the need for habit-cessation program provision. © 2016 American Heart Association, Inc.
Damman, Peter; Wallentin, Lars; Fox, Keith A A; Windhausen, Fons; Hirsch, Alexander; Clayton, Tim; Pocock, Stuart J; Lagerqvist, Bo; Tijssen, Jan G P; de Winter, Robbert J
2012-01-31
The present study was designed to investigate the long-term prognostic impact of procedure-related and spontaneous myocardial infarction (MI) on cardiovascular mortality in patients with non-ST-elevation acute coronary syndrome. Five-year follow-up after procedure-related or spontaneous MI was investigated in the individual patient pooled data set of the FRISC-II (Fast Revascularization During Instability in Coronary Artery Disease), ICTUS (Invasive Versus Conservative Treatment in Unstable Coronary Syndromes), and RITA-3 (Randomized Intervention Trial of Unstable Angina 3) non-ST-elevation acute coronary syndrome trials. The principal outcome was cardiovascular death up to 5 years of follow-up. Cumulative event rates were estimated by the Kaplan-Meier method; hazard ratios were calculated with time-dependent Cox proportional hazards models. Adjustments were made for the variables associated with long-term outcomes. Among the 5467 patients, 212 experienced a procedure-related MI within 6 months after enrollment. A spontaneous MI occurred in 236 patients within 6 months. The cumulative cardiovascular death rate was 5.2% in patients who had a procedure-related MI, comparable to that for patients without a procedure-related MI (hazard ratio 0.66; 95% confidence interval, 0.36-1.20, P=0.17). In patients who had a spontaneous MI within 6 months, the cumulative cardiovascular death rate was 22.2%, higher than for patients without a spontaneous MI (hazard ratio 4.52; 95% confidence interval, 3.37-6.06, P<0.001). These hazard ratios did not change materially after risk adjustments. Five-year follow-up of patients with non-ST-elevation acute coronary syndrome from the 3 trials showed no association between a procedure-related MI and long-term cardiovascular mortality. In contrast, there was a substantial increase in long-term mortality after a spontaneous MI.
Buys, Roselien; Coeckelberghs, Ellen; Cornelissen, Véronique A; Goetschalckx, Kaatje; Vanhees, Luc
2016-09-01
Peak oxygen uptake is an independent predictor of mortality in patients with coronary artery disease (CAD). However, patients with CAD are not always capable of reaching peak effort, and therefore submaximal gas exchange variables such as the oxygen uptake efficiency slope (OUES) have been introduced. Baseline exercise capacity as expressed by OUES provides prognostic information and this parameter responds to training. Therefore, we aimed to assess the prognostic value of post-training OUES in patients with CAD. We included 960 patients with CAD (age 60.6 ± 9.5 years; 853 males) who completed a cardiac rehabilitation program between 2000 and 2011. The OUES was calculated before and after cardiac rehabilitation and information on mortality was obtained. The relationships of post-training OUES with all-cause and cardiovascular (CV) mortality was assessed by Cox proportional hazards regression analyses. Receiver operator characteristic curve analysis was performed in order to obtain the optimal cut-off value. During 7.37 ± 3.20 years of follow-up (range: 0.45-13.75 years), 108 patients died, among whom 47 died due to CV reasons. The post-training OUES was related to all-cause (hazard ratio: 0.50, p < 0.001) and CV (hazard ratio: 0.40, p < 0.001) mortality. When significant covariates, including baseline OUES, were entered into the Cox regression analysis, post-training OUES remained related to all-cause and CV mortality (hazard ratio: 0.40, p < 0.01 and 0.26, p < 0.01, respectively). In addition, the change in OUES due to exercise training was positively related to mortality (hazard ratio: 0.49, p < 0.01). Post-training OUES has stronger prognostic value compared to baseline OUES. The lack of improvement in exercise capacity expressed by OUES after an exercise training program relates to a worse prognosis and can help distinguish patients with favorable and unfavorable prognoses. © The European Society of Cardiology 2016.
Hayes, J F; Bhaskaran, K; Batterham, R; Smeeth, L; Douglas, I
2015-09-01
The marketing authorization for the weight loss drug sibutramine was suspended in 2010 following a major trial that showed increased rates of non-fatal myocardial infarction and cerebrovascular events in patients with pre-existing cardiovascular disease. In routine clinical practice, sibutramine was already contraindicated in patients with cardiovascular disease and so the relevance of these influential clinical trial findings to the 'real World' population of patients receiving or eligible for the drug is questionable. We assessed rates of myocardial infarction and cerebrovascular events in a cohort of patients prescribed sibutramine or orlistat in the United Kingdom. A cohort of patients prescribed weight loss medication was identified within the Clinical Practice Research Datalink. Rates of myocardial infarction or cerebrovascular event, and all-cause mortality were compared between patients prescribed sibutramine and similar patients prescribed orlistat, using both a multivariable Cox proportional hazard model, and propensity score-adjusted model. Possible effect modification by pre-existing cardiovascular disease and cardiovascular risk factors was assessed. Patients prescribed sibutramine (N=23,927) appeared to have an elevated rate of myocardial infarction or cerebrovascular events compared with those taking orlistat (N=77,047; hazard ratio 1.69, 95% confidence interval 1.12-2.56). However, subgroup analysis showed the elevated rate was larger in those with pre-existing cardiovascular disease (hazard ratio 4.37, 95% confidence interval 2.21-8.64), compared with those with no cardiovascular disease (hazard ratio 1.52, 95% confidence interval 0.92-2.48, P-interaction=0.0076). All-cause mortality was not increased in those prescribed sibutramine (hazard ratio 0.67, 95% confidence interval 0.34-1.32). Sibutramine was associated with increased rates of acute cardiovascular events in people with pre-existing cardiovascular disease, but there was a low absolute risk in those without. Sibutramine's marketing authorization may have, therefore, been inappropriately withdrawn for people without cardiovascular disease.
Jonkman, Nini H; Westland, Heleen; Groenwold, Rolf H H; Ågren, Susanna; Atienza, Felipe; Blue, Lynda; Bruggink-André de la Porte, Pieta W F; DeWalt, Darren A; Hebert, Paul L; Heisler, Michele; Jaarsma, Tiny; Kempen, Gertrudis I J M; Leventhal, Marcia E; Lok, Dirk J A; Mårtensson, Jan; Muñiz, Javier; Otsu, Haruka; Peters-Klimm, Frank; Rich, Michael W; Riegel, Barbara; Strömberg, Anna; Tsuyuki, Ross T; van Veldhuisen, Dirk J; Trappenburg, Jaap C A; Schuurmans, Marieke J; Hoes, Arno W
2016-03-22
Self-management interventions are widely implemented in the care for patients with heart failure (HF). However, trials show inconsistent results, and whether specific patient groups respond differently is unknown. This individual patient data meta-analysis assessed the effectiveness of self-management interventions in patients with HF and whether subgroups of patients respond differently. A systematic literature search identified randomized trials of self-management interventions. Data from 20 studies, representing 5624 patients, were included and analyzed with the use of mixed-effects models and Cox proportional-hazard models, including interaction terms. Self-management interventions reduced the risk of time to the combined end point of HF-related hospitalization or all-cause death (hazard ratio, 0.80; 95% confidence interval [CI], 0.71-0.89), time to HF-related hospitalization (hazard ratio, 0.80; 95% CI, 0.69-0.92), and improved 12-month HF-related quality of life (standardized mean difference, 0.15; 95% CI, 0.00-0.30). Subgroup analysis revealed a protective effect of self-management on the number of HF-related hospital days in patients <65 years of age (mean, 0.70 versus 5.35 days; interaction P=0.03). Patients without depression did not show an effect of self-management on survival (hazard ratio for all-cause mortality, 0.86; 95% CI, 0.69-1.06), whereas in patients with moderate/severe depression, self-management reduced survival (hazard ratio, 1.39; 95% CI, 1.06-1.83, interaction P=0.01). This study shows that self-management interventions had a beneficial effect on time to HF-related hospitalization or all-cause death and HF-related hospitalization alone and elicited a small increase in HF-related quality of life. The findings do not endorse limiting self-management interventions to subgroups of patients with HF, but increased mortality in depressed patients warrants caution in applying self-management strategies in these patients. © 2016 American Heart Association, Inc.
Law, Sabrina P; Oron, Assaf P; Kemna, Mariska S; Albers, Erin L; McMullan, D Michael; Chen, Jonathan M; Law, Yuk M
2018-05-01
Ventricular assist devices have gained popularity in the management of refractory heart failure in children listed for heart transplantation. Our primary aim was to compare the composite endpoint of all-cause pretransplant mortality and loss of transplant eligibility in children who were treated with a ventricular assist device versus a medically managed cohort. This was a retrospective cohort analysis. Data were obtained from the Scientific Registry of Transplant Recipients. The at-risk population (n = 1,380) was less than 18 years old, either on a ventricular assist device (605 cases) or an equivalent-severity, intensively medically treated group (referred to as MED, 775 cases). None. The impact of ventricular assist devices was estimated via Cox proportional hazards regression (hazard ratio), dichotomizing 1-year outcomes to "poor" (22%: 193 deaths, 114 too sick) versus all others (940 successful transplants, 41 too healthy, 90 censored), while adjusting for conventional risk factors. Among children 0-12 months old, ventricular assist device was associated with a higher risk of poor outcomes (hazard ratio, 2.1; 95% CI, 1.5-3.0; p < 0.001). By contrast, ventricular assist device was associated with improved outcomes for ages 12-18 (hazard ratio, 0.3; 95% CI, 0.1-0.7; p = 0.003). For candidates 1-5 and 6-11 years old, there were no differences in outcomes between the ventricular assist device and MED groups (hazard ratio, 0.8 and 1.0, p = 0.43 and 0.9). The interaction between ventricular assist devices and age group was strongly significant (p < 0.001). This is a comparative study of ventricular assist devices versus medical therapy in children. Age is a significant modulator of waitlist outcomes for children with end-stage heart failure supported by ventricular assist device, with the impact of ventricular assist devices being more beneficial in adolescents.
Chang, Fung-Wei; Chiu, Feng-Hsiang; Yeh, Chia-Lun; Huang, Chun-Fa; Chang, Shu-Ting; Lee, Hung-Chang; Chi, Hsin; Lin, Chien-Yu
2017-01-01
Objectives Scabies is a common and annoying disorder. Pernicious anemia (PA) is a serious disease which, when untreated, leads to death. Mounting evidence suggests that immune-mediated inflammatory processes play a role in the pathophysiology of both diseases. The relationship between these two diseases has not been investigated. We conducted this study to explore the potential relationship between scabies and PA. Materials and methods This nationwide, population-based study was conducted using the National Health Insurance Research Database of Taiwan. In total, 5,407 patients with scabies were identified as a study group and 20,089 matched patients were randomly selected as a control group. We tracked patients in both groups for a 7-year period to identify the incidence of PA. The demographic characteristics and comorbidities of the patients were analyzed, and Cox proportional hazards regression was used to calculate the hazard ratios for PA. Results Of the 25,496 patients in this study, 183 (0.7%) patients with newly diagnosed PA were identified during the 7-year follow-up period; 71 of 5,407 (1.3%) from the scabies group and 112 of 20,089 (0.6%) from the control group. Patients with scabies had a higher risk of subsequent PA, with a crude hazard ratio of 2.368. After adjusting for covariates, the adjusted hazard ratio was 1.51 (95% confidence interval: 1.09–2.08). Conclusion This study demonstrated an increased risk of PA (adjusted hazard ratio 1.51) among patients with scabies. Immune-mediated inflammatory processes may contribute to this association. Further studies are warranted to investigate the entire pathological mechanisms between these two diseases. Physicians should pay attention to patients with history of scabies presented with anemia. Further confirmative tests of PA may contribute to correct diagnosis and initiation of vitamin B12 supplement. PMID:29066901
Serum Uromodulin: A Biomarker of Long-Term Kidney Allograft Failure.
Bostom, Andrew; Steubl, Dominik; Garimella, Pranav S; Franceschini, Nora; Roberts, Mary B; Pasch, Andreas; Ix, Joachim H; Tuttle, Katherine R; Ivanova, Anastasia; Shireman, Theresa; Kim, S Joseph; Gohh, Reginald; Weiner, Daniel E; Levey, Andrew S; Hsu, Chi-Yuan; Kusek, John W; Eaton, Charles B
2018-01-01
Uromodulin is a kidney-derived glycoprotein and putative tubular function index. Lower serum uromodulin was recently associated with increased risk for kidney allograft failure in a preliminary, longitudinal single-center -European study involving 91 kidney transplant recipients (KTRs). The Folic Acid for Vascular Outcome Reduction in Transplantation (FAVORIT) trial is a completed, large, multiethnic controlled clinical trial cohort, which studied chronic, stable KTRs. We conducted a case cohort analysis using a randomly selected subset of patients (random subcohort, n = 433), and all individuals who developed kidney allograft failure (cases, n = 226) during follow-up. Serum uromodulin was determined in this total of n = 613 FAVORIT trial participants at randomization. Death-censored kidney allograft failure was the study outcome. The 226 kidney allograft failures occurred during a median surveillance of 3.2 years. Unadjusted, weighted Cox proportional hazards modeling revealed that lower serum uromodulin, tertile 1 vs. tertile 3, was associated with a threefold greater risk for kidney allograft failure (hazards ratio [HR], 95% CI 3.20 [2.05-5.01]). This association was attenuated but persisted at twofold greater risk for allograft failure, after adjustment for age, sex, smoking, allograft type and vintage, prevalent diabetes mellitus and cardiovascular disease (CVD), total/high-density lipoprotein cholesterol ratio, systolic blood pressure, estimated glomerular filtration rate, and natural log urinary albumin/creatinine: HR 2.00, 95% CI (1.06-3.77). Lower serum uromodulin, a possible indicator of less well-preserved renal tubular function, remained associated with greater risk for kidney allograft failure, after adjustment for major, established clinical kidney allograft failure and CVD risk factors, in a large, multiethnic cohort of long-term, stable KTRs. © 2018 S. Karger AG, Basel.
Dietary fat, fat subtypes and hepatocellular carcinoma in a large European cohort.
Duarte-Salles, Talita; Fedirko, Veronika; Stepien, Magdalena; Aleksandrova, Krasimira; Bamia, Christina; Lagiou, Pagona; Laursen, Anne Sofie Dam; Hansen, Louise; Overvad, Kim; Tjønneland, Anne; Boutron-Ruault, Marie-Christine; Fagherazzi, Guy; His, Mathilde; Boeing, Heiner; Katzke, Verena; Kühn, Tilman; Trichopoulou, Antonia; Valanou, Elissavet; Kritikou, Maria; Masala, Giovanna; Panico, Salvatore; Sieri, Sabina; Ricceri, Fulvio; Tumino, Rosario; Bueno-de-Mesquita, H B As; Peeters, Petra H; Hjartåker, Anette; Skeie, Guri; Weiderpass, Elisabete; Ardanaz, Eva; Bonet, Catalina; Chirlaque, Maria-Dolores; Dorronsoro, Miren; Quirós, J Ramón; Johansson, Ingegerd; Ohlsson, Bodil; Sjöberg, Klas; Wennberg, Maria; Khaw, Kay-Tee; Travis, Ruth C; Wareham, Nick; Ferrari, Pietro; Freisling, Heinz; Romieu, Isabelle; Cross, Amanda J; Gunter, Marc; Lu, Yunxia; Jenab, Mazda
2015-12-01
The role of amount and type of dietary fat consumption in the etiology of hepatocellular carcinoma (HCC) is poorly understood, despite suggestive biological plausibility. The associations of total fat, fat subtypes and fat sources with HCC incidence were investigated in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort, which includes 191 incident HCC cases diagnosed between 1992 and 2010. Diet was assessed by country-specific, validated dietary questionnaires. A single 24-hr diet recall from a cohort subsample was used for measurement error calibration. Hazard ratios (HR) and 95% confidence intervals (95% CI) were estimated from Cox proportional hazard models. Hepatitis B and C viruses (HBV/HCV) status and biomarkers of liver function were assessed separately in a nested case-control subset with available blood samples (HCC = 122). In multivariable calibrated models, there was a statistically significant inverse association between total fat intake and risk of HCC (per 10 g/day, HR = 0.80, 95% CI: 0.65-0.99), which was mainly driven by monounsaturated fats (per 5 g/day, HR = 0.71, 95% CI: 0.55-0.92) rather than polyunsaturated fats (per 5 g/day, HR = 0.92, 95% CI: 0.68-1.25). There was no association between saturated fats (HR = 1.08, 95% CI: 0.88-1.34) and HCC risk. The ratio of polyunsaturated/monounsaturated fats to saturated fats was not significantly associated with HCC risk (per 0.2 point, HR = 0.86, 95% CI: 0.73-1.01). Restriction of analyses to HBV/HCV free participants or adjustment for liver function did not substantially alter the findings. In this large prospective European cohort, higher consumption of monounsaturated fats is associated with lower HCC risk. © 2015 UICC.
Lim, Wai H; McDonald, Stephen P; Russ, Graeme R; Chapman, Jeremy R; Ma, Maggie Km; Pleass, Henry; Jaques, Bryon; Wong, Germaine
2017-06-01
Delayed graft function (DGF) is an established complication after donation after cardiac death (DCD) kidney transplants, but the impact of DGF on graft outcomes is uncertain. To minimize donor variability and bias, a paired donor kidney analysis was undertaken where 1 kidney developed DGF and the other did not develop DGF using data from the Australia and New Zealand Dialysis and Transplant Registry. Using paired DCD kidney data from the Australia and New Zealand Dialysis and Transplant Registry, we examined the association between DGF, graft and patient outcomes between 1994 and 2012 using adjusted Cox regression models. Of the 74 pairs of DCD kidneys followed for a median of 1.9 years (408 person-years), a greater proportion of recipients with DGF had experienced overall graft loss and death-censored graft loss at 3 years compared with those without DGF (14% vs 4%, P = 0.04 and 11% vs 0%, P < 0.01, respectively). Compared with recipients without DGF, the adjusted hazard ratio for overall graft loss at 3 years for recipients with DGF was 4.31 (95% confidence interval [95% CI], 1.13-16.44). The adjusted hazard ratio for acute rejection and all-cause mortality at 3 years in recipients who have experienced DGF were 0.98 (95% CI, 0.96-1.01) and 1.70 (95% CI, 0.36-7.93), respectively, compared with recipients without DGF. Recipients of DCD kidneys with DGF experienced a higher incidence of overall and death-censored graft loss compared with those without DGF. Strategies aim to reduce the risk of DGF could potentially improve graft survival in DCD kidney transplants.
The effectiveness of graded activity for low back pain in occupational healthcare.
Steenstra, I A; Anema, J R; Bongers, P M; de Vet, H C W; Knol, D L; van Mechelen, W
2006-11-01
Low back pain is a common medical and social problem associated with disability and absence from work. Knowledge on effective return to work (RTW) interventions is scarce. To determine the effectiveness of graded activity as part of a multistage RTW programme. Randomised controlled trial. Occupational healthcare. 112 workers absent from work for more than eight weeks due to low back pain were randomised to either graded activity (n = 55) or usual care (n = 57). Graded activity, a physical exercise programme aimed at RTW based on operant-conditioning behavioural principles. The number of days off work until first RTW for more then 28 days, total number of days on sick leave during follow up, functional status, and severity of pain. Follow up was 26 weeks. Graded activity prolonged RTW. Median time until RTW was equal to the total number of days on sick leave and was 139 (IQR = 69) days in the graded activity group and 111 (IQR = 76) days in the usual care group (hazard ratio = 0.52, 95% CI 0.32 to 0.86). An interaction between a prior workplace intervention and graded activity, together with a delay in the start of the graded activity intervention, explained most of the delay in RTW (hazard ratio = 0.86, 95% CI 0.40 to 1.84 without prior intervention and 0.39, 95% CI 0.19 to 0.81 with prior intervention). Graded activity did not improve pain or functional status clinically significantly. Graded activity was not effective for any of the outcome measures. Different interventions combined can lead to a delay in RTW. Delay in referral to graded activity delays RTW. In implementing graded activity special attention should be paid to the structure and process of care.
Gill, Jagbir; Dong, Jianghu; Rose, Caren; Gill, John S
2016-06-01
Concern about the long-term impact of delayed graft function (DGF) may limit the use of high-risk organs for kidney transplantation. To understand this better, we analyzed 29,598 mate kidney transplants from the same deceased donor where only 1 transplant developed DGF. The DGF associated risk of graft failure was greatest in the first posttransplant year, and in patients with concomitant acute rejection (hazard ratio: 8.22, 95% confidence interval: 4.76-14.21). In contrast, the DGF-associated risk of graft failure after the first posttransplant year in patients without acute rejection was far lower (hazard ratio: 1.15, 95% confidence interval: 1.02-1.29). In subsequent analysis, recipients of transplants complicated by DGF still derived a survival benefit when compared with patients who received treatment with dialysis irrespective of donor quality as measured by the Kidney Donor Profile Index (KDPI). The difference in the time required to derive a survival benefit was longer in transplants with DGF than in transplants without DGF, and this difference was greatest in recipients of lower quality kidneys (difference: 250-279 days for KDPI 20%-60% vs. 809 days for the KDPI over 80%). Thus, the association of DGF with graft failure is primarily limited to the first posttransplant year. Transplants complicated by DGF provide a survival benefit compared to treatment with dialysis, but the survival benefit is lower in kidney transplants with lower KDPI. This information may increase acceptance of kidneys at high risk for DGF and inform strategies to minimize the risk of death in the setting of DGF. Copyright © 2016 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.
Association of Physical Activity History With Physical Function and Mortality in Old Age
Koster, Annemarie; Valkeinen, Heli; Patel, Kushang V.; Bandinelli, Stefania; Guralnik, Jack M.; Ferrucci, Luigi
2016-01-01
Background. We examined whether physical activity in early adulthood, late midlife, and old age as well as cumulative physical activity history are associated with changes in physical functioning and mortality in old age. Methods. Data are from participants aged 65 years or older enrolled in the InCHIANTI study who were followed up from 1998–2000 to 2007–2008 (n = 1,149). At baseline, participants recalled their physical activity levels at ages 20–40, 40–60, and in the previous year, and they were categorized as physically inactive, moderately active, and physically active. Physical performance was assessed with the Short Physical Performance Battery and self-reported mobility disability was evaluated at the 3-, 6- and 9-year follow-up. Mortality follow-up was assessed until the end of 2010. Results. Physical inactivity at baseline was associated with greater decline in Short Physical Performance Battery score (mean 9-year change: −2.72, 95% CI: −3.08, −2.35 vs −0.98, 95% −1.57, −0.39) and greater rate of incident mobility disability (hazard ratio 4.66, 95% CI 1.14–19.07) and mortality (hazard ratio 2.18, 95% CI 1.01–4.70) compared to physically active participants at baseline. Being physically active throughout adulthood was associated with smaller decline in physical performance as well as with lower risk of incident mobility disability and premature death compared with those who had been less active during their adult life. Conclusions. Higher cumulative physical activity over the life course was associated with less decline in physical performance and reduced rate of incident mobility disability and mortality in older ages. PMID:26290538
Chan, Y-H; Lau, K-K; Yiu, K-H; Siu, C-W; Chan, H-T; Li, S-W; Tam, S; Lam, T-H; Lau, C-P; Tse, H-F
2012-04-01
Whether isoflavone has any effect on recurrent cardiovascular events is unknown. To investigate the relations between isoflavone intake and the risk of stroke recurrence. We recruited 127 consecutive patients with prior history of atherothrombotic/ hemorrhagic stroke (mean age: 67 ± 11 years, 69% male) and prospectively followed up for a mean duration of 30 months. Stroke recurrence and major adverse cardiovascular events (MACE) were documented. Brachial flow-mediated dilatation (FMD) was measured using high-resolution ultrasound. Isoflavone intake was estimated using a validated food frequency questionnaire. Median isoflavone intake was 6.9 (range: 2.1 - 14.5) mg/day. Isoflavone intake was independently associated with increased FMD (Pearson R=0.23, p=0.012). At 30 months, there were 10 stroke recurrence and 12 MACE. Kaplan-Meier analysis showed that patients with isoflavone intake higher than median value had significantly longer median stroke recurrence-free survival time (19.0 [range: 10.4 - 27.6] mth versus 5.0 [range: 4.1 - 5.9] mth, p=0.021) and MACE-free survival time (19.0 [range: 10.4 - 27.6] mth versus 4.0 [range: 2.4 - 5.6] mth, p=0.013). Using multivariate cox regression, higher isoflavone intake was an independent predictor for lower risk of stroke recurrence (hazards ratio 0.18 [95%CI: 0.03 - 0.95], risk reduction 82%, p=0.043) and MACE (hazards ratio 0.16 [95%CI: 0.03 - 0.84], risk reduction 84%, p=0.030). Higher isoflavone intake in stroke patients was associated with prolonged recurrence-free survival, and reduced risk of stroke recurrence and MACE independent of baseline vascular function. Whether isoflavone may confer clinically significant secondary protection in stroke patients should be further investigated in a randomized controlled trial.
Wang, Jiajun; Liu, Li; Qu, Yang; Xi, Wei; Xia, Yu; Bai, Qi; Xiong, Ying; Long, Qilai; Xu, Jiejie; Guo, Jianming
2018-01-01
Classical HLA class I antigen is highly involved in antigen presentation and adaptive immune response against tumor. In this study, we explored its predictive value for treatment response and survival in metastatic renal-cell carcinoma (mRCC) patients. A TKI cohort of 111 mRCC patients treated with sunitinib or sorafenib and a non-TKI cohort of 160 mRCC patients treated with interleukin-2 or interferon-α-based immunotherapy at a single institution were retrospectively enrolled. HLA class I expression and cytotoxic T lymphocyte (CTL) density was assessed by immunohistochemistry on tissue microarrays. Association between HLA class I and CTL was also assessed in the TCGA KIRC cohort. In the TKI cohort, down-regulated HLA class I was associated with lower objective response rate of TKI therapy (P = 0.004), shorter overall survival (OS) (P = 0.001), and shorter progression free survival (PFS) (P < 0.001). Multivariate Cox regression model defined HLA expression as an independent prognostic factor for both OS [hazard ratio 1.687 (95% CI 1.045-2.724), P = 0.032] and PFS [hazard ratio 2.139 (95% CI 1.376-3.326), P = 0.001]. In the non-TKI cohort, HLA class I was not significantly associated with survival. HLA class I expression was associated with CTL infiltration and function, and its prognostic value was more predominant in CTL high-density tumors (P < 0.001) rather than CTL low-density tumors (P = 0.294). Classical HLA class I expression can serve as a potential predictive biomarker for TKI therapy in mRCC patients. Its predictive value was restricted in CTL high-density tumors. However, further external validations and functional investigations are still required.
Pulmonary Function, Muscle Strength, and Incident Mobility Disability in Elders
Buchman, Aron S.; Boyle, Patricia A.; Leurgans, Sue E.; Evans, Denis A.; Bennett, David A.
2009-01-01
Muscle strength, including leg strength and respiratory muscle strength, are relatively independently associated with mobility disability in elders. However, the factors linking muscle strength with mobility disability are unknown. To test the hypothesis that pulmonary function mediates the association of muscle strength with the development of mobility disability in elders, we used data from a longitudinal cohort study of 844 ambulatory elders without dementia participating in the Rush Memory and Aging Project with a mean follow-up of 4.0 years (SD = 1.39). A composite measure of pulmonary function was based on spirometric measures of forced vital capacity, forced expiratory volume, and peak expiratory flow. Respiratory muscle strength was based on maximal inspiratory pressure and expiratory pressure and leg strength based on hand-held dynamometry. Mobility disability was defined as a gait speed less than or equal to 0.55 m/s based on annual assessment of timed walk. Secondary analyses considered time to loss of the ability to ambulate. In separate proportional hazards models which controlled for age, sex, and education, composite measures of pulmonary function, respiratory muscle strength, and leg strength were each associated with incident mobility disability (all P values < 0.001). Further, all three were related to the development of incident mobility disability when considered together in a single model (pulmonary function: hazard ratio [HR], 0.721; 95% confidence interval [CI], 0.577, 0.902; respiratory muscle strength: HR, 0.732; 95% CI, 0.593, 0.905; leg strength: HR, 0.791; 95% CI, 0.640, 0.976). Secondary analyses examining incident loss of the ability to ambulate revealed similar findings. Overall, these findings suggest that lower levels of pulmonary function and muscle strength are relatively independently associated with the development of mobility disability in the elderly. PMID:19934353
Jahangiri, Younes; Kerrigan, Timothy; Li, Lei; Prosser, Dominik; Brar, Anantnoor; Righetti, Johnathan; Schenning, Ryan C; Kaufman, John A; Farsad, Khashayar
2017-12-01
To identify risk factors of stent graft thrombosis after transjugular intrahepatic portosystemic shunt (TIPS) creation. Patients who underwent TIPS creation between June 2003 and January 2016 and with follow-up assessing stent graft patency were included (n=174). Baseline comorbidities, liver function, procedural details and follow-up liver function tests were analyzed in association with hazards of thrombosis on follow-up. Competing risk cox regression models were used considering liver transplant after TIPS creation as the competing risk variable. One-, 2- and 5-year primary patency rates were 94.1%, 91.7% and 78.2%, respectively. Patient age [sub-hazard ratio (sHR): 1.13; P=0.001], body mass index (BMI) <30 (sHR: 33.08; P=0.008) and a higher post-TIPS portosystemic pressure gradient (sHR: 1.14; P=0.023) were significantly associated with TIPS thrombosis in multivariate analysis. A higher rate of TIPS thrombosis was observed in those for whom the procedure was clinically unsuccessful (P=0.014). A significant increase in incidence of thrombosis was noted with increasing tertiles of post-TIPS portosystemic gradients (P value for trend=0.017). Older age, lower BMI and higher post-TIPS portosystemic gradients were associated with higher hazards of shunt thrombosis after TIPS creation using stent grafts. Higher rates of shunt thrombosis were seen in patients for whom TIPS creation was clinically unsuccessful. The association between TIPS thrombosis and higher post-TIPS portosystemic gradients may indicate impaired flow through the shunt, a finding which may be technical or anatomic in nature and should be assessed before procedure completion.
Tuliani, Tushar A; Shenoy, Maithili; Belgrave, Kevin; Deshmukh, Abhishek; Pant, Sadip; Hilliard, Anthony; Afonso, Luis
2017-09-01
Studies suggest that subclinical hypothyroidism (SCH) is related to cardiovascular mortality (CVM). We explored the role of microalbuminuria (MIA) as a predictor of long-term CVM in population with and without SCH with normal kidney function. We examined the National Health and Nutrition Education Survey - III database (n = 6,812). Individuals younger than 40 years, thyroid-stimulating hormone levels ≥20 and ≤0.35mIU/L, estimated glomerular filtration rate <60mL/minute/1.73m 2 and urine albumin-to-creatinine ratio of >250mg/g in men and >355mg/g in women were excluded. SCH was defined as thyroid-stimulating hormone levels between 5 and 19.99mIU/L and serum T4 levels between 5 and 12µg/dL. MIA was defined as urine albumin-to-creatinine ratio of 17-250mg/g in men and 25-355mg/g in women. Patients were categorized into the following 4 groups: (1) no SCH or MIA, (2) MIA, but no SCH, (3) SCH, but no MIA and (4) both SCH and MIA. Prevalence of MIA in the subclinical hypothyroid cohort was 21% compared to 16.4% in those without SCH (P = 0.03). SCH was a significant independent predictor of MIA (n = 6,812), after adjusting for traditional risk factors (unadjusted odds ratio = 1.75; 95% CI: 1.24-2.48; P = 0.002 and adjusted odds ratio = 1.83; 95% CI: 1.2-2.79; P = 0.006). MIA was a significant independent predictor of long-term all-cause (adjusted hazard ratio = 1.7, 95% CI: 1.24-2.33) and CVM (adjusted hazard ratio = 1.72, 95% CI: 1.07-2.76) in subclinical hypothyroid individuals. In a cohort of subclinical hypothyroid individuals, the presence of MIA predicts increased risk of CVM as compared to nonmicroalbuminurics with SCH. Further randomized trials are needed to assess the benefits of treating microalbuminuric subclinical hypothyroid individuals and impact on CVM. Copyright © 2017 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.
Impact of scalp location on survival in head and neck melanoma: A retrospective cohort study.
Xie, Charles; Pan, Yan; McLean, Catriona; Mar, Victoria; Wolfe, Rory; Kelly, John
2017-03-01
Scalp melanomas have more aggressive clinicopathological features than other melanomas and mortality rates more than twice that of melanoma located elsewhere. We sought to describe the survival of patients with scalp melanoma versus other cutaneous head and neck melanoma (CHNM), and explore a possible independent negative impact of scalp location on CHNM survival. A retrospective cohort study was performed of all invasive primary CHNM cases seen at a tertiary referral center over a 20-year period. Melanoma-specific survival (MSS) was compared between scalp melanoma and other invasive CHNM. Multivariable Cox proportional hazards regression was performed to determine associations with survival. On univariate analysis, patients with scalp melanoma had worse MSS than other CHNM (hazard ratio 2.22, 95% confidence interval 1.59-3.11). Scalp location was not associated with MSS in CHNM on multivariable analysis (hazard ratio 1.11, 95% confidence interval 0.77-1.61) for all tumors together, but remained independently associated with MSS for the 0.76- to 1.50-mm thickness stratum (hazard ratio 5.51, 95% confidence interval 1.55-19.59). Disease recurrence was not assessed because of unavailable data. The poorer survival of scalp melanoma is largely explained by greater Breslow thickness and a higher proportion of male patients. Copyright © 2016 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
Weight Cycling and Cancer Incidence in a Large Prospective US Cohort
Stevens, Victoria L.; Jacobs, Eric J.; Patel, Alpa V.; Sun, Juzhong; McCullough, Marjorie L.; Campbell, Peter T.; Gapstur, Susan M.
2015-01-01
Weight cycling, which consists of repeated cycles of intentional weight loss and regain, is common among individuals who try to lose weight. Some evidence suggests that weight cycling may affect biological processes that could contribute to carcinogenesis, but whether it is associated with cancer risk is unclear. Using 62,792 men and 69,520 women enrolled in the Cancer Prevention Study II Nutrition Cohort in 1992, we examined the association between weight cycling and cancer incidence. Weight cycles were defined by using baseline questions that asked the number of times ≥10 pounds (4.54 kg) was purposely lost and later regained. Multivariable-adjusted hazard ratios and 95% confidence intervals for all cancer and 15 individual cancers were estimated by using Cox proportional hazards regression. During up to 17 years of follow-up, 15,333 men and 9,984 women developed cancer. Weight cycling was not associated with overall risk of cancer in men (hazard ratio = 0.96, 95% confidence interval: 0.83, 1.11 for ≥20 cycles vs. no weight cycles) or women (hazard ratio = 0.96, 95% confidence interval: 0.86, 1.08) in models that adjusted for body mass index and other covariates. Weight cycling was also not associated with any individual cancer investigated. These results suggest that weight cycling, independent of body weight, is unlikely to influence subsequent cancer risk. PMID:26209523
Lin, Chien-Yu; Chang, Fung-Wei; Yang, Jing-Jung; Chang, Chun-Hung; Yeh, Chia-Lun; Lei, Wei-Te; Huang, Chun-Fa; Liu, Jui-Ming; Hsu, Ren-Jun
2017-11-01
Both scabies and bipolar disorder (BD) are common and troublesome disorders. There are several similarities in both diseases: pruritus, a higher prevalence in crowded environments, and cytokine-mediated inflammatory processes in the pathophysiology. We conducted this nationwide population-based study to investigate the possible relationship between scabies and BD. Based on the National Health Insurance Research Database (NHIRD) of Taiwan, a total of 7096 patients with scabies were identified as a study group and 28,375 matched patients as a control. We tracked the patients in both groups for a 7-year period to identify those newly diagnosed with BD. The demographic characteristics and comorbidities of the patients were analyzed, and Cox proportional hazard regressions were performed to calculate the hazard ratio (HR) of BD. Of the 35,471 patients in this study, 183 (0.5%) patients with newly diagnosed BD were identified, with 58 (0.8%) from the scabies group and 125 (0.4%) from the control group. The patients with scabies had a higher risk of subsequent BD, with a crude hazard ratio of 1.86 and an adjusted hazard ratio of 1.55 (95% confidence interval: 1.12-2.09, P < 0.05). This study shows there is an increased risk for BD among patients with scabies. Immunopathology may contribute to this association. Copyright © 2017 Elsevier B.V. All rights reserved.
Li, Wen; Wang, Lei; Xiong, Chang-Ming; Yang, Tao; Zhang, Yan; Gu, Qing; Yang, Yong; Ni, Xin-Hai; Liu, Zhi-Hong; Fang, Wei; He, Jian-Guo
2015-11-01
Metabolic changes occur in the right ventricle (RV) under increased afterload in pulmonary arterial hypertension. FDG PET imaging has potential to assess RV function. In this study, we aimed to determine the prognostic value of metabolic changes of RV using FDG PET imaging in idiopathic pulmonary arterial hypertension (IPAH). In this prospective investigation, patients newly diagnosed with IPAH were recruited. Patients underwent right heart catheterization, FDG PET imaging, and cardiac MR (CMR) within 1 week. Right ventricle hemodynamics, glucose metabolism derived from the FDG uptake levels, and functional parameters were obtained. The FDG uptake ratio between the RV and the left ventricle (LV) and its relation with the patients' survival were analyzed. A total of 45 IPAH patients were enrolled in this study, which included 13 male (28.9%) and 32 female (71.1%). The median follow-up time of this study was 1043 days. At the end of the follow-up, 36 patients survived, whereas 9 patients were deceased because of right heart failure. Multivariate Cox proportional hazard analysis showed that the ratio between the corrected RV and LV FDG uptake (cRV/LV) in both glucose-loading (cRV/LVg) and fasting (cRV/LVf) conditions independently predicted the mortality after adjusting for pulmonary vascular resistance index, mean right atrial pressure, and World Health Organization functional class. Kaplan-Meier survival analysis showed that patients with cRV/LVf greater than 143.65% in fasting condition (log rank, P = 0.030) or cRV/LVg greater than 120.55% in glucose-loading condition (log rank, P = 0.014) had worse prognosis. The FDG uptake ratio between the RV and LV can be an independent predictor for long-term prognosis of IPAH patients.
Relationship between Clinic and Ambulatory Blood-Pressure Measurements and Mortality.
Banegas, José R; Ruilope, Luis M; de la Sierra, Alejandro; Vinyoles, Ernest; Gorostidi, Manuel; de la Cruz, Juan J; Ruiz-Hurtado, Gema; Segura, Julián; Rodríguez-Artalejo, Fernando; Williams, Bryan
2018-04-19
Evidence for the influence of ambulatory blood pressure on prognosis derives mainly from population-based studies and a few relatively small clinical investigations. This study examined the associations of blood pressure measured in the clinic (clinic blood pressure) and 24-hour ambulatory blood pressure with all-cause and cardiovascular mortality in a large cohort of patients in primary care. We analyzed data from a registry-based, multicenter, national cohort that included 63,910 adults recruited from 2004 through 2014 in Spain. Clinic and 24-hour ambulatory blood-pressure data were examined in the following categories: sustained hypertension (elevated clinic and elevated 24-hour ambulatory blood pressure), "white-coat" hypertension (elevated clinic and normal 24-hour ambulatory blood pressure), masked hypertension (normal clinic and elevated 24-hour ambulatory blood pressure), and normotension (normal clinic and normal 24-hour ambulatory blood pressure). Analyses were conducted with Cox regression models, adjusted for clinic and 24-hour ambulatory blood pressures and for confounders. During a median follow-up of 4.7 years, 3808 patients died from any cause, and 1295 of these patients died from cardiovascular causes. In a model that included both 24-hour and clinic measurements, 24-hour systolic pressure was more strongly associated with all-cause mortality (hazard ratio, 1.58 per 1-SD increase in pressure; 95% confidence interval [CI], 1.56 to 1.60, after adjustment for clinic blood pressure) than the clinic systolic pressure (hazard ratio, 1.02; 95% CI, 1.00 to 1.04, after adjustment for 24-hour blood pressure). Corresponding hazard ratios per 1-SD increase in pressure were 1.55 (95% CI, 1.53 to 1.57, after adjustment for clinic and daytime blood pressures) for nighttime ambulatory systolic pressure and 1.54 (95% CI, 1.52 to 1.56, after adjustment for clinic and nighttime blood pressures) for daytime ambulatory systolic pressure. These relationships were consistent across subgroups of age, sex, and status with respect to obesity, diabetes, cardiovascular disease, and antihypertensive treatment. Masked hypertension was more strongly associated with all-cause mortality (hazard ratio, 2.83; 95% CI, 2.12 to 3.79) than sustained hypertension (hazard ratio, 1.80; 95% CI, 1.41 to 2.31) or white-coat hypertension (hazard ratio, 1.79; 95% CI, 1.38 to 2.32). Results for cardiovascular mortality were similar to those for all-cause mortality. Ambulatory blood-pressure measurements were a stronger predictor of all-cause and cardiovascular mortality than clinic blood-pressure measurements. White-coat hypertension was not benign, and masked hypertension was associated with a greater risk of death than sustained hypertension. (Funded by the Spanish Society of Hypertension and others.).
Sloane, Philip D; Zimmerman, Sheryl; Ward, Kimberly; Reed, David; Preisser, John S; Weber, David J
2017-09-01
Pneumonia is the leading infectious cause of hospitalization and death for nursing home (NH) residents; however, diagnosis is often delayed because classic signs of infection are not present. We sought to identify NH residents at high risk for pneumonia, to identify persons to target for more intensive surveillance and preventive measures. Based on a literature review, we identified key risk factors for pneumonia and compiled them for use as prediction tool, limiting risk factors to those available on the Minimum Data Set (MDS). Next, we tested the tool's ability to predict 6-month pneumonia incidence and mortality rates in a sample of 674 residents from 7 NHs, evaluating it both as a continuous and a dichotomous variable, and applying both logistic regression and survival analysis to calculate estimates. NH Pneumonia Risk Index scores ranged from -1 to 6, with a mean of 2.1, a median of 2, and a mode of 2. For the outcome of pneumonia, a 1-point increase in the index was associated with a risk odds ratio of 1.26 (P = .038) or a hazard ratio of 1.24 (P = .037); using it as a dichotomous variable (≤2 vs ≥3), the corresponding figures were a risk odds ratio of 1.78 (P = .045) and a hazard ratio of 1.82 (P = .025). For the outcome of mortality, a 1-point increase in the NH Pneumonia Risk Index was associated with a risk odds ratio of 1.58 (P = .002) and a hazard ratio of 1.45 (P = .013); using the index as a dichotomous variable, the corresponding figures were a risk odds ratio of 3.71 (P < .001) and a hazard ratio of 3.29 (P = .001). The NH Pneumonia Risk Index can be used by NH staff to identify residents for whom to apply especially intensive preventive measures and surveillance. Because of its strong association with mortality, the index may also be valuable in care planning and discussion of advance directives. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
Health Literacy, Cognitive Abilities, and Mortality Among Elderly Persons
Wolf, Michael S.; Feinglass, Joseph; Thompson, Jason A.
2008-01-01
Background Low health literacy and low cognitive abilities both predict mortality, but no study has jointly examined these relationships. Methods We conducted a prospective cohort study of 3,260 community-dwelling adults age 65 and older. Participants were interviewed in 1997 and administered the Short Test of Functional Health Literacy in Adults and the Mini Mental Status Examination. Mortality was determined using the National Death Index through 2003. Measurements and Main Results In multivariate models with only literacy (not cognition), the adjusted hazard ratio was 1.50 (95% confidence of interval [CI] 1.24–1.81) for inadequate versus adequate literacy. In multivariate models without literacy, delayed recall of 3 items and the ability to serial subtract numbers were associated with higher mortality (e.g., adjusted hazard ratios [AHR] 1.74 [95% CI 1.30–2.34] for recall of zero versus 3 items, and 1.32 [95% CI 1.09–1.60] for 0–2 vs 5 correct subtractions). In multivariate analysis with both literacy and cognition, the AHRs for the cognition items were similar, but the AHR for inadequate literacy decreased to 1.27 (95% CI 1.03 – 1.57). Conclusions Both health literacy and cognitive abilities independently predict mortality. Interventions to improve patient knowledge and self-management skills should consider both the reading level and cognitive demands of the materials. PMID:18330654
Serum urate at trial entry and ALS progression in EMPOWER.
O'Reilly, ÉIlis J; Liu, Dawei; Johns, Donald R; Cudkowicz, Merit E; Paganoni, Sabrina; Schwarzschild, Michael A; Leitner, Melanie; Ascherio, Alberto
2017-02-01
Our objective was to determine whether serum urate predicts ALS progression. A study population comprised adult participants of EMPOWER (n = 942), a phase III clinical trial to evaluate the efficacy of dexpramipexole to treat ALS. Urate was measured in blood samples collected during enrollment as part of the routine block chemistry. We measured outcomes by combined assessment of function and survival rank (CAFs), and time to death, by 12 months. Results showed that in females there was not a significant relation between urate and outcomes. In males, outcomes improved with increasing urate (comparing highest to lowest urate quartile: CAFS was 53 points better with p for trend = 0.04; and hazard ratio for death was 0.60 with p for trend = 0.07), but with adjustment for body mass index (BMI) at baseline, a predictor of both urate levels and prognosis, associations were attenuated and no longer statistically significant. Overall, participants with urate levels equal to or above the median (5.1 mg/dl) appeared to have a survival advantage compared to those below (hazard ratio adjusted for BMI: 0.67; 95% confidence interval 0.47-0.95). In conclusion, these findings suggest that while the association between urate at baseline and ALS progression is partially explained by BMI, there may be an independent beneficial effect of urate.
NASA Technical Reports Server (NTRS)
Hilado, C. J.; Brandt, D. L.; Brauer, D. P.
1978-01-01
An apparatus and procedure are described for evaluating the toxicity of the gases evolved from the smoldering combustion of seating and bedding materials. The method combines initiation of smoldering combustion in fabric/cushion combinations by a lighted cigarette and exposure of laboratory animals to the gases evolved. The ratio of the surface available for smoldering to the compartment volume in this apparatus is approximately five times the ratio expected in a California living room, and 100 times the ratio expected in a wide-body aircraft passenger cabin. Based on fabric/cushion combinations tested, the toxicity of gases from smoldering combustion does not appear to be a significant hazard in aircraft passenger cabins, but seems to be a basis for careful selection of materials for residential environments.
40 CFR 270.66 - Permits for boilers and industrial furnaces burning hazardous waste.
Code of Federal Regulations, 2011 CFR
2011-07-01
... blended, and blending ratios. (3) A detailed engineering description of the boiler or industrial furnace... 40 Protection of Environment 27 2011-07-01 2011-07-01 false Permits for boilers and industrial... PROGRAM Special Forms of Permits § 270.66 Permits for boilers and industrial furnaces burning hazardous...
40 CFR 270.66 - Permits for boilers and industrial furnaces burning hazardous waste.
Code of Federal Regulations, 2013 CFR
2013-07-01
... blended, and blending ratios. (3) A detailed engineering description of the boiler or industrial furnace... 40 Protection of Environment 28 2013-07-01 2013-07-01 false Permits for boilers and industrial... PROGRAM Special Forms of Permits § 270.66 Permits for boilers and industrial furnaces burning hazardous...
40 CFR 270.66 - Permits for boilers and industrial furnaces burning hazardous waste.
Code of Federal Regulations, 2012 CFR
2012-07-01
... blended, and blending ratios. (3) A detailed engineering description of the boiler or industrial furnace... 40 Protection of Environment 28 2012-07-01 2012-07-01 false Permits for boilers and industrial... PROGRAM Special Forms of Permits § 270.66 Permits for boilers and industrial furnaces burning hazardous...
40 CFR 270.66 - Permits for boilers and industrial furnaces burning hazardous waste.
Code of Federal Regulations, 2014 CFR
2014-07-01
... blended, and blending ratios. (3) A detailed engineering description of the boiler or industrial furnace... 40 Protection of Environment 27 2014-07-01 2014-07-01 false Permits for boilers and industrial... PROGRAM Special Forms of Permits § 270.66 Permits for boilers and industrial furnaces burning hazardous...
40 CFR 270.66 - Permits for boilers and industrial furnaces burning hazardous waste.
Code of Federal Regulations, 2010 CFR
2010-07-01
... blended, and blending ratios. (3) A detailed engineering description of the boiler or industrial furnace... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Permits for boilers and industrial... PROGRAM Special Forms of Permits § 270.66 Permits for boilers and industrial furnaces burning hazardous...
A new modeling and inference approach for the Systolic Blood Pressure Intervention Trial outcomes.
Yang, Song; Ambrosius, Walter T; Fine, Lawrence J; Bress, Adam P; Cushman, William C; Raj, Dominic S; Rehman, Shakaib; Tamariz, Leonardo
2018-06-01
Background/aims In clinical trials with time-to-event outcomes, usually the significance tests and confidence intervals are based on a proportional hazards model. Thus, the temporal pattern of the treatment effect is not directly considered. This could be problematic if the proportional hazards assumption is violated, as such violation could impact both interim and final estimates of the treatment effect. Methods We describe the application of inference procedures developed recently in the literature for time-to-event outcomes when the treatment effect may or may not be time-dependent. The inference procedures are based on a new model which contains the proportional hazards model as a sub-model. The temporal pattern of the treatment effect can then be expressed and displayed. The average hazard ratio is used as the summary measure of the treatment effect. The test of the null hypothesis uses adaptive weights that often lead to improvement in power over the log-rank test. Results Without needing to assume proportional hazards, the new approach yields results consistent with previously published findings in the Systolic Blood Pressure Intervention Trial. It provides a visual display of the time course of the treatment effect. At four of the five scheduled interim looks, the new approach yields smaller p values than the log-rank test. The average hazard ratio and its confidence interval indicates a treatment effect nearly a year earlier than a restricted mean survival time-based approach. Conclusion When the hazards are proportional between the comparison groups, the new methods yield results very close to the traditional approaches. When the proportional hazards assumption is violated, the new methods continue to be applicable and can potentially be more sensitive to departure from the null hypothesis.
Development of hazard-compatible building fragility and vulnerability models
Karaca, E.; Luco, N.
2008-01-01
We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.
NASA Technical Reports Server (NTRS)
Kumar, K. V.; Calkins, Dick S.; Waligora, James M.; Gilbert, John H., III; Powell, Michael R.
1992-01-01
This study investigated the association between time at onset of circulating microbubbles (CMB) and symptoms of altitude decompression sickness (DCS), using Cox proportional hazard regression models. The study population consisted of 125 individuals who participated in direct ascent, simulated extravehicular activities profiles. Using individual CMB status as a time-dependent variable, we found that the hazard for symptoms increased significantly (at the end of 180 min at altitude) in the presence of CMB (Hazard Ratio = 29.59; 95 percent confidence interval (95 percent CI) = 7.66-114.27), compared to no CMB. Further examination was conducted on the subgroup of individuals who developed microbubbles during the test (n = 49), by using Cox regression. Individuals with late onset of CMB (greater than 60 min at altitude) showed a significantly reduced risk of symptoms (hazard ratio = 0.92; 95 percent CI = 0.89-0.95), compared to those with early onset (equal to or less than 60 min), while controlling for other risk factors. We conclude that time to detection of circulating microbubbles is an independent determinant of symptoms of DCS.
Morbidity, mortality and economic burden of renal impairment in cardiac intensive care.
Chew, D P; Astley, C; Molloy, D; Vaile, J; De Pasquale, C G; Aylward, P
2006-03-01
Moderate to severe impairment of renal function has emerged as a potent risk factor for adverse short- and long-term outcomes among patients presenting with cardiac disease. We sought to define the clinical, late mortality and economic burden of this risk factor among patients presenting to cardiac intensive care. A clinical audit of patients presenting to cardiac intensive care was undertaken between July 2002 and June 2003. All patients presenting with cardiac diagnoses were included in the study. Baseline creatinine levels were assessed in all patients. Late mortality was assessed by the interrogation of the National Death Register. Renal impairment was defined as estimated glomerular filtration rate <60 mL/min per 1.73 m2, as calculated by the Modified Diet in Renal Disease formula. In-hospital and late outcomes were compared by Cox proportional hazards modelling, adjusting for known confounders. A matched analysis and attributable risk calculation were undertaken to assess the proportion of late mortality accounted for by impairment of renal function and other known negative prognostic factors. The in-hospital total cost associated with renal impairment was assessed by linear regression. Glomerular filtration rate <60 mL/min per 1.73 m2 was evident in 33.0% of this population. Among these patients, in-hospital and late mortality were substantially increased: risk ratio 13.2; 95% CI 3.0-58.1; P < 0.001 and hazard ratio 6.2; 95% CI 3.6-10.7; P < 0.001, respectively. In matched analysis, renal impairment to this level was associated with 42.1% of all the late deaths observed. Paradoxically, patients with renal impairment were more conservatively managed, but their hospitalizations were associated with an excess adjusted in-hospital cost of $A1676. Impaired renal function is associated with a striking clinical and economic burden among patients presenting to cardiac intensive care. As a marker for future risk, renal function accounts for a substantial proportion of the burden of late mortality. The burden of risk suggests a greater potential opportunity for improvement of outcomes through optimisation of therapeutic strategies.
Smail, Nassima; Tchervenkov, Jean; Paraskevas, Steven; Baran, Dana; Mucsi, Istvan; Hassanain, Mazen; Chaudhury, Prosanto; Cantarovich, Marcelo
2013-07-27
The use of kidneys from expanded-criteria donors (ECD) is regarded with caution. We compared 279 kidney transplant recipients (KTxR) from standard-criteria donors (SCD) and 237 from ECD, transplanted between January 1990 and December 2006. We evaluated the impact of immediate graft function (IGF), slow graft function (SGF), and delayed graft function (DGF) and the drop in estimated glomerular filtration rate (ΔeGFR) ≤ 30% or > 30% during the first year after transplantation on long-term patient and death-censored graft survival (DCGS). Ten-year patient survival was similar in SCD- or ECD-KTxR (P = 0.38). DCGS was better in SCD-KTxR versus ECD-KTxR (77.3% vs. 67.3%; P = 0.01). DCGS did not differ in either group experiencing IGF (P = 0.17) or DGF (P = 0.12). However, DCGS was worse in ECD-KTxR experiencing SGF (84.9% vs. 73.7%; P = 0.04). Predictors of DCGS were 1-year serum creatinine (hazard ratio, 1.03; P < 0.0001) and ΔeGFR > 30% between 1 and 12 months (Δ1-12eGFR) after transplantation (hazard ratio, 2.2; P = 0.02). In ECD-KTxR with IGF and more than 1-year follow-up, 10-year DCGS was better in those with Δ1-12eGFR ≤ 30% versus those with Δ1-12eGFR > 30% (83.8% vs. 53.6%; P = 0.01). Recipients of SCD or ECD kidneys with IGF or DGF had similar 10-year patient survival and DCGS. SGF had a worse impact on DCGS in ECD-KTxR. In addition to 1-year serum creatinine, Δ1-12eGFR > 30% is a negative predictor of DCGS. Larger studies should confirm if increasing the use of ECD, avoiding factors that contribute to SGF or DGF, and/or a decline in eGFR during the first year after transplantation may expand the donor pool and result in acceptable long-term outcomes.
Neuropsychological Correlates of Hazard Perception in Older Adults.
McInerney, Katalina; Suhr, Julie
2016-03-01
Hazard perception, the ability to identify and react to hazards while driving, is of growing importance in driving research, given its strong relationship to real word driving variables. Furthermore, although poor hazard perception is associated with novice drivers, recent research suggests that it declines with advanced age. In the present study, we examined the neuropsychological correlates of hazard perception in a healthy older adult sample. A total of 68 adults age 60 and older who showed no signs of dementia and were active drivers completed a battery of neuropsychological tests as well as a hazard perception task. Tests included the Repeatable Battery for the Assessment of Neuropsychological Status, Wechsler Test of Adult Reading, Trail Making Test, Block Design, Useful Field of View, and the Delis-Kaplan Executive Function System Color Word Interference Test. Hazard perception errors were related to visuospatial/constructional skills, processing speed, memory, and executive functioning skills, with a battery of tests across these domains accounting for 36.7% of the variance in hazard perception errors. Executive functioning, particularly Trail Making Test part B, emerged as a strong predictor of hazard perception ability. Consistent with prior work showing the relationship of neuropsychological performance to other measures of driving ability, neuropsychological performance was associated with hazard perception skill. Future studies should examine the relationship of neuropsychological changes in adults who are showing driving impairment and/or cognitive changes associated with Mild Cognitive Impairment or dementia.
Models and analysis for multivariate failure time data
NASA Astrophysics Data System (ADS)
Shih, Joanna Huang
The goal of this research is to develop and investigate models and analytic methods for multivariate failure time data. We compare models in terms of direct modeling of the margins, flexibility of dependency structure, local vs. global measures of association, and ease of implementation. In particular, we study copula models, and models produced by right neutral cumulative hazard functions and right neutral hazard functions. We examine the changes of association over time for families of bivariate distributions induced from these models by displaying their density contour plots, conditional density plots, correlation curves of Doksum et al, and local cross ratios of Oakes. We know that bivariate distributions with same margins might exhibit quite different dependency structures. In addition to modeling, we study estimation procedures. For copula models, we investigate three estimation procedures. the first procedure is full maximum likelihood. The second procedure is two-stage maximum likelihood. At stage 1, we estimate the parameters in the margins by maximizing the marginal likelihood. At stage 2, we estimate the dependency structure by fixing the margins at the estimated ones. The third procedure is two-stage partially parametric maximum likelihood. It is similar to the second procedure, but we estimate the margins by the Kaplan-Meier estimate. We derive asymptotic properties for these three estimation procedures and compare their efficiency by Monte-Carlo simulations and direct computations. For models produced by right neutral cumulative hazards and right neutral hazards, we derive the likelihood and investigate the properties of the maximum likelihood estimates. Finally, we develop goodness of fit tests for the dependency structure in the copula models. We derive a test statistic and its asymptotic properties based on the test of homogeneity of Zelterman and Chen (1988), and a graphical diagnostic procedure based on the empirical Bayes approach. We study the performance of these two methods using actual and computer generated data.
Cremer, Paul C; Zhang, Yiran; Alu, Maria; Rodriguez, L Leonardo; Lindman, Brian R; Zajarias, Alan; Hahn, Rebecca T; Lerakis, Stamatios; Malaisrie, S Chris; Douglas, Pamela S; Pibarot, Philippe; Svensson, Lars G; Leon, Martin B; Jaber, Wael A
2018-05-08
In patients randomized to transcatheter or surgical aortic valve replacement (TAVR, SAVR), we sought to determine whether SAVR is associated with worsening right ventricular (RV) function and whether RV deterioration is associated with mortality. In 1376 patients from PARTNERIIA with paired baseline and 30-day core lab echocardiograms, worsening RV function was defined as decline by at least one grade from baseline to 30 days. Our primary outcome was all-cause mortality from 30 days to 2 years. Among 744 patients with TAVR, 62 (8.3%) had worsening RV function, compared with 156 of 632 patients with SAVR (24.7%) (P < 0.0001). In a multivariable model, SAVR [odds ratio (OR) 4.05, 95% confidence interval (CI) 2.55-6.44], a dilated RV (OR 2.38, 95% CI 1.37-4.14), and more than mild tricuspid regurgitation (TR) (OR 2.58, 95% CI 1.25-5.33) were associated with worsening RV function. There were 169 deaths, and patients with worsening RV function had higher all-cause mortality [hazard ratio (HR) 1.98, 95% CI 1.40-2.79]. This association remained robust after adjusting for clinical and echocardiographic variables. Among patients with worsening RV function, there was no mortality difference between TAVR and SAVR (HR 1.16, 95% CI 0.61-2.18). The development of moderate or severe RV dysfunction from baseline normal RV function conferred the worst prognosis (HR 2.87, 95% CI 1.40-5.89). After aortic valve replacement, worsening RV function is more common in patients with baseline RV dilation, more than mild TR, and in patients treated with SAVR. Worsening RV function and the magnitude of deterioration have important prognostic implications.
Wedderburn, Catherine J; van Beijnum, Janneke; Bhattacharya, Jo J; Counsell, Carl E; Papanastassiou, Vakis; Ritchie, Vaughn; Roberts, Richard C; Sellar, Robin J; Warlow, Charles P; Al-Shahi Salman, Rustam
2008-03-01
The decision about whether to treat an unruptured brain arteriovenous malformation (AVM) depends on a comparison of the estimated lifetime risk of intracranial haemorrhage with the risks of interventional treatment. We aimed to test whether outcome differs between adults who had interventional AVM treatment and those who did not. All adults in Scotland who were first diagnosed with an unruptured AVM during 1999-2003 (n=114) entered our prospective, population-based study. We compared the baseline characteristics and 3-year outcome of adults who received interventional treatment for their AVM (n=63) with those who did not (n=51). At presentation, adults who were treated were younger (mean 40 vs 55 years of age, 95% CI for difference 9-20; p<0.0001), more likely to present with a seizure (odds ratio 2.4, 95% CI 1.1-5.0), and had fewer comorbidities (median 3 vs 4, p=0.03) than those who were not treated. Despite these baseline imbalances, treated and untreated groups did not differ in progression to Oxford Handicap Scale (OHS) scores of 2-6 (log-rank p=0.12) or 3-6 (log-rank p=0.98) in survival analyses. In a multivariable Cox proportional hazards analysis, the risk of poor outcome (OHS 2-6) was greater in patients who had interventional treatment than in those who did not (hazard ratio 2.5, 95% CI 1.1-6.0) and was greater in patients with a larger AVM nidus (hazard ratio 1.3, 95% CI 1.1-1.7). The treated and untreated groups did not differ in time to an OHS score of 2 or more that was sustained until the end of the third year of follow-up, or in the spectrum of dependence as measured by the OHS at 1, 2, and 3 years of follow-up. Greater AVM size and interventional treatment were associated with worse short-term functional outcome for unruptured AVMs, but the longer-term effects of intervention are unclear.
Ruptured Tendons in Anabolic-Androgenic Steroid Users: A Cross-Sectional Cohort Study.
Kanayama, Gen; DeLuca, James; Meehan, William P; Hudson, James I; Isaacs, Stephanie; Baggish, Aaron; Weiner, Rory; Micheli, Lyle; Pope, Harrison G
2015-11-01
Accumulating case reports have described tendon rupture in men who use anabolic-androgenic steroids (AAS). However, no controlled study has assessed the history of tendon rupture in a large cohort of AAS users and comparison nonusers. Men reporting long-term AAS abuse would report an elevated lifetime incidence of tendon rupture compared with non-AAS-using bodybuilders. Cohort study; Level of evidence, 3. Medical histories were obtained from 142 experienced male bodybuilders aged 35 to 55 years recruited in the course of 2 studies. Of these men, 88 reported at least 2 years of cumulative lifetime AAS use, and 54 reported no history of AAS use. In men reporting a history of tendon rupture, the circumstances of the injury, prodromal symptoms, concomitant drug or alcohol use, and details of current and lifetime AAS use (if applicable) were recorded. Surgical records were obtained for most participants. Nineteen (22%) of the AAS users, but only 3 (6%) of the nonusers, reported at least 1 lifetime tendon rupture. The hazard ratio for a first ruptured tendon in AAS users versus nonusers was 9.0 (95% CI, 2.5-32.3; P < .001). Several men reported 2 or more independent lifetime tendon ruptures. Interestingly, upper-body tendon ruptures occurred exclusively in the AAS group (15 [17%] AAS users vs 0 nonusers; risk difference, 0.17 [95% CI, 0.09-0.25]; P < .001 [hazard ratio not estimable]), whereas there was no significant difference between users and nonusers in risk for lower-body ruptures (6 [7%] AAS users, 3 [6%] nonusers; hazard ratio, 3.1 [95% CI, 0.7-13.8]; P = .13). Of 31 individual tendon ruptures assessed, only 6 (19%) occurred while weightlifting, with the majority occurring during other sports activities. Eight (26%) ruptures followed prodromal symptoms of nonspecific pain in the region. Virtually all ruptures were treated surgically, with complete or near-complete ultimate restoration of function. AAS abusers, compared with otherwise similar bodybuilders, showed a markedly increased risk of tendon ruptures, particularly upper-body tendon rupture. © 2015 The Author(s).
Greisenegger, Stefan; Segal, Helen C; Burgess, Annette I; Poole, Debbie L; Mehta, Ziyah; Rothwell, Peter M
2015-11-01
Copeptin, the c-terminal portion of provasopressin, is a useful prognostic marker in patients after myocardial infarction and heart failure. More recently, high levels of copeptin have also been associated with worse functional outcome and increased mortality within the first year after ischemic stroke and transient ischemic attack (TIA). However, to date, there are no published data on whether copeptin predicts long-term risk of vascular events after TIA and stroke. We measured copeptin levels in consecutive patients with TIA or ischemic stroke in a population-based study (Oxford Vascular Study) recruited from 2002 to 2007 and followed up to 2014. Associations with risk of recurrent vascular events were determined by Cox-regression. During ≈6000 patient-years in 1076 patients, there were 357 recurrent vascular events, including 174 ischemic strokes. After adjustment for age, sex, and risk factors, copeptin was predictive of recurrent vascular events (adjusted hazard ratio per SD, 1.47; 95% confidence interval, 1.31-1.64; P=0.0001), vascular death (1.85; 1.60-2.14; P<0.0001), all-cause death (1.75; 1.58-1.93; P<0.0001), and recurrent ischemic stroke (1.22; 1.04-1.44; P=0.017); and improved model-discrimination significantly: net reclassification improvement for recurrent vascular events (32%; P<0.0001), vascular death (55%; P<0.0001), death (66%; P<0.0001), and recurrent stroke (16%; P=0.044). The predictive value of copeptin was largest in patients with cardioembolic index events (adjusted hazard ratio, 1.84; 95% confidence interval, 1.53-2.20 versus 1.31, 1.14-1.50 in noncardioembolic stroke; P=0.0025). In patients with cardioembolic stroke, high copeptin levels were associated with a 4-fold increased risk of vascular events within the first year of follow-up (adjusted hazard ratio, 4.02; 95% confidence interval, 2.13-7.70). In patients with TIA and ischemic stroke, copeptin predicted recurrent vascular events and death, particularly after cardioembolic TIA/stroke. Further validation is required, in particular, in studies using more extensive cardiac evaluation. © 2015 American Heart Association, Inc.
Revascularization versus medical therapy for renal-artery stenosis.
Wheatley, Keith; Ives, Natalie; Gray, Richard; Kalra, Philip A; Moss, Jonathan G; Baigent, Colin; Carr, Susan; Chalmers, Nicholas; Eadington, David; Hamilton, George; Lipkin, Graham; Nicholson, Anthony; Scoble, John
2009-11-12
Percutaneous revascularization of the renal arteries improves patency in atherosclerotic renovascular disease, yet evidence of a clinical benefit is limited. In a randomized, unblinded trial, we assigned 806 patients with atherosclerotic renovascular disease either to undergo revascularization in addition to receiving medical therapy or to receive medical therapy alone. The primary outcome was renal function, as measured by the reciprocal of the serum creatinine level (a measure that has a linear relationship with creatinine clearance). Secondary outcomes were blood pressure, the time to renal and major cardiovascular events, and mortality. The median follow-up was 34 months. During a 5-year period, the rate of progression of renal impairment (as shown by the slope of the reciprocal of the serum creatinine level) was -0.07x10(-3) liters per micromole per year in the revascularization group, as compared with -0.13x10(-3) liters per micromole per year in the medical-therapy group, a difference favoring revascularization of 0.06x10(-3) liters per micromole per year (95% confidence interval [CI], -0.002 to 0.13; P=0.06). Over the same time, the mean serum creatinine level was 1.6 micromol per liter (95% CI, -8.4 to 5.2 [0.02 mg per deciliter; 95% CI, -0.10 to 0.06]) lower in the revascularization group than in the medical-therapy group. There was no significant between-group difference in systolic blood pressure; the decrease in diastolic blood pressure was smaller in the revascularization group than in the medical-therapy group. The two study groups had similar rates of renal events (hazard ratio in the revascularization group, 0.97; 95% CI, 0.67 to 1.40; P=0.88), major cardiovascular events (hazard ratio, 0.94; 95% CI, 0.75 to 1.19; P=0.61), and death (hazard ratio, 0.90; 95% CI, 0.69 to 1.18; P=0.46). Serious complications associated with revascularization occurred in 23 patients, including 2 deaths and 3 amputations of toes or limbs. We found substantial risks but no evidence of a worthwhile clinical benefit from revascularization in patients with atherosclerotic renovascular disease. (Current Controlled Trials number, ISRCTN59586944.) 2009 Massachusetts Medical Society
Jabs, Douglas A.; Drye, Lea; Van Natta, Mark L.; Thorne, Jennifer E.; Holland, Gary N.
2014-01-01
Objectives Patients with the acquired immunodeficiency syndrome (AIDS) have an abnormality of retina/optic nerve function, manifested as decreased contrast sensitivity (in the absence of ocular opportunistic infections or media opacity), abnormalities on automated perimetry, and loss of retinal nerve fiber layer, even among those with good visual acuity, termed the HIV-neuroretinal disorder. The objectives of this study were to determine the prevalence, incidence, risk factors for, and outcomes of HIV-neuroretinal disorder. Design Prospective cohort study Participants 1822 patients with AIDS without ocular infections or media opacities. Methods Patients with HIV-neuroretinal disorder were identified by a contrast sensitivity < 1.50 log units in either eye in the absence of ocular opportunistic infections or media opacity. Main outcome measures Incidence of HIV-neuroretinal disorder, mortality, visual impairment (visual acuity 20/50 or worse), and blindness (20/200 or worse) on logarithmic visual acuity charts. Results Sixteen percent of participants had HIV-neuroretinal disorder at enrollment. The estimated cumulative incidence by 20 years after AIDS diagnosis was 51% (95% confidence interval [CI] 46%–55%). HIV-neuroretinal disorder was more common in women and African American persons. Risk factors for it included hepatitis C infection, low CD4+ T cells, and detectable HIV RNA in the blood. Patients with HIV neuroretinal disorder had a 70% excess mortality vs. those without it, even after adjusting for CD4+ T cells and HIV load (hazard ratio=1.7, 95% CI= 1.3–2.1, P<0.0001). Patients with HIV-neuroretinal disorder had increased risks of bilateral visual impairment (hazard ratio=6.5, 95% CI=2.6–10.6, P<0.0001) and blindness (hazard ratio=5.9, 95% CI=2.8–13.7, P=0.01) vs. those without HIV neuroretinal disorder. Conclusions HIV-neuroretinal disorder is a common finding among patients with AIDS, and it is associated with an increased mortality and an increased risk of visual impairment. Successful antiretroviral therapy decreases but does not eliminate the risk of HIV-neuroretinal disorder. PMID:25600199
Preston, Ioana R.; Roberts, Kari E.; Miller, Dave P.; Sen, Ginny P.; Selej, Mona; Benton, Wade W.; Hill, Nicholas S.
2015-01-01
Background— Long-term anticoagulation is recommended in idiopathic pulmonary arterial hypertension (IPAH). In contrast, limited data support anticoagulation in pulmonary arterial hypertension (PAH) associated with systemic sclerosis (SSc-PAH). We assessed the effect of warfarin anticoagulation on survival in IPAH and SSc-PAH patients enrolled in Registry to Evaluate Early and Long-term PAH Disease Management (REVEAL), a longitudinal registry of group I PAH. Methods and Results— Patients who initiated warfarin on study (n=187) were matched 1:1 with patients never on warfarin, by enrollment site, etiology, and diagnosis status. Descriptive analyses were conducted to compare warfarin users and nonusers by etiology. Survival analyses with and without risk adjustment were performed from the time of warfarin initiation or a corresponding quarterly update in matched pairs to avoid immortal time bias. Time-varying covariate models were used as sensitivity analyses. Mean warfarin treatment was 1 year; mean international normalized ratios were 1.9 (IPAH) and 2.0 (SSc-PAH). Two-thirds of patients initiating warfarin discontinued treatment before the last study assessment. There was no survival difference with warfarin in IPAH patients (adjusted hazard ratio, 1.37; P=0.21) or in SSc-PAH patients (adjusted hazard ratio, 1.60; P=0.15) in comparison with matched controls. However, SSc-PAH patients receiving warfarin within the previous year (hazard ratio, 1.57; P=0.031) or any time postbaseline (hazard ratio, 1.49; P=0.046) had increased mortality in comparison with warfarin-naïve patients. Conclusions— No significant survival advantage was observed in IPAH patients who started warfarin. In SSc-PAH patients, long-term warfarin was associated with poorer survival than in patients not receiving warfarin, even after adjusting for confounders. Clinical Trial Registration— URL: http://www.clinicaltrials.gov. Unique identifier: NCT00370214. PMID:26510696
Preston, Ioana R; Roberts, Kari E; Miller, Dave P; Sen, Ginny P; Selej, Mona; Benton, Wade W; Hill, Nicholas S; Farber, Harrison W
2015-12-22
Long-term anticoagulation is recommended in idiopathic pulmonary arterial hypertension (IPAH). In contrast, limited data support anticoagulation in pulmonary arterial hypertension (PAH) associated with systemic sclerosis (SSc-PAH). We assessed the effect of warfarin anticoagulation on survival in IPAH and SSc-PAH patients enrolled in Registry to Evaluate Early and Long-term PAH Disease Management (REVEAL), a longitudinal registry of group I PAH. Patients who initiated warfarin on study (n=187) were matched 1:1 with patients never on warfarin, by enrollment site, etiology, and diagnosis status. Descriptive analyses were conducted to compare warfarin users and nonusers by etiology. Survival analyses with and without risk adjustment were performed from the time of warfarin initiation or a corresponding quarterly update in matched pairs to avoid immortal time bias. Time-varying covariate models were used as sensitivity analyses. Mean warfarin treatment was 1 year; mean international normalized ratios were 1.9 (IPAH) and 2.0 (SSc-PAH). Two-thirds of patients initiating warfarin discontinued treatment before the last study assessment. There was no survival difference with warfarin in IPAH patients (adjusted hazard ratio, 1.37; P=0.21) or in SSc-PAH patients (adjusted hazard ratio, 1.60; P=0.15) in comparison with matched controls. However, SSc-PAH patients receiving warfarin within the previous year (hazard ratio, 1.57; P=0.031) or any time postbaseline (hazard ratio, 1.49; P=0.046) had increased mortality in comparison with warfarin-naïve patients. No significant survival advantage was observed in IPAH patients who started warfarin. In SSc-PAH patients, long-term warfarin was associated with poorer survival than in patients not receiving warfarin, even after adjusting for confounders. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00370214. © 2015 The Authors.
Assessment of Three Flood Hazard Mapping Methods: A Case Study of Perlis
NASA Astrophysics Data System (ADS)
Azizat, Nazirah; Omar, Wan Mohd Sabki Wan
2018-03-01
Flood is a common natural disaster and also affect the all state in Malaysia. Regarding to Drainage and Irrigation Department (DID) in 2007, about 29, 270 km2 or 9 percent of region of the country is prone to flooding. Flood can be such devastating catastrophic which can effected to people, economy and environment. Flood hazard mapping can be used is an important part in flood assessment to define those high risk area prone to flooding. The purposes of this study are to prepare a flood hazard mapping in Perlis and to evaluate flood hazard using frequency ratio, statistical index and Poisson method. The six factors affecting the occurrence of flood including elevation, distance from the drainage network, rainfall, soil texture, geology and erosion were created using ArcGIS 10.1 software. Flood location map in this study has been generated based on flooded area in year 2010 from DID. These parameters and flood location map were analysed to prepare flood hazard mapping in representing the probability of flood area. The results of the analysis were verified using flood location data in year 2013, 2014, 2015. The comparison result showed statistical index method is better in prediction of flood area rather than frequency ratio and Poisson method.
NASA Astrophysics Data System (ADS)
Liu, D. L.; Li, Y.
2015-11-01
Evaluating social vulnerability is a crucial issue in risk and disaster management. In this study, a household social vulnerability index (HSVI) to flood hazards was developed and used to assess the social vulnerability of rural households in western mountainous regions of Henan province, China. Eight key indicators were indentified through interactive discussions with multidisciplinary specialists and local farmers, and their weights were determined using principle component analysis (PCA). The results showed that (1) the ratio of perennial working in other places, hazard-related training and illiteracy ratio (15+) were the most dominant factors to social vulnerability. (2) The numbers of high, moderate and low vulnerable households were 14, 64 and 16, respectively, which accounted for 14.9, 68.1, and 17.0 % of the total interviewed rural households, respectively. (3) The correlation coefficient between household social vulnerability scores and casualties in a storm flood in July 2010 was significant at 0.05 significance level (r = 0.248), which indicated that the selected indicators and their weights were valid. (4) Some mitigation strategies to reduce the household social vulnerability to flood hazards were proposed based on the assessment results. The results provide useful information for rural households and local governments to prepare, mitigate and response to flood hazards.
Clemens, Michael S; Stewart, Ian J; Sosnov, Jonathan A; Howard, Jeffrey T; Belenkiy, Slava M; Sine, Christy R; Henderson, Jonathan L; Buel, Allison R; Batchinsky, Andriy I; Cancio, Leopoldo C; Chung, Kevin K
2016-10-01
To evaluate the association between acute respiratory distress syndrome and acute kidney injury with respect to their contributions to mortality in critically ill patients. Retrospective analysis of consecutive adult burn patients requiring mechanical ventilation. A 16-bed burn ICU at tertiary military teaching hospital. Adult patients more than 18 years old requiring mechanical ventilation during their initial admission to our burn ICU from January 1, 2003, to December 31, 2011. None. A total 830 patients were included, of whom 48.2% had acute kidney injury (n = 400). These patients had a 73% increased risk of developing acute respiratory distress syndrome after controlling for age, gender, total body surface area burned, and inhalation injury (hazard ratio, 1.73; 95% CI, 1.18-2.54; p = 0.005). In a reciprocal multivariate analysis, acute respiratory distress syndrome (n = 299; 36%) demonstrated a strong trend toward developing acute kidney injury (hazard ratio, 1.39; 95% CI, 0.99-1.95; p = 0.05). There was a 24% overall in-hospital mortality (n = 198). After adjusting for the aforementioned confounders, both acute kidney injury (hazard ratio, 3.73; 95% CI, 2.39-5.82; p < 0.001) and acute respiratory distress syndrome (hazard ratio, 2.16; 95% CI, 1.58-2.94; p < 0.001) significantly contributed to mortality. Age, total body surface area burned, and inhalation injury were also significantly associated with increased mortality. Acute kidney injury increases the risk of acute respiratory distress syndrome in mechanically ventilated burn patients, whereas acute respiratory distress syndrome similarly demonstrates a strong trend toward the development of acute kidney injury. Acute kidney injury and acute respiratory distress syndrome are both independent risks for subsequent death. Future research should look at this interplay for possible early interventions.
Conen, David; Arendacká, Barbora; Röver, Christian; Bergau, Leonard; Munoz, Pascal; Wijers, Sofieke; Sticherling, Christian; Zabel, Markus; Friede, Tim
2016-01-01
Some but not all prior studies have shown that women receiving a primary prophylactic implantable cardioverter defibrillator (ICD) have a lower risk of death and appropriate shocks than men. To evaluate the effect of gender on the risk of appropriate shock, all-cause mortality and inappropriate shock in contemporary studies of patients receiving a primary prophylactic ICD. PubMed, LIVIVO, Cochrane CENTRAL between 2010 and 2016. Studies providing at least 1 gender-specific risk estimate for the outcomes of interest. Abstracts were screened independently for potentially eligible studies for inclusion. Thereby each abstract was reviewed by at least two authors. Out of 680 abstracts retained by our search strategy, 20 studies including 46'657 patients had gender-specific information on at least one of the relevant endpoints. Mean age across the individual studies varied between 58 and 69 years. The proportion of women enrolled ranged from 10% to 30%. Across 6 available studies, women had a significantly lower risk of first appropriate shock compared with men (pooled multivariable adjusted hazard ratio 0.62 (95% CI [0.44; 0.88]). Across 14 studies reporting multivariable adjusted gender-specific hazard ratio estimates for all-cause mortality, women had a lower risk of death than men (pooled hazard ratio 0.75 (95% CI [0.66; 0.86]). There was no statistically significant difference for the incidence of first inappropriate shocks (3 studies, pooled hazard ratio 0.99 (95% CI [0.56; 1.73]). Individual patient data were not available for most studies. In this large contemporary meta-analysis, women had a significantly lower risk of appropriate shocks and death than men, but a similar risk of inappropriate shocks. These data may help to select patients who benefit from primary prophylactic ICD implantation.
Chitasombat, Maria N.; Kofteridis, Diamantis P.; Jiang, Ying; Tarrand, Jeffrey; Lewis, Russell E.; Kontoyiannis, Dimitrios P.
2013-01-01
Background Rare opportunistic (non-Candida, non-Cryptococcus) yeast bloodstream infections (ROYBSIs) are rare, even in cancer patients. Methods We retrospectively reviewed all episodes of ROYBSIs occurring from 1998 to 2010 in our cancer center. Results Of 2984 blood cultures positive for Candida and non-Candida yeasts, 94 (3.1%) were positive for non-Candida yeasts, representing 41 ROYBSIs (incidence, 2.1 cases/100,000 patient-days). Catheter-associated fungemia occurred in 21 (51%) patients. Breakthrough ROYBSIs occurred in 20 (49%) patients. The yeast species distribution was Rhodotorula in 21 (51%) patients, Trichosporon in 8 (20%) patients, Saccharomyces cerevisiae in 8 (20%) patients, Geotrichum in 2 (5%) patients, Pichia anomala, and Malassezia furfur in 1 patient each. All tested Trichosporon, Geotrichum, and Pichia isolates were azole-susceptible, whereas the Rhodotorula isolates were mostly azole-resistant. We noted echinocandin nonsusceptibility (minimal inhibitory concentration ≥ 2 mg/L) in all but the S. cerevisiae isolates. Most of the isolates (28/33 [85%]) were susceptible to amphotericin B. The mortality rate in all patients at 30 days after ROYBSIs diagnosis was 34%. Multivariate survival analysis revealed increased risk of death in patients with S. cerevisiae infections (hazard ratio, 3.7), Geotrichum infections (hazard ratio, 111.3), or disseminated infections (hazard ratio, 33.4) and reduced risk in patients who had catheter removal (hazard ratio, 0.1). Conclusions ROYBSIs are uncommon in patients with cancer, and catheters are common sources of them. Half of the ROYBSIs occurred as breakthrough infections, and in vitro species-specific resistance to echinocandins and azoles was common. Disseminated infections resulted in the high mortality rate. PMID:22101079
Red meat, chicken, and fish consumption and risk of colorectal cancer.
English, Dallas R; MacInnis, Robert J; Hodge, Allison M; Hopper, John L; Haydon, Andrew M; Giles, Graham G
2004-09-01
Red meat and processed meat consumption have been associated with increased risk of colorectal cancer in some, but not all, relevant cohort studies. Evidence on the relationship between risk of colorectal cancer and poultry and fish consumption is inconsistent. We conducted a prospective cohort study of 37,112 residents of Melbourne, Australia recruited from 1990 to 1994. Diet was measured with a food frequency questionnaire. We categorized the frequency of fresh red meat, processed meat, chicken, and fish consumption into approximate quartiles. Adenocarcinomas of the colon or rectum were ascertained via the Victorian Cancer Registry. We identified 283 colon cancers and 169 rectal cancers in an average of 9 years of follow-up. For rectal cancer, the hazard ratios [95% confidence intervals (95% CI)] in the highest quartile of consumption of fresh red meat and processed meat were 2.3 (1.2-4.2; P for trend = 0.07) and 2.0 (1.1-3.4; P for trend = 0.09), respectively. The corresponding hazard ratios (95% CIs) for colon cancer were 1.1 (0.7-1.6; P for trend = 0.9) and 1.3 (0.9-1.9; P for trend = 0.06). However, for neither type of meat was the heterogeneity between subsites significant. Chicken consumption was weakly negatively associated with colorectal cancer (hazard ratio highest quartile, 0.7; 95% CI, 0.6-1.0; P for trend = 0.03), whereas hazard ratios for fish consumption were close to unity. Consumption of fresh red meat and processed meat seemed to be associated with an increased risk of rectal cancer. Consumption of chicken and fish did not increase risk.
Leng, Yue; Wainwright, Nick W. J.; Cappuccio, Francesco P.; Surtees, Paul G.; Hayat, Shabina; Luben, Robert; Brayne, Carol; Khaw, Kay-Tee
2014-01-01
Epidemiologic studies have reported conflicting results on the relationship between daytime napping and mortality risk, and there are few data on the potential association in the British population. We investigated the associations between daytime napping and all-cause or cause-specific mortality in the European Prospective Investigation Into Cancer-Norfolk study, a British population-based cohort study. Among the 16,374 men and women who answered questions on napping habits between 1998 and 2000, a total of 3,251 died during the 13-year follow-up. Daytime napping was associated with an increased risk of all-cause mortality (for napping less than 1 hour per day on average, hazard ratio = 1.14, 95% confidence interval: 1.02, 1.27; for napping 1 hour or longer per day on average, hazard ratio = 1.32, 95% confidence interval: 1.04, 1.68), independent of age, sex, social class, educational level, marital status, employment status, body mass index, physical activity level, smoking status, alcohol intake, depression, self-reported general health, use of hypnotic drugs or other medications, time spent in bed at night, and presence of preexisting health conditions. This association was more pronounced for death from respiratory diseases (for napping less than 1 hour, hazard ratio = 1.40, 95% confidence interval: 0.95, 2.05; for napping 1 hour or more, hazard ratio = 2.56, 95% confidence interval: 1.34, 4.86) and in individuals 65 years of age or younger. Excessive daytime napping might be a useful marker of underlying health risk, particularly of respiratory problems, especially among those 65 years of age or younger. Further research is required to clarify the nature of the observed association. PMID:24685532
Leng, Yue; Wainwright, Nick W J; Cappuccio, Francesco P; Surtees, Paul G; Hayat, Shabina; Luben, Robert; Brayne, Carol; Khaw, Kay-Tee
2014-05-01
Epidemiologic studies have reported conflicting results on the relationship between daytime napping and mortality risk, and there are few data on the potential association in the British population. We investigated the associations between daytime napping and all-cause or cause-specific mortality in the European Prospective Investigation Into Cancer-Norfolk study, a British population-based cohort study. Among the 16,374 men and women who answered questions on napping habits between 1998 and 2000, a total of 3,251 died during the 13-year follow-up. Daytime napping was associated with an increased risk of all-cause mortality (for napping less than 1 hour per day on average, hazard ratio = 1.14, 95% confidence interval: 1.02, 1.27; for napping 1 hour or longer per day on average, hazard ratio = 1.32, 95% confidence interval: 1.04, 1.68), independent of age, sex, social class, educational level, marital status, employment status, body mass index, physical activity level, smoking status, alcohol intake, depression, self-reported general health, use of hypnotic drugs or other medications, time spent in bed at night, and presence of preexisting health conditions. This association was more pronounced for death from respiratory diseases (for napping less than 1 hour, hazard ratio = 1.40, 95% confidence interval: 0.95, 2.05; for napping 1 hour or more, hazard ratio = 2.56, 95% confidence interval: 1.34, 4.86) and in individuals 65 years of age or younger. Excessive daytime napping might be a useful marker of underlying health risk, particularly of respiratory problems, especially among those 65 years of age or younger. Further research is required to clarify the nature of the observed association.
Kim, Sung Han; Park, Boram; Joo, Jungnam; Joung, Jae Young; Seo, Ho Kyung; Chung, Jinsoo; Lee, Kang Hyun
2017-01-01
Objective To evaluate predictive factors for retrograde ureteral stent failure in patients with non-urological malignant ureteral obstruction. Materials and methods Between 2005 and 2014, medical records of 284 malignant ureteral obstruction patients with 712 retrograde ureteral stent trials including 63 (22.2%) having bilateral malignant ureteral obstruction were retrospectively reviewed. Retrograde ureteral stent failure was defined as the inability to place ureteral stents by cystoscopy, recurrent stent obstruction within one month, or non-relief of azotemia within one week from the prior retrograde ureteral stent. The clinicopathological parameters and first retrograde pyelographic findings were analyzed to investigate the predictive factors for retrograde ureteral stent failure and conversion to percutaneous nephrostomy in multivariate analysis with a statistical significance of p < 0.05. Results Retrograde ureteral stent failure was detected in 14.1% of patients. The mean number of retrograde ureteral stent placements and indwelling duration of the ureteral stents were 2.5 ± 2.6 times and 8.6 ± 4.0 months, respectively. Multivariate analyses identified several specific RGP findings as significant predictive factors for retrograde ureteral stent failure (p < 0.05). The significant retrograde pyelographic findings included grade 4 hydronephrosis (hazard ratio 4.10, 95% confidence interval 1.39–12.09), irreversible ureteral kinking (hazard ratio 2.72, confidence interval 1.03–7.18), presence of bladder invasion (hazard ratio 4.78, confidence interval 1.81–12.63), and multiple lesions of ureteral stricture (hazard ratio 3.46, confidence interval 1.35–8.83) (p < 0.05). Conclusion Retrograde pyelography might prevent unnecessary and ineffective retrograde ureteral stent trials in patients with advanced non-urological malignant ureteral obstruction. PMID:28931043
Jakobsen, Lars; Niemann, Troels; Thorsgaard, Niels; Thuesen, Leif; Lassen, Jens F; Jensen, Lisette O; Thayssen, Per; Ravkilde, Jan; Tilsted, Hans H; Mehnert, Frank; Johnsen, Søren P
2012-10-01
The association between low socioeconomic status (SES) and high mortality from coronary heart disease is well-known. However, the role of SES in relation to the clinical outcome after primary percutaneous coronary intervention remains poorly understood. We studied 7385 patients treated with primary percutaneous coronary intervention. Participants were divided into high-SES and low-SES groups according to income, education, and employment status. The primary outcome was major adverse cardiac events (cardiac death, recurrent myocardial infarction, and target vessel revascularization) at maximum follow-up (mean, 3.7 years). Low-SES patients had more adverse baseline risk profiles than high-SES patients. The cumulative risk of major adverse cardiac events after maximum follow-up was higher among low-income patients and unemployed patients compared with their counterparts (income: hazard ratio, 1.68; 95% CI, 1.47-1.92; employment status: hazard ratio, 1.75; 95% CI, 1.46-2.10). After adjustment for patient characteristics, these differences were substantially attenuated (income: hazard ratio, 1.12; 95% CI, 0.93-1.33; employment status: hazard ratio, 1.27; 95% CI, 1.03-1.56). Further adjustment for admission findings, procedure-related data, and medical treatment during follow-up did not significantly affect the associations. With education as the SES indicator, no between-group differences were observed in the risk of the composite end point. Even in a tax-financed healthcare system, low-SES patients treated with primary percutaneous coronary intervention face a worse prognosis than high-SES patients. The poor outcome seems to be largely explained by differences in baseline patient characteristics. Employment status and income (but not education level) were associated with clinical outcomes.
Todd, Jonathan V; Cole, Stephen R; Pence, Brian W; Lesko, Catherine R; Bacchetti, Peter; Cohen, Mardge H; Feaster, Daniel J; Gange, Stephen; Griswold, Michael E; Mack, Wendy; Rubtsova, Anna; Wang, Cuiwei; Weedon, Jeremy; Anastos, Kathryn; Adimora, Adaora A
2017-05-15
Depression affects up to 30% of human immunodeficiency virus (HIV)-infected individuals. We estimated joint effects of antiretroviral therapy (ART) initiation and depressive symptoms on time to death using a joint marginal structural model and data from a cohort of HIV-infected women from the Women's Interagency HIV Study (conducted in the United States) from 1998-2011. Among 848 women contributing 6,721 years of follow-up, 194 participants died during follow-up, resulting in a crude mortality rate of 2.9 per 100 women-years. Cumulative mortality curves indicated greatest mortality for women who reported depressive symptoms and had not initiated ART. The hazard ratio for depressive symptoms was 3.38 (95% confidence interval (CI): 2.15, 5.33) and for ART was 0.47 (95% CI: 0.31, 0.70). Using a reference category of women without depressive symptoms who had initiated ART, the hazard ratio for women with depressive symptoms who had initiated ART was 3.60 (95% CI: 2.02, 6.43). For women without depressive symptoms who had not started ART, the hazard ratio was 2.36 (95% CI: 1.16, 4.81). Among women reporting depressive symptoms who had not started ART, the hazard ratio was 7.47 (95% CI: 3.91, 14.3). We found a protective effect of ART initiation on mortality, as well as a harmful effect of depressive symptoms, in a cohort of HIV-infected women. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Very-high-risk localized prostate cancer: definition and outcomes.
Sundi, D; Wang, V M; Pierorazio, P M; Han, M; Bivalacqua, T J; Ball, M W; Antonarakis, E S; Partin, A W; Schaeffer, E M; Ross, A E
2014-03-01
Outcomes in men with National Comprehensive Cancer Network (NCCN) high-risk prostate cancer (PCa) can vary substantially-some will have excellent cancer-specific survival, whereas others will experience early metastasis even after aggressive local treatments. Current nomograms, which yield continuous risk probabilities, do not separate high-risk PCa into distinct sub-strata. Here, we derive a binary definition of very-high-risk (VHR) localized PCa to aid in risk stratification at diagnosis and selection of therapy. We queried the Johns Hopkins radical prostatectomy database to identify 753 men with NCCN high-risk localized PCa (Gleason sum 8-10, PSA >20 ng ml(-1), or clinical stage ≥T3). Twenty-eight alternate permutations of adverse grade, stage and cancer volume were compared by their hazard ratios for metastasis and cancer-specific mortality. VHR criteria with top-ranking hazard ratios were further evaluated by multivariable analyses and inclusion of a clinically meaningful proportion of the high-risk cohort. The VHR cohort was best defined by primary pattern 5 present on biopsy, or ≥5 cores with Gleason sum 8-10, or multiple NCCN high-risk features. These criteria encompassed 15.1% of the NCCN high-risk cohort. Compared with other high-risk men, VHR men were at significantly higher risk for metastasis (hazard ratio 2.75) and cancer-specific mortality (hazard ratio 3.44) (P<0.001 for both). Among high-risk men, VHR men also had significantly worse 10-year metastasis-free survival (37% vs 78%) and cancer-specific survival (62% vs 90%). Men who meet VHR criteria form a subgroup within the current NCCN high-risk classification who have particularly poor oncological outcomes. Use of these characteristics to distinguish VHR localized PCa may help in counseling and selection optimal candidates for multimodal treatments or clinical trials.
Phenotype at diagnosis predicts recurrence rates in Crohn's disease
Wolters, F L; Russel, M G; Sijbrandij, J; Ambergen, T; Odes, S; Riis, L; Langholz, E; Politi, P; Qasim, A; Koutroubakis, I; Tsianos, E; Vermeire, S; Freitas, J; van Zeijl, G; Hoie, O; Bernklev, T; Beltrami, M; Rodriguez, D; Stockbrügger, R W; Moum, B
2006-01-01
Background In Crohn's disease (CD), studies associating phenotype at diagnosis and subsequent disease activity are important for patient counselling and health care planning. Aims To calculate disease recurrence rates and to correlate these with phenotypic traits at diagnosis. Methods A prospectively assembled uniformly diagnosed European population based inception cohort of CD patients was classified according to the Vienna classification for disease phenotype at diagnosis. Surgical and non‐surgical recurrence rates throughout a 10 year follow up period were calculated. Multivariate analysis was performed to classify risk factors present at diagnosis for recurrent disease. Results A total of 358 were classified for phenotype at diagnosis, of whom 262 (73.2%) had a first recurrence and 113 patients (31.6%) a first surgical recurrence during the first 10 years after diagnosis. Patients with upper gastrointestinal disease at diagnosis had an excess risk of recurrence (hazard ratio 1.54 (95% confidence interval (CI) 1.13–2.10)) whereas age ⩾40 years at diagnosis was protective (hazard ratio 0.82 (95% CI 0.70–0.97)). Colonic disease was a protective characteristic for resective surgery (hazard ratio 0.38 (95% CI 0.21–0.69)). More frequent resective surgical recurrences were reported from Copenhagen (hazard ratio 3.23 (95% CI 1.32–7.89)). Conclusions A mild course of disease in terms of disease recurrence was observed in this European cohort. Phenotype at diagnosis had predictive value for disease recurrence with upper gastrointestinal disease being the most important positive predictor. A phenotypic North‐South gradient in CD may be present, illustrated by higher surgery risks in some of the Northern European centres. PMID:16361306
Mortality after a diagnosis of dementia in a population aged 75 and over in Spain.
Llinàs-Regla, Jordi; López-Pousa, Secundino; Vilalta-Franch, Joan; Garre-Olmo, Josep; Román, Gustavo C
2008-01-01
To examine the impact of incident dementia on the risk of death, taking into account other chronic illnesses potentially related to death. Six-year, prospective, two-phase, observational cohort study. 8 municipalities from a rural area in Girona (Spain). A representative community-based cohort of 1,153 adults aged over 70 living at home at study enrolment. Surviving participants underwent detailed clinical evaluation and were assessed by means of the Cambridge Examination for Mental Disorders of the Elderly. Relatives of deceased participants were interviewed using the Retrospective Collateral Dementia Interview. Mortality rates and relative risk of death for subjects with a diagnosis of dementia were calculated. The Cox proportional hazards regression model was used to assess the relationship between mortality and the diagnosis of dementia. In this cohort, 40.0% (n = 49) of the subjects with a diagnosis of dementia died. The mortality rate specific to dementia was 1.0 per 100 person-years. Mortality risk ratios for dementia were 1.79 in men [95% confidence interval (CI) = 1.06-3.02], and 3.14 in women (95% CI = 2.04-4.85). The population death risk attributable to the diagnosis of dementia in our cohort was 11.8%. The most important mortality risks were severe dementia (hazard ratio = 5.7, 95% CI = 3.7-8.6), cancer (hazard ratio = 3.2, 95% CI = 2.2-4.5), heart disease, and an age over 85 (hazard ratio = 1.4, 95% CI = 1.1-1.9). Dementia is a major risk factor for death in advanced age, with the highest mortality rates in women. Moderate and severe dementia was associated with an increased mortality risk even after appropriate control of comorbid conditions. Copyright 2008 S. Karger AG, Basel.
Mortality in former Olympic athletes: retrospective cohort analysis
Zwiers, R; Zantvoord, F W A; van Bodegom, D; van der Ouderaa, F J G; Westendorp, R G J
2012-01-01
Objective To assess the mortality risk in subsequent years (adjusted for year of birth, nationality, and sex) of former Olympic athletes from disciplines with different levels of exercise intensity. Design Retrospective cohort study. Setting Former Olympic athletes. Participants 9889 athletes (with a known age at death) who participated in the Olympic Games between 1896 and 1936, representing 43 types of disciplines with different levels of cardiovascular, static, and dynamic intensity exercise; high or low risk of bodily collision; and different levels of physical contact. Main outcome measure All cause mortality. Results Hazard ratios for mortality among athletes from disciplines with moderate cardiovascular intensity (1.01, 95% confidence interval 0.96 to 1.07) or high cardiovascular intensity (0.98, 0.92 to 1.04) were similar to those in athletes from disciplines with low cardiovascular intensity. The underlying static and dynamic components in exercise intensity showed similar non-significant results. Increased mortality was seen among athletes from disciplines with a high risk of bodily collision (hazard ratio 1.11, 1.06 to 1.15) and with high levels of physical contact (1.16, 1.11 to 1.22). In a multivariate analysis, the effect of high cardiovascular intensity remained similar (hazard ratio 1.05, 0.89 to 1.25); the increased mortality associated with high physical contact persisted (hazard ratio 1.13, 1.06 to 1.21), but that for bodily collision became non-significant (1.03, 0.98 to 1.09) as a consequence of its close relation with physical contact. Conclusions Among former Olympic athletes, engagement in disciplines with high intensity exercise did not bring a survival benefit compared with disciplines with low intensity exercise. Those who engaged in disciplines with high levels of physical contact had higher mortality than other Olympians later in life. PMID:23241269
Pregnancy during breast cancer: does a mother's parity status modify an offspring's mortality risk?
Simonella, Leonardo; Verkooijen, Helena M; Edgren, Gustaf; Liu, Jenny; Hui, Miao; Salim, Agus; Czene, Kamila; Hartman, Mikael
2014-07-01
To assess whether children born to primiparous women around the time of a breast cancer diagnosis have an increased mortality risk. From the merged Swedish Multi-Generation and Cancer Registers, we identified 49,750 eligible children whose mother was diagnosed with breast cancer between 1958 and 2010. Mortality rates in offspring were compared to the background population using standardized mortality ratios (SMR), adjusted for calendar year of birth, attained age, and sex, and calculated for each category of timing of delivery (before, around, or after mother's diagnosis) and mother's parity status. Hazard ratios were assessed using a Cox proportional hazards model and adjusted for socioeconomic status, year of birth and mother's age at birth. Children born to a primiparous woman around a breast cancer diagnosis had a mortality rate five times greater than the background population (SMR 5.26, 95 % CI 1.93-11.5), whereas children born to a multiparous woman had a twofold increase (SMR 2.40, 95 % CI 1.10-4.55). Children of primiparous women born around diagnosis had an adjusted hazard ratio fourfold to that of children of primiparous women born before their mother's diagnosis (HR 4.29, 95 % CI 1.68-8.91), whereas hazard ratios for children of primiparous or multiparous women born at other times were not statistically significant. Children born to primiparous women around a breast cancer diagnosis have an increased relative mortality risk. Although relative risk is increased, in absolute terms children born from a cancer complicated pregnancy do relatively well. Additional investigations are needed to elucidate the reason(s) underlying this observation before the information can be used to inform patient counseling and clinical care.
Ade, Carl J; Broxterman, Ryan M; Charvat, Jacqueline M; Barstow, Thomas J
2017-08-07
It is unknown whether the astronaut occupation or exposure to microgravity influences the risk of long-term cardiovascular disease (CVD). This study explored the effects of being a career National Aeronautics and Space Administration (NASA) astronaut on the risk for clinical CVD end points. During the Longitudinal Study of Astronaut Health, data were collected on 310 NASA astronauts and 981 nonastronaut NASA employees. The nonastronauts were matched to the astronauts on age, sex, and body mass index, to evaluate acute and chronic morbidity and mortality. The primary outcomes were composites of clinical CVD end points (myocardial infarction, congestive heart failure, stroke, and coronary artery bypass surgery) or coronary artery disease (CAD) end points (myocardial infarction and coronary artery bypass surgery). Of the astronauts, 5.2% had a clinical CVD end point and 2.9% had a CAD end point compared with the nonastronaut comparisons with 4.7% and 3.1% having CVD and CAD end points, respectively. In the multivariate models adjusted for traditional risk factors, astronauts had a similar risk of CVD compared with nonastronauts (adjusted hazard ratio, 1.08; 95% CI, 0.60-1.93; P =0.80). Risk of a CAD end point was similar between groups (hazard ratio, 0.97; CI, 0.45-2.08; P =0.93). In astronauts with early spaceflight experience, the risk of CVD (hazard ratio, 0.80; CI, 0.25-2.56; P =0.71) and CAD (hazard ratio, 1.23; CI: 0.27-5.61; P =0.79) compared with astronauts with no experience were not different. These findings suggest that being an astronaut is not associated with increased long-term risk of CVD development. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Feldman, C H; Liu, J; Feldman, S; Solomon, D H; Kim, S C
2017-06-01
Objective Prior studies suggest an increased risk of cervical cancer among women with systemic lupus erythematosus. However, the relationship with immunosuppressive drugs is not well studied in US nationwide cohorts. We compared the risk of high-grade cervical dysplasia and cervical cancer among women with systemic lupus erythematosus who started immunosuppressive drugs versus hydroxychloroquine. Methods We identified systemic lupus erythematosus patients initiating immunosuppressive drugs or hydroxychloroquine using claims data from two US commercial health plans and Medicaid (2000-2012). We used a validated claims-based algorithm to identify high-grade cervical dysplasia or cervical cancer. To account for potential confounders, including demographic factors, comorbidities, medication use, HPV vaccination status, and health care utilization, immunosuppressive drugs and hydroxychloroquine initiators were 1:1 matched on the propensity score. We used inverse variance-weighted, fixed effect models to pool hazard ratios from the propensity score-matched Medicaid and commercial cohorts. Results We included 2451 matched pairs of immunosuppressive drugs and hydroxychloroquine new users in the commercial cohort and 7690 matched pairs in Medicaid. In the commercial cohort, there were 14 cases of cervical dysplasia or cervical cancer among immunosuppressive drugs users and five cases among hydroxychloroquine users (hazard ratio 2.47, 95% CI 0.89-6.85, hydroxychloroquine = ref). In Medicaid, there were 46 cases among immunosuppressive drugs users and 29 cases in hydroxychloroquine users (hazard ratio 1.24, 95% CI 0.78-1.98, hydroxychloroquine = ref). The pooled hazard ratio of immunosuppressive drugs was 1.40 (95% CI 0.92-2.12). Conclusion Among women with systemic lupus erythematosus, immunosuppressive drugs may be associated with a greater, albeit not statistically significant, risk of high-grade cervical dysplasia and cervical cancer compared to patients receiving hydroxychloroquine alone.
Hsiao, Yi-Han; Chen, Yung-Tai; Tseng, Ching-Ming; Wu, Li-An; Perng, Diahn-Warng; Chen, Yuh-Min; Chen, Tzeng-Ji; Chang, Shi-Chuan; Chou, Kun-Ta
2017-10-01
Sleep disorders are common non-motor symptoms in patients with Parkinson's disease. Our study aims to explore the relationship between non-apnea sleep disorders and future Parkinson's disease. This is a cohort study using a nationwide database. The participants were recruited from the Taiwan National Health Insurance Research Database between 2000 and 2003. A total of 91 273 adult patients who had non-apnea sleep disorders without pre-existing Parkinson's disease were enrolled. An age-, gender-, income-, urbanization- and Charlson comorbidity index score-matched control cohort consisting of 91 273 participants was selected for comparison. The two cohorts were followed for the occurrence of Parkinson's disease, death or until the end of 2010, whichever came first. The Kaplan-Meier analyses revealed patients with non-apnea sleep disorders tended to develop Parkinson's disease (log-rank test, P < 0.001). After a multivariate adjustment in a Cox regression model, non-apnea sleep disorders was an independent risk factor for the development of Parkinson's disease [crude hazard ratio: 1.63, 95% confidence interval (CI): 1.54-1.73, P < 0.001; adjusted hazard ratio: 1.18, 95% CI: 1.11-1.26, P < 0.001]. In the subgroup analysis, patients with chronic insomnia (lasting more than 3 months) had the greatest risk (crude hazard ratio: 2.91, 95% CI: 2.59-3.26, P < 0.001; adjusted hazard ratio: 1.37, 95% CI: 1.21-1.55, P < 0.001). In conclusion, this study revealed that non-apnea sleep disorders, especially chronic insomnia, are associated with a higher risk for future Parkinson's disease. © 2017 European Sleep Research Society.
Zhang, Ruoxi; Chen, Shuyuan; Zhang, Hui; Wang, Wei; Xing, Jianpang; Wang, Yu; Yu, Bo; Hou, Jingbo
2016-09-28
This retrospective study aimed to investigate the predictive value of biomarkers for in-hospital mortality of patients with Stanford type A acute aortic dissection (AAD).AAD is a life-threatening disease with an incidence of about 2.6-3.6 cases per 100,000/year.A total of 67 consecutive Stanford type A AAD patients admitted to hospital were divided into a deceased group and survival group. The baseline information of the patients between two groups was systematically compared, followed by examination of the electrocardiograms (ECG). Based on the follow-up during hospitalization, we investigated the simultaneous assessment of indexes like fragmented QRS complex (fQRS), admission systolic blood pressure (SBP), aortic diameter, surgical management, troponin I (TnI), white blood cell (WBC) count, N-terminal pro-brain natriuretic peptide (NT-proBNP), and D-dimer.The levels of TnI and NT-proBNP, WBC counts, and rate of fQRS (+) in patients of the deceased group were significantly higher than those in the survival group. The male sex (hazard ratio, 10.88; P = 0.001), admission SBP (hazard ratio, 0.98; P = 0.012), NT-proBNP (hazard ratio, 1.00; P = 0.001), and WBC count (hazard ratio, 1.10; P = 0.033) were independently related with in-hospital death. As a single marker, WBC count had the highest sensitivity at 84.6% (specificity 65.9%).Admission SBP, NT-proBNP, and WBC count were potential independent risk factors of in-hospital death in Stanford type A AAD patients. WBC count may be a more accurate predictor of type A AAD than either alone.
Denburg, Michelle R.; Jemielita, Thomas; Tasian, Gregory; Haynes, Kevin; Mucksavage, Phillip; Shults, Justine; Copelovitch, Lawrence
2015-01-01
In this study we sought to determine if among individuals with urolithiasis, extracorporeal shock wave lithotripsy (SWL) and ureteroscopy are associated with a higher risk of incident arterial hypertension (HTN) and/or chronic kidney disease (CKD). This was measured in a population-based retrospective study of 11,570 participants with incident urolithiasis and 127,464 without urolithiasis in The Health Improvement Network. Patients with pre-existing HTN and CKD were excluded. The study included 1319 and 919 urolithiasis patients with at least one SWL or URS procedure, respectively. Multivariable Cox regression was used to estimate the hazard ratio for incident CKD stage 3–5 and HTN in separate analyses. Over a median of 3.7 and 4.1 years, 1423 and 595 of urolithiasis participants developed HTN and CKD, respectively. Urolithiasis was associated with a significant hazard ratio each for HTN of 1.42 (95% CI: 1.35, 1.51) and for CKD of 1.82 (1.67, 1.98). SWL was associated with a significant increased risk of HTN 1.34 (1.15, 1.57), while ureteroscopy was not. When further stratified as SWL to the kidney or ureter, only SWL to the kidney was significantly and independently associated with HTN 1.40 (1.19, 1.66). Neither SWL nor ureteroscopy was associated with incident CKD. Since urolithiasis itself was associated with a hazard ratio of 1.42 for HTN, an individual who undergoes SWL to the kidney can be expected to have a significantly increased hazard ratio for HTN of 1.96 (1.67, 2.29) compared to an individual without urolithiasis. PMID:26509587
Kuo, Ling; Chao, Tze-Fan; Liu, Chia-Jen; Lin, Yenn-Jiang; Chang, Shih-Lin; Lo, Li-Wei; Hu, Yu-Feng; Tuan, Ta-Chuan; Liao, Jo-Nan; Chung, Fa-Po; Chen, Tzeng-Ji; Lip, Gregory Y H; Chen, Shih-Ann
2017-06-23
Patients with liver cirrhosis have been excluded from randomized clinical trials of oral anticoagulation therapy for stroke prevention in atrial fibrillation. We hypothesized that patients with liver cirrhosis would have a positive net clinical benefit for oral anticoagulation when used for stroke prevention in atrial fibrillation. This study used the National Health Insurance Research Database in Taiwan. Among 289 559 atrial fibrillation patients aged ≥20 years, there were 10 336 with liver cirrhosis, and 9056 of them having a CHA 2 DS 2 -VASc score ≥2 were divided into 3 groups, that is, no treatment, antiplatelet therapy, and warfarin. Patients with liver cirrhosis had a higher risk of ischemic stroke (hazard ratio=1.10, P =0.046) and intracranial hemorrhage (hazard ratio=1.20, P =0.043) compared with those without. Among patients with liver cirrhosis, patients taking antiplatelet therapy had a similar risk of ischemic stroke (hazard ratio=1.02, 95%CI=0.88-1.18) compared to those without antithrombotic therapies, but the risk was significantly lowered among warfarin users (hazard ratio=0.76, 95%CI=0.58-0.99). For intracranial hemorrhage, there were no significant differences between those untreated and those taking antiplatelet therapy or warfarin. The use of warfarin was associated with a positive net clinical benefit compared with being untreated or receiving only antiplatelet therapy. For atrial fibrillation patients with liver cirrhosis in the current analysis of an observational study, warfarin use was associated with a lower risk of ischemic stroke and a positive net clinical benefit compared with nontreatment, and thus, thromboprophylaxis should be considered for such patients. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Gozé, Catherine; Blonski, Marie; Le Maistre, Guillaume; Bauchet, Luc; Dezamis, Edouard; Page, Philippe; Varlet, Pascale; Capelle, Laurent; Devaux, Bertrand; Taillandier, Luc; Duffau, Hugues; Pallud, Johan
2014-01-01
Background We explored whether spontaneous imaging tumor growth (estimated by the velocity of diametric expansion) and isocitrate dehydrogenase 1 (IDH1) mutation (estimated by IDH1 immunoexpression) were independent predictors of long-term outcomes of diffuse low-grade gliomas in adults. Methods One hundred thirty-one adult patients with newly diagnosed supratentorial diffuse low-grade gliomas were retrospectively studied. Results Isocitrate dehydrogenase 1 mutations were present in 107 patients. The mean spontaneous velocity of diametric expansion was 5.40 ± 5.46 mm/y. During follow-up (mean, 70 ± 54.7 mo), 56 patients presented a malignant transformation and 23 died. The median malignant progression-free survival and the overall survival were significantly longer in cases of slow velocity of diametric expansion (149 and 198 mo, respectively) than in cases of fast velocity of diametric expansion (46 and 82 mo; P < .001 and P < .001, respectively) and in cases with IDH1 mutation (100 and 198 mo, respectively) than in cases without IDH1 mutation (72 mo and not reached; P = .028 and P = .001, respectively). In multivariate analyses, spontaneous velocity of diametric expansion and IDH1 mutation were independent prognostic factors for malignant progression-free survival (P < .001; hazard ratio, 4.23; 95% CI, 1.81–9.40 and P = .019; hazard ratio, 2.39; 95% CI, 1.19–4.66, respectively) and for overall survival (P < .001; hazard ratio, 26.3; 95% CI, 5.42–185.2 and P = .007; hazard ratio, 17.89; 95% CI, 2.15–200.1, respectively). Conclusions The spontaneous velocity of diametric expansion and IDH1 mutation status are 2 independent prognostic values that should be obtained at the beginning of the management of diffuse low-grade gliomas in adults. PMID:24847087
van Kleef, Monique E A M; Visseren, Frank L J; Vernooij, Joris W P; Nathoe, Hendrik M; Cramer, Maarten-Jan M; Bemelmans, Remy H H; van der Graaf, Yolanda; Spiering, Wilko
2018-06-06
The relation between different electrocardiographic left ventricular hypertrophy (ECG-LVH) criteria and cardiovascular risk in patients with clinical manifest arterial disease is unclear. Therefore, we determined the association between four ECG-LVH criteria: Sokolow-Lyon, Cornell product, Cornell/strain index and Framingham criterion; and risk of cardiovascular events and mortality in this population. Risk of cardiovascular events was estimated in 6913 adult patients with clinical manifest arterial disease originating from the Secondary Manifestations of ARTerial disease (SMART) cohort. Cox proportional regression analysis was used to estimate the risk of the four ECG-LVH criteria and the primary composite outcome: myocardial infarction (MI), stroke or cardiovascular death; and secondary outcomes: MI, stroke and all-cause mortality; adjusted for confounders. The highest prevalence of ECG-LVH was observed for Cornell product (10%) and Cornell/strain index (9%). All four ECG-LVH criteria were associated with an increased risk of the primary composite endpoint: Sokolow-Lyon (hazard ratio 1.37, 95% CI 1.13-1.66), Cornell product (hazard ratio 1.54, 95% CI 1.30-1.82), Cornell/strain index (hazard ratio 1.70, 95% CI 1.44-2.00) and Framingham criterion (hazard ratio 1.78, 95% CI 1.21-2.62). Cornell product, Cornell/strain index and Framingham criterion ECG-LVH were additionally associated with an elevated risk of secondary outcomes. Cardiovascular risk increased whenever two, or three or more ECG-LVH criteria were present concurrently. All four ECG-LVH criteria are associated with an increased risk of cardiovascular events. As Cornell/strain index is both highly prevalent and carries a high cardiovascular risk, this is likely the most relevant ECG-LVH criterion for clinical practice.
Racial disparity in cardiac procedures and mortality among long-term survivors of cardiac arrest.
Groeneveld, Peter W; Heidenreich, Paul A; Garber, Alan M
2003-07-22
It is unknown whether white and black Medicare beneficiaries have different rates of cardiac procedure utilization or long-term survival after cardiac arrest. A total of 5948 elderly Medicare beneficiaries (5429 white and 519 black) were identified who survived to hospital discharge between 1990 and 1999 after admission for cardiac arrest. Demographic, socioeconomic, and clinical information about these patients was obtained from Medicare administrative files, the US census, and the American Hospital Association's annual institutional survey. A Cox proportional hazard model that included demographic and clinical predictors indicated a hazard ratio for mortality of 1.30 (95% CI 1.09 to 1.55) for blacks aged 66 to 74 years compared with whites of the same age. The addition of cardiac procedures to this model lowered the hazard ratio for blacks to 1.23 (95% CI 1.03 to 1.46). In analyses stratified by race, implantable cardioverter-defibrillators (ICDs) had a mortality hazard ratio of 0.53 (95% CI 0.45 to 0.62) for white patients and 0.50 (95% CI 0.27 to 0.91) for black patients. Logistic regression models that compared procedure rates between races indicated odds ratios for blacks aged 66 to 74 years of 0.58 (95% CI 0.36 to 0.94) to receive an ICD and 0.50 (95% CI 0.34 to 0.75) to receive either revascularization or an ICD. There is racial disparity in long-term mortality among elderly cardiac arrest survivors. Both black and white patients benefited from ICD implantation, but blacks were less likely to undergo this potentially life-saving procedure. Lower rates of cardiac procedures may explain in part the lower survival rates among black patients.
Thrombin-receptor antagonist vorapaxar in acute coronary syndromes.
Tricoci, Pierluigi; Huang, Zhen; Held, Claes; Moliterno, David J; Armstrong, Paul W; Van de Werf, Frans; White, Harvey D; Aylward, Philip E; Wallentin, Lars; Chen, Edmond; Lokhnygina, Yuliya; Pei, Jinglan; Leonardi, Sergio; Rorick, Tyrus L; Kilian, Ann M; Jennings, Lisa H K; Ambrosio, Giuseppe; Bode, Christoph; Cequier, Angel; Cornel, Jan H; Diaz, Rafael; Erkan, Aycan; Huber, Kurt; Hudson, Michael P; Jiang, Lixin; Jukema, J Wouter; Lewis, Basil S; Lincoff, A Michael; Montalescot, Gilles; Nicolau, José Carlos; Ogawa, Hisao; Pfisterer, Matthias; Prieto, Juan Carlos; Ruzyllo, Witold; Sinnaeve, Peter R; Storey, Robert F; Valgimigli, Marco; Whellan, David J; Widimsky, Petr; Strony, John; Harrington, Robert A; Mahaffey, Kenneth W
2012-01-05
Vorapaxar is a new oral protease-activated-receptor 1 (PAR-1) antagonist that inhibits thrombin-induced platelet activation. In this multinational, double-blind, randomized trial, we compared vorapaxar with placebo in 12,944 patients who had acute coronary syndromes without ST-segment elevation. The primary end point was a composite of death from cardiovascular causes, myocardial infarction, stroke, recurrent ischemia with rehospitalization, or urgent coronary revascularization. Follow-up in the trial was terminated early after a safety review. After a median follow-up of 502 days (interquartile range, 349 to 667), the primary end point occurred in 1031 of 6473 patients receiving vorapaxar versus 1102 of 6471 patients receiving placebo (Kaplan-Meier 2-year rate, 18.5% vs. 19.9%; hazard ratio, 0.92; 95% confidence interval [CI], 0.85 to 1.01; P=0.07). A composite of death from cardiovascular causes, myocardial infarction, or stroke occurred in 822 patients in the vorapaxar group versus 910 in the placebo group (14.7% and 16.4%, respectively; hazard ratio, 0.89; 95% CI, 0.81 to 0.98; P=0.02). Rates of moderate and severe bleeding were 7.2% in the vorapaxar group and 5.2% in the placebo group (hazard ratio, 1.35; 95% CI, 1.16 to 1.58; P<0.001). Intracranial hemorrhage rates were 1.1% and 0.2%, respectively (hazard ratio, 3.39; 95% CI, 1.78 to 6.45; P<0.001). Rates of nonhemorrhagic adverse events were similar in the two groups. In patients with acute coronary syndromes, the addition of vorapaxar to standard therapy did not significantly reduce the primary composite end point but significantly increased the risk of major bleeding, including intracranial hemorrhage. (Funded by Merck; TRACER ClinicalTrials.gov number, NCT00527943.).
Liu, Shu-Guang; Gao, Chao; Zhang, Rui-Dong; Zhao, Xiao-Xi; Cui, Lei; Li, Wei-Jing; Chen, Zhen-Ping; Yue, Zhi-Xia; Zhang, Yuan-Yuan; Wu, Min-Yuan; Wang, Jian-Xiang; Li, Zhi-Gang; Zheng, Hu-Yong
2017-06-06
High-dose methotrexate (HDMTX) plays an important role in the treatment of acute lymphoblastic leukemia (ALL) although there is great inter-patient variability in the efficacy and toxicity of MTX. The relationship between polymorphisms in genes encoding MTX transporters and MTX response is controversial. In the present study, 322 Chinese children with standard- and intermediate-risk ALL were genotyped for 12 polymorphisms. SLCO1B1 rs10841753 showed a significant association with plasma MTX levels at 48 h (P = 0.017). Patients who had the ABCB1 rs1128503 C allele had longer duration of hospitalization than did those with the TT genotype (P = 0.006). No association was found between oral mucositis and any polymorphism. Long-term outcome was worse in patients with the SLCO1B1 rs4149056 CC genotype than in patients with TT or TC (5-year event-free survival [EFS] 33.3 ± 19.2% vs. 90.5 ± 1.7%, P < 0.001), and was worse in patients with the SCL19A1 rs2838958 AA genotype than in patients with AG or GG (5-year EFS 78.5 ± 4.6% vs. 92.2 ± 1.8%, P = 0.008). Multiple Cox regression analyses revealed associations of minimal residual disease (MRD) at day 33 (hazard ratio 3.458; P = 0.002), MRD at day 78 (hazard ratio 6.330; P = 0.001), SLCO1B1 rs4149056 (hazard ratio 12.242; P < 0.001), and SCL19A1 rs2838958 (hazard ratio 2.324; P = 0.019) with EFS. Our findings show that polymorphisms in genes encoding MTX transporters substantially influence the kinetics and response to HDMTX therapy in childhood ALL.
The performance of different propensity score methods for estimating marginal hazard ratios.
Austin, Peter C
2013-07-20
Propensity score methods are increasingly being used to reduce or minimize the effects of confounding when estimating the effects of treatments, exposures, or interventions when using observational or non-randomized data. Under the assumption of no unmeasured confounders, previous research has shown that propensity score methods allow for unbiased estimation of linear treatment effects (e.g., differences in means or proportions). However, in biomedical research, time-to-event outcomes occur frequently. There is a paucity of research into the performance of different propensity score methods for estimating the effect of treatment on time-to-event outcomes. Furthermore, propensity score methods allow for the estimation of marginal or population-average treatment effects. We conducted an extensive series of Monte Carlo simulations to examine the performance of propensity score matching (1:1 greedy nearest-neighbor matching within propensity score calipers), stratification on the propensity score, inverse probability of treatment weighting (IPTW) using the propensity score, and covariate adjustment using the propensity score to estimate marginal hazard ratios. We found that both propensity score matching and IPTW using the propensity score allow for the estimation of marginal hazard ratios with minimal bias. Of these two approaches, IPTW using the propensity score resulted in estimates with lower mean squared error when estimating the effect of treatment in the treated. Stratification on the propensity score and covariate adjustment using the propensity score result in biased estimation of both marginal and conditional hazard ratios. Applied researchers are encouraged to use propensity score matching and IPTW using the propensity score when estimating the relative effect of treatment on time-to-event outcomes. Copyright © 2012 John Wiley & Sons, Ltd.
Bunch, T Jared; May, Heidi T; Bair, Tami L; Anderson, Jeffrey L; Crandall, Brian G; Cutler, Michael J; Jacobs, Victoria; Mallender, Charles; Muhlestein, Joseph B; Osborn, Jeffrey S; Weiss, J Peter; Day, John D
2015-12-01
There are a paucity of data about the long-term natural history of adult Wolff-Parkinson-White syndrome (WPW) patients in regard to risk of mortality and atrial fibrillation. We sought to describe the long-term outcomes of WPW patients and ascertain the impact of ablation on the natural history. Three groups of patients were studied: 2 WPW populations (ablation: 872, no ablation: 1461) and a 1:5 control population (n=11 175). Long-term mortality and atrial fibrillation rates were determined. The average follow-up for the WPW group was 7.9±5.9 (median: 6.9) years and was similar between the ablation and nonablation groups. Death rates were similar between the WPW group versus the control group (hazard ratio, 0.96; 95% confidence interval, 0.83-1.11; P=0.56). Nonablated WPW patients had a higher long-term death risk compared with ablated WPW patients (hazard ratio, 2.10; 95% confidence interval: 1.50-20.93; P<0.0001). Incident atrial fibrillation risk was higher in the WPW group compared with the control population (hazard ratio, 1.55; 95% confidence interval, 1.29-1.87; P<0.0001). Nonablated WPW patients had lower risk than ablated patients (hazard ratio, 0.39; 95% confidence interval, 0.28-0.53; P<0.0001). Long-term mortality rates in WPW patients are low and similar to an age-matched and gender-matched control population. WPW patients that underwent the multifactorial process of ablation had a lower mortality compared to nonablated WPW patients. Atrial fibrillation rates are high long-term, and ablation does not reduce this risk. © 2015 American Heart Association, Inc.
Wu, Shu-I; Chen, Su-Chiu; Liu, Shen-Ing; Sun, Fang-Ju; Juang, Jimmy J M; Lee, Hsin-Chien; Kao, Kai-Liang; Dewey, Michael E; Prince, Martin; Stewart, Robert
2015-01-01
Despite high mortality associated with serious mental illness, risk of acute myocardial infarction (AMI) remains unclear, especially for patients with bipolar disorder. The main objective was to investigate the relative risk of AMI associated with schizophrenia and bipolar disorders in a national sample. Using nationwide administrative data, an 11-year historic cohort study was assembled, comprised of cases aged 18 and above who had received a diagnosis of schizophrenia or bipolar disorder, compared to a random sample of all other adults excluding those with diagnoses of serious mental illness. Incident AMI as a primary diagnosis was ascertained. Hazard ratios stratified by age and gender were calculated and Cox regression models were used to adjust for other covariates. A total of 70,225 people with schizophrenia or bipolar disorder and 207,592 people without serious mental illness were compared. Hazard ratios in men adjusted for age, income and urbanization were 1.15 (95% CI 1.01~1.32) for schizophrenia and 1.37 (1.08~1.73)for bipolar disorder, and in women, 1.85 (1.58~2.18) and 1.88(1.47~2.41) respectively. Further adjustment for treated hypertension, diabetes and hyperlipidaemia attenuated the hazard ratio for men with schizophrenia but not the other comparison groups. Hazard ratios were significantly stronger in women than men and were stronger in younger compared to older age groups for both disorders; however, gender modification was only significant in people with schizophrenia, and age modification only significant in people with bipolar disorder. In this large national sample, schizophrenia and bipolar disorder were associated with raised risk of AMI in women and in the younger age groups although showed differences in potential confounding and modifying factors.
Propensity-Matched Mortality Comparison of Incident Hemodialysis and Peritoneal Dialysis Patients
Weinhandl, Eric D.; Gilbertson, David T.; Arneson, Thomas J.; Snyder, Jon J.; Collins, Allan J.
2010-01-01
Contemporary comparisons of mortality in matched hemodialysis and peritoneal dialysis patients are lacking. We aimed to compare survival of incident hemodialysis and peritoneal dialysis patients by intention-to-treat analysis in a matched-pair cohort and in subsets defined by age, cardiovascular disease, and diabetes. We matched 6337 patient pairs from a retrospective cohort of 98,875 adults who initiated dialysis in 2003 in the United States. In the primary intention-to-treat analysis of survival from day 0, cumulative survival was higher for peritoneal dialysis patients than for hemodialysis patients (hazard ratio 0.92; 95% CI 0.86 to 1.00, P = 0.04). Cumulative survival probabilities for peritoneal dialysis versus hemodialysis were 85.8% versus 80.7% (P < 0.01), 71.1% versus 68.0% (P < 0.01), 58.1% versus 56.7% (P = 0.25), and 48.4% versus 47.3% (P = 0.50) at 12, 24, 36, and 48 months, respectively. Peritoneal dialysis was associated with improved survival compared with hemodialysis among subgroups with age <65 years, no cardiovascular disease, and no diabetes. In a sensitivity analysis of survival from 90 days after initiation, we did not detect a difference in survival between modalities overall (hazard ratio 1.05; 95% CI 0.96 to 1.16), but hemodialysis was associated with improved survival among subgroups with cardiovascular disease and diabetes. In conclusion, despite hazard ratio heterogeneity across patient subgroups and nonconstant hazard ratios during the follow-up period, the overall intention-to-treat mortality risk after dialysis initiation was 8% lower for peritoneal dialysis than for matched hemodialysis patients. These data suggest that increased use of peritoneal dialysis may benefit incident ESRD patients. PMID:20133483
Carlsson, Axel C; Li, Xinjun; Holzmann, Martin J; Ärnlöv, Johan; Wändell, Per; Gasevic, Danijela; Sundquist, Jan; Sundquist, Kristina
2017-10-01
Objective We aimed to study the association between neighborhood socioeconomic status at the age of 40 years and risk of ischemic stroke before the age of 50 years. Methods All individuals in Sweden were included if their 40th birthday occurred between 1998 and 2010. National registers were used to categorize neighborhood socioeconomic status into high, middle, and low and to retrieve information on incident ischemic strokes. Hazard ratios and their 95% confidence intervals were estimated. Results A total of 1,153,451 adults (women 48.9%) were followed for a mean of 5.5 years (SD 3.5 years), during which 1777 (0.30%) strokes among men and 1374 (0.24%) strokes among women were recorded. After adjustment for sex, marital status, education level, immigrant status, region of residence, and neighborhood services, there was a lower risk of stroke in residents from high-socioeconomic status neighborhoods (hazard ratio 0.87, 95% confidence interval 0.78-0.96), and an increased risk of stroke in adults from low-socioeconomic status neighborhoods (hazard ratio 1.16, 95% confidence interval 1.06-1.27), compared to their counterparts living in middle-socioeconomic status neighborhoods. After further adjustment for hospital diagnoses of hypertension, diabetes, heart failure, and atrial fibrillation prior to the age of 40, the higher risk in neighborhoods with low socioeconomic status was attenuated, but remained significant (hazard ratio 1.12, 95% confidence interval 1.02-1.23). Conclusions In a nationwide study of individuals between 40 and 50 years, we found that the risk of ischemic stroke differed depending on neighborhood socioeconomic status, which calls for increased efforts to prevent cardiovascular diseases in low socioeconomic status neighborhoods.
Rubio-Tapia, Alberto; Malamut, Georgia; Verbeek, Wieke H.M.; van Wanrooij, Roy L.J.; Leffler, Daniel A.; Niveloni, Sonia I.; Arguelles-Grande, Carolina; Lahr, Brian D.; Zinsmeister, Alan R.; Murray, Joseph A.; Kelly, Ciaran P.; Bai, Julio C.; Green, Peter H.; Daum, Severin; Mulder, Chris J.J.; Cellier, Christophe
2016-01-01
Background Refractory coeliac disease is a severe complication of coeliac disease with heterogeneous outcome. Aim To create a prognostic model to estimate survival of patients with refractory coeliac disease. Methods We evaluated predictors of 5-year mortality using Cox proportional hazards regression on subjects from a multinational registry. Bootstrap re-sampling was used to internally validate the individual factors and overall model performance. The mean of the estimated regression coefficients from 400 bootstrap models was used to derive a risk score for 5-year mortality. Results The multinational cohort was composed of 232 patients diagnosed with refractory coeliac disease across 7 centers (range of 11–63 cases per center). The median age was 53 years and 150 (64%) were women. A total of 51 subjects died during 5-year follow-up (cumulative 5-year all-cause mortality = 30%). From a multiple variable Cox proportional hazards model, the following variables were significantly associated with 5-year mortality: age at refractory coeliac disease diagnosis (per 20 year increase, hazard ratio = 2.21; 95% confidence interval: 1.38, 3.55), abnormal intraepithelial lymphocytes (hazard ratio = 2.85; 95% confidence interval: 1.22, 6.62), and albumin (per 0.5 unit increase, hazard ratio = 0.72; 95% confidence interval: 0.61, 0.85). A simple weighted 3-factor risk score was created to estimate 5-year survival. Conclusions Using data from a multinational registry and previously-reported risk factors, we create a prognostic model to predict 5-year mortality among patients with refractory coeliac disease. This new model may help clinicians to guide treatment and follow-up. PMID:27485029
Austin, Peter C; Wagner, Philippe; Merlo, Juan
2017-03-15
Multilevel data occurs frequently in many research areas like health services research and epidemiology. A suitable way to analyze such data is through the use of multilevel regression models (MLRM). MLRM incorporate cluster-specific random effects which allow one to partition the total individual variance into between-cluster variation and between-individual variation. Statistically, MLRM account for the dependency of the data within clusters and provide correct estimates of uncertainty around regression coefficients. Substantively, the magnitude of the effect of clustering provides a measure of the General Contextual Effect (GCE). When outcomes are binary, the GCE can also be quantified by measures of heterogeneity like the Median Odds Ratio (MOR) calculated from a multilevel logistic regression model. Time-to-event outcomes within a multilevel structure occur commonly in epidemiological and medical research. However, the Median Hazard Ratio (MHR) that corresponds to the MOR in multilevel (i.e., 'frailty') Cox proportional hazards regression is rarely used. Analogously to the MOR, the MHR is the median relative change in the hazard of the occurrence of the outcome when comparing identical subjects from two randomly selected different clusters that are ordered by risk. We illustrate the application and interpretation of the MHR in a case study analyzing the hazard of mortality in patients hospitalized for acute myocardial infarction at hospitals in Ontario, Canada. We provide R code for computing the MHR. The MHR is a useful and intuitive measure for expressing cluster heterogeneity in the outcome and, thereby, estimating general contextual effects in multilevel survival analysis. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Turner, Caitlin M; Coffin, Phillip; Santos, Deirdre; Huffaker, Shannon; Matheson, Tim; Euren, Jason; DeMartini, Anna; Rowe, Chris; Batki, Steven; Santos, Glenn-Milo
2017-04-01
Ecological momentary assessments (EMA) are data collection approaches that characterize behaviors in real-time. However, EMA is underutilized in alcohol and substance use research among men who have sex with men (MSM). The aim of this analysis is to explore the correlates of engagement in EMA text messages among substance-using MSM in San Francisco. The present analysis uses data collected from the Project iN pilot study (n=30). Over a two-month period, participants received and responded to EMA daily text messages inquiring about their study medication, alcohol, and methamphetamine use. Baseline characteristics including demographics, alcohol use, and substance use were examined as potential correlates of engagement in EMA text messages in logistic regression and proportional hazards models. Participants had a 74% response rate to EMA text messages over the study period. MSM of color had significantly lower adjusted odds of responding to EMA texts 80% of the time or more, compared to white MSM (adjusted odds ratio=0.05, 95%CI=0.01-0.38). College-educated MSM had a lower adjusted hazard of week-long discontinuation in EMA texts (adjusted hazard ratio=0.12, 95%CI=0.02-0.63). Older MSM had a higher adjusted hazard of week-long discontinuation in EMA texts (adjusted hazard ratio=1.15, 95%CI=1.01-1.31). Differences in engagement in EMA text prompts were discovered for MSM with different racial/ethnic backgrounds, ages, and education levels. Substance use variables were not correlated with engagement in text messages, suggesting that EMA may be a useful research tool among actively substance-using MSM in San Francisco. Published by Elsevier Inc.
How do outcomes compare between women and men living with HIV in Australia? An observational study.
Giles, Michelle L; Zapata, Marin C; Wright, Stephen T; Petoumenos, Kathy; Grotowski, Miriam; Broom, Jennifer; Law, Matthew G; O'Connor, Catherine C
2016-04-01
Background Gender differences vary across geographical settings and are poorly reported in the literature. The aim of this study was to evaluate demographics and clinical characteristics of participants from the Australian HIV Observational Database (AHOD), and to explore any differences between females and males in the rate of new clinical outcomes, as well as initial immunological and virological response to antiretroviral therapy. Time to a new clinical end-point, all-cause mortality and/or AIDS illness was analysed using standard survival methods. Univariate and covariate adjusted Cox proportional hazard models were used to evaluate the time to plasma viral load suppression in all patients that initiated antiretroviral therapy (ART) and time to switching from a first-line ART to a second-line ART regimen. There was no significant difference between females and males for the hazard of all-cause mortality [adjusted hazard ratio: 0.98 (0.51, 1.55), P=0.67], new AIDS illness [adjusted hazard ratio: 0.75 (0.38, 1.48), P=0.41] or a composite end-point [adjusted hazard ratio: 0.74 (0.45, 1.21), P=0.23]. Incident rates of all-cause mortality were similar between females and males; 1.14 (0.61, 1.95) vs 1.28 (1.12, 1.45) per 100 person years. Virological response to ART was similar for females and males when measured as time to viral suppression and/or time to virological failure. This study supports current Australian HIV clinical care as providing equivalent standards of care for male and female HIV-positive patients. Future studies should compare ART-associated toxicity differences between ART-associated toxicity differences between men and women living with HIV in Australia.
How do outcomes compare between women and men living with HIV in Australia? An observational study
Giles, Michelle L.; Zapata, Marin C.; Wright, Stephen T.; Petoumenos, Kathy; Grotowski, Miriam; Broom, Jennifer; Law, Matthew G.; O’Connor, Catherine C.
2018-01-01
Background Gender differences vary across geographical settings and are poorly reported in the literature. The aim of this study was to evaluate demographics and clinical characteristics of participants from the Australian HIV Observational Database (AHOD), and to explore any differences between females and males in the rate of new clinical outcomes, as well as initial immunological and virological response to antiretroviral therapy. Methods Time to a new clinical end-point, all-cause mortality and/or AIDS illness was analysed using standard survival methods. Univariate and covariate adjusted Cox proportional hazard models were used to evaluate the time to plasma viral load suppression in all patients that initiated antiretroviral therapy (ART) and time to switching from a first-line ART to a second-line ART regimen. Results There was no significant difference between females and males for the hazard of all-cause mortality [adjusted hazard ratio: 0.98 (0.51, 1.55), P = 0.67], new AIDS illness [adjusted hazard ratio: 0.75 (0.38, 1.48), P = 0.41] or a composite end-point [adjusted hazard ratio: 0.74 (0.45, 1.21), P = 0.23]. Incident rates of all-cause mortality were similar between females and males; 1.14 (0.61, 1.95) vs 1.28 (1.12, 1.45) per 100 person years. Virological response to ART was similar for females and males when measured as time to viral suppression and/or time to virological failure. Conclusion This study supports current Australian HIV clinical care as providing equivalent standards of care for male and female HIV-positive patients. Future studies should compare ART-associated toxicity differences between ART-associated toxicity differences between men and women living with HIV in Australia. PMID:26827052
Wagner, Philippe; Merlo, Juan
2016-01-01
Multilevel data occurs frequently in many research areas like health services research and epidemiology. A suitable way to analyze such data is through the use of multilevel regression models (MLRM). MLRM incorporate cluster‐specific random effects which allow one to partition the total individual variance into between‐cluster variation and between‐individual variation. Statistically, MLRM account for the dependency of the data within clusters and provide correct estimates of uncertainty around regression coefficients. Substantively, the magnitude of the effect of clustering provides a measure of the General Contextual Effect (GCE). When outcomes are binary, the GCE can also be quantified by measures of heterogeneity like the Median Odds Ratio (MOR) calculated from a multilevel logistic regression model. Time‐to‐event outcomes within a multilevel structure occur commonly in epidemiological and medical research. However, the Median Hazard Ratio (MHR) that corresponds to the MOR in multilevel (i.e., ‘frailty’) Cox proportional hazards regression is rarely used. Analogously to the MOR, the MHR is the median relative change in the hazard of the occurrence of the outcome when comparing identical subjects from two randomly selected different clusters that are ordered by risk. We illustrate the application and interpretation of the MHR in a case study analyzing the hazard of mortality in patients hospitalized for acute myocardial infarction at hospitals in Ontario, Canada. We provide R code for computing the MHR. The MHR is a useful and intuitive measure for expressing cluster heterogeneity in the outcome and, thereby, estimating general contextual effects in multilevel survival analysis. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27885709
Forbang, Nketi I; Michos, Erin D; McClelland, Robyn L; Remigio-Baker, Rosemay A; Allison, Matthew A; Sandfort, Veit; Ix, Joachim H; Thomas, Isac; Rifkin, Dena E; Criqui, Michael H
2016-11-01
Abdominal aortic calcium (AAC) and coronary artery calcium (CAC) independently and similarly predict cardiovascular disease (CVD) events. The standard AAC and CAC score, the Agatston method, upweights for greater calcium density, thereby modeling higher calcium density as a CVD hazard. Computed tomography scans were used to measure AAC and CAC volume and density in a multiethnic cohort of community-dwelling individuals, and Cox proportional hazard was used to determine their independent association with incident coronary heart disease (CHD, defined as myocardial infarction, resuscitated cardiac arrest, or CHD death), cardiovascular disease (CVD, defined as CHD plus stroke and stroke death), and all-cause mortality. In 997 participants with Agatston AAC and CAC scores >0, the mean age was 66±9 years, and 58% were men. During an average follow-up of 9 years, there were 77 CHD, 118 CVD, and 169 all-cause mortality events. In mutually adjusted models, additionally adjusted for CVD risk factors, an increase in ln(AAC volume) per standard deviation was significantly associated with increased all-cause mortality (hazard ratio=1.20; 95% confidence interval, 1.08-1.33; P<0.01) and an increased ln(CAC volume) per standard deviation was significantly associated with CHD (hazard ratio=1.17; 95% confidence interval, 1.04-1.59; P=0.02) and CVD (hazard ratio=1.20; 95% confidence interval, 1.05-1.36; P<0.01). In contrast, both AAC and CAC density were not significantly associated with CVD events. The Agatston method of upweighting calcium scores for greater density may be inappropriate for CVD risk prediction in both the abdominal aorta and coronary arteries. © 2016 American Heart Association, Inc.
Hazard Regression Models of Early Mortality in Trauma Centers
Clark, David E; Qian, Jing; Winchell, Robert J; Betensky, Rebecca A
2013-01-01
Background Factors affecting early hospital deaths after trauma may be different from factors affecting later hospital deaths, and the distribution of short and long prehospital times may vary among hospitals. Hazard regression (HR) models may therefore be more useful than logistic regression (LR) models for analysis of trauma mortality, especially when treatment effects at different time points are of interest. Study Design We obtained data for trauma center patients from the 2008–9 National Trauma Data Bank (NTDB). Cases were included if they had complete data for prehospital times, hospital times, survival outcome, age, vital signs, and severity scores. Cases were excluded if pulseless on admission, transferred in or out, or ISS<9. Using covariates proposed for the Trauma Quality Improvement Program and an indicator for each hospital, we compared LR models predicting survival at 8 hours after injury to HR models with survival censored at 8 hours. HR models were then modified to allow time-varying hospital effects. Results 85,327 patients in 161 hospitals met inclusion criteria. Crude hazards peaked initially, then steadily declined. When hazard ratios were assumed constant in HR models, they were similar to odds ratios in LR models associating increased mortality with increased age, firearm mechanism, increased severity, more deranged physiology, and estimated hospital-specific effects. However, when hospital effects were allowed to vary by time, HR models demonstrated that hospital outliers were not the same at different times after injury. Conclusions HR models with time-varying hazard ratios reveal inconsistencies in treatment effects, data quality, and/or timing of early death among trauma centers. HR models are generally more flexible than LR models, can be adapted for censored data, and potentially offer a better tool for analysis of factors affecting early death after injury. PMID:23036828
Stroke Risk and Mortality in Patients With Ventricular Assist Devices.
Parikh, Neal S; Cool, Joséphine; Karas, Maria G; Boehme, Amelia K; Kamel, Hooman
2016-11-01
Ventricular assist devices (VADs) have advanced the management of end-stage heart failure. However, these devices are associated with hemorrhagic and thrombotic complications, including stroke. We assessed the incidence, risk factors, and outcomes of ischemic and hemorrhagic stroke after VAD placement. Using administrative claims data from acute care hospitals in California, Florida, and New York from 2005 to 2013, we identified patients who underwent VAD placement, defined by the International Classification of Diseases, Ninth Revision, Clinical Modification code 37.66. Ischemic and hemorrhagic strokes were identified by previously validated coding algorithms. We used survival statistics to determine the incidence rates and Cox proportional hazard analyses to examine the associations. Among 1813 patients, we identified 201 ischemic strokes and 116 hemorrhagic strokes during 3.4 (±2.0) years of follow-up after implantation of a VAD. The incidence of stroke was 8.7% per year (95% confidence interval [CI], 7.7-9.7). The annual incidence of ischemic stroke (5.5%; 95% CI, 4.8-6.4) was nearly double that of hemorrhagic stroke (3.1%; 95% CI, 2.6-3.8). Women faced a higher hazard of stroke than men (hazard ratio, 1.6; 95% CI, 1.2-2.1), particularly hemorrhagic stroke (hazard ratio, 2.2; 95% CI, 1.4-3.4). Stroke was strongly associated with subsequent in-hospital mortality (hazard ratio, 6.1; 95% CI, 4.6-7.9). The incidence of stroke after VAD implantation was 8.7% per year, and incident stroke was strongly associated with subsequent in-hospital mortality. Notably, ischemic stroke occurred at nearly twice the rate of hemorrhagic stroke. Women seemed to face a higher risk for hemorrhagic stroke than men. © 2016 American Heart Association, Inc.
Choudhary, Gaurav; Jankowich, Matthew; Wu, Wen-Chih
2014-07-01
Although elevated pulmonary artery systolic pressure (PASP) is associated with heart failure (HF), whether PASP measurement can help predict future HF admissions is not known, especially in African Americans who are at increased risk for HF. We hypothesized that elevated PASP is associated with increased risk of HF admission and improves HF prediction in African American population. We conducted a longitudinal analysis using the Jackson Heart Study cohort (n=3125; 32.2% men) with baseline echocardiography-derived PASP and follow-up for HF admissions. Hazard ratio for HF admission was estimated using Cox proportional hazard model adjusted for variables in the Atherosclerosis Risk in Community (ARIC) HF prediction model. During a median follow-up of 3.46 years, 3.42% of the cohort was admitted for HF. Subjects with HF had a higher PASP (35.6±11.4 versus 27.6±6.9 mm Hg; P<0.001). The hazard of HF admission increased with higher baseline PASP (adjusted hazard ratio per 10 mm Hg increase in PASP: 2.03; 95% confidence interval, 1.67-2.48; adjusted hazard ratio for highest [≥33 mm Hg] versus lowest quartile [<24 mm Hg] of PASP: 2.69; 95% confidence interval, 1.43-5.06) and remained significant irrespective of history of HF or preserved/reduced ejection fraction. Addition of PASP to the ARIC model resulted in a significant improvement in model discrimination (area under the curve=0.82 before versus 0.84 after; P=0.03) and improved net reclassification index (11-15%) using PASP as a continuous or dichotomous (cutoff=33 mm Hg) variable. Elevated PASP predicts HF admissions in African Americans and may aid in early identification of at-risk subjects for aggressive risk factor modification. © 2014 American Heart Association, Inc.
Rubio-Tapia, A; Malamut, G; Verbeek, W H M; van Wanrooij, R L J; Leffler, D A; Niveloni, S I; Arguelles-Grande, C; Lahr, B D; Zinsmeister, A R; Murray, J A; Kelly, C P; Bai, J C; Green, P H; Daum, S; Mulder, C J J; Cellier, C
2016-10-01
Refractory coeliac disease is a severe complication of coeliac disease with heterogeneous outcome. To create a prognostic model to estimate survival of patients with refractory coeliac disease. We evaluated predictors of 5-year mortality using Cox proportional hazards regression on subjects from a multinational registry. Bootstrap resampling was used to internally validate the individual factors and overall model performance. The mean of the estimated regression coefficients from 400 bootstrap models was used to derive a risk score for 5-year mortality. The multinational cohort was composed of 232 patients diagnosed with refractory coeliac disease across seven centres (range of 11-63 cases per centre). The median age was 53 years and 150 (64%) were women. A total of 51 subjects died during a 5-year follow-up (cumulative 5-year all-cause mortality = 30%). From a multiple variable Cox proportional hazards model, the following variables were significantly associated with 5-year mortality: age at refractory coeliac disease diagnosis (per 20 year increase, hazard ratio = 2.21; 95% confidence interval, CI: 1.38-3.55), abnormal intraepithelial lymphocytes (hazard ratio = 2.85; 95% CI: 1.22-6.62), and albumin (per 0.5 unit increase, hazard ratio = 0.72; 95% CI: 0.61-0.85). A simple weighted three-factor risk score was created to estimate 5-year survival. Using data from a multinational registry and previously reported risk factors, we create a prognostic model to predict 5-year mortality among patients with refractory coeliac disease. This new model may help clinicians to guide treatment and follow-up. © 2016 John Wiley & Sons Ltd.
Wong, C K H; Wong, W C W; Wan, Y F; Chan, A K C; Chan, F W K; Lam, C L K
2016-10-01
To assess whether a structured diabetes education programme, the Patient Empowerment Programme, was associated with a lower rate of all-cause hospitalization and emergency department visits in a population-based cohort of patients with Type 2 diabetes mellitus in primary care. A cohort of 24 250 patients was evaluated using a linked administrative database during 2009-2013. We selected 12 125 patients with Type 2 diabetes who had at least one Patient Empowerment Programme session attendance. Patients who did not participate in the Patient Empowerment Programme were matched one-to-one with patients who did, using the propensity score method. Hospitalization events and emergency department visits were the events of interest. Cox proportional hazard and negative binomial regressions were performed to estimate the hazard ratios for the initial event, and incidence rate ratios for the number of events. During a median 30.5 months of follow-up, participants in the Patient Empowerment Programme had a lower incidence of an initial hospitalization event (22.1 vs 25.2%; hazard ratio 0.879; P < 0.001) and emergency department visit (40.5 vs 44%; hazard ratio 0.901; P < 0.001) than those who did not participate in the Patient Empowerment Programme. Participation in the Patient Empowerment Programme was associated with a significantly lower number of emergency department visits (incidence rate ratio 0.903; P < 0.001): 40.4 visits per 100 patients annually in those who did not participate in the Patient Empowerment Programme vs. 36.2 per 100 patients annually in those who did. There were significantly fewer hospitalization episodes (incidence rate ratio 0.854; P < 0.001): 20.0 hospitalizations per 100 patients annually in those who did not participate in the Patient Empowerment Programme vs. 16.9 hospitalizations per 100 patients annually in those who did. Among patients with Type 2 diabetes, the Patient Empowerment Programme was shown to be effective in delaying the initial hospitalization event and in reducing their frequency. © 2015 Diabetes UK.
Cooking frequency may enhance survival in Taiwanese elderly.
Chen, Rosalind Chia-Yu; Lee, Meei-Shyuan; Chang, Yu-Hung; Wahlqvist, Mark L
2012-07-01
To investigate the association between cooking behaviour and long-term survival among elderly Taiwanese. Cohort study. The duration of follow-up was the interval between the date of interview and the date of death or 31 December 2008, when censored for survivors. Information used included demographics, socio-economic status, health behaviours, cooking frequencies, physical function, cognitive function, nutrition knowledge awareness, eating out habits and food and nutrient intakes. These data were linked to death records. Cox proportional-hazards models were used to evaluate cooking frequency on death from 1999 to 2008 with related covariate adjustments. Elderly Nutrition and Health Survey in Taiwan, 1999-2000. Nationally representative free-living elderly people aged ≥65 years (n 1888). During a 10-year follow-up, 695 participants died. Those who cooked most frequently were younger, women, unmarried, less educated, non-drinkers of alcohol, non-smokers, without chewing difficulty, had spouse as dinner companion, normal cognition, who walked or shopped more than twice weekly, who ate less meat and more vegetables. Highly frequent cooking (>5 times/week, compared with never) predicted survival (hazard ratio (HR) = 0·47; 95 % CI, 0·36, 0·61); with adjustment for physical function, cognitive function, nutrition knowledge awareness and other covariates, HR was 0·59 (95 % CI, 0·41, 0·86). Women benefited more from cooking more frequently than did men, with decreased HR, 51 % v. 24 %, when most was compared with least. A 2-year delay in the assessment of survivorship led to similar findings. Cooking behaviour favourably predicts survivorship. Highly frequent cooking may favour women more than men.
Lunar mission safety and rescue: Hazards analysis and safety requirements
NASA Technical Reports Server (NTRS)
1971-01-01
The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.
Lindman, Brian R; Maniar, Hersh S; Jaber, Wael A; Lerakis, Stamatios; Mack, Michael J; Suri, Rakesh M; Thourani, Vinod H; Babaliaros, Vasilis; Kereiakes, Dean J; Whisenant, Brian; Miller, D Craig; Tuzcu, E Murat; Svensson, Lars G; Xu, Ke; Doshi, Darshan; Leon, Martin B; Zajarias, Alan
2015-04-01
Tricuspid regurgitation (TR) and right ventricular (RV) dysfunction adversely affect outcomes in patients with heart failure or mitral valve disease, but their impact on outcomes in patients with aortic stenosis treated with transcatheter aortic valve replacement has not been well characterized. Among 542 patients with symptomatic aortic stenosis treated in the Placement of Aortic Transcatheter Valves (PARTNER) II trial (inoperable cohort) with a Sapien or Sapien XT valve via a transfemoral approach, baseline TR severity, right atrial and RV size and RV function were evaluated by echocardiography according to established guidelines. One-year mortality was 16.9%, 17.2%, 32.6%, and 61.1% for patients with no/trace (n=167), mild (n=205), moderate (n=117), and severe (n=18) TR, respectively (P<0.001). Increasing severity of RV dysfunction as well as right atrial and RV enlargement were also associated with increased mortality (P<0.001). After multivariable adjustment, severe TR (hazard ratio, 3.20; 95% confidence interval, 1.50-6.82; P=0.003) and moderate TR (hazard ratio, 1.60; 95% confidence interval, 1.02-2.52; P=0.042) remained associated with increased mortality as did right atrial and RV enlargement, but not RV dysfunction. There was an interaction between TR and mitral regurgitation severity (P=0.04); the increased hazard of death associated with moderate/severe TR only occurred in those with no/trace/mild mitral regurgitation. In inoperable patients treated with transcatheter aortic valve replacement, moderate or severe TR and right heart enlargement are independently associated with increased 1-year mortality; however, the association between moderate or severe TR and an increased hazard of death was only found in those with minimal mitral regurgitation at baseline. These findings may improve our assessment of anticipated benefit from transcatheter aortic valve replacement and support the need for future studies on TR and the right heart, including whether concomitant treatment of TR in operable but high-risk patients with aortic stenosis is warranted. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01314313. © 2015 American Heart Association, Inc.
Disseminating Landslide Hazard Information for California Local Government
NASA Astrophysics Data System (ADS)
Wills, C. J.
2010-12-01
Since 1969, the California Geological Survey has produced numerous maps showing landslide features and delineating potential slope-stability problem areas. These maps have been provided to local governments to encourage consideration of landslide hazards in planning and development decisions. Maps produced from 1986 through 1995 under the Landslide Hazard Mapping Act were advisory only, and their use by local government was never consistent. By contrast, maps of Zones of Required Investigation for seismically induced landslides produced under the Seismic Hazard Zoning Act since 1997 come with detailed guidelines and legal requirements. A legislative act that required landslide hazards be mapped and hazard maps disseminated to local government proved ineffective in landslide hazard mitigation. A later act with requirements that the hazard zone maps be used by local government proved more effective. Planning scenarios have proven to be an effective way of transmitting scientific information about natural hazards to emergency response professionals. Numerous earthquake planning scenarios have been prepared and used as the basis for emergency response exercises. An advantage of scenarios that include loss estimates is that the effects can be put in units of measure that everyone understands, principally deaths and dollars. HAZUS software available from FEMA allows calculation of losses for earthquake scenarios, but similar methods for landslides have not been developed. As part of the USGS Multi-Hazard Demonstration Project, we have estimated the landslide losses for a major west-coast winter storm scenario by developing a system based loosely on HAZUS. Data on landslide damage in past storms has been sparse and inconsistent, but a few data sets are available. The most detailed and complete available data on landslide damage was gathered by the City of Los Angeles following the 1978 storms. We extrapolate from that data to the entire state by first generalizing a landslide susceptibility map to give a single value of susceptibility for each census tract. We then calculated the loss ratio, the cost of landslide damage from the 1978 storms divided by the value of light wood frame structures in the census tract. The comparison suggests three general categories of damage: tracts with low landslide susceptibility have no landslide damage: tracts with moderate susceptibility have loss ratios of about 0.016%: and tracts with high susceptibility have loss ratios of 0.096%. Using these values, the susceptibility map becomes a landslide loss ratio map for the average storm intensity and landslide vulnerability of Los Angeles in 1978. Generalization to other storm intensities uses differences in storm intensity and landslide damage data from the 1982 storm in the Bay Area. In Santa Cruz County, that storm had a recurrence interval of over 100 years, and over 3 times the damage as our projection from the 1978 data. In Sonoma County, that storm had a recurrence interval of only 10 years and damage that was only 2% of our projection. If a relationship between storm intensity and the projections from the 1978 Los Angeles data can be developed, we may be able to estimate landslide losses for any projected storm intensity.
Steiner, Markus FC; Cezard, Genevieve; Bansal, Narinder; Fischbacher, Colin; Douglas, Anne; Bhopal, Raj; Sheikh, Aziz
2015-01-01
Objective There is evidence of substantial ethnic variations in asthma morbidity and the risk of hospitalisation, but the picture in relation to lower respiratory tract infections is unclear. We carried out an observational study to identify ethnic group differences for lower respiratory tract infections. Design A retrospective, cohort study. Setting Scotland. Participants 4.65 million people on whom information was available from the 2001 census, followed from May 2001 to April 2010. Main outcome measures Hospitalisations and deaths (any time following first hospitalisation) from lower respiratory tract infections, adjusted risk ratios and hazard ratios by ethnicity and sex were calculated. We multiplied ratios and confidence intervals by 100, so the reference Scottish White population’s risk ratio and hazard ratio was 100. Results Among men, adjusted risk ratios for lower respiratory tract infection hospitalisation were lower in Other White British (80, 95% confidence interval 73–86) and Chinese (69, 95% confidence interval 56–84) populations and higher in Pakistani groups (152, 95% confidence interval 136–169). In women, results were mostly similar to those in men (e.g. Chinese 68, 95% confidence interval 56–82), although higher adjusted risk ratios were found among women of the Other South Asians group (145, 95% confidence interval 120–175). Survival (adjusted hazard ratio) following lower respiratory tract infection for Pakistani men (54, 95% confidence interval 39–74) and women (31, 95% confidence interval 18–53) was better than the reference population. Conclusions Substantial differences in the rates of lower respiratory tract infections amongst different ethnic groups in Scotland were found. Pakistani men and women had particularly high rates of lower respiratory tract infection hospitalisation. The reasons behind the high rates of lower respiratory tract infection in the Pakistani community are now required. PMID:26152675
Change in plasma Aß peptides and onset of dementia in adults with Down syndrome.
Schupf, N; Zigman, W B; Tang, M-X; Pang, D; Mayeux, R; Mehta, P; Silverman, W
2010-11-02
To examine changes in levels of plasma amyloid-β (Aβ) peptides, Aβ42 and Aβ40, in relation to onset of Alzheimer disease (AD) in adults with Down syndrome (DS). Plasma Aβ42 and Aβ40 were measured at initial examination and at follow-up in a community-based cohort of 225 adults with DS who did not have dementia at baseline and were assessed for cognitive/functional abilities and health status and followed at 14- to 20-month intervals. We used Cox proportional hazards modeling to estimate the cumulative incidence of AD by Aβ peptide change group (increasing, no change, or decreasing), adjusting for covariates. Sixty-one (27.1%) of the participants developed AD. At follow-up, a decrease in Aβ42 levels, a decrease in the Aβ42/Aβ40 ratio, and an increase in Aβ40 levels were related to conversion to AD. Compared with the group with increasing levels of Aβ42, the likelihood of developing AD was 5 times higher for those whose plasma Aβ42 levels decreased over follow-up (hazard ratio [HR] = 4.9, 95% confidence interval [CI] 2.1-11.4). Decreasing Aβ42/Aβ40 was also strongly related to AD risk (HR = 4.9, 95% CI 1.8-13.2), while decreasing Aβ40 was associated with lower risk (HR = 0.4, 95% CI 0.2-0.9). Among adults with DS, decreasing levels of plasma Aβ42, a decline in the Aβ42/Aβ40 ratio, or increasing levels of Aβ40 may be sensitive indicators of conversion to AD, possibly reflecting compartmentalization of Aβ peptides in the brain.
Querin, G; El Mendili, M M; Lenglet, T; Delphine, S; Marchand-Pauvert, V; Benali, H; Pradat, P-F
2017-08-01
Assessing survival is a critical issue in patients with amyotrophic lateral sclerosis (ALS). Neuroimaging seems to be promising in the assessment of disease severity and several studies also suggest a strong relationship between spinal cord (SC) atrophy described by magnetic resonance imaging (MRI) and disease progression. The aim of the study was to determine the predictive added value of multimodal SC MRI on survival. Forty-nine ALS patients were recruited and clinical data were collected. Patients were scored on the Revised ALS Functional Rating Scale and manual muscle testing. They were followed longitudinally to assess survival. The cervical SC was imaged using the 3 T MRI system. Cord volume and cross-sectional area (CSA) at each vertebral level were computed. Diffusion tensor imaging metrics were measured. Imaging metrics and clinical variables were used as inputs for a multivariate Cox regression survival model. On building a multivariate Cox regression model with clinical and MRI parameters, fractional anisotropy, magnetization transfer ratio and CSA at C2-C3, C4-C5, C5-C6 and C6-C7 vertebral levels were significant. Moreover, the hazard ratio calculated for CSA at the C3-C4 and C5-C6 levels indicated an increased risk for patients with SC atrophy (respectively 0.66 and 0.68). In our cohort, MRI parameters seem to be more predictive than clinical variables, which had a hazard ratio very close to 1. It is suggested that multimodal SC MRI could be a useful tool in survival prediction especially if used at the beginning of the disease and when combined with clinical variables. To validate it as a biomarker, confirmation of the results in bigger independent cohorts of patients is warranted. © 2017 EAN.
Lanfear, David E; Peterson, Edward L; Campbell, Janis; Phatak, Hemant; Wu, David; Wells, Karen; Spertus, John A; Williams, L Keoki
2011-01-01
Worsened renal function (WRF) during heart failure (HF) hospitalization is associated with in-hospital mortality, but there are limited data regarding its relation to long-term outcomes after discharge. The influence of WRF resolution is also unknown. This retrospective study analyzed patients who received care from a large health system and had a primary hospital discharge diagnosis of HF from January 2000 to June 2008. Renal function was estimated from creatinine levels during hospitalization. The first available value was considered baseline. WRF was defined a creatinine increase ≥ 0.3 mg/dl on any subsequent hospital day compared to baseline. Persistent WRF was defined as having WRF at discharge. Proportional hazards regression, adjusting for baseline renal function and potential confounding factors, was used to assess time to rehospitalization or death. Of 2,465 patients who survived to discharge, 887 (36%) developed WRF. Median follow-up was 2.1 years. In adjusted models, WRF was associated with higher rates of postdischarge death or rehospitalization (hazard ratio [HR] 1.12, 95% confidence interval [CI] 1.02 to 1.22). Of those with WRF, 528 (60%) had persistent WRF, whereas 359 (40%) recovered. Persistent WRF was significantly associated with higher postdischarge event rates (HR 1.14, 95% CI 1.02 to 1.27), whereas transient WRF showed only a nonsignificant trend toward risk (HR 1.09, 95% CI 0.96 to 1.24). In conclusion, in patients surviving hospitalization for HF, WRF was associated with increased long-term mortality and rehospitalization, particularly if renal function did not recover by the time of discharge. Copyright © 2011 Elsevier Inc. All rights reserved.
Amin, Amit P; Spertus, John A; Reid, Kimberly J; Lan, Xiao; Buchanan, Donna M; Decker, Carole; Masoudi, Frederick A
2010-12-01
Although an acute worsening in renal function (WRF) commonly occurs among patients hospitalized for acute myocardial infarction (AMI), its long-term prognostic significance is unknown. We examined predictors of WRF and its association with 4-year mortality. Acute myocardial infarction patients from the multicenter PREMIER study (N=2,098) who survived to hospital discharge were followed for at least 4 years. Worsening in renal function was defined as an increase in creatinine during hospitalization of ≥0.3 mg/dL above the admission value. Correlates of WRF were determined with multivariable logistic regression models and used, along with other important clinical covariates, in Cox proportional hazards models to define the independent association between WRF and mortality. Worsening in renal function was observed in 393 (18.7%) of AMI survivors. Diabetes, left ventricular systolic dysfunction, and a history of chronic kidney disease (documented history of renal failure with baseline creatinine>2.5 mg/dL) were independently associated with WRF. During 4-year follow-up, 386 (18.6%) patients died. Mortality was significantly higher in the WRF group (36.6% vs 14.4% in those without WRF, P<.001). After adjusting for other factors associated with WRF and long-term mortality, including baseline creatinine, WRF was independently associated with a higher risk of death (hazard ratio=1.64, 95% CI 1.23-2.19). Worsening in renal function occurs in approximately 1 of 6 AMI survivors and is independently associated with an adverse long-term prognosis. Further studies on interventions to minimize WRF or to more aggressively treat patients developing WRF should be tested. Copyright © 2010 Mosby, Inc. All rights reserved.
Saito, Yoshihiko; Okada, Sadanori; Ogawa, Hisao; Soejima, Hirofumi; Sakuma, Mio; Nakayama, Masafumi; Doi, Naofumi; Jinnouchi, Hideaki; Waki, Masako; Masuda, Izuru; Morimoto, Takeshi
2017-02-14
The long-term efficacy and safety of low-dose aspirin for primary prevention of cardiovascular events in patients with type 2 diabetes mellitus are still inconclusive. The JPAD trial (Japanese Primary Prevention of Atherosclerosis With Aspirin for Diabetes) was a randomized, open-label, standard care-controlled trial examining whether low-dose aspirin affected cardiovascular events in 2539 Japanese patients with type 2 diabetes mellitus and without preexisting cardiovascular disease. Patients were randomly allocated to receive aspirin (81 or 100 mg daily; aspirin group) or no aspirin (no-aspirin group) in the JPAD trial. After that trial ended in 2008, we followed up with the patients until 2015, with no attempt to change the previously assigned therapy. Primary end points were cardiovascular events, including sudden death, fatal or nonfatal coronary artery disease, fatal or nonfatal stroke, and peripheral vascular disease. For the safety analysis, hemorrhagic events, consisting of gastrointestinal bleeding, hemorrhagic stroke, and bleeding from any other sites, were also analyzed. The primary analysis was conducted for cardiovascular events among patients who retained their original allocation (a per-protocol cohort). Analyses on an intention-to-treat cohort were conducted for hemorrhagic events and statistical sensitivity. The median follow-up period was 10.3 years; 1621 patients (64%) were followed up throughout the study; and 2160 patients (85%) retained their original allocation. Low-dose aspirin did not reduce cardiovascular events in the per-protocol cohort (hazard ratio, 1.14; 95% confidence interval, 0.91-1.42). Multivariable Cox proportional hazard model adjusted for age, sex, glycemic control, kidney function, smoking status, hypertension, and dyslipidemia showed similar results (hazard ratio, 1.04; 95% confidence interval, 0.83-1.30), with no heterogeneity of efficacy in subgroup analyses stratified by each of these factors (all interaction P >0.05). Sensitivity analyses on the intention-to-treat cohort yielded consistent results (hazard ratio, 1.01; 95% confidence interval, 0.82-1.25). Gastrointestinal bleeding occurred in 25 patients (2%) in the aspirin group and 12 (0.9%) in the no-aspirin group ( P =0.03), and the incidence of hemorrhagic stroke was not different between groups. Low-dose aspirin did not affect the risk for cardiovascular events but increased risk for gastrointestinal bleeding in patients with type 2 diabetes mellitus in a primary prevention setting. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00110448. © 2016 American Heart Association, Inc.
Agiasotelli, Danai; Alexopoulou, Alexandra; Vasilieva, Larisa; Kalpakou, Georgia; Papadaki, Sotiria; Dourakis, Spyros P
2016-05-01
Acute-on-chronic liver failure (ACLF) is defined as an acute deterioration of liver disease with high mortality in patients with cirrhosis. The early mortality in ACLF is associated with organ failure and high leukocyte count. The time needed to reverse this condition and the factors affecting mortality after the early 30-day-period were evaluated. One hundred and ninety-seven consecutive patients with cirrhosis were included. Patients were prospectively followed up for 180 days. ACLF was diagnosed in 54.8% of the patients. Infection was the most common precipitating event in patients with ACLF. On multivariate analysis, only the neutrophil/leukocyte ratio and Chronic Liver Failure Consortium Organ Failure (CLIF-C OF) score were associated with mortality. Hazard ratios for mortality of patients with ACLF compared with those without at different time end-points post-enrollment revealed that the relative risk of death in the ACLF group was 8.54 during the first 30-day period and declined to 1.94 during the second period of observation. The time varying effect of neutrophil/leukocyte ratio and CLIF-C score was negative (1% and 18% decline in the hazard ratio per month) while that of Model for End-Stage Liver Disease (MELD) was positive (3% increase in the hazard ratio per month). The condition of ACLF was reversible in patients who survived. During the 30-180-day period following the acute event, the probability of death in ACLF became gradually similar to the non-ACLF group. The impact of inflammatory response and organ failure on survival is powerful during the first 30-day period and weakens thereafter while that of MELD increases. © 2015 The Japan Society of Hepatology.
Poppe, Katrina K; Doughty, Robert N; Yu, Cheuk-Man; Quintana, Miguel; Møller, Jacob E; Klein, Allan L; Gamble, Greg D; Dini, Frank L; Whalley, Gillian A
2011-04-14
Meta-analyses are increasingly used to summarise observational data however a literature meta-analysis (LMA) may give different results to the corresponding individual patient meta-analysis (IPMA). This study compares the published results of equivalent LMAs and IPMAs, highlighting factors that can affect the results and therefore impact on clinical interpretation of meta-analyses. Univariate results from published meta-analyses of prospective observational outcome data were compared, as were the number of studies, patients and length of follow-up. The absolute difference in survival was calculated. The association between severe diastolic dysfunction (RFP) and death post acute myocardial infarction (AMI) and in chronic heart failure (HF) were used as clinical examples. The IPMA hazard ratio was lower that the LMA odds ratio: AMI hazard ratio 2.67 (95% confidence interval 2.23 to 3.20), odds ratio 4.10 (3.38 to 4.99); HF hazard ratio 2.42 (2.06 to 2.83), odds ratio 4.36 (3.60 to 5.04). The IPMAs contained most of the studies from the LMAs as well as additional unpublished data, and a longer length of follow-up was available in the IPMAs (AMI 3.7 vs 2.6 yr, HF 4.0 vs 1.5 yr). Restricting analysis to the same studies in both the LMA and IPMA resulted in a similar difference in effect sizes between methods to those found in the published analyses. The result of a meta-analysis is affected by whether study level or individual patient data have been used, and the variant of analysis that is required. Awareness and consideration of these factors is important for clinical interpretation of meta-analyses. Copyright © 2009 Elsevier B.V. All rights reserved.
Bechard, Lori J; Duggan, Christopher; Touger-Decker, Riva; Parrott, J Scott; Rothpletz-Puglia, Pamela; Byham-Gray, Laura; Heyland, Daren; Mehta, Nilesh M
2016-08-01
To determine the influence of admission anthropometry on clinical outcomes in mechanically ventilated children in the PICU. Data from two multicenter cohort studies were compiled to examine the unique contribution of nutritional status, defined by body mass index z score, to 60-day mortality, hospital-acquired infections, length of hospital stay, and ventilator-free days, using multivariate analysis. Ninety PICUs from 16 countries with eight or more beds. Children aged 1 month to 18 years, admitted to each participating PICU and requiring mechanical ventilation for more than 48 hours. Data from 1,622 eligible patients, 54.8% men and mean (SD) age 4.5 years (5.1), were analyzed. Subjects were classified as underweight (17.9%), normal weight (54.2%), overweight (14.5%), and obese (13.4%) based on body mass index z score at admission. After adjusting for severity of illness and site, the odds of 60-day mortality were higher in underweight (odds ratio, 1.53; p < 0.001) children. The odds of hospital-acquired infections were higher in underweight (odds ratio, 1.88; p = 0.008) and obese (odds ratio, 1.64; p < 0.001) children. Hazard ratios for hospital discharge were lower among underweight (hazard ratio, 0.71; p < 0.001) and obese (hazard ratio, 0.82; p = 0.04) children. Underweight was associated with 1.3 (p = 0.001) and 1.6 (p < 0.001) fewer ventilator-free days than normal weight and overweight, respectively. Malnutrition is prevalent in mechanically ventilated children on admission to PICUs worldwide. Classification as underweight or obese was associated with higher risk of hospital-acquired infections and lower likelihood of hospital discharge. Underweight children had a higher risk of mortality and fewer ventilator-free days.
Dietary acrylamide intake and risk of breast cancer in the UK women's cohort
Burley, V J; Greenwood, D C; Hepworth, S J; Fraser, L K; de Kok, T M; van Breda, S G; Kyrtopoulos, S A; Botsivali, M; Kleinjans, J; McKinney, P A; Cade, J E
2010-01-01
Background: No studies to date have demonstrated a clear association with breast cancer risk and dietary exposure to acrylamide. Methods: A 217-item food frequency questionnaire was used to estimate dietary acrylamide intake in 33 731 women aged 35–69 years from the UK Women's Cohort Study followed up for a median of 11 years. Results: In all, 1084 incident breast cancers occurred during follow-up. There was no evidence of an overall association between acrylamide intake and breast cancer (hazard ratio=1.08 per 10 μg day−1, 95% CI: 0.98–1.18, Ptrend=0.1). There was a suggestion of a possible weak positive association between dietary acrylamide intake and premenopausal breast cancer after adjustment for potential confounders (hazard ratio=1.2, 95% CI: 1.0–1.3, Ptrend=0.008). There was no suggestion of any association for postmenopausal breast cancer (hazard ratio=1.0, 95% CI: 0.9–1.1, Ptrend=0.99). Conclusions: There is no evidence of an association between dietary acrylamide intake and breast cancer. A weak association may exist with premenopausal breast cancer, but requires further investigation. PMID:20959829
Esserman, Laura J.; Berry, Donald A.; DeMichele, Angela; Carey, Lisa; Davis, Sarah E.; Buxton, Meredith; Hudis, Cliff; Gray, Joe W.; Perou, Charles; Yau, Christina; Livasy, Chad; Krontiras, Helen; Montgomery, Leslie; Tripathy, Debasish; Lehman, Constance; Liu, Minetta C.; Olopade, Olufunmilayo I.; Rugo, Hope S.; Carpenter, John T.; Dressler, Lynn; Chhieng, David; Singh, Baljit; Mies, Carolyn; Rabban, Joseph; Chen, Yunn-Yi; Giri, Dilip; van 't Veer, Laura; Hylton, Nola
2012-01-01
Purpose Neoadjuvant chemotherapy for breast cancer provides critical information about tumor response; how best to leverage this for predicting recurrence-free survival (RFS) is not established. The I-SPY 1 TRIAL (Investigation of Serial Studies to Predict Your Therapeutic Response With Imaging and Molecular Analysis) was a multicenter breast cancer study integrating clinical, imaging, and genomic data to evaluate pathologic response, RFS, and their relationship and predictability based on tumor biomarkers. Patients and Methods Eligible patients had tumors ≥ 3 cm and received neoadjuvant chemotherapy. We determined associations between pathologic complete response (pCR; defined as the absence of invasive cancer in breast and nodes) and RFS, overall and within receptor subsets. Results In 221 evaluable patients (median tumor size, 6.0 cm; median age, 49 years; 91% classified as poor risk on the basis of the 70-gene prognosis profile), 41% were hormone receptor (HR) negative, and 31% were human epidermal growth factor receptor 2 (HER2) positive. For 190 patients treated without neoadjuvant trastuzumab, pCR was highest for HR-negative/HER2-positive patients (45%) and lowest for HR-positive/HER2-negative patients (9%). Achieving pCR predicted favorable RFS. For 172 patients treated without trastuzumab, the hazard ratio for RFS of pCR versus no pCR was 0.29 (95% CI, 0.07 to 0.82). pCR was more predictive of RFS by multivariate analysis when subtype was taken into account, and point estimates of hazard ratios within the HR-positive/HER2-negative (hazard ratio, 0.00; 95% CI, 0.00 to 0.93), HR-negative/HER2-negative (hazard ratio, 0.25; 95% CI, 0.04 to 0.97), and HER2-positive (hazard ratio, 0.14; 95% CI, 0.01 to 1.0) subtypes are lower. Ki67 further improved the prediction of pCR within subsets. Conclusion In this biologically high-risk group, pCR differs by receptor subset. pCR is more highly predictive of RFS within every established receptor subset than overall, demonstrating that the extent of outcome advantage conferred by pCR is specific to tumor biology. PMID:22649152
Geri, Guillaume; Dumas, Florence; Chenevier-Gobeaux, Camille; Bouglé, Adrien; Daviaud, Fabrice; Morichau-Beauchant, Tristan; Jouven, Xavier; Mira, Jean-Paul; Pène, Frédéric; Empana, Jean-Philippe; Cariou, Alain
2015-02-01
The availability of circulating biomarkers that helps to identify early out-of-hospital cardiac arrest survivors who are at increased risk of long-term mortality remains challenging. Our aim was to prospectively study the association between copeptin and 1-year mortality in patients with out-of-hospital cardiac arrest admitted in a tertiary cardiac arrest center. Retrospective monocenter study. Tertiary cardiac arrest center in Paris, France. Copeptin was assessed at admission and day 3. Pre- and intrahospital factors associated with 1-year mortality were analyzed by multivariate Cox proportional analysis. None. Two hundred ninety-eight consecutive out-of-hospital cardiac arrest patients (70.3% male; median age, 60.2 yr [49.9-71.4]) were admitted in a tertiary cardiac arrest center in Paris (France). After multivariate analysis, higher admission copeptin was associated with 1-year mortality with a threshold effect (hazard ratio(5th vs 1st quintile) = 1.64; 95% CI, 1.05-2.58; p = 0.03). Day 3 copeptin was associated with 1-year mortality in a dose-dependent manner (hazard ratio(2nd vs 1st quintile) = 1.87; 95% CI, 1.00-3.49; p = 0.05; hazard ratio(3rd vs 1st quintile) = 1.92; 95% CI, 1.02-3.64; p = 0.04; hazard ratio(4th vs 1st quintile) = 2.12; 95% CI, 1.14-3.93; p = 0.02; and hazard ratio(5th vs 1st quintile) = 2.75; 95% CI, 1.47-5.15; p < 0.01; p for trend < 0.01). For both admission and day 3 copeptin, association with 1-year mortality existed for out-of-hospital cardiac arrest of cardiac origin only (p for interaction = 0.05 and < 0.01, respectively). When admission and day 3 copeptin were mutually adjusted, only day 3 copeptin remained associated with 1-year mortality in a dose-dependent manner (p for trend = 0.01). High levels of copeptin were associated with 1-year mortality independently from prehospital and intrahospital risk factors, especially in out-of-hospital cardiac arrest of cardiac origin. Day 3 copeptin was superior to admission copeptin: this could permit identification of out-of-hospital cardiac arrest survivors at increased risk of mortality and allow for close observation of such patients.
Five-Year Outcomes with PCI Guided by Fractional Flow Reserve.
Xaplanteris, Panagiotis; Fournier, Stephane; Pijls, Nico H J; Fearon, William F; Barbato, Emanuele; Tonino, Pim A L; Engstrøm, Thomas; Kääb, Stefan; Dambrink, Jan-Henk; Rioufol, Gilles; Toth, Gabor G; Piroth, Zsolt; Witt, Nils; Fröbert, Ole; Kala, Petr; Linke, Axel; Jagic, Nicola; Mates, Martin; Mavromatis, Kreton; Samady, Habib; Irimpen, Anand; Oldroyd, Keith; Campo, Gianluca; Rothenbühler, Martina; Jüni, Peter; De Bruyne, Bernard
2018-05-22
Background We hypothesized that fractional flow reserve (FFR)-guided percutaneous coronary intervention (PCI) would be superior to medical therapy as initial treatment in patients with stable coronary artery disease. Methods Among 1220 patients with angiographically significant stenoses, those in whom at least one stenosis was hemodynamically significant (FFR, ≤0.80) were randomly assigned to FFR-guided PCI plus medical therapy or to medical therapy alone. Patients in whom all stenoses had an FFR of more than 0.80 received medical therapy and were entered into a registry. The primary end point was a composite of death, myocardial infarction, or urgent revascularization. Results A total of 888 patients underwent randomization (447 patients in the PCI group and 441 in the medical-therapy group). At 5 years, the rate of the primary end point was lower in the PCI group than in the medical-therapy group (13.9% vs. 27.0%; hazard ratio, 0.46; 95% confidence interval [CI], 0.34 to 0.63; P<0.001). The difference was driven by urgent revascularizations, which occurred in 6.3% of the patients in the PCI group as compared with 21.1% of those in the medical-therapy group (hazard ratio, 0.27; 95% CI, 0.18 to 0.41). There were no significant differences between the PCI group and the medical-therapy group in the rates of death (5.1% and 5.2%, respectively; hazard ratio, 0.98; 95% CI, 0.55 to 1.75) or myocardial infarction (8.1% and 12.0%; hazard ratio, 0.66; 95% CI, 0.43 to 1.00). There was no significant difference in the rate of the primary end point between the PCI group and the registry cohort (13.9% and 15.7%, respectively; hazard ratio, 0.88; 95% CI, 0.55 to 1.39). Relief from angina was more pronounced after PCI than after medical therapy. Conclusions In patients with stable coronary artery disease, an initial FFR-guided PCI strategy was associated with a significantly lower rate of the primary composite end point of death, myocardial infarction, or urgent revascularization at 5 years than medical therapy alone. Patients without hemodynamically significant stenoses had a favorable long-term outcome with medical therapy alone. (Funded by St. Jude Medical and others; FAME 2 ClinicalTrials.gov number, NCT01132495 .).
Semaglutide and Cardiovascular Outcomes in Patients with Type 2 Diabetes.
Marso, Steven P; Bain, Stephen C; Consoli, Agostino; Eliaschewitz, Freddy G; Jódar, Esteban; Leiter, Lawrence A; Lingvay, Ildiko; Rosenstock, Julio; Seufert, Jochen; Warren, Mark L; Woo, Vincent; Hansen, Oluf; Holst, Anders G; Pettersson, Jonas; Vilsbøll, Tina
2016-11-10
Regulatory guidance specifies the need to establish cardiovascular safety of new diabetes therapies in patients with type 2 diabetes in order to rule out excess cardiovascular risk. The cardiovascular effects of semaglutide, a glucagon-like peptide 1 analogue with an extended half-life of approximately 1 week, in type 2 diabetes are unknown. We randomly assigned 3297 patients with type 2 diabetes who were on a standard-care regimen to receive once-weekly semaglutide (0.5 mg or 1.0 mg) or placebo for 104 weeks. The primary composite outcome was the first occurrence of cardiovascular death, nonfatal myocardial infarction, or nonfatal stroke. We hypothesized that semaglutide would be noninferior to placebo for the primary outcome. The noninferiority margin was 1.8 for the upper boundary of the 95% confidence interval of the hazard ratio. At baseline, 2735 of the patients (83.0%) had established cardiovascular disease, chronic kidney disease, or both. The primary outcome occurred in 108 of 1648 patients (6.6%) in the semaglutide group and in 146 of 1649 patients (8.9%) in the placebo group (hazard ratio, 0.74; 95% confidence interval [CI], 0.58 to 0.95; P<0.001 for noninferiority). Nonfatal myocardial infarction occurred in 2.9% of the patients receiving semaglutide and in 3.9% of those receiving placebo (hazard ratio, 0.74; 95% CI, 0.51 to 1.08; P=0.12); nonfatal stroke occurred in 1.6% and 2.7%, respectively (hazard ratio, 0.61; 95% CI, 0.38 to 0.99; P=0.04). Rates of death from cardiovascular causes were similar in the two groups. Rates of new or worsening nephropathy were lower in the semaglutide group, but rates of retinopathy complications (vitreous hemorrhage, blindness, or conditions requiring treatment with an intravitreal agent or photocoagulation) were significantly higher (hazard ratio, 1.76; 95% CI, 1.11 to 2.78; P=0.02). Fewer serious adverse events occurred in the semaglutide group, although more patients discontinued treatment because of adverse events, mainly gastrointestinal. In patients with type 2 diabetes who were at high cardiovascular risk, the rate of cardiovascular death, nonfatal myocardial infarction, or nonfatal stroke was significantly lower among patients receiving semaglutide than among those receiving placebo, an outcome that confirmed the noninferiority of semaglutide. (Funded by Novo Nordisk; SUSTAIN-6 ClinicalTrials.gov number, NCT01720446 .).
Dose comparisons of clopidogrel and aspirin in acute coronary syndromes.
Mehta, Shamir R; Bassand, Jean-Pierre; Chrolavicius, Susan; Diaz, Rafael; Eikelboom, John W; Fox, Keith A A; Granger, Christopher B; Jolly, Sanjit; Joyner, Campbell D; Rupprecht, Hans-Jurgen; Widimsky, Petr; Afzal, Rizwan; Pogue, Janice; Yusuf, Salim
2010-09-02
Clopidogrel and aspirin are widely used for patients with acute coronary syndromes and those undergoing percutaneous coronary intervention (PCI). However, evidence-based guidelines for dosing have not been established for either agent. We randomly assigned, in a 2-by-2 factorial design, 25,086 patients with an acute coronary syndrome who were referred for an invasive strategy to either double-dose clopidogrel (a 600-mg loading dose on day 1, followed by 150 mg daily for 6 days and 75 mg daily thereafter) or standard-dose clopidogrel (a 300-mg loading dose and 75 mg daily thereafter) and either higher-dose aspirin (300 to 325 mg daily) or lower-dose aspirin (75 to 100 mg daily). The primary outcome was cardiovascular death, myocardial infarction, or stroke at 30 days. The primary outcome occurred in 4.2% of patients assigned to double-dose clopidogrel as compared with 4.4% assigned to standard-dose clopidogrel (hazard ratio, 0.94; 95% confidence interval [CI], 0.83 to 1.06; P=0.30). Major bleeding occurred in 2.5% of patients in the double-dose group and in 2.0% in the standard-dose group (hazard ratio, 1.24; 95% CI, 1.05 to 1.46; P=0.01). Double-dose clopidogrel was associated with a significant reduction in the secondary outcome of stent thrombosis among the 17,263 patients who underwent PCI (1.6% vs. 2.3%; hazard ratio, 0.68; 95% CI, 0.55 to 0.85; P=0.001). There was no significant difference between higher-dose and lower-dose aspirin with respect to the primary outcome (4.2% vs. 4.4%; hazard ratio, 0.97; 95% CI, 0.86 to 1.09; P=0.61) or major bleeding (2.3% vs. 2.3%; hazard ratio, 0.99; 95% CI, 0.84 to 1.17; P=0.90). In patients with an acute coronary syndrome who were referred for an invasive strategy, there was no significant difference between a 7-day, double-dose clopidogrel regimen and the standard-dose regimen, or between higher-dose aspirin and lower-dose aspirin, with respect to the primary outcome of cardiovascular death, myocardial infarction, or stroke. (Funded by Sanofi-Aventis and Bristol-Myers Squibb; ClinicalTrials.gov number, NCT00335452.)
Accelerated decline of renal function in type 2 diabetes following severe hypoglycemia.
Tsujimoto, Tetsuro; Yamamoto-Honda, Ritsuko; Kajio, Hiroshi; Kishimoto, Miyako; Noto, Hiroshi; Hachiya, Remi; Kimura, Akio; Kakei, Masafumi; Noda, Mitsuhiko
2016-01-01
This study aimed to evaluate whether the pronounced elevation in blood pressure during severe hypoglycemia is associated with subsequent renal insufficiency. We conducted a 3-year cohort study to assess the clinical course of renal function in type 2 diabetes patients with or without blood pressure surge during severe hypoglycemia. Of 111 type 2 diabetes patients with severe hypoglycemia, 76 exhibited an extremely high systolic blood pressure before treatment, whereas 35 demonstrated no such increase (179.1 ± 27.7 mmHg vs. 131.1 ± 20.2 mmHg, P<0.001). At 12h after treatment, systolic blood pressure did not differ significantly (131.5 ± 30.7 mmHg vs. 123.5 ± 20.7 mmHg; P=0.39). The estimated glomerular filtration rate (GFR) before and at the time of severe hypoglycemia did not significantly differ between both groups. A multivariate Cox proportional hazards regression analysis revealed that blood pressure surge during severe hypoglycemia was independently associated with a composite outcome of a more than 15 mL/min/1.73 m(2) decrease in the estimated GFR and initiation of chronic dialysis (hazard ratio, 2.68; 95% confidence interval, 1.12-6.38; P=0.02). Renal function after severe hypoglycemia was significantly worse in type 2 diabetes patients with blood pressure surge during severe hypoglycemia than those without blood pressure surge. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
[Relations of landslide and debris flow hazards to environmental factors].
Zhang, Guo-ping; Xu, Jing; Bi, Bao-gui
2009-03-01
To clarify the relations of landslide and debris flow hazards to environmental factors is of significance to the prediction and evaluation of landslide and debris flow hazards. Base on the latitudinal and longitudinal information of 18431 landslide and debris flow hazards in China, and the 1 km x 1 km grid data of elevation, elevation difference, slope, slope aspect, vegetation type, and vegetation coverage, this paper analyzed the relations of landslide and debris flow hazards in this country to above-mentioned environmental factors by the analysis method of frequency ratio. The results showed that the landslide and debris flow hazards in China more occurred in lower elevation areas of the first and second transitional zones. When the elevation difference within a 1 km x 1 km grid cell was about 300 m and the slope was around 30 degree, there was the greatest possibility of the occurrence of landslide and debris hazards. Mountain forest land and slope cropland were the two land types the hazards most easily occurred. The occurrence frequency of the hazards was the highest when the vegetation coverage was about 80%-90%.
Shih, Cheng-Ping; Lin, Hung-Che; Chung, Chi-Hsiang; Hsiao, Po-Jen; Wang, Chih-Hung; Lee, Jih-Chin; Chien, Wu-Chien
2017-01-01
Tinnitus mostly results from central and peripheral auditory pathology. Chronic kidney disease (CKD) is a major risk factor for cerebrovascular disease. However, no studies have evaluated the association between tinnitus and CKD. The aim of this study is to investigate the risk of tinnitus in patients with CKD. This retrospective cohort study was conducted using Taiwan National Health Insurance Research Database from 2000 to 2010. We established a CKD group (n = 185,430) and a non-CKD comparison group (n = 556,290) to investigate the incidence of tinnitus. Cox proportional hazard regression analysis was used to evaluate the effects of CKD on tinnitus risk. The results showed CKD significantly increased the risk of tinnitus (adjusted hazard ratio, 3.02; 95% CI, 2.655-3.456, P<0.001). A subgroup analysis revealed the increase in risk of tinnitus is more in CKD patients with heart failure (adjusted hazard ratio, 9.975; 95% CI, 5.001-18.752) and diabetes mellitus (adjusted hazard ratio, 3.712; 95% CI, 2.856-5.007). Furthermore, compared to non-CKD patients, the risk of tinnitus was increased 4.586-fold (95% CI, 2.399-6.7) in CKD patients with dialysis and 2.461-fold (95% CI, 1.033-3.454) in CKD patients without dialysis. This study is the first to report that CKD is associated with an increased risk of tinnitus. Among CKD cohort, patients with dialysis are at a higher risk of tinnitus than those without dialysis.
Circulating active serum calcium reduces the risk of hypertension.
Kunutsor, Setor K; Laukkanen, Jari A
2017-02-01
Purpose Calcium, which is one the most abundant mineral elements in the body, has been suggested to be involved in blood pressure regulation. We aimed to assess the association of active serum calcium (which is the ionised and physiologically active form of serum calcium) with the future risk of hypertension. Methods The active serum calcium concentration was assessed at baseline in the Finnish Kuopio Ischemic Heart Disease population-based prospective cohort study of 1562 normotensive men aged 42-61 years at baseline. Cox proportional hazard models were used to assess the hazard ratios (95% confidence intervals (CIs)) for incident hypertension. Results During a median follow-up of 24.9 years, 247 men developed new-onset hypertension. Active serum calcium was inversely associated with incident hypertension in an approximately linear fashion. In age-adjusted analysis, the hazard ratio for hypertension per 1 standard deviation increase in active serum calcium was 0.86 (95% CI 0.76-0.98), which remained consistent after adjustment for several established risk factors and potential confounders 0.82 (0.71-0.94). In a comparison of extreme quintiles of active serum calcium levels, the corresponding adjusted hazard ratios were 0.59 (95% CI 0.39-0.90) and 0.54 (95% CI 0.35-0.82), respectively. Conclusion Active serum calcium is protective of future hypertension in a middle-aged male Caucasian population. Further research is needed to confirm these findings and help unravel the mechanistic pathways of calcium in the pathogenesis of hypertension.
Levitan, Emily B; Wolk, Alicja; Mittleman, Murray A
2009-06-01
Fatty fish and marine omega-3 fatty acids were associated with lower rates of heart failure (HF) among US elderly, but this has not been confirmed in broader age ranges or other populations where source and type of fish may differ. We therefore conducted a population-based, prospective study of 39 367 middle-aged and older Swedish men. Diet was measured using food-frequency questionnaires. Men were followed for HF through Swedish inpatient and cause-of-death registers from 1 January 1998 to 31 December 2004. We used proportional hazards models adjusted for age and other covariates to estimate hazard ratios (HR). Compared with no consumption, men who ate fatty fish once per week had an HR of 0.88 (95% CI 0.68-1.13). Hazard ratios for consumption two times per week and > or =3 times per week were 0.99 and 0.97, respectively. Hazard ratios across quintiles of marine omega-3 were 1, 0.94 (95% CI 0.74-1.20), 0.67 (95% CI 0.50-0.90), 0.89 (95% CI 0.68-1.16), 1.00 (95% CI 0.77-1.29). In this population, moderate intake of fatty fish and marine omega-3 fatty acids was associated with lower rates of HF, though the association for fish intake was not statistically significant; higher intake was not associated with additional benefit.
Clark, Daniel O; Gao, Sujuan; Lane, Kathleen A; Callahan, Christopher M; Baiyewu, Olusegun; Ogunniyi, Adesola; Hendrie, Hugh C
2014-09-01
To compare the effect of obesity and related risk factors on 10-year mortality in two cohorts of older adults of African descent; one from the United States and one from Nigeria. Study participants were community residents aged 70 or older of African descent living in Indianapolis, Indiana (N = 1,269) or Ibadan, Nigeria (1,197). We compared survival curves between the two cohorts by obesity class and estimated the effect of obesity class on mortality in Cox proportional hazards models controlling for age, gender, alcohol use, and smoking history, and the cardiometabolic biomarkers blood pressure, triglycerides, high-density lipoprotein, low-density lipoprotein, and C-reactive protein. We found that underweight was associated with an increased risk of death in both the Yoruba (hazards ratio = 1.35, 95% confidence interval: 1.12-1.63) and African American samples (hazards ratio = 2.49, 95% confidence interval: 1.40-4.43) compared with those with normal weight. The overweight and obese participants in both cohorts experienced survival similar to the normal weight participants. Controlling for cardiometabolic biomarkers had little effect on the obesity-specific hazard ratios in either cohort. Despite significant differences across these two cohorts in terms of obesity and biomarker levels, overall 10-year survival and obesity class-specific survival were remarkably similar. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Association between parity and risk of suicide among parous women.
Yang, Chun-Yuh
2010-04-06
There are limited empirical data to support the theory of a protective effect of parenthood against suicide, as proposed by Durkheim in 1897. I conducted this study to examine whether there is an association between parity and risk of death from suicide among women. The study cohort consisted of 1,292,462 women in Taiwan who had a first live birth between Jan. 1, 1978, and Dec. 31, 1987. The women were followed up from the date of their first birth to Dec. 31, 2007. Their vital status was ascertained by means of linking records with data from a computerized mortality database. Cox proportional hazard regression models were used to estimate hazard ratios of death from suicide associated with parity. There were 2252 deaths from suicide during 32 464 187 person-years of follow-up. Suicide-related mortality was 6.94 per 100,000 person-years. After adjustment for age at first birth, marital status, years of schooling and place of delivery, the adjusted hazard ratio was 0.61 (95% confidence interval [CI] 0.54-0.68) among women with two live births and 0.40 (95% CI 0.35-0.45) among those with three or more live births, compared with women who had one live birth. I observed a significantly decreasing trend in adjusted hazard ratios of suicide with increasing parity. This study provides evidence to support Durkheim's hypothesis that parenthood confers a protective effect against suicide.
Kriege, M; Hollestelle, A; Jager, A; Huijts, P E A; Berns, E M; Sieuwerts, A M; Meijer-van Gelder, M E; Collée, J M; Devilee, P; Hooning, M J; Martens, J W M; Seynaeve, C
2014-08-26
We assessed the sensitivity to adjuvant chemotherapy in cell cycle checkpoint kinase 2 (CHEK2) vs non-CHEK2 breast cancer patients by comparing the contralateral breast cancer incidence and distant disease-free and breast cancer-specific survival between both groups, stratified for adjuvant chemotherapy. One Dutch hereditary non-BRCA1/2 breast cancer patient cohort (n=1220) and two Dutch cohorts unselected for family history (n=1014 and n=2488, respectively) were genotyped for CHEK2 1100delC. Hazard ratios for contralateral breast cancer, distant disease-free and breast cancer-specific death for mutation carriers vs noncarriers were calculated using the Cox proportional hazard method, stratified for adjuvant chemotherapy. The CHEK2 mutation carriers (n=193) had an increased incidence of contralateral breast cancer (multivariate hazard ratio 3.97, 95% confidence interval 2.59-6.07). Distant disease-free and breast cancer-specific survival were similar in the first 6 years in mutation carriers compared with noncarriers, but diverted as of 6 years after breast cancer diagnosis (multivariate hazard ratios and 95% confidence intervals 2.65 (1.79-3.93) and 2.05 (1.41-2.99), respectively). No significant interaction between CHEK2 and adjuvant chemotherapy was observed. The CHEK2 1100delC-associated breast cancer is associated with a higher contralateral breast cancer rate as well as worse survival measures beyond 6 years after diagnosis. No differential sensitivity to adjuvant chemotherapy was observed in CHEK2 patients.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L. K.; Vogel, R. M.
2015-11-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-04-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.
Shirakabe, Akihiro; Hata, Noritake; Kobayashi, Nobuaki; Okazaki, Hirotake; Matsushita, Masato; Shibata, Yusaku; Nishigoori, Suguru; Uchiyama, Saori; Asai, Kuniya; Shimizu, Wataru
2018-06-01
Whether or not the definition of a worsening renal function (WRF) is adequate for the evaluation of acute renal failure in patients with acute heart failure is unclear. One thousand and eighty-three patients with acute heart failure were analysed. A WRF, indicated by a change in serum creatinine ≥0.3 mg/mL during the first 5 days, occurred in 360 patients while no-WRF, indicated by a change <0.3 mg/dL, in 723 patients. Acute kidney injury (AKI) upon admission was defined based on the ratio of the serum creatinine value recorded on admission to the baseline creatinine value and placed into groups based on the degree of AKI: no-AKI (n = 751), Class R (risk; n = 193), Class I (injury; n = 41), or Class F (failure; n = 98). The patients were assigned to another set of four groups: no-WRF/no-AKI (n = 512), no-WRF/AKI (n = 211), WRF/no-AKI (n = 239), and WRF/AKI (n = 121). A multivariate logistic regression model found that no-WRF/AKI and WRF/AKI were independently associated with 365 day mortality (hazard ratio: 1.916; 95% confidence interval: 1.234-2.974 and hazard ratio: 3.622; 95% confidence interval: 2.332-5.624). Kaplan-Meier survival curves showed that the rate of any-cause death during 1 year was significantly poorer in the no-WRF/AKI and WRF/AKI groups than in the WRF/no-AKI and no-WRF/no-AKI groups and in Class I and Class F than in Class R and the no-AKI group. The presence of AKI on admission, especially Class I and Class F status, is associated with a poor prognosis despite the lack of a WRF within the first 5 days. The prognostic ability of AKI on admission may be superior to WRF within the first 5 days. © 2018 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.
Hata, Noritake; Kobayashi, Nobuaki; Okazaki, Hirotake; Matsushita, Masato; Shibata, Yusaku; Nishigoori, Suguru; Uchiyama, Saori; Asai, Kuniya; Shimizu, Wataru
2018-01-01
Abstract Aims Whether or not the definition of a worsening renal function (WRF) is adequate for the evaluation of acute renal failure in patients with acute heart failure is unclear. Methods and results One thousand and eighty‐three patients with acute heart failure were analysed. A WRF, indicated by a change in serum creatinine ≥0.3 mg/mL during the first 5 days, occurred in 360 patients while no‐WRF, indicated by a change <0.3 mg/dL, in 723 patients. Acute kidney injury (AKI) upon admission was defined based on the ratio of the serum creatinine value recorded on admission to the baseline creatinine value and placed into groups based on the degree of AKI: no‐AKI (n = 751), Class R (risk; n = 193), Class I (injury; n = 41), or Class F (failure; n = 98). The patients were assigned to another set of four groups: no‐WRF/no‐AKI (n = 512), no‐WRF/AKI (n = 211), WRF/no‐AKI (n = 239), and WRF/AKI (n = 121). A multivariate logistic regression model found that no‐WRF/AKI and WRF/AKI were independently associated with 365 day mortality (hazard ratio: 1.916; 95% confidence interval: 1.234–2.974 and hazard ratio: 3.622; 95% confidence interval: 2.332–5.624). Kaplan–Meier survival curves showed that the rate of any‐cause death during 1 year was significantly poorer in the no‐WRF/AKI and WRF/AKI groups than in the WRF/no‐AKI and no‐WRF/no‐AKI groups and in Class I and Class F than in Class R and the no‐AKI group. Conclusions The presence of AKI on admission, especially Class I and Class F status, is associated with a poor prognosis despite the lack of a WRF within the first 5 days. The prognostic ability of AKI on admission may be superior to WRF within the first 5 days. PMID:29388735
Interval Estimation of Seismic Hazard Parameters
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Lasocki, Stanislaw
2017-03-01
The paper considers Poisson temporal occurrence of earthquakes and presents a way to integrate uncertainties of the estimates of mean activity rate and magnitude cumulative distribution function in the interval estimation of the most widely used seismic hazard functions, such as the exceedance probability and the mean return period. The proposed algorithm can be used either when the Gutenberg-Richter model of magnitude distribution is accepted or when the nonparametric estimation is in use. When the Gutenberg-Richter model of magnitude distribution is used the interval estimation of its parameters is based on the asymptotic normality of the maximum likelihood estimator. When the nonparametric kernel estimation of magnitude distribution is used, we propose the iterated bias corrected and accelerated method for interval estimation based on the smoothed bootstrap and second-order bootstrap samples. The changes resulted from the integrated approach in the interval estimation of the seismic hazard functions with respect to the approach, which neglects the uncertainty of the mean activity rate estimates have been studied using Monte Carlo simulations and two real dataset examples. The results indicate that the uncertainty of mean activity rate affects significantly the interval estimates of hazard functions only when the product of activity rate and the time period, for which the hazard is estimated, is no more than 5.0. When this product becomes greater than 5.0, the impact of the uncertainty of cumulative distribution function of magnitude dominates the impact of the uncertainty of mean activity rate in the aggregated uncertainty of the hazard functions. Following, the interval estimates with and without inclusion of the uncertainty of mean activity rate converge. The presented algorithm is generic and can be applied also to capture the propagation of uncertainty of estimates, which are parameters of a multiparameter function, onto this function.
Rotator cuff strength balance in glovebox workers
Lawton, Cindy M.; Weaver, Amelia M.; Chan, Martha Kwan Yi; ...
2016-11-23
Gloveboxes are essential to the pharmaceutical, semi-conductor, nuclear, and biochemical industries. While gloveboxes serve as effective containment systems, they are often difficult to work in and present a number of ergonomic hazards. One such hazard is injury to the rotator cuff, a group of tendons and muscles in the shoulder, connecting the upper arm to the shoulder blade. Rotator cuff integrity is critical to shoulder health. This study compared the rotator cuff muscle strength ratios of glovebox workers to the healthy norm. Descriptive statistics were collected using a short questionnaire. Handheld dynamometry was used to quantify the ratio of forcesmore » produced for shoulder internal and external rotation. Results showed this population to have shoulder strength ratios significantly different from the healthy norm. Strength ratios were found to be a sound predictor of symptom incidence. The deviation from the normal ratio demonstrates the need for solutions designed to reduce the workload on the rotator cuff musculature in order to improve health and safety. Assessment of strength ratios can be used to screen for risk of symptom development. As a result, this increases technical knowledge and augments operational safety.« less
Rotator cuff strength balance in glovebox workers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawton, Cindy M.; Weaver, Amelia M.; Chan, Martha Kwan Yi
Gloveboxes are essential to the pharmaceutical, semi-conductor, nuclear, and biochemical industries. While gloveboxes serve as effective containment systems, they are often difficult to work in and present a number of ergonomic hazards. One such hazard is injury to the rotator cuff, a group of tendons and muscles in the shoulder, connecting the upper arm to the shoulder blade. Rotator cuff integrity is critical to shoulder health. This study compared the rotator cuff muscle strength ratios of glovebox workers to the healthy norm. Descriptive statistics were collected using a short questionnaire. Handheld dynamometry was used to quantify the ratio of forcesmore » produced for shoulder internal and external rotation. Results showed this population to have shoulder strength ratios significantly different from the healthy norm. Strength ratios were found to be a sound predictor of symptom incidence. The deviation from the normal ratio demonstrates the need for solutions designed to reduce the workload on the rotator cuff musculature in order to improve health and safety. Assessment of strength ratios can be used to screen for risk of symptom development. As a result, this increases technical knowledge and augments operational safety.« less
Hanai, Ko; Tauchi, Eriko; Nishiwaki, Yui; Mori, Tomomi; Yokoyama, Yoichi; Uchigata, Yasuko; Babazono, Tetsuya
2018-05-30
Most existing data regarding effects of uric acid (UA) on diabetic kidney disease have considered patients with preserved kidney function. We examined a hypothesis that there are differences in the effects of serum UA levels on the decline in kidney function depending on baseline kidney function in diabetic patients. In this historical cohort study, 7033 type 2 diabetic patients were analyzed and classified into two groups as follows: nonchronic kidney disease (non-CKD), with an estimated glomerular filtration rate (eGFR) ≥60 mL/min/1.73 m2 (n = 4994), and CKD, with an eGFR <60 mL/min/1.73 m2 (n = 2039). The composite endpoint was a ≥30% decrease in eGFR from baseline or the initiation of renal replacement therapy. The hazard ratio (HR) of serum UA levels at baseline was estimated using multivariate Cox proportional hazards models. There was a significant interaction between UA levels and baseline eGFR with respect to the endpoint (P < 0.001). The HRs of 1 mg/dL increase in UA levels were 1.13 [95% confidence interval (CI) 1.05-1.22, P = 0.002] and 0.93 (95% CI 0.88-0.99, P = 0.02) in the non-CKD and CKD groups, respectively. When patients were classified by quintile of UA levels, the HRs of those in the 5th quintile (versus 1st quintile) were 1.64 (95% CI 1.23-2.18, P < 0.001) and 0.76 (95% CI 0.58-0.99, P = 0.05) in the non-CKD and CKD groups, respectively. The effects of UA on kidney function decline might differ depending on baseline kidney function in type 2 diabetic patients. High UA levels are the prognostic factor only in patients with preserved kidney function.
2014-01-01
Background Home hazards are associated with toddlers receiving unintentional home injuries (UHI). These result in not only physical and psychological difficulties for children, but also economic losses and additional stress for their families. Few researchers pay attention to predictors of home hazards among toddlers in a systematic way. The purpose of this study is firstly to describe the characteristics of homes with hazards and secondly to explore the predicted relationship of children, parents and family factors to home hazards among toddlers aged 24–47 months in Wenzhou, China. Methods A random cluster sampling was employed to select 366 parents having children aged 24 – 47 months from 13 kindergartens between March and April of 2012. Four instruments assessed home hazards, demographics, parent’s awareness of UHI, as well as family functioning. Results Descriptive statistics showed that the mean of home hazards was 12.29 (SD = 6.39). The nine kinds of home hazards that were identified in over 50% of households were: plastic bags (74.3%), coin buttons (69.1%), and toys with small components (66.7%) etc. Multivariate linear regression revealed that the predictors of home hazards were the child’s age, the child’s residential status and family functioning (b = .19, 2.02, - .07, p < .01, < .05 and < .01, respectively). Conclusions The results showed that a higher number of home hazards were significantly attributed to older toddlers, migrant toddlers and poorer family functioning. This result suggested that heath care providers should focus on the vulnerable family and help the parents assess home hazards. Further study is needed to find interventions on how to manage home hazards for toddlers in China. PMID:24953678
Khan, Muhammad; Lin, Jie; Liao, Guixiang; Li, Rong; Wang, Baiyao; Xie, Guozhu; Zheng, Jieling; Yuan, Yawei
2017-07-01
Whole brain radiotherapy has been a standard treatment of brain metastases. Stereotactic radiosurgery provides more focal and aggressive radiation and normal tissue sparing but worse local and distant control. This meta-analysis was performed to assess and compare the effectiveness of whole brain radiotherapy alone, stereotactic radiosurgery alone, and their combination in the treatment of brain metastases based on randomized controlled trial studies. Electronic databases (PubMed, MEDLINE, Embase, and Cochrane Library) were searched to identify randomized controlled trial studies that compared treatment outcome of whole brain radiotherapy and stereotactic radiosurgery. This meta-analysis was performed using the Review Manager (RevMan) software (version 5.2) that is provided by the Cochrane Collaboration. The data used were hazard ratios with 95% confidence intervals calculated for time-to-event data extracted from survival curves and local tumor control rate curves. Odds ratio with 95% confidence intervals were calculated for dichotomous data, while mean differences with 95% confidence intervals were calculated for continuous data. Fixed-effects or random-effects models were adopted according to heterogeneity. Five studies (n = 763) were included in this meta-analysis meeting the inclusion criteria. All the included studies were randomized controlled trials. The sample size ranged from 27 to 331. In total 202 (26%) patients with whole brain radiotherapy alone, 196 (26%) patients receiving stereotactic radiosurgery alone, and 365 (48%) patients were in whole brain radiotherapy plus stereotactic radiosurgery group. No significant survival benefit was observed for any treatment approach; hazard ratio was 1.19 (95% confidence interval: 0.96-1.43, p = 0.12) based on three randomized controlled trials for whole brain radiotherapy only compared to whole brain radiotherapy plus stereotactic radiosurgery and hazard ratio was 1.03 (95% confidence interval: 0.82-1.29, p = 0.81) for stereotactic radiosurgery only compared to combined approach. Local control was best achieved when whole brain radiotherapy was combined with stereotactic radiosurgery. Hazard ratio 2.05 (95% confidence interval: 1.36-3.09, p = 0.0006) and hazard ratio 1.84 (95% confidence interval: 1.26-2.70, p = 0.002) were obtained from comparing whole brain radiotherapy only and stereotactic radiosurgery only to whole brain radiotherapy + stereotactic radiosurgery, respectively. No difference in adverse events for treatment difference; odds ratio 1.16 (95% confidence interval: 0.77-1.76, p = 0.48) and odds ratio 0.92 (95% confidence interval: 0.59-1.42, p = 71) for whole brain radiotherapy + stereotactic radiosurgery versus whole brain radiotherapy only and whole brain radiotherapy + stereotactic radiosurgery versus stereotactic radiosurgery only, respectively. Adding stereotactic radiosurgery to whole brain radiotherapy provides better local control as compared to whole brain radiotherapy only and stereotactic radiosurgery only with no difference in radiation related toxicities.