Socioeconomic disparities in outcomes after acute myocardial infarction.
Bernheim, Susannah M; Spertus, John A; Reid, Kimberly J; Bradley, Elizabeth H; Desai, Rani A; Peterson, Eric D; Rathore, Saif S; Normand, Sharon-Lise T; Jones, Philip G; Rahimi, Ali; Krumholz, Harlan M
2007-02-01
Patients of low socioeconomic status (SES) have higher mortality after acute myocardial infarction (AMI). Little is known about the underlying mechanisms or the relationship between SES and rehospitalization after AMI. We analyzed data from the PREMIER observational study, which included 2142 patients hospitalized with AMI from 18 US hospitals. Socioeconomic status was measured by self-reported household income and education level. Sequential multivariable modeling assessed the relationship of socioeconomic factors with 1-year all-cause mortality and all-cause rehospitalization after adjustment for demographics, clinical factors, and quality-of-care measures. Both household income and education level were associated with higher risk of mortality (hazard ratio 2.80, 95% CI 1.37-5.72, lowest to highest income group) and rehospitalization after AMI (hazard ratio 1.55, 95% CI 1.17-2.05). Patients with low SES had worse clinical status at admission and received poorer quality of care. In multivariable modeling, the relationship between household income and mortality was attenuated by adjustment for demographic and clinical factors (hazard ratio 1.19, 95% CI 0.54-2.62), with a further small decrement in the hazard ratio after adjustment for quality of care. The relationship between income and rehospitalization was only partly attenuated by demographic and clinical factors (hazard ratio 1.38, 95% CI 1.01-1.89) and was not influenced by adjustment for quality of care. Patients' baseline clinical status largely explained the relationship between SES and mortality, but not rehospitalization, among patients with AMI.
Shiraishi, Takuya; Ishikawa, Shizukiyo; Kario, Kazuomi; Kayaba, Kazunori; Kajii, Eiji
2017-11-01
The role of factor VII (FVII) as a risk factor in myocardial infarction (MI) has been the subject of numerous studies. However, it remains uncertain whether the FVII levels are associated with development of MI. The subjects were 4142 men and women whose activated FVII (FVIIa) and FVII coagulant (FVIIc) levels were measured in the Jichi Medical School Cohort Study. Subjects were divided into tertiles by FVIIa and FVIIc levels, and Cox's proportional hazard model was used to calculate hazard ratios (HRs) for MI. The multivariate-adjusted HRs (95% confidential interval [CI]) for FVIIa in men were 0.67 (0.67-1.78) in tertile 2 (T2), and 0.52 (0.17-1.60) in T3. In women, the multivariate-adjusted HRs (95% CI) were 0.18 (0.02-1.60) in T2, and 0.39 (0.07-2.20) in T3. The multivariate-adjusted HRs (95% CI) for FVIIc in men were 0.54 (0.21-1.36) in T2, and 0.20 (0.04-0.91) in T3. In women, the multivariate-adjusted HRs (95% CI) were 0.44 (0.07-2.85) in T2, and 0.35 (0.06-2.22) in T3. We used T1 as a reference for all measures. Our findings revealed a significant association between low FVIIc level and incidence of MI in men. The FVIIa and FVIIc levels were inversely related to increased MI risk, but did not reach statistical significance. Future studies are needed to confirm this association. © 2017 Wiley Periodicals, Inc.
Association of Race With Mortality and Cardiovascular Events in a Large Cohort of US Veterans.
Kovesdy, Csaba P; Norris, Keith C; Boulware, L Ebony; Lu, Jun L; Ma, Jennie Z; Streja, Elani; Molnar, Miklos Z; Kalantar-Zadeh, Kamyar
2015-10-20
In the general population, blacks experience higher mortality than their white peers, attributed in part to their lower socioeconomic status, reduced access to care, and possibly intrinsic biological factors. Patients with kidney disease are a notable exception, among whom blacks experience lower mortality. It is unclear if similar differences affecting outcomes exist in patients with no kidney disease but with equal or similar access to health care. We compared all-cause mortality, incident coronary heart disease, and incident ischemic stroke using multivariable-adjusted Cox models in a nationwide cohort of 547 441 black and 2 525 525 white patients with baseline estimated glomerular filtration rate ≥ 60 mL·min⁻¹·1.73 m⁻² receiving care from the US Veterans Health Administration. In parallel analyses, we compared outcomes in black versus white individuals in the National Health and Nutrition Examination Survey (NHANES) 1999 to 2004. After multivariable adjustments in veterans, black race was associated with 24% lower all-cause mortality (adjusted hazard ratio, 0.76; 95% confidence interval, 0.75-0.77; P<0.001) and 37% lower incidence of coronary heart disease (adjusted hazard ratio, 0.63; 95% confidence interval, 0.62-0.65; P<0.001) but a similar incidence of ischemic stroke (adjusted hazard ratio, 0.99; 95% confidence interval, 0.97-1.01; P=0.3). Black race was associated with a 42% higher adjusted mortality among individuals with estimated glomerular filtration rate ≥ 60 mL·min⁻¹·1.73 m⁻² in NHANES (adjusted hazard ratio, 1.42; 95% confidence interval, 1.09-1.87). Black veterans with normal estimated glomerular filtration rate and equal access to healthcare have lower all-cause mortality and incidence of coronary heart disease and a similar incidence of ischemic stroke. These associations are in contrast to the higher mortality experienced by black individuals in the general US population. © 2015 American Heart Association, Inc.
Lee, Jane J.; Yin, Xiaoyan; Hoffmann, Udo; Fox, Caroline S.; Benjamin, Emelia J.
2016-01-01
Obesity is associated with increased risk of developing atrial fibrillation (AF). Different fat depots may have differential associations with cardiac pathology. We examined the longitudinal associations between pericardial, intrathoracic, and visceral fat with incident AF. We studied Framingham Heart Study Offspring and Third Generation Cohorts who participated in the multi-detector computed tomography sub-study examination 1. We constructed multivariable-adjusted Cox proportional hazard models for risk of incident AF. Body mass index (BMI) was included in the multivariable-adjusted model as a secondary adjustment. We included 2,135 participants (53.3% women; mean age 58.8 years). During a median follow-up of 9.7 years, we identified 162 cases of incident AF. Across the increasing tertiles of pericardial fat volume, age- and sex-adjusted incident AF rate per 1000 person-years of follow-up were 8.4, 7.5, and 10.2. Based on an age- and sex-adjusted model, greater pericardial fat [hazard ratio (HR) 1.17, 95% confidence interval (CI) 1.03-1.34] and intrathoracic fat (HR 1.24, 95% CI 1.06-1.45) were associated with increased risk of incident AF. The HRs (95% CI) for incident AF were 1.13 (0.99-1.30) for pericardial fat, 1.19 (1.01-1.40) for intrathoracic fat, and 1.09 (0.93-1.28) for abdominal visceral fat after multivariable adjustment. After additional adjustment of BMI, none of the associations remained significant (all p>0.05). Our findings suggest that cardiac ectopic fat depots may share common risk factors with AF, which may have led to a lack of independence in the association between pericardial fat with incident AF. PMID:27666172
Meat, Dietary Heme Iron, and Risk of Type 2 Diabetes Mellitus: The Singapore Chinese Health Study.
Talaei, Mohammad; Wang, Ye-Li; Yuan, Jian-Min; Pan, An; Koh, Woon-Puay
2017-10-01
We evaluated the relationships of red meat, poultry, fish, and shellfish intakes, as well as heme iron intake, with the risk of type 2 diabetes mellitus (T2D).The Singapore Chinese Health Study is a population-based cohort study that recruited 63,257 Chinese adults aged 45-74 years from 1993 to 1998. Usual diet was evaluated using a validated 165-item semiquantitative food frequency questionnaire at recruitment. Physician-diagnosed T2D was self-reported during 2 follow-up interviews in 1999-2004 and 2006-2010. During a mean follow-up of 10.9 years, 5,207 incident cases of T2D were reported. When comparing persons in the highest intake quartiles with those in the lowest, the multivariate-adjusted hazard ratio for T2D was 1.23 (95% confidence interval (CI): 1.14, 1.33) for red meat intake (P for trend < 0.001), 1.15 (95% CI: 1.06, 1.24) for poultry intake (P for trend = 0.004), and 1.07 (95% CI: 0.99, 1.16) for fish/shellfish intake (P for trend = 0.12). After additional adjustment for heme iron, only red meat intake remained significantly associated with T2D risk (multivariate-adjusted hazard ratio = 1.13, 95% CI: 1.01, 1.25; P for trend = 0.02). Heme iron was associated with a higher risk of T2D even after additional adjustment for red meat intake (multivariate-adjusted hazard ratio = 1.14, 95% CI: 1.02, 1.28; P for trend = 0.03). In conclusion, red meat and poultry intakes were associated with a higher risk of T2D. These associations were mediated completely for poultry and partially for red meat by heme iron intake. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Lai, Shih-Wei; Lin, Cheng-Li; Liao, Kuan-Fu
2017-09-01
We assessed the association between diabetes mellitus and the risk of pleural empyema in Taiwan.A population-based retrospective cohort study was conducted using the database of the Taiwan National Health Insurance Program. There were 28,802 subjects aged 20 to 84 years who were newly diagnosed with diabetes mellitus from 2000 to 2010 as the diabetes group and 114,916 randomly selected subjects without diabetes mellitus as the non-diabetes group. The diabetes group and the non-diabetes group were matched by sex, age, comorbidities, and the year of index date. The incidence of pleural empyema at the end of 2011 was estimated. A multivariable Cox proportional hazards regression model was used to estimate the hazard ratio (HR) and 95% confidence interval (95% CI) for pleural empyema associated with diabetes mellitus.The overall incidence of pleural empyema was 1.65-fold higher in the diabetes group than that in the non-diabetes group (1.58 vs 0.96 per 10,000 person-years, 95% CI 1.57-1.72). After adjusting for confounders, a multivariable Cox proportional hazards regression model revealed that the adjusted HR of pleural empyema was 1.71 in subjects with diabetes mellitus (95% CI 1.16-2.51), compared with those without diabetes mellitus. In further analysis, even in the absence of any comorbidity, the adjusted HR was 1.99 for subjects with diabetes mellitus alone (95% CI 1.18-3.38).Diabetic patients confer a 1.71-fold increased hazard of developing pleural empyema. Even in the absence of any comorbidity, the risk remains existent.
Rivera, Andrew; Nan, Hongmei; Li, Tricia; Qureshi, Abrar; Cho, Eunyoung
2016-01-01
Background Alcohol consumption is associated with increased risk of numerous cancers, but existing evidence for an association with melanoma is equivocal. No study has evaluated the association with different anatomic locations of melanoma. Methods We used data from three large prospective cohort studies to investigate whether alcohol intake was associated with risk of melanoma. Alcohol intake was assessed repeatedly by food-frequency questionnaires. A Cox proportional hazards model was used to calculate multivariate-adjusted hazard ratios (HRs). Results A total of 1,374 cases of invasive melanoma were documented during 3,855,706 person-years of follow-up. There was an association between higher alcohol intake and incidence of invasive melanoma (pooled multivariate HR 1.14; 95% confidence interval [CI]: 1.00–1.29] per drink/d, p trend = 0.04). Among alcoholic beverages, white wine consumption was associated with an increased risk of melanoma (pooled multivariate HR 1.13 [95% CI: 1.04–1.24] per drink/d, p trend <0.01) after adjusting for other alcoholic beverages. The association between alcohol consumption and melanoma risk was stronger for melanoma in relatively UV-spared sites (trunk) versus more UV-exposed sites (head, neck, or extremities). Compared to non-drinkers, the pooled multivariate-adjusted HRs for ≥20g/d of alcohol were 1.02 (95% CI: 0.64–1.62; P trend =0.25) for melanomas of the head, neck, and extremities and 1.73 (95% CI: 1.25–2.38; P trend =0.02) for melanomas of the trunk. Conclusions Alcohol intake was associated with a modest increase in the risk of melanoma, particularly in UV-protected sites. Impact These findings further support American Cancer Society Guidelines for Cancer Prevention to limit alcohol intake. PMID:27909090
Li, Shijun; Barywani, Salim; Fu, Michael
2017-01-01
Association of heart rate (HR) with mortality in patients with acute coronary syndrome (ACS) and aged ≥ 80 years are underrepresented in clinical trials. We therefore aimed to investigate the association of HR in atrial fibrillation (AF) versus sinus rhythm (SR) with all-cause mortality in octogenarian patients with ACS. A total of 336 patients with ACS patients and aged ≥ 80 years were enrolled into the current study. The end point of interest was death from any cause. Association of HR in AF versus SR with mortality was analyzed by Kaplan-Meier curve following log-rank test and multivariable Cox regression analysis. In total, 63 (87.5%) of patients with AF were dead and 147 (59.8%) of patients with SR were dead during the follow-up period. The best cut-off was 80 bpm, with a sensitivity of 62% and specificity of 66%. HR ≤ 80 bpm in SR but not in AF was associated with better outcome as compared with HR > 80 bpm (Chi-Square = 26.55, Log rank P < 0.001). In SR subgroup, the hazard ratios of HR ≤ 80 bpm were 0.51(95% CI 0.37-0.70, P < 0.001) adjusted for age, 0.46 (95%CI 0.33-0.63, P < 0.001) adjusted for gender, 0.62 (95%CI 0.42- 0.93, P = 0.020) adjusted for multivariables respectively. In AF subgroup, the hazard ratios of HR ≤ 80 bpm were 0.83(95% CI 0.49-1.38, P = 0.464) adjusted for age, 0.96 (95%CI 0.59-1.58, P = 0.882) adjusted for gender, 0.72(95% CI 0.41-1.26, P = 0.249) adjusted for multivariables respectively. The current study demonstrates that heart rate is an independent prognostic predictor for all-cause mortality, and HR ≤ 80 bpm is associated with improved outcome in SR but not in AF in octogenarian patients with ACS.
Pfister, Roman; Michels, Guido; Sharp, Stephen J; Luben, Robert; Wareham, Nick J; Khaw, Kay-Tee
2014-04-01
Interventional trials provide evidence for a beneficial effect of reduced dietary sodium intake on blood pressure. The association of sodium intake with heart failure which is a long-term complication of hypertension has not been examined. Hazard ratios [HRs, 95% confidence interval (CI)] of heart failure comparing quintiles of estimated 24 h urinary sodium excretion (USE) were calculated in apparently healthy men (9017) and women (10,840) aged 39–79 participating in the EPIC study in Norfolk. During a mean follow-up of 12.9 years, 1210 incident cases of heart failure occurred. Compared with the reference category (128 mmol/day≤USE≤148 mmol/day), the top quintile (USE≥191 mmol/day) was associated with a significantly increased hazard of heart failure (1.32, 1.07–1.62) in multivariable analysis adjusting for age, sex, body mass index, diabetes, cholesterol, social class, educational level, smoking, physical activity, and alcohol consumption, with a marked attenuation (1.21, 0.98–1.49) when further adjusting for blood pressure. The bottom quintile (USE≤127 mmol/day) was also associated with an increased hazard of heart failure (1.29, 1.04–1.60) in multivariable analysis without relevant attenuation by blood pressure adjustment (1.26, 1.02–1.56), but with substantial attenuation when adjusting for interim ischaemic heart disease and baseline C-reactive protein levels and exclusion of events during the first 2 years (1.18, 0.96–1.47). We demonstrate a U-shaped association between USE and heart failure risk in an apparently healthy middle-aged population. The risk associated with the high range of USE was attenuated after adjustment for blood pressure, whereas the risk associated with the low range of USE was attenuated after adjustment for pre-existing disease processes. © 2014 The Authors. European Journal of Heart Failure © 2014 European Society of Cardiology.
Herpes zoster correlates with increased risk of Parkinson's disease in older people
Lai, Shih-Wei; Lin, Chih-Hsueh; Lin, Hsien-Feng; Lin, Cheng-Li; Lin, Cheng-Chieh; Liao, Kuan-Fu
2017-01-01
Abstract Little is known on the relationship between herpes zoster and Parkinson's disease in older people. This study aimed to explore whether herpes zoster could be associated with Parkinson's disease in older people in Taiwan. We conducted a retrospective cohort study using the claim data of the Taiwan National Health Insurance Program. There were 10,296 subjects aged 65 years and older with newly diagnosed herpes zoster as the herpes zoster group and 39,405 randomly selected subjects aged 65 years and older without a diagnosis of herpes zoster as the nonherpes zoster group from 1998 to 2010. Both groups were followed up until subjects received a diagnosis of Parkinson's disease. This follow-up design would explore whether subjects with herpes zoster were at an increased risk of Parkinson's disease. Relative risks were estimated by adjusted hazard ratio (HR) and 95% confidence interval (CI) using the multivariable Cox proportional hazards regression model. The incidence of Parkinson's disease was higher in the herpes zoster group than that in the nonherpes zoster group (4.86 vs 4.00 per 1000 person-years, 95% CI 1.14, 1.29). After adjustment for confounding factors, the multivariable Cox proportional hazards regression model revealed that the adjusted HR of Parkinson's disease was 1.17 for the herpes zoster group (95% CI 1.10, 1.25), compared with the nonherpes zoster group. Older people with herpes zoster confer a slightly increased hazard of developing Parkinson's disease when compared to those without herpes zoster. We think that herpes zoster correlates with increased risk of Parkinson's disease in older people. When older people with herpes zoster seek help, clinicians should pay more attention to the development of the cardinal symptoms of Parkinson's disease. PMID:28207515
Nagar, Himanshu; Yan, Weisi; Christos, Paul; Chao, K S Clifford; Nori, Dattatreyudu; Ravi, Akkamma
2017-06-01
Studies have shown that older women are undertreated for breast cancer. Few data are available on cancer-related death in elderly women aged 70 years and older with pathologic stage T1a-b N0 breast cancer and the impact of prognostic factors on cancer-related death. The Surveillance, Epidemiology, and End Results (SEER) database was queried for women aged 70 years or above diagnosed with pT1a or pT1b, N0 breast cancer who underwent breast conservation surgery from 1999 to 2003. The Kaplan-Meier survival analysis was performed to evaluate breast cause-specific survival (CSS) and overall survival (OS), and the log-rank test was employed to compare CSS/OS between different groups of interest. Multivariable analysis (MVA), using Cox proportional hazards regression analysis, was performed to evaluate the independent effect of age, race, stage, grade, ER status, and radiation treatment on CSS. Adjusted hazard ratios were calculated from the MVA and reflect the increased risk of breast cancer death. Competing-risks survival regression was also performed to adjust the univariate and multivariable CSS hazard ratios for the competing event of death due to causes other than breast cancer. Patients aged 85 and above had a greater risk of breast cancer death compared with patients aged 70 to 74 years (referent category) (adjusted hazard ratio [HRs]=1.98). Race had no effect on CSS. Patients with stage T1bN0 breast cancer had a greater risk of breast cancer death compared with stage T1aN0 patients (adjusted HR=1.35; P=0.09). ER negative patients had a greater risk of breast cancer death compared with ER positive patients (adjusted HR=1.59; P<0.017). Patients with higher grade tumors had a greater risk of breast cancer death compared with patients with grade 1 tumors (referent category) (adjusted HRs=1.69 and 2.96 for grade 2 and 3, respectively). Patients who underwent radiation therapy had a lower risk of breast cancer death compared with patients who did not (adjusted HR=0.55; P<0.0001). Older patients with higher grade, pT1b, ER-negative breast cancer had increased risk of breast cancer-related death. Adjuvant radiation therapy may provide a CSS benefit in this elderly patient population.
Vitamin D and the risk of dementia and Alzheimer disease.
Littlejohns, Thomas J; Henley, William E; Lang, Iain A; Annweiler, Cedric; Beauchet, Olivier; Chaves, Paulo H M; Fried, Linda; Kestenbaum, Bryan R; Kuller, Lewis H; Langa, Kenneth M; Lopez, Oscar L; Kos, Katarina; Soni, Maya; Llewellyn, David J
2014-09-02
To determine whether low vitamin D concentrations are associated with an increased risk of incident all-cause dementia and Alzheimer disease. One thousand six hundred fifty-eight elderly ambulatory adults free from dementia, cardiovascular disease, and stroke who participated in the US population-based Cardiovascular Health Study between 1992-1993 and 1999 were included. Serum 25-hydroxyvitamin D (25(OH)D) concentrations were determined by liquid chromatography-tandem mass spectrometry from blood samples collected in 1992-1993. Incident all-cause dementia and Alzheimer disease status were assessed during follow-up using National Institute of Neurological and Communicative Disorders and Stroke/Alzheimer's Disease and Related Disorders Association criteria. During a mean follow-up of 5.6 years, 171 participants developed all-cause dementia, including 102 cases of Alzheimer disease. Using Cox proportional hazards models, the multivariate adjusted hazard ratios (95% confidence interval [CI]) for incident all-cause dementia in participants who were severely 25(OH)D deficient (<25 nmol/L) and deficient (≥25 to <50 nmol/L) were 2.25 (95% CI: 1.23-4.13) and 1.53 (95% CI: 1.06-2.21) compared to participants with sufficient concentrations (≥50 nmol/L). The multivariate adjusted hazard ratios for incident Alzheimer disease in participants who were severely 25(OH)D deficient and deficient compared to participants with sufficient concentrations were 2.22 (95% CI: 1.02-4.83) and 1.69 (95% CI: 1.06-2.69). In multivariate adjusted penalized smoothing spline plots, the risk of all-cause dementia and Alzheimer disease markedly increased below a threshold of 50 nmol/L. Our results confirm that vitamin D deficiency is associated with a substantially increased risk of all-cause dementia and Alzheimer disease. This adds to the ongoing debate about the role of vitamin D in nonskeletal conditions. © 2014 American Academy of Neurology.
Chang, Y S; Chang, C C; Chen, Y H; Chen, W S; Chen, J H
2017-10-01
Objectives Patients with systemic lupus erythematosus are considered vulnerable to infective endocarditis and prophylactic antibiotics are recommended before an invasive dental procedure. However, the evidence is insufficient. This nationwide population-based study evaluated the risk and related factors of infective endocarditis in systemic lupus erythematosus. Methods We identified 12,102 systemic lupus erythematosus patients from the National Health Insurance research-oriented database, and compared the incidence rate of infective endocarditis with that among 48,408 non-systemic lupus erythematosus controls. A Cox multivariable proportional hazards model was employed to evaluate the risk of infective endocarditis in the systemic lupus erythematosus cohort. Results After a mean follow-up of more than six years, the systemic lupus erythematosus cohort had a significantly higher incidence rate of infective endocarditis (42.58 vs 4.32 per 100,000 person-years, incidence rate ratio = 9.86, p < 0.001) than that of the control cohort. By contrast, the older systemic lupus erythematosus cohort had lower risk (adjusted hazard ratio 11.64) than that of the younger-than-60-years systemic lupus erythematosus cohort (adjusted hazard ratio 15.82). Cox multivariate proportional hazards analysis revealed heart disease (hazard ratio = 5.71, p < 0.001), chronic kidney disease (hazard ratio = 2.98, p = 0.034), receiving a dental procedure within 30 days (hazard ratio = 36.80, p < 0.001), and intravenous steroid therapy within 30 days (hazard ratio = 39.59, p < 0.001) were independent risk factors for infective endocarditis in systemic lupus erythematosus patients. Conclusions A higher risk of infective endocarditis was observed in systemic lupus erythematosus patients. Risk factors for infective endocarditis in the systemic lupus erythematosus cohort included heart disease, chronic kidney disease, steroid pulse therapy within 30 days, and a recent invasive dental procedure within 30 days.
Hansson, Lotta; Asklid, Anna; Diels, Joris; Eketorp-Sylvan, Sandra; Repits, Johanna; Søltoft, Frans; Jäger, Ulrich; Österborg, Anders
2017-10-01
This study explored the relative efficacy of ibrutinib versus previous standard-of-care treatments in relapsed/refractory patients with chronic lymphocytic leukaemia (CLL), using multivariate regression modelling to adjust for baseline prognostic factors. Individual patient data were collected from an observational Stockholm cohort of consecutive patients (n = 144) diagnosed with CLL between 2002 and 2013 who had received at least second-line treatment. Data were compared with results of the RESONATE clinical trial. A multivariate Cox proportional hazards regression model was used which estimated the hazard ratio (HR) of ibrutinib versus previous standard of care. The adjusted HR of ibrutinib versus the previous standard-of-care cohort was 0.15 (p < 0.0001) for progression-free survival (PFS) and 0.36 (p < 0.0001) for overall survival (OS). A similar difference was observed also when patients treated late in the period (2012-) were compared separately. Multivariate analysis showed that later line of therapy, male gender, older age and poor performance status were significant independent risk factors for worse PFS and OS. Our results suggest that PFS and OS with ibrutinib in the RESONATE study were significantly longer than with previous standard-of-care regimens used in second or later lines in routine healthcare. The approach used, which must be interpreted with caution, compares patient-level data from a clinical trial with outcomes observed in a daily clinical practice and may complement results from randomised trials or provide preliminary wider comparative information until phase 3 data exist.
Health Literacy, Cognitive Abilities, and Mortality Among Elderly Persons
Wolf, Michael S.; Feinglass, Joseph; Thompson, Jason A.
2008-01-01
Background Low health literacy and low cognitive abilities both predict mortality, but no study has jointly examined these relationships. Methods We conducted a prospective cohort study of 3,260 community-dwelling adults age 65 and older. Participants were interviewed in 1997 and administered the Short Test of Functional Health Literacy in Adults and the Mini Mental Status Examination. Mortality was determined using the National Death Index through 2003. Measurements and Main Results In multivariate models with only literacy (not cognition), the adjusted hazard ratio was 1.50 (95% confidence of interval [CI] 1.24–1.81) for inadequate versus adequate literacy. In multivariate models without literacy, delayed recall of 3 items and the ability to serial subtract numbers were associated with higher mortality (e.g., adjusted hazard ratios [AHR] 1.74 [95% CI 1.30–2.34] for recall of zero versus 3 items, and 1.32 [95% CI 1.09–1.60] for 0–2 vs 5 correct subtractions). In multivariate analysis with both literacy and cognition, the AHRs for the cognition items were similar, but the AHR for inadequate literacy decreased to 1.27 (95% CI 1.03 – 1.57). Conclusions Both health literacy and cognitive abilities independently predict mortality. Interventions to improve patient knowledge and self-management skills should consider both the reading level and cognitive demands of the materials. PMID:18330654
Li, Shijun; Barywani, Salim; Fu, Michael
2017-01-01
Introduction Association of heart rate (HR) with mortality in patients with acute coronary syndrome (ACS) and aged ≥ 80 years are underrepresented in clinical trials. We therefore aimed to investigate the association of HR in atrial fibrillation (AF) versus sinus rhythm (SR) with all-cause mortality in octogenarian patients with ACS. Methods A total of 336 patients with ACS patients and aged ≥ 80 years were enrolled into the current study. The end point of interest was death from any cause. Association of HR in AF versus SR with mortality was analyzed by Kaplan-Meier curve following log-rank test and multivariable Cox regression analysis. Results In total, 63 (87.5%) of patients with AF were dead and 147 (59.8%) of patients with SR were dead during the follow-up period. The best cut-off was 80 bpm, with a sensitivity of 62% and specificity of 66%. HR ≤ 80 bpm in SR but not in AF was associated with better outcome as compared with HR > 80 bpm (Chi-Square = 26.55, Log rank P < 0.001). In SR subgroup, the hazard ratios of HR ≤ 80 bpm were 0.51(95% CI 0.37-0.70, P < 0.001) adjusted for age, 0.46 (95%CI 0.33-0.63, P < 0.001) adjusted for gender, 0.62 (95%CI 0.42- 0.93, P = 0.020) adjusted for multivariables respectively. In AF subgroup, the hazard ratios of HR ≤ 80 bpm were 0.83(95% CI 0.49-1.38, P = 0.464) adjusted for age, 0.96 (95%CI 0.59-1.58, P = 0.882) adjusted for gender, 0.72(95% CI 0.41-1.26, P = 0.249) adjusted for multivariables respectively. Conclusion The current study demonstrates that heart rate is an independent prognostic predictor for all-cause mortality, and HR ≤ 80 bpm is associated with improved outcome in SR but not in AF in octogenarian patients with ACS. PMID:29255559
Gutiérrez, Orlando M; Irvin, Marguerite R; Chaudhary, Ninad S; Cushman, Mary; Zakai, Neil A; David, Victor A; Limou, Sophie; Pamir, Nathalie; Reiner, Alex P; Naik, Rakhi P; Sale, Michele M; Safford, Monika M; Hyacinth, Hyacinth I; Judd, Suzanne E; Kopp, Jeffrey B; Winkler, Cheryl A
2018-06-01
APOL1 renal risk variants are strongly associated with chronic kidney disease in Black adults, but reported associations with cardiovascular disease (CVD) have been conflicting. We examined associations of APOL1 with incident coronary heart disease (n=323), ischemic stroke (n=331), and the composite CVD outcome (n=500) in 10 605 Black participants of the REGARDS study (Reasons for Geographic and Racial Differences in Stroke). Primary analyses compared individuals with APOL1 high-risk genotypes to APOL1 low-risk genotypes in Cox proportional hazards models adjusted for CVD risk factors and African ancestry. APOL1 high-risk participants were younger and more likely to have albuminuria at baseline than APOL1 low-risk participants. The risk of incident stroke, coronary heart disease, or composite CVD end point did not significantly differ by APOL1 genotype status in multivariable models. The association of APOL1 genotype with incident composite CVD differed by diabetes mellitus status ( P interaction =0.004). In those without diabetes mellitus, APOL1 high-risk genotypes associated with greater risk of incident composite CVD (hazard ratio, 1.67; 95% confidence interval, 1.12-2.47) compared with those with APOL1 low-risk genotypes in multivariable adjusted models. This latter association was driven by ischemic strokes (hazard ratio, 2.32; 95% confidence interval, 1.33-4.07), in particular, those related to small vessel disease (hazard ratio, 5.10; 95% confidence interval, 1.55-16.56). There was no statistically significant association of APOL1 genotypes with incident CVD in subjects with diabetes mellitus. The APOL1 high-risk genotype was associated with higher stroke risk in individuals without but not those with chronic kidney disease in fully adjusted models. APOL1 high-risk status is associated with CVD events in community-dwelling Black adults without diabetes mellitus. © 2018 American Heart Association, Inc.
Fröhlich, Hanna; Zhao, Jingting; Täger, Tobias; Cebola, Rita; Schellberg, Dieter; Katus, Hugo A; Grundtvig, Morten; Hole, Torstein; Atar, Dan; Agewall, Stefan; Frankenstein, Lutz
2015-09-01
β-Blockers exert a prognostic benefit in the treatment of chronic heart failure. Their pharmacological properties vary. The only substantial comparative trial to date-the Carvedilol or Metoprolol European Trial-has compared carvedilol with short-acting metoprolol tartrate at different dose equivalents. We therefore addressed the relative efficacy of equal doses of carvedilol and metoprolol succinate on survival in multicenter hospital outpatients with chronic heart failure. Four thousand sixteen patients with stable systolic chronic heart failure who were using either carvedilol or metoprolol succinate were identified in the Norwegian Heart Failure Registry and The Heart Failure Registry of the University of Heidelberg, Germany. Patients were individually matched on both the dose equivalents and the respective propensity scores for β-blocker treatment. During a follow-up for 17 672 patient-years, it was found that 304 (27.2%) patients died in the carvedilol group and 1066 (36.8%) in the metoprolol group. In a univariable analysis of the general sample, metoprolol therapy was associated with higher mortality compared with carvedilol therapy (hazard ratio, 1.49; 95% confidence interval, 1.31-1.69; P<0.001). This difference was not seen after multivariable adjustment (hazard ratio, 0.93; 95% confidence interval, 0.57-1.50; P=0.75) and adjustment for propensity score and dose equivalents (hazard ratio, 1.06; 95% confidence interval, 0.94-1.20; P=0.36) or in the propensity and dose equivalent-matched sample (hazard ratio, 1.00; 95% confidence interval, 0.82-1.23; P=0.99). These results were essentially unchanged for all prespecified subgroups. In outpatients with chronic heart failure, no conclusive association between all-cause mortality and treatment with carvedilol or metoprolol succinate was observed after either multivariable adjustment or multilevel propensity score matching. © 2015 American Heart Association, Inc.
Shih, H-J; Kao, M-C; Tsai, P-S; Fan, Y-C; Huang, C-J
2017-09-01
Clinical observations indicated an increased risk of developing prostate cancer in gout patients. Chronic inflammation is postulated to be one crucial mechanism for prostate carcinogenesis. Allopurinol, a widely used antigout agent, possesses potent anti-inflammation capacity. We elucidated whether allopurinol decreases the risk of prostate cancer in gout patients. We analyzed data retrieved from Taiwan National Health Insurance Database between January 2000 and December 2012. Patients diagnosed with gout during the study period with no history of prostate cancer and who had never used allopurinol were selected. Four allopurinol use cohorts (that is, allopurinol use (>365 days), allopurinol use (181-365 days), allopurinol use (91-180 days) and allopurinol use (31-90 days)) and one cohort without using allopurinol (that is, allopurinol use (No)) were included. The study end point was the diagnosis of new-onset prostate cancer. Multivariable Cox proportional hazards regression and propensity score-adjusted Cox regression models were used to estimate the association between the risk of prostate cancer and allopurinol treatment in gout patients after adjusting for potential confounders. A total of 25 770 gout patients (aged between 40 and 100 years) were included. Multivariable Cox regression analyses revealed that the risk of developing prostate cancer in the allopurinol use (>365 days) cohort was significantly lower than the allopurinol use (No) cohort (adjusted hazard ratio (HR)=0.64, 95% confidence interval (CI)=0.45-0.9, P=0.011). After propensity score adjustment, the trend remained the same (adjusted HR=0.66, 95% CI=0.46-0.93, P=0.019). Long-term (more than 1 year) allopurinol use may associate with a decreased risk of prostate cancer in gout patients.
Reproductive Factors and Incidence of Heart Failure Hospitalization in the Women’s Health Initiative
Hall, Philip S.; Nah, Gregory; Howard, Barbara V.; Lewis, Cora E.; Allison, Matthew A.; Sarto, Gloria E.; Waring, Molly E.; Jacobson, Lisette T.; Manson, JoAnn E.; Klein, Liviu; Parikh, Nisha I.
2017-01-01
BACKGROUND Reproductive factors reflective of endogenous sex hormone exposure might have an effect on cardiac remodeling and the development of heart failure (HF). OBJECTIVES This study examined the association between key reproductive factors and the incidence of HF. METHODS Women from a cohort of the Women’s Health Initiative were systematically evaluated for the incidence of HF hospitalization from study enrollment through 2014. Reproductive factors (number of live births, age at first pregnancy, and total reproductive duration [time from menarche to menopause]) were self-reported at study baseline in 1993 to 1998. We employed Cox proportional hazards regression analysis in age- and multivariable-adjusted models. RESULTS Among 28,516 women, with an average age of 62.7 ± 7.1 years at baseline, 1,494 (5.2%) had an adjudicated incident HF hospitalization during an average follow-up of 13.1 years. After adjusting for covariates, total reproductive duration in years was inversely associated with incident HF: hazard ratios (HRs) of 0.99 per year (95% confidence interval [CI]: 0.98 to 0.99 per year) and 0.95 per 5 years (95% CI: 0.91 to 0.99 per 5 years). Conversely, early age at first pregnancy and nulliparity were significantly associated with incident HF in age-adjusted models, but not after multivariable adjustment. Notably, nulliparity was associated with incident HF with preserved ejection fraction in the fully adjusted model (HR: 2.75; 95% CI: 1.16 to 6.52). CONCLUSIONS In postmenopausal women, shorter total reproductive duration was associated with higher risk of incident HF, and nulliparity was associated with higher risk for incident HF with preserved ejection fraction. Whether exposure to endogenous sex hormones underlies this relationship should be investigated in future studies. PMID:28521890
Incompletely treated malignancies of the major salivary gland: Toward evidence-based care.
Tam, Samantha; Sandulache, Vlad C; Metwalli, Kareem A; Rock, Crosby D; Eraj, Salman A; Sheu, Tommy; El-Naggar, Adel K; Fuller, Clifton D; Weber, Randal S; Lai, Stephen Y
2018-05-07
Unexpected malignancy is common in major salivary gland tumors due to variability of workup, creating challenging treatment decisions. The purpose of this study was to define treatment-related outcomes for patients with incompletely treated major salivary gland tumors. A retrospective cohort study was completed of patients with incompletely treated major salivary gland tumors. Tumor burden at presentation was established and treatment categorized. The Cox Proportional Hazards model was used to determine predictors of survival and failure. Of the 440 included patients, patients with gross residual or metastatic disease had a worse overall survival (OS; P < .001). Presentation status was an independent predictor of OS on multivariate analysis (gross residual disease adjusted hazard ratio [HR adjusted ] 2.55; 95% confidence interval [CI] 1.20-5.30; metastatic disease HR adjusted 9.53; 95% CI 3.04-27.06). Failure to achieve gross total resection during initial surgery resulted in worse OS. Adequate preoperative planning is required for initial surgical management to optimize tumor control and survival. © 2018 Wiley Periodicals, Inc.
Pfaendler, Krista S; Chang, Jenny; Ziogas, Argyrios; Bristow, Robert E; Penner, Kristine R
2018-05-01
To evaluate the association of sociodemographic and hospital characteristics with adherence to National Comprehensive Cancer Network treatment guidelines for stage IB-IIA cervical cancer and to analyze the relationship between adherent care and survival. This is a retrospective population-based cohort study of patients with stage IB-IIA invasive cervical cancer reported to the California Cancer Registry from January 1, 1995, through December 31, 2009. Adherence to National Comprehensive Cancer Network guideline care was defined by year- and stage-appropriate surgical procedures, radiation, and chemotherapy. Multivariate logistic regression, Kaplan-Meier estimate, and Cox proportional hazard models were used to examine associations between patient, tumor, and treatment characteristics and National Comprehensive Cancer Network guideline adherence and cervical cancer-specific 5-year survival. A total of 6,063 patients were identified. Forty-seven percent received National Comprehensive Cancer Network guideline-adherent care, and 18.8% were treated in high-volume centers (20 or more patients/year). On multivariate analysis, lowest socioeconomic status (adjusted odds ratio [OR] 0.69, 95% CI 0.57-0.84), low-middle socioeconomic status (adjusted OR 0.76, 95% CI 0.64-0.92), and Charlson-Deyo comorbidity score 1 or higher (adjusted OR 0.78, 95% CI 0.69-0.89) were patient characteristics associated with receipt of nonguideline care. Receiving adherent care was less common in low-volume centers (45.9%) than in high-volume centers (50.9%) (effect size 0.90, 95% CI 0.84-0.96). Death from cervical cancer was more common in the nonadherent group (13.3%) than in the adherent group (8.6%) (effect size 1.55, 95% CI 1.34-1.80). Black race (adjusted hazard ratio 1.56, 95% CI 1.08-2.27), Medicaid payer status (adjusted hazard ratio 1.47, 95% CI 1.15-1.87), and Charlson-Deyo comorbidity score 1 or higher (adjusted hazard ratio 2.07, 95% CI 1.68-2.56) were all associated with increased risk of dying from cervical cancer. Among patients with early-stage cervical cancer, National Comprehensive Cancer Network guideline-nonadherent care was independently associated with increased cervical cancer-specific mortality along with black race and Medicaid payer status. Nonadherence was more prevalent in patients with older age, lower socioeconomic status, and receipt of care in low-volume centers. Attention should be paid to increase guideline adherence.
Association of Modality with Mortality among Canadian Aboriginals
Hemmelgarn, Brenda; Rigatto, Claudio; Komenda, Paul; Yeates, Karen; Promislow, Steven; Mojica, Julie; Tangri, Navdeep
2012-01-01
Summary Background and objectives Previous studies have shown that Aboriginals and Caucasians experience similar outcome on dialysis in Canada. Using the Canadian Organ Replacement Registry, this study examined whether dialysis modality (peritoneal or hemodialysis) impacted mortality in Aboriginal patients. Design, setting, participants, & measurements This study identified 31,576 adult patients (hemodialysis: Aboriginal=1839, Caucasian=21,430; peritoneal dialysis: Aboriginal=554, Caucasian=6769) who initiated dialysis between January of 2000 and December of 2009. Aboriginal status was identified by self-report. Dialysis modality was determined 90 days after dialysis initiation. Multivariate Cox proportional hazards and competing risk models were constructed to determine the association between race and mortality by dialysis modality. Results During the study period, 939 (51.1%) Aboriginals and 12,798 (53.3%) Caucasians initiating hemodialysis died, whereas 166 (30.0%) and 2037 (30.1%), respectively, initiating peritoneal dialysis died. Compared with Caucasians, Aboriginals on hemodialysis had a comparable risk of mortality (adjusted hazards ratio=1.04, 95% confidence interval=0.96–1.11, P=0.37). However, on peritoneal dialysis, Aboriginals experienced a higher risk of mortality (adjusted hazards ratio=1.36, 95% confidence interval=1.13–1.62, P=0.001) and technique failure (adjusted hazards ratio=1.29, 95% confidence interval=1.03–1.60, P=0.03) than Caucasians. The risk of technique failure varied by patient age, with younger Aboriginals (<50 years old) more likely to develop technique failure than Caucasians (adjusted hazards ratio=1.76, 95% confidence interval=1.23–2.52, P=0.002). Conclusions Aboriginals on peritoneal dialysis experience higher mortality and technique failure relative to Caucasians. Reasons for this race disparity in peritoneal dialysis outcomes are unclear. PMID:22997343
Conen, David; Arendacká, Barbora; Röver, Christian; Bergau, Leonard; Munoz, Pascal; Wijers, Sofieke; Sticherling, Christian; Zabel, Markus; Friede, Tim
2016-01-01
Some but not all prior studies have shown that women receiving a primary prophylactic implantable cardioverter defibrillator (ICD) have a lower risk of death and appropriate shocks than men. To evaluate the effect of gender on the risk of appropriate shock, all-cause mortality and inappropriate shock in contemporary studies of patients receiving a primary prophylactic ICD. PubMed, LIVIVO, Cochrane CENTRAL between 2010 and 2016. Studies providing at least 1 gender-specific risk estimate for the outcomes of interest. Abstracts were screened independently for potentially eligible studies for inclusion. Thereby each abstract was reviewed by at least two authors. Out of 680 abstracts retained by our search strategy, 20 studies including 46'657 patients had gender-specific information on at least one of the relevant endpoints. Mean age across the individual studies varied between 58 and 69 years. The proportion of women enrolled ranged from 10% to 30%. Across 6 available studies, women had a significantly lower risk of first appropriate shock compared with men (pooled multivariable adjusted hazard ratio 0.62 (95% CI [0.44; 0.88]). Across 14 studies reporting multivariable adjusted gender-specific hazard ratio estimates for all-cause mortality, women had a lower risk of death than men (pooled hazard ratio 0.75 (95% CI [0.66; 0.86]). There was no statistically significant difference for the incidence of first inappropriate shocks (3 studies, pooled hazard ratio 0.99 (95% CI [0.56; 1.73]). Individual patient data were not available for most studies. In this large contemporary meta-analysis, women had a significantly lower risk of appropriate shocks and death than men, but a similar risk of inappropriate shocks. These data may help to select patients who benefit from primary prophylactic ICD implantation.
Budhathoki, Sanjeev; Hidaka, Akihisa; Sawada, Norie; Tanaka-Mizuno, Sachiko; Kuchiba, Aya; Charvat, Hadrien; Goto, Atsushi; Kojima, Satoshi; Sudo, Natsuki; Shimazu, Taichi; Sasazuki, Shizuka; Inoue, Manami; Tsugane, Shoichiro; Iwasaki, Motoki
2018-01-01
Abstract Objective To evaluate the association between pre-diagnostic circulating vitamin D concentration and the subsequent risk of overall and site specific cancer in a large cohort study. Design Nested case-cohort study within the Japan Public Health Center-based Prospective Study cohort. Setting Nine public health centre areas across Japan. Participants 3301 incident cases of cancer and 4044 randomly selected subcohort participants. Exposure Plasma concentration of 25-hydroxyvitamin D measured by enzyme immunoassay. Participants were divided into quarters based on the sex and season specific distribution of 25-hydroxyvitamin D among subcohorts. Weighted Cox proportional hazard models were used to calculate the multivariable adjusted hazard ratios for overall and site specific cancer across categories of 25-hydroxyvitamin D concentration, with the lowest quarter as the reference. Main outcome measure Incidence of overall or site specific cancer. Results Plasma 25-hydroxyvitamin D concentration was inversely associated with the risk of total cancer, with multivariable adjusted hazard ratios for the second to fourth quarters compared with the lowest quarter of 0.81 (95% confidence interval 0.70 to 0.94), 0.75 (0.65 to 0.87), and 0.78 (0.67 to 0.91), respectively (P for trend=0.001). Among the findings for cancers at specific sites, an inverse association was found for liver cancer, with corresponding hazard ratios of 0.70 (0.44 to 1.13), 0.65 (0.40 to 1.06), and 0.45 (0.26 to 0.79) (P for trend=0.006). A sensitivity analysis showed that alternately removing cases of cancer at one specific site from total cancer cases did not substantially change the overall hazard ratios. Conclusions In this large prospective study, higher vitamin D concentration was associated with lower risk of total cancer. These findings support the hypothesis that vitamin D has protective effects against cancers at many sites. PMID:29514781
Swords, Douglas S; Mulvihill, Sean J; Skarda, David E; Finlayson, Samuel R G; Stoddard, Gregory J; Ott, Mark J; Firpo, Matthew A; Scaife, Courtney L
2017-07-11
To (1) evaluate rates of surgery for clinical stage I-II pancreatic ductal adenocarcinoma (PDAC), (2) identify predictors of not undergoing surgery, (3) quantify the degree to which patient- and hospital-level factors explain differences in hospital surgery rates, and (4) evaluate the association between adjusted hospital-specific surgery rates and overall survival (OS) of patients treated at different hospitals. Curative-intent surgery for potentially resectable PDAC is underutilized in the United States. Retrospective cohort study of patients ≤85 years with clinical stage I-II PDAC in the 2004 to 2014 National Cancer Database. Mixed effects multivariable models were used to characterize hospital-level variation across quintiles of hospital surgery rates. Multivariable Cox proportional hazards models were used to estimate the effect of adjusted hospital surgery rates on OS. Of 58,553 patients without contraindications or refusal of surgery, 63.8% underwent surgery, and the rate decreased from 2299/3528 (65.2%) in 2004 to 4412/7092 (62.2%) in 2014 (P < 0.001). Adjusted hospital rates of surgery varied 6-fold (11.4%-70.9%). Patients treated at hospitals with higher rates of surgery had better unadjusted OS (median OS 10.2, 13.3, 14.2, 16.5, and 18.4 months in quintiles 1-5, respectively, P < 0.001, log-rank). Treatment at hospitals in lower surgery rate quintiles 1-3 was independently associated with mortality [Hazard ratio (HR) 1.10 (1.01, 1.21), HR 1.08 (1.02, 1.15), and HR 1.09 (1.04, 1.14) for quintiles 1-3, respectively, compared with quintile 5] after adjusting for patient factors, hospital type, and hospital volume. Quality improvement efforts are needed to help hospitals with low rates of surgery ensure that their patients have access to appropriate surgery.
Dehesh, Tania; Zare, Najaf; Ayatollahi, Seyyed Mohammad Taghi
2015-01-01
Univariate meta-analysis (UM) procedure, as a technique that provides a single overall result, has become increasingly popular. Neglecting the existence of other concomitant covariates in the models leads to loss of treatment efficiency. Our aim was proposing four new approximation approaches for the covariance matrix of the coefficients, which is not readily available for the multivariate generalized least square (MGLS) method as a multivariate meta-analysis approach. We evaluated the efficiency of four new approaches including zero correlation (ZC), common correlation (CC), estimated correlation (EC), and multivariate multilevel correlation (MMC) on the estimation bias, mean square error (MSE), and 95% probability coverage of the confidence interval (CI) in the synthesis of Cox proportional hazard models coefficients in a simulation study. Comparing the results of the simulation study on the MSE, bias, and CI of the estimated coefficients indicated that MMC approach was the most accurate procedure compared to EC, CC, and ZC procedures. The precision ranking of the four approaches according to all above settings was MMC ≥ EC ≥ CC ≥ ZC. This study highlights advantages of MGLS meta-analysis on UM approach. The results suggested the use of MMC procedure to overcome the lack of information for having a complete covariance matrix of the coefficients.
Lobato, Robert L; White, William D; Mathew, Joseph P; Newman, Mark F; Smith, Peter K; McCants, Charles B; Alexander, John H; Podgoreanu, Mihai V
2011-09-13
We tested the hypothesis that genetic variation in thrombotic and inflammatory pathways is independently associated with long-term mortality after coronary artery bypass graft (CABG) surgery. Two separate cohorts of patients undergoing CABG surgery at a single institution were examined, and all-cause mortality between 30 days and 5 years after the index CABG was ascertained from the National Death Index. In a discovery cohort of 1018 patients, a panel of 90 single-nucleotide polymorphisms (SNPs) in 49 candidate genes was tested with Cox proportional hazard models to identify clinical and genomic multivariate predictors of incident death. After adjustment for multiple comparisons and clinical predictors of mortality, the homozygote minor allele of a common variant in the thrombomodulin (THBD) gene (rs1042579) was independently associated with significantly increased risk of all-cause mortality (hazard ratio, 2.26; 95% CI, 1.31 to 3.92; P=0.003). Six tag SNPs in the THBD gene, 1 of which (rs3176123) in complete linkage disequilibrium with rs1042579, were then assessed in an independent validation cohort of 930 patients. After multivariate adjustment for the clinical predictors identified in the discovery cohort and multiple testing, the homozygote minor allele of rs3176123 independently predicted all-cause mortality (hazard ratio, 3.6; 95% CI, 1.67 to 7.78; P=0.001). In 2 independent cardiac surgery cohorts, linked common allelic variants in the THBD gene are independently associated with increased long-term mortality risk after CABG and significantly improve the classification ability of traditional postoperative mortality prediction models.
A prospective study of caffeine intake and risk of incident tinnitus.
Glicksman, Jordan T; Curhan, Sharon G; Curhan, Gary C
2014-08-01
Caffeine is a commonly consumed substance that has been thought to play a role in the development of tinnitus, but prospective data are lacking. We prospectively evaluated the association between caffeine intake and self-reported tinnitus in a female cohort. Participants were 65,085 women in the Nurses' Health Study II, aged 30 to 44 years and without tinnitus at baseline in 1991, who completed questionnaires about lifestyle and medical history every 2 years and food frequency questionnaires every 4 years. Information on self-reported tinnitus and date of onset was obtained from the 2009 questionnaire, with cases defined as those reporting experiencing symptoms "a few days/week" or "daily." Multivariable adjusted hazard ratios were calculated using Cox proportional hazards regression models. At baseline, the mean age of the cohort was 36.3 years and the mean caffeine intake was 242.3 mg/d. After 18 years of follow-up, 5289 incident cases of tinnitus were reported. There was a significant inverse association between caffeine intake and the incidence of tinnitus. Compared with women with caffeine intake less than 150 mg/d (150 mg corresponds to ∼ one 8-ounce cup of coffee), the multivariable adjusted hazard ratios were 0.85 (95% confidence interval, 0.76-0.95) for those who consumed 450 to 599 mg/d and 0.79 (0.68-0.91) for those who consumed 600 mg/d or more. In this prospective study, higher caffeine intake was associated with a lower risk of incident tinnitus in women. Copyright © 2014 Elsevier Inc. All rights reserved.
Hart, Jaime E; Bertrand, Kimberly A; DuPre, Natalie; James, Peter; Vieira, Verónica M; VoPham, Trang; Mittleman, Maggie R; Tamimi, Rulla M; Laden, Francine
2018-03-27
Findings from a recent prospective cohort study in California suggested increased risk of breast cancer associated with higher exposure to certain carcinogenic and estrogen-disrupting hazardous air pollutants (HAPs). However, to date, no nationwide studies have evaluated these possible associations. Our objective was to examine the impacts of mammary carcinogen and estrogen disrupting HAPs on risk of invasive breast cancer in a nationwide cohort. We assigned HAPs from the US Environmental Protection Agency's 2002 National Air Toxics Assessment to 109,239 members of the nationwide, prospective Nurses' Health Study II (NHSII). Risk of overall invasive, estrogen receptor (ER)-positive (ER+), and ER-negative (ER-) breast cancer with increasing quartiles of exposure were assessed in time-varying multivariable proportional hazards models, adjusted for traditional breast cancer risk factors. A total of 3321 invasive cases occurred (2160 ER+, 558 ER-) during follow-up 1989-2011. Overall, there was no consistent pattern of elevated risk of the HAPs with risk of breast cancer. Suggestive elevations were only seen with increasing 1,2-dibromo-3-chloropropane exposures (multivariable adjusted HR of overall breast cancer = 1.12, 95% CI: 0.98-1.29; ER+ breast cancer HR = 1.09; 95% CI: 0.92, 1.30; ER- breast cancer HR = 1.14; 95% CI: 0.81, 1.61; each in the top exposure quartile compared to the lowest). Exposures to HAPs during adulthood were not consistently associated with an increased risk of overall or estrogen-receptor subtypes of invasive breast cancer in this nationwide cohort of women.
Hilscher, Moira; Enders, Felicity B; Carey, Elizabeth J; Lindor, Keith D; Tabibian, James H
2016-01-01
Introduction. Recent studies suggest that serum alkaline phosphatase may represent a prognostic biomarker in patients with primary sclerosing cholangitis. However, this association remains poorly understood. Therefore, the aim of this study was to investigate the prognostic significance and clinical correlates of alkaline phosphatase normalization in primary sclerosing cholangitis. This was a retrospective cohort study of patients with a new diagnosis of primary sclerosing cholangitis made at an academic medical center. The primary endpoint was time to hepatobiliaryneoplasia, liver transplantation, or liver-related death. Secondary endpoints included occurrence of and time to alkaline phosphatase normalization. Patients who did and did not achieve normalization were compared with respect to clinical characteristics and endpoint-free survival, and the association between normalization and the primary endpoint was assessed with univariate and multivariate Cox proportional-hazards analyses. Eighty six patients were included in the study, with a total of 755 patient-years of follow-up. Thirty-eight patients (44%) experienced alkaline phosphatase normalization within 12 months of diagnosis. Alkaline phosphatase normalization was associated with longer primary endpoint-free survival (p = 0.0032) and decreased risk of requiring liver transplantation (p = 0.033). Persistent normalization was associated with even fewer adverse endpoints as well as longer survival. In multivariate analyses, alkaline phosphatase normalization (adjusted hazard ratio 0.21, p = 0.012) and baseline bilirubin (adjusted hazard ratio 4.87, p = 0.029) were the only significant predictors of primary endpoint-free survival. Alkaline phosphatase normalization, particularly if persistent, represents a robust biomarker of improved long-term survival and decreased risk of requiring liver transplantation in patients with primary sclerosing cholangitis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucas, John T., E-mail: jolucas@wakehealth.edu; Colmer, Hentry G.; White, Lance
Purpose: To estimate the hazard for neurologic (central nervous system, CNS) and nonneurologic (non-CNS) death associated with patient, treatment, and systemic disease status in patients receiving stereotactic radiosurgery after whole-brain radiation therapy (WBRT) failure, using a competing risk model. Patients and Methods: Of 757 patients, 293 experienced recurrence or new metastasis following WBRT. Univariate Cox proportional hazards regression identified covariates for consideration in the multivariate model. Competing risks multivariable regression was performed to estimate the adjusted hazard ratio (aHR) and 95% confidence interval (CI) for both CNS and non-CNS death after adjusting for patient, disease, and treatment factors. The resultantmore » model was converted into an online calculator for ease of clinical use. Results: The cumulative incidence of CNS and non-CNS death at 6 and 12 months was 20.6% and 21.6%, and 34.4% and 35%, respectively. Patients with melanoma histology (relative to breast) (aHR 2.7, 95% CI 1.5-5.0), brainstem location (aHR 2.1, 95% CI 1.3-3.5), and number of metastases (aHR 1.09, 95% CI 1.04-1.2) had increased aHR for CNS death. Progressive systemic disease (aHR 0.55, 95% CI 0.4-0.8) and increasing lowest margin dose (aHR 0.97, 95% CI 0.9-0.99) were protective against CNS death. Patients with lung histology (aHR 1.3, 95% CI 1.1-1.9) and progressive systemic disease (aHR 2.14, 95% CI 1.5-3.0) had increased aHR for non-CNS death. Conclusion: Our nomogram provides individual estimates of neurologic death after salvage stereotactic radiosurgery for patients who have failed prior WBRT, based on histology, neuroanatomical location, age, lowest margin dose, and number of metastases after adjusting for their competing risk of death from other causes.« less
Hsich, Eileen M; Blackstone, Eugene H; Thuita, Lucy; McNamara, Dennis M; Rogers, Joseph G; Ishwaran, Hemant; Schold, Jesse D
2017-06-01
There are sex differences in mortality while awaiting heart transplantation, and the reason remains unclear. We included all adults in the Scientific Registry of Transplant Recipients placed on the heart transplant active waitlist from 2004 to 2015. The primary end point was all-cause mortality. Multivariable Cox proportional hazards models were performed to evaluate survival by United Network for Organ Sharing (UNOS) status at the time of listing. Random survival forest was used to identify sex interactions for the competing risk of death and transplantation. There were 33 069 patients (25% women) awaiting heart transplantation. This cohort included 7681 UNOS status 1A (26% women), 13 027 UNOS status 1B (25% women), and 12 361 UNOS status 2 (26% women). During a median follow-up of 4.3 months, 1351 women and 4052 men died. After adjusting for >20 risk factors, female sex was associated with a significant risk of death among UNOS status 1A (adjusted hazard ratio, 1.14; 95% confidence interval, 1.01-1.29) and UNOS status 1B (adjusted hazard ratio, 1.17; 95% confidence interval, 1.05-1.30). In contrast, female sex was significantly protective for time to death among UNOS status 2 (adjusted hazard ratio, 0.85; 95% confidence interval, 0.76-0.95). Sex differences in probability of transplantation were present for every UNOS status, and >20 sex interactions were identified for mortality and transplantation. When stratified by initial UNOS status, women had a higher mortality than men as UNOS status 1 and a lower mortality as UNOS status 2. With >20 sex interactions for mortality and transplantation, further evaluation is warranted to form a more equitable allocation system. © 2017 American Heart Association, Inc.
Pancreatic β-Cell Function and Prognosis of Nondiabetic Patients With Ischemic Stroke.
Pan, Yuesong; Chen, Weiqi; Jing, Jing; Zheng, Huaguang; Jia, Qian; Li, Hao; Zhao, Xingquan; Liu, Liping; Wang, Yongjun; He, Yan; Wang, Yilong
2017-11-01
Pancreatic β-cell dysfunction is an important factor in the development of type 2 diabetes mellitus. This study aimed to estimate the association between β-cell dysfunction and prognosis of nondiabetic patients with ischemic stroke. Patients with ischemic stroke without a history of diabetes mellitus in the ACROSS-China (Abnormal Glucose Regulation in Patients with Acute Stroke across China) registry were included. Disposition index was estimated as computer-based model of homeostatic model assessment 2-β%/homeostatic model assessment 2-insulin resistance based on fasting C-peptide level. Outcomes included stroke recurrence, all-cause death, and dependency (modified Rankin Scale, 3-5) at 12 months after onset. Among 1171 patients, 37.2% were women with a mean age of 62.4 years. At 12 months, 167 (14.8%) patients had recurrent stroke, 110 (9.4%) died, and 184 (16.0%) had a dependency. The first quartile of the disposition index was associated with an increased risk of stroke recurrence (adjusted hazard ratio, 3.57; 95% confidence interval, 2.13-5.99) and dependency (adjusted hazard ratio, 2.30; 95% confidence interval, 1.21-4.38); both the first and second quartiles of the disposition index were associated with an increased risk of death (adjusted hazard ratio, 5.09; 95% confidence interval, 2.51-10.33; adjusted hazard ratio, 2.42; 95% confidence interval, 1.17-5.03) compared with the fourth quartile. Using a multivariable regression model with restricted cubic spline, we observed an L-shaped association between the disposition index and the risk of each end point. In this large-scale registry, β-cell dysfunction was associated with an increased risk of 12-month poor prognosis in nondiabetic patients with ischemic stroke. © 2017 American Heart Association, Inc.
Ovarian Conservation and Overall Survival in Young Women With Early-Stage Cervical Cancer.
Matsuo, Koji; Machida, Hiroko; Shoupe, Donna; Melamed, Alexander; Muderspach, Laila I; Roman, Lynda D; Wright, Jason D
2017-01-01
To identify predictors of ovarian conservation at hysterectomy and to examine the association of ovarian conservation and survival of young women with early-stage cervical cancer. This is a retrospective cohort study using the Surveillance, Epidemiology, and End Results Program to identify hysterectomy-based surgically treated patients with stage I cervical cancer diagnosed between 1983 and 2012 (N=16,511). Multivariable models were used to identify independent factors associated with ovarian conservation. Among the subgroup of 9,419 women younger than 50 years of age with stage I disease, survival outcomes and causes of death were examined for 3,908 (41.5%) women who underwent ovarian conservation at hysterectomy without radiotherapy. On multivariable analysis, age younger than 50 years, stage IA disease, and squamous histology were independent factors associated with ovarian conservation (all, P<.001). Among 5,526 women younger than 50 years of age with stage IA disease who underwent hysterectomy without radiotherapy, overall survival was significantly higher in patients undergoing ovarian conservation than in those undergoing oophorectomy (20-year rate, 93.5% compared with 86.8%, P<.001); cervical cancer-specific survival was similar between the patients who underwent ovarian conservation and those who underwent oophorectomy (98.8% compared with 97.8%, P=.12). On multivariable analysis, ovarian conservation remained an independent prognostic factor for improved overall survival (adjusted hazard ratio 0.63, 95% confidence interval [CI] 0.49-0.82, P=.001) and was independently associated with lower cumulative risks of death resulting from cardiovascular disease (20-year cumulative rate, 1.2% compared with 3.3%, adjusted hazard ratio 0.47, 95% CI 0.26-0.86, P=.014) and other chronic disease (0.5% compared with 1.4%, adjusted hazard ratio 0.24, 95% CI 0.09-0.65, P=.005) compared with oophorectomy. Both cervical cancer-specific survival (20-year rate, 93.1% compared with 92.0%, P=.37) and overall survival (86.7% compared with 84.6%, P=.12) were similar between ovarian conservation and oophorectomy among 3,893 women younger than 50 years of age with stage IB disease who underwent hysterectomy without radiotherapy. Among young women with stage IA cervical cancer, ovarian conservation at hysterectomy is associated with decreased all-cause mortality including death resulting from cardiovascular disease and other chronic diseases.
Tumin, Dmitry; McConnell, Patrick I; Galantowicz, Mark; Tobias, Joseph D; Hayes, Don
2017-02-01
Young adult heart transplantation (HTx) recipients experience high mortality risk attributed to increased nonadherence to immunosuppressive medication in this age window. This study sought to test whether a high-risk age window in HTx recipients persisted in the absence of reported nonadherence. Heart transplantation recipients aged 2 to 40 years, transplanted between October 1999 and January 2007, were identified in the United Network for Organ Sharing database. Multivariable survival analysis was used to estimate influences of age at transplantation and attained posttransplant age on mortality hazard among patients stratified by center report of nonadherence to immunosuppression that compromised recovery. Three thousand eighty-one HTx recipients were included, with univariate analysis demonstrating peak hazards of mortality and reported nonadherence among 567 patients transplanted between ages 17 and 24 years. Multivariable analysis adjusting for reported nonadherence demonstrated lower mortality among patients transplanted at younger (hazards ratio, 0.813; 95% confidence interval, 0.663-0.997; P = 0.047) or older (hazards ratio, 0.835; 95% confidence interval, 0.701-0.994; P = 0.042) ages. Peak mortality hazard at ages 17 to 24 years was confirmed in the subgroup of patients with no nonadherence reported during follow-up. This result was replicated using attained age after HTx as the time metric, with younger and older ages predicting improved survival in the absence of reported nonadherence. Late adolescence and young adulthood coincide with greater mortality hazard and greater chances of nonadherence to immunosuppressive medication after HTx, but the elevation of mortality hazard in this age range persists in the absence of reported nonadherence. Other causes of the high-risk age window for post-HTx mortality should be demonstrated to identify opportunities for intervention.
Determinants of hazardous drinking among black South African men who have sex with men.
Knox, Justin; Reddy, Vasu; Lane, Tim; Lovasi, Gina; Hasin, Deborah; Sandfort, Theo
2017-11-01
There is a known heavy burden of hazardous drinking and its associated health risks among black South African MSM; however, no study to date has identified risk factors for hazardous drinking among this nor any other African MSM population. A cross-sectional survey was conducted among 480 black South African MSM recruited using respondent-driven sampling. All analyses were adjusted using an RDS II estimator. Multivariable logistic regression was used to assess the relationship between demographic characteristics, psychosocial factors, behavioral attributes and hazardous drinking. More than half of the men (62%, 95%CI=56%-68%) screened positive as hazardous drinkers. In multivariable analyses, living in a township (versus the city of Pretoria) (aOR=1.9, 95%CI=1.2-3.1, p<.01), more gender dysphoria (aOR=1.4, 95%CI=1.0-1.8, p=.03), having ever received money or other incentives in return for sex (aOR=2.4, 95%CI=1.3-4.3, p<.01), having been sexually abused as a child (aOR=2.6, 95%CI=1.1-6.4, p=.03), having anxiety (aOR=5.4, 95%CI=1.2-24.3, p=.03), and social network drinking behavior (aOR=5.4, 95%CI=1.2-24.3, p=.03) were positively associated with hazardous drinking. Being sexually attracted only to men (aOR=0.3, 95%CI=0.1-0.8, p=.01) was negatively associated with hazardous drinking. Hazardous drinking is highly prevalent among black South African MSM. Multiple indicators of social vulnerability were identified as independent determinants of hazardous drinking. These findings are of heightened concern because these health problems often work synergistically to increase risk of HIV infection and should be taken into consideration by efforts aimed at reducing hazardous drinking among this critical population. Copyright © 2017 Elsevier B.V. All rights reserved.
Doherty, Sarah M; Jackman, Louise M; Kirwan, John F; Dunne, Deirdre; O'Connor, Kieran G; Rouse, John M
2016-12-01
The incidence of melanoma is rising worldwide. Current Irish guidelines from the National Cancer Control Programme state suspicious pigmented lesions should not be removed in primary care. There are conflicting guidelines and research advising who should remove possible melanomas. To determine whether initial diagnostic excision biopsy of cutaneous malignant melanoma in primary versus secondary care leads to poorer survival. Analysis of data comprising 7116 cases of cutaneous malignant melanoma from the National Cancer Registry Ireland between January 2002 and December 2011. Single predictor variables were examined by the chi-square or Mann-Whitney U test. The effects of single predictor variables on survival were examined by Cox proportionate hazards modelling and a multivariate Cox model of survival based on excision in a non-hospital setting versus hospital setting was derived with adjusted and unadjusted hazard ratios. Over a 10-year period 8.5% of melanomas in Ireland were removed in a non-hospital setting. When comparing melanoma death between the hospital and non-hospital groups, the adjusted hazard ratio was 1.56 (95%CI: 1.08-2.26); (P = .02), indicating a non-inferior outcome for the melanoma cases initially treated in the non-hospital group, after adjustment for significant covariates. This study suggests that initial excision biopsy carried out in general practice does not lead to a poorer outcome. [Box: see text].
Spoendlin, Julia; Layton, J Bradley; Mundkur, Mallika; Meier, Christian; Jick, Susan S; Meier, Christoph R
2016-12-01
Case reports and pharmacovigilance data reported cases of tendon ruptures in statin users, but evidence from observational studies is scarce and inconclusive. We aimed to assess the association between new statin use and tendon rupture. We performed a propensity score (PS)-matched sequential cohort study, using data from the Clinical Practice Research Datalink. Patients aged ≥45 years with at least one new statin prescription between 1995 and 2014 were PS-matched within 2-year entry blocks to patients without a statin prescription during the block. We followed patients until they had a recorded Achilles or biceps tendon rupture, completed 5 years of follow-up, or were censored for change in exposure status or another censoring criterion. We calculated hazard ratios (HRs) with 95 % confidence intervals (CIs), applying Cox proportional hazard analyses in the overall cohort (crude and multivariable) and in the PS-matched cohort. We performed subgroup analyses by sex, age, treatment duration, and statin dose. We observed a crude HR of 1.32 (95 % CI 1.21-1.44) in the overall cohort, which attenuated after multivariable adjustment (HR 1.02, 95 % CI 0.92-1.12) and after PS-matching (HR 0.95, 95 % CI 0.84-1.08). Crude HRs were higher in women than in men, but remained around null in both sexes after multivariable adjustment and PS-matching. Subgroup analyses by age, treatment duration, and statin dose revealed null results across all subgroups. The results of this cohort study suggest that statin use does not increase the risk of tendon rupture, irrespective of gender, age, statin dose, or treatment duration.
Bhandari, Simran K; Shi, Jiaxiao; Molnar, Miklos Z; Rasgon, Scott A; Derose, Stephen F; Kovesdy, Csaba P; Calhoun, David A; Kalantar-Zadeh, Kamyar; Jacobsen, Steven J; Sim, John J
2016-11-01
We directly compared sleep apnoea (SA) rates and risk of cardiovascular and mortality outcomes among SA patients with resistant hypertension (RH) and non-RH within a large diverse hypertension population. A retrospective cohort study between 1 January 2006 and 31 December 2010 among hypertensive adults (age ≥ 18 years) was performed within an integrated health system. Rates of SA in RH and non-RH were determined. Multivariable logistic regression analyses were used to calculate OR for SA. Cox proportional hazard modelling was used to estimate hazard ratios (HRs) for cardiovascular and mortality outcomes between SA in RH versus SA in non-RH adjusting for age, gender, race, BMI, chronic kidney disease and other comorbidities. SA was identified in 33 682 (7.2%) from 470 386 hypertensive individuals. SA in RH accounted for 5806 (9.6%) compared to SA in non-RH 27 876 individuals (6.8%). Multivariable OR (95% CI) for SA was 1.16 (1.12, 1.19), 3.57 (3.47, 3.66) and 2.20 (2.15, 2.25) for RH versus non-RH, BMI ≥ 30, and males, respectively. Compared to SA in non-RH individuals, SA in RH had a multivariable adjusted HR (95% CI) of 1.24 (1.13, 1.36), 1.43 (1.28, 1.61), 0.98 (0.85, 1.12) and 1.04 (0.95, 1.14) for ischaemic heart event (IHE), congestive heart failure (CHF), stroke and mortality, respectively. We observed a modest increase in likelihood for SA among RH compared to non-RH patients. Risks for IHE and CHF were higher for SA in RH compared to SA in non-RH patients; however, there were no differences in risk for stroke and mortality. © 2016 Asian Pacific Society of Respirology.
d'Errico, Angelo; Ardito, Chiara; Leombruni, Roberto
2016-01-01
Aim of the study was to identify work organization features and workplace hazards associated with sickness presenteeism (SP) among European workers. The study was conducted on data from the European Working Conditions Survey 2010 and included a study population of 30,279 employees. The relationship between work-related factors and SP was assessed through Poisson multivariate robust regression models, adjusting for significant (P < 0.05) individual and work-related characteristics. SP for at least 2 days in the previous year was reported by 35% of the workers. In fully adjusted model, several psychosocial (decision authority, skill discretion, reward, abuse; psychological, cognitive, and emotional demand), and organizational factors (shift work, working with clients, long work hours) were positively associated with SP, whereas job insecurity and exposure to physical factors (lifting or moving people, vibration) decreased SP risk. Our results support the importance of work-related factors, especially psychosocial exposures and organizational features, in determining workers' SP. © 2015 Wiley Periodicals, Inc.
Navar, Ann Marie; Gallup, Dianne S; Lokhnygina, Yuliya; Green, Jennifer B; McGuire, Darren K; Armstrong, Paul W; Buse, John B; Engel, Samuel S; Lachin, John M; Standl, Eberhard; Van de Werf, Frans; Holman, Rury R; Peterson, Eric D
2017-11-01
Systolic blood pressure (SBP) treatment targets for adults with diabetes mellitus remain unclear. SBP levels among 12 275 adults with diabetes mellitus, prior cardiovascular disease, and treated hypertension were evaluated in the TECOS (Trial Evaluating Cardiovascular Outcomes With Sitagliptin) randomized trial of sitagliptin versus placebo. The association between baseline SBP and recurrent cardiovascular disease was evaluated using multivariable Cox proportional hazards modeling with restricted cubic splines, adjusting for clinical characteristics. Kaplan-Meier curves by baseline SBP were created to assess time to cardiovascular disease and 2 potential hypotension-related adverse events: worsening kidney function and fractures. The association between time-updated SBP and outcomes was examined using multivariable Cox proportional hazards models. Overall, 42.2% of adults with diabetes mellitus, cardiovascular disease, and hypertension had an SBP ≥140 mm Hg. The association between SBP and cardiovascular disease risk was U shaped, with a nadir ≈130 mm Hg. When the analysis was restricted to those with baseline SBP of 110 to 150 mm Hg, the adjusted association between SBP and cardiovascular disease risk was flat (hazard ratio per 10-mm Hg increase, 0.96; 95% confidence interval, 0.91-1.02). There was no association between SBP and risk of fracture. Above 150 mm Hg, higher SBP was associated with increasing risk of worsening kidney function (hazard ratio per 10-mm Hg increase, 1.10; 95% confidence interval, 1.02-1.18). Many patients with diabetes mellitus have uncontrolled hypertension. The U-shaped association between SBP and cardiovascular disease events was largely driven by those with very high or low SBP, with no difference in cardiovascular disease risk between 110 and 150 mm Hg. Lower SBP was not associated with higher risks of fractures or worsening kidney function. © 2017 American Heart Association, Inc.
Varma, Niraj; Mittal, Suneet; Prillinger, Julie B; Snell, Jeff; Dalal, Nirav; Piccini, Jonathan P
2017-05-10
Whether outcomes differ between sexes following treatment with pacemakers (PM), implantable cardioverter defibrillators, and cardiac resynchronization therapy (CRT) devices is unclear. Consecutive US patients with newly implanted PM, implantable cardioverter defibrillators, and CRT devices from a large remote monitoring database between 2008 and 2011 were included in this observational cohort study. Sex-specific all-cause survival postimplant was compared within each device type using a multivariable Cox proportional hazards model, stratified on age and adjusted for remote monitoring utilization and ZIP-based socioeconomic variables. A total of 269 471 patients were assessed over a median 2.9 [interquartile range, 2.2, 3.6] years. Unadjusted mortality rates (MR; deaths/100 000 patient-years) were similar between women versus men receiving PMs (n=115 076, 55% male; MR 4193 versus MR 4256, respectively; adjusted hazard ratio, 0.87; 95% CI, 0.84-0.90; P <0.001) and implantable cardioverter defibrillators (n=85 014, 74% male; MR 4417 versus MR 4479, respectively; adjusted hazard ratio, 0.98; 95% CI, 0.93-1.02; P =0.244). In contrast, survival was superior in women receiving CRT defibrillators (n=61 475, 72% male; MR 5270 versus male MR 7175; adjusted hazard ratio, 0.73; 95% CI, 0.70-0.76; P <0.001) and also CRT pacemakers (n=7906, 57% male; MR 5383 versus male MR 7625, adjusted hazard ratio, 0.69; 95% CI, 0.61-0.78; P <0.001). This relative difference increased with time. These results were unaffected by age or remote monitoring utilization. Women accounted for less than 30% of high-voltage implants and fewer than half of low-voltage implants in a large, nation-wide cohort. Survival for women and men receiving implantable cardioverter defibrillators and PMs was similar, but dramatically greater for women receiving both defibrillator- and PM-based CRT. © 2017 The Authors and St. Jude Medical. Published on behalf of the American Heart Association, Inc., by Wiley.
Akter, Shamima; Kashino, Ikuko; Goto, Atsushi; Mizoue, Tetsuya; Noda, Mitsuhiko; Sasazuki, Shizuka; Sawada, Norie; Tsugane, Shoichiro
2016-01-01
Objective To examine the association between adherence to the Japanese Food Guide Spinning Top and total and cause specific mortality. Design Large scale population based prospective cohort study in Japan with follow-up for a median of 15 years. Setting 11 public health centre areas across Japan. Participants 36 624 men and 42 970 women aged 45-75 who had no history of cancer, stroke, ischaemic heart disease, or chronic liver disease. Main outcome measures Deaths and causes of death identified with the residential registry and death certificates. Results Higher scores on the food guide (better adherence) were associated with lower total mortality; the multivariable adjusted hazard ratios (95% confidence interval) of total mortality for the lowest through highest scores were 1.00, 0.92 (0.87 to 0.97), 0.88 (0.83 to 0.93), and 0.85 (0.79 to 0.91) (P<0.001 for trend) and the multivariable adjusted hazard ratio associated with a 10 point increase in food guide scores was 0.93 (0.91 to 0.95; P<0.001 for trend). This score was inversely associated with mortality from cardiovascular disease (hazard ratio associated with a 10 point increase 0.93, 0.89 to 0.98; P=0.005 for trend) and particularly from cerebrovascular disease (0.89, 0.82 to 0.95; P=0.002 for trend). There was some evidence, though not significant, of an inverse association for cancer mortality (0.96, 0.93 to 1.00; P=0.053 for trend). Conclusion Closer adherence to Japanese dietary guidelines was associated with a lower risk of total mortality and mortality from cardiovascular disease, particularly from cerebrovascular disease, in Japanese adults. PMID:27005903
McAuley, Paul A; Keteyian, Steven J; Brawner, Clinton A; Dardari, Zeina A; Al Rifai, Mahmoud; Ehrman, Jonathan K; Al-Mallah, Mouaz H; Whelton, Seamus P; Blaha, Michael J
2018-05-03
To assess the influence of exercise capacity and body mass index (BMI) on 10-year mortality in patients with heart failure (HF) and to synthesize these results with those of previous studies. This large biracial sample included 774 men and women (mean age, 60±13 years; 372 [48%] black) with a baseline diagnosis of HF from the Henry Ford Exercise Testing (FIT) Project. All patients completed a symptom-limited maximal treadmill stress test from January 1, 1991, through May 31, 2009. Patients were grouped by World Health Organization BMI categories for Kaplan-Meier survival analyses and stratified by exercise capacity (<4 and ≥4 metabolic equivalents [METs] of task). Associations of BMI and exercise capacity with all-cause mortality were assessed using multivariable-adjusted Cox proportional hazards models. During a mean follow-up of 10.1±4.6 years, 380 patients (49%) died. Kaplan-Meier survival plots revealed a significant positive association between BMI category and survival for exercise capacity less than 4 METs (log-rank, P=.05), but not greater than or equal to 4 METs (P=.76). In the multivariable-adjusted models, exercise capacity (per 1 MET) was inversely associated, but BMI was not associated, with all-cause mortality (hazard ratio, 0.89; 95% CI, 0.85-0.94; P<.001 and hazard ratio, 0.99; 95% CI, 0.97-1.01; P=.16, respectively). Maximal exercise capacity modified the relationship between BMI and long-term survival in patients with HF, upholding the presence of an exercise capacity-obesity paradox dichotomy as observed over the short-term in previous studies. Copyright © 2018 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.
Panotopoulos, Joannis; Posch, Florian; Funovics, Philipp T; Willegger, Madeleine; Scharrer, Anke; Lamm, Wolfgang; Brodowicz, Thomas; Windhager, Reinhard; Ay, Cihan
2016-03-01
Low serum albumin levels and impaired kidney function have been associated with decreased survival in patients with a variety of cancer types. In a retrospective cohort study, we analyzed 84 patients with liposarcoma treated at from May 1994 to October 2011. Uni- and multivariable Cox proportional hazard models and competing risk analyses were performed to evaluate the association between putative biomarkers with disease-specific and overall survival. The median age of the study population was 51.7 (range 19.6-83.8) years. In multivariable analysis adjusted for AJCC tumor stage, serum creatinine was highly associated with disease-specific survival (Subdistribution Hazard ratio (SHR) per 1 mg/dl increase = 2.94; 95%CI 1.39-6.23; p = 0.005). High albumin was associated with improved overall and disease-specific survival (Hazard Ratio (HR) per 10 units increase = 0.50; 95%CI 0.26-0.95; p = 0.033 and SHR = 0.64; 95%CI 0.42-1.00; p = 0.049). The serum albumin-creatinine-ratio emerged to be associated with both overall and disease-specific survival after adjusting for AJCC tumor stage (HR = 0.95; 95%CI 0.92-0.99; p = 0.011 and SHR = 0.96; 95%CI 0.93-0.99; p = 0.08). Our study provides evidence for a tumor-stage-independent association between higher creatinine and lower albumin with worse disease-specific survival. Low albumin and a high albumin-creatinine-ratio independently predict poor overall survival. Our work identified novel prognostic biomarkers for prognosis of patients with liposarcoma. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.
Kershaw, Kiarri N.; Osypuk, Theresa L.; Do, D. Phuong; De Chavez, Peter J.; Roux, Ana V. Diez
2014-01-01
Background Previous research suggests neighborhood-level racial/ethnic residential segregation is linked to health, but it has not been studied prospectively in relation to cardiovascular disease (CVD). Methods and Results Participants were 1,595 non-Hispanic Black, 2,345 non-Hispanic White, and 1,289 Hispanic adults from the Multi-Ethnic Study of Atherosclerosis free of CVD at baseline (ages 45-84). Own-group racial/ethnic residential segregation was assessed using the Gi∗ statistic, a measure of how the neighborhood racial/ethnic composition deviates from surrounding counties’ racial/ethnic composition. Multivariable Cox proportional hazards modeling was used to estimate hazard ratios (HR) for incident CVD (first definite angina, probable angina followed by revascularization, myocardial infarction, resuscitated cardiac arrest, CHD death, stroke, or stroke death) over 10.2 median years of follow-up. Among Blacks, each standard deviation increase in Black segregation was associated with a 12% higher hazard of developing CVD after adjusting for demographics (95% Confidence Interval (CI): 1.02, 1.22). This association persisted after adjustment for neighborhood-level characteristics, individual socioeconomic position, and CVD risk factors (HR: 1.12; 95% CI: 1.02, 1.23). For Whites, higher White segregation was associated with lower CVD risk after adjusting for demographics (HR: 0.88; 95% CI: 0.81, 0.96), but not after further adjustment for neighborhood characteristics. Segregation was not associated with CVD risk among Hispanics. Similar results were obtained after adjusting for time-varying segregation and covariates. Conclusions The association of residential segregation with cardiovascular risk varies according to race/ethnicity. Further work is needed to better characterize the individual- and neighborhood-level pathways linking segregation to CVD risk. PMID:25447044
Intake of Fiber and Nuts during Adolescence and Incidence of Proliferative Benign Breast Disease
Su, Xuefen; Tamimi, Rulla M.; Collins, Laura C.; Baer, Heather J.; Cho, Eunyoung; Sampson, Laura; Willett, Walter C.; Schnitt, Stuart J.; Connolly, James L.; Rosner, Bernard A.; Colditz, Graham A.
2011-01-01
Objective We examined the association between adolescent fiber intake and proliferative BBD, a marker of increased breast cancer risk, in the Nurses’ Health Study II. Methods Among 29,480 women who completed a high school diet questionnaire in 1998, 682 proliferative BBD cases were identified and confirmed by centralized pathology review between 1991 and 2001. Multivariate-adjusted Cox proportional hazards regression was used to estimate hazard ratios (HRs) and 95% confidence intervals (CIs). Results Women in the highest quintile of adolescent fiber intake had a 25% lower risk of proliferative BBD (multivariate HR (95% CI): 0.75 (0.59, 0.96), p-trend = 0.01) than women in the lowest quintile. High school intake of nuts and apples was also related to significantly reduced BBD risk. Women consuming ≥2 servings of nuts/week had a 36% lower risk (multivariate HR (95% CI): 0.64 (0.48, 0.85), p-trend < 0.01) than women consuming <1 serving/month. Results were essentially the same when the analysis was restricted to prospective cases (n = 142) diagnosed after return of the high school diet questionnaire. Conclusions These findings support the hypothesis that dietary intake of fiber and nuts during adolescence influence subsequent risk of breast disease and may suggest a viable means for breast cancer prevention. PMID:20229245
Budhathoki, Sanjeev; Hidaka, Akihisa; Yamaji, Taiki; Sawada, Norie; Tanaka-Mizuno, Sachiko; Kuchiba, Aya; Charvat, Hadrien; Goto, Atsushi; Kojima, Satoshi; Sudo, Natsuki; Shimazu, Taichi; Sasazuki, Shizuka; Inoue, Manami; Tsugane, Shoichiro; Iwasaki, Motoki
2018-03-07
To evaluate the association between pre-diagnostic circulating vitamin D concentration and the subsequent risk of overall and site specific cancer in a large cohort study. Nested case-cohort study within the Japan Public Health Center-based Prospective Study cohort. Nine public health centre areas across Japan. 3301 incident cases of cancer and 4044 randomly selected subcohort participants. Plasma concentration of 25-hydroxyvitamin D measured by enzyme immunoassay. Participants were divided into quarters based on the sex and season specific distribution of 25-hydroxyvitamin D among subcohorts. Weighted Cox proportional hazard models were used to calculate the multivariable adjusted hazard ratios for overall and site specific cancer across categories of 25-hydroxyvitamin D concentration, with the lowest quarter as the reference. Incidence of overall or site specific cancer. Plasma 25-hydroxyvitamin D concentration was inversely associated with the risk of total cancer, with multivariable adjusted hazard ratios for the second to fourth quarters compared with the lowest quarter of 0.81 (95% confidence interval 0.70 to 0.94), 0.75 (0.65 to 0.87), and 0.78 (0.67 to 0.91), respectively (P for trend=0.001). Among the findings for cancers at specific sites, an inverse association was found for liver cancer, with corresponding hazard ratios of 0.70 (0.44 to 1.13), 0.65 (0.40 to 1.06), and 0.45 (0.26 to 0.79) (P for trend=0.006). A sensitivity analysis showed that alternately removing cases of cancer at one specific site from total cancer cases did not substantially change the overall hazard ratios. In this large prospective study, higher vitamin D concentration was associated with lower risk of total cancer. These findings support the hypothesis that vitamin D has protective effects against cancers at many sites. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A Concept-Wide Association Study of Clinical Notes to Discover New Predictors of Kidney Failure.
Singh, Karandeep; Betensky, Rebecca A; Wright, Adam; Curhan, Gary C; Bates, David W; Waikar, Sushrut S
2016-12-07
Identifying predictors of kidney disease progression is critical toward the development of strategies to prevent kidney failure. Clinical notes provide a unique opportunity for big data approaches to identify novel risk factors for disease. We used natural language processing tools to extract concepts from the preceding year's clinical notes among patients newly referred to a tertiary care center's outpatient nephrology clinics and retrospectively evaluated these concepts as predictors for the subsequent development of ESRD using proportional subdistribution hazards (competing risk) regression. The primary outcome was time to ESRD, accounting for a competing risk of death. We identified predictors from univariate and multivariate (adjusting for Tangri linear predictor) models using a 5% threshold for false discovery rate (q value <0.05). We included all patients seen by an adult outpatient nephrologist between January 1, 2004 and June 18, 2014 and excluded patients seen only by transplant nephrology, with preexisting ESRD, with fewer than five clinical notes, with no follow-up, or with no baseline creatinine values. Among the 4013 patients selected in the final study cohort, we identified 960 concepts in the unadjusted analysis and 885 concepts in the adjusted analysis. Novel predictors identified included high-dose ascorbic acid (adjusted hazard ratio, 5.48; 95% confidence interval, 2.80 to 10.70; q<0.001) and fast food (adjusted hazard ratio, 4.34; 95% confidence interval, 2.55 to 7.40; q<0.001). Novel predictors of human disease may be identified using an unbiased approach to analyze text from the electronic health record. Copyright © 2016 by the American Society of Nephrology.
Pigmentation Traits, Sun Exposure, and Risk of Incident Vitiligo in Women.
Dunlap, Rachel; Wu, Shaowei; Wilmer, Erin; Cho, Eunyoung; Li, Wen-Qing; Lajevardi, Newsha; Qureshi, Abrar
2017-06-01
Vitiligo is the most common cutaneous depigmentation disorder worldwide, yet little is known about specific risk factors for disease development. Using data from the Nurses' Health Study, a prospective cohort study of 51,337 white women, we examined the associations between (i) pigmentary traits and (ii) reactions to sun exposure and risk of incident vitiligo. Nurses' Health Study participants responded to a question about clinician-diagnosed vitiligo and year of diagnosis (2001 or before, 2002-2005, 2006-2009, 2010-2011, or 2012+). We used Cox proportional hazards regression models to estimate the multivariate-adjusted hazard ratios and 95% confidence intervals of incident vitiligo associated with exposures variables, adjusting for potential confounders. We documented 271 cases of incident vitiligo over 835,594 person-years. Vitiligo risk was higher in women who had at least one mole larger than 3 mm in diameter on their left arms (hazard ratio = 1.37, 95% confidence interval = 1.02-1.83). Additionally, vitiligo risk was higher among women with better tanning ability (hazard ratio = 2.59, 95% confidence interval = 1.21-5.54) and in women who experienced at least one blistering sunburn (hazard ratio = 2.17, 95% confidence interval = 1.15-4.10). In this study, upper extremity moles, a higher ability to achieve a tan, and history of a blistering sunburn were associated with a higher risk of developing vitiligo in a population of white women. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Prognostic Utility of Novel Biomarkers of Cardiovascular Stress: The Framingham Heart Study
Wang, Thomas J.; Wollert, Kai C.; Larson, Martin G.; Coglianese, Erin; McCabe, Elizabeth L.; Cheng, Susan; Ho, Jennifer E.; Fradley, Michael G.; Ghorbani, Anahita; Xanthakis, Vanessa; Kempf, Tibor; Benjamin, Emelia J.; Levy, Daniel; Vasan, Ramachandran S.; Januzzi, James L.
2013-01-01
Background Biomarkers for predicting cardiovascular events in community-based populations have not consistently added information to standard risk factors. A limitation of many previously studied biomarkers is their lack of cardiovascular specificity. Methods and Results To determine the prognostic value of 3 novel biomarkers induced by cardiovascular stress, we measured soluble ST2, growth differentiation factor-15, and high-sensitivity troponin I in 3,428 participants (mean age 59, 53% women) in the Framingham Heart Study. We performed multivariable-adjusted proportional hazards models to assess the individual and combined ability of the biomarkers to predict adverse outcomes. We also constructed a “multimarker” score composed of the 3 biomarkers, in addition to B-type natriuretic peptide and high-sensitivity C-reactive protein. During a mean follow-up of 11.3 years, there were 488 deaths, 336 major cardiovascular events, 162 heart failure events, and 142 coronary events. In multivariable-adjusted models, the 3 new biomarkers were associated with each endpoint (p<0.001) except for coronary events. Individuals with multimarker scores in the highest quartile had a 3-fold risk of death (adjusted hazard ratio, 3.2, 95% CI, 2.2–4.7; p<0.001), 6-fold risk of heart failure (6.2, 95% CI, 2.6–14.8; p<0.001), and 2-fold risk of cardiovascular events (1.9, 95% CI, 1.3–2.7; p=0.001). Addition of the multimarker score to clinical variables led to significant increases in the c-statistic (p=0.007 or lower) and net reclassification improvement (p=0.001 or lower). Conclusions Multiple biomarkers of cardiovascular stress are detectable in ambulatory individuals, and add prognostic value to standard risk factors for predicting death, overall cardiovascular events, and heart failure. PMID:22907935
Use of Tanning Beds and Incidence of Skin Cancer
Zhang, Mingfeng; Qureshi, Abrar A.; Geller, Alan C.; Frazier, Lindsay; Hunter, David J.; Han, Jiali
2012-01-01
Purpose We sought to evaluate the risk effect of tanning bed use on skin cancers among teenage and young adults. We also expected to determine whether a dose-response relationship was evident. Patients and Methods We observed 73,494 female nurses for 20 years (from 1989 to 2009) in a large and well-characterized cohort in the United States and investigated whether frequency of tanning bed use during high school/college and at ages 25 to 35 years were associated with a risk of basal cell carcinoma (BCC), squamous cell carcinoma (SCC), and melanoma. We used Cox proportional hazards models and carefully adjusted for host risk factors, ultraviolet index of residence, and sun exposure behaviors at a young age. Results During follow-up, 5,506 nurses were diagnosed with BCC, 403 with SCC, and 349 with melanoma. The multivariable-adjusted hazard ratio (HR) of skin cancer for an incremental increase in use of tanning beds of four times per year during both periods was 1.15 (95% CI, 1.11 to 1.19; P < .001) for BCC, 1.15 (95% CI, 1.01 to 1.31; P = .03) for SCC, and 1.11 (95% CI, 0.97 to 1.27; P = .13) for melanoma. Compared with tanning bed use at ages 25 to 35 years, we found a significantly higher risk of BCC for use during high school/college (multivariable-adjusted HR for use more than six times per year compared with no use was 1.73 during high school/college v 1.28 at ages 25 to 35 years; P for heterogeneity < .001). Conclusion Our data provide evidence for a dose-response relationship between tanning bed use and the risk of skin cancers, especially BCC, and the association is stronger for patients with a younger age at exposure. PMID:22370316
Use of tanning beds and incidence of skin cancer.
Zhang, Mingfeng; Qureshi, Abrar A; Geller, Alan C; Frazier, Lindsay; Hunter, David J; Han, Jiali
2012-05-10
We sought to evaluate the risk effect of tanning bed use on skin cancers among teenage and young adults. We also expected to determine whether a dose-response relationship was evident. We observed 73,494 female nurses for 20 years (from 1989 to 2009) in a large and well-characterized cohort in the United States and investigated whether frequency of tanning bed use during high school/college and at ages 25 to 35 years were associated with a risk of basal cell carcinoma (BCC), squamous cell carcinoma (SCC), and melanoma. We used Cox proportional hazards models and carefully adjusted for host risk factors, ultraviolet index of residence, and sun exposure behaviors at a young age. During follow-up, 5,506 nurses were diagnosed with BCC, 403 with SCC, and 349 with melanoma. The multivariable-adjusted hazard ratio (HR) of skin cancer for an incremental increase in use of tanning beds of four times per year during both periods was 1.15 (95% CI, 1.11 to 1.19; P < .001) for BCC, 1.15 (95% CI, 1.01 to 1.31; P = .03) for SCC, and 1.11 (95% CI, 0.97 to 1.27; P = .13) for melanoma. Compared with tanning bed use at ages 25 to 35 years, we found a significantly higher risk of BCC for use during high school/college (multivariable-adjusted HR for use more than six times per year compared with no use was 1.73 during high school/college v 1.28 at ages 25 to 35 years; P for heterogeneity < .001). Our data provide evidence for a dose-response relationship between tanning bed use and the risk of skin cancers, especially BCC, and the association is stronger for patients with a younger age at exposure.
Yaffe, Kristine; Falvey, Cherie M; Hamilton, Nathan; Harris, Tamara B; Simonsick, Eleanor M; Strotmeyer, Elsa S; Shorr, Ronald I; Metti, Andrea; Schwartz, Ann V
2013-07-22
Hypoglycemia commonly occurs in patients with diabetes mellitus (DM) and may negatively influence cognitive performance. Cognitive impairment in turn can compromise DM management and lead to hypoglycemia. To prospectively evaluate the association between hypoglycemia and dementia in a biracial cohort of older adults with DM. Prospective population-based study. We studied 783 older adults with DM (mean age, 74.0 years; 47.0% of black race/ethnicity; and 47.6% female) who were participating in the prospective population-based Health, Aging, and Body Composition Study beginning in 1997 and who had baseline Modified Mini-Mental State Examination scores of 80 or higher. Dementia diagnosis was determined during the follow-up period from hospital records indicating an admission associated with dementia or the use of prescribed dementia medications. Hypoglycemic events were determined during the follow-up period by hospital records. During the 12-year follow-up period, 61 participants (7.8%) had a reported hypoglycemic event, and 148 (18.9%) developed dementia. Those who experienced a hypoglycemic event had a 2-fold increased risk for developing dementia compared with those who did not have a hypoglycemic event (34.4% vs 17.6%, P < .001; multivariate-adjusted hazard ratio, 2.1; 95% CI, 1.0-4.4). Similarly, older adults with DM who developed dementia had a greater risk for having a subsequent hypoglycemic event compared with participants who did not develop dementia (14.2% vs 6.3%, P < .001; multivariate-adjusted hazard ratio, 3.1; 95% CI, 1.5-6.6). Further adjustment for stroke, hypertension, myocardial infarction, and cognitive change scores produced similar results. Among older adults with DM, there seems to be a bidirectional association between hypoglycemia and dementia.
Sharma, Sonam; Bekelman, Justin; Lin, Alexander; Lukens, J Nicholas; Roman, Benjamin R; Mitra, Nandita; Swisher-McClure, Samuel
2016-05-01
We examined practice patterns using the National Cancer Data Base (NCDB) to determine risk factors for prolonged diagnosis to treatment interval (DTI) and survival outcomes in patients receiving chemoradiation for oropharyngeal squamous cell carcinoma (OPSCC). We identified 6606 NCDB patients with Stage III-IV OPSCC receiving chemoradiation from 2003 to 2006. We determined risk factors for prolonged DTI (>30days) using univariate and multivariable logistic regression models. We examined overall survival (OS) using Kaplan Meier and multivariable Cox proportional hazards models. 3586 (54.3%) patients had prolonged DTI. Race, IMRT, insurance status, and high volume facilities were significant risk factors for prolonged DTI. Patients with prolonged DTI had inferior OS compared to DTI⩽30days (Hazard Ratio (HR)=1.12, 95% CI 1.04-1.20, p=0.005). For every week increase in DTI there was a 2.2% (95% CI 1.1-3.3%, p<0.001) increase in risk of death. Patients receiving IMRT, treatment at academic, or high-volume facilities were more likely to experience prolonged DTI (High vs. Low volume: 61.5% vs. 51.8%, adjusted OR 1.38, 95% CI 1.21-1.58; Academic vs. Community: 59.5% vs. 50.6%, adjusted OR 1.26, 95% CI 1.13-1.42; non-IMRT vs. IMRT: 53.4% vs. 56.5%; adjusted OR 1.17, 95% CI 1.04-1.31). Our results suggest that prolonged DTI has a significant impact on survival outcomes. We observed disparities in DTI by socioeconomic factors. However, facility level factors such as academic affiliation, high volume, and IMRT also increased risk of DTI. These findings should be considered in developing efficient pathways to mitigate adverse effects of prolonged DTI. Copyright © 2016 Elsevier Ltd. All rights reserved.
Prognostic utility of novel biomarkers of cardiovascular stress: the Framingham Heart Study.
Wang, Thomas J; Wollert, Kai C; Larson, Martin G; Coglianese, Erin; McCabe, Elizabeth L; Cheng, Susan; Ho, Jennifer E; Fradley, Michael G; Ghorbani, Anahita; Xanthakis, Vanessa; Kempf, Tibor; Benjamin, Emelia J; Levy, Daniel; Vasan, Ramachandran S; Januzzi, James L
2012-09-25
Biomarkers for predicting cardiovascular events in community-based populations have not consistently added information to standard risk factors. A limitation of many previously studied biomarkers is their lack of cardiovascular specificity. To determine the prognostic value of 3 novel biomarkers induced by cardiovascular stress, we measured soluble ST2, growth differentiation factor-15, and high-sensitivity troponin I in 3428 participants (mean age, 59 years; 53% women) in the Framingham Heart Study. We performed multivariable-adjusted proportional hazards models to assess the individual and combined ability of the biomarkers to predict adverse outcomes. We also constructed a "multimarker" score composed of the 3 biomarkers in addition to B-type natriuretic peptide and high-sensitivity C-reactive protein. During a mean follow-up of 11.3 years, there were 488 deaths, 336 major cardiovascular events, 162 heart failure events, and 142 coronary events. In multivariable-adjusted models, the 3 new biomarkers were associated with each end point (P<0.001) except coronary events. Individuals with multimarker scores in the highest quartile had a 3-fold risk of death (adjusted hazard ratio, 3.2; 95% confidence interval, 2.2-4.7; P<0.001), 6-fold risk of heart failure (6.2; 95% confidence interval, 2.6-14.8; P<0.001), and 2-fold risk of cardiovascular events (1.9; 95% confidence interval, 1.3-2.7; P=0.001). Addition of the multimarker score to clinical variables led to significant increases in the c statistic (P=0.005 or lower) and net reclassification improvement (P=0.001 or lower). Multiple biomarkers of cardiovascular stress are detectable in ambulatory individuals and add prognostic value to standard risk factors for predicting death, overall cardiovascular events, and heart failure.
Warner, Erica T; Carapinha, René; Weber, Griffin M; Hill, Emorcia V; Reede, Joan Y
2017-10-01
To determine whether there were gender differences in likelihood of receiving a first National Institutes of Health (NIH) R01 award among 5445 instructors and assistant professors at Harvard Medical School (HMS). Data on R01 award principal investigators were obtained from NIH ExPORTER and linked with faculty data. Using Cox proportional hazard regression, we examined the association of gender with receipt of first R01 award between 2008 and 2015 accounting for demographics, research productivity metrics, and professional characteristics. Compared to males, females had fewer publications, lower h-index, smaller coauthor networks and were less likely to be assistant professors (p < 0.0001). Four hundred and thirteen of 5445 faculty (7.6%) received their first R01 award during the study period. There was no gender difference in receipt of R01 awards in age-adjusted (hazard ratio [HR]: 0.87, 95% confidence interval [CI]: 0.70-1.08) or multivariable-adjusted models (HR: 1.07, 95% CI: 0.86-1.34). Compared to white males, there was a nonsignificant 10%, 18%, and 30% lower rate of R01 receipt among white, Asian or Pacific Islander, and underrepresented minority females, respectively. These differences were eliminated in the multivariable-adjusted model. Network reach, age, HMS start year, h-index, academic rank, previous K award, terminal degree, and HMS training were all significant predictors of receiving an R01 award. A relatively small proportion of HMS junior faculty obtained their first NIH R01 award during the study period. There was no significant gender difference in likelihood of award. However, we are unable to distinguish faculty that never applied from those who applied and were not successful.
von Ruesten, A; Illner, A-K; Buijsse, B; Heidemann, C; Boeing, H
2010-11-01
The German food pyramid was set up to foster and communicate healthy food choices. The adherence to recommendations of the food pyramid was translated into an index (German Food Pyramid Index (GFPI)) by scoring the ratio of consumed and recommended daily servings of eight food groups, wherein higher scores indicated greater adherence. The GFPI was calculated for 23 531 subjects who participated in the European Prospective Investigation into Cancer and Nutrition-Potsdam study and were recruited between 1994 and 1998. Associations between quintiles of GFPI scores and risk of incident cardiovascular diseases (CVD), type-2 diabetes (T2D) and cancer were evaluated using Cox proportional hazard regression models. During 183 740 person-years of follow-up, 363 incident cases of CVD (myocardial infarction or stroke), 837 incident cases of T2D and 844 incident cases of cancer occurred. The GFPI was inversely related to CVD risk in men (multivariable-adjusted hazard ratio (HR) for highest versus lowest quintiles=0.56; 95% confidence interval (CI): 0.34-0.94) but not in women (HR=1.39; 95% CI: 0.76-2.55). No association between GFPI and cancer was observed. An inverse relation between GFPI and T2D (men: HR= 0.71 (0.52-0.97); women: HR= 0.69 (0.50-0.96)) in age-adjusted models was substantially attenuated after multivariable adjustments, particularly by body mass index (BMI) (men: HR=0.94 (0.69-1.30); women: HR=1.09 (0.77-1.54)). The same was observed for overall major chronic disease risk (CVD, T2D and total cancer). Adherence to the German food pyramid recommendations is not associated with a decreased risk of chronic diseases when considering BMI as confounder, except of CVD in men.
Mortensen, Eric M.; Rello, Jordi; Brody, Jennifer; Anzueto, Antonio
2010-01-01
Background: Limited data are available on the impact of time to ICU admission and outcomes for patients with severe community acquired pneumonia (CAP). Our objective was to examine the association of time to ICU admission and 30-day mortality in patients with severe CAP. Methods: A retrospective cohort study of 161 ICU subjects with CAP (by International Classification of Diseases, 9th edition, codes) was conducted over a 3-year period at two tertiary teaching hospitals. Timing of the ICU admission was dichotomized into early ICU admission (EICUA, direct admission or within 24 h) and late ICU admission (LICUA, ≥ day 2). A multivariable analysis using Cox proportional hazard model was created with the primary outcome of 30-day mortality (dependent measure) and the American Thoracic Society (ATS) severity adjustment criteria and time to ICU admission as the independent measures. Results: Eighty-eight percent (n = 142) were EICUA patients compared with 12% (n = 19) LICUA patients. Groups were similar with respect to age, gender, comorbidities, clinical parameters, CAP-related process of care measures, and need for mechanical ventilation. LICUA patients had lower rates of ATS severity criteria at presentation (26.3% vs 53.5%; P = .03). LICUA patients (47.4%) had a higher 30-day mortality compared with EICUA (23.2%) patients (P = .02), which remained after adjusting in the multivariable analysis (hazard ratio 2.6; 95% CI, 1.2-5.5; P = .02). Conclusion: Patients with severe CAP with a late ICU admission have increased 30-day mortality after adjustment for illness severity. Further research should evaluate the risk factors associated and their impact on clinical outcomes in patients admitted late to the ICU. PMID:19880910
A prospective cohort study of stroke mortality and arsenic in drinking water in Bangladeshi adults
2014-01-01
Background Arsenic in drinking water causes increased coronary artery disease (CAD) and death from CAD, but its association with stroke is not known. Methods Prospective cohort study with arsenic exposure measured in well water at baseline. 61074 men and women aged 18 years or older on January 2003 were enrolled in 2003. The cohort was actively followed for an average of 7 years (421,754 person-years) through December 2010. Based on arsenic concentration the population was categorized in three groups and stroke mortality HR was compared to the referent. The risk of stroke mortality Hazard Ratio (HR) and 95% Confidence Interval was calculated in relation to arsenic exposure was estimated by Cox proportional hazard models with adjustment for potential confounders. Results A total of 1033 people died from stroke during the follow-up period, accounting for 23% of the total deaths. Multivariable adjusted HRs (95% confidence interval) for stroke for well water arsenic concentrations <10, 10-49, and ≥50 μg/L were 1.0 (reference), 1.20 (0.92 to 1.57), and 1.35 (1.04 to 1.75) respectively (Ptrend=0.00058). For men, multivariable adjusted HRs (95%) for well water arsenic concentrations <10, 10-49, and ≥50 μg/L were 1.0 (reference), 1.12 (0.78 to 1.60), and 1.07 (0.75 to 1.51) respectively (Ptrend=0.45) and for women 1.0 (reference),1.31 (0.87 to 1.98), and 1.72 (1.15 to 2.57) respectively (Ptrend=0.00004). Conclusion The result suggests that arsenic exposure was associated with increased stroke mortality risk in this population, and was more significant in women compared to men. PMID:24548416
Chen, Ruoling; McKevitt, Christopher; Rudd, Anthony G; Wolfe, Charles D A
2014-01-01
Previous findings of the association between socioeconomic deprivation (SED) and survival after stroke are inconsistent. There is less investigation on long-term survival. We assessed the associations in a multi-ethnic population in England. We examined data from 4398 patients (3103 whites, 932 blacks, and 253 Asians/others) with first-ever stroke, collected by a population-based stroke register in South London from 1995 to 2011. SED was measured using the Carstairs index score-the higher score, the more deprived. It was analyzed in multivariate Cox regression models in relation to survival after stroke. During 17-year follow-up 2754 patients died. The quartile data of Carstairs score showed no significant association of SED with survival in patients, except for black Caribbeans and Africans. Black patients with the fourth quartile SED had a multivariate adjusted hazard ratio of 1.76 (95% confidence interval, 1.06-2.94) for 3-month mortality and 1.54 (1.00-2.37) for 1-year mortality. After adjustment for acute stroke care provisions, these were no longer significant. However, the sextile data of Carstairs score showed a consistent association of SED with survival after stroke; all patients with the sixth sextile had a fully adjusted hazard ratio of 1.23 (1.05-1.44) for 3-month mortality and 1.13 (1.01-1.25) for 17-year mortality. There is a weak but significant association of SED with reduced survival after stroke in England. SED in blacks may have a stronger impact on short-term survival when compared with white patients. Further efforts are required to achieve equality in survival among patients with stroke of different socioeconomic groups.
Trial by fire: a multivariate examination of the relation between job tenure and work injuries.
Breslin, F C; Smith, P
2006-01-01
This study examined the relation between months on the job and lost-time claim rates, with a particular focus on age related differences. Workers' compensation records and labour force survey data were used to compute claim rates per 1000 full time equivalents. To adjust for potential confounding, multivariate analyses included age, sex, occupation, and industry, as well job tenure as predictors of claim rates. At any age, the claim rates decline as time on the job increases. For example, workers in the first month on the job were over four times more likely to have a lost-time claim than workers with over one year in their current job. The job tenure injury associations were stronger among males, the goods industry, manual occupations, and older adult workers. The present results suggest that all worker subgroups examined show increased risk when new on the job. Recommendations for improving this situation include earlier training, starting workers in low hazard conditions, reducing job turnover rates in firms, and improved monitoring of hazard exposures that new workers encounter.
Prognostic Significance of POLE Proofreading Mutations in Endometrial Cancer
Church, David N.; Stelloo, Ellen; Nout, Remi A.; Valtcheva, Nadejda; Depreeuw, Jeroen; ter Haar, Natalja; Noske, Aurelia; Amant, Frederic; Wild, Peter J.; Lambrechts, Diether; Jürgenliemk-Schulz, Ina M.; Jobsen, Jan J.; Smit, Vincent T. H. B. M.; Creutzberg, Carien L.; Bosse, Tjalling
2015-01-01
Background: Current risk stratification in endometrial cancer (EC) results in frequent over- and underuse of adjuvant therapy, and may be improved by novel biomarkers. We examined whether POLE proofreading mutations, recently reported in about 7% of ECs, predict prognosis. Methods: We performed targeted POLE sequencing in ECs from the PORTEC-1 and -2 trials (n = 788), and analyzed clinical outcome according to POLE status. We combined these results with those from three additional series (n = 628) by meta-analysis to generate multivariable-adjusted, pooled hazard ratios (HRs) for recurrence-free survival (RFS) and cancer-specific survival (CSS) of POLE-mutant ECs. All statistical tests were two-sided. Results: POLE mutations were detected in 48 of 788 (6.1%) ECs from PORTEC-1 and-2 and were associated with high tumor grade (P < .001). Women with POLE-mutant ECs had fewer recurrences (6.2% vs 14.1%) and EC deaths (2.3% vs 9.7%), though, in the total PORTEC cohort, differences in RFS and CSS were not statistically significant (multivariable-adjusted HR = 0.43, 95% CI = 0.13 to 1.37, P = .15; HR = 0.19, 95% CI = 0.03 to 1.44, P = .11 respectively). However, of 109 grade 3 tumors, 0 of 15 POLE-mutant ECs recurred, compared with 29 of 94 (30.9%) POLE wild-type cancers; reflected in statistically significantly greater RFS (multivariable-adjusted HR = 0.11, 95% CI = 0.001 to 0.84, P = .03). In the additional series, there were no EC-related events in any of 33 POLE-mutant ECs, resulting in a multivariable-adjusted, pooled HR of 0.33 for RFS (95% CI = 0.12 to 0.91, P = .03) and 0.26 for CSS (95% CI = 0.06 to 1.08, P = .06). Conclusion: POLE proofreading mutations predict favorable EC prognosis, independently of other clinicopathological variables, with the greatest effect seen in high-grade tumors. This novel biomarker may help to reduce overtreatment in EC. PMID:25505230
Weight Cycling and Cancer Incidence in a Large Prospective US Cohort
Stevens, Victoria L.; Jacobs, Eric J.; Patel, Alpa V.; Sun, Juzhong; McCullough, Marjorie L.; Campbell, Peter T.; Gapstur, Susan M.
2015-01-01
Weight cycling, which consists of repeated cycles of intentional weight loss and regain, is common among individuals who try to lose weight. Some evidence suggests that weight cycling may affect biological processes that could contribute to carcinogenesis, but whether it is associated with cancer risk is unclear. Using 62,792 men and 69,520 women enrolled in the Cancer Prevention Study II Nutrition Cohort in 1992, we examined the association between weight cycling and cancer incidence. Weight cycles were defined by using baseline questions that asked the number of times ≥10 pounds (4.54 kg) was purposely lost and later regained. Multivariable-adjusted hazard ratios and 95% confidence intervals for all cancer and 15 individual cancers were estimated by using Cox proportional hazards regression. During up to 17 years of follow-up, 15,333 men and 9,984 women developed cancer. Weight cycling was not associated with overall risk of cancer in men (hazard ratio = 0.96, 95% confidence interval: 0.83, 1.11 for ≥20 cycles vs. no weight cycles) or women (hazard ratio = 0.96, 95% confidence interval: 0.86, 1.08) in models that adjusted for body mass index and other covariates. Weight cycling was also not associated with any individual cancer investigated. These results suggest that weight cycling, independent of body weight, is unlikely to influence subsequent cancer risk. PMID:26209523
Low Bone Density and Bisphosphonate Use and the Risk of Kidney Stones.
Prochaska, Megan; Taylor, Eric; Vaidya, Anand; Curhan, Gary
2017-08-07
Previous studies have demonstrated lower bone density in patients with kidney stones, but no longitudinal studies have evaluated kidney stone risk in individuals with low bone density. Small studies with short follow-up reported reduced 24-hour urine calcium excretion with bisphosphonate use. We examined history of low bone density and bisphosphonate use and the risk of incident kidney stone as well as the association with 24-hour calcium excretion. We conducted a prospective analysis of 96,092 women in the Nurses' Health Study II. We used Cox proportional hazards models to adjust for age, body mass index, thiazide use, fluid intake, supplemental calcium use, and dietary factors. We also conducted a cross-sectional analysis of 2294 participants using multivariable linear regression to compare 24-hour urinary calcium excretion between participants with and without a history of low bone density, and among 458 participants with low bone density, with and without bisphosphonate use. We identified 2564 incident stones during 1,179,860 person-years of follow-up. The multivariable adjusted relative risk for an incident kidney stone for participants with history of low bone density compared with participants without was 1.39 (95% confidence interval [95% CI], 1.20 to 1.62). Among participants with low bone density, the multivariable adjusted relative risk for an incident kidney stone for bisphosphonate users was 0.68 (95% CI, 0.48 to 0.98). In the cross-sectional analysis of 24-hour urine calcium excretion, the multivariable adjusted mean difference in 24-hour calcium was 10 mg/d (95% CI, 1 to 19) higher for participants with history of low bone density. However, among participants with history of low bone density, there was no association between bisphosphonate use and 24-hour calcium with multivariable adjusted mean difference in 24-hour calcium of -2 mg/d (95% CI, -25 to 20). Low bone density is an independent risk factor for incident kidney stone and is associated with higher 24-hour urine calcium excretion. Among participants with low bone density, bisphosphonate use was associated with lower risk of incident kidney stone but was not independently associated with 24-hour urine calcium excretion. Copyright © 2017 by the American Society of Nephrology.
Muscular strength and incident hypertension in normotensive and prehypertensive men.
Maslow, Andréa L; Sui, Xuemei; Colabianchi, Natalie; Hussey, Jim; Blair, Steven N
2010-02-01
The protective effects of cardiorespiratory fitness (CRF) on hypertension (HTN) are well known; however, the association between muscular strength and incidence of HTN has yet to be examined. This study evaluated the strength-HTN association with and without accounting for CRF. Participants were 4147 men (age = 20-82 yr) in the Aerobics Center Longitudinal Study for whom an age-specific composite muscular strength score was computed from measures of a one-repetition maximal leg and a one-repetition maximal bench press. CRF was quantified by maximal treadmill exercise test time in minutes. Cox proportional hazards regression analysis was used to estimate hazard ratios (HR) and 95% confidence intervals of incident HTN events according to exposure categories. During a mean follow-up of 19 yr, there were 503 incident HTN cases. Multivariable-adjusted (excluding CRF) HR of HTN in normotensive men comparing middle- and high-strength thirds to the lowest third were not significant at 1.17 and 0.84, respectively. Multivariable-adjusted (excluding CRF) HR of HTN in baseline prehypertensive men comparing middle- and high-strength thirds to the lowest third were significant at 0.73 and 0.72 (P = 0.01 each), respectively. The association between muscular strength and incidence of HTN in baseline prehypertensive men was no longer significant after control for CRF (P = 0.26). The study indicated that middle and high levels of muscular strength were associated with a reduced risk of HTN in prehypertensive men only. However, this relationship was no longer significant after controlling for CRF.
Boudreau, Robert M; Hanlon, Joseph T; Roumani, Yazan F; Studenski, Stephanie A; Ruby, Christine M; Wright, Rollin M; Hilmer, Sarah N; Shorr, Ronald I; Bauer, Douglas C; Simonsick, Eleanor M; Newman, Anne B
2009-10-01
To evaluate whether CNS medication use in older adults was associated with a higher risk of future incident mobility limitation. This 5-year longitudinal cohort study included 3055 participants from the health, aging and body composition (Health ABC) study who were well-functioning at baseline. CNS medication use (benzodiazepine and opioid receptor agonists, antipsychotics, and antidepressants) was determined yearly (except year 4) during in-home or in-clinic interviews. Summated standardized daily doses (low, medium, and high) and duration of CNS drug use were computed. Incident mobility limitation was operationalized as two consecutive self-reports of having any difficulty walking 1/4 mile or climbing 10 steps without resting every 6 months after baseline. Multivariable Cox proportional hazard analyses were conducted adjusting for demographics, health behaviors, health status, and common indications for CNS medications. Each year at least 13.9% of participants used a CNS medication. By year 6, overall 49% had developed incident mobility limitation. In multivariable models, CNS medication users compared to never users showed a higher risk for incident mobility limitation (adjusted hazard ratio (Adj. HR) 1.28; 95% confidence interval (CI) 1.12-1.47). Similar findings of increased risk were seen in analyses examining dose- and duration-response relationships. CNS medication use is independently associated with an increased risk of future incident mobility limitation in community dwelling elderly. Further studies are needed to determine the impact of reducing CNS medication exposure on mobility problems. 2009 John Wiley & Sons, Ltd.
Hsiao, Yi-Han; Chen, Yung-Tai; Tseng, Ching-Ming; Wu, Li-An; Perng, Diahn-Warng; Chen, Yuh-Min; Chen, Tzeng-Ji; Chang, Shi-Chuan; Chou, Kun-Ta
2017-10-01
Sleep disorders are common non-motor symptoms in patients with Parkinson's disease. Our study aims to explore the relationship between non-apnea sleep disorders and future Parkinson's disease. This is a cohort study using a nationwide database. The participants were recruited from the Taiwan National Health Insurance Research Database between 2000 and 2003. A total of 91 273 adult patients who had non-apnea sleep disorders without pre-existing Parkinson's disease were enrolled. An age-, gender-, income-, urbanization- and Charlson comorbidity index score-matched control cohort consisting of 91 273 participants was selected for comparison. The two cohorts were followed for the occurrence of Parkinson's disease, death or until the end of 2010, whichever came first. The Kaplan-Meier analyses revealed patients with non-apnea sleep disorders tended to develop Parkinson's disease (log-rank test, P < 0.001). After a multivariate adjustment in a Cox regression model, non-apnea sleep disorders was an independent risk factor for the development of Parkinson's disease [crude hazard ratio: 1.63, 95% confidence interval (CI): 1.54-1.73, P < 0.001; adjusted hazard ratio: 1.18, 95% CI: 1.11-1.26, P < 0.001]. In the subgroup analysis, patients with chronic insomnia (lasting more than 3 months) had the greatest risk (crude hazard ratio: 2.91, 95% CI: 2.59-3.26, P < 0.001; adjusted hazard ratio: 1.37, 95% CI: 1.21-1.55, P < 0.001). In conclusion, this study revealed that non-apnea sleep disorders, especially chronic insomnia, are associated with a higher risk for future Parkinson's disease. © 2017 European Sleep Research Society.
Lagergren, Emily; Kempe, Kelly; Craven, Timothy E; Kornegay, Susan T; Hurie, Justin B; Garg, Nitin; Velazquez-Ramirez, Gabriela; Edwards, Matthew S; Corriere, Matthew A
2017-10-01
Outcome disparities associated with lower extremity bypass (LEB) for peripheral artery disease (PAD) have been identified but are poorly understood. Marital status may affect outcomes through factors related to health risk behaviors, adherence, and access to care but has not been characterized as a predictor of surgical outcomes and is often omitted from administrative data sets. We evaluated associations between marital status and vein graft patency following LEB using multivariable models adjusting for established risk factors. Consecutive patients undergoing autogenous LEB for PAD were identified and analyzed. Survival analysis and Cox proportional hazards models were used to evaluate patency stratified by marital status (married versus single, divorced, or widow[er]) adjusting for demographic, comorbidity, and anatomic factors in multivariable models. Seventy-three participants who underwent 79 autogenous vein LEB had complete data and were analyzed. Forty-three patients (58.9%) were married, and 30 (41.1%) were unmarried. Compared with unmarried patients, married patients were older at the time of their bypass procedure (67.3 ± 10.8 years vs. 62.2 ± 10.6 years; P = 0.05). Married patients also had a lower prevalence of female gender (11.6% vs. 33.3%; P = 0.02). Diabetes, hypertension, hyperlipidemia, and smoking were common among both married and unmarried patients. Minimum great saphenous vein conduit diameters were larger in married versus unmarried patients (2.82 ± 0.57 mm vs. 2.52 ± 0.65 mm; P = 0.04). Twenty-four-month primary patency was 66% for married versus 38% for unmarried patients. In a multivariable proportional hazards model adjusting for proximal and distal graft inflow/outflow, medications, gender, age, race, smoking, diabetes, and minimum vein graft diameter, married status was associated with superior primary patency (hazard ratio [HR] = 0.33; 95% confidence limits [0.11, 0.99]; P = 0.05); other predictive covariates included preoperative antiplatelet therapy (HR = 0.27; 95% confidence limits [0.10, 0.74]; P = 0.01) and diabetes (HR = 2.56; 95% confidence limits [0.93-7.04]; P = 0.07). Marital status is associated with vein graft patency following LEB. Further investigation into the mechanistic explanation for improved patency among married patients may provide insight into social or behavioral factors influencing other disparities associated with LEB outcomes. Copyright © 2017 Elsevier Inc. All rights reserved.
Lebwohl, Benjamin; Cao, Yin; Zong, Geng; Hu, Frank B; Green, Peter H R; Neugut, Alfred I; Rimm, Eric B; Sampson, Laura; Dougherty, Lauren W; Giovannucci, Edward; Willett, Walter C; Sun, Qi; Chan, Andrew T
2017-05-02
Objective To examine the association of long term intake of gluten with the development of incident coronary heart disease. Design Prospective cohort study. Setting and participants 64 714 women in the Nurses' Health Study and 45 303 men in the Health Professionals Follow-up Study without a history of coronary heart disease who completed a 131 item semiquantitative food frequency questionnaire in 1986 that was updated every four years through 2010. Exposure Consumption of gluten, estimated from food frequency questionnaires. Main outcome measure Development of coronary heart disease (fatal or non-fatal myocardial infarction). Results During 26 years of follow-up encompassing 2 273 931 person years, 2431 women and 4098 men developed coronary heart disease. Compared with participants in the lowest fifth of gluten intake, who had a coronary heart disease incidence rate of 352 per 100 000 person years, those in the highest fifth had a rate of 277 events per 100 000 person years, leading to an unadjusted rate difference of 75 (95% confidence interval 51 to 98) fewer cases of coronary heart disease per 100 000 person years. After adjustment for known risk factors, participants in the highest fifth of estimated gluten intake had a multivariable hazard ratio for coronary heart disease of 0.95 (95% confidence interval 0.88 to 1.02; P for trend=0.29). After additional adjustment for intake of whole grains (leaving the remaining variance of gluten corresponding to refined grains), the multivariate hazard ratio was 1.00 (0.92 to 1.09; P for trend=0.77). In contrast, after additional adjustment for intake of refined grains (leaving the variance of gluten intake correlating with whole grain intake), estimated gluten consumption was associated with a lower risk of coronary heart disease (multivariate hazard ratio 0.85, 0.77 to 0.93; P for trend=0.002). Conclusion Long term dietary intake of gluten was not associated with risk of coronary heart disease. However, the avoidance of gluten may result in reduced consumption of beneficial whole grains, which may affect cardiovascular risk. The promotion of gluten-free diets among people without celiac disease should not be encouraged. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Parental intermittent claudication as risk factor for claudication in adults.
Prushik, Scott G; Farber, Alik; Gona, Philimon; Shrader, Peter; Pencina, Michael J; D'Agostino, Ralph B; Murabito, Joanne M
2012-03-01
Little is known about the familial aggregation of intermittent claudication (IC). Our objective was to examine whether parental IC increased the risk of IC in adult offspring, independent of the established cardiovascular risk factors. We evaluated the Offspring Cohort Participants of the Framingham Heart Study who were ≥30 years old, cardiovascular disease free, and had both parents enrolled in the Framingham Heart Study (n = 2,970 unique participants, 53% women). Pooled proportional hazards regression analysis was used to examine whether the 12-year risk of incident IC in offspring participants was associated with parental IC, adjusting for age, gender, diabetes, smoking, systolic blood pressure, total cholesterol, high-density lipoprotein cholesterol, and antihypertensive and lipid treatment. Of the 909 person-examinations in the parental IC history group and 5,397 person-examinations in the no-parental IC history group, there were 101 incident IC events (29 with parental IC history and 72 without a parental IC history) during follow-up. The age- and gender-adjusted 12-year cumulative incidence rate per 1,000 person-years was 5.08 (95% confidence interval [CI] 2.74 to 7.33) and 2.34 (95% CI 1.46 to 3.19) in participants with and without a parental IC history. A parental history of IC significantly increased the risk of incident IC in the offspring (multivariable adjusted hazard ratio 1.81, 95% CI 1.14 to 2.88). The hazard ratio was unchanged, with an adjustment for the occurrence of cardiovascular disease (hazard ratio 1.83, 95% CI 1.15 to 2.91). In conclusion, IC in parents increases the risk of IC in adult offspring, independent of the established risk factors. These data suggest a genetic component of peripheral artery disease and support future research into genetic causes. Copyright © 2012 Elsevier Inc. All rights reserved.
van Londen, Marco; Aarts, Brigitte M; Deetman, Petronella E; van der Weijden, Jessica; Eisenga, Michele F; Navis, Gerjan; Bakker, Stephan J L; de Borst, Martin H
2017-08-07
Hypophosphatemia is common in the first year after kidney transplantation, but its clinical implications are unclear. We investigated the relationship between the severity of post-transplant hypophosphatemia and mortality or death-censored graft failure in a large cohort of renal transplant recipients with long-term follow-up. We performed a longitudinal cohort study in 957 renal transplant recipients who were transplanted between 1993 and 2008 at a single center. We used a large real-life dataset containing 28,178 phosphate measurements (median of 27; first to third quartiles, 23-34) serial measurements per patient) and selected the lowest intraindividual phosphate level during the first year after transplantation. The primary outcomes were all-cause mortality, cardiovascular mortality, and death-censored graft failure. The median (interquartile range) intraindividual lowest phosphate level was 1.58 (1.30-1.95) mg/dl, and it was reached at 33 (21-51) days post-transplant. eGFR was the main correlate of the lowest serum phosphate level (model R 2 =0.32). During 9 (5-12) years of follow-up, 181 (19%) patients developed graft failure, and 295 (35%) patients died, of which 94 (32%) deaths were due to cardiovascular disease. In multivariable Cox regression analysis, more severe hypophosphatemia was associated with a lower risk of death-censored graft failure (fully adjusted hazard ratio, 0.61; 95% confidence interval, 0.43 to 0.88 per 1 mg/dl lower serum phosphate) and cardiovascular mortality (fully adjusted hazard ratio, 0.37; 95% confidence interval, 0.22 to 0.62) but not noncardiovascular mortality (fully adjusted hazard ratio, 1.33; 95% confidence interval, 0.9 to 1.96) or all-cause mortality (fully adjusted hazard ratio, 1.15; 95% confidence interval, 0.81 to 1.61). Post-transplant hypophosphatemia develops early after transplantation. These data connect post-transplant hypophosphatemia with favorable long-term graft and patient outcomes. Copyright © 2017 by the American Society of Nephrology.
Zhang, Kun; Gao, Baoshan; Wang, Yuantao; Wang, Gang; Wang, Weigang; Zhu, Yaxiang; Yao, Liyu; Gu, Yiming; Chen, Mo; Zhou, Honglan; Fu, Yaowen
2015-01-01
Since the association of serum uric acid and kidney transplant graft outcome remains disputable, we sought to evaluate the predictive value of uric acid level for graft survival/function and the factors could affect uric acid as time varies. A consecutive cohort of five hundred and seventy three recipients transplanted during January 2008 to December 2011 were recruited. Data and laboratory values of our interest were collected at 1, 3, 6, 12, 24 and 36 months post-transplant for analysis. Cox proportional hazard model, and multiple regression equation were built to adjust for the possible confounding variables and meet our goals as appropriate. The current cohort study lasts for 41.86 ± 15.49 months. Uric acid level is proven to be negatively associated with eGFR at different time point after adjustment for age, body mass index and male gender (standardized β ranges from -0.15 to -0.30 with all P<0.001).Males with low eGFR but high level of TG were on CSA, diuretics and RAS inhibitors and experienced at least one episode of acute rejection and diabetic issue were associated with a higher mean uric acid level. Hyperuricemia was significantly an independent predictor of pure graft failure (hazard ratio=4.01, 95% CI: 1.25-12.91, P=0.02) after adjustment. But it was no longer an independent risk factor for graft loss after adjustment. Interestingly, higher triglyceride level can make incidence of graft loss (hazard ratio=1.442, for each unit increase millimoles per liter 95% CI: 1.008-2.061, P=0.045) and death (hazard ratio=1.717, 95% CI: 1.105-2.665, P=0.016) more likely. The results of our study suggest that post-transplant elevated serum uric acid level is an independent predictor of long-term graft survival and graft function. Together with the high TG level impact on poor outcomes, further investigations for therapeutic effect are needed. PMID:26208103
Bolland, Mark J; Grey, Andrew; Gamble, Greg D; Reid, Ian R
2015-01-01
Observational studies (OS) and randomized controlled trials (RCTs) often report discordant results. In the Women's Health Initiative Calcium and Vitamin D (WHI CaD) RCT, women were randomly assigned to CaD or placebo, but were permitted to use personal calcium and vitamin D supplements, creating a unique opportunity to compare results from randomized and observational analyses within the same study. WHI CaD was a 7-year RCT of 1g calcium/400IU vitamin D daily in 36,282 post-menopausal women. We assessed the effects of CaD on cardiovascular events, death, cancer and fracture in a randomized design- comparing CaD with placebo in 43% of women not using personal calcium or vitamin D supplements- and in a observational design- comparing women in the placebo group (44%) using personal calcium and vitamin D supplements with non-users. Incidence was assessed using Cox proportional hazards models, and results from the two study designs deemed concordant if the absolute difference in hazard ratios was ≤0.15. We also compared results from WHI CaD to those from the WHI Observational Study(WHI OS), which used similar methodology for analyses and recruited from the same population. In WHI CaD, for myocardial infarction and stroke, results of unadjusted and 6/8 covariate-controlled observational analyses (age-adjusted, multivariate-adjusted, propensity-adjusted, propensity-matched) were not concordant with the randomized design results. For death, hip and total fracture, colorectal and total cancer, unadjusted and covariate-controlled observational results were concordant with randomized results. For breast cancer, unadjusted and age-adjusted observational results were concordant with randomized results, but only 1/3 other covariate-controlled observational results were concordant with randomized results. Multivariate-adjusted results from WHI OS were concordant with randomized WHI CaD results for only 4/8 endpoints. Results of randomized analyses in WHI CaD were concordant with observational analyses for 5/8 endpoints in WHI CaD and 4/8 endpoints in WHI OS.
Bolland, Mark J.; Grey, Andrew; Gamble, Greg D.; Reid, Ian R.
2015-01-01
Background Observational studies (OS) and randomized controlled trials (RCTs) often report discordant results. In the Women’s Health Initiative Calcium and Vitamin D (WHI CaD) RCT, women were randomly assigned to CaD or placebo, but were permitted to use personal calcium and vitamin D supplements, creating a unique opportunity to compare results from randomized and observational analyses within the same study. Methods WHI CaD was a 7-year RCT of 1g calcium/400IU vitamin D daily in 36,282 post-menopausal women. We assessed the effects of CaD on cardiovascular events, death, cancer and fracture in a randomized design- comparing CaD with placebo in 43% of women not using personal calcium or vitamin D supplements- and in a observational design- comparing women in the placebo group (44%) using personal calcium and vitamin D supplements with non-users. Incidence was assessed using Cox proportional hazards models, and results from the two study designs deemed concordant if the absolute difference in hazard ratios was ≤0.15. We also compared results from WHI CaD to those from the WHI Observational Study(WHI OS), which used similar methodology for analyses and recruited from the same population. Results In WHI CaD, for myocardial infarction and stroke, results of unadjusted and 6/8 covariate-controlled observational analyses (age-adjusted, multivariate-adjusted, propensity-adjusted, propensity-matched) were not concordant with the randomized design results. For death, hip and total fracture, colorectal and total cancer, unadjusted and covariate-controlled observational results were concordant with randomized results. For breast cancer, unadjusted and age-adjusted observational results were concordant with randomized results, but only 1/3 other covariate-controlled observational results were concordant with randomized results. Multivariate-adjusted results from WHI OS were concordant with randomized WHI CaD results for only 4/8 endpoints. Conclusions Results of randomized analyses in WHI CaD were concordant with observational analyses for 5/8 endpoints in WHI CaD and 4/8 endpoints in WHI OS. PMID:26440516
Correlates of household seismic hazard adjustment adoption.
Lindell, M K; Whitney, D J
2000-02-01
This study examined the relationships of self-reported adoption of 12 seismic hazard adjustments (pre-impact actions to reduce danger to persons and property) with respondents' demographic characteristics, perceived risk, perceived hazard knowledge, perceived protection responsibility, and perceived attributes of the hazard adjustments. Consistent with theoretical predictions, perceived attributes of the hazard adjustments differentiated among the adjustments and had stronger correlations with adoption than any of the other predictors. These results identify the adjustments and attributes that emergency managers should address to have the greatest impact on improving household adjustment to earthquake hazard.
Tanaka, Shin-ichiro; Yasuda, Tomoyuki; Ishida, Tatsuro; Fujioka, Yoshio; Tsujino, Takeshi; Miki, Tetsuo; Hirata, Ken-ichi
2013-05-01
Lecithin:cholesterol acyltransferase (LCAT) is thought to be important in reverse cholesterol transport. However, its association with coronary heart disease (CHD) and sudden death is controversial. We prospectively studied 1927 individuals from the general population. Serum concentrations of apolipoprotein A-I, A-II, B, C-II, C-III, E, and LCAT activity measured as a serum cholesterol esterification rate were evaluated. We documented 61 events of CHD and sudden death during 10.9 years of follow-up. After adjustment for age and sex, LCAT activity was significantly associated with the risk of CHD and sudden death (hazard ratio, 3.02; 95% confidence interval, 1.49-6.12; P=0.002). In multivariate analysis adjusted for age, sex, current smoking status, history of diabetes mellitus, body mass index, systolic blood pressure, serum total cholesterol, and serum high-density lipoprotein cholesterol concentrations, the hazard ratio of LCAT activity for the risk of CHD and sudden death remained significant (hazard ratio, 3.07; 95% confidence interval, 1.35-7.01; P=0.008). However, when it was analyzed for men and women separately, this association remained significant only in women. Increased LCAT activity measured as a serum cholesterol esterification rate was a risk for CHD and sudden death in a Japanese general population.
Wong, Tuck-Siu; Liao, Kuan-Fu; Lin, Chi-Ming; Lin, Cheng-Li; Chen, Wen-Chi; Lai, Shih-Wei
2016-04-01
The aim of this study is to explore whether there is a relationship between chronic pancreatitis and cerebrovascular disease in Taiwan. Using the claims data of the Taiwan National Health Insurance Program, we identified 16,672 subjects aged 20 to 84 years with a new diagnosis of chronic pancreatitis from 2000 to 2010 as the chronic pancreatitis group. We randomly selected 65,877 subjects aged 20 to 84 years without chronic pancreatitis as the nonchronic pancreatitis group. Both groups were matched by sex, age, comorbidities, and the index year of diagnosing chronic pancreatitis. The incidence of cerebrovascular disease at the end of 2011 was measured. The multivariable Cox proportional hazards regression model was used to measure the hazard ratio (HR) and 95% confidence interval (CI) for cerebrovascular disease risk associated with chronic pancreatitis and other comorbidities. The overall incidence of cerebrovascular disease was 1.24-fold greater in the chronic pancreatitis group than that in the nonchronic pancreatitis group (14.2 vs. 11.5 per 1000 person-years, 95% CI = 1.19-1.30). After controlling for confounding factors, the adjusted HR of cerebrovascular disease was 1.27 (95% CI = 1.19-1.36) for the chronic pancreatitis group as compared with the nonchronic pancreatitis group. Woman (adjusted HR = 1.41, 95% CI = 1.31-1.51), age (every 1 year, HR = 1.04, 95% CI = 1.04-1.05), atrial fibrillation (adjusted HR = 1.23, 95% CI = 1.02-1.48), chronic kidney disease (adjusted HR = 1.48, 95% CI = 1.31-1.67), chronic obstructive pulmonary disease (adjusted HR = 1.27, 95% CI = 1.16-1.40), diabetes mellitus (adjusted HR = 1.82, 95% CI = 1.72-1.92), hypertension (adjusted HR = 1.66, 95% CI = 1.56-1.76), and peripheral atherosclerosis (adjusted HR = 1.26, 95% CI = 1.06-1.51) were other factors significantly associated with cerebrovascular disease. Chronic pancreatitis is associated with increased hazard of subsequent cerebrovascular disease.
Chronic Pancreatitis Correlates With Increased Risk of Cerebrovascular Disease
Wong, Tuck-Siu; Liao, Kuan-Fu; Lin, Chi-Ming; Lin, Cheng-Li; Chen, Wen-Chi; Lai, Shih-Wei
2016-01-01
Abstract The aim of this study is to explore whether there is a relationship between chronic pancreatitis and cerebrovascular disease in Taiwan. Using the claims data of the Taiwan National Health Insurance Program, we identified 16,672 subjects aged 20 to 84 years with a new diagnosis of chronic pancreatitis from 2000 to 2010 as the chronic pancreatitis group. We randomly selected 65,877 subjects aged 20 to 84 years without chronic pancreatitis as the nonchronic pancreatitis group. Both groups were matched by sex, age, comorbidities, and the index year of diagnosing chronic pancreatitis. The incidence of cerebrovascular disease at the end of 2011 was measured. The multivariable Cox proportional hazards regression model was used to measure the hazard ratio (HR) and 95% confidence interval (CI) for cerebrovascular disease risk associated with chronic pancreatitis and other comorbidities. The overall incidence of cerebrovascular disease was 1.24-fold greater in the chronic pancreatitis group than that in the nonchronic pancreatitis group (14.2 vs. 11.5 per 1000 person-years, 95% CI = 1.19–1.30). After controlling for confounding factors, the adjusted HR of cerebrovascular disease was 1.27 (95% CI = 1.19–1.36) for the chronic pancreatitis group as compared with the nonchronic pancreatitis group. Woman (adjusted HR = 1.41, 95% CI = 1.31–1.51), age (every 1 year, HR = 1.04, 95% CI = 1.04–1.05), atrial fibrillation (adjusted HR = 1.23, 95% CI = 1.02–1.48), chronic kidney disease (adjusted HR = 1.48, 95% CI = 1.31–1.67), chronic obstructive pulmonary disease (adjusted HR = 1.27, 95% CI = 1.16–1.40), diabetes mellitus (adjusted HR = 1.82, 95% CI = 1.72–1.92), hypertension (adjusted HR = 1.66, 95% CI = 1.56–1.76), and peripheral atherosclerosis (adjusted HR = 1.26, 95% CI = 1.06–1.51) were other factors significantly associated with cerebrovascular disease. Chronic pancreatitis is associated with increased hazard of subsequent cerebrovascular disease. PMID:27082563
A Concept–Wide Association Study of Clinical Notes to Discover New Predictors of Kidney Failure
Betensky, Rebecca A.; Wright, Adam; Curhan, Gary C.; Bates, David W.; Waikar, Sushrut S.
2016-01-01
Background and objectives Identifying predictors of kidney disease progression is critical toward the development of strategies to prevent kidney failure. Clinical notes provide a unique opportunity for big data approaches to identify novel risk factors for disease. Design, setting, participants, & measurements We used natural language processing tools to extract concepts from the preceding year’s clinical notes among patients newly referred to a tertiary care center’s outpatient nephrology clinics and retrospectively evaluated these concepts as predictors for the subsequent development of ESRD using proportional subdistribution hazards (competing risk) regression. The primary outcome was time to ESRD, accounting for a competing risk of death. We identified predictors from univariate and multivariate (adjusting for Tangri linear predictor) models using a 5% threshold for false discovery rate (q value <0.05). We included all patients seen by an adult outpatient nephrologist between January 1, 2004 and June 18, 2014 and excluded patients seen only by transplant nephrology, with preexisting ESRD, with fewer than five clinical notes, with no follow-up, or with no baseline creatinine values. Results Among the 4013 patients selected in the final study cohort, we identified 960 concepts in the unadjusted analysis and 885 concepts in the adjusted analysis. Novel predictors identified included high–dose ascorbic acid (adjusted hazard ratio, 5.48; 95% confidence interval, 2.80 to 10.70; q<0.001) and fast food (adjusted hazard ratio, 4.34; 95% confidence interval, 2.55 to 7.40; q<0.001). Conclusions Novel predictors of human disease may be identified using an unbiased approach to analyze text from the electronic health record. PMID:27927892
Epstein, Noam U; Guo, Rong; Farlow, Martin R; Singh, Jaswinder P; Fisher, Morris
2014-02-01
Falls are common in the elderly, especially in those with cognitive impairment. The elderly are often treated with several medications, which may have both beneficial and deleterious effects. The use and type of medication in Alzheimer's disease (AD) patients and association with falls is limited. We examined the association between falls and medication use in the Alzheimer's Disease Neuroimaging Initiative (ADNI). Diagnosis, demographics, medication use, apolipoprotein E4 allele status and functional activity level at baseline were gathered for 810 participants enrolled in the ADNI, including healthy controls and subjects with mild cognitive impairment or Alzheimer's. Reports detailing adverse event falls were tabulated. Baseline characteristics were compared between subjects with and without one or more falls. Cox proportional hazards models were conducted to evaluate the association between subject characteristics and hazard of the first fall. Age (p < 0.0001), Functional Activities Questionnaire (p = 0.035), Beers List (p = 0.0477) and medications for treating cognitive symptoms of Alzheimer's (p = 0.0019) were associated with hazard of fall in the univariate model. In the final multivariate model, after adjusting for covariates, Alzheimer's medication use (p = 0.0005) was associated with hazard of fall. Medication was changed by the clinician after an adverse fall event in 9% of the falls. About 7% of the falls were reported as serious adverse events and 6% were reported to be severe. We found a significant association between the use of symptomatic medication treating cognitive symptoms in AD and hazard of fall after adjusting for age and Beers List medication use. Additional pharmacovigilance of the association between falls and Alzheimer's medication use is warranted.
Epstein, Noam U.; Guo, Rong; Farlow, Martin R.; Singh, Jaswinder P.; Fisher, Morris
2014-01-01
Background Falls are common in the elderly, especially in those with cognitive impairment. The elderly are often treated with several medications which may have both beneficial and deleterious effects. The use and type of medication in Alzheimer’s patients and association with falls is limited. Objective We examined the association between falls and medication use in the Alzheimer’s Disease Neuro-Imaging Initiative (ADNI). Methods Diagnosis, demographics, medication use, apolipoprotein E4 allele status and functional activity level at baseline were gathered for 810 participants enrolled in ADNI including healthy controls and subjects with mild cognitive impairment or Alzheimer’s. Adverse event fall reports were tabulated. Baseline characteristics were compared between subjects with and without one or more falls. Cox proportional hazards models were conducted to evaluate the association between subject characteristics and hazard of first fall. Results Age (p<0.0001), functional activities questionnaire (p=0.035), Beers list (p=0.0477) and medications for treating cognitive symptoms of Alzheimer’s (p=0.0019) were associated with hazard of fall in the univariate model. In the final multivariate model, after adjusting for covariates, Alzheimer’s medication use (p=0.0005) was associated with hazard of fall. Medication was changed after an adverse fall event by the clinician in 9% of the falls. About 7% of the falls were reported as serious adverse events and 6% were reported to be severe. Conclusion We found a significant association between use of symptomatic medication treating cognitive symptoms in Alzheimer’s disease and hazard of fall after adjusting for age and Beers list medication use. Additional pharmaco-vigilance of the association between falls and Alzheimer’s medication use is warranted. PMID:24357133
Ishikawa, Joji; Ishikawa, Shizukiyo; Kario, Kazuomi
2015-03-01
We attempted to evaluate whether subjects who exhibit prolonged corrected QT (QTc) interval (≥440 ms in men and ≥460 ms in women) on ECG, with and without ECG-diagnosed left ventricular hypertrophy (ECG-LVH; Cornell product, ≥244 mV×ms), are at increased risk of stroke. Among the 10 643 subjects, there were a total of 375 stroke events during the follow-up period (128.7±28.1 months; 114 142 person-years). The subjects with prolonged QTc interval (hazard ratio, 2.13; 95% confidence interval, 1.22-3.73) had an increased risk of stroke even after adjustment for ECG-LVH (hazard ratio, 1.71; 95% confidence interval, 1.22-2.40). When we stratified the subjects into those with neither a prolonged QTc interval nor ECG-LVH, those with a prolonged QTc interval but without ECG-LVH, and those with ECG-LVH, multivariate-adjusted Cox proportional hazards analysis demonstrated that the subjects with prolonged QTc intervals but not ECG-LVH (1.2% of all subjects; incidence, 10.7%; hazard ratio, 2.70, 95% confidence interval, 1.48-4.94) and those with ECG-LVH (incidence, 7.9%; hazard ratio, 1.83; 95% confidence interval, 1.31-2.57) had an increased risk of stroke events, compared with those with neither a prolonged QTc interval nor ECG-LVH. In conclusion, prolonged QTc interval was associated with stroke risk even among patients without ECG-LVH in the general population. © 2014 American Heart Association, Inc.
Farré, Núria; Aranyó, Júlia; Enjuanes, Cristina; Verdú-Rotellar, José María; Ruiz, Sonia; Gonzalez-Robledo, Gina; Meroño, Oona; de Ramon, Marta; Moliner, Pedro; Bruguera, Jordi; Comin-Colet, Josep
2015-02-15
Obese patients with chronic Heart Failure (HF) have better outcome than their lean counterparts, although little is known about the pathophysiology of this obesity paradox. Our aim was to evaluate the hypothesis that patients with chronic HF and obesity (defined as body mass index (BMI)≥30kg/m(2)), may have an attenuated neurohormonal activation in comparison with non-obese patients. The present study is the post-hoc analysis of a cohort of 742 chronic HF patients from a single-center study evaluating sympathetic activation by measuring baseline levels of norepinephrine (NE). Obesity was present in 33% of patients. Higher BMI and obesity were significantly associated with lower NE levels in multivariable linear regression models adjusted for covariates (p<0.001). Addition to NE in multivariate Cox proportional hazard models attenuated the prognostic impact of BMI in terms of outcomes. Finally, when we explored the prognosis impact of raised NE levels (>70th percentile) carrying out a separate analysis in obese and non-obese patients we found that in both groups NE remained a significant independent predictor of poorer outcomes, despite the lower NE levels in patients with chronic HF and obesity: all-cause mortality hazard ratio=2.37 (95% confidence interval, 1.14-4.94) and hazard ratio=1.59 (95% confidence interval, 1.05-2.4) in obese and non-obese respectively; and cardiovascular mortality hazard ratio=3.08 (95% confidence interval, 1.05-9.01) in obese patients and hazard ratio=2.08 (95% confidence interval, 1.42-3.05) in non-obese patients. Patients with chronic HF and obesity have significantly lower sympathetic activation. This finding may partially explain the obesity paradox described in chronic HF patients. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Risk of fall-related injury in people with lower limb amputations: A prospective cohort study.
Wong, Christopher Kevin; Chihuri, Stanford T; Li, Guohua
2016-01-01
To assess fall-related injury risk and risk factors in people with lower limb amputation. Prospective longitudinal cohort with follow-up every 6 months for up to 41 months. Community-dwelling adults with lower limb amputations of any etiology and level recruited from support groups and prosthetic clinics. Demographic and clinical characteristics were obtained by self-reported questionnaire and telephone or in-person follow-up. Fall-related injury incidence requiring medical care per person-month and adjusted hazard ratio of fall-related injury were calculated using multivariable proportional hazards regression modeling. A total of 41 subjects, with 782 follow-up person-months in total, had 11 fall-related injury incidents (14.1/1,000 person-months). During follow-up, 56.1% of subjects reported falling and 26.8% reported fall-related injury. Multivariable proportional hazard modeling showed that women were nearly 6 times more likely as men to experience fall-related injury and people of non-white race were 13 times more likely than people of white race to experience fall-related injury. The final predictive model also included vascular amputation and age. Risk of fall-related injury requiring medical care in people with lower limb amputation appears to be higher than in older adult inpatients. Intervention programs to prevent fall-related injury in people with lower limb amputation should target women and racial minorities.
Williams, Jessica N.; Rai, Ashish; Lipscomb, Joseph; Koff, Jean L.; Nastoupil, Loretta J.; Flowers, Christopher R.
2015-01-01
Background Although rituximab, cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP) is considered standard therapy for diffuse large B-cell lymphoma (DLBCL), patterns of use and the impact of R-CHOP on survival in patients >80 years are less clear. Methods We used the Surveillance, Epidemiology, and End Results (SEER)-Medicare database to characterize presentation, treatment, and survival patterns in DLBCL patients diagnosed from 2002–2009. Chi-squared tests compared characteristics and initial treatments of DLBCL patients >80 years and ≤80 years. Multivariable logistic regression models examined factors associated with treatment selection in patients >80 years; standard and propensity score-adjusted multivariable Cox proportional hazards models examined relationships between treatment regimen, treatment duration, and survival. Results Among 4,635 patients with DLBCL, 1,156 (25%) were >80 years. Patients >80 were less likely to receive R-CHOP and more likely to be observed or receive rituximab, cyclophosphamide, vincristine, and prednisone (R-CVP); both p<0.0001. Marital status, stage, disease site, performance status, radiation therapy, and growth factor support were associated with initial R-CHOP in patients >80. In propensity score-matched multivariable Cox proportional hazards models examining relationships between treatment regimen and survival, R-CHOP was the only regimen associated with improved OS (hazard ratio (HR) = 0.45, 95% confidence interval (CI) = 0.33–0.62) and LRS (HR=0.58, 95% CI 0.38–0.88). Conclusions Although DLBCL patients >80 years were less likely to receive R-CHOP, this regimen conferred the longest survival and should be considered for this population. Further studies are needed to characterize the impact of DLBCL treatment on quality of life in this age group. PMID:25675909
Nuotio, M; Tuominen, P; Luukkaala, T
2016-03-01
We examined the association of nutritional status as measured by the Mini-Nutritional Assessment Short Form (MNA-SF) with changes in mobility, institutionalization and death after hip fracture. Population-based prospective data were collected on 472 out of 693 consecutive hip fracture patients aged 65 years and over between January 2010 and December 2012. Declined vs same or improved mobility level, institutionalization and death during the 4-month follow-up were the outcomes. Age, gender, American Society of Anesthesiologists scores, pre-fracture diagnosis of a memory disorder, mobility level, living arrangements and MNA-SF scores at baseline were the independent variables. Age-adjusted and multivariate logistic regression and Cox proportional hazards models were conducted. At baseline, 41 (9%) patients were malnourished and 200 (42%) patients at risk of malnutrition according to the MNA-SF. During the follow-up, 90 (19%) had died. In the multivariate Cox proportional hazards model, malnutrition (hazard ratio 2.16; 95% confidence interval (CI) 1.07-4.34) was associated with mortality. In the multivariate binary logistic regression analyses, risk of malnutrition (odds ratios (OR) 2.42; 95% CI 1.25-4.66) and malnutrition (OR 6.10;95% CI 2.01-18.5) predicted institutionalization. Risk of malnutrition (OR 2.03; 95% CI 1.24-3.31) was associated with decline in the mobility level. Malnutrition or risk of malnutrition as measured by the MNA-SF were independent predictors of negative outcomes after hip fracture. Patients classified as being at risk of malnutrition by the MNA-SF may constitute a patient population with mild-to-moderate malnutrition and may require specific attention when nutritional interventions are designed after hip fracture.
Chiu, Hsien-Yi; Wang, I-Ting; Huang, Weng-Foung; Tsai, Yi-Wen; Shiu, Ming-Neng; Tsai, Tsen-Fang
2017-05-01
Avascular necrosis (AVN) and psoriasis have some pathogenic mechanisms and associated conditions in common. To examine the association between psoriasis and AVN. This study used data from the Taiwan National Health Insurance Research Database for the period 2004-2006 and identified 28,268 patients with psoriasis, who were then matched for age and sex with 113,072 controls without psoriasis from the Taiwan Longitudinal Health Insurance Database 2000. Multivariate Cox proportional hazards models were used for the analysis. The unadjusted risk of AVN was significantly higher for patients with psoriasis than for controls (hazard ratio [HR] 2.29) and remained significant after adjustment for other risk factors (adjusted HR 1.96; 95% confidence interval 1.62-2.38). The risk for AVN increased in relation to psoriasis severity and was higher for patients with psoriasis and arthritis than for patients without arthritis. The adjusted HRs were higher for male patients than for female patients and for patients younger than 30 years compared with older patients. We lacked information on daily tobacco use, alcohol consumption, and physical activity. The risk for AVN increased with the disease severity of psoriasis. Copyright © 2016 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
Gálvez-Buccollini, Juan A; Paz-Soldán, Valerie A; Herrera, Phabiola M; DeLea, Suzanne; Gilman, Robert H
2009-06-01
To estimate the effect of sex-related alcohol expectancies (SRAE) on hazardous drinking prevalence and examine gender differences in reporting SRAE. Trained research assistants administered part of a questionnaire to 393 men and 400 women between 18 and 30 years old from a peri-urban shantytown in Lima, Peru. The remaining questions were self-administered. Two measuring instruments-one testing for hazardous drinking and one for SRAE-were used. Multivariate data analysis was performed using logistic regression. Based on odds ratios adjusted for socio-demographic variables (age, marital status, education, and employment status) (n = 793), men with one or two SRAE and men with three or more SRAE were 2.3 (95% confidence interval (CI) = 1.4-3.8; p = 0.001) and 3.9 (95% CI = 2.1-7.3; p < 0.001) times more likely than men with no SRAE, respectively, to be hazardous drinkers. Reporting of SRAE was significantly higher in men versus women. In a shantytown in Lima, SRAE is associated with hazardous drinking among men, but not among women, and reporting of SRAE differs by gender.
Diagnosis of pernicious anemia and the risk of pancreatic cancer.
Shah, Pari; Rhim, Andrew D; Haynes, Kevin; Hwang, Wei-Ting; Yang, Yu-Xiao
2014-04-01
A number of studies have demonstrated a trophic effect of gastrin on pancreatic cancer cells in vitro. Pernicious anemia (PA) is a clinical condition characterized by chronic hypergastrinemia. The aim of this study was to determine if PA is a risk factor for pancreatic cancer. This study is a retrospective cohort study using The Health Improvement Network database, which contains comprehensive health information on 7.5 million patients in the United Kingdom from 1993 to 2009. All patients with PA in the study cohort were identified and composed of the exposed group. Each exposed patient was matched on practice site, sex, and age with up to 4 unexposed patients without PA. The outcome was incident pancreatic cancer. The hazard ratio and 95% confidence intervals were estimated using multivariable Cox regression analysis. We identified 15,324 patients with PA and 55,094 unexposed patients. Mean follow-up time was similar between groups (exposed 4.31 [SD, 3.38] years, unexposed 4.63 [SD, 3.44] years). The multivariable adjusted hazard ratio for pancreatic cancer associated with PA was 1.16 (95% confidence interval, 0.77-1.76; P = 0.47). There is no significant association between PA and the risk of pancreatic cancer.
Allopurinol use and the risk of acute cardiovascular events in patients with gout and diabetes.
Singh, Jasvinder A; Ramachandaran, Rekha; Yu, Shaohua; Curtis, Jeffrey R
2017-03-14
Few studies, if any, have examined cardiovascular outcomes in patients with diabetes and gout. Both diabetes and gout are risk factors for cardiovascular disease. The objective of this study was to examine the effect of allopurinol on the risk of incident acute cardiovascular events in patients with gout and diabetes. We used the 2007-2010 Multi-Payer Claims Database (MPCD) that linked health plan data from national commercial and governmental insurances, representing beneficiaries with United Healthcare, Medicare, or Medicaid coverage. In patients with gout and diabetes, we assessed the current allopurinol use, defined as a new filled prescription for allopurinol, as the main predictor of interest. Our outcome of interest was the occurrence of the first Incident hospitalized myocardial infarction (MI) or stroke (composite acute cardiovascular event), after which observations were censored. We employed multivariable-adjusted Cox proportional hazards models that simultaneously adjusted for patient demographics, cardiovascular risk factors and other medical comorbidities. We calculated hazard ratios [HR] (95% confidence intervals [CI]) for incident composite (MI or stroke) acute cardiovascular events. We performed sensitivity analyses that additionally adjusted for the presence of immune diseases and colchicine use, as potential confounders. There were 2,053,185 person days (5621.3 person years) of current allopurinol use and 1,671,583 person days (4576.5 person years) of prior allopurinol use. There were 158 incident MIs or strokes in current and 151 in prior allopurinol users, respectively. Compared to previous allopurinol users, current allopurinol users had significantly lower adjusted hazard of incident acute cardiovascular events (incident stroke or MI), with an HR of 0.67 (95% CI, 0.53, 0.84). Sensitivity analyses, additionally adjusted for immune diseases or colchicine use, confirmed this association. Current allopurinol use protected against the occurrence of acute cardiovascular events in patients with gout and diabetes. The underlying mechanisms for this potential cardio-protective effect of allopurinol need further exploration.
Schnabel, Renate B.; Yin, Xiaoyan; PhilimonGona; Larson, Martin G.; Beiser, Alexa S.; McManus, David D.; Newton-Cheh, Christopher; Lubitz, Steven A.; Magnani, Jared W.; Ellinor, Patrick T.; SudhaSeshadri; Wolf, Philip A; Vasan, Ramachandran S.; Benjamin, Emelia J.; Levy, Daniel
2015-01-01
Summary Background Comprehensive long-term data on atrial fibrillation trends in men and women are scant. Methods We investigated trends in atrial fibrillation incidence, prevalence, and risk factors, and in stroke and mortality following its onset in Framingham Heart Study participants (n=9511) from 1958 to 2007. To accommodate sex differences in atrial fibrillation risk factors and disease manifestations, sex-stratified analyses were performed. Findings During 50 years of observation (202,417 person-years), there were 1,544 new-onset atrial fibrillation cases (46.8% women). We observed about a fourfold increase in the age-adjusted prevalence and more than a tripling in age-adjusted incidence of atrial fibrillation (prevalence 20.4 versus 96.2 per 1000 person-years in men; 13.7 versus 49.4 in women; incidence rates in first versus last decade 3.7 versus 13.4 per 1000 person-years in men; 2.5 versus 8.6 in women, ptrend<0.0001). For atrial fibrillation diagnosed by ECG during routine Framingham examinations, age-adjusted prevalence increased (12.6versus 25.7 per 1000 person-years in men; 8.1 versus 11.8 in women, ptrend<0.0001). The age-adjusted incidence increased, but did not achieve statistical significance. Although the prevalence of most risk factors changed over time, their associated hazards for atrial fibrillation changed little. Multivariable-adjusted proportional hazards models revealed a 73.5% decline in stroke and a 25.4% decline in mortality following atrial fibrillation onset (ptrend=0.0001, ptrend=0.003, respectively). Interpretation Our data suggest that observed trends of increased incidence of atrial fibrillation in the community were partially due to enhanced surveillance. Stroke occurrence and mortality following atrial fibrillation onset declined over the decades, and prevalence increased approximately fourfold. The hazards for atrial fibrillation risk factors remained fairly constant. Our data indicate a need for measures to enhance early detection of atrial fibrillation through increased awareness coupled with targeted screening programs, and risk factor-specific prevention. PMID:25960110
Zhang, X; Giovannucci, E L; Wu, K; Smith-Warner, S A; Fuchs, C S; Pollak, M; Willett, W C; Ma, J
2012-01-01
Background: Laboratory studies suggest a possible role of magnesium intake in colorectal carcinogenesis but epidemiological evidence is inconclusive. Method: We tested magnesium–colorectal cancer hypothesis in the Nurses' Health Study, in which 85 924 women free of cancer in 1980 were followed until June 2008. Cox proportional hazards regression models were used to estimate multivariable relative risks (MV RRs, 95% confidence intervals). Results: In the age-adjusted model, magnesium intake was significantly inversely associated with colorectal cancer risk; the RRs from lowest to highest decile of total magnesium intake were 1.0 (ref), 0.93, 0.81, 0.72, 0.74, 0.77, 0.72, 0.75, 0.80, and 0.67 (Ptrend<0.001). However, in the MV model adjusted for known dietary and non-dietary risk factors for colorectal cancer, the association was significantly attenuated; the MV RRs were 1.0 (ref), 0.96, 0.85, 0.78, 0.82, 0.86, 0.84, 0.91, 1.02, and 0.93 (Ptrend=0.77). Similarly, magnesium intakes were significantly inversely associated with concentrations of plasma C-peptide in age-adjusted model (Ptrend=0.002) but not in multivariate-adjusted model (Ptrend=0.61). Results did not differ by subsite or modified by calcium intakes or body mass index. Conclusion: These prospective results do not support an independent association of magnesium intake with either colorectal cancer risk or plasma C-peptide levels in women. PMID:22415230
Muscular Strength and Incident Hypertension in Normotensive and Prehypertensive Men
Maslow, Andréa L.; Sui, Xuemei; Colabianchi, Natalie; Hussey, Jim; Blair, Steven N.
2009-01-01
The protective effects of cardiorespiratory fitness (CRF) on hypertension (HTN) are well known; however, the association between muscular strength and incidence of HTN has yet to be examined. Purpose This study evaluated the strength-HTN association with and without accounting for CRF. Methods Participants were 4147 men (20–82 years) in the Aerobics Center Longitudinal Study for whom an age-specific composite muscular strength score was computed from measures of a 1-repetition maximal leg and a 1-repetition maximal bench press. CRF was quantified by maximal treadmill exercise test time in minutes. Cox proportional hazards regression analysis was used to estimate hazard ratios (HRs) and 95% confidence intervals of incident HTN events according to exposure categories. Results During a mean follow-up of 19 years, there were 503 incident HTN cases. Multivariable-adjusted (excluding CRF) HRs of hypertension in normotensive men comparing middle and high strength thirds to the lowest third were not significant at 1.17 and 0.84, respectively. Multivariable-adjusted (excluding CRF) HRs of hypertension in baseline prehypertensive men comparing middle and high strength thirds to the lowest third were significant at 0.73 and 0.72 (p=.01 each), respectively. The association between muscular strength and incidence of HTN in baseline prehypertensive men was no longer significant after control for CRF (p=.26). Conclusions The study indicated that middle and high levels of muscular strength were associated with a reduced risk of HTN in prehypertensive men only. However, this relationship was no longer significant after controlling for CRF. PMID:19927030
Low bone mineral density and risk of incident fracture in HIV-infected adults.
Battalora, Linda; Buchacz, Kate; Armon, Carl; Overton, Edgar T; Hammer, John; Patel, Pragna; Chmiel, Joan S; Wood, Kathy; Bush, Timothy J; Spear, John R; Brooks, John T; Young, Benjamin
2016-01-01
Prevalence rates of low bone mineral density (BMD) and bone fractures are higher among HIV-infected adults compared with the general United States (US) population, but the relationship between BMD and incident fractures in HIV-infected persons has not been well described. Dual energy X-ray absorptiometry (DXA) results of the femoral neck of the hip and clinical data were obtained prospectively during 2004-2012 from participants in two HIV cohort studies. Low BMD was defined by a T-score in the interval >-2.5 to <-1.0 (osteopenia) or ≤-2.5 (osteoporosis). We analysed the association of low BMD with risk of subsequent incident fractures, adjusted for sociodemographics, other risk factors and covariables, using multivariable proportional hazards regression. Among 1,006 participants analysed (median age 43 years [IQR 36-49], 83% male, 67% non-Hispanic white, median CD4(+) T-cell count 461 cells/mm(3) [IQR 311-658]), 36% (n=358) had osteopenia and 4% (n=37) osteoporosis; 67 had a prior fracture documented. During 4,068 person-years of observation after DXA scanning, 85 incident fractures occurred, predominantly rib/sternum (n=18), hand (n=14), foot (n=13) and wrist (n=11). In multivariable analyses, osteoporosis (adjusted hazard ratio [aHR] 4.02, 95% CI 2.02, 8.01) and current/prior tobacco use (aHR 1.59, 95% CI 1.02, 2.50) were associated with incident fracture. In this large sample of HIV-infected adults in the US, low baseline BMD was significantly associated with elevated risk of incident fracture. There is potential value of DXA screening in this population.
SOX9 expression predicts relapse of stage II colon cancer patients.
Marcker Espersen, Maiken Lise; Linnemann, Dorte; Christensen, Ib Jarle; Alamili, Mahdi; Troelsen, Jesper T; Høgdall, Estrid
2016-06-01
The aim of this study was to investigate if the protein expression of sex-determining region y-box 9 (SOX9) in primary tumors could predict relapse of stage II colon cancer patients. One hundred forty-four patients with stage II primary colon cancer were retrospectively enrolled in the study. SOX9 expression was evaluated by immunohistochemistry, and mismatch repair status was assessed by both immunohistochemistry and promoter hypermethylation assay. High SOX9 expression at the invasive front was significantly associated with lower risk of relapse when including the SOX9 expression as a continuous variable (from low to high expression) in univariate (hazard ratio [HR], 0.73; 95% confidence interval [CI], 0.56-0.94; P = .01) and multivariate Cox proportional hazards analyses (HR, 0.75; 95% CI, 0.58-0.96; P = .02), adjusting for mismatch repair deficiency and histopathologic risk factors. Conversely, low SOX9 expression at the invasive front was significantly associated with high risk of relapse, when including SOX9 expression as a dichotomous variable, in univariate (HR, 2.32; 95% CI, 1.14-4.69; P = .02) and multivariate analyses (HR, 2.32; 95% CI, 1.14-4.69; P = .02), adjusting for histopathologic risk factors and mismatch repair deficiency. In conclusion, high levels of SOX9 of primary stage II colon tumors predict low risk of relapse, whereas low levels of SOX9 predict high risk of relapse. SOX9 may have an important value as a biomarker when evaluating risk of relapse for personalized treatment. Copyright © 2016 Elsevier Inc. All rights reserved.
Shimazu, T; Wakai, K; Tamakoshi, A; Tsuji, I; Tanaka, K; Matsuo, K; Nagata, C; Mizoue, T; Inoue, M; Tsugane, S; Sasazuki, S
2014-06-01
Prospective evidence is inconsistent regarding the association between vegetable/fruit intake and the risk of gastric cancer. In an analysis of original data from four population-based prospective cohort studies encompassing 191 232 participants, we used Cox proportional hazards regression to estimate hazard ratios (HRs) and 95% confidence intervals (CIs) of gastric cancer incidence according to vegetable and fruit intake and conducted a meta-analysis of HRs derived from each study. During 2 094 428 person-years of follow-up, 2995 gastric cancer cases were identified. After adjustment for potential confounders, we found a marginally significant decrease in gastric cancer risk in relation to total vegetable intake but not total fruit intake: the multivariate-adjusted HR (95% CI; P for trend) for the highest versus the lowest quintile of total vegetable intake was 0.89 (0.77-1.03; P for trend = 0.13) among men and 0.83 (0.67-1.03; P for trend = 0.40) among women. For distal gastric cancer, the multivariate HR for the highest quintile of total vegetable intake was 0.78 (0.63-0.97; P for trend = 0.02) among men. This pooled analysis of data from large prospective studies in Japan suggests that vegetable intake reduces gastric cancer risk, especially the risk of distal gastric cancer among men. © The Author 2014. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Exercise Decreases and Smoking Increases Bladder Cancer Mortality.
Liss, Michael A; White, Martha; Natarajan, Loki; Parsons, J Kellogg
2017-06-01
The aim of this study was to investigate modifiable lifestyle factors of smoking, exercise, and obesity with bladder cancer mortality. We used mortality-linked data from the National Health Information Survey from 1998 through 2006. The primary outcome was bladder cancer-specific mortality. The primary exposures were self-reported smoking status (never- vs. former vs. current smoker), self-reported exercise (dichotomized as "did no exercise" vs. "light, moderate, or vigorous exercise in ≥ 10-minute bouts"), and body mass index. We utilized multivariable adjusted Cox proportional hazards regression models, with delayed entry to account for age at survey interview. Complete data were available on 222,163 participants, of whom 96,715 (44%) were men and 146,014 (66%) were non-Hispanic whites, and among whom we identified 83 bladder cancer-specific deaths. In multivariate analyses, individuals who reported any exercise were 47% less likely (adjusted hazard ratio [HR adj ], 0.53; 95% confidence interval [CI], 0.29-0.96; P = .038) to die of bladder cancer than "no exercise". Compared with never-smokers, current (HR adj , 4.24; 95% CI, 1.89-9.65; P = .001) and former (HR adj , 2.95; 95% CI, 1.50-5.79; P = .002) smokers were 4 and 3 times more likely, respectively, to die of bladder cancer. There were no significant associations of body mass index with bladder cancer mortality. Exercise decreases and current smoking increases the risk of bladder cancer-specific mortality. These data suggest that exercise and smoking cessation interventions may reduce bladder cancer death. Published by Elsevier Inc.
Farr, Amanda M; Sheehan, John J; Davis, Brian M; Smith, David M
2016-01-01
Adherence and persistence to antidiabetes medications are important to control blood glucose levels among individuals with type 2 diabetes mellitus (T2D). The objective of this study was to compare adherence and persistence over a 12-month period between patients initiating saxagliptin and patients initiating linagliptin, two dipeptidyl peptidase-4 inhibitors. This retrospective cohort study was conducted in MarketScan(®) Commercial and Medicare Supplemental claims databases. Patients with T2D initiating saxagliptin or linagliptin between January 1, 2009, and June 30, 2013, were selected. Patients were required to be at least 18 years old and have 12 months of continuous enrollment prior to and following initiation. Adherence and persistence to initiated medication were measured over the 12 months after initiation using outpatient pharmacy claims. Patients were considered adherent if the proportion of days covered was ≥0.80. Patients were considered nonpersistent (or to have discontinued) if there was a gap of >60 days without initiated medication on hand. Multivariable logistic regression and multivariable Cox proportional hazard models were fit to compare adherence and persistence, respectively, between the two cohorts. There were 21,599 saxagliptin initiators (mean age 55 years; 53% male) and 5,786 linagliptin initiators (mean age 57 years; 54% male) included in the study sample. Over the 12-month follow-up, 46% of saxagliptin initiators and 42% of linagliptin initiators were considered adherent and 47% of saxagliptin initiators and 51% of linagliptin initiators discontinued their initiated medication. After controlling for patient characteristics, saxagliptin initiation was associated with significantly greater odds of being adherent (adjusted odds ratio =1.212, 95% CI 1.140-1.289) and significantly lower hazards of discontinuation (adjusted hazard ratio =0.887, 95% CI 0.850-0.926) compared with linagliptin initiation. Compared with patients with T2D who initiated linagliptin, patients with T2D who initiated saxagliptin had significantly better adherence and persistence.
Kurotani, Kayo; Akter, Shamima; Kashino, Ikuko; Goto, Atsushi; Mizoue, Tetsuya; Noda, Mitsuhiko; Sasazuki, Shizuka; Sawada, Norie; Tsugane, Shoichiro
2016-03-22
To examine the association between adherence to the Japanese Food Guide Spinning Top and total and cause specific mortality. Large scale population based prospective cohort study in Japan with follow-up for a median of 15 years. 11 public health centre areas across Japan. 36,624 men and 42,970 women aged 45-75 who had no history of cancer, stroke, ischaemic heart disease, or chronic liver disease. Deaths and causes of death identified with the residential registry and death certificates. Higher scores on the food guide (better adherence) were associated with lower total mortality; the multivariable adjusted hazard ratios (95% confidence interval) of total mortality for the lowest through highest scores were 1.00, 0.92 (0.87 to 0.97), 0.88 (0.83 to 0.93), and 0.85 (0.79 to 0.91) (P<0.001 for trend) and the multivariable adjusted hazard ratio associated with a 10 point increase in food guide scores was 0.93 (0.91 to 0.95; P<0.001 for trend). This score was inversely associated with mortality from cardiovascular disease (hazard ratio associated with a 10 point increase 0.93, 0.89 to 0.98; P=0.005 for trend) and particularly from cerebrovascular disease (0.89, 0.82 to 0.95; P=0.002 for trend). There was some evidence, though not significant, of an inverse association for cancer mortality (0.96, 0.93 to 1.00; P=0.053 for trend). Closer adherence to Japanese dietary guidelines was associated with a lower risk of total mortality and mortality from cardiovascular disease, particularly from cerebrovascular disease, in Japanese adults. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Nadeau-Fredette, Annie-Claire; Hawley, Carmel M.; Pascoe, Elaine M.; Chan, Christopher T.; Clayton, Philip A.; Polkinghorne, Kevan R.; Boudville, Neil; Leblanc, Martine
2015-01-01
Background and objectives Home dialysis is often recognized as a first-choice therapy for patients initiating dialysis. However, studies comparing clinical outcomes between peritoneal dialysis and home hemodialysis have been very limited. Design, setting, participants, & measurements This Australia and New Zealand Dialysis and Transplantation Registry study assessed all Australian and New Zealand adult patients receiving home dialysis on day 90 after initiation of RRT between 2000 and 2012. The primary outcome was overall survival. The secondary outcomes were on-treatment survival, patient and technique survival, and death-censored technique survival. All results were adjusted with three prespecified models: multivariable Cox proportional hazards model (main model), propensity score quintile–stratified model, and propensity score–matched model. Results The study included 10,710 patients on incident peritoneal dialysis and 706 patients on incident home hemodialysis. Treatment with home hemodialysis was associated with better patient survival than treatment with peritoneal dialysis (5-year survival: 85% versus 44%, respectively; log-rank P<0.001). Using multivariable Cox proportional hazards analysis, home hemodialysis was associated with superior patient survival (hazard ratio for overall death, 0.47; 95% confidence interval, 0.38 to 0.59) as well as better on-treatment survival (hazard ratio for on-treatment death, 0.34; 95% confidence interval, 0.26 to 0.45), composite patient and technique survival (hazard ratio for death or technique failure, 0.34; 95% confidence interval, 0.29 to 0.40), and death-censored technique survival (hazard ratio for technique failure, 0.34; 95% confidence interval, 0.28 to 0.41). Similar results were obtained with the propensity score models as well as sensitivity analyses using competing risks models and different definitions for technique failure and lag period after modality switch, during which events were attributed to the initial modality. Conclusions Home hemodialysis was associated with superior patient and technique survival compared with peritoneal dialysis. PMID:26068181
Association between GFR Estimated by Multiple Methods at Dialysis Commencement and Patient Survival
Wong, Muh Geot; Pollock, Carol A.; Cooper, Bruce A.; Branley, Pauline; Collins, John F.; Craig, Jonathan C.; Kesselhut, Joan; Luxton, Grant; Pilmore, Andrew; Harris, David C.
2014-01-01
Summary Background and objectives The Initiating Dialysis Early and Late study showed that planned early or late initiation of dialysis, based on the Cockcroft and Gault estimation of GFR, was associated with identical clinical outcomes. This study examined the association of all-cause mortality with estimated GFR at dialysis commencement, which was determined using multiple formulas. Design, setting, participants, & measurements Initiating Dialysis Early and Late trial participants were stratified into tertiles according to the estimated GFR measured by Cockcroft and Gault, Modification of Diet in Renal Disease, or Chronic Kidney Disease-Epidemiology Collaboration formula at dialysis commencement. Patient survival was determined using multivariable Cox proportional hazards model regression. Results Only Initiating Dialysis Early and Late trial participants who commenced on dialysis were included in this study (n=768). A total of 275 patients died during the study. After adjustment for age, sex, racial origin, body mass index, diabetes, and cardiovascular disease, no significant differences in survival were observed between estimated GFR tertiles determined by Cockcroft and Gault (lowest tertile adjusted hazard ratio, 1.11; 95% confidence interval, 0.82 to 1.49; middle tertile hazard ratio, 1.29; 95% confidence interval, 0.96 to 1.74; highest tertile reference), Modification of Diet in Renal Disease (lowest tertile hazard ratio, 0.88; 95% confidence interval, 0.63 to 1.24; middle tertile hazard ratio, 1.20; 95% confidence interval, 0.90 to 1.61; highest tertile reference), and Chronic Kidney Disease-Epidemiology Collaboration equations (lowest tertile hazard ratio, 0.93; 95% confidence interval, 0.67 to 1.27; middle tertile hazard ratio, 1.15; 95% confidence interval, 0.86 to 1.54; highest tertile reference). Conclusion Estimated GFR at dialysis commencement was not significantly associated with patient survival, regardless of the formula used. However, a clinically important association cannot be excluded, because observed confidence intervals were wide. PMID:24178976
Association between divorce and risks for acute myocardial infarction.
Dupre, Matthew E; George, Linda K; Liu, Guangya; Peterson, Eric D
2015-05-01
Divorce is a major life stressor that can have economic, emotional, and physical health consequences. However, the cumulative association between divorce and risks for acute myocardial infarction (AMI) is unknown. This study investigated the association between lifetime exposure to divorce and the incidence of AMI in US adults. We used nationally representative data from a prospective cohort of ever-married adults aged 45 to 80 years (n=15,827) who were followed biennially from 1992 to 2010. Approximately 14% of men and 19% of women were divorced at baseline and more than one third of the cohort had ≥1 divorce in their lifetime. In 200,524 person-years of follow-up, 8% (n=1211) of the cohort had an AMI and age-specific rates of AMI were consistently higher in those who were divorced compared with those who were continuously married (P<0.05). Results from competing-risk hazard models showed that AMI risks were significantly higher in women who had 1 divorce (hazard ratio, 1.24; 95% confidence interval, 1.01-1.55), ≥2 divorces (hazard ratio, 1.77; 95% confidence interval, 1.30-2.41), and among the remarried (hazard ratio, 1.35; 95% confidence interval, 1.07-1.70) compared with continuously married women after adjusting for multiple risk factors. Multivariable-adjusted risks were elevated only in men with a history of ≥2 divorces (hazard ratio, 1.30; 95% confidence interval, 1.02-1.66) compared with continuously married men. Men who remarried had no significant risk for AMI. Interaction terms for sex were not statistically significant. Divorce is a significant risk factor for AMI. The risks associated with multiple divorces are especially high in women and are not reduced with remarriage. © 2015 American Heart Association, Inc.
Wang, Yujie; Tuomilehto, Jaakko; Jousilahti, Pekka; Antikainen, Riitta; Mähönen, Markku; Katzmarzyk, Peter T; Hu, Gang
2010-09-28
The purpose of this study was to examine the association of different levels of occupational, commuting, and leisure-time physical activity and heart failure (HF) risk. The role of different types of physical activity in explaining the risk of HF is not properly established. Study cohorts included 28,334 Finnish men and 29,874 women who were 25 to 74 years of age and free of HF at baseline. Baseline measurement of different types of physical activity was used to predict incident HF. During a mean follow-up of 18.4 years, HF developed in 1,868 men and 1,640 women. The multivariate adjusted (age; smoking; education; alcohol consumption; body mass index; systolic blood pressure; total cholesterol; history of myocardial infarction, valvular heart disease, diabetes, lung disease, and use of antihypertensive drugs; and other types of physical activity) hazard ratios of HF associated with light, moderate, and active occupational activity were 1.00, 0.90, and 0.83 (p = 0.005, for trend) for men and 1.00, 0.80, and 0.92 (p = 0.007, for trend) for women, respectively. The multivariate adjusted hazard ratios of HF associated with low, moderate, and high leisure-time physical activity were 1.00, 0.83, and 0.65 (p < 0.001, for trend) for men and 1.00, 0.84, and 0.75 (p < 0.001, for trend) for women, respectively. Active commuting had a significant inverse association with HF risk in women, but not in men, before adjustment for occupational and leisure-time physical activity. The joint effects of any 2 types of physical activity on HF risk were even greater. Moderate and high levels of occupational or leisure-time physical activity are associated with a reduced risk of HF. Copyright © 2010 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Fractures in women treated with raloxifene or alendronate: a retrospective database analysis
2013-01-01
Background Raloxifene and alendronate are anti-resorptive therapies approved for the prevention and treatment of postmenopausal osteoporosis. Raloxifene is also indicated to reduce the risk of invasive breast cancer in postmenopausal women with osteoporosis and in postmenopausal women at high risk of invasive breast cancer. A definitive study comparing the fracture effectiveness and rate of breast cancer for raloxifene and alendronate has not been published. The purpose of this retrospective cohort study was to evaluate fracture and breast cancer rates among patients treated with raloxifene or alendronate. Methods Females ≥45 years who initiated raloxifene or alendronate in 1998–2006 Truven Health Analytics MarketScan® Databases, had continuous enrollment 12 months prior to and at least 12 months after the index date, and had a treatment medication possession ratio ≥80% were included in this study. Rates of vertebral and nonvertebral fractures and breast cancer during 1, 3, 5, 6, 7, and 8 years of treatment with raloxifene or alendronate were evaluated. Fracture rates were adjusted for potential treatment bias using inverse probability of treatment weights. Multivariate hazard ratios were estimated for vertebral and nonvertebral fractures. Results Raloxifene patients had statistically significantly lower rates of vertebral fractures in 1, 3, 5, and 7 years and for nonvertebral fractures in 1 and 5 years. There were no statistically significant differences in the adjusted fracture rates between raloxifene and alendronate cohorts, except in the 3-year nonvertebral fracture rates where raloxifene was higher. Multivariate hazard ratios of raloxifene versus alendronate cohorts were not significantly different for vertebral and nonvertebral fracture in 1, 3, 5, 6, 7, and 8 years. Unweighted and weighted breast cancer rates were lower among raloxifene recipients. Conclusions Patients treated with alendronate and raloxifene had similar adjusted fracture rates in up to 8 years of adherent treatment, and raloxifene patients had lower breast cancer rates. PMID:23521803
Singal, Amit G; Mittal, Sahil; Yerokun, Olutola A; Ahn, Chul; Marrero, Jorge A; Yopp, Adam C; Parikh, Neehar D; Scaglione, Steve J
2017-09-01
Professional societies recommend hepatocellular carcinoma screening in patients with cirrhosis, but high-quality data evaluating its effectiveness to improve early tumor detection and survival in "real world" clinical practice are needed. We aim to characterize the association between hepatocellular carcinoma screening and early tumor detection, curative treatment, and overall survival among patients with cirrhosis. We performed a retrospective cohort study of patients diagnosed with hepatocellular carcinoma between June 2012 and May 2013 at 4 health systems in the US. Patients were categorized in the screening group if hepatocellular carcinoma was detected by imaging performed for screening purposes. Generalized linear models and multivariate Cox regression with frailty adjustment were used to compare early detection, curative treatment, and survival between screen-detected and non-screen-detected patients. Among 374 hepatocellular carcinoma patients, 42% (n = 157) were detected by screening. Screen-detected patients had a significantly higher proportion of early tumors (Barcelona Clinic Liver Cancer stage A 63.1% vs 36.4%, P <.001) and were more likely to undergo curative treatment (31% vs 13%, P = .02). Hepatocellular carcinoma screening was significantly associated with improved survival in multivariate analysis (hazards ratio 0.41; 95% confidence interval, 0.26-0.65) after adjusting for patient demographics, Child-Pugh class, and performance status. Median survival of screen-detected patients was 14.6 months, compared with 6.0 months for non-screen-detected patients, with the difference remaining significant after adjusting for lead-time bias (hazards ratio 0.59, 95% confidence interval, 0.37-0.93). Hepatocellular carcinoma screening is associated with increased early tumor detection and improved survival; however, a minority of hepatocellular carcinoma patients are detected by screening. Interventions to increase screening use in patients with cirrhosis may help curb hepatocellular carcinoma mortality rates. Copyright © 2017 Elsevier Inc. All rights reserved.
Baumgartner, Christine; Fan, Dongjie; Fang, Margaret C; Singer, Daniel E; Witt, Daniel M; Schmelzer, John R; Williams, Marc S; Gurwitz, Jerry H; Sung, Sue Hee; Go, Alan S
2018-04-14
Anxiety and depression are associated with worse outcomes in several cardiovascular conditions, but it is unclear whether they affect outcomes in atrial fibrillation (AF). In a large diverse population of adults with AF, we evaluated the association of diagnosed anxiety and/or depression with stroke and bleeding outcomes. The Cardiovascular Research Network WAVE (Community-Based Control and Persistence of Warfarin Therapy and Associated Rates and Predictors of Adverse Clinical Events in Atrial Fibrillation and Venous Thromboembolism) Study included adults with AF newly starting warfarin between 2004 and 2007 within 5 health delivery systems in the United States. Diagnosed anxiety and depression and other patient characteristics were identified from electronic health records. We identified stroke and bleeding outcomes from hospitalization databases using validated International Classification of Diseases, Ninth Revision ( ICD-9 ), codes. We used multivariable Cox regression to assess the relation between anxiety and/or depression with outcomes after adjustment for stroke and bleeding risk factors. In 25 570 adults with AF initiating warfarin, 490 had an ischemic stroke or intracranial hemorrhage (1.52 events per 100 person-years). In multivariable analyses, diagnosed anxiety was associated with a higher adjusted rate of combined ischemic stroke and intracranial hemorrhage (hazard ratio, 1.52; 95% confidence interval, 1.01-2.28). Results were not materially changed after additional adjustment for patient-level percentage of time in therapeutic anticoagulation range on warfarin (hazard ratio, 1.56; 95% confidence interval, 1.03-2.36). In contrast, neither isolated depression nor combined depression and anxiety were significantly associated with outcomes. Diagnosed anxiety was independently associated with increased risk of combined ischemic stroke and intracranial hemorrhage in adults with AF initiating warfarin that was not explained by differences in risk factors or achieved anticoagulation quality. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Munoz Mendoza, Jair; Isakova, Tamara; Cai, Xuan; Bayes, Liz Y; Faul, Christian; Scialla, Julia J; Lash, James P; Chen, Jing; He, Jiang; Navaneethan, Sankar; Negrea, Lavinia; Rosas, Sylvia E; Kretzler, Matthias; Nessel, Lisa; Xie, Dawei; Anderson, Amanda Hyre; Raj, Dominic S; Wolf, Myles
2017-03-01
Inflammation is a consequence of chronic kidney disease (CKD) and is associated with adverse outcomes in many clinical settings. Inflammation stimulates production of fibroblast growth factor 23 (FGF23), high levels of which are independently associated with mortality in CKD. Few large-scale prospective studies have examined inflammation and mortality in patients with CKD, and none tested the interrelationships among inflammation, FGF23, and risk of death. Therefore, we conducted a prospective investigation of 3875 participants in the Chronic Renal Insufficiency Cohort (CRIC) study with CKD stages 2 to 4 to test the associations of baseline plasma interleukin-6, high-sensitivity C-reactive protein, and FGF23 levels with all-cause mortality, censoring at the onset of end-stage renal disease. During a median follow-up of 6.9 years, 550 participants died (20.5/1000 person-years) prior to end-stage renal disease. In separate multivariable-adjusted analyses, higher levels of interleukin-6 (hazard ratio per one standard deviation increase of natural log-transformed levels) 1.35 (95% confidence interval, 1.25-1.46), C-reactive protein 1.28 (1.16-1.40), and FGF23 1.45 (1.32-1.60) were each independently associated with increased risk of death. With further adjustment for FGF23, the risks of death associated with interleukin-6 and C-reactive protein were minimally attenuated. Compared to participants in the lowest quartiles of inflammation and FGF23, the multivariable-adjusted hazard ratio of death among those in the highest quartiles of both biomarkers was 4.38 (2.65-7.23) for interleukin-6 and FGF23, and 5.54 (3.04-10.09) for C-reactive protein and FGF23. Thus, elevated levels of interleukin-6, C-reactive protein, and FGF23 are independent risk factors for mortality in CKD. Copyright © 2016 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.
Egg consumption and risk of type 2 diabetes among African Americans: The Jackson Heart Study
Djoussé, Luc; Petrone, Andrew B; Hickson, DeMarc A; Talegawkar, Sameera A.; Dubbert, Patricia M; Taylor, Herman; Tucker, Katherine L
2015-01-01
Background and Aims Type 2 diabetes (DM) disproportionally affects African Americans. Data on the association between egg consumption and risk of DM are sparse. We sought to examine whether egg consumption is associated with the prevalence and incidence of DM among African Americans. Methods We analyzed baseline data from 4,568 participants of the Jackson Heart Study. Egg consumption was obtained using a food frequency questionnaire designed for this population. We used generalized estimating equations to calculate adjusted prevalence ratios of DM and Cox regression to estimate hazard ratios of DM with corresponding 95% confidence intervals (CI). Results The average age was 55±13 years and 64% of subjects were women. The median frequency of egg consumption was 2/week for men and 1/week for women. The prevalence of DM was 22% overall (21% of men and 23% of women). Multivariable adjusted prevalence ratio [PR(95% CI)] for DM were: 1.00 (ref), 1.14 (0.90–1.44), 1.33 (1.04–1.70), 1.33 (1.06–1.68), 1.26 (0.99–1.61), and 1.52 (1.17–1.97) for egg consumption of <1/month, 1–3/month, 1/week, 2/week, 3–4/week, and 5+/week, respectively, p for linear trend 0.0024. Corresponding multivariable adjusted hazard ratios were 1.00 (ref), 0.88 (0.65–1.19), 0.94 (0.68–1.30), 0.91 (0.66–1.25), 1.11 (0.81–1.52), and 1.17 (0.81–1.70), respectively, during a mean follow up of 7.3 years (p for linear trend 0.22). Conclusions While egg consumption was positively associated with prevalent DM, prospective analysis did not show an association of egg intake with incidence of DM among African Americans. PMID:25971658
Jespersen, Lasse; Abildstrom, Steen Z; Hvelplund, Anders; Madsen, Jan K; Galatius, Soren; Pedersen, Frants; Hojberg, Soren; Prescott, Eva
2014-01-01
To evaluate risk of hospitalization due to cardiovascular disease (CVD) and repeat coronary angiography (CAG) in stable angina pectoris (SAP) with no obstructive coronary artery disease (CAD) versus obstructive CAD, and asymptomatic reference individuals. We followed 11,223 patients with no prior CVD having a first-time CAG in 1998-2009 due to SAP symptoms and 5,695 asymptomatic reference individuals from the Copenhagen City Heart Study through registry linkage for 7.8 years (median). In recurrent event survival analysis, patients with SAP had 3-4-fold higher risk of hospitalization for CVD irrespective of CAG findings and cardiovascular comorbidity. Multivariable adjusted hazard ratios(95%CI) for patients with angiographically normal coronary arteries was 3.0(2.5-3.5), for angiographically diffuse non-obstructive CAD 3.9(3.3-4.6) and for 1-3-vessel disease 3.6-4.1(range)(all P<0.001). Mean accumulated hospitalization time was 3.5(3.0-4.0)(days/10 years follow-up) in reference individuals and 4.5(3.8-5.2)/7.0(5.4-8.6)/6.7(5.2-8.1)/6.1(5.2-7.4)/8.6(6.6-10.7) in patients with angiographically normal coronary arteries/angiographically diffuse non-obstructive CAD/1-, 2-, and 3-vessel disease, respectively (all P<0.05, age-adjusted). SAP symptoms predicted repeat CAG with multivariable adjusted hazard ratios for patients with angiographically normal coronary arteries being 2.3(1.9-2.9), for angiographically diffuse non-obstructive CAD 5.5(4.4-6.8) and for obstructive CAD 6.6-9.4(range)(all P<0.001). Patients with SAP symptoms and angiographically normal coronary arteries or angiographically diffuse non-obstructive CAD suffer from considerably greater CVD burdens in terms of hospitalization for CVD and repeat CAG compared with asymptomatic reference individuals even after adjustment for cardiac risk factors and exclusion of cardiovascular comorbidity as cause. Contrary to common perception, excluding obstructive CAD by CAG in such patients does not ensure a benign cardiovascular prognosis.
Lindell, Michael K; Arlikatti, Sudha; Prater, Carla S
2009-08-01
This study examined respondents' self-reported adoption of 16 hazard adjustments (preimpact actions to reduce danger to persons and property), their perceptions of those adjustments' attributes, and the correlations of those perceived attributes with respondents' demographic characteristics. The sample comprised 561 randomly selected residents from three cities in Southern California prone to high seismic risk and three cities from Western Washington prone to moderate seismic risks. The results show that the hazard adjustment perceptions were defined by hazard-related attributes and resource-related attributes. More significantly, the respondents had a significant degree of consensus in their ratings of those attributes and used them to differentiate among the hazard adjustments, as indicated by statistically significant differences among the hazard adjustment profiles. Finally, there were many significant correlations between respondents' demographic characteristics and the perceived characteristics of hazard adjustments, but there were few consistent patterns among these correlations.
Psychosocial stress as a risk factor for sepsis: a population-based cohort study.
Ojard, Connor; Donnelly, John P; Safford, Monika M; Griffin, Russell; Wang, Henry E
2015-01-01
To characterize the relationship between stress and future risk of sepsis. We also evaluated the role of depression in this relationship. We used population-based data on 30,183 participants in the Reasons for Geographic and Racial Differences in Stroke cohort, characterizing stress using the Perceived Stress Scale (PSS) and depressive symptoms using the Center for Epidemiologic Studies Depression Scale (CES-D). We identified incident sepsis events as hospitalizations for a serious infection with the presence of at least two systemic inflammatory response syndrome criteria. We assessed associations between PSS and incidence of sepsis for 1 and 10 years of follow-up, adjusting for demographics and chronic medical conditions and assessing the role of health behaviors and CES-D in these relationships. In 2003 to 2012, 1500 participants experienced an episode of sepsis. Mean PSS and CES-D scores were 3.2 (2.9) and 1.2 (2.1). PSS was associated with increased 1-year adjusted incidence of sepsis (hazard ratio [HR] = 1.21 per PSS standard deviation, 95% confidence interval = 1.06-1.38); multivariable adjustment for health behaviors and CES-D did not change this association (1.20, 1.03-1.39). PSS was also associated with increased 10-year adjusted incidence of sepsis (HR = 1.07 per PSS standard deviation; 95% confidence interval = 1.02-1.13). Multivariable adjustment showed that health behaviors did not affect this long-term association, whereas the addition of CES-D reduced the association between PSS and sepsis during 10-year follow-up (HR = 1.04, 0.98-1.11). Increased stress was associated with higher 1-year adjusted incidence of sepsis, even after accounting for depressive symptoms. The association between stress and 10-year adjusted incidence of sepsis was also significant, but this association was reduced when adjusting for depressive symptoms. Reduction of stress may limit short-term sepsis risk.
Coffee and tea consumption and the risk of Parkinson's disease.
Hu, Gang; Bidel, Siamak; Jousilahti, Pekka; Antikainen, Riitta; Tuomilehto, Jaakko
2007-11-15
Several prospective studies have assessed the association between coffee consumption and Parkinson's disease (PD) risk, but the results are inconsistent. We examined the association of coffee and tea consumption with the risk of incident PD among 29,335 Finnish subjects aged 25 to 74 years without a history of PD at baseline. During a mean follow-up of 12.9 years, 102 men and 98 women developed an incident PD. The multivariate-adjusted (age, body mass index, systolic blood pressure, total cholesterol, education, leisure-time physical activity, smoking, alcohol and tea consumption, and history of diabetes) hazard ratios (HRs) of PD associated with the amount of coffee consumed daily (0, 1-4, and > or = 5 cups) were 1.00, 0.55, and 0.41 (P for trend = 0.063) in men, 1.00, 0.50, and 0.39 (P for trend = 0.073) in women, and 1.00, 0.53, and 0.40 (P for trend = 0.005) in men and women combined (adjusted also for sex), respectively. In both sexes combined, the multivariate-adjusted HRs of PD for subjects drinking > or = 3 cups of tea daily compared with tea nondrinkers was 0.41 (95% CI 0.20-0.83). These results suggest that coffee drinking is associated with a lower risk of PD. More tea drinking is associated with a lower risk of PD. (c) 2007 Movement Disorder Society.
Gastroduodenal Ulcers and ABO Blood Group: the Japan Nurses' Health Study (JNHS).
Alkebsi, Lobna; Ideno, Yuki; Lee, Jung-Su; Suzuki, Shosuke; Nakajima-Shimada, Junko; Ohnishi, Hiroshi; Sato, Yasunori; Hayashi, Kunihiko
2018-01-05
Although several studies have shown that blood type O is associated with increased risk of peptic ulcer, few studies have investigated these associations in Japan. We sought to investigate the association between the ABO blood group and risk of gastroduodenal ulcers (GDU) using combined analysis of both retrospective and prospective data from a large cohort study of Japanese women, the Japan Nurses' Health Study (JNHS; n = 15,019). The impact of the ABO blood group on GDU risk was examined using Cox regression analysis to estimate hazard ratios (HRs) and 95% confidence intervals (CI), with adjustment for potential confounders. Compared with women with non-O blood types (A, B, and AB), women with blood type O had a significantly increased risk of GDU from birth (multivariable-adjusted HR 1.18; 95% CI, 1.04-1.34). Moreover, the highest cumulative incidence of GDU was observed in women born pre-1956 with blood type O. In a subgroup analysis stratified by birth year (pre-1956 or post-1955), the multivariable-adjusted HR of women with blood type O was 1.22 (95% CI, 1.00-1.49) and 1.15 (95% CI, 0.98-1.35) in the pre-1956 and post-1955 groups, respectively. In this large, combined, ambispective cohort study of Japanese women, older women with blood type O had a higher risk of developing GDU than those with other blood types.
Gray, Shelly L.; Boudreau, Robert M.; Newman, Anne B.; Studenski, Stephanie A.; Shorr, Ronald I; Bauer, Douglas C.; Simonsick, Eleanor M.; Hanlon, Joseph T
2012-01-01
Objective Angiotensin-converting enzyme (ACE) inhibitors and statin medications have been proposed as potential agents to prevent or delay physical disability; yet limited research has evaluated whether such use in older community dwelling adults is associated with a lower risk of incident mobility limitation. Design Longitudinal cohort study Setting Health, Aging and Body Composition (Health ABC) Participants 3055 participants who were well functioning at baseline (e.g., no mobility limitations). Measurements Summated standardized daily doses (low, medium and high) and duration of ACE inhibitor and statin use was computed. Mobility limitation (two consecutive self-reports of having any difficulty walking 1/4 mile or climbing 10 steps without resting) was assessed every 6 months after baseline. Multivariable Cox proportional hazard analyses were conducted adjusting for demographics, health status, and health behaviors. Results At baseline, ACE inhibitors and statins were used by 15.2% and 12.9%, respectively and both increased to over 25% by year 6. Over 6.5 years of follow-up, 49.8% had developed mobility limitation. In separate multivariable models, neither ACE inhibitor (multivariate hazard ratio [HR] 0.95; 95% confidence interval [CI] 0.82–1.09) nor statin use (multivariate HR 1.02; 95% CI 0.87–1.17) was associated with a lower risk for mobility limitation. Similar findings were seen in analyses examining dose- and duration-response relationships and sensitivity analyses restricted to those with hypertension. Conclusions These findings indicate that ACE inhibitors and statins widely prescribed to treat hypertension and hypercholesterolemia, respectively do not lower risk of mobility limitation, an important life quality indicator. PMID:22092102
Khachatryan, Naira; Medeiros, Felipe A.; Sharpsten, Lucie; Bowd, Christopher; Sample, Pamela A.; Liebmann, Jeffrey M.; Girkin, Christopher A.; Weinreb, Robert N.; Miki, Atsuya; Hammel, Na’ama; Zangwill, Linda M.
2015-01-01
Purpose To evaluate racial differences in the development of visual field (VF) damage in glaucoma suspects. Design Prospective, observational cohort study. Methods Six hundred thirty six eyes from 357 glaucoma suspects with normal VF at baseline were included from the multicenter African Descent and Glaucoma Evaluation Study (ADAGES). Racial differences in the development of VF damage were examined using multivariable Cox Proportional Hazard models. Results Thirty one (25.4%) of 122 African descent participants and 47 (20.0%) of 235 European descent participants developed VF damage (p=0.078). In multivariable analysis, worse baseline VF mean deviation, higher mean arterial pressure during follow up, and a race *mean intraocular pressure (IOP) interaction term were significantly associated with the development of VF damage suggesting that racial differences in the risk of VF damage varied by IOP. At higher mean IOP levels, race was predictive of the development of VF damage even after adjusting for potentially confounding factors. At mean IOPs during follow-up of 22, 24 and 26 mmHg, multivariable hazard ratios (95%CI) for the development of VF damage in African descent compared to European descent subjects were 2.03 (1.15–3.57), 2.71 (1.39–5.29), and 3.61 (1.61–8.08), respectively. However, at lower mean IOP levels (below 22 mmHg) during follow-up, African descent was not predictive of the development of VF damage. Conclusion In this cohort of glaucoma suspects with similar access to treatment, multivariate analysis revealed that at higher mean IOP during follow-up, individuals of African descent were more likely to develop VF damage than individuals of European descent. PMID:25597839
Gray, Shelly L; Boudreau, Robert M; Newman, Anne B; Studenski, Stephanie A; Shorr, Ronald I; Bauer, Douglas C; Simonsick, Eleanor M; Hanlon, Joseph T
2011-12-01
To evaluate whether the use of angiotensin-converting enzyme (ACE) inhibitors and statins is associated with a lower risk of incident mobility limitation in older community dwelling adults. Longitudinal cohort study. Health, Aging and Body Composition (Health ABC) study. Three thousand fifty-five participants who were well functioning at baseline (no mobility limitations). Summated standardized daily doses (low, medium, high) and duration of ACE inhibitor and statin use were computed. Mobility limitation (two consecutive self-reports of having any difficulty walking one-quarter of a mile or climbing 10 steps without resting) was assessed every 6 months after baseline. Multivariable Cox proportional hazards analyses were conducted, adjusting for demographics, health status, and health behaviors. At baseline, 15.2% used ACE inhibitors and 12.9% used statins; use of both was greater than 25% by Year 6. Over 6.5 years of follow-up, 49.8% had developed mobility limitation. In separate multivariable models, neither ACE inhibitor (multivariate hazard ratio (HR) = 0.95, 95% confidence interval (CI) = 0.82-1.09) nor statin use (multivariate HR = 1.02, 95% CI = 0.87-1.17) was associated with lower risk of mobility limitation. Similar findings were seen in analyses examining dose-response and duration-response relationships and a sensitivity analysis restricted to those with hypertension. ACE inhibitors and statins widely prescribed to treat hypertension and hypercholesterolemia, respectively, do not lower risk of mobility limitation, an important indicator of quality of life. © 2011, Copyright the Authors Journal compilation © 2011, The American Geriatrics Society.
Bae, Woong Jin; Choi, Jin Bong; Moon, Hyong Woo; Park, Young Hyun; Cho, Hyuk Jin; Hong, Sung-Hoo; Lee, Ji Youl; Kim, Sae Woong; Han, Kyung-Do; Ha, U-Syn
2018-01-01
To examine the association between obesity and urothelial cancer, we used a representative data from the National Health Insurance System (NHIS). Participants included 826,170 men aged 20 years and older who experienced a health examination at least one time between 2004 and 2008. The study thus excluded people aged <20 years and women. We used a multivariate adjusted Cox regression analysis to examine the association between urothelial cancer and body mass index (BMI) via a hazard ratio (HR) and 95% confidence interval (CI). The age- or multivariable-adjusted HR for urothelial cancer was stratified by BMI. Men with a higher BMI were more likely to acquire urothelial cancer independent of variables. In the population with diabetes, there showed a considerable, increasing trend in the risk of urothelial cancer in the overweight and obesity group, compared to the group with the same BMI but without diabetes. This population-based study showed evidence of an association between obesity and the development of urothelial cancer, where the presence of diabetes increased the risk of urothelial cancer. Additionally, the higher the BMI, the higher the risk for urothelial cancer.
Psychological attitudes and risk of breast cancer in Japan: a prospective study.
Wakai, Kenji; Kojima, Masayo; Nishio, Kazuko; Suzuki, Sadao; Niwa, Yoshimitsu; Lin, Yingsong; Kondo, Takaaki; Yatsuya, Hiroshi; Tamakoshi, Koji; Yamamoto, Akio; Tokudome, Shinkan; Toyoshima, Hideaki; Tamakoshi, Akiko
2007-04-01
To examine the association between psychological factors and the risk of breast cancer prospectively in a non-Western population. Data from the Japan Collaborative Cohort (JACC) study were analyzed. From 1988 to 1990, 34,497 women aged 40-79 years completed a questionnaire on medical, lifestyle and psychosocial factors. The rate ratios (RRs) of their responses were computed by fitting to proportional hazards models. During the mean follow-up period of 7.5 years, 149 breast cancer cases were documented. Those individuals who possessed "ikigai" (Japanese term meaning something that made one's life worth living) showed a significantly lower risk of breast cancer (multivariate-adjusted RR=0.66; 95% confidence interval [CI]=0.47-0.94). Those who perceived themselves as able to make decisions quickly also had a lower risk of breast cancer (multivariate-adjusted RR=0.56; 95% CI=0.36-0.87). The other factors investigated, including ease of anger arousal and self-perceived stress of daily life were not associated with breast cancer risk. Although further studies will be necessary to verify these findings, our results suggest that having "ikigai" and being decisive decrease an individual's subsequent risk of breast cancer.
Physically and psychologically hazardous jobs and mental health in Thailand
Yiengprugsawan, Vasoontara; Strazdins, Lyndall; Lim, Lynette L.-Y.; Kelly, Matthew; Seubsman, Sam-ang; Sleigh, Adrian C.
2015-01-01
This paper investigates associations between hazardous jobs, mental health and wellbeing among Thai adults. In 2005, 87 134 distance-learning students from Sukhothai Thammathirat Open University completed a self-administered questionnaire; at the 2009 follow-up 60 569 again participated. Job characteristics were reported in 2005, psychological distress and life satisfaction were reported in both 2005 and 2009. We derived two composite variables grading psychologically and physically hazardous jobs and reported adjusted odds ratios (AOR) from multivariate logistic regressions. Analyses focused on cohort members in paid work: the total was 62 332 at 2005 baseline and 41 671 at 2009 follow-up. Cross-sectional AORs linking psychologically hazardous jobs to psychological distress ranged from 1.52 (one hazard) to 4.48 (four hazards) for males and a corresponding 1.34–3.76 for females. Similarly AORs for physically hazardous jobs were 1.75 (one hazard) to 2.76 (four or more hazards) for males and 1.70–3.19 for females. A similar magnitude of associations was found between psychologically adverse jobs and low life satisfaction (AORs of 1.34–4.34 among males and 1.18–3.63 among females). Longitudinal analyses confirm these cross-sectional relationships. Thus, significant dose–response associations were found linking hazardous job exposures in 2005 to mental health and wellbeing in 2009. The health impacts of psychologically and physically hazardous jobs in developed, Western countries are equally evident in transitioning Southeast Asian countries such as Thailand. Regulation and monitoring of work conditions will become increasingly important to the health and wellbeing of the Thai workforce. PMID:24218225
Citrus consumption and incident dementia in elderly Japanese: the Ohsaki Cohort 2006 Study.
Zhang, Shu; Tomata, Yasutake; Sugiyama, Kemmyo; Sugawara, Yumi; Tsuji, Ichiro
2017-04-01
Although some experimental biological studies have indicated that citrus may have preventive effects against cognitive impairment, no cohort study has yet examined the relationship between citrus consumption and incident dementia. In a baseline survey, we collected data on daily citrus intake (categorised as ≤2, 3-4 times/week or almost every day) and consumption of other foods using a FFQ, and used a self-reported questionnaire to collect data on other covariates. Data on incident dementia were retrieved from the Japanese Long-term Care Insurance database. A multivariate-adjusted Cox model was used to estimate the hazard ratios (HR) and 95 % CI for incident dementia according to citrus consumption. Among 13 373 participants, the 5·7-year incidence of dementia was 8·6 %. In comparison with participants who consumed citrus ≤2 times/week, the multivariate-adjusted HR for incident dementia among those did so 3-4 times/week and almost every day was 0·92 (95 % CI 0·80, 1·07) and 0·86 (95 % CI 0·73, 1·01), respectively (P trend=0·065). The inverse association persisted after excluding participants whose dementia events had occurred in the first 2 years of follow-up. The multivariate HR was 1·00 (reference) for ≤2 times/week, 0·82 (95 % CI 0·69, 0·98) for 3-4 times/week and 0·77 (95 % CI 0·64, 0·93) for almost every day (P trend=0·006). The present findings suggest that frequent citrus consumption was associated with a lower risk of incident dementia, even after adjustment for possible confounding factors.
Menopause and Risk of Kidney Stones.
Prochaska, Megan; Taylor, Eric N; Curhan, Gary
2018-05-03
Metabolic changes due to menopause may alter urine composition and kidney stone risk but results from prior work on this association have been mixed. We examined menopause and risk of incident kidney stones and changes in 24-hour urine composition in the Nurses' Health Study II. We conducted a prospective analysis of 108,639 Nurses' Health Study II participants who provided information on menopause and kidney stones. We used multivariate adjusted Cox proportional hazards models. We also analyzed 24-hour urine collections from 658 participants who performed a collection while pre-menopausal and a repeat collection after menopause. During 22 years of follow-up, there were 3,456 incident kidney stones. The multivariate adjusted relative risk for an incident kidney stone for post-menopausal participants compared with pre-menopause was 1.27 (95% CI 1.08 to 1.46). In a stratified analysis, compared with pre-menopause, the multivariate adjusted relative risk of natural menopause was 1.27 (95% CI 1.09 to 1.48) and surgically induced menopause was 1.43 (95% CI 1.19 to 1.73). Among 74,505 post-menopausal participants, there were 1,041 incident stone events. Compared with no hormone therapy use, neither current nor past use was significantly associated with kidney stone risk. Compared with pre-menopause, the post-menopausal urine collections had lower mean calcium, citrate, phosphorus, and uric acid, and higher mean volume. Post-menopausal status is associated with higher risk of incident kidney stone. Natural and surgical menopause are each independently associated with higher risk. There are small but significant differences in urine composition between pre- and post-menopausal urine collections. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Single Marital Status and Infectious Mortality in Women With Cervical Cancer in the United States.
Machida, Hiroko; Eckhardt, Sarah E; Castaneda, Antonio V; Blake, Erin A; Pham, Huyen Q; Roman, Lynda D; Matsuo, Koji
2017-10-01
Unmarried status including single marital status is associated with increased mortality in women bearing malignancy. Infectious disease weights a significant proportion of mortality in patients with malignancy. Here, we examined an association of single marital status and infectious mortality in cervical cancer. This is a retrospective observational study examining 86,555 women with invasive cervical cancer identified in the Surveillance, Epidemiology, and End Results Program between 1973 and 2013. Characteristics of 18,324 single women were compared with 38,713 married women in multivariable binary logistic regression models. Propensity score matching was performed to examine cumulative risk of all-cause and infectious mortality between the 2 groups. Single marital status was significantly associated with young age, black/Hispanic ethnicity, Western US residents, uninsured status, high-grade tumor, squamous histology, and advanced-stage disease on multivariable analysis (all, P < 0.05). In a prematched model, single marital status was significantly associated with increased cumulative risk of all-cause mortality (5-year rate: 32.9% vs 29.7%, P < 0.001) and infectious mortality (0.5% vs 0.3%, P < 0.001) compared with the married status. After propensity score matching, single marital status remained an independent prognostic factor for increased cumulative risk of all-cause mortality (adjusted hazards ratio [HR], 1.15; 95% confidence interval [CI], 1.11-1.20; P < 0.001) and those of infectious mortality on multivariable analysis (adjusted HR, 1.71; 95% CI, 1.27-2.32; P < 0.001). In a sensitivity analysis for stage I disease, single marital status remained significantly increased risk of infectious mortality after propensity score matching (adjusted HR, 2.24; 95% CI, 1.34-3.73; P = 0.002). Single marital status was associated with increased infectious mortality in women with invasive cervical cancer.
The Multidisciplinary Swallowing Team Approach Decreases Pneumonia Onset in Acute Stroke Patients.
Aoki, Shiro; Hosomi, Naohisa; Hirayama, Junko; Nakamori, Masahiro; Yoshikawa, Mineka; Nezu, Tomohisa; Kubo, Satoshi; Nagano, Yuka; Nagao, Akiko; Yamane, Naoya; Nishikawa, Yuichi; Takamoto, Megumi; Ueno, Hiroki; Ochi, Kazuhide; Maruyama, Hirofumi; Yamamoto, Hiromi; Matsumoto, Masayasu
2016-01-01
Dysphagia occurs in acute stroke patients at high rates, and many of them develop aspiration pneumonia. Team approaches with the cooperation of various professionals have the power to improve the quality of medical care, utilizing the specialized knowledge and skills of each professional. In our hospital, a multidisciplinary participatory swallowing team was organized. The aim of this study was to clarify the influence of a team approach on dysphagia by comparing the rates of pneumonia in acute stroke patients prior to and post team organization. All consecutive acute stroke patients who were admitted to our hospital between April 2009 and March 2014 were registered. We analyzed the difference in the rate of pneumonia onset between the periods before team organization (prior period) and after team organization (post period). Univariate and multivariate analyses were performed using a Cox proportional hazards model to determine the predictors of pneumonia. We recruited 132 acute stroke patients from the prior period and 173 patients from the post period. Pneumonia onset was less frequent in the post period compared with the prior period (6.9% vs. 15.9%, respectively; p = 0.01). Based on a multivariate analysis using a Cox proportional hazards model, it was determined that a swallowing team approach was related to pneumonia onset independent from the National Institutes of Health Stroke Scale score on admission (adjusted hazard ratio 0.41, 95% confidence interval 0.19-0.84, p = 0.02). The multidisciplinary participatory swallowing team effectively decreased the pneumonia onset in acute stroke patients.
Kerr, Stephen J; Rowett, Debra S; Sayer, Geoffrey P; Whicker, Susan D; Saltman, Deborah C; Mant, Andrea
2011-01-01
AIM To determine hazard ratios for all-cause mortality in elderly Australian veterans taking COX-2 selective and non-selective NSAIDs. METHODS Patient cohorts were constructed from claims databases (1997 to 2007) for veterans and dependants with full treatment entitlement irrespective of military service. Patients were grouped by initial exposure: celecoxib, rofecoxib, meloxicam, diclofenac, non-selective NSAID. A reference group was constructed of patients receiving glaucoma/hypothyroid medications and none of the study medications. Univariate and multivariate analyses were performed using Cox proportional hazards regression models. Hazard ratios (HR) and 95% confidence intervals (CI) were estimated for each exposure group against each of the reference group. The final model was adjusted for age, gender and co-prescription as a surrogate for cardiovascular risk. Patients were censored if the gap in supply of study prescription exceeded 30 days or if another study medication was initiated. The outcome measure in all analyses was death. RESULTS Hazard ratios and 95% CIs, adjusted for age, gender and cardiovascular risk, for each group relative to the reference group were: celecoxib 1.39 (1.25, 1.55), diclofenac 1.44 (1.28, 1.62), meloxicam 1.49 (1.25, 1.78), rofecoxib 1.58 (1.39, 1.79), non-selective NSAIDs 1.76 (1.59, 1.94). CONCLUSIONS In this large cohort of Australian veterans exposed to COX-2 selective and non-selective NSAIDs, there was a significant increased mortality risk for those exposed to either COX-2-selective or non-selective NSAIDs relative to those exposed to unrelated (glaucoma/hypothyroid) medications. PMID:21276041
Kato, Koki; Fukuda, Haruhisa
2017-11-01
To quantify the difference between adjusted costs for home-based palliative care and hospital-based palliative care in terminally ill cancer patients. We carried out a case-control study of home-care patients (cases) who had died at home between January 2009 and December 2013, and hospital-care patients (controls) who had died at a hospital between April 2008 and December 2013. Data on patient characteristics were obtained from insurance claims data and medical records. We identified the determinants of home care using a multivariate logistic regression analysis. Cox proportional hazards analysis was used to examine treatment duration in both types of care, and a generalized linear model was used to estimate the reduction in treatment costs associated with home care. The case and control groups comprised 48 and 99 patients, respectively. Home care was associated with one or more person(s) living with the patient (adjusted OR 6.54, 95% CI 1.18-36.05), required assistance for activities of daily living (adjusted OR 3.61, 95% CI 1.12-10.51), non-use of oxygen inhalation therapy (adjusted OR 12.75, 95% CI 3.53-46.02), oral or suppository opioid use (adjusted OR 5.74, 95% CI 1.11-29.54) and transdermal patch opioid use (adjusted OR 8.30, 95% CI 1.97-34.93). The adjusted hazard ratio of home care for treatment duration was not significant (adjusted OR 0.95, 95% CI 0.59-1.53). However, home care was significantly associated with a reduction of $7523 (95% CI $7093-7991, P = 0.015) in treatment costs. Despite similar treatment durations between the groups, treatment costs were substantially lower in the home-care group. These findings might inform the policymaking process for improving the home-care support system. Geriatr Gerontol Int 2017; 17: 2247-2254. © 2017 Japan Geriatrics Society.
Factors associated with reporting results for pulmonary clinical trials in ClinicalTrials.gov.
Riley, Isaretta L; Boulware, L Ebony; Sun, Jie-Lena; Chiswell, Karen; Que, Loretta G; Kraft, Monica; Todd, Jamie L; Palmer, Scott M; Anderson, Monique L
2018-02-01
Background/aims The Food and Drug Administration Amendments Act mandates that applicable clinical trials report basic summary results to the ClinicalTrials.gov database within 1 year of trial completion or termination. We aimed to determine the proportion of pulmonary trials reporting basic summary results to ClinicalTrials.gov and assess factors associated with reporting. Methods We identified pulmonary clinical trials subject to the Food and Drug Administration Amendments Act (called highly likely applicable clinical trials) that were completed or terminated between 2008 and 2012 and reported results by September 2013. We estimated the cumulative percentage of applicable clinical trials reporting results by pulmonary disease category. Multivariable Cox regression modeling identified characteristics independently associated with results reporting. Results Of 1450 pulmonary highly likely applicable clinical trials, 380 (26%) examined respiratory neoplasms, 238 (16%) asthma, 175 (12%) chronic obstructive pulmonary disease, and 657 (45%) other respiratory diseases. Most (75%) were pharmaceutical highly likely applicable clinical trials and 71% were industry-funded. Approximately 15% of highly likely applicable clinical trials reported results within 1 year of trial completion, while 55% reported results over the 5-year study period. Earlier phase highly likely applicable clinical trials were less likely to report results compared to phase 4 highly likely applicable clinical trials (phases 1/2 and 2 (adjusted hazard ratio 0.41 (95% confidence interval: 0.31-0.54)), phases 2/3 and 3 (adjusted hazard ratio 0.55 (95% confidence interval: 0.42-0.72)) and phase not applicable (adjusted hazard ratio 0.43 (95% confidence interval: 0.29-0.63)). Pulmonary highly likely applicable clinical trials without Food and Drug Administration oversight were less likely to report results compared with those with oversight (adjusted hazard ratio 0.65 (95% confidence interval: 0.51-0.83)). Conclusion A total of 15% of pulmonary clinical highly likely applicable clinical trials report basic summary results to ClinicalTrials.gov within 1 year of trial completion. Strategies to improve reporting are needed within the pulmonary community.
Yeh, Hsin-Chih; Jan, Hau-Chern; Wu, Wen-Jeng; Li, Ching-Chia; Li, Wei-Ming; Ke, Hung-Lung; Huang, Shu-Pin; Liu, Chia-Chu; Lee, Yung-Chin; Yang, Sheau-Fang; Liang, Peir-In; Huang, Chun-Nung
2015-01-01
To investigate the impact of preoperative hydronephrosis and flank pain on prognosis of patients with upper tract urothelial carcinoma. In total, 472 patients with upper tract urothelial carcinoma managed by radical nephroureterectomy were included from Kaohsiung Medical University Hospital Healthcare System. Clinicopathological data were collected retrospectively for analysis. The significance of hydronephrosis, especially when combined with flank pain, and other relevant factors on overall and cancer-specific survival were evaluated. Of the 472 patients, 292 (62%) had preoperative hydronephrosis and 121 (26%) presented with flank pain. Preoperative hydronephrosis was significantly associated with age, hematuria, flank pain, tumor location, and pathological tumor stage. Concurrent presence of hydronephrosis and flank pain was a significant predictor of non-organ-confined disease (multivariate-adjusted hazard ratio = 2.10, P = 0.025). Kaplan-Meier analysis showed significantly poorer overall and cancer-specific survival in patients with preoperative hydronephrosis (P = 0.005 and P = 0.026, respectively) and in patients with flank pain (P < 0.001 and P = 0.001, respectively) than those without. However, only simultaneous hydronephrosis and flank pain independently predicted adverse outcome (hazard ratio = 1.98, P = 0.016 for overall survival and hazard ratio = 1.87, P = 0.036 for and cancer-specific survival, respectively) in multivariate Cox proportional hazards models. In addition, concurrent presence of hydronephrosis and flank pain was also significantly predictive of worse survival in patient with high grade or muscle-invasive disease. Notably, there was no difference in survival between patients with hydronephrosis but devoid of flank pain and those without hydronephrosis. Concurrent preoperative presence of hydronephrosis and flank pain predicted non-organ-confined status of upper tract urothelial carcinoma. When accompanied with flank pain, hydronephrosis represented an independent predictor for worse outcome in patients with upper tract urothelial carcinoma.
Herpes zoster could be an early manifestation of undiagnosed human immunodeficiency virus infection.
Lai, Shih-Wei; Lin, Cheng-Li; Liao, Kuan-Fu; Chen, Wen-Chi
2016-05-01
No formal epidemiological research based on systematic analysis has focused on the relationship between herpes zoster and immunodeficiency virus (HIV) infection in Taiwan. Our aim was to explore whether herpes zoster is an early manifestation of undiagnosed human HIV infection in Taiwan. This was a retrospective cohort study using the database of the Taiwan National Health Insurance Program. A total of 35,892 individuals aged ≤ 84 years with newly diagnosed herpes zoster from 1998 to 2010 were assigned to the herpes zoster group, whereas 143,568 sex-matched and age-matched, randomly selected individuals without herpes zoster served as the non-herpes zoster group. The incidence of HIV diagnosis at the end of 2011 was estimated in both groups. The multivariable Cox proportional hazards regression model was used to estimate the hazard ratio and 95% confidence interval (CI) for risk of HIV diagnosis associated with herpes zoster and other comorbidities including drug dependence and venereal diseases. The overall incidence of HIV diagnosis was 4.19-fold greater in the herpes zoster group than that in the non-herpes zoster group (3.33 per 10,000 person-years vs. 0.80 per 10,000 person-years, 95% CI 4.04-4.35). The multivariable Cox proportional hazards regression analysis revealed that the adjusted hazard ratio of HIV diagnosis was 4.37 (95% CI 3.10-6.15) for individuals with herpes zoster and without comorbidities, as compared with individuals without herpes zoster and without comorbidities. Herpes zoster is associated with HIV diagnosis. Patients who have risk behaviors of HIV infection should receive regular surveillance for undiagnosed HIV infection when they present with herpes zoster. Copyright © 2015. Published by Elsevier B.V.
Ahrén-Moonga, Jennie; Silverwood, Richard; Klinteberg, Britt Af; Koupil, Ilona
2009-09-01
Eating disorders are a leading cause of disease burden among young women. This study investigated associations of social characteristics of parents and grandparents, sibling position, and school performance with incidence of eating disorders. The authors studied Swedish females born in 1952-1989 (n = 13,376), third-generation descendants of a cohort born in Uppsala in 1915-1929. Data on grandparental and parental social characteristics, sibling position, school grades, hospitalizations, emigrations, and deaths were obtained by register linkages. Associations with incidence of hospitalization for eating disorders were studied with multivariable Cox regression, adjusted for age and study period. Overall incidence of hospitalization for eating disorders was 32.0/100,000 person-years. Women with more highly educated parents and maternal grandparents were at higher risk (hazard ratio for maternal grandmother with higher education relative to elementary education = 6.5, 95% confidence interval: 2.2, 19.3, adjusted for parental education). Independent of family social characteristics, women with the highest school grades had a higher risk of eating disorders (hazard ratio = 7.7, 95% confidence interval: 2.5, 24.1 for high compared with low grades in Swedish, adjusted for parental education). Thus, higher parental and grandparental education and higher school grades may increase risk of hospitalization for eating disorders in female offspring, possibly because of high internal and external demands.
Obesity Paradox: Comparison of Heart Failure Patients With and Without Comorbid Diabetes.
Lee, Kyoung Suk; Moser, Debra K; Lennie, Terry A; Pelter, Michele M; Nesbitt, Thomas; Southard, Jeffrey A; Dracup, Kathleen
2017-03-01
Diabetes is a common comorbid condition in patients with heart failure and is strongly associated with poor outcomes. Patients with heart failure who have diabetes are more likely to be obese than are those without diabetes. Obesity is positively associated with survival in patients with heart failure, but how comorbid diabetes influences the relationship between obesity and favorable prognosis is unclear. To explore whether the relationship between body mass index and survival differs between patients with heart failure who do or do not have diabetes. The sample consisted of 560 ambulatory patients with heart failure (mean age, 66 years; mean body mass index, 32; diabetes, 41%). The association between body mass index and all-cause mortality was examined by using multivariate Cox proportional hazards regression after adjustments for covariates. In patients without diabetes, higher body mass index was associated with a lower risk for all-cause mortality after adjustments for covariates (hazard ratio, 0.952; 95% CI, 0.909-0.998). In patients with diabetes, body mass index was not predictive of all-cause death after adjustments for covariates. Obesity was a survival benefit in heart failure patients without comorbid diabetes but not in those with comorbid diabetes. The mechanisms underlying the difference in the relationship between obesity and survival due to the presence of diabetes in patients with heart failure need to be elucidated. ©2017 American Association of Critical-Care Nurses.
Religion and mortality among the community-dwelling elderly.
Oman, D; Reed, D
1998-01-01
OBJECTIVES: This study analyzed the prospective association between attending religious services and all-cause mortality to determine whether the association is explainable by 6 confounding factors: demographics, health status, physical functioning, health habits, social functioning and support, and psychological state. METHODS: The association between self-reported religious attendance and subsequent mortality over 5 years for 1931 older residents of Marin County, California, was examined by proportional hazards regression. Interaction terms of religion with social support were used to explore whether other forms of social support could substitute for religion and diminish its protective effect. RESULTS: Persons who attended religious services had lower mortality than those who did not (age- and sex-adjusted relative hazard [RH] = 0.64; 95% confidence interval [CI] = 0.52, 0.78). Multivariate adjustment reduced this relationship only slightly (RH = 0.76; 95% CI = 0.62, 0.94), primarily by including physical functioning and social support. Contrary to hypothesis, religious attendance tended to be slightly more protective for those with high social support. CONCLUSIONS: Lower mortality rates for those who attend religious services are only partly explained by the 6 possible confounders listed above. Psychodynamic and other explanations need further investigation. PMID:9772846
A prospective study of sudden unexpected infant death after reported maltreatment.
Putnam-Hornstein, Emily; Schneiderman, Janet U; Cleves, Mario A; Magruder, Joseph; Krous, Henry F
2014-01-01
To examine whether infants reported for maltreatment face a heightened risk of sudden infant death syndrome (SIDS) and other leading causes of sudden unexpected infant death (SUID). Linked birth and infant death records for all children born in California between 1999 and 2006 were matched to administrative child protection data. Infants were prospectively followed from birth through death or 1 year of age. A report of maltreatment was modeled as a time-varying covariate; risk factors at birth were included as baseline covariates. Multivariable competing risk survival models were used to estimate the adjusted relative hazard of postneonatal SIDS and other SUID. A previous maltreatment report emerged as a significant predictor of SIDS and other SUID. After adjusting for baseline risk factors, the rate of SIDS was more than 3 times as great among infants reported for possible maltreatment (hazard ratio: 3.22; 95% CI: 2.66, 3.89). Infants reported to child protective services have a heightened risk of SIDS and other SUID. Targeted services and improved communication between child protective services and the pediatric health care community may enhance infant well-being and reduce risk of death. Copyright © 2014 Mosby, Inc. All rights reserved.
Myocardial Injury in Patients With Sepsis and Its Association With Long-Term Outcome.
Frencken, Jos F; Donker, Dirk W; Spitoni, Cristian; Koster-Brouwer, Marlies E; Soliman, Ivo W; Ong, David S Y; Horn, Janneke; van der Poll, Tom; van Klei, Wilton A; Bonten, Marc J M; Cremer, Olaf L
2018-02-01
Sepsis is frequently complicated by the release of cardiac troponin, but the clinical significance of this myocardial injury remains unclear. We studied the associations between troponin release during sepsis and 1-year outcomes. We enrolled consecutive patients with sepsis in 2 Dutch intensive care units between 2011 and 2013. Subjects with a clinically apparent cause of troponin release were excluded. High-sensitivity cardiac troponin I (hs-cTnI) concentration in plasma was measured daily during the first 4 intensive care unit days, and multivariable Cox regression analysis was used to model its association with 1-year mortality while adjusting for confounding. In addition, we studied cardiovascular morbidity occurring during the first year after hospital discharge. Among 1258 patients presenting with sepsis, 1124 (89%) were eligible for study inclusion. Hs-cTnI concentrations were elevated in 673 (60%) subjects on day 1, and 755 (67%) ever had elevated levels in the first 4 days. Cox regression analysis revealed that high hs-cTnI concentrations were associated with increased death rates during the first 14 days (adjusted hazard ratio, 1.72; 95% confidence interval, 1.14-2.59 and hazard ratio, 1.70; 95% confidence interval, 1.10-2.62 for hs-cTnI concentrations of 100-500 and >500 ng/L, respectively) but not thereafter. Furthermore, elevated hs-cTnI levels were associated with the development of cardiovascular disease among 200 hospital survivors who were analyzed for this end point (adjusted subdistribution hazard ratio, 1.25; 95% confidence interval, 1.04-1.50). Myocardial injury occurs in the majority of patients with sepsis and is independently associated with early-but not late-mortality, as well as postdischarge cardiovascular morbidity. © 2018 American Heart Association, Inc.
Saeed, Mohammed J; Olsen, Margaret A; Powderly, William G; Presti, Rachel M
2017-01-01
To investigate the association of diabetes with risk of decompensated cirrhosis in patients with chronic hepatitis C (CHC). Direct-acting antivirals are highly effective in treating CHC but very expensive. CHC patients at high risk of progression to symptomatic liver disease may benefit most from early treatment. We conducted a retrospective cohort study using the 2006 to 2013 Truven Health Analytics MarketScan Commercial Claims and Encounters database including inpatient, outpatient, and pharmacy claims from private insurers. CHC and cirrhosis were identified using ICD-9-CM diagnosis codes; baseline diabetes was identified by diagnosis codes or antidiabetic medications. CHC patients were followed to identify decompensated cirrhosis. Multivariable Cox proportional hazards regression was used to model the risk of decompensated cirrhosis by baseline cirrhosis. There were 75,805 CHC patients with median 1.9 years follow-up. A total of 10,317 (13.6%) of the CHC population had diabetes. The rates of decompensated cirrhosis per 1000 person-years were: 185.5 for persons with baseline cirrhosis and diabetes, 119.8 for persons with cirrhosis and no diabetes, 35.3 for persons with no cirrhosis and diabetes, and 17.1 for persons with no cirrhosis and no diabetes. Diabetes was associated with increased risk of decompensated cirrhosis in persons with baseline cirrhosis (adjusted hazard ratio=1.4; 95% confidence interval, 1.3-1.6) and in persons without baseline cirrhosis (adjusted hazard ratio=1.9; 95% confidence interval, 1.7-2.1). In a privately insured US population with CHC, the adjusted risk of decompensated cirrhosis was higher in diabetic compared with nondiabetic patients. Diabetes status should be included in prioritization of antiviral treatment.
Xiao, Roy; Miller, Jacob A; Zafirau, William J; Gorodeski, Eiran Z; Young, James B
2018-04-01
As healthcare costs rise, home health care represents an opportunity to reduce preventable adverse events and costs following hospital discharge. No studies have investigated the utility of home health care within the context of a large and diverse patient population. A retrospective cohort study was conducted between 1/1/2013 and 6/30/2015 at a single tertiary care institution to assess healthcare utilization after discharge with home health care. Control patients discharged with "self-care" were matched by propensity score to home health care patients. The primary outcome was total healthcare costs in the 365-day post-discharge period. Secondary outcomes included follow-up readmission and death. Multivariable linear and Cox proportional hazards regression were used to adjust for covariates. Among 64,541 total patients, 11,266 controls were matched to 6,363 home health care patients across 11 disease-based Institutes. During the 365-day post-discharge period, home health care was associated with a mean unadjusted savings of $15,233 per patient, or $6,433 after adjusting for covariates (p < 0.0001). Home health care independently decreased the hazard of follow-up readmission (HR 0.82, p < 0.0001) and death (HR 0.80, p < 0.0001). Subgroup analyses revealed that home health care most benefited patients discharged from the Digestive Disease (death HR 0.72, p < 0.01), Heart & Vascular (adjusted savings of $11,453, p < 0.0001), Medicine (readmission HR 0.71, p < 0.0001), and Neurological (readmission HR 0.67, p < 0.0001) Institutes. Discharge with home health care was associated with significant reduction in healthcare utilization and decreased hazard of readmission and death. These data inform development of value-based care plans. Copyright © 2018 Elsevier Inc. All rights reserved.
Miller, P Elliott; Zhao, Di; Frazier-Wood, Alexis C; Michos, Erin D; Averill, Michelle; Sandfort, Veit; Burke, Gregory L; Polak, Joseph F; Lima, Joao A C; Post, Wendy S; Blumenthal, Roger S; Guallar, Eliseo; Martin, Seth S
2017-02-01
Coffee and tea are 2 of the most commonly consumed beverages in the world. The association of coffee and tea intake with coronary artery calcium and major adverse cardiovascular events remains uncertain. We examined 6508 ethnically diverse participants with available coffee and tea data from the Multi-Ethnic Study of Atherosclerosis. Intake for each was classified as never, occasional (<1 cup per day), and regular (≥1 cup per day). A coronary artery calcium progression ratio was derived from mixed effect regression models using loge(calcium score+1) as the outcome, with coefficients exponentiated to reflect coronary artery calcium progression ratio versus the reference. Cox proportional hazards analyses were used to evaluate the association between beverage intake and incident cardiovascular events. Over a median follow-up of 5.3 years for coronary artery calcium and 11.1 years for cardiovascular events, participants who regularly drank tea (≥1 cup per day) had a slower progression of coronary artery calcium compared with never drinkers after multivariable adjustment. This correlated with a statistically significant lower incidence of cardiovascular events for ≥1 cup per day tea drinkers (adjusted hazard ratio 0.71; 95% confidence interval 0.53-0.95). Compared with never coffee drinkers, regular coffee intake (≥1 cup per day) was not statistically associated with coronary artery calcium progression or cardiovascular events (adjusted hazard ratio 0.97; 95% confidence interval 0.78-1.20). Caffeine intake was marginally inversely associated with coronary artery calcium progression. Moderate tea drinkers had slower progression of coronary artery calcium and reduced risk for cardiovascular events. Future research is needed to understand the potentially protective nature of moderate tea intake. Published by Elsevier Inc.
Importance of Abnormal Chloride Homeostasis in Stable Chronic Heart Failure.
Grodin, Justin L; Verbrugge, Frederik H; Ellis, Stephen G; Mullens, Wilfried; Testani, Jeffrey M; Tang, W H Wilson
2016-01-01
The aim of this analysis was to determine the long-term prognostic value of lower serum chloride in patients with stable chronic heart failure. Electrolyte abnormalities are prevalent in patients with chronic heart failure. Little is known regarding the prognostic implications of lower serum chloride. Serum chloride was measured in 1673 consecutively consented stable patients with a history of heart failure undergoing elective diagnostic coronary angiography. All patients were followed for 5-year all-cause mortality, and survival models were adjusted for variables that confounded the chloride-risk relationship. The average chloride level was 102 ± 4 mEq/L. Over 6772 person-years of follow-up, there were 547 deaths. Lower chloride (per standard deviation decrease) was associated with a higher adjusted risk of mortality (hazard ratio 1.29, 95% confidence interval 1.12-1.49; P < 0.001). Chloride levels net-reclassified risk in 10.4% (P = 0.03) when added to a multivariable model (with a resultant C-statistic of 0.70), in which sodium levels were not prognostic (P = 0.30). In comparison to those with above first quartile chloride (≥ 101 mEq/L) and sodium (≥ 138 meq/L), subjects with first quartile chloride had a higher adjusted mortality risk, whether they had first quartile sodium (hazard ratio 1.35, 95% confidence interval 1.08-1.69; P = 0.008) or higher (hazard ratio 1.43, 95% confidence interval 1.12-1.85; P = 0.005). However, subjects with first quartile sodium but above first quartile chloride had no association with mortality (P = 0.67). Lower serum chloride levels are independently and incrementally associated with increased mortality risk in patients with chronic heart failure. A better understanding of the biological role of serum chloride is warranted. © 2015 American Heart Association, Inc.
Screening frequency and atypical cells and the prediction of cervical cancer risk.
Chen, Yun-Yuan; You, San-Lin; Koong, Shin-Lan; Liu, Jessica; Chen, Chi-An; Chen, Chien-Jen
2014-05-01
To evaluate the screening efficacy and importance of atypical squamous cells and atypical glandular cells in predicting subsequent cervical cancer risk. This national cohort study in Taiwan analyzed associations between Pap test screening frequency and findings in 1995-2000 and subsequent risk of squamous cell carcinoma and adenocarcinoma after 2002. Women aged 30 years or older in 1995 without a cervical cancer history were included. Multivariate-adjusted hazard ratios and their 95% confidence intervals (CIs) were assessed using Cox regression analysis. During a total follow-up of 31,693,980 person-years in 2002-2008, 9,471 squamous cell carcinoma and 1,455 adenocarcinoma cases were newly diagnosed, resulting in 2,067 deaths. The risk of developing and dying from squamous cell carcinoma decreased significantly with increasing attendance frequency between 1995 and 2000 (all P values for trend<.001). Women who attended more than three screenings in 1995-2000 had 0.69-fold and 0.35-fold decrease in incidence and mortality of adenocarcinoma, respectively, compared with women who never attended any screenings. Abnormal cytologic findings were significant predictors of the incidence and mortality of cervical cancers. The adjusted hazard ratio (95% CI) of developing squamous cell carcinoma was 29.94 (22.83-39.25) for atypical squamous cells, cannot exclude high-grade squamous intraepithelial lesions, and the adjusted hazard ratio (95% CI) of developing adenocarcinoma was 49.43 (36.49-66.97) for atypical glandular cells. Significant reductions in cervical adenocarcinoma occurred in women who attend three or more annual screenings in 6 years. High-grade atypical squamous cells and atypical glandular cells are important predictors of subsequent adenocarcinoma and squamous cell carcinoma. II.
Assessment of frailty in aged dogs.
Hua, Julie; Hoummady, Sara; Muller, Claude; Pouchelon, Jean-Louis; Blondot, Marc; Gilbert, Caroline; Desquilbet, Loic
2016-12-01
OBJECTIVE To define a frailty-related phenotype-a clinical syndrome associated with the aging process in humans-in aged dogs and to investigate its association with time to death. ANIMALS 116 aged guide dogs. PROCEDURES Dogs underwent a clinical geriatric assessment (CGA) and were followed to either time of death or the study cutoff date. A 5-component clinical definition of a frailty phenotype was derived from clinical items included in a geriatric health evaluation scoresheet completed by veterinarians during the CGA. Univariate (via Kaplan-Meier curves) and multivariate (via Cox proportional hazards models) survival analyses were used to investigate associations of the 5 CGA components with time to death. RESULTS 76 dogs died, and the median time from CGA to death was 4.4 years. Independent of age at the time of CGA, dogs that had ≥ 2 of the 5 components (n = 10) were more likely to die during the follow-up period, compared with those that had 1 or no components (adjusted hazard ratio, 3.9 [95% confidence interval, 1.4 to 10.9]). After further adjustments for subclinical or clinical diseases and routine biomarkers, the adjusted hazard ratio remained significant. CONCLUSIONS AND CLINICAL RELEVANCE Results indicated that signs of frailty appeared to be a risk factor for death in dogs. The concept of frailty in dogs requires further development. IMPACT FOR HUMAN MEDICINE The concept of frailty, as defined for humans, seems transposable to dogs. Given that they share humans' environments and develop several age-related diseases similar to those in humans, dogs may be useful for the study of environmental or age-related risk factors for frailty in humans.
Parlett, Lauren E.; Bowman, Joseph D.; van Wijngaarden, Edwin
2015-01-01
Objective Epidemiologic evidence for the association between electromagnetic fields and amyotrophic lateral sclerosis, the most common form of motor neuron disease (MND), has been inconclusive. We evaluated the association between electromagnetic fields and MND among workers in occupations potentially exposed to magnetic fields. Methods MND mortality (ICD-9 335.2) was examined in the National Longitudinal Mortality Study using multivariable proportional hazards models. Occupational exposure to magnetic fields was determined on the basis of a population-based job-exposure matrix. Age at entry, education, race, sex, and income were considered for inclusion as covariates. Results After adjusting for age, sex, and education, there were no increased risks of MND mortality in relation to potential magnetic field exposure, with hazard ratios around the null in all magnetic field exposure quartiles. Conclusions Our study does not provide evidence for an association between magnetic field exposure and MND mortality. PMID:22076040
Stefler, Denes; Pikhart, Hynek; Kubinova, Ruzena; Pajak, Andrzej; Stepaniak, Urszula; Malyutina, Sofia; Simonova, Galina; Peasey, Anne; Marmot, Michael G; Bobak, Martin
2016-03-01
It is estimated that disease burden due to low fruit and vegetable consumption is higher in Central and Eastern Europe (CEE) and the former Soviet Union (FSU) than any other parts of the world. However, no large scale studies have investigated the association between fruit and vegetable (F&V) intake and mortality in these regions yet. The Health, Alcohol and Psychosocial Factors in Eastern Europe (HAPIEE) study is a prospective cohort study with participants recruited from the Czech Republic, Poland and Russia. Dietary data was collected using food frequency questionnaire. Mortality data was ascertained through linkage with death registers. Multivariable adjusted hazard ratios were calculated by Cox regression models. Among 19,333 disease-free participants at baseline, 1314 died over the mean follow-up of 7.1 years. After multivariable adjustment, we found statistically significant inverse association between cohort-specific quartiles of F&V intake and stroke mortality: the highest vs lowest quartile hazard ratio (HR) was 0.52 (95% confidence interval (CI): 0.28-0.98). For total mortality, significant interaction (p = 0.008) between F&V intake and smoking was found. The associations were statistically significant in smokers, with HR 0.70 (0.53-0.91, p for trend: 0.011) for total mortality, and 0.62 (0.40-0.97, p for trend: 0.037) for cardiovascular disease (CVD) mortality. The association was appeared to be mediated by blood pressure, and F&V intake explained a considerable proportion of the mortality differences between the Czech and Russian cohorts. Our results suggest that increasing F&V intake may reduce CVD mortality in CEE and FSU, particularly among smokers and hypertensive individuals. © The European Society of Cardiology 2015.
Subclinical Hypothyroidism and Risk for Incident Myocardial Infarction Among Postmenopausal Women
LeGrys, Vicky A.; Funk, Michele Jonsson; Lorenz, Carol E.; Giri, Ayush; Jackson, Rebecca D.; Manson, JoAnn E.; Schectman, Robin; Edwards, Todd L.; Heiss, Gerardo
2013-01-01
Context: Subclinical hypothyroidism (SCH) has been associated with an increased risk for cardiovascular disease. However, few studies have specifically examined the association between SCH and myocardial infarction (MI), and the relationship is poorly understood. Objectives: The purpose of this study was to evaluate incident MI risk in relation to SCH and severities of SCH among postmenopausal women. Methods: We used a population-based nested case-cohort design within the Women's Health Initiative observational study to examine the association between SCH and incident first-time MI risk among postmenopausal women in the United States. SCH was assessed using blood specimens collected at baseline. Participants presenting with normal free T4 levels and with thyrotropin levels of greater than 4.68–6.99 mU/L or 7.00 mU/L or greater were defined as having mild SCH or moderate/severe SCH, respectively. MI cases were centrally adjudicated by trained Women's Health Initiative staff. The primary analysis included 736 incident MI cases and 2927 randomly selected subcohort members. Multivariable adjusted Cox-proportional hazard models were used to assess MI risk in relation to SCH. Results: Compared with euthyroid participants, the multivariable adjusted hazard ratio (HR) for participants with any SCH was 1.05 [95% confidence interval (CI) 0.77–1.44]. HRs for participants with mild SCH, moderate/severe SCH, and moderate/severe SCH and the presence of antithyroid peroxidase antibodies (TPOAb) were 0.99 (95% CI 0.67–1.46), 1.19 (95% CI 0.72–1.96), and 0.90 (95% CI 0.47–1.74), respectively. Conclusion: We did not find evidence to suggest that SCH is associated with increased MI risk among a population of predominantly older postmenopausal women with no prior history of MI. PMID:23539723
Castro-Quezada, Itandehui; Sánchez-Villegas, Almudena; Martínez-González, Miguel Á; Salas-Salvadó, Jordi; Corella, Dolores; Estruch, Ramón; Schröder, Helmut; Álvarez-Pérez, Jacqueline; Ruiz-López, María D; Artacho, Reyes; Ros, Emilio; Bulló, Mónica; Sorli, Jose V; Fitó, Montserrat; Ruiz-Gutiérrez, Valentina; Toledo, Estefanía; Buil-Cosiales, Pilar; García Rodríguez, Antonio; Lapetra, José; Pintó, Xavier; Salaverría, Itziar; Tur, Josep A; Romaguera, Dora; Tresserra-Rimbau, Anna; Serra-Majem, Lluís
2016-11-01
The objective of this study was to evaluate the prospective associations between dietary glycemic index (GI) and glycemic load (GL) and the risk for invasive breast cancer incidence in postmenopausal women at high cardiovascular disease (CVD) risk. This study was conducted within the framework of the PREvención con DIeta MEDiterránea (PREDIMED) study, a nutritional intervention trial for primary cardiovascular prevention. We included 4010 women aged between 60 and 80 years who were initially free from breast cancer but at high risk for CVD disease. Dietary information was collected using a validated 137-item food frequency questionnaire. We assigned GI values using the International Tables of GI and GL values. Cases were ascertained through yearly consultation of medical records and through consultation of the National Death Index. Only cases confirmed by results from cytology tests or histological evaluation were included. We estimated multivariable-adjusted hazard ratios for invasive breast cancer risk across tertiles of energy-adjusted dietary GI/GL using Cox regression models. We repeated our analyses using yearly repeated measures of GI/GL intakes. No associations were found between baseline dietary GI/GL and invasive breast cancer incidence. The multivariable hazard ratio and 95% confidence interval (CI) for the top tertile of dietary GI was 1.02 (95% CI: 0.42-2.46) and for dietary GL was 1.00 (95% CI: 0.44-2.30) when compared with the bottom tertile. Repeated-measures analyses yielded similar results. In sensitivity analyses, no significant associations were observed for women with obesity or diabetes. Dietary GI and GL did not appear to be associated with an increased risk for invasive breast cancer in postmenopausal women at high CVD risk.
Goto, Atsushi; Goto, Maki; Terauchi, Yasuo; Yamaguchi, Naohito; Noda, Mitsuhiko
2016-03-09
It remains unclear whether severe hypoglycemia is associated with cardiovascular disease (CVD) in Asian populations with type 2 diabetes (T2D). Furthermore, no study in Japan, where the prescription patterns differ from those in other countries, has examined this association. We retrospectively included 58 223 patients (18-74 years old) with T2D. First, we examined the potential predictors of severe hypoglycemia. Then, we investigated the association between severe hypoglycemia and CVD risk. Finally, we performed an updated systematic review and meta-analysis to incorporate our findings and recently published studies into the previous systematic review and meta-analysis. During 134 597 person-years from cumulative observation periods, 128 persons experienced severe hypoglycemia and 550 developed CVD events. In a multivariate Cox proportional hazard model, severe hypoglycemia was strongly and positively associated with the risk of CVD (multivariate-adjusted adjusted hazard ratio, 3.39; 95% CI, 1.25-9.18). In a propensity score-matched cohort that had similar baseline characteristics for patients with severe hypoglycemia and those without, severe hypoglycemia was more strongly associated with the risk of CVD. An updated systematic review and meta-analysis that included 10 studies found that severe hypoglycemia was associated with an ≈2-fold increased risk of CVD (pooled relative risk, 1.91; 95% CI, 1.69-2.15). Our results suggest that severe hypoglycemia is strongly associated with an increased risk of CVD in Japanese patients with T2D, further supporting the notion that avoiding severe hypoglycemia may be important in preventing CVD in this patient population. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Paul L., E-mail: pnguyen@LROC.harvard.ed; Department of Radiation Oncology, Dana Farber Cancer Institute and Brigham and Women's Hospital, Boston, MA; Chen, Ming-Hui
2010-02-01
Purpose: The U.S. Preventive Services Task Force has recommended against screening men over 75 for prostate cancer. We examined whether older healthy men could benefit from aggressive prostate cancer treatment. Methods and Materials: 206 men with intermediate to high risk localized prostate cancer randomized to 70 Gy of radiation (RT) or RT plus 6 months of androgen suppression therapy (RT+AST) constituted the study cohort. Within subgroups stratified by Adult Comorbidity Evaluation-27 comorbidity score and age, Cox multivariable analysis was used to determine whether treatment with RT+AST as compared with RT was associated with a decreased risk of death. Results: Amongmore » healthy men (i.e., with mild or no comorbidity), 78 were older than the median age of 72.4 years, and in this subgroup, RT+AST was associated with a significantly lower risk of death on multivariable analysis (adjusted hazard ratio = 0.36 (95% CI=0.13-0.98), p = 0.046, with significantly lower 8-year mortality estimates of 16.5% vs. 41.4% (p = 0.011). Conversely, among men with moderate or severe comorbidity, 24 were older than the median age of 73, and in this subgroup, treatment with RT+AST was associated with a higher risk of death (adjusted hazard ratio = 5.2 (1.3-20.2), p = 0.018). Conclusion: In older men with mild or no comorbidity, treatment with RT+AST was associated with improved survival compared with treatment with RT alone, suggesting that healthy older men may derive the same benefits from prostate cancer treatment as younger men. We therefore suggest that prostate cancer screening recommendations should not be based on strict age cutoffs alone but should also take into account comorbidity.« less
Castleberry, A W; Güller, U; Tarantino, I; Berry, M F; Brügger, L; Warschkow, R; Cerny, T; Mantyh, C R; Candinas, D; Worni, M
2014-06-01
Recently, multiple clinical trials have demonstrated improved outcomes in patients with metastatic colorectal cancer. This study investigated if the improved survival is race dependent. Overall and cancer-specific survival of 77,490 White and Black patients with metastatic colorectal cancer from the 1988-2008 Surveillance Epidemiology and End Results registry were compared using unadjusted and multivariable adjusted Cox proportional hazard regression as well as competing risk analyses. Median age was 69 years, 47.4 % were female and 86.0 % White. Median survival was 11 months overall, with an overall increase from 8 to 14 months between 1988 and 2008. Overall survival increased from 8 to 14 months for White, and from 6 to 13 months for Black patients. After multivariable adjustment, the following parameters were associated with better survival: White, female, younger, better educated and married patients, patients with higher income and living in urban areas, patients with rectosigmoid junction and rectal cancer, undergoing cancer-directed surgery, having well/moderately differentiated, and N0 tumors (p < 0.05 for all covariates). Discrepancies in overall survival based on race did not change significantly over time; however, there was a significant decrease of cancer-specific survival discrepancies over time between White and Black patients with a hazard ratio of 0.995 (95 % confidence interval 0.991-1.000) per year (p = 0.03). A clinically relevant overall survival increase was found from 1988 to 2008 in this population-based analysis for both White and Black patients with metastatic colorectal cancer. Although both White and Black patients benefitted from this improvement, a slight discrepancy between the two groups remained.
Siontis, Konstantinos C.; Geske, Jeffrey B.; Ong, Kevin; Nishimura, Rick A.; Ommen, Steve R.; Gersh, Bernard J.
2014-01-01
Background Atrial fibrillation (AF) is a common sequela of hypertrophic cardiomyopathy (HCM), but evidence on its prevalence, risk factors, and effect on mortality is sparse. We sought to evaluate the prevalence of AF, identify clinical and echocardiographic correlates, and assess its effect on mortality in a large high‐risk HCM population. Methods and Results We identified HCM patients who underwent evaluation at our institution from 1975 to 2012. AF was defined by known history (either chronic or paroxysmal), electrocardiogram, or Holter monitoring at index visit. We examined clinical and echocardiographic variables in association with AF. The effect of AF on overall and cause‐specific mortality was evaluated with multivariate Cox proportional hazards models. Of 3673 patients with HCM, 650 (18%) had AF. Patients with AF were older and more symptomatic (P<0.001). AF was less common among patients with obstructive HCM phenotype and was associated with larger left atria, higher E/e’ ratios, and worse cardiopulmonary exercise tolerance (all P values<0.001). During median (interquartile range) follow‐up of 4.1 (0.2 to 10) years, 1069 (29%) patients died. Patients with AF had worse survival compared to those without AF (P<0.001). In multivariate analysis adjusted for established risk factors of mortality in HCM, the hazard ratio (95% confidence interval) for the effect of AF on overall mortality was 1.48 (1.27 to 1.71). AF did not have an effect on sudden or nonsudden cardiac death. Conclusions In this large referral HCM population, approximately 1 in 5 patients had AF. AF was a strong predictor of mortality, even after adjustment for established risk factors. PMID:24965028
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olszewski, Adam J., E-mail: adam_olszewski@brown.edu; Desai, Amrita
2014-03-01
Purpose: To determine the factors associated with the use of radiation therapy and associated survival outcomes in early-stage marginal zone lymphoma of the mucosa-associated lymphoid tissue (MALT). Methods and Materials: We extracted data on adult patients with stage I/II MALT lymphoma diagnoses between 1998 and 2010 recorded in the Surveillance, Epidemiology, and End Results (SEER) database. We studied factors associated with radiation therapy administration in a logistic regression model and described the cumulative incidence of lymphoma-related death (LRD) according to receipt of the treatment. The association of radiation therapy with survival was explored in multivariate models with adjustment for immortalmore » time bias. Results: Of the 7774 identified patients, 36% received radiation therapy as part of the initial course of treatment. Older patients; black or Hispanic men; white, Hispanic, and black women; and socioeconomically disadvantaged and underinsured patients had a significantly lower chance of receiving radiation therapy. Radiation therapy administration was associated with a lower chance of LRD in most sites. In cutaneous, ocular, and salivary MALT lymphomas, the 5-year estimate of LRD after radiation therapy was 0%. The association of radiation therapy with overall survival in different lymphoma sites was heterogeneous, and statistically significant in cutaneous (hazard ratio 0.45, P=.009) and ocular (hazard ratio 0.47, P<.0001) locations after multivariate adjustment. Conclusions: Demographic factors are associated with the use of radiation therapy in MALT lymphoma. Clinicians should be sensitive to those disparities because the administration of radiation therapy may be associated with improved survival, particularly in cutaneous and ocular lymphomas.« less
2014-01-01
Introduction Current practice in the delivery of caloric intake (DCI) in patients with severe acute kidney injury (AKI) receiving renal replacement therapy (RRT) is unknown. We aimed to describe calorie administration in patients enrolled in the Randomized Evaluation of Normal vs. Augmented Level of Replacement Therapy (RENAL) study and to assess the association between DCI and clinical outcomes. Methods We performed a secondary analysis in 1456 patients from the RENAL trial. We measured the dose and evolution of DCI during treatment and analyzed its association with major clinical outcomes using multivariable logistic regression, Cox proportional hazards models, and time adjusted models. Results Overall, mean DCI during treatment in ICU was low at only 10.9 ± 9 Kcal/kg/day for non-survivors and 11 ± 9 Kcal/kg/day for survivors. Among patients with a lower DCI (below the median) 334 of 729 (45.8%) had died at 90-days after randomization compared with 316 of 727 (43.3%) patients with a higher DCI (above the median) (P = 0.34). On multivariable logistic regression analysis, mean DCI carried an odds ratio of 0.95 (95% confidence interval (CI): 0.91-1.00; P = 0.06) per 100 Kcal increase for 90-day mortality. DCI was not associated with significant differences in renal replacement (RRT) free days, mechanical ventilation free days, ICU free days and hospital free days. These findings remained essentially unaltered after time adjusted analysis and Cox proportional hazards modeling. Conclusions In the RENAL study, mean DCI was low. Within the limits of such low caloric intake, greater DCI was not associated with improved clinical outcomes. Trial registration ClinicalTrials.gov number, NCT00221013 PMID:24629036
Dorner, T E; Alexanderson, K; Svedberg, P; Ropponen, A; Stein, K V; Mittendorfer-Rutz, E
2015-10-01
The aim of this study was to investigate the associations between sickness absence due to back pain or depressive episode with future all-cause and diagnosis-specific disability pension, while adjusting for comorbidity and socio-demographics, for all and stratifying for sex. In total, 4,823,069 individuals aged 16-64 years, living in Sweden at the end of 2004, not on old-age or disability pension in 2005 and without ongoing sickness absence at the turn of 2004/2005 formed the study population. Crude and adjusted hazard ratios (HRs) for all-cause and diagnosis-specific disability pension (2006-2010) in relation to diagnosis-specific sickness absence with sickness benefits paid by the Social Insurance Agency were estimated using Cox regression. The HR for all-cause disability pension was 7.52 (7.25-7.52) in individuals with an incident sick-leave spell due to back pain, compared to individuals without sickness absence in 2005 in the fully adjusted (socio-demographics and comorbidity) model. The fully adjusted (multivariate) HRs for diagnosis-specific disability pension were musculoskeletal diagnoses 23.87 (22.75-25.04), mental 2.49 (2.27-2.73) or all other diagnoses, 3.44 (3.17-3.75). In individuals with an incident sick-leave spell due to a depressive episode in 2005, the multivariate adjusted HR for all-cause disability pension was 12.87 (12.42-13.35), while the multivariate HRs for disability pension due to musculoskeletal diagnoses were 4.39 (3.89-4.96), for mental diagnoses 25.32 (24.29-26.38) and for all other somatic diagnoses 3.44 (3.09-3.82). Men who were sickness absent due to a depressive episode had a higher HR for disability pension compared to women. Results indicate that sickness absence due to a depressive episode or back pain is a strong risk factor for a future disability pension due to mental, musculoskeletal or other somatic diagnoses. © 2015 European Pain Federation - EFIC®
Confounder summary scores when comparing the effects of multiple drug exposures.
Cadarette, Suzanne M; Gagne, Joshua J; Solomon, Daniel H; Katz, Jeffrey N; Stürmer, Til
2010-01-01
Little information is available comparing methods to adjust for confounding when considering multiple drug exposures. We compared three analytic strategies to control for confounding based on measured variables: conventional multivariable, exposure propensity score (EPS), and disease risk score (DRS). Each method was applied to a dataset (2000-2006) recently used to examine the comparative effectiveness of four drugs. The relative effectiveness of risedronate, nasal calcitonin, and raloxifene in preventing non-vertebral fracture, were each compared to alendronate. EPSs were derived both by using multinomial logistic regression (single model EPS) and by three separate logistic regression models (separate model EPS). DRSs were derived and event rates compared using Cox proportional hazard models. DRSs derived among the entire cohort (full cohort DRS) was compared to DRSs derived only among the referent alendronate (unexposed cohort DRS). Less than 8% deviation from the base estimate (conventional multivariable) was observed applying single model EPS, separate model EPS or full cohort DRS. Applying the unexposed cohort DRS when background risk for fracture differed between comparison drug exposure cohorts resulted in -7 to + 13% deviation from our base estimate. With sufficient numbers of exposed and outcomes, either conventional multivariable, EPS or full cohort DRS may be used to adjust for confounding to compare the effects of multiple drug exposures. However, our data also suggest that unexposed cohort DRS may be problematic when background risks differ between referent and exposed groups. Further empirical and simulation studies will help to clarify the generalizability of our findings.
Hayes, Don; Kopp, Benjamin T; Tobias, Joseph D; Woodley, Frederick W; Mansour, Heidi M; Tumin, Dmitry; Kirkby, Stephen E
2015-12-01
Survival in non-cystic fibrosis (CF) bronchiectasis is not well studied. The United Network for Organ Sharing database was queried from 1987 to 2013 to compare survival in adult patients with non-CF bronchiectasis to patients with CF listed for lung transplantation (LTx). Each subject was tracked from waitlist entry date until death or censoring to determine survival differences between the two groups. Of 2112 listed lung transplant candidates with bronchiectasis (180 non-CF, 1932 CF), 1617 were used for univariate Cox and Kaplan-Meier survival function analysis, 1173 for multivariate Cox models, and 182 for matched-pairs analysis based on propensity scores. Compared to CF, patients with non-CF bronchiectasis had a significantly lower mortality by univariate Cox analysis (HR 0.565; 95 % CI 0.424, 0.754; p < 0.001). Adjusting for potential confounders, multivariate Cox models identified a significant reduction in risk for death associated with non-CF bronchiectasis who were lung transplant candidates (HR 0.684; 95 % CI 0.475, 0.985; p = 0.041). Results were consistent in multivariate models adjusting for pulmonary hypertension and forced expiratory volume in one second. Non-CF bronchiectasis with advanced lung disease was associated with significantly lower mortality hazard compared to CF bronchiectasis on the waitlist for LTx. Separate referral and listing criteria for LTx in non-CF and CF populations should be considered.
Ryu, Seungho; Chang, Yoosoo; Zhang, Yiyi; Woo, Hee-Yeon; Kwon, Min-Jung; Park, Hyosoon; Lee, Kyu-Beck; Son, Hee Jung; Cho, Juhee; Guallar, Eliseo
2014-01-01
Background The association between serum bilirubin levels and incident chronic kidney disease (CKD) in the general population is unknown. We aimed to examine the association between serum bilirubin concentration (total, direct, and indirect) and the risk of incident CKD. Methods and Findings Longitudinal cohort study of 12,823 Korean male workers 30 to 59 years old without CKD or proteinuria at baseline participating in medical health checkup program in a large worksite. Study participants were followed for incident CKD from 2002 through 2011. Estimated glomerular filtration rate (eGFR) was estimated by using the CKD-EPI equation. CKD was defined as eGFR <60 mL/min per 1.73 m2. Parametric Cox models and pooled logistic regression models were used to estimate adjusted hazard ratios for incident CKD. We observed 238 incident cases of CKD during 70,515.8 person-years of follow-up. In age-adjusted models, the hazard ratios for CKD comparing quartiles 2–4 vs. quartile 1 of serum direct bilirubin were 0.93 (95% CI 0.67–1.28), 0.88 (0.60–1.27) and 0.60 (0.42–0.88), respectively. In multivariable models, the adjusted hazard ratio for CKD comparing the highest to the lowest quartile of serum direct bilirubin levels was 0.60 (95% CI 0.41–0.87; P trend = 0.01). Neither serum total nor indirect bilirubin levels were significantly associated with the incidence of CKD. Conclusions Higher serum direct bilirubin levels were significantly associated with a lower risk of developing CKD, even adjusting for a variety of cardiometabolic parameters. Further research is needed to elucidate the mechanisms underlying this association and to establish the role of serum direct bilirubin as a marker for CKD risk. PMID:24586219
Shams, Tanzila; Auchus, Alexander P; Oparil, Suzanne; Wright, Clinton B; Wright, Jackson; Furlan, Anthony J; Sila, Cathy A; Davis, Barry R; Pressel, Sara; Yamal, Jose-Miguel; Einhorn, Paula T; Lerner, Alan J
2017-11-01
The visual analogue scale is a self-reported, validated tool to measure quality of life (QoL). Our purpose was to determine whether baseline QoL predicted strokes in the ALLHAT study (Antihypertensive and Lipid Lowering Treatment to Prevent Heart Attack Trial) and evaluate determinants of poststroke change in QoL. In the ALLHAT study, among the 33 357 patients randomized to treatment arms, 1525 experienced strokes; 1202 (79%) strokes were nonfatal. This study cohort includes 32 318 (97%) subjects who completed the baseline visual analogue scale QoL estimate. QoL was measured on a visual analogue scale and adjusted using a Torrance transformation (transformed QoL [TQoL]). Kaplan-Meier curves and adjusted proportional hazards analyses were used to estimate the effect of TQoL on the risk of stroke, on a continuous scale (0-1) and by quartiles (≤0.81, >0.81≤0.89, >0.89≤0.95, >0.95). We analyzed the change from baseline to first poststroke TQoL using adjusted linear regression. After adjusting for multiple stroke risk factors, the hazard ratio for stroke events for baseline TQoL was 0.93 (95% confidence interval, 0.89-0.98) per 0.1 U increase. The lowest baseline TQoL quartile had a 20% increased stroke risk (hazard ratio=1.20 [95% confidence interval, 1.00-1.44]) compared with the reference highest quartile TQoL. Poststroke TQoL change was significant within all treatment groups ( P ≤0.001). Multivariate regression analysis revealed that baseline TQoL was the strongest predictor of poststroke TQoL with similar results for the untransformed QoL. The lowest baseline TQoL quartile had a 20% higher stroke risk than the highest quartile. Baseline TQoL was the only factor that predicted poststroke change in TQoL. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00000542. © 2017 American Heart Association, Inc.
Investigation of risk factors for mortality in aged guide dogs: A retrospective cohort study.
Hoummady, S; Hua, J; Muller, C; Pouchelon, J L; Blondot, M; Gilbert, C; Desquilbet, L
2016-09-15
The overall median lifespan of domestic dogs has been estimated to 9-12 years, but little is known about risk factors for mortality in aged and a priori healthy dogs. The objective of this retrospective cohort study was to determine which characteristics are associated with mortality in aged and a priori healthy guide dogs, in a retrospective cohort study of 116 guide dogs followed from a systematic geriatric examination at the age of 8-10 years old. A geriatric grid collected the clinical data and usual biological parameters were measured at the time of examination. Univariate (Kaplan-Meier estimates) and multivariable (Cox proportional hazard model) survival analyses were used to assess the associations with time to all-cause death. The majority of dogs were Golden Retrievers (n=48) and Labrador Retrievers (n=27). Median age at geriatric examination was 8.9 years. A total of 76 dogs died during follow-up, leading to a median survival time from geriatric examination of 4.4 years. After adjustment for demographic and biological variables, an increased alanine amionotransferase level (adjusted Hazard Ratio (adjusted HR), 6.2; 95% confidence interval [95%CI], 2.0-19.0; P<0.01), presenting skin nodules (adjusted HR, 1.9; 95% CI, 1.0-3.4; P=0.04), and not being a Labrador Retriever (adjusted HR, 3.3; 95%CI, 1.4-10; P<0.01) were independently associated with a shorter time to death. This study documents independent associations of alanine aminotransferase level, skin nodules and breed with mortality in aged guide dogs. These results may be useful for preventive medical care when conducting a geriatric examination in working dogs. Copyright © 2016 Elsevier B.V. All rights reserved.
Disparities in breast cancer surgery delay: the lingering effect of race.
Sheppard, Vanessa B; Oppong, Bridget A; Hampton, Regina; Snead, Felicia; Horton, Sara; Hirpa, Fikru; Brathwaite, Echo J; Makambi, Kepher; Onyewu, S; Boisvert, Marc; Willey, Shawna
2015-09-01
Delays to surgical breast cancer treatment of 90 days or more may be associated with greater stage migration. We investigated racial disparities in time to receiving first surgical treatment in breast cancer patients. Insured black (56 %) and white (44 %) women with primary breast cancer completed telephone interviews regarding psychosocial (e.g., self-efficacy) and health care factors (e.g., communication). Clinical data were extracted from medical charts. Time to surgery was measured as the days between diagnosis and definitive surgical treatment. We also examined delays of more than 90 days. Unadjusted hazard ratios (HRs) examined univariate relationships between delay outcomes and covariates. Cox proportional hazard models were used for multivariate analyses. Mean time to surgery was higher in blacks (mean 47 days) than whites (mean 33 days; p = .001). Black women were less likely to receive therapy before 90 days compared to white women after adjustment for covariates (HR .58; 95 % confidence interval .44, .78). Health care process factors were nonsignificant in multivariate models. Women with shorter delay reported Internet use (vs. not) and underwent breast-conserving surgery (vs. mastectomy) (p < .01). Prolonged delays to definitive breast cancer surgery persist among black women. Because the 90-day interval has been associated with poorer outcomes, interventions to address delay are needed.
Kikuya, Masahiro; Staessen, Jan A; Ohkubo, Takayoshi; Thijs, Lutgarde; Metoki, Hirohito; Asayama, Kei; Obara, Taku; Inoue, Ryusuke; Li, Yan; Dolan, Eamon; Hoshi, Haruhisa; Hashimoto, Junichiro; Totsune, Kazuhito; Satoh, Hiroshi; Wang, Ji-Guang; O'Brien, Eoin; Imai, Yutaka
2007-04-01
Ambulatory arterial stiffness index (AASI) and pulse pressure (PP) are indexes of arterial stiffness and can be computed from 24-hour blood pressure recordings. We investigated the prognostic value of AASI and PP in relation to fatal outcomes. In 1542 Ohasama residents (baseline age, 40 to 93 years; 63.4% women), we applied Cox regression to relate mortality to AASI and PP while adjusting for sex, age, BMI, 24-hour MAP, smoking and drinking habits, diabetes mellitus, and a history of cardiovascular disease. During 13.3 years (median), 126 cardiovascular and 63 stroke deaths occurred. The sex- and age-standardized incidence rates of cardiovascular and stroke mortality across quartiles were U-shaped for AASI and J-shaped for PP. Across quartiles, the multivariate-adjusted hazard ratios for cardiovascular and stroke death significantly deviated from those in the whole population in a U-shaped fashion for AASI, whereas for PP, none of the HRs departed from the overall risk. The hazard ratios for cardiovascular mortality across ascending AASI quartiles were 1.40 (P=0.04), 0.82 (P=0.25), 0.64 (P=0.01), and 1.35 (P=0.03). Additional adjustment of AASI for PP and sensitivity analyses by sex, excluding patients on antihypertensive treatment or with a history of cardiovascular disease, or censoring deaths occurring within 2 years of enrollment, produced confirmatory results. In a Japanese population, AASI predicted cardiovascular and stroke mortality over and beyond PP and other risk factors, whereas in adjusted analyses, PP did not carry any prognostic information.
A method for analyzing clustered interval-censored data based on Cox's model.
Kor, Chew-Teng; Cheng, Kuang-Fu; Chen, Yi-Hau
2013-02-28
Methods for analyzing interval-censored data are well established. Unfortunately, these methods are inappropriate for the studies with correlated data. In this paper, we focus on developing a method for analyzing clustered interval-censored data. Our method is based on Cox's proportional hazard model with piecewise-constant baseline hazard function. The correlation structure of the data can be modeled by using Clayton's copula or independence model with proper adjustment in the covariance estimation. We establish estimating equations for the regression parameters and baseline hazards (and a parameter in copula) simultaneously. Simulation results confirm that the point estimators follow a multivariate normal distribution, and our proposed variance estimations are reliable. In particular, we found that the approach with independence model worked well even when the true correlation model was derived from Clayton's copula. We applied our method to a family-based cohort study of pandemic H1N1 influenza in Taiwan during 2009-2010. Using the proposed method, we investigate the impact of vaccination and family contacts on the incidence of pH1N1 influenza. Copyright © 2012 John Wiley & Sons, Ltd.
Physically and psychologically hazardous jobs and mental health in Thailand.
Yiengprugsawan, Vasoontara; Strazdins, Lyndall; Lim, Lynette L-Y; Kelly, Matthew; Seubsman, Sam-ang; Sleigh, Adrian C
2015-09-01
This paper investigates associations between hazardous jobs, mental health and wellbeing among Thai adults. In 2005, 87 134 distance-learning students from Sukhothai Thammathirat Open University completed a self-administered questionnaire; at the 2009 follow-up 60 569 again participated. Job characteristics were reported in 2005, psychological distress and life satisfaction were reported in both 2005 and 2009. We derived two composite variables grading psychologically and physically hazardous jobs and reported adjusted odds ratios (AOR) from multivariate logistic regressions. Analyses focused on cohort members in paid work: the total was 62 332 at 2005 baseline and 41 671 at 2009 follow-up. Cross-sectional AORs linking psychologically hazardous jobs to psychological distress ranged from 1.52 (one hazard) to 4.48 (four hazards) for males and a corresponding 1.34-3.76 for females. Similarly AORs for physically hazardous jobs were 1.75 (one hazard) to 2.76 (four or more hazards) for males and 1.70-3.19 for females. A similar magnitude of associations was found between psychologically adverse jobs and low life satisfaction (AORs of 1.34-4.34 among males and 1.18-3.63 among females). Longitudinal analyses confirm these cross-sectional relationships. Thus, significant dose-response associations were found linking hazardous job exposures in 2005 to mental health and wellbeing in 2009. The health impacts of psychologically and physically hazardous jobs in developed, Western countries are equally evident in transitioning Southeast Asian countries such as Thailand. Regulation and monitoring of work conditions will become increasingly important to the health and wellbeing of the Thai workforce. © The Author 2013. Published by Oxford University Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, James X.; Rose, Steven; White, Sarah B.
PurposeThe purpose of the study was to evaluate prognostic factors for survival outcomes following embolotherapy for neuroendocrine tumor (NET) liver metastases.Materials and MethodsThis was a multicenter retrospective study of 155 patients (60 years mean age, 57 % male) with NET liver metastases from pancreas (n = 71), gut (n = 68), lung (n = 8), or other/unknown (n = 8) primary sites treated with conventional transarterial chemoembolization (TACE, n = 50), transarterial radioembolization (TARE, n = 64), or transarterial embolization (TAE, n = 41) between 2004 and 2015. Patient-, tumor-, and treatment-related factors were evaluated for prognostic effect on hepatic progression-free survival (HPFS) and overall survival (OS) using unadjusted and propensity score-weighted univariate and multivariate Coxmore » proportional hazards models.ResultsMedian HPFS and OS were 18.5 and 125.1 months for G1 (n = 75), 12.2 and 33.9 months for G2 (n = 60), and 4.9 and 9.3 months for G3 tumors (n = 20), respectively (p < 0.05). Tumor burden >50 % hepatic volume demonstrated 5.5- and 26.8-month shorter median HPFS and OS, respectively, versus burden ≤50 % (p < 0.05). There were no significant differences in HPFS or OS between gut or pancreas primaries. In multivariate HPFS analysis, there were no significant differences among embolotherapy modalities. In multivariate OS analysis, TARE had a higher hazard ratio than TACE (unadjusted Cox model: HR 2.1, p = 0.02; propensity score adjusted model: HR 1.8, p = 0.11), while TAE did not differ significantly from TACE.ConclusionHigher tumor grade and tumor burden prognosticated shorter HPFS and OS. TARE had a higher hazard ratio for OS than TACE. There were no significant differences in HPFS among embolotherapy modalities.« less
Chatterjee, Ranee; Yeh, Hsin-Chieh; Shafi, Tariq; Selvin, Elizabeth; Anderson, Cheryl; Pankow, James S.; Miller, Edgar; Brancati, Frederick
2012-01-01
Background Serum potassium levels affect insulin secretion by pancreatic beta-cells, and hypokalemia associated with diuretic use has been associated with dysglycemia. We hypothesized that adults with lower serum potassium levels and lower dietary potassium intake are at higher risk for incident diabetes, independent of diuretic use. Methods We analyzed data from 12,209 participants from the Atherosclerosis Risk in Communities (ARIC) Study, an on-going prospective cohort study beginning in 1986, with 9 years of in-person follow-up and 17 years of telephone follow-up. Using multivariate Cox proportional hazard models, we estimated the relative hazard (RH) of incident diabetes associated with baseline serum potassium levels. Results During 9 years of in-person follow-up, 1475 participants developed incident diabetes. In multivariate analyses, we found an inverse association between serum potassium and risk of incident diabetes. Compared to those with a high-normal serum potassium (5.0-5.5 mEq/l), adults with serum potassium levels of < 4.0, 4.0-<4.5, and 4.5-<5.0, (mEq/L) had adjusted relative hazards (RH) (95% CI) of incident diabetes of 1.64 (1.29-2.08), 1.64 (1.34-2.01), and 1.39 (1.14-1.71) respectively. An increased risk persisted during an additional 8 years of telephone follow-up based on self-report with RHs of 1.2-1.3 for those with a serum potassium less than 5.0 mEq/L. Dietary potassium intake was significantly associated with risk of incident diabetes in unadjusted models but not in multivariate models. Conclusions Serum potassium is an independent predictor of incident diabetes in this cohort. Further study is needed to determine if modification of serum potassium could reduce the subsequent risk of diabetes. PMID:20975023
Green space and mortality following ischemic stroke.
Wilker, Elissa H; Wu, Chih-Da; McNeely, Eileen; Mostofsky, Elizabeth; Spengler, John; Wellenius, Gregory A; Mittleman, Murray A
2014-08-01
Residential proximity to green space has been associated with physical and mental health benefits, but whether green space is associated with post-stroke survival has not been studied. Patients ≥ 21 years of age admitted to the Beth Israel Deaconess Medical Center (BIDMC) between 1999 and 2008 with acute ischemic stroke were identified. Demographics, presenting symptoms, medical history and imaging results were abstracted from medical records at the time of hospitalization for stroke onset. Addresses were linked to average Normalized Difference Vegetation Index, distance to roadways with more than 10,000 cars/day, and US census block group. Deaths were identified through June 2012 using the Social Security Death Index. There were 929 deaths among 1645 patients with complete data (median follow up: 5 years). In multivariable Cox models adjusted for indicators of medical history, demographic and socioeconomic factors, the hazard ratio for patients living in locations in the highest quartile of green space compared to the lowest quartile was 0.78 (95% Confidence Interval: 0.63-0.97) (p-trend = 0.009). This association remained statistically significant after adjustment for residential proximity to a high traffic road. Residential proximity to green space is associated with higher survival rates after ischemic stroke in multivariable adjusted models. Further work is necessary to elucidate the underlying mechanisms for this association, and to better understand the exposure-response relationships and susceptibility factors that may contribute to higher mortality in low green space areas. Copyright © 2014 Elsevier Inc. All rights reserved.
Gastroduodenal Ulcers and ABO Blood Group: the Japan Nurses’ Health Study (JNHS)
Ideno, Yuki; Lee, Jung-Su; Suzuki, Shosuke; Nakajima-Shimada, Junko; Ohnishi, Hiroshi; Sato, Yasunori; Hayashi, Kunihiko
2018-01-01
Background Although several studies have shown that blood type O is associated with increased risk of peptic ulcer, few studies have investigated these associations in Japan. We sought to investigate the association between the ABO blood group and risk of gastroduodenal ulcers (GDU) using combined analysis of both retrospective and prospective data from a large cohort study of Japanese women, the Japan Nurses’ Health Study (JNHS; n = 15,019). Methods The impact of the ABO blood group on GDU risk was examined using Cox regression analysis to estimate hazard ratios (HRs) and 95% confidence intervals (CI), with adjustment for potential confounders. Results Compared with women with non-O blood types (A, B, and AB), women with blood type O had a significantly increased risk of GDU from birth (multivariable-adjusted HR 1.18; 95% CI, 1.04–1.34). Moreover, the highest cumulative incidence of GDU was observed in women born pre-1956 with blood type O. In a subgroup analysis stratified by birth year (pre-1956 or post-1955), the multivariable-adjusted HR of women with blood type O was 1.22 (95% CI, 1.00–1.49) and 1.15 (95% CI, 0.98–1.35) in the pre-1956 and post-1955 groups, respectively. Conclusion In this large, combined, ambispective cohort study of Japanese women, older women with blood type O had a higher risk of developing GDU than those with other blood types. PMID:29093357
Matsuo, Koji; Machida, Hiroko; Shoupe, Donna; Melamed, Alexander; Muderspach, Laila I; Roman, Lynda D; Wright, Jason D
2016-10-01
To characterize contributing factors for ovarian conservation during surgical treatment for endometrial cancer and to examine the association of ovarian conservation on survival of young women with early-stage, low-grade tumors. This was a population-based study using the Surveillance, Epidemiology, and End Results program to identify surgically treated stage I type I (grade 1-2 endometrioid histology) endometrial cancer cases diagnosed between 1983 and 2012 (N=86,005). Multivariable models were used to identify independent factors for ovarian conservation. Survival outcomes and cause of death were examined for women aged younger than 50 with stage I type I endometrial cancer who underwent ovarian conservation (1,242 among 12,860 women [9.7%]). On multivariable analysis, age younger than 50 years, grade 1 endometrioid histology, and tumor size 2.0 cm or less were noted to be independent factors for ovarian conservation (all, P<.001). For 9,110 women aged younger than 50 years with stage I grade 1 tumors, cause-specific survival was similar between ovarian conservation and oophorectomy cases (20-year rates 98.9% compared with 97.7%, P=.31), whereas overall survival was significantly higher in ovarian conservation cases than oophorectomy cases (88.8% compared with 82.0%, P=.011). On multivariable analysis, ovarian conservation remained an independent prognostic factor for improved overall survival (adjusted hazard ratio 0.73, 95% confidence interval [CI] 0.54-0.98, P=.036) and was independently associated with a lower cumulative risk of death resulting from cardiovascular disease compared with oophorectomy (20-year rates, 2.3% compared with 3.7%, adjusted hazard ratio 0.40, 95% CI 0.17-0.91, P=.029). Contrary, cause-specific survival (20-year rates 94.6% compared with 96.1%, P=.68) and overall survival (81.0% compared with 80.6%, P=.91) were similar between ovarian conservation and oophorectomy among 3,750 women aged younger than 50 years with stage I grade 2 tumors. Ovarian conservation is performed in less than 10% of young women with stage I type I endometrial cancer. Ovarian conservation is associated with decreased mortality in young women with stage I grade 1 tumors.
Mid-arm muscle circumference as a significant predictor of all-cause mortality in male individuals
Wu, Li-Wei; Lin, Yuan-Yung; Kao, Tung-Wei; Lin, Chien-Ming; Liaw, Fang-Yih; Wang, Chung-Ching; Peng, Tao-Chun; Chen, Wei-Liang
2017-01-01
Background Emerging evidences indicate that mid-arm muscle circumference (MAMC) is one of the anthropometric indicators that reflect health and nutritional status, but its correlative effectiveness in all-cause mortality prediction of United States individuals remains uncertain. Methods and findings design We investigated the joint association between MAMC and all-cause mortality in the US general population. A population-based longitudinal study of 6,769 participants aged 40 to 90 years in the third National Health and Nutrition Examination Survey (NHANES III) conducted by the National Center for Health Statistics of the Centers for Disease Control and Prevention. All participants were divided into two groups based on the gender: male and female group; each group was then divided into three subgroups depending on their MAMC level. The tertiles were as follows: T1 (18<27.3), T2 (27.3<29.6), T3 (29.6≤40.0) cm in the male group and T1 (15<22.3), T2 (22.3<24.6), T3 (24.6≤44.0) cm in the female group. Multivariable Cox regression analyses and Kaplan–Meier survival probabilities were utilized to jointly relate all-cause mortality risk to different MAMC level. For all-cause mortality in male participants, multivariable adjusted hazard ratios (HRs) were 0.83 (95% confidence interval (CI): 0.69–0.98; p = 0.033) for MAMC of 27.3–29.6 cm compared with 18–27.3 cm, and 0.76 (95% CI: 0.61–0.95; p = 0.018) for MAMC of 29.6–40 cm compared with 18–27.3 cm. For all-cause mortality in female participants, multivariable adjusted hazard ratios (HRs) were 0.84 (95% confidence interval (CI): 0.69–1.02; p = 0.075) for MAMC of 22.3–24.6 cm compared with 15–22.3 cm, and 0.94 (95% CI: 0.75–1.17; p = 0.583) for MAMC of 24.6–44 cm compared with 15–22.3 cm. Conclusion Results support a lower MAMC is associated with a higher mortality risk in male individuals. PMID:28196081
Survival in Adult Lung Transplant Recipients Receiving Pediatric Versus Adult Donor Allografts.
Hayes, Don; Whitson, Bryan A; Ghadiali, Samir N; Lloyd, Eric A; Tobias, Joseph D; Mansour, Heidi M; Black, Sylvester M
2015-10-01
Recent evidence showed that pediatric donor lungs increased rates of allograft failure in adult lung transplant recipients; however, the influence on survival is unclear. The United Network for Organ Sharing (UNOS) database was queried from 2005 to 2013 for adult lung transplant recipients (≥18 years) to assess survival differences among donor age categories (<18 years, 18 to 29 years, 30 to 59 years, ≥60 years). Of 12,297 adult lung transplants, 12,209 were used for univariate Cox models and Kaplan-Meier (KM) analysis and 11,602 for multivariate Cox models. A total of 1,187 adult recipients received pediatric donor lungs compared with 11,110 receiving adult donor organs. Univariate and multivariate Cox models found no difference in survival between donor ages 0 to 17 and donor ages 18 to 29, whereas donor ages 60 and older were significantly associated with increased mortality hazard, relative to the modal category of donor ages 30 to 59 (adjusted hazard ratio = 1.381; 95% confidence interval = 1.188% to 1.606%; p < 0.001). Interactions between recipient and donor age range found that the oldest donor age range was negatively associated with survival among middle-aged (30 to 59) and older (≥60) lung transplant recipients. Pediatric donor lung allografts were not negatively associated with survival in adult lung transplant recipients; however, the oldest donor age range was associated with increased mortality hazard for adult lung transplant recipients. Copyright © 2015 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Appendectomy correlates with increased risk of pyogenic liver abscess
Liao, Kuan-Fu; Lai, Shih-Wei; Lin, Cheng-Li; Chien, Sou-Hsin
2016-01-01
Abstract Little is known on the association between appendectomy and pyogenic liver abscess. The objective of this study was to investigate the association between appendectomy and the risk of pyogenic liver abscess in Taiwan. This population-based retrospective cohort study was conducted using the hospitalization dataset of the Taiwan National Health Insurance Program. There were 212,530 subjects age 20 to 84 years with newly diagnosed appendectomy as the appendectomy group since 1998 to 2010, and 850,099 randomly selected subjects without appendectomy as the nonappendectomy group. Both appendectomy and nonappendectomy groups were matched with sex, age, comorbidities, and index year of diagnosing appendectomy. The incidence of pyogenic liver abscess at the end of 2011 was estimated in both groups. The multivariable Cox proportional hazards regression model was applied to investigate the hazard ratio (HR) and 95% confidence interval (CI) for risk of pyogenic liver abscess associated with appendectomy and other comorbidities including alcoholism, biliary stone, chronic kidney disease, chronic liver diseases, and diabetes mellitus. The overall incidence of pyogenic liver abscess was 1.73-fold greater in the appendectomy group than that in the nonappendectomy group (3.85 vs 2.22 per 10,000 person-years, 95% CI 1.71, 1.76). The multivariable regression analysis disclosed that the adjusted HR of pyogenic liver abscess was 1.77 for the appendectomy group (95% CI 1.59, 1.97), when compared with the nonappendectomy group. Appendectomy is associated with increased hazard of pyogenic liver abscess. Further studies remain necessary to confirm our findings. PMID:27368018
Perreault, Sylvie; de Denus, Simon; White, Michel; White-Guay, Brian; Bouvier, Michel; Dorais, Marc; Dubé, Marie-Pierre; Rouleau, Jean-Lucien; Tardif, Jean-Claude; Jenna, Sarah; Haibe-Kains, Benjamin; Leduc, Richard; Deblois, Denis
2017-01-01
The long-term use of β-blockers has been shown to improve clinical outcomes among patients with heart failure (HF). However, a lack of data persists in assessing whether carvedilol or bisoprolol are superior to metoprolol tartrate in clinical practice. We endeavored to compare the effectiveness of β-blockers among older adults following a primary hospital admission for HF. We conducted a cohort study using Quebec administrative databases to identify patients who were using β-blockers, carvedilol, bisoprolol, or metoprolol tartrate after the diagnosis of HF. We characterized the patients by the type of β-blocker prescribed at discharge of their first HF hospitalization. An adjusted multivariate Cox proportional hazards model was used to compare the primary outcome of all-cause mortality. We also conducted analyses by matching for a propensity score for initiation of β-blocker therapy and assessed the effect on primary outcome. Among 3197 patients with HF with a median follow-up of 2.8 years, the crude annual mortality rates (per 100 person-years) were at 16, 14.9, and 17.7 for metoprolol tartrate, carvedilol, and bisoprolol, respectively. Adjusted hazard ratios of carvedilol (hazard ratio 0.92; 0.78-1.09) and bisoprolol (hazard ratio 1.04; 0.93-1.16) were not significantly different from that of metoprolol tartrate in improving survival. After matching for propensity score, carvedilol and bisoprolol showed no additional benefit with respect to all-cause mortality compared with metoprolol tartrate. Our evidence suggests no differential effect of β-blockers on all-cause mortality among older adults with HF. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Nakamura, Shunichi; Kato, Koji; Yoshida, Asuka; Fukuma, Nagaharu; Okumura, Yasuyuki; Ito, Hiroto; Mizuno, Kyoichi
2013-05-15
Although attention has recently been focused on the role of psychosocial factors in patients with cardiovascular disease (CVD), the factors that have the greatest influence on prognosis have not yet been elucidated. The aim of this study was to evaluate the effects of depression, anxiety, and anger on the prognosis of patients with CVD. Four hundred fourteen consecutive patients hospitalized with CVD were prospectively enrolled. Depression was evaluated using the Patient Health Questionnaire, anxiety using the Generalized Anxiety Disorder Questionnaire, and anger using the Spielberger Trait Anger Scale. Cox proportional-hazards regression was used to examine the individual effects of depression, anxiety, and anger on a combined primary end point of cardiac death or cardiac hospitalization and on a combined secondary end point of all-cause death or hospitalization during follow-up (median 14.2 months). Multivariate analysis showed that depression was a significant risk factor for cardiovascular hospitalization or death after adjusting for cardiac risk factors and other psychosocial factors (hazard ratio 2.62, p = 0.02), whereas anxiety was not significantly associated with cardiovascular hospitalization or death after adjustment (hazard ratio 2.35, p = 0.10). Anger was associated with a low rate of cardiovascular hospitalization or death (hazard ratio 0.34, p <0.01). In conclusion, depression in hospitalized patients with CVD is a stronger independent risk factor for adverse cardiac events than either anxiety or anger. Anger may help prevent adverse outcomes. Routine screening for depression should therefore be performed in patients with CVD, and the potential effects of anger in clinical practice should be reconsidered. Copyright © 2013 Elsevier Inc. All rights reserved.
Signorovitch, J E; Macaulay, D; Diener, M; Yan, Y; Wu, E Q; Gruenberger, J-B; Frier, B M
2013-01-01
Aims To assess associations between hypoglycaemia and risk of accidents resulting in hospital visits among people with type 2 diabetes receiving antidiabetes drugs without insulin. Methods People with type 2 diabetes who were not treated with insulin were identified from a US-based employer claims database (1998–2010). Following initiation of an antidiabetes drug, the occurrence of accidents resulting in hospital visits was compared between people with, and without, claims for hypoglycaemia using multivariable Cox proportional hazard models adjusted for demographics, comorbidities, prior treatments and prior medical service use. Additional analyses were stratified by age 65 years or older. Results A total of N = 5582 people with claims for hypoglycaemia and N = 27 910 with no such claims were included. Accidents resulting in hospital visits occurred in 5.5 and 2.8% of people with, and without, hypoglycaemia, respectively. After adjusting for baseline characteristics, hypoglycaemia was associated with significantly increased hazards for any accident [hazard ratio (HR) 1.39, 95% CI 1.21–1.59, p < 0.001], accidental falls (HR 1.36, 95% CI 1.13–1.65, p < 0.001) and motor vehicle accidents (HR 1.82, 95% CI 1.18–2.80, p = 0.007). In age-stratified analyses, hypoglycaemia was associated with greater hazards of driving-related accidents in people younger than age 65 and falls in people aged 65 or older. Conclusions In people with type 2 diabetes receiving antidiabetes drugs without insulin, hypoglycaemia was associated with a significantly higher risk of accidents resulting in hospital visits, including accidents related to driving and falls. PMID:23121373
Ade, Carl J; Broxterman, Ryan M; Charvat, Jacqueline M; Barstow, Thomas J
2017-08-07
It is unknown whether the astronaut occupation or exposure to microgravity influences the risk of long-term cardiovascular disease (CVD). This study explored the effects of being a career National Aeronautics and Space Administration (NASA) astronaut on the risk for clinical CVD end points. During the Longitudinal Study of Astronaut Health, data were collected on 310 NASA astronauts and 981 nonastronaut NASA employees. The nonastronauts were matched to the astronauts on age, sex, and body mass index, to evaluate acute and chronic morbidity and mortality. The primary outcomes were composites of clinical CVD end points (myocardial infarction, congestive heart failure, stroke, and coronary artery bypass surgery) or coronary artery disease (CAD) end points (myocardial infarction and coronary artery bypass surgery). Of the astronauts, 5.2% had a clinical CVD end point and 2.9% had a CAD end point compared with the nonastronaut comparisons with 4.7% and 3.1% having CVD and CAD end points, respectively. In the multivariate models adjusted for traditional risk factors, astronauts had a similar risk of CVD compared with nonastronauts (adjusted hazard ratio, 1.08; 95% CI, 0.60-1.93; P =0.80). Risk of a CAD end point was similar between groups (hazard ratio, 0.97; CI, 0.45-2.08; P =0.93). In astronauts with early spaceflight experience, the risk of CVD (hazard ratio, 0.80; CI, 0.25-2.56; P =0.71) and CAD (hazard ratio, 1.23; CI: 0.27-5.61; P =0.79) compared with astronauts with no experience were not different. These findings suggest that being an astronaut is not associated with increased long-term risk of CVD development. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Dhruva, Sanket S; Huang, Chenxi; Spatz, Erica S; Coppi, Andreas C; Warner, Frederick; Li, Shu-Xia; Lin, Haiqun; Xu, Xiao; Furberg, Curt D; Davis, Barry R; Pressel, Sara L; Coifman, Ronald R; Krumholz, Harlan M
2017-07-01
Randomized trials of hypertension have seldom examined heterogeneity in response to treatments over time and the implications for cardiovascular outcomes. Understanding this heterogeneity, however, is a necessary step toward personalizing antihypertensive therapy. We applied trajectory-based modeling to data on 39 763 study participants of the ALLHAT (Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial) to identify distinct patterns of systolic blood pressure (SBP) response to randomized medications during the first 6 months of the trial. Two trajectory patterns were identified: immediate responders (85.5%), on average, had a decreasing SBP, whereas nonimmediate responders (14.5%), on average, had an initially increasing SBP followed by a decrease. Compared with those randomized to chlorthalidone, participants randomized to amlodipine (odds ratio, 1.20; 95% confidence interval [CI], 1.10-1.31), lisinopril (odds ratio, 1.88; 95% CI, 1.73-2.03), and doxazosin (odds ratio, 1.65; 95% CI, 1.52-1.78) had higher adjusted odds ratios associated with being a nonimmediate responder (versus immediate responder). After multivariable adjustment, nonimmediate responders had a higher hazard ratio of stroke (hazard ratio, 1.49; 95% CI, 1.21-1.84), combined cardiovascular disease (hazard ratio, 1.21; 95% CI, 1.11-1.31), and heart failure (hazard ratio, 1.48; 95% CI, 1.24-1.78) during follow-up between 6 months and 2 years. The SBP response trajectories provided superior discrimination for predicting downstream adverse cardiovascular events than classification based on difference in SBP between the first 2 measurements, SBP at 6 months, and average SBP during the first 6 months. Our findings demonstrate heterogeneity in response to antihypertensive therapies and show that chlorthalidone is associated with more favorable initial response than the other medications. © 2017 American Heart Association, Inc.
Skin autofluorescence and all-cause mortality in stage 3 CKD.
Fraser, Simon D S; Roderick, Paul J; McIntyre, Natasha J; Harris, Scott; McIntyre, Christopher W; Fluck, Richard J; Taal, Maarten W
2014-08-07
Novel markers may help to improve risk prediction in CKD. One potential candidate is tissue advanced glycation end product accumulation, a marker of cumulative metabolic stress, which can be assessed by a simple noninvasive measurement of skin autofluorescence. Skin autofluorescence correlates with higher risk of cardiovascular events and mortality in people with diabetes or people requiring RRT, but its role in earlier CKD has not been studied. A prospective cohort of 1741 people with CKD stage 3 was recruited from primary care between August 2008 and March 2010. Participants underwent medical history, clinical assessment, blood and urine sampling for biochemistry, and measurement of skin autofluorescence. Kaplan-Meier plots and multivariate Cox proportional hazards models were used to investigate associations between skin autofluorescence (categorical in quartiles) and all-cause mortality. In total, 1707 participants had skin autofluorescence measured; 170 (10%) participants died after a median of 3.6 years of follow-up. The most common cause of death was cardiovascular disease (41%). Higher skin autofluorescence was associated significantly with poorer survival (all-cause mortality, P<0.001) on Kaplan-Meier analysis. Univariate and age/sex-adjusted Cox proportional hazards models showed that the highest quartile of skin autofluorescence was associated with all-cause mortality (hazard ratio, 2.64; 95% confidence interval, 1.71 to 4.08; P<0.001 and hazard ratio, 1.84; 95% confidence interval, 1.18 to 2.86; P=0.003, respectively, compared with the lowest quartile). This association was not maintained after additional adjustment to include cardiovascular disease, diabetes, smoking, body mass index, eGFR, albuminuria, and hemoglobin. Skin autofluorescence was not independently associated with all-cause mortality in this study. Additional research is needed to clarify whether it has a role in risk prediction in CKD. Copyright © 2014 by the American Society of Nephrology.
Sim, John J.; Bhandari, Simran K.; Shi, Jiaxiao; Reynolds, Kristi; Calhoun, David A.; Kalantar-Zadeh, Kamyar; Jacobsen, Steven J.
2015-01-01
We sought to compare the risk of end stage renal disease (ESRD), ischemic heart event (IHE), congestive heart failure (CHF), cerebrovascular accident (CVA), and all-cause mortality among 470,386 individuals with resistant and nonresistant hypertension (non-RH). Resistant hypertension (60,327 individuals) was sub-categorized into 2 groups; 23,104 patients with cRH (controlled on 4 or more medicines) and 37,223 patients with uRH (uncontrolled on 3 or more medicines) in a 5 year retrospective cohort study. Cox proportional hazard modeling was used to estimate hazard ratios adjusting for age, gender, race, body mass index, chronic kidney disease (CKD), and co-morbidities. Resistant hypertension (cRH and uRH) compared to non-RH, had multivariable adjusted hazard ratios (95% confidence intervals) of 1.32 (1.27–1.37), 1.24 (1.20–1.28), 1.46 (1.40–1.52), 1.14 (1.10–1.19), and 1.06 (1.03–1.08) for ESRD, IHE, CHF, CVA, and mortality, respectively. Comparison of uRH to cRH had hazard ratios of 1.25 (1.18–1.33), 1.04 (0.99–1.10), 0.94 (0.89–1.01), 1.23 (1.14–1.31), and 1.01 (0.97–1.05) for ESRD, IHE, CHF, CVA, and mortality, respectively. Males and Hispanics had greater risk for ESRD within all 3 cohorts. Resistant hypertension had greater risk for ESRD, IHE, CHF, CVA, and mortality. The risk of ESRD and CVA and were 25% and 23% greater, respectively, in uRH compared to cRH supporting the linkage between blood pressure and both outcomes. PMID:25945406
Hepatitis B and C virus infection and diabetes mellitus: A cohort study.
Hong, Yun Soo; Chang, Yoosoo; Ryu, Seungho; Cainzos-Achirica, Miguel; Kwon, Min-Jung; Zhang, Yiyi; Choi, Yuni; Ahn, Jiin; Rampal, Sanjay; Zhao, Di; Pastor-Barriuso, Roberto; Lazo, Mariana; Shin, Hocheol; Cho, Juhee; Guallar, Eliseo
2017-07-04
The role of hepatitis virus infection in glucose homeostasis is uncertain. We examined the associations between hepatitis B virus (HBV) or hepatitis C virus (HCV) infection and the development of diabetes in a cohort (N = 439,708) of asymptomatic participants in health screening examinations. In cross-sectional analyses, the multivariable-adjusted odds ratio for prevalent diabetes comparing hepatitis B surface antigen (HBsAg) (+) to HBsAg (-) participants was 1.17 (95% CI 1.06-1.31; P = 0.003). The corresponding odds ratio comparing hepatitis C antibodies (HCV Ab) (+) to HCV Ab (-) participants was 1.43 (95% CI 1.01-2.02, P = 0.043). In prospective analyses, the multivariable-adjusted hazard ratio for incident diabetes comparing HBsAg (+) to HbsAg (-) participants was 1.23 (95% CI 1.08-1.41; P = 0.007). The number of incident cases of diabetes among HCV Ab (+) participants (10 cases) was too small to reliably estimate the prospective association between HCV infection and diabetes. In this large population at low risk of diabetes, HBV and HCV infections were associated with diabetes prevalence and HBV infection with the risk of incident diabetes. Our studies add evidence suggesting that diabetes is an additional metabolic complication of HBV and HCV infection.
Coffee and caffeine intake and the risk of ovarian cancer
Lueth, Natalie A.; Anderson, Kristin E.; Harnack, Lisa J.; Fulkerson, Jayne A.; Robien, Kim
2008-01-01
Laboratory data suggests that caffeine or some components of coffee may cause DNA mutations and inhibit tumor suppressor mechanisms, leading to neoplastic growth. However, coffee consumption has not been clearly implicated in the etiology of human post-menopausal ovarian cancer. This study evaluated the relationship of coffee and caffeine intake with risk of epithelial ovarian cancer in a prospective cohort study of 29,060 postmenopausal women. The participants completed a mailed questionnaire that assessed diet and health history and were followed for ovarian cancer incidence from 1986 to 2004. Age-adjusted and multivariate-adjusted hazard ratios were calculated for four exposure variables: caffeinated coffee, decaffeinated coffee, total coffee and total caffeine to assess whether or not coffee or caffeine influences the risk of ovarian cancer. An increased risk was observed in the multivariate model for women who reported drinking five or more cups/day of caffeinated coffee compared to women who reported drinking none (HR=1.81, 95% CI: 1.10-2.95). Decaffeinated coffee, total coffee and caffeine were not statistically significantly associated with ovarian cancer incidence. Our results suggest that a component of coffee other than caffeine, or in combination with caffeine, may be associated with increased risk of ovarian cancer in postmenopausal women who drink five or more cups of coffee a day. PMID:18704717
Fondell, Elinor; O'Reilly, É Ilis J; Fitzgerald, Kathryn C; Falcone, Guido J; Kolonel, Laurence N; Park, Yikyung; Gapstur, Susan M; Ascherio, Alberto
2015-01-01
Caffeine is thought to be neuroprotective by antagonizing the adenosine A2A receptors in the brain and thereby protecting motor neurons from excitotoxicity. We examined the association between consumption of caffeine, coffee and tea and risk of amyotrophic lateral sclerosis (ALS). Longitudinal analyses based on over 1,010,000 males and females in five large cohort studies (the Nurses' Health Study, the Health Professionals Follow-up Study, the Cancer Prevention Study II Nutrition Cohort, the Multiethnic Cohort Study, and the National Institutes of Health-AARP Diet and Health Study). Cohort-specific multivariable-adjusted risk ratios (RR) and 95% confidence intervals (CI) estimates of ALS incidence or death were estimated by Cox proportional hazards regression and pooled using random-effects models. Results showed that a total of 1279 cases of ALS were documented during a mean of 18 years of follow-up. Caffeine intake was not associated with ALS risk; the pooled multivariable-adjusted RR comparing the highest to the lowest quintile of intake was 0.96 (95% CI 0.81-1.16). Similarly, neither coffee nor tea was associated with ALS risk. In conclusion, the results of this large study do not support associations of caffeine or caffeinated beverages with ALS risk.
Fondell, Elinor; O'Reilly, Éilis J.; Fitzgerald, Kathryn C.; Falcone, Guido J.; Kolonel, Laurence N.; Park, Yikyung; Gapstur, Susan M.; Ascherio, Alberto
2015-01-01
Objective Caffeine is thought to be neuroprotective by antagonizing the adenosine A2A receptors in the brain and thereby protecting motor neurons from excitotoxicity. We examined the association between consumption of caffeine, coffee and tea and risk of Amyotrophic Lateral Sclerosis (ALS). Methods Longitudinal analyses based on over 1 010 000 men and women in 5 large cohort studies [the Nurses’ Health Study, the Health Professionals Follow-up Study, the Cancer Prevention Study II Nutrition Cohort, the Multiethnic Cohort Study, and the National Institutes of Health – AARP Diet and Health Study]. Cohort-specific multivariable-adjusted risk ratios (RR) and 95% confidence intervals (CI) estimates of ALS incidence or death was estimated by Cox proportional hazards regression and pooled using random-effects models. Results A total of 1279 cases of ALS were documented during a mean of 18 years of follow-up. Caffeine intake was not associated with ALS risk; the pooled multivariable-adjusted RR comparing the highest to the lowest quintile of intake was 0.96 (95% CI 0.81-1.16). Similarly, neither coffee nor tea was associated with ALS risk. Conclusion The results of this large study do not support associations of caffeine or caffeinated beverages with ALS risk. PMID:25822002
Body mass index and infectious disease mortality in midlife in a cohort of 2.3 million adolescents.
Twig, G; Geva, N; Levine, H; Derazne, E; Goldberger, N; Haklai, Z; Leiba, A; Kark, J D
2017-10-30
Obesity was linked to altered immunity, but also to favorable outcomes among patients with infectious disease (ID) in some settings. We assessed the association between adolescent body mass index (BMI) and ID mortality. BMI of 2 294 139 Israeli adolescents (60% men; age 17.4±0.3 years) was measured between 1967 and 2010. The outcome, obtained by linkage with official national records, was death due to ID as the underlying cause. Multivariable Cox proportional hazards models were applied. During 42 297 007 person-years of follow-up (median 18.4 years), there were 689 deaths from ID (mean age 44.1±10.5 years). Adjusted hazard ratios (HR) were 1.039 (1.011-1.068) and 1.146 (1.099-1.194) among men and women, respectively, per unit increment in BMI (P for sex interaction=4.4 × 10 -5 ). Adjusted hazard ratios among men were 1.2 (1.0-1.5), 1.9 (1.4-2.5) and 2.5 (1.5-4.2) for those with high-normal BMI (22.0-24.9 kg m -2 ), overweight and obese, respectively, compared with the 18.5⩽BMI<22 kg m -2 reference group, and 1.7 (1.1-2.6), 2.6 (1.6-4.3) and 6.6 (3.3-13.1) among women, respectively. The increased risk among underweight (<18.5 kg m -2 ) boys was attenuated when the study sample was restricted to those with unimpaired health at baseline. A multivariable spline model indicated a minimum risk for total ID mortality at 20.7 and 18.0 kg m -2 for men and women, respectively, with significantly increased risk seen above adolescent BMI values of 23.6 and 24.0 kg m -2 , respectively. The association with BMI was particularly evident for bacterial infections (predominantly sepsis), airways and central nervous system infections (63% of the ID deaths). Adolescent overweight and obesity were strongly associated with ID mortality, especially of bacterial origin and among women.International Journal of Obesity advance online publication, 26 December 2017; doi:10.1038/ijo.2017.263.
Takashima, N; Turin, T C; Matsui, K; Rumana, N; Nakamura, Y; Kadota, A; Saito, Y; Sugihara, H; Morita, Y; Ichikawa, M; Hirose, K; Kawakani, K; Hamajima, N; Miura, K; Ueshima, H; Kita, Y
2014-05-01
Brachial-ankle pulse wave velocity (baPWV) is a non-invasive measure of arterial stiffness obtained using an automated system. Although baPWVs have been widely used as a non-invasive marker for evaluation of arterial stiffness, evidence for the prognostic value of baPWV in the general population is scarce. In this study, we assessed the association between baPWV and future cardiovascular disease (CVD) incidence in a Japanese population. From 2002 to 2009, baPWV was measured in a total of 4164 men and women without a history of CVD, and they were followed up until the end of 2009 with a median follow-up period of 6.5 years. Hazard ratios (HRs) for CVD incidence according to baPWV levels were calculated using a Cox proportional hazards model adjusted for potential confounding factors, including seated or supine blood pressure (BP). During the follow-up period, we observed 40 incident cases of CVD. In multivariable-adjusted model, baPWV as a continuous variable was not significantly associated with future CVD risk after adjustment for supine BP. However, compared with lower baPWV category (<18 m s(-1)), higher baPWV (< or = 18.0 m s(-1)) was significantly associated with an increased CVD risk (HR: 2.70, 95% confidence interval: 1.18-6.19). Higher baPWV (< or = 18.0 m s(-1)) would be an independent predictor of future CVD event in the general Japanese population.
Kaplovitch, Eric; Gomes, Tara; Camacho, Ximena; Dhalla, Irfan A.; Mamdani, Muhammad M.; Juurlink, David N.
2015-01-01
Background The use of opioids for noncancer pain is widespread, and more than 16,000 die of opioid-related causes in the United States annually. The patients at greatest risk of death are those receiving high doses of opioids. Whether sex influences the risk of dose escalation or opioid-related mortality is unknown. Methods and Findings We conducted a cohort study using healthcare records of 32,499 individuals aged 15 to 64 who commenced chronic opioid therapy for noncancer pain between April 1, 1997 and December 31, 2010 in Ontario, Canada. Patients were followed from their first opioid prescription until discontinuation of therapy, death from any cause or the end of the study period. Among patients receiving chronic opioid therapy, 589 (1.8%) escalated to high dose therapy and n = 59 (0.2%) died of opioid-related causes while on treatment. After multivariable adjustment, men were more likely than women to escalate to high-dose opioid therapy (adjusted hazard ratio 1.44; 95% confidence interval 1.21 to 1.70) and twice as likely to die of opioid-related causes (adjusted hazard ratio 2.04; 95% confidence interval 1.18 to 3.53). These associations were maintained in a secondary analysis of 285,520 individuals receiving any opioid regardless of the duration of therapy. Conclusions Men are at higher risk than women for escalation to high-dose opioid therapy and death from opioid-related causes. Both outcomes were more common than anticipated. PMID:26291716
Betts, Keith A; Griffith, Jenny; Ganguli, Arijit; Li, Nanxin; Douglas, Kevin; Wu, Eric Q
2016-05-01
To assess the economic outcomes and treatment patterns among patients with rheumatoid arthritis (RA) who used 1, 2, or 3 or more conventional synthetic disease-modifying antirheumatic drugs (DMARDs) before receiving a biologic therapy. Adult patients with ≥2 RA diagnoses (International Classification of Diseases, Ninth Revision, Clinical Modification [ICD-9-CM] codes 714.xx) on different dates, ≥1 claim for a conventional synthetic DMARD, and ≥1 claim for a biologic DMARD were identified from a large commercial claims database. The initiation date of the first biologic DMARD was defined as the index date. Based on the number of distinct conventional synthetic DMARDs initiated between the first RA diagnosis and the index date, patients were classified into 3 cohorts: those who used 1, 2, or 3 or more conventional synthetic DMARDs. Baseline characteristics were measured 6 months preindex date and compared between the 3 cohorts. All-cause health care costs (in 2014 US$) were compared during the follow-up period (12 months postbiologic initiation) using multivariable gamma models adjusting for baseline characteristics. Time to discontinuation of the index biologic DMARD and time to switching to a new DMARD were compared using multivariable Cox proportional hazards models. The 1, 2, and 3 or more conventional synthetic DMARD cohorts included 6215; 3227; and 976 patients, respectively. At baseline, patients in the 3 or more conventional synthetic DMARD cohort had the least severe RA, as indicated by the lowest claims-based index for RA severity score (1 vs 2 vs 3 or more = 6.1 vs 5.9 vs 5.8). During the study period, there was a significant association between number of conventional synthetic DMARDs and higher all-cause total health care costs (adjusted mean difference, 1 vs 2: $772; P < 0.001; 2 vs 3 or more: $2390; P < 0.001). The all-cause medical and pharmacy costs were also significantly higher with the increasing number of conventional synthetic DMARDs. Patients who cycled more conventional synthetic DMARDs were also more likely to switch treatment after biologic initiation (1 vs 2: adjusted hazard ratio = 0.89; P = 0.005; 2 vs 3 or more: adjusted hazard ratio = 0.89; P = 0.087). There were no differences in index biologic discontinuation between the 3 cohorts. Patients with RA who cycled more conventional synthetic DMARDs had increased economic burden in the 12 months following biologic initiation and were more likely to switch therapy. These results highlight the importance of timely switching to biologic DMARDs for the treatment of RA. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.
Cardiovascular Disease After Aromatase Inhibitor Use.
Haque, Reina; Shi, Jiaxiao; Schottinger, Joanne E; Chung, Joanie; Avila, Chantal; Amundsen, Britta; Xu, Xiaoqing; Barac, Ana; Chlebowski, Rowan T
2016-12-01
Cardiovascular disease (CVD) is an important cause of death in older patients with breast cancer. However, limited information exists on the long-term effect of aromatase inhibitor (AI) use on CVD risk in breast cancer survivors. To this point, no other population-based studies have been able to adjust for CVD risk factors or cardiovascular medications. To determine the long-term influence of adjuvant endocrine therapies on CVD in a cohort of postmenopausal breast cancer survivors in analyses that accounted for major CVD risk factors, medication use, chemotherapy, and radiotherapy. A retrospective cohort of postmenopausal women with breast cancer diagnosed from January 1, 1991, to December 31, 2010, and followed up through December 31, 2011 (maximum, 21 years [72 886 person-years]), was evaluated using records from a managed care organization with nearly 20 community hospitals in California. A total of 13 273 postmenopausal women with hormone receptor-positive breast cancer without prior CVD were included. Cardiovascular disease incidence was compared across endocrine therapy categories. Information on demographics, comorbidity, medication, use, and CVD risk was captured from electronic health records. Multivariate Cox proportional hazards models using time-dependent endocrine drug use variables and propensity scores were conducted. Data analysis was conducted from September 15, 2014, to February 1, 2016. Women were grouped by endocrine therapy status (tamoxifen citrate only, AI only, both, or neither). Person-year rates of CVD for each therapy group. During 72 886 person-years in 13 273 women (mean [SD] age, 66.8 [8.1] years) with follow-up through 2011, we observed 3711 CVD events. In multivariable analyses (reported as hazard ratio [95% CI]), AI-only users had a similar risk of cardiac ischemia (myocardial infarction and angina) (adjusted, 0.97 [0.78-1.22]) and stroke (adjusted, 0.97 [0.70-1.33]) as tamoxifen-only users (reference). However, we found an increased risk of other CVD (dysrhythmia, valvular dysfunction, and pericarditis) (adjusted, 1.29 [1.11-1.50]) in women who used AIs only or sequentially after tamoxifen (1.26 [1.09-1.45]) vs tamoxifen (reference) as well nonhormone users (1.18 [1.02-1.35]). The risk of the most serious cardiovascular events (cardiac ischemia or stroke) was not elevated in AI-only users compared with tamoxifen users. The finding that other CVD events combined were greater in AI users requires further study.
Dietary Inflammatory Index and Incidence of Cardiovascular Disease in the PREDIMED Study.
Garcia-Arellano, Ana; Ramallal, Raul; Ruiz-Canela, Miguel; Salas-Salvadó, Jordi; Corella, Dolores; Shivappa, Nitin; Schröder, Helmut; Hébert, James R; Ros, Emilio; Gómez-Garcia, Enrique; Estruch, Ramon; Lapetra, José; Arós, Fernando; Fiol, Miquel; Serra-Majem, Lluis; Pintó, Xavier; Babio, Nancy; González, José I; Fitó, Montse; Martínez, J Alfredo; Martínez-González, Miguel A
2015-05-29
Previous studies have reported an association between a more pro-inflammatory diet profile and various chronic metabolic diseases. The Dietary Inflammatory Index (DII) was used to assess the inflammatory potential of nutrients and foods in the context of a dietary pattern. We prospectively examined the association between the DII and the incidence of cardiovascular disease (CVD: myocardial infarction, stroke or cardiovascular death) in the PREDIMED (Prevención con Dieta Mediterránea) study including 7216 high-risk participants. The DII was computed based on a validated 137-item food frequency questionnaire. Multivariate-adjusted hazard ratios (HR) and 95% confidence intervals of CVD risk were computed across quartiles of the DII where the lowest (most anti-inflammatory) quartile is the referent. Risk increased across the quartiles (i.e., with increasing inflammatory potential): HR(quartile2) = 1.42 (95%CI = 0.97-2.09); HR(quartile3) = 1.85 (1.27-2.71); and HR(quartile4) = 1.73 (1.15-2.60). When fit as continuous the multiple-adjusted hazard ratio for each additional standard deviation of the DII was 1.22 (1.06-1.40). Our results provide direct prospective evidence that a pro-inflammatory diet is associated with a higher risk of cardiovascular clinical events.
Dietary Inflammatory Index and Incidence of Cardiovascular Disease in the PREDIMED Study
Garcia-Arellano, Ana; Ramallal, Raul; Ruiz-Canela, Miguel; Salas-Salvadó, Jordi; Corella, Dolores; Shivappa, Nitin; Schröder, Helmut; Hébert, James R.; Ros, Emilio; Gómez-Garcia, Enrique; Estruch, Ramon; Lapetra, José; Arós, Fernando; Fiol, Miquel; Serra-Majem, Lluis; Pintó, Xavier; Babio, Nancy; González, José I.; Fitó, Montse; Martínez, J. Alfredo; Martínez-González, Miguel A.
2015-01-01
Previous studies have reported an association between a more pro-inflammatory diet profile and various chronic metabolic diseases. The Dietary Inflammatory Index (DII) was used to assess the inflammatory potential of nutrients and foods in the context of a dietary pattern. We prospectively examined the association between the DII and the incidence of cardiovascular disease (CVD: myocardial infarction, stroke or cardiovascular death) in the PREDIMED (Prevención con Dieta Mediterránea) study including 7216 high-risk participants. The DII was computed based on a validated 137-item food frequency questionnaire. Multivariate-adjusted hazard ratios (HR) and 95% confidence intervals of CVD risk were computed across quartiles of the DII where the lowest (most anti-inflammatory) quartile is the referent. Risk increased across the quartiles (i.e., with increasing inflammatory potential): HRquartile2 = 1.42 (95%CI = 0.97–2.09); HRquartile3 = 1.85 (1.27–2.71); and HRquartile4 = 1.73 (1.15–2.60). When fit as continuous the multiple-adjusted hazard ratio for each additional standard deviation of the DII was 1.22 (1.06–1.40). Our results provide direct prospective evidence that a pro-inflammatory diet is associated with a higher risk of cardiovascular clinical events. PMID:26035241
Maternal exposure to heatwave and preterm birth in Brisbane, Australia.
Wang, J; Williams, G; Guo, Y; Pan, X; Tong, S
2013-12-01
To quantify the short-term effects of maternal exposure to heatwave on preterm birth. An ecological study. A population-based study in Brisbane, Australia. All pregnant women who had a spontaneous singleton live birth in Brisbane between November and March in 2000-2010 were studied. Daily data on pregnancy outcomes, meteorological factors, and ambient air pollutants were obtained. The Cox proportional hazards regression model with time-dependent variables was used to examine the short-term impact of heatwave on preterm birth. A series of cut-off temperatures and durations were used to define heatwave. Multivariable analyses were also performed to adjust for socio-economic factors, demographic factors, meteorological factors, and ambient air pollutants. Spontaneous preterm births. The adjusted hazard ratios (HRs) ranged from 1.13 (95% CI 1.03-1.24) to 2.00 (95% CI 1.37-2.91) by using different heatwave definitions, after controlling for demographic, socio-economic, and meteorological factors, and air pollutants. Heatwave was significantly associated with preterm birth: the associations were robust to the definitions of heatwave. The threshold temperatures, instead of duration, could be more likely to influence the evaluation of birth-related heatwaves. The findings of this study may have significant public health implications as climate change progresses. © 2013 RCOG.
Feng, Tom; Howard, Lauren E; Vidal, Adriana C; Moreira, Daniel M; Castro-Santamaria, Ramiro; Andriole, Gerald L; Freedland, Stephen J
2017-02-01
To determine if cholesterol is a risk factor for the development of lower urinary tract symptoms in asymptomatic men. A post-hoc analysis of the Reduction by Dutasteride of Prostate Cancer Events (REDUCE) study was carried out in 2323 men with baseline International Prostate Symptom Score <8 and not taking benign prostatic hyperplasia or cholesterol medications. Cox proportion models were used to test the association between cholesterol, high-density lipoprotein, low-density lipoprotein and the cholesterol : high-density lipoprotein ratio with incident lower urinary tract symptoms, defined as first report of medical treatment, surgery or two reports of an International Prostate Symptom Score >14. A total of 253 men (10.9%) developed incident lower urinary tract symptoms. On crude analysis, higher high-density lipoprotein was associated with a decreased lower urinary tract symptoms risk (hazard ratio 0.89, P = 0.024), whereas total cholesterol and low-density lipoprotein showed no association. After multivariable adjustment, the association between high-density lipoprotein and incident lower urinary tract symptoms remained significant (hazard ratio 0.89, P = 0.044), whereas no association was observed for low-density lipoprotein (P = 0.611). There was a trend for higher cholesterol to be linked with higher lower urinary tract symptoms risk, though this was not statistically significant (hazard ratio 1.04, P = 0.054). A higher cholesterol : high-density lipoprotein ratio was associated with increased lower urinary tract symptoms risk on crude (hazard ratio 1.11, P = 0.016) and adjusted models (hazard ratio 1.12, P = 0.012). Among asymptomatic men participating in the REDUCE study, higher cholesterol was associated with increased incident lower urinary tract symptoms risk, though the association was not significant. A higher cholesterol : high-density lipoprotein ratio was associated with increased incident lower urinary tract symptoms, whereas higher high-density lipoprotein was protective. These findings suggest dyslipidemia might play a role in lower urinary tract symptoms progression. © 2016 The Japanese Urological Association.
Cardiovascular reactivity patterns and pathways to hypertension: a multivariate cluster analysis.
Brindle, R C; Ginty, A T; Jones, A; Phillips, A C; Roseboom, T J; Carroll, D; Painter, R C; de Rooij, S R
2016-12-01
Substantial evidence links exaggerated mental stress induced blood pressure reactivity to future hypertension, but the results for heart rate reactivity are less clear. For this reason multivariate cluster analysis was carried out to examine the relationship between heart rate and blood pressure reactivity patterns and hypertension in a large prospective cohort (age range 55-60 years). Four clusters emerged with statistically different systolic and diastolic blood pressure and heart rate reactivity patterns. Cluster 1 was characterised by a relatively exaggerated blood pressure and heart rate response while the blood pressure and heart rate responses of cluster 2 were relatively modest and in line with the sample mean. Cluster 3 was characterised by blunted cardiovascular stress reactivity across all variables and cluster 4, by an exaggerated blood pressure response and modest heart rate response. Membership to cluster 4 conferred an increased risk of hypertension at 5-year follow-up (hazard ratio=2.98 (95% CI: 1.50-5.90), P<0.01) that survived adjustment for a host of potential confounding variables. These results suggest that the cardiac reactivity plays a potentially important role in the link between blood pressure reactivity and hypertension and support the use of multivariate approaches to stress psychophysiology.
Shirani, Afsaneh; Zhao, Yinshan; Karim, Mohammad Ehsanul; Evans, Charity; Kingwell, Elaine; van der Kop, Mia L; Oger, Joel; Gustafson, Paul; Petkau, John; Tremlett, Helen
2012-07-18
Interferon beta is widely prescribed to treat multiple sclerosis (MS); however, its relationship with disability progression has yet to be established. To investigate the association between interferon beta exposure and disability progression in patients with relapsing-remitting MS. Retrospective cohort study based on prospectively collected data (1985-2008) from British Columbia, Canada. Patients with relapsing-remitting MS treated with interferon beta (n = 868) were compared with untreated contemporary (n = 829) and historical (n = 959) cohorts. The main outcome measure was time from interferon beta treatment eligibility (baseline) to a confirmed and sustained score of 6 (requiring a cane to walk 100 m; confirmed at >150 days with no measurable improvement) on the Expanded Disability Status Scale (EDSS) (range, 0-10, with higher scores indicating higher disability). A multivariable Cox regression model with interferon beta treatment included as a time-varying covariate was used to assess the hazard of disease progression associated with interferon beta treatment. Analyses also included propensity score adjustment to address confounding by indication. The median active follow-up times (first to last EDSS measurement) were as follows: for the interferon beta-treated cohort, 5.1 years (interquartile range [IQR], 3.0-7.0 years); for the contemporary control cohort, 4.0 years (IQR, 2.1-6.4 years); and for the historical control cohort, 10.8 years (IQR, 6.3-14.7 years). The observed outcome rates for reaching a sustained EDSS score of 6 were 10.8%, 5.3%, and 23.1% in the 3 cohorts, respectively. After adjustment for potential baseline confounders (sex, age, disease duration, and EDSS score), exposure to interferon beta was not associated with a statistically significant difference in the hazard of reaching an EDSS score of 6 when either the contemporary control cohort (hazard ratio, 1.30; 95% CI, 0.92-1.83; P = .14) or the historical control cohort (hazard ratio, 0.77; 95% CI, 0.58-1.02; P = .07) were considered. Further adjustment for comorbidities and socioeconomic status, where possible, did not change interpretations, and propensity score adjustment did not substantially change the results. Among patients with relapsing-remitting MS, administration of interferon beta was not associated with a reduction in progression of disability.
Okura, Mika; Ogita, Mihoko; Yamamoto, Miki; Nakai, Toshimi; Numata, Tomoko; Arai, Hidenori
2017-06-01
It is clear that each trend of kyphosis with increased age and the ability to eat firm foods with the back teeth (chewing ability) has a strong influence on both the physical and mental condition of older people. Thus, this study aimed to examine whether the combination of kyphosis and chewing disorders was associated with mortality or the need for care under the new long-term care insurance (LTCI) service requirement, over 3 years in community-dwelling older Japanese adults. A prospective cohort study. We analyzed the cohort data for older adults (65 years or older) from a prospective study in Kami town. The response rate was 94.3%, and we followed 5094 older individuals for 3 years. Thus, we analyzed 5083 older adults using multiple imputation to manage missing data. The outcomes were mortality or new certifications for LTCI services in a 3-year period. We developed 3 groups by asking 2 self-reported questions on both "no kyphosis" and "good chewing ability." The groups were no kyphosis and good chewing ability (GG), kyphosis and poor chewing ability (BB), and kyphosis and good chewing ability or no kyphosis and poor chewing ability (GB/BG). The prevalence of BB, BG/GB, and GG were 8.9%, 40.3%, and 50.8%, respectively, in our survey. During the 3-year follow-up period, 5.2% (n = 262) died and 13.9% (n = 708) individuals were newly certified as needing LTCI services. As determined by multivariate analyses, BG/GB older adults (adjusted hazard ratio: 1.3 [95% CI 1.1-1.6]) and BB older adults (adjusted hazard ratio: 2.0 [95% CI 1.5-2.4]) had a significantly higher risk of needing LTCI services than GG older adults. Similarly, BG/GB older adults (adjusted hazard ratio: 1.5 [95% CI 1.1-2.0]) and BB older adults (adjusted hazard ratio: 2.3 [95% CI 1.5-3.3]) had a significantly higher risk of mortality than GG older adults did. The presence of kyphosis or poor chewing ability was related to mortality and new certifications for LTCI services, and we found an additive effect of these 2 factors related to frailty. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
Alternatives for using multivariate regression to adjust prospective payment rates
Sheingold, Steven H.
1990-01-01
Multivariate regression analysis has been used in structuring three of the adjustments to Medicare's prospective payment rates. Because the indirect-teaching adjustment, the disproportionate-share adjustment, and the adjustment for large cities are responsible for distributing approximately $3 billion in payments each year, the specification of regression models for these adjustments is of critical importance. In this article, the application of regression for adjusting Medicare's prospective rates is discussed, and the implications that differing specifications could have for these adjustments are demonstrated. PMID:10113271
Lean body mass and risk of incident atrial fibrillation in post-menopausal women
Azarbal, Farnaz; Stefanick, Marcia L.; Assimes, Themistocles L.; Manson, JoAnn E.; Bea, Jennifer W.; Li, Wenjun; Hlatky, Mark A.; Larson, Joseph C.; LeBlanc, Erin S.; Albert, Christine M.; Nassir, Rami; Martin, Lisa W.; Perez, Marco V.
2016-01-01
Aims High body mass index (BMI) is a risk factor for atrial fibrillation (AF). The aim of this study was to determine whether lean body mass (LBM) predicts AF. Methods and results The Women's Health Initiative is a study of post-menopausal women aged 50–79 enrolled at 40 US centres from 1994 to 1998. A subset of 11 393 participants at three centres underwent dual-energy X-ray absorptiometry. Baseline demographics and clinical histories were recorded. Incident AF was identified using hospitalization records and diagnostic codes from Medicare claims. A multivariable Cox hazard regression model adjusted for demographic and clinical risk factors was used to evaluate associations between components of body composition and AF risk. After exclusion for prevalent AF or incomplete data, 8832 participants with an average age of 63.3 years remained for analysis. Over the 11.6 years of average follow-up time, 1035 women developed incident AF. After covariate adjustment, all measures of LBM were independently associated with higher rates of AF: total LBM [hazard ratio (HR) 1.24 per 5 kg increase, 95% confidence intervals (CI) 1.14–1.34], central LBM (HR 1.51 per 5 kg increase, 95% CI 1.31–1.74), and peripheral LBM (HR 1.39 per 5 kg increase, 95% CI 1.19–1.63). The association between total LBM and AF remained significant after adjustment for total fat mass (HR 1.22 per 5 kg increase, 95% CI 1.13–1.31). Conclusion Greater LBM is a strong independent risk factor for AF. After adjusting for obesity-related risk factors, the risk of AF conferred by higher BMI is primarily driven by the association between LBM and AF. PMID:26371115
Chen, Weiqi; Pan, Yuesong; Jing, Jing; Zhao, Xingquan; Liu, Liping; Meng, Xia; Wang, Yilong; Wang, Yongjun
2017-06-01
We aimed to determine the risk conferred by metabolic syndrome (METS) and diabetes mellitus (DM) to recurrent stroke in patients with minor ischemic stroke or transient ischemic attack from the CHANCE (Clopidogrel in High-risk patients with Acute Non-disabling Cerebrovascular Events) trial. In total, 3044 patients were included. Patients were stratified into 4 groups: neither, METS only, DM only, or both. METS was defined using the Chinese Diabetes Society (CDS) and International Diabetes Foundation (IDF) definitions. The primary outcome was new stroke (including ischemic and hemorrhagic) at 90 days. A multivariable Cox regression model was used to assess the relationship of METS and DM status to the risk of recurrent stroke adjusted for potential covariates. Using the CDS criteria of METS, 53.2%, 17.2%, 19.8%, and 9.8% of patients were diagnosed as neither, METS only, DM only, and both, respectively. After 90 days of follow-up, there were 299 new strokes (293 ischemic, 6 hemorrhagic). Patients with DM only (16.1% versus 6.8%; adjusted hazard ratio 2.50, 95% CI 1.89-3.39) and both (17.1% versus 6.8%; adjusted hazard ratio 2.76, 95% CI 1.98-3.86) had significantly increased rates of recurrent stroke. No interaction effect of antiplatelet therapy by different METS or DM status for the risk of recurrent stroke ( P =0.82 for interaction in the fully adjusted model of CDS) was observed. Using the METS (IDF) criteria demonstrated similar results. Concurrent METS and DM was associated with an increased risk of recurrent stroke in patients with minor stroke and transient ischemic attack. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Khan, Abigail May; Lubitz, Steven A.; Sullivan, Lisa M.; Sun, Jenny X.; Levy, Daniel; Vasan, Ramachandran S.; Magnani, Jared W.; Ellinor, Patrick T.; Benjamin, Emelia J.; Wang, Thomas J.
2012-01-01
Background Low serum magnesium has been linked to increased risk of atrial fibrillation (AF) following cardiac surgery. It is unknown whether hypomagnesemia predisposes to AF in the community. Methods and Results We studied 3,530 participants (mean age, 44 years; 52% women) from the Framingham Offspring Study who attended a routine examination, and were free of AF and cardiovascular disease. We used Cox proportional hazard regression analysis to examine the association between serum magnesium at baseline and risk of incident AF. Analyses were adjusted for conventional AF risk factors, use of antihypertensive medications, and serum potassium. During up to 20 years of follow-up, 228 participants developed AF. Mean serum magnesium was 1.88 mg/dl. The age- and sex-adjusted incidence rate of AF was 9.4 per 1,000 person-years (95% confidence interval, 6.7 to 11.9) in the lowest quartile of serum magnesium (≤1.77 mg/dl), compared with 6.3 per 1,000 person-years (95% confidence interval, 4.1 to 8.4) in the highest quartile (≥1.99 mg/dl). In multivariable-adjusted models, individuals in the lowest quartile of serum magnesium were approximately 50% more likely to develop AF (adjusted hazard ratio, 1.52, 1.00 to 2.31; P=0.05), compared with those in the upper quartiles. Results were similar after excluding individuals on diuretics. Conclusion Low serum magnesium is moderately associated with the development of AF in individuals without cardiovascular disease. Because hypomagnesemia is common in the general population, a link with AF may have potential clinical implications. Further studies are warranted to confirm our findings and elucidate the underlying mechanisms. PMID:23172839
A Prospective Investigation of Coffee Drinking and Bladder Cancer Incidence in the United States.
Loftfield, Erikka; Freedman, Neal D; Inoue-Choi, Maki; Graubard, Barry I; Sinha, Rashmi
2017-09-01
In 1991, coffee was classified as a group 2B carcinogen, possibly carcinogenic to humans, based on limited epidemiologic evidence of a positive association with bladder cancer. In 2016, the International Agency for Research on Cancer downgraded this classification due to lack of evidence from prospective studies particularly for never smokers. Baseline coffee drinking was assessed with a food frequency questionnaire in the NIH-AARP prospective cohort study. Among 469,047 US adults, who were cancer free at baseline, 6,012 bladder cancer cases (5,088 men and 924 women) were identified during >6.3 million person-years of follow-up. Multivariable-adjusted Cox proportional hazards models were used to estimate hazard ratios (HR) and 95% confidence intervals (CI), with non-coffee drinkers as the reference group. Coffee drinking was positively associated with bladder cancer in models adjusted for age and sex (HR for ≥4 cups/d relative to coffee nondrinkers = 1.91, 95% CI = 1.70, 2.14; P trend < 0.0001). However, the association was substantially attenuated after adjustment for cigarette smoking and other potential confounders (HR for ≥4 cups/d relative to coffee nondrinkers = 1.18, 95% CI = 1.05, 1.33; P trend = 0.0007). Associations were further attenuated after additional adjustment for lifetime smoking patterns among the majority of the cohort with this available data (P trend = 0.16). There was no evidence of an association among never smokers (P trend = 0.84). Positive associations between coffee drinking and bladder cancer among ever smokers but not never smokers suggest that residual confounding from imperfect measurement of smoking or unmeasured risk factors may be an explanation for our positive findings.
Dalager-Pedersen, Michael; Søgaard, Mette; Schønheyder, Henrik Carl; Nielsen, Henrik; Thomsen, Reimar Wernich
2014-04-01
Infections may trigger acute cardiovascular events, but the risk after community-acquired bacteremia is unknown. We assessed the risk for acute myocardial infarction and ischemic stroke within 1 year of community-acquired bacteremia. This population-based cohort study was conducted in Northern Denmark. We included 4389 hospitalized medical patients with positive blood cultures obtained on the day of admission. Patients hospitalized with bacteremia were matched with up to 10 general population controls and up to 5 acutely admitted nonbacteremic controls, matched on age, sex, and calendar time. All incident events of myocardial infarction and stroke during the following 365 days were ascertained from population-based healthcare databases. Multivariable regression analyses were used to assess relative risks with 95% confidence intervals (CIs) for myocardial infarction and stroke among bacteremia patients and their controls. The risk for myocardial infarction or stroke was greatly increased within 30 days of community-acquired bacteremia: 3.6% versus 0.2% among population controls (adjusted relative risk, 20.86; 95% CI, 15.38-28.29) and 1.7% among hospitalized controls (adjusted relative risk, 2.18; 95% CI, 1.80-2.65). The risks for myocardial infarction or stroke remained modestly increased from 31 to 180 days after bacteremia in comparison with population controls (adjusted hazard ratio, 1.64; 95% CI, 1.18-2.27), but not versus hospitalized controls (adjusted hazard ratio, 0.95; 95% CI, 0.69-1.32). No differences in cardiovascular risk were seen after >6 months. Increased 30-day risks were consistently found for a variety of etiologic agents and infectious foci. Community-acquired bacteremia is associated with increased short-term risk of myocardial infarction and stroke.
Non-alcoholic fatty liver disease and the development of reflux esophagitis: A cohort study.
Min, Yang Won; Kim, Youngha; Gwak, Geum-Youn; Gu, Seonhye; Kang, Danbee; Cho, Soo Jin; Guallar, Eliseo; Cho, Juhee; Sinn, Dong Hyun
2018-05-01
Non-alcoholic fatty liver disease (NAFLD), a hepatic manifestation of the metabolic syndrome, is associated with gastroesophageal reflux disease in cross-sectional studies, but a prospective association has not been evaluated. The current study aimed to determine whether NAFLD increases the risk of incident reflux esophagitis in a large cohort study. We conducted a cohort study of 34 063 men and women without reflux esophagitis or other upper gastrointestinal disease at baseline who underwent health checkup examinations between January 2003 and December 2013. Fatty liver was diagnosed by ultrasound based on standard criteria. Reflux esophagitis was defined by the presence of at least grade A mucosal break on esophagogastroduodenoscopy. The prevalence of NAFLD at baseline was 33.2%. During 153 520.2 person-years of follow-up, the cumulative incidences of reflux esophagitis for participants without and with NAFLD were 9.6% and 13.8%, respectively (P < 0.001). The age-adjusted and sex-adjusted hazard ratio for the risk of reflux esophagitis development in participants with NAFLD compared with those without NAFLD was 1.15 (95% confidence interval 1.07-1.23; P < 0.001). However, this association disappeared after adjusting for body mass index and other metabolic factors (hazard ratio 1.01, 95% confidence interval 0.94-1.09; P = 0.79). Similarly, in multivariable-adjusted models, there was no significant association between NAFLD severity and the risk of developing reflux esophagitis. Non-alcoholic fatty liver disease is not independently associated with the risk of the development of reflux esophagitis, but rather, reflux esophagitis is primarily the consequence of increased body mass index commonly associated with NAFLD. © 2017 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.
Fink, Stephen P.; Yamauchi, Mai; Nishihara, Reiko; Jung, Seungyoun; Kuchiba, Aya; Wu, Kana; Cho, Eunyoung; Giovannucci, Edward; Fuchs, Charles S.; Ogino, Shuji; Markowitz, Sanford D.; Chan, Andrew T.
2014-01-01
Aspirin use reduces the risk of colorectal neoplasia, at least in part, through inhibition of prostaglandin-endoperoxide synthase 2 (PTGS2, cyclooxygenase 2)-related pathways. Hydroxyprostaglandin dehydrogenase 15-(NAD) (15-PGDH, HPGD) is downregulated in colorectal cancers and functions as a metabolic antagonist of PTGS2. We hypothesized that the effect of aspirin may be antagonized by low 15-PGDH expression in the normal colon. In the Nurses’ Health Study and the Health Professionals Follow-up Study, we collected data on aspirin use and other risk factors every two years and followed up participants for diagnoses of colorectal cancer. Duplication-method Cox proportional, multivariable-adjusted, cause-specific hazards regression for competing risks data was used to compute hazard ratios (HRs) for incident colorectal cancer according to 15-PGDH mRNA expression level measured in normal mucosa from colorectal cancer resections. Among 127,865 participants, we documented 270 colorectal cancer cases that developed during 3,166,880 person-years of follow-up and from which we could assess 15-PGDH expression. Compared with nonuse, regular aspirin use was associated with lower risk of colorectal cancer that developed within a background of colonic mucosa with high 15-PGDH expression (multivariable HR=0.49; 95% CI, 0.34–0.71), but not with low 15-PGDH expression (multivariable HR=0.90; 95% CI, 0.63–1.27) (P for heterogeneity=0.018). Regular aspirin use was associated with lower incidence of colorectal cancers arising in association with high 15-PGDH expression, but not with low 15-PGDH expression in normal colon mucosa. This suggests that 15-PGDH expression level in normal colon mucosa may serve as a biomarker which may predict stronger benefit from aspirin chemoprevention. PMID:24760190
Marrie, Thomas J; Minhas-Sandhu, Jasjeet K; Majumdar, Sumit R
2017-01-01
Abstract Objective To determine the attributable risk of community acquired pneumonia on incidence of heart failure throughout the age range of affected patients and severity of the infection. Design Cohort study. Setting Six hospitals and seven emergency departments in Edmonton, Alberta, Canada, 2000-02. Participants 4988 adults with community acquired pneumonia and no history of heart failure were prospectively recruited and matched on age, sex, and setting of treatment (inpatient or outpatient) with up to five adults without pneumonia (controls) or prevalent heart failure (n=23 060). Main outcome measures Risk of hospital admission for incident heart failure or a combined endpoint of heart failure or death up to 2012, evaluated using multivariable Cox proportional hazards analyses. Results The average age of participants was 55 years, 2649 (53.1%) were men, and 63.4% were managed as outpatients. Over a median of 9.9 years (interquartile range 5.9-10.6), 11.9% (n=592) of patients with pneumonia had incident heart failure compared with 7.4% (n=1712) of controls (adjusted hazard ratio 1.61, 95% confidence interval 1.44 to 1.81). Patients with pneumonia aged 65 or less had the lowest absolute increase (but greatest relative risk) of heart failure compared with controls (4.8% v 2.2%; adjusted hazard ratio 1.98, 95% confidence interval 1.5 to 2.53), whereas patients with pneumonia aged more than 65 years had the highest absolute increase (but lowest relative risk) of heart failure (24.8% v 18.9%; adjusted hazard ratio 1.55, 1.36 to 1.77). Results were consistent in the short term (90 days) and intermediate term (one year) and whether patients were treated in hospital or as outpatients. Conclusion Our results show that community acquired pneumonia substantially increases the risk of heart failure across the age and severity range of cases. This should be considered when formulating post-discharge care plans and preventive strategies, and assessing downstream episodes of dyspnoea. PMID:28193610
Antoine, Clemence; Benfari, Giovanni; Michelena, Hector I; Malouf, Joseph F; Nkomo, Vuyisile T; Thapa, Prabin; Enriquez-Sarano, Maurice
2018-05-31
Background -Echocardiographic quantitation of degenerative mitral regurgitation (DMR) is recommended whenever possible in clinical guidelines but is criticized and its scalability to routine clinical practice doubted. We hypothesized that echocardiographic DMR quantitation, performed in routine clinical practice by multiple practitioners predicts independently long-term survival, and thus is essential to DMR management. Methods -We included patients diagnosed with isolated mitral-valve-prolapse 2003-2011 and any degree of MR quantified by any physician/sonographer in routine clinical practice. Clinical/echocardiographic data acquired at diagnosis were retrieved electronically. Endpoint was mortality under medical treatment analyzed by Kaplan-Meir method and Proportional-Hazard models. Results -The cohort included 3914 patients (55% male) aged 62±17 years, with left ventricular ejection fraction (LVEF) 63±8% and routinely measured effective regurgitant orifice area (EROA) 19[0-40] mm 2 During follow-up (6.7±3.1 years) 696 patients died under medical management and 1263 underwent mitral surgery. In multivariate analysis, routinely measured EROA was associated with mortality (adjusted-hazard-ratio 1.19[1.13-1.24] p<0.0001 per-10mm 2 ) independently of LVEF and end-systolic diameter, symptoms and age/comorbidities. The association between routinely measured EROA and mortality persisted with competitive risk modeling (adjusted hazard-ratio 1.15[1.10-1.20] per 10mm 2 p<0.0001), or in patients without guideline-based Class I/II surgical triggers (adjusted hazard ratio 1.19[1.10-1.28] per 10mm 2 p<0.0001) and in all subgroups examined (all p<0.01). Spline curve analysis showed that, compared with general population mortality, excess mortality appears for moderate DMR (EROA ≥20mm 2 ) becomes notable ≥EROA 30mm 2 and steadily increases with higher EROA levels, > 40 mm 2 threshold. Conclusions -Echocardiographic DMR quantitation is scalable to routine practice and is independently associated with clinical outcome. Routinely measured EROA is strongly associated with long-term survival under medical treatment. Excess mortality vs. the general population appears in the "moderate" DMR range and steadily increases with higher EROA. Hence, individual EROA values should be integrated into therapeutic considerations, additionally to categorical DMR grading.
Griffiths, Robert I; Lalla, Deepa; Herbert, Robert J; Doan, Justin F; Brammer, Melissa G; Danese, Mark D
2011-01-01
We used Surveillance, Epidemiology, and End Results-Medicare data (2000-2006) to describe treatment and survival in women diagnosed with metastatic breast cancer (MBC) who received trastuzumab. There were 610 patients with a mean age of 74 years. Overall, 32% received trastuzumab alone and 47% received trastuzumab plus a taxane. In multivariate analysis, trastuzumab plus chemotherapy was associated with a lower adjusted cancer mortality rate (Hazard Ratio [HR] 0.54; 95% Confidence Interval [CI] 0.39-0.74; p < .001) than trastuzumab alone among patients who received trastuzumab as part of first-line therapy. Adding chemotherapy to first-line trastuzumab for metastatic breast cancer is associated with improved cancer survival. PMID:21929325
Lifestyle and the risk of diabetes mellitus in a Japanese population.
Tatsumi, Yukako; Ohno, Yuko; Morimoto, Akiko; Nishigaki, Yoshio; Mizuno, Shoichi; Watanabe, Shaw
2013-06-01
The objective was to examine the association between lifestyle and risk for diabetes. For an average of 9.9 years, this study prospectively followed a cohort of 7,211 (2,524 men and 4,687 women) community residents aged 30-69 years without diabetes at a health check-up conducted between April 1990 and March 1992 until diabetes was confirmed or until the end of 2006. The subjects were divided into 6 groups according to their total scores of Breslow's lifestyle index (1-2, 3, 4, 5, 6 and 7 points). The association between lifestyle and diabetes incidence was investigated using Cox proportional hazards regression models. The results showed that the multivariate-adjusted hazard ratios were 0.45 in subjects who scored 5 points, 0.39 in subjects who scored 6 points, and 0.31 in subjects who scored 7 points, compared with subjects who scored 1-2 points. These data indicate that the healthy behaviors prevent the incidence of diabetes.
Farnacio, Yvonne; Pratt, Michael E; Marshall, Elizabeth G; Graber, Judith M
2017-10-01
Psychosocial hazards in the workplace may adversely impact occupational and general health, including injury risk. Among 16,417 adult workers in the 2010 National Health Interview Survey Occupational Health Supplement, weighted prevalence estimates were calculated for work-related injuries (WRI) and any injuries. The association between injury and psychosocial occupational hazards (job insecurity, work-family imbalance, hostile work environment) was assessed adjusting for sociodemographic and occupational factors. WRI prevalence was 0.65% (n = 99); any injury prevalence was 2.46% (n = 427). In multivariable models job insecurity, work-family imbalance, and hostile work environment were each positively associated with WRI prevalence (odds ratio [OR]: 1.60, 95% CI: 0.97-2.65; OR: 1.69, 95% CI 0.96-2.89; and 2.01, 95% CI 0.94-4.33, respectively). Stressful working conditions may contribute to injuries. There is need for ongoing surveillance of occupational psychosocial risk factors and further study of their relationship with injury.
NASA Astrophysics Data System (ADS)
André, C.; Monfort, D.; Bouzit, M.; Vinchon, C.
2013-08-01
There are a number of methodological issues involved in assessing damage caused by natural hazards. The first is the lack of data, due to the rarity of events and the widely different circumstances in which they occur. Thus, historical data, albeit scarce, should not be neglected when seeking to build ex-ante risk management models. This article analyses the input of insurance data for two recent severe coastal storm events, to examine what causal relationships may exist between hazard characteristics and the level of damage incurred by residential buildings. To do so, data was collected at two levels: from lists of about 4000 damage records, 358 loss adjustment reports were consulted, constituting a detailed damage database. The results show that for flooded residential buildings, over 75% of reconstruction costs are associated with interior elements, with damage to structural components remaining very localised and negligible. Further analysis revealed a high scatter between costs and water depth, suggesting that uncertainty remains high in drawing up damage functions with insurance data alone. Due to the paper format of the loss adjustment reports, and the lack of harmonisation between their contents, the collection stage called for a considerable amount of work. For future events, establishing a standardised process for archiving damage information could significantly contribute to the production of such empirical damage functions. Nevertheless, complementary sources of data on hazards and asset vulnerability parameters will definitely still be necessary for damage modelling; multivariate approaches, crossing insurance data with external material, should also be investigated more deeply.
NASA Astrophysics Data System (ADS)
André, C.; Monfort, D.; Bouzit, M.; Vinchon, C.
2013-03-01
There are a number of methodological issues involved in assessing damage caused by natural hazards. The first is the lack of data, due to the rarity of events and the widely different circumstances in which they occur. Thus, historical data, albeit scarce, should not be neglected when seeking to build ex-ante risk management models. This article analyses the input of insurance data for two recent severe coastal storm events, to examine what causal relationships may exist between hazard characteristics and the level of damage incurred by residential buildings. To do so, data was collected at two levels: from lists of about 4000 damage records, 358 loss adjustment reports were consulted, constituting a detailed damage database. The results show that for flooded residential buildings, over 75% of reconstruction costs are associated with interior elements, damage to structural components remaining very localised and negligible. Further analysis revealed a high scatter between costs and water depth, suggesting that uncertainty remains high in drawing up damage functions with insurance data alone. Due to the paper format of the loss adjustment reports and the lack of harmonisation between their contents, the collection stage called for a considerable amount of work. For future events, establishing a standardised process for archiving damage information could significantly contribute to the production of such empirical damage functions. Nevertheless, complementary sources of data on hazards and asset vulnerability parameters, will definitely still be necessary for damage modelling and multivariate approaches, crossing insurance data with external material, should also be deeper investigated.
Witberg, Guy; Regev, Ehud; Chen, Shmuel; Assali, Abbid; Barbash, Israel M; Planer, David; Vaknin-Assa, Hana; Guetta, Victor; Vukasinovic, Vojislav; Orvin, Katia; Danenberg, Haim D; Segev, Amit; Kornowski, Ran
2017-07-24
The study sought to examine the effect of coronary artery disease (CAD) on mortality in patients undergoing transcatheter aortic valve replacement (TAVR). CAD is common in the TAVR population. However, there are conflicting data on the prognostic significance of CAD and its treatment in this population. The authors analyzed 1,270 consecutive patients with severe aortic stenosis (AS) undergoing TAVR at 3 Israeli centers. They investigated the association of CAD severity (no CAD, nonsevere CAD [i.e., SYNTAX score (SS) <22], severe CAD [SS >22]) and revascularization completeness ("reasonable" incomplete revascularization [ICR] [i.e., residual SS <8]; ICR [residual SS >8]) with all-cause mortality following TAVR using a Cox proportional hazards ratio model adjusted for multiple prognostic variables. Of the 1,270 patients, 817 (64%) had no CAD, 331 (26%) had nonsevere CAD, and 122 (10%) had severe CAD. Over a median follow-up of 1.9 years, 311 (24.5%) patients died. Mortality was higher in the severe CAD and the ICR groups, but not in the nonsevere CAD or "reasonable" ICR groups, versus no CAD. After multivariate adjustment, both severe CAD (hazard ratio: 2.091; p = 0.017) and ICR (hazard ratio: 1.720; p = 0.031) were associated with increased mortality. Only severe CAD was associated with increased mortality post-TAVR. More complete revascularization pre-TAVR may attenuate the association of severe CAD and mortality. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Williams, Robert C; Opelz, Gerhard; McGarvey, Chelsea J; Weil, E Jennifer; Chakkera, Harini A
2016-05-01
Since the beginning of the technology, there has been active debate about the role of human leucocyte antigen (HLA) matching in kidney allograft survival. Recent studies have reported diminishing importance of HLA matching, which have, in turn, been challenged by reports that suggest the continuing importance of these loci. Given the controversies, we examined the effect of HLA compatibility on kidney allograft survival by studying all first adult kidney transplants in the United States from a deceased donor. Using the United Network for Organ Sharing data, we identified first deceased donor kidney transplants between October 1, 1987, and December 31, 2013. Recipients were classified by their number of HLA mismatches. Cox multivariate regression analyses adjusting for recipient and donor transplant characteristics were performed to determine the impact of HLA compatibility on kidney allograft survival. Study cohort included 189 141 first adult kidney alone transplants, with a total of 994 558 years of kidney allograft follow-up time. Analyses adjusted for recipient and donor characteristics demonstrated a 13% higher risk (hazard ratio, 1.13; 95% confidence interval, 1.06-1.21) with 1 mismatch and a 64% higher risk (hazard ratio, 1.64, 95% confidence interval, 1.56-1.73) with 6 mismatches. Dividing the mismatch categories into 27 ordered permutations, and testing their 57 within mismatch category differences, demonstrated that all but 1 were equal, independent of locus. A significant linear relationship of hazard ratios was associated with HLA mismatch and affects allograft survival even during the recent periods of increasing success in renal transplantation.
Ndumbi, Patricia; Gillis, Jennifer; Raboud, Janet M; Cooper, Curtis; Hogg, Robert S; Montaner, Julio S G; Burchell, Ann N; Loutfy, Mona R; Machouf, Nima; Klein, Marina B; Tsoukas, Chris M
2013-11-28
We investigated the probability of transitioning in or out of the CD3⁺ T-cell homeostatic range during antiretroviral therapy, and we assessed the clinical impact of lost T-cell homeostasis (TCH) on AIDS-defining illnesses (ADIs) or death. Within the Canadian Observational Cohort (CANOC), we studied 4463 antiretroviral therapy (ART)-naive HIV-positive patients initiating combination ART (cART) between 2000 and 2010. CD3⁺ trajectories were estimated using a four state Markov model. CD3⁺ T-cel percentage states were classified as follows: very low (<50%), low (50-64%), normal (65-85%), and high (>85%). Covariates associated with transitioning between states were examined. The association between CD3⁺ T-cell percentage states and time to ADI/death from cART initiation was determined using Cox proportional hazards models. A total of 4463 patients were followed for a median of 3 years. Two thousand, five hundred and eight (56%) patients never transitioned from their baseline CD3⁺ T-cell percentage state; 85% of these had normal TCH. In multivariable analysis, individuals with time-updated low CD4⁺ cell count, time-updated detectable viral load, older age, and hepatitis C virus (HCV) coinfection were less likely to maintain TCH. In the multivariable proportional hazards model, both very low and high CD3⁺ T-cell percentages were associated with increased risk of ADI/death [adjusted hazard ratio=1.91 (95% confidence interval, CI: 1.27-2.89) and hazard ratio=1.49 (95% CI: 1.13-1.96), respectively]. Patients with very low or high CD3⁺ T-cell percentages are at risk for ADIs/death. To our knowledge, this is the first study linking altered TCH and morbidity/mortality in cART-treated HIV-positive patients.
Yeh, Hsin-Chih; Jan, Hau-Chern; Wu, Wen-Jeng; Li, Ching-Chia; Li, Wei-Ming; Ke, Hung-Lung; Huang, Shu-Pin; Liu, Chia-Chu; Lee, Yung-Chin; Yang, Sheau-Fang; Liang, Peir-In; Huang, Chun-Nung
2015-01-01
Objectives To investigate the impact of preoperative hydronephrosis and flank pain on prognosis of patients with upper tract urothelial carcinoma. Methods In total, 472 patients with upper tract urothelial carcinoma managed by radical nephroureterectomy were included from Kaohsiung Medical University Hospital Healthcare System. Clinicopathological data were collected retrospectively for analysis. The significance of hydronephrosis, especially when combined with flank pain, and other relevant factors on overall and cancer-specific survival were evaluated. Results Of the 472 patients, 292 (62%) had preoperative hydronephrosis and 121 (26%) presented with flank pain. Preoperative hydronephrosis was significantly associated with age, hematuria, flank pain, tumor location, and pathological tumor stage. Concurrent presence of hydronephrosis and flank pain was a significant predictor of non-organ-confined disease (multivariate-adjusted hazard ratio = 2.10, P = 0.025). Kaplan-Meier analysis showed significantly poorer overall and cancer-specific survival in patients with preoperative hydronephrosis (P = 0.005 and P = 0.026, respectively) and in patients with flank pain (P < 0.001 and P = 0.001, respectively) than those without. However, only simultaneous hydronephrosis and flank pain independently predicted adverse outcome (hazard ratio = 1.98, P = 0.016 for overall survival and hazard ratio = 1.87, P = 0.036 for and cancer-specific survival, respectively) in multivariate Cox proportional hazards models. In addition, concurrent presence of hydronephrosis and flank pain was also significantly predictive of worse survival in patient with high grade or muscle-invasive disease. Notably, there was no difference in survival between patients with hydronephrosis but devoid of flank pain and those without hydronephrosis. Conclusion Concurrent preoperative presence of hydronephrosis and flank pain predicted non-organ-confined status of upper tract urothelial carcinoma. When accompanied with flank pain, hydronephrosis represented an independent predictor for worse outcome in patients with upper tract urothelial carcinoma. PMID:26469704
Unger, Erin D; Dubin, Ruth F; Deo, Rajat; Daruwalla, Vistasp; Friedman, Julie L; Medina, Crystal; Beussink, Lauren; Freed, Benjamin H; Shah, Sanjiv J
2016-01-01
Chronic kidney disease (CKD) is associated with worse outcomes in heart failure with preserved ejection fraction (HFpEF). Whether this association is due the effect of CKD on intrinsic abnormalities in cardiac function is unknown. We hypothesized that CKD is independently associated with worse cardiac mechanics in HFpEF. We prospectively studied 299 patients enrolled in the Northwestern University HFpEF Program. Using the creatinine-based CKD-Epi equation to calculate estimated glomerular filtration rate (eGFR), study participants were analysed by CKD status (using eGFR <60 mL/min/1.73 m(2) to denote CKD). Indices of cardiac mechanics (longitudinal strain parameters) were measured using speckle-tracking echocardiography. Using multivariable-adjusted linear and Cox regression analyses, we determined the association between CKD and echocardiographic parameters and clinical outcomes (cardiovascular hospitalization or death). Of 299 study participants, 48% had CKD. CKD (dichotomous variable) and reduced eGFR (continuous variable) were both associated with worse cardiac mechanics indices including left atrial (LA) reservoir strain, LV longitudinal strain, and right ventricular free wall strain even after adjusting for potential confounders, including co-morbidities, EF, and volume status. For example, for each 1-SD decrease in eGFR, LA reservoir strain was 3.52% units lower (P < 0.0001) after multivariable adjustment. Reduced eGFR was also associated with worse outcomes [adjusted hazard ratio (HR) 1.28, 95% confidence interval (CI) 1.01-1.61 per 1-SD decrease in eGFR; P = 0.039]. The association was attenuated after adjustment for indices of cardiac mechanics (P = 0.064). In HFpEF, CKD is independently associated with worse cardiac mechanics, which may explain why HFpEF patients with CKD have worse outcomes. NCT01030991. © 2015 The Authors European Journal of Heart Failure © 2015 European Society of Cardiology.
Effect of HCV, HIV and coinfection in kidney transplant recipients: mate kidney analyses.
Xia, Y; Friedmann, P; Yaffe, H; Phair, J; Gupta, A; Kayler, L K
2014-09-01
Reports of kidney transplantation (KTX) in recipients with hepatitis C virus (HCV+), human immunodeficiency virus (HIV+) or coinfection often do not provide adequate adjustment for donor risk factors. We evaluated paired deceased-donor kidneys (derived from the same donor transplanted to different recipients) in which one kidney was transplanted into a patient with viral infection (HCV+, n = 1700; HIV+, n = 243) and the other transplanted into a recipient without infection (HCV- n = 1700; HIV- n = 243) using Scientific Registry of Transplant Recipients data between 2000 and 2013. On multivariable analysis (adjusted for recipient risk factors), HCV+ conferred increased risks of death-censored graft survival (DCGS) (adjusted hazard ratio [aHR] 1.24, 95% confidence interval [CI] 1.04-1.47) and patient survival (aHR 1.24, 95% CI 1.06-1.45) compared with HCV-. HIV+ conferred similar DCGS (aHR 0.85, 95% CI 0.48-1.51) and patient survival (aHR 0.80, 95% CI 0.39-1.64) compared with HIV-. HCV coinfection was a significant independent risk factor for DCGS (aHR 2.33; 95% CI 1.06, 5.12) and patient survival (aHR 2.88; 95% CI 1.35, 6.12). On multivariable analysis, 1-year acute rejection was not associated with HCV+, HIV+ or coinfection. Whereas KTX in HIV+ recipients were associated with similar outcomes relative to noninfected recipients, HCV monoinfection and, to a greater extent, coinfection were associated with poor patient and graft survival. © Copyright 2014 The American Society of Transplantation and the American Society of Transplant Surgeons.
Martinez-Aguilar, Esther; Orbe, Josune; Fernández-Montero, Alejandro; Fernández-Alonso, Sebastián; Rodríguez, Jose A; Fernández-Alonso, Leopoldo; Páramo, Jose A; Roncal, Carmen
2017-11-01
The prognosis of patients with peripheral arterial disease (PAD) is characterized by an exceptionally high risk for myocardial infarction, ischemic stroke, and death; however, studies in search of new prognostic biomarkers in PAD are scarce. Even though low levels of high-density lipoprotein cholesterol (HDL-C) have been associated with higher risk of cardiovascular (CV) complications and death in different atherosclerotic diseases, recent epidemiologic studies have challenged its prognostic utility. The aim of this study was to test the predictive value of HDL-C as a risk factor for ischemic events or death in symptomatic PAD patients. Clinical and demographic parameters of 254 symptomatic PAD patients were recorded. Amputation, ischemic coronary disease, cerebrovascular disease, and all-cause mortality were recorded during a mean follow-up of 2.7 years. Multivariate analyses showed that disease severity (critical limb ischemia) was significantly reduced in patients with normal HDL-C levels compared with the group with low HDL-C levels (multivariate analysis odds ratio, 0.09; 95% confidence interval [CI], 0.03-0.24). A decreased risk for mortality (hazard ratio, 0.46; 95% CI, 0.21-0.99) and major adverse CV events (hazard ratio, 0.38; 95% CI, 0.16-0.86) was also found in patients with normal vs reduced levels of HDL-C in both Cox proportional hazards models and Kaplan-Meier estimates, after adjustment for confounding factors. Reduced HDL-C levels were significantly associated with higher risk for development of CV complications as well as with mortality in PAD patients. These findings highlight the usefulness of this simple test for early identification of PAD patients at high risk for development of major CV events. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Wang, Haibo; Brown, Katherine S.; Wang, Guixiang; Ding, Guowei; Zang, Chunpeng; Wang, Junjie; Reilly, Kathleen H.; Chen, Helen; Wang, Ning
2012-01-01
Background Drug use and sex work have had facilitative roles in the transmission of HIV/AIDS in China. Stopping drug use among sex workers may help to control the growth of the HIV/AIDS epidemic among Chinese sex workers. Methods From March 2006 to November 2009, female sex workers (FSW) in Kaiyuan City, Yunnan, China were recruited into an open cohort study. Participants were interviewed and tested for drug use and HIV/sexually transmitted infection (STI) prevalence. Follow-up surveys were conducted every six months. Multivariate Cox proportional hazards regression model with time dependent variables was used to measure the associations between independent variables and drug initiation. Results During the course of the study, 66 (8.8%) FSWs initiated drug use yielding an overall incidence of 6.0 per 100 person years (PY) (95% confidence interval [CI], 4.67–7.58). In the multivariate Cox proportional hazards regression model, being HIV-positive and aware of positive serostatus (adjusted hazard ratio [AHR] 2.6, 95% CI 1.24–5.55), age at initiation of commercial sex work <20 years (AHR 1.8, 95% CI 1.12–3.01), and working in a high-risk establishment (AHR 1.9, 95% CI 1.14–3.04) were associated with illicit drug initiation. Conclusions Being HIV-positive and aware of positive serostatus was the most salient predictor for the initiation of illicit drug use. Interventions offering sources of education, treatment, support, and counseling to HIV-positive FSWs need to be implemented in order to help promote self-efficacy and safe behaviors among this group of high-risk women. PMID:21402453
Hayashi, Kanna; Milloy, Michael-John; Wood, Evan; Dong, Huiru; Montaner, Julio SG; Kerr, Thomas
2014-01-01
Introduction While HIV/AIDS remains an important cause of death among people who inject drugs (PWID), the potential mortality burden attributable to hepatitis C virus (HCV) infection among this population is of increasing concern. Therefore, we sought to identify trends in and predictors of liver-related mortality among PWID. Methods Data were derived from prospective cohorts of PWID in Vancouver, Canada, between 1996 and 2011. Cohort data were linked to the provincial vital statistics database to ascertain mortality rates and causes of death. Multivariate Cox proportional hazards regression was used to examine the relationship between HCV infection and time to liver-related death. A sub-analysis examined the effect of HIV/HCV co-infection. Results and discussion In total, 2,279 PWID participated in this study, with 1,921 (84.3%) having seroconverted to anti-HCV prior to baseline assessments and 124 (5.4%) during follow-up. The liver-related mortality rate was 2.1 (95% confidence interval [CI]: 1.5–3.0) deaths per 1,000 person-years and was stable over time. In multivariate analyses, HCV seropositivity was not significantly associated with liver-related mortality (adjusted relative hazard [ARH]: 0.45; 95% CI: 0.15–1.37), but HIV seropositivity was (ARH: 2.67; 95% CI: 1.27–5.63). In sub-analysis, HIV/HCV co-infection had a 2.53 (95% CI: 1.18–5.46) times hazard of liver-related death compared with HCV mono-infection. Conclusions In this study, HCV seropositivity did not predict liver-related mortality while HIV seropositivity did. The findings highlight the critical role of HIV mono- and co-infection rather than HCV infection in contributing to liver-related mortality among PWID in this setting. PMID:25391765
Ko, Naomi Y; Snyder, Frederick R; Raich, Peter C; Paskett, Electra D; Dudley, Donald J; Lee, Ji-Hyun; Levine, Paul H; Freund, Karen M
2016-09-01
Patient navigation was developed to address barriers to timely care and reduce cancer disparities. The current study explored navigation and racial and ethnic differences in time to the diagnostic resolution of a cancer screening abnormality. The authors conducted an analysis of the multisite Patient Navigation Research Program. Participants with an abnormal cancer screening test were allocated to either navigation or control. The unadjusted median time to resolution was calculated for each racial and ethnic group by navigation and control. Multivariable Cox proportional hazards models were fit, adjusting for sex, age, cancer abnormality type, and health insurance and stratifying by center of care. Among a sample of 7514 participants, 29% were non-Hispanic white, 43% were Hispanic, and 28% were black. In the control group, black individuals were found to have a longer median time to diagnostic resolution (108 days) compared with non-Hispanic white individuals (65 days) or Hispanic individuals (68 days) (P<.0001). In the navigated groups, black individuals had a reduction in the median time to diagnostic resolution (97 days) (P<.0001). In the multivariable models, among controls, black race was found to be associated with an increased delay to diagnostic resolution (hazard ratio, 0.77; 95% confidence interval, 0.69-0.84) compared with non-Hispanic white individuals, which was reduced in the navigated arm (hazard ratio, 0.85; 95% confidence interval, 0.77-0.94). Patient navigation appears to have the greatest impact among black patients, who had the greatest delays in care. Cancer 2016. © 2016 American Cancer Society. Cancer 2016;122:2715-2722. © 2016 American Cancer Society. © 2016 American Cancer Society.
Wang, G; Xu, W G; Li, F; Su, K; Li, N; Lü, Z Y; Feng, X S; Wei, L P; Chen, H D; Chen, Y H; Guo, L W; Cui, H; Yang, W J; Li, Z F; Ren, J S; Wu, S L; Shi, J F; Dai, M; He, J
2017-10-31
Objective: To investigate whether elevated levels of high sensitivity C-Reactive Protein (hsCRP) and neutrophil (NE) at baseline are associated with an increased risk of colorectal cancer in Kailuan male cohort. Methods: Since May 2006, males from Kailuan cohort were included in this study. Information on demographics, medical history, anthropometry, hsCRP and NE were collectedat baseline for all subjects. Multivariable Cox proportional hazards regression models were used to calculate hazard ratios ( HR ) of association between baseline hsCRP and NE and colorectal cancer risk. Results: By December 31, 2015, a total of 73 869 participants were enrolled in this study. During the follow-up, 336 incident colorectal cancer cases were identified. All participants were divided into three groups according to the level of hsCRP (<1 mg/L, 1-3 mg/L and >3 mg/L). The cumulative incidence of colorectal cancer were 456/10(5,) 510/10(5) and 746/10(5) in these 3 groups, respectively (χ(2)=10.79, P =0.005). Compared with participants with lower hsCRP levels (<1 mg/L), individuals with the highest hsCRP (>3 mg/L) levels had significant increased risks of colorectal cancer ( HR =1.38, 95% CI: 1.05-1.81, P =0.020)after adjusting for age, gender, smoking, drinking, BMI, diabetes and income. Furthermore, subjects were divided into two groups according to the level of NE (≤ 4.08×10(9)/L and > 4.08×10(9)/L). Multivariable Cox proportional hazards regression models indicated that there is no statistical significance of association between NE and colorectal cancer. Conclusions: Elevated levels of hsCRP at baseline might increase the risk of colorectal cancer in males.
Prognostic implications of mutation-specific QTc standard deviation in congenital long QT syndrome.
Mathias, Andrew; Moss, Arthur J; Lopes, Coeli M; Barsheshet, Alon; McNitt, Scott; Zareba, Wojciech; Robinson, Jennifer L; Locati, Emanuela H; Ackerman, Michael J; Benhorin, Jesaia; Kaufman, Elizabeth S; Platonov, Pyotr G; Qi, Ming; Shimizu, Wataru; Towbin, Jeffrey A; Michael Vincent, G; Wilde, Arthur A M; Zhang, Li; Goldenberg, Ilan
2013-05-01
Individual corrected QT interval (QTc) may vary widely among carriers of the same long QT syndrome (LQTS) mutation. Currently, neither the mechanism nor the implications of this variable penetrance are well understood. To hypothesize that the assessment of QTc variance in patients with congenital LQTS who carry the same mutation provides incremental prognostic information on the patient-specific QTc. The study population comprised 1206 patients with LQTS with 95 different mutations and ≥ 5 individuals who carry the same mutation. Multivariate Cox proportional hazards regression analysis was used to assess the effect of mutation-specific standard deviation of QTc (QTcSD) on the risk of cardiac events (comprising syncope, aborted cardiac arrest, and sudden cardiac death) from birth through age 40 years in the total population and by genotype. Assessment of mutation-specific QTcSD showed large differences among carriers of the same mutations (median QTcSD 45 ms). Multivariate analysis showed that each 20 ms increment in QTcSD was associated with a significant 33% (P = .002) increase in the risk of cardiac events after adjustment for the patient-specific QTc duration and the family effect on QTc. The risk associated with QTcSD was pronounced among patients with long QT syndrome type 1 (hazard ratio 1.55 per 20 ms increment; P<.001), whereas among patients with long QT syndrome type 2, the risk associated with QTcSD was not statistically significant (hazard ratio 0.99; P = .95; P value for QTcSD-by-genotype interaction = .002). Our findings suggest that mutations with a wider variation in QTc duration are associated with increased risk of cardiac events. These findings appear to be genotype-specific, with a pronounced effect among patients with the long QT syndrome type 1 genotype. Copyright © 2013. Published by Elsevier Inc.
Eguchi, Kazuo; Pickering, Thomas G; Schwartz, Joseph E; Hoshide, Satoshi; Ishikawa, Joji; Ishikawa, Shizukiyo; Shimada, Kazuyuki; Kario, Kazuomi
2008-11-10
It is not known whether short duration of sleep is a predictor of future cardiovascular events in patients with hypertension. To test the hypothesis that short duration of sleep is independently associated with incident cardiovascular diseases (CVD), we performed ambulatory blood pressure (BP) monitoring in 1255 subjects with hypertension (mean [SD] age, 70.4 [9.9] years) and followed them for a mean period of 50 (23) months. Short sleep duration was defined as less than 7.5 hours (20th percentile). Multivariable Cox hazard models predicting CVD events were used to estimate the adjusted hazard ratio and 95% confidence interval (CI) for short sleep duration. A riser pattern was defined when mean nighttime systolic BP exceeded daytime systolic BP. The end point was a cardiovascular event: stroke, fatal or nonfatal myocardial infarction (MI), and sudden cardiac death. In multivariable analyses, short duration of sleep (<7.5 hours) was associated with incident CVD (hazard ratio [HR], 1.68; 95% CI, 1.06-2.66; P = .03). A synergistic interaction was observed between short sleep duration and the riser pattern (P = .09). When subjects were classified according to their sleep time and a riser vs nonriser pattern, the group with shorter sleep duration plus the riser pattern had a substantially and significantly higher incidence of CVD than the group with predominant normal sleep duration plus the nonriser pattern (HR, 4.43; 95% CI, 2.09-9.39; P < .001), independent of covariates. Short duration of sleep is associated with incident CVD risk and the combination of the riser pattern and short duration of sleep that is most strongly predictive of future CVD, independent of ambulatory BP levels. Physicians should inquire about sleep duration in the risk assessment of patients with hypertension.
Jansen, Mona Dverdal; Knudsen, Gun Peggy; Myhre, Ronny; Høiseth, Gudrun; Mørland, Jørg; Næss, Øyvind; Tambs, Kristian; Magnus, Per
2014-05-01
Single nucleotide polymorphisms (SNPs) in loci 1p13 and 9p21 have previously been found to be associated with incident coronary heart disease (CHD). This study aimed to investigate whether these SNPs show associations with fatal CHD in a population-based cohort study after adjustment for socioeconomic- and lifestyle-related CHD risk factors not commonly included in genetic association studies. Using the population-based Cohort of Norway (CONOR), a nested case-cohort study was set up and DNA from 2,953 subjects (829 cases and 2,124 non-cases) were genotyped. The association with fatal CHD was estimated for four SNPs, three from locus 1p13 and one from locus 9p21. Multivariable Cox regression was used to estimate unstratified and gender-stratified hazard ratios while adjusting for major CHD risk factors. The associations between three SNPs from locus 1p13 and non-HDL cholesterol levels were also estimated. Men homozygous for the risk alleles on rs1333049 (9p21) and rs14000 (1p13) were found to have significantly increased hazard ratios in crude and adjusted models, and the hazard ratios remained statistically significant when both genders were analyzed together. Adjustment for additional socioeconomic- and lifestyle-related CHD risk factors influenced the association estimates only slightly. No significant associations were observed between the other two SNPs in loci 1p13 (rs599839 and rs646776) and CHD mortality in either gender. Both rs599839 and rs646776 showed significant, gradual increases in non-HDL cholesterol levels with increasing number of risk alleles. This study confirms the association between 9p21 (rs1333049) and fatal CHD in a Norwegian population-based cohort. The effect was not influenced by several socioeconomic- and lifestyle-related risk factors. Our results show that 1p13 (rs14000) may also be associated with fatal CHD. SNPs at 1p13 (rs599839 and rs646776) were associated with non-HDL cholesterol levels.
Weinberg, Ido; Gona, Philimon; O'Donnell, Christopher J; Jaff, Michael R; Murabito, Joanne M
2014-03-01
An increased interarm systolic blood pressure difference is an easily determined physical examination finding. The relationship between interarm systolic blood pressure difference and risk of future cardiovascular disease is uncertain. We described the prevalence and risk factor correlates of interarm systolic blood pressure difference in the Framingham Heart Study (FHS) original and offspring cohorts and examined the association between interarm systolic blood pressure difference and incident cardiovascular disease and all-cause mortality. An increased interarm systolic blood pressure difference was defined as ≥ 10 mm Hg using the average of initial and repeat blood pressure measurements obtained in both arms. Participants were followed through 2010 for incident cardiovascular disease events. Multivariable Cox proportional hazards regression analyses were performed to investigate the effect of interarm systolic blood pressure difference on incident cardiovascular disease. We examined 3390 (56.3% female) participants aged 40 years and older, free of cardiovascular disease at baseline, mean age of 61.1 years, who attended a FHS examination between 1991 and 1994 (original cohort) and from 1995 to 1998 (offspring cohort). The mean absolute interarm systolic blood pressure difference was 4.6 mm Hg (range 0-78). Increased interarm systolic blood pressure difference was present in 317 (9.4%) participants. The median follow-up time was 13.3 years, during which time 598 participants (17.6%) experienced a first cardiovascular event, including 83 (26.2%) participants with interarm systolic blood pressure difference ≥ 10 mm Hg. Compared with those with normal interarm systolic blood pressure difference, participants with an elevated interarm systolic blood pressure difference were older (63.0 years vs 60.9 years), had a greater prevalence of diabetes mellitus (13.3% vs 7.5%,), higher systolic blood pressure (136.3 mm Hg vs 129.3 mm Hg), and a higher total cholesterol level (212.1 mg/dL vs 206.5 mg/dL). Interarm systolic blood pressure difference was associated with a significantly increased hazard of incident cardiovascular events in the multivariable adjusted model (hazard ratio 1.38; 95% CI, 1.09-1.75). For each 1-SD-unit increase in absolute interarm systolic blood pressure difference, the hazard ratio for incident cardiovascular events was 1.07 (95% CI, 1.00-1.14) in the fully adjusted model. There was no such association with mortality (hazard ratio 1.02; 95% CI 0.76-1.38). In this community-based cohort, an interarm systolic blood pressure difference is common and associated with a significant increased risk for future cardiovascular events, even when the absolute difference in arm systolic blood pressure is modest. These findings support research to expand clinical use of this simple measurement. Copyright © 2014 Elsevier Inc. All rights reserved.
Gunderson, Erica P; Hurston, Shanta R; Ning, Xian; Lo, Joan C; Crites, Yvonne; Walton, David; Dewey, Kathryn G; Azevedo, Robert A; Young, Stephen; Fox, Gary; Elmasian, Cathie C; Salvador, Nora; Lum, Michael; Sternfeld, Barbara; Quesenberry, Charles P
2015-12-15
Lactation improves glucose metabolism, but its role in preventing type 2 diabetes mellitus (DM) after gestational diabetes mellitus (GDM) remains uncertain. To evaluate lactation and the 2-year incidence of DM after GDM pregnancy. Prospective, observational cohort of women with recent GDM. (ClinicalTrials.gov: NCT01967030). Integrated health care system. 1035 women diagnosed with GDM who delivered singletons at 35 weeks' gestation or later and enrolled in the Study of Women, Infant Feeding and Type 2 Diabetes After GDM Pregnancy from 2008 to 2011. Three in-person research examinations from 6 to 9 weeks after delivery (baseline) and annual follow-up for 2 years that included 2-hour, 75-g oral glucose tolerance testing; anthropometry; and interviews. Multivariable Weibull regression models evaluated independent associations of lactation measures with incident DM adjusted for potential confounders. Of 1010 women without diabetes at baseline, 959 (95%) were evaluated up to 2 years later; 113 (11.8%) developed incident DM. There were graded inverse associations for lactation intensity at baseline with incident DM and adjusted hazard ratios of 0.64, 0.54, and 0.46 for mostly formula or mixed/inconsistent, mostly lactation, and exclusive lactation versus exclusive formula feeding, respectively (P trend = 0.016). Time-dependent lactation duration showed graded inverse associations with incident DM and adjusted hazard ratios of 0.55, 0.50, and 0.43 for greater than 2 to 5 months, greater than 5 to 10 months, and greater than 10 months, respectively, versus 0 to 2 months (P trend = 0.007). Weight change slightly attenuated hazard ratios. Randomized design is not feasible or desirable for clinical studies of lactation. Higher lactation intensity and longer duration were independently associated with lower 2-year incidences of DM after GDM pregnancy. Lactation may prevent DM after GDM delivery. National Institute of Child Health and Human Development.
Swords, Douglas S; Zhang, Chong; Presson, Angela P; Firpo, Matthew A; Mulvihill, Sean J; Scaife, Courtney L
2018-04-01
Time-to-surgery from cancer diagnosis has increased in the United States. We aimed to determine the association between time-to-surgery and oncologic outcomes in patients with resectable pancreatic ductal adenocarcinoma undergoing upfront surgery. The 2004-2012 National Cancer Database was reviewed for patients undergoing curative-intent surgery without neoadjuvant therapy for clinical stage I-II pancreatic ductal adenocarcinoma. A multivariable Cox model with restricted cubic splines was used to define time-to-surgery as short (1-14 days), medium (15-42), and long (43-120). Overall survival was examined using Cox shared frailty models. Secondary outcomes were examined using mixed-effects logistic regression models. Of 16,763 patients, time-to-surgery was short in 34.4%, medium in 51.6%, and long in 14.0%. More short time-to-surgery patients were young, privately insured, healthy, and treated at low-volume hospitals. Adjusted hazards of mortality were lower for medium (hazard ratio 0.94, 95% confidence interval, .90, 0.97) and long time-to-surgery (hazard ratio 0.91, 95% confidence interval, 0.86, 0.96) than short. There were no differences in adjusted odds of node positivity, clinical to pathologic upstaging, being unresectable or stage IV at exploration, and positive margins. Medium time-to-surgery patients had higher adjusted odds (odds ratio 1.11, 95% confidence interval, 1.03, 1.20) of receiving an adequate lymphadenectomy than short. Ninety-day mortality was lower in medium (odds ratio 0.75, 95% confidence interval, 0.65, 0.85) and long time-to-surgery (odds ratio 0.72, 95% confidence interval, 0.60, 0.88) than short. In this observational analysis, short time-to-surgery was associated with slightly shorter OS and higher perioperative mortality. These results may suggest that delays for medical optimization and referral to high volume surgeons are safe. Published by Elsevier Inc.
Terada, Tasuku; Forhan, Mary; Norris, Colleen M; Qiu, Weiyu; Padwal, Raj; Sharma, Arya M; Nagendran, Jayan; Johnson, Jeffrey A
2017-04-14
The association between obesity and mortality risks following coronary revascularization is not clear. We examined the associations of BMI (kg/m 2 ) with short-, intermediate-, and long-term mortality following coronary artery bypass graft surgery (CABG) and percutaneous coronary intervention (PCI) in patients with different coronary anatomy risks and diabetes mellitus status. Data from the Alberta Provincial Project for Outcomes Assessment in Coronary Heart Disease (APPROACH) registry were analyzed. Using normal BMI (18.5-24.9) as a reference, multivariable-adjusted hazard ratios for all-cause mortality within 6 months, 1 year, 5 years, and 10 years were individually calculated for CABG and PCI with 4 prespecified BMI categories: overweight (25.0-29.9), obese class I (30.0-34.9), obese class II (35.0-39.9), and obese class III (≥40.0). The analyses were repeated after stratifying for coronary risks and diabetes mellitus status. The cohorts included 7560 and 30 258 patients for CABG and PCI, respectively. Following PCI, overall mortality was lower in patients with overweight and obese class I compared to those with normal BMI; however, 5- and 10-year mortality rates were significantly higher in patients with obese class III with high-risk coronary anatomy, which was primarily driven by higher mortality rates in patients without diabetes mellitus (5-year adjusted hazard ratio, 1.78 [95% CI, 1.11-2.85] and 10-year adjusted hazard ratio, 1.57 [95% CI, 1.02-2.43]). Following CABG, overweight was associated with lower mortality risks compared with normal BMI. Overweight was associated with lower mortality following CABG and PCI. Greater long-term mortality in patients with obese class III following PCI, especially in those with high-risk coronary anatomy without diabetes mellitus, warrants further investigation. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yi; Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo; Song, Jie
Purpose: To identify prognostic biomarkers in pancreatic cancer using high-throughput quantitative image analysis. Methods and Materials: In this institutional review board–approved study, we retrospectively analyzed images and outcomes for 139 locally advanced pancreatic cancer patients treated with stereotactic body radiation therapy (SBRT). The overall population was split into a training cohort (n=90) and a validation cohort (n=49) according to the time of treatment. We extracted quantitative imaging characteristics from pre-SBRT {sup 18}F-fluorodeoxyglucose positron emission tomography, including statistical, morphologic, and texture features. A Cox proportional hazard regression model was built to predict overall survival (OS) in the training cohort using 162more » robust image features. To avoid over-fitting, we applied the elastic net to obtain a sparse set of image features, whose linear combination constitutes a prognostic imaging signature. Univariate and multivariate Cox regression analyses were used to evaluate the association with OS, and concordance index (CI) was used to evaluate the survival prediction accuracy. Results: The prognostic imaging signature included 7 features characterizing different tumor phenotypes, including shape, intensity, and texture. On the validation cohort, univariate analysis showed that this prognostic signature was significantly associated with OS (P=.002, hazard ratio 2.74), which improved upon conventional imaging predictors including tumor volume, maximum standardized uptake value, and total legion glycolysis (P=.018-.028, hazard ratio 1.51-1.57). On multivariate analysis, the proposed signature was the only significant prognostic index (P=.037, hazard ratio 3.72) when adjusted for conventional imaging and clinical factors (P=.123-.870, hazard ratio 0.53-1.30). In terms of CI, the proposed signature scored 0.66 and was significantly better than competing prognostic indices (CI 0.48-0.64, Wilcoxon rank sum test P<1e-6). Conclusion: Quantitative analysis identified novel {sup 18}F-fluorodeoxyglucose positron emission tomography image features that showed improved prognostic value over conventional imaging metrics. If validated in large, prospective cohorts, the new prognostic signature might be used to identify patients for individualized risk-adaptive therapy.« less
Preemptive Deceased Donor Kidney Transplantation: Considerations of Equity and Utility
Chen, B. Po-Han; Coresh, Josef; Segev, Dorry L.
2013-01-01
Summary Background and objectives There exists gross disparity in national deceased donor kidney transplant availability and practice: waiting times exceed 6 years in some regions, but some patients receive kidneys before they require dialysis. This study aimed to quantify and characterize preemptive deceased donor kidney transplant recipients and compare their outcomes with patients transplanted shortly after dialysis initiation. Design, setting, participants, & measurements Using the Scientific Registry of Transplant Recipients database, first-time adult deceased donor kidney transplant recipients between 1995 and 2011 were classified as preemptive, early (on dialysis≤1 year), or late recipients. Random effects logistic regression and multivariate Cox proportional hazards regression were used to identify characteristics of preemptive deceased donor kidney transplant and evaluate survival in preemptive and early recipients, respectively. Results Preemptive recipients were 9.0% of the total recipient population. Patients with private insurance (adjusted odds ratio=3.15, 95% confidence interval=3.01–3.29, P<0.001), previous (nonkidney) transplant (adjusted odds ratio=1.94, 95% confidence interval=1.67–2.26, P<0.001), and zero-antigen mismatch (adjusted odds ratio=1.45, 95% confidence interval=1.37–1.54, P<0.001; Caucasians only) were more likely to receive preemptive deceased donor kidney transplant, even after accounting for center-level clustering. African Americans were less likely to receive preemptive deceased donor kidney transplant (adjusted odds ratio=0.44, 95% confidence interval=0.41–0.47, P<0.001). Overall, patients transplanted preemptively had similar survival compared with patients transplanted within 1 year after initiating dialysis (adjusted hazard ratio=1.06, 95% confidence interval=0.99–1.12, P=0.07). Conclusions Preemptive deceased donor kidney transplant occurs most often among Caucasians with private insurance, and survival is fairly similar to survival of recipients on dialysis for <1 year. PMID:23371953
Tait, Christopher A; L'Abbé, Mary R; Smith, Peter M; Rosella, Laura C
2018-01-01
A pervasive and persistent finding is the health disadvantage experienced by those in food insecure households. While clear associations have been identified between food insecurity and diabetes risk factors, less is known about the relationship between food insecurity and incident type 2 diabetes. The objective of this study is to investigate the association between household food insecurity and the future development of type 2 diabetes. We used data from Ontario adult respondents to the 2004 Canadian Community Health Survey, linked to health administrative data (n = 4,739). Food insecurity was assessed with the Household Food Security Survey Module and incident type 2 diabetes cases were identified by the Ontario Diabetes Database. Multivariable adjusted Cox proportional hazards models were used to estimate hazard ratios (HRs) and 95% confidence intervals (CIs) for type 2 diabetes as a function of food insecurity. Canadians in food insecure households had more than 2 times the risk of developing type 2 diabetes compared to those in food secure households [HR = 2.40, 95% CI = 1.17-4.94]. Additional adjustment for BMI attenuated the association between food insecurity and type 2 diabetes [HR = 2.08, 95% CI = 0.99, 4.36]. Our findings indicate that food insecurity is independently associated with increased diabetes risk, even after adjustment for a broad set of measured confounders. Examining diabetes risk from a broader perspective, including a comprehensive understanding of socioeconomic and biological pathways is paramount for informing policies and interventions aimed at mitigating the future burden of type 2 diabetes.
Zhang, Mingfeng; Zhang, Xuehong; Qureshi, Abrar A.; Eliassen, A. Heather; Hankinson, Susan E.; Han, Jiali
2014-01-01
Background Cutaneous nevi are suggested to be hormone-related. We hypothesized that the number of cutaneous nevi might be a phenotypic marker of plasma hormone levels and predict subsequent breast cancer risk. Methods and Findings We followed 74,523 female nurses for 24 y (1986–2010) in the Nurses' Health Study and estimate the relative risk of breast cancer according to the number of cutaneous nevi. We adjusted for the known breast cancer risk factors in the models. During follow-up, a total of 5,483 invasive breast cancer cases were diagnosed. Compared to women with no nevi, women with more cutaneous nevi had higher risks of breast cancer (multivariable-adjusted hazard ratio, 1.04, 95% confidence interval [CI], 0.98–1.10 for 1–5 nevi; 1.15, 95% CI, 1.00–1.31 for 6–14 nevi, and 1.35, 95% CI, 1.04–1.74 for 15 or more nevi; p for continuous trend = 0.003). Over 24 y of follow-up, the absolute risk of developing breast cancer increased from 8.48% for women without cutaneous nevi to 8.82% (95% CI, 8.31%–9.33%) for women with 1–5 nevi, 9.75% (95% CI, 8.48%–11.11%) for women with 6–14 nevi, and 11.4% (95% CI, 8.82%–14.76%) for women with 15 or more nevi. The number of cutaneous nevi was associated with increased risk of breast cancer only among estrogen receptor (ER)–positive tumors (multivariable-adjusted hazard ratio per five nevi, 1.09, 95% CI, 1.02–1.16 for ER+/progesterone receptor [PR]–positive tumors; 1.08, 95% CI, 0.94–1.24 for ER+/PR− tumors; and 0.99, 95% CI, 0.86–1.15 for ER−/PR− tumors). Additionally, we tested plasma hormone levels according to the number of cutaneous nevi among a subgroup of postmenopausal women without postmenopausal hormone use (n = 611). Postmenopausal women with six or more nevi had a 45.5% higher level of free estradiol and a 47.4% higher level of free testosterone compared to those with no nevi (p for trend = 0.001 for both). Among a subgroup of 362 breast cancer cases and 611 matched controls with plasma hormone measurements, the multivariable-adjusted odds ratio for every five nevi attenuated from 1.25 (95% CI, 0.89–1.74) to 1.16 (95% CI, 0.83–1.64) after adjusting for plasma hormone levels. Key limitations in this study are that cutaneous nevi were self-counted in our cohort and that the study was conducted in white individuals, and thus the findings do not necessarily apply to other populations. Conclusions Our results suggest that the number of cutaneous nevi may reflect plasma hormone levels and predict breast cancer risk independently of previously known factors. Please see later in the article for the Editors' Summary PMID:24915186
Lane, Whitney O; Nussbaum, Daniel P; Sun, Zhifei; Blazer, Dan G
2018-05-25
Although surgery remains the cornerstone of gastric cancer therapy, the use of radiation therapy (RT) is increasingly being employed to optimize outcomes. We sought to assess outcomes following use of RT for the treatment of gastric adenocarcinoma. Using the National Cancer Data Base (NCDB) from 1998 to 2012, all patients with resected gastric adenocarcinoma were identified. Patients were stratified into four groups based on preoperative therapy: RT alone, chemotherapy only, chemoradiotherapy (CRT), and no preoperative therapy. Overall survival was estimated using multivariate Cox proportional hazards model. Adjusted secondary outcomes include margin positivity, lymph node harvest, LOS, 30-day readmission and mortality. A total of 10 019 patients met study criteria. In the unadjusted analysis, patients undergoing CRT compared to chemotherapy alone had fewer positive margins (7.9% vs 15.9%; P < 0.001), increased negative LNs (54.6% vs 37.7%; P < 0.001) with reduced LN retrieval (mean: 13.5 vs 19.6; P < 0.01). After multivariate adjustment, there was no survival benefit to any preoperative therapy; however, preoperative RT/CRT remained associated with decreased LN retrieval. The results support previous reports on preoperative RT resulting in decreased margin positivity. This study highlights the need to reconsider practice guidelines regarding appropriate lymphadenectomy in the setting of preoperative RT given reduced LN retrieval. © 2018 Wiley Periodicals, Inc.
Cole, Stephen R.; Hudgens, Michael G.; Tien, Phyllis C.; Anastos, Kathryn; Kingsley, Lawrence; Chmiel, Joan S.; Jacobson, Lisa P.
2012-01-01
To estimate the association of antiretroviral therapy initiation with incident acquired immunodeficiency syndrome (AIDS) or death while accounting for time-varying confounding in a cost-efficient manner, the authors combined a case-cohort study design with inverse probability-weighted estimation of a marginal structural Cox proportional hazards model. A total of 950 adults who were positive for human immunodeficiency virus type 1 were followed in 2 US cohort studies between 1995 and 2007. In the full cohort, 211 AIDS cases or deaths occurred during 4,456 person-years. In an illustrative 20% random subcohort of 190 participants, 41 AIDS cases or deaths occurred during 861 person-years. Accounting for measured confounders and determinants of dropout by inverse probability weighting, the full cohort hazard ratio was 0.41 (95% confidence interval: 0.26, 0.65) and the case-cohort hazard ratio was 0.47 (95% confidence interval: 0.26, 0.83). Standard multivariable-adjusted hazard ratios were closer to the null, regardless of study design. The precision lost with the case-cohort design was modest given the cost savings. Results from Monte Carlo simulations demonstrated that the proposed approach yields approximately unbiased estimates of the hazard ratio with appropriate confidence interval coverage. Marginal structural model analysis of case-cohort study designs provides a cost-efficient design coupled with an accurate analytic method for research settings in which there is time-varying confounding. PMID:22302074
Fujihara, Kazuya; Sugawara, Ayumi; Heianza, Yoriko; Sairenchi, Toshimi; Irie, Fujiko; Iso, Hiroyasu; Doi, Mikio; Shimano, Hitoshi; Watanabe, Hiroshi; Sone, Hirohito; Ota, Hitoshi
2014-01-01
The levels of lipids, especially triglycerides (TG), and obesity are associated with diabetes mellitus (DM). Although typically measured in fasting individuals, non-fasting lipid measurements play an important role in predicting future DM. This study compared the predictive efficacy of lipid variables according to the fasting status and body mass index (BMI) category. Data were collected for 39,196 nondiabetic men and 87,980 nondiabetic women 40-79years of age who underwent health checkups in Ibaraki-Prefecture, Japan in 1993 and were followed through 2007. The hazard ratios (HRs) for DM in relation to sex, the fasting status and BMI were estimated using a Cox proportional hazards model. A total of 8,867 participants, 4,012 men and 4,855 women, developed DM during a mean follow-up of 5.5 years. TG was found to be an independent predictor of incident DM in both fasting and non-fasting men and non-fasting women. The multivariable-adjusted HR for DM according to the TG quartile (Q) 4 vs. Q1 was 1.18 (95% confidence interval (CI): 1.05, 1.34) in the non-fasting men with a normal BMI (18.5-24.9). This trend was also observed in the non-fasting women with a normal BMI. That is, the multivariable-adjusted HRs for DM for TG Q2, Q3 and Q4 compared with Q1 were 1.07 (95% CI: 0.94, 1.23), 1.17 (95%CI: 1.03, 1.34) and 1.48 (95%CI: 1.30, 1.69), respectively. The fasting and non-fasting TG levels in men and non-fasting TG levels in women are predictive of future DM among those with a normal BMI. Clinicians must pay attention to those individuals at high risk for DM.
Farvid, Maryam S; Malekshah, Akbar F; Pourshams, Akram; Poustchi, Hossein; Sepanlou, Sadaf G; Sharafkhah, Maryam; Khoshnia, Masoud; Farvid, Mojtaba; Abnet, Christian C; Kamangar, Farin; Dawsey, Sanford M; Brennan, Paul; Pharoah, Paul D; Boffetta, Paolo; Willett, Walter C; Malekzadeh, Reza
2017-02-01
Dietary protein comes from foods with greatly different compositions that may not relate equally with mortality risk. Few cohort studies from non-Western countries have examined the association between various dietary protein sources and cause-specific mortality. Therefore, the associations between dietary protein sources and all-cause, cardiovascular disease, and cancer mortality were evaluated in the Golestan Cohort Study in Iran. Among 42,403 men and women who completed a dietary questionnaire at baseline, 3,291 deaths were documented during 11 years of follow up (2004-2015). Cox proportional hazards models estimated age-adjusted and multivariate-adjusted hazard ratios (HRs) and 95% CIs for all-cause and disease-specific mortality in relation to dietary protein sources. Data were analyzed from 2015 to 2016. Comparing the highest versus the lowest quartile, egg consumption was associated with lower all-cause mortality risk (HR=0.88, 95% CI=0.79, 0.97, p trend =0.03). In multivariate analysis, the highest versus the lowest quartile of fish consumption was associated with reduced risk of total cancer (HR=0.79, 95% CI=0.64, 0.98, p trend =0.03) and gastrointestinal cancer (HR=0.75, 95% CI=0.56, 1.00, p trend =0.02) mortality. The highest versus the lowest quintile of legume consumption was associated with reduced total cancer (HR=0.72, 95% CI=0.58, 0.89, p trend =0.004), gastrointestinal cancer (HR=0.76, 95% CI=0.58, 1.01, p trend =0.05), and other cancer (HR=0.66, 95% CI=0.47, 0.93, p trend =0.04) mortality. Significant associations between total red meat and poultry intake and all-cause, cardiovascular disease, or cancer mortality rate were not observed among all participants. These findings support an association of higher fish and legume consumption with lower cancer mortality, and higher egg consumption with lower all-cause mortality. Copyright © 2016 American Journal of Preventive Medicine. All rights reserved.
Farvid, Maryam S.; Malekshah, Akbar F.; Pourshams, Akram; Poustchi, Hossein; Sepanlou, Sadaf G.; Sharafkhah, Maryam; Khoshnia, Masoud; Farvid, Mojtaba; Abnet, Christian C.; Kamangar, Farin; Dawsey, Sanford M.; Brennan, Paul; Pharoah, Paul D.; Boffetta, Paolo; Willett, Walter C.; Malekzadeh, Reza
2016-01-01
Introduction Dietary protein comes from foods with greatly different compositions that may not relate equally with mortality risk. Few cohort studies from non-Western countries have examined the association between various dietary protein sources and cause-specific mortality. Therefore, the associations between dietary protein sources and all-cause, cardiovascular disease, and cancer mortality were evaluated in the Golestan Cohort Study in Iran. Methods Among 42,403 men and women who completed a dietary questionnaire at baseline, 3,291 deaths were documented during 11 years of follow up (2004–2015). Cox proportional hazards models estimated age-adjusted and multivariate-adjusted hazard ratios (HRs) and 95% CIs for all- cause and disease-specific mortality in relation to dietary protein sources. Data were analyzed from 2015 to 2016. Results Comparing the highest versus the lowest quartile, egg consumption was associated with lower all-cause mortality risk (HR=0.88, 95% CI=0.79, 0.97, ptrend=0.03). In multivariate analysis, the highest versus the lowest quartile of fish consumption was associated with reduced risk of total cancer (HR=0.79, 95% CI=0.64, 0.98, ptrend=0.03) and gastrointestinal cancer (HR=0.75, 95% CI=0.56, 1.00, ptrend=0.02) mortality. The highest versus the lowest quintile of legume consumption was associated with reduced total cancer (HR=0.72, 95% CI=0.58, 0.89, ptrend=0.004), gastrointestinal cancer (HR=0.76, 95% CI=0.58, 1.01, ptrend=0.05), and other cancer (HR=0.66, 95% CI=0.47, 0.93, ptrend=0.04) mortality. Significant associations between total red meat and poultry intake and all- cause, cardiovascular disease, or cancer mortality rate were not observed among all participants. Conclusions These findings support an association of higher fish and legume consumption with lower cancer mortality, and higher egg consumption with lower all-cause mortality. PMID:28109460
Ibrahim, Fowzia; Lorente-Cánovas, Beatriz; Doré, Caroline J; Bosworth, Ailsa; Ma, Margaret H; Galloway, James B; Cope, Andrew P; Pande, Ira; Walker, David; Scott, David L
2017-11-01
RA patients receiving TNF inhibitors (TNFi) usually maintain their initial doses. The aim of the Optimizing Treatment with Tumour Necrosis Factor Inhibitors in Rheumatoid Arthritis trial was to evaluate whether tapering TNFi doses causes loss of clinical response. We enrolled RA patients receiving etanercept or adalimumab and a DMARD with DAS28 under 3.2 for over 3 months. Initially (months 0-6) patients were randomized to control (constant TNFi) or two experimental groups (tapering TNFi by 33 or 66%). Subsequently (months 6-12) control subjects were randomized to taper TNFi by 33 or 66%. Disease flares (DAS28 increasing ⩾0.6 with at least one additional swollen joint) were the primary outcome. Two hundred and forty-four patients were screened, 103 randomized and 97 treated. In months 0-6 there were 8/50 (16%) flares in controls, 3/26 (12%) with 33% tapering and 6/21 (29%) with 66% tapering. Multivariate Cox analysis showed time to flare was unchanged with 33% tapering but was reduced with 66% tapering compared with controls (adjusted hazard ratio 2.81, 95% CI: 0.99, 7.94; P = 0.051). Analysing all tapered patients after controls were re-randomized (months 6-12) showed differences between groups: there were 6/48 (13%) flares with 33% tapering and 14/39 (36%) with 66% tapering. Multivariate Cox analysis showed 66% tapering reduced time to flare (adjusted hazard ratio 3.47, 95% CI: 1.26, 9.58; P = 0.016). Tapering TNFi by 33% has no impact on disease flares and appears practical in patients in sustained remission and low disease activity states. EudraCT, https://www.clinicaltrialsregister.eu, 2010-020738-24; ISRCTN registry, https://www.isrctn.com, 28955701. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Rheumatology.
Bongers, Mathilda L.; Hogervorst, Janneke G. F.; Schouten, Leo J.; Goldbohm, R. Alexandra; Schouten, Harry C.; van den Brandt, Piet A.
2012-01-01
Background Acrylamide, a probable human carcinogen, is present in many everyday foods. Since the finding of its presence in foods in 2002, epidemiological studies have found some suggestive associations between dietary acrylamide exposure and the risk of various cancers. The aim of this prospective study is to investigate for the first time the association between dietary acrylamide intake and the risk of several histological subtypes of lymphatic malignancies. Methods The Netherlands Cohort Study on diet and cancer includes 120,852 men and women followed-up since September 1986. The number of person years at risk was estimated by using a random sample of participants from the total cohort that was chosen at baseline (n = 5,000). Acrylamide intake was estimated from a food frequency questionnaire combined with acrylamide data for Dutch foods. Hazard ratios (HRs) were calculated for acrylamide intake as a continuous variable as well as in categories (quintiles and tertiles), for men and women separately and for never-smokers, using multivariable-adjusted Cox proportional hazards models. Results After 16.3 years of follow-up, 1,233 microscopically confirmed cases of lymphatic malignancies were available for multivariable-adjusted analysis. For multiple myeloma and follicular lymphoma, HRs for men were 1.14 (95% CI: 1.01, 1.27) and 1.28 (95% CI: 1.03, 1.61) per 10 µg acrylamide/day increment, respectively. For never-smoking men, the HR for multiple myeloma was 1.98 (95% CI: 1.38, 2.85). No associations were observed for women. Conclusion We found indications that acrylamide may increase the risk of multiple myeloma and follicular lymphoma in men. This is the first epidemiological study to investigate the association between dietary acrylamide intake and the risk of lymphatic malignancies, and more research into these observed associations is warranted. PMID:22723843
Pouessel, Damien; Bastuji-Garin, Sylvie; Houédé, Nadine; Vordos, Dimitri; Loriot, Yohann; Chevreau, Christine; Sevin, Emmanuel; Beuzeboc, Philippe; Taille, Alexandre de la; Le Thuaut, Aurélie; Allory, Yves; Culine, Stéphane
2017-02-01
In the past decade, adjuvant chemotherapy (AC) after radical cystectomy (RC) was preferred worldwide for patients with muscle-invasive urothelial bladder cancer. In this study we aimed to determine the outcome of patients who received AC and evaluated prognostic factors associated with survival. We retrospectively analyzed 226 consecutive patients treated in 6 academic hospitals between 2000 and 2009. Multivariate Cox proportional hazards regression adjusted for center to estimate adjusted hazard ratios (HRs) with 95% confidence intervals were used. The median age was 62.4 (range, 35-82) years. Patients had pT3/pT4 and/or pN+ in 180 (79.6%) and 168 patients (74.3%), respectively. Median lymph node (LN) density was 25% (range, 3.1-100). Median time between RC and AC was 61.5 (range, 18-162) days. Gemcitabine with cisplatin, gemcitabine with carboplatin, and MVAC (methotrexate, vinblastine, doxorubicin, and cisplatin) regimens were delivered in 161 (71.2%), 49 (21.7%), and 12 patients (5.3%) of patients, respectively. The median number of cycles was 4 (range, 1-6). Thirteen patients (5.7%) with LN metastases also received adjuvant pelvic radiotherapy (ART). After a median follow-up of 4.2 years, 5-year overall survival (OS) was 40.7%. In multivariate analysis, pT ≥3 stage (HR, 1.73; P = .05), LN density >50% (HR, 1.94; P = .03), and number of AC cycles <4 (HR, 4.26; P = .001) were adverse prognostic factors for OS. ART (HR, 0.30; P = .05) tended to provide survival benefit. Classical prognostic features associated with survival are not modified by the use of AC. Patients who derived benefit from AC had a low LN density and received at least 4 cycles of treatment. Copyright © 2016 Elsevier Inc. All rights reserved.
Croome, Kristopher P; Lee, David D; Keaveny, Andrew P; Taner, C Burcin
2016-12-01
Published reports describing the national experience with liver grafts from donation after cardiac death (DCD) donors have resulted in reservations with their widespread utilization. The present study aimed to investigate if temporal improvements in outcomes have been observed on a national level and to determine if donor and recipient selection have been modified in a fashion consistent with published data on DCD use in liver transplantation (LT). Patients undergoing DCD LT between 2003 and 2014 were obtained from the United Network of Organ Sharing Standard Transplant Analysis and Research file and divided into 3 equal eras based on the date of DCD LT: era 1 (2003-2006), era 2 (2007-2010), and era 3 (2011-2014). Improvement in graft survival was seen between era 1 and era 2 (P = 0.001) and between era 2 and era 3 (P < 0.001). Concurrently, an increase in the proportion of patients with hepatocellular carcinoma and a decrease in critically ill patients, retransplant recipients, donor age, warm ischemia time greater than 30 minutes and cold ischemic time also occurred over the same period. On multivariate analysis, significant predictors of graft survival included: recipient age, biologic MELD score, recipient on ventilator, recipient hepatitis C virus + serology, donor age and cold ischemic time. In addition, even after adjustment for all of the aforementioned variables, both era 2 (hazard ratio, 0.81; confidence interval, 0.69-0.94; P = 0.007), and era 3 (hazard ratio, 0.61; confidence interval, 0.5-0.73; P < 0.001) had a protective effect compared to era 1. The national outcomes for DCD LT have improved over the last 12 years. This change was associated with modifications in both recipient and donor selection. Furthermore, an era effect was observed, even after adjustment for all recipient and donor variables on multivariate analysis.
Yadav, Siddhartha; Yadav, Dhiraj; Zakalik, Dana
2017-07-01
Squamous cell carcinoma of breast accounts for less than 0.1% of all breast cancers. The purpose of this study is to describe the epidemiology and survival of this rare malignancy. Data were extracted from the National Cancer Institute's Surveillance, Epidemiology and End Results Registry to identify women diagnosed with squamous cell carcinoma of breast between 1998 and 2013. SEER*Stat 8.3.1 was used to calculate age-adjusted incidence, age-wise distribution, and annual percentage change in incidence. Kaplan-Meier curves were plotted for survival analysis. Univariate and multivariate Cox proportional hazard regression model was used to determine predictors of survival. A total of 445 cases of squamous cell carcinoma of breast were diagnosed during the study period. The median age of diagnosis was 67 years. The overall age-adjusted incidence between 1998 and 2013 was 0.62 per 1,000,000 per year, and the incidence has been on a decline. Approximately half of the tumors were poorly differentiated. Stage II was the most common stage at presentation. Majority of the cases were negative for expression of estrogen and progesterone receptor. One-third of the cases underwent breast conservation surgery while more than half of the cases underwent mastectomy (unilateral or bilateral). Approximately one-third of cases received radiation treatment. The 1-year and 5-year cause-specific survival was 81.6 and 63.5%, respectively. Excluding patient with metastasis or unknown stage at presentation, in multivariate Cox proportional hazard model, older age at diagnosis and higher tumor stage (T3 or T4) or nodal stage at presentation were significant predictors of poor survival. Our study describes the unique characteristics of squamous cell carcinoma of breast and demonstrates that it is an aggressive tumor with a poor survival. Older age and higher tumor or nodal stages at presentation were independent predictors of poor survival for loco-regional stages.
Ananthakrishnan, Ashwin N; Khalili, Hamed; Konijeti, Gauree G; Higuchi, Leslie M; de Silva, Punyanganie; Korzenik, Joshua R; Fuchs, Charles S; Willett, Walter C; Richter, James M; Chan, Andrew T
2013-11-01
Increased intake of dietary fiber has been proposed to reduce the risk of inflammatory bowel disease (Crohn's disease [CD] and ulcerative colitis [UC]). However, few prospective studies have examined associations between long-term intake of dietary fiber and risk of incident CD or UC. We collected and analyzed data from 170,776 women, followed up over 26 years, who participated in the Nurses' Health Study, followed up for 3,317,425 person-years. Dietary information was prospectively ascertained via administration of a validated semiquantitative food frequency questionnaire every 4 years. Self-reported CD and UC were confirmed through review of medical records. Cox proportional hazards models, adjusting for potential confounders, were used to calculate hazard ratios (HRs). We confirmed 269 incident cases of CD (incidence, 8/100,000 person-years) and 338 cases of UC (incidence, 10/100,000 person-years). Compared with the lowest quintile of energy-adjusted cumulative average intake of dietary fiber, intake of the highest quintile (median of 24.3 g/day) was associated with a 40% reduction in risk of CD (multivariate HR for CD, 0.59; 95% confidence interval, 0.39-0.90). This apparent reduction appeared to be greatest for fiber derived from fruits; fiber from cereals, whole grains, or legumes did not modify risk. In contrast, neither total intake of dietary fiber (multivariate HR, 0.82; 95% confidence interval, 0.58-1.17) nor intake of fiber from specific sources appeared to be significantly associated with risk of UC. Based on data from the Nurses' Health Study, long-term intake of dietary fiber, particularly from fruit, is associated with lower risk of CD but not UC. Further studies are needed to determine the mechanisms that mediate this association. Copyright © 2013 AGA Institute. Published by Elsevier Inc. All rights reserved.
Ban, J; Takao, Y; Okuno, Y; Mori, Y; Asada, H; Yamanishi, K; Iso, H
2017-04-01
Few studies have examined the impact of cigarette smoking on the risk for herpes zoster. The Shozu Herpes Zoster (SHEZ) Study is a community-based prospective cohort study over 3 years in Japan aiming to clarify the incidence and predictive and immunological factors for herpes zoster. We investigated the associations of smoking status with past history and incidence of herpes zoster. A total of 12 351 participants provided valid information on smoking status and past history of herpes zoster at baseline survey. Smoking status was classified into three categories (current, former, never smoker), and if currently smoking, the number of cigarettes consumed per day was recorded. The participants were under the active surveillance for first-ever incident herpes zoster for 3 years. We used a logistic regression model for the cross-sectional study on the association between smoking status and past history of herpes zoster, and a Cox proportional hazards regression model for the cohort study on the association with risk of incidence. The multivariable adjusted odd ratios (95% CI) of past history of herpes zoster for current vs. never smokers were 0·67 (0·54-0·80) for total subjects, 0·72 (0·56-0·93) for men and 0·65 (0·44-0·96) for women. The multivariable adjusted hazard ratios (95% CI) of incident herpes zoster for current vs. never smokers were 0·52 (0·33-0·81) for total subjects, 0·49 (0·29-0·83) for men and 0·52 (0·19-1·39) for women. Smoking status was inversely associated with the prevalence and incidence of herpes zoster in the general population of men and women aged ⩾50 years.
Combined effect of alcohol consumption and lifestyle behaviors on risk of type 2 diabetes.
Joosten, Michel M; Grobbee, Diederick E; van der A, Daphne L; Verschuren, W M Monique; Hendriks, Henk F J; Beulens, Joline W J
2010-06-01
It has been suggested that the inverse association between alcohol and type 2 diabetes could be explained by moderate drinkers' healthier lifestyles. We studied whether moderate alcohol consumption is associated with a lower risk of type 2 diabetes in adults with combined low-risk lifestyle behaviors. We prospectively examined 35,625 adults of the Dutch European Prospective Investigation into Cancer and Nutrition (EPIC-NL) cohort aged 20-70 y, who were free of diabetes, cardiovascular disease, and cancer at baseline (1993-1997). In addition to moderate alcohol consumption (women: 5.0-14.9 g/d; men: 5.0-29.9 g/d), we defined low-risk categories of 4 lifestyle behaviors: optimal weight [body mass index (in kg/m(2)) <25], physically active (> or =30 min of physical activity/d), current nonsmoker, and a healthy diet [upper 2 quintiles of the Dietary Approaches to Stop Hypertension (DASH) diet]. During a median of 10.3 y, we identified 796 incident cases of type 2 diabetes. Compared with teetotalers, hazard ratios of moderate alcohol consumers for risk of type 2 diabetes in low-risk lifestyle strata after multivariable adjustments were 0.35 (95% CI: 0.17, 0.72) when of a normal weight, 0.65 (95% CI: 0.46, 0.91) when physically active, 0.54 (95% CI: 0.41, 0.71) when nonsmoking, and 0.57 (95% CI: 0.39, 0.84) when consuming a healthy diet. When > or =3 low-risk lifestyle behaviors were combined, the hazard ratio for incidence of type 2 diabetes in moderate alcohol consumers after multivariable adjustments was 0.56 (95% CI: 0.32, 1.00). In subjects already at lower risk of type 2 diabetes on the basis of multiple low-risk lifestyle behaviors, moderate alcohol consumption was associated with an approximately 40% lower risk compared with abstention.
Formica, Richard N; Barrantes, Fidel; Asch, William S; Bia, Margaret J; Coca, Steven; Kalyesubula, Robert; McCloskey, Barbara; Leary, Tucker; Arvelakis, Antonios; Kulkarni, Sanjay
2012-08-01
Waiting time for a kidney transplant is calculated from the date the patient is placed on the UNOS (United Network for Organ Sharing) waitlist to the date the patient undergoes transplant. Time from transplant evaluation to listing represents unaccounted waiting time, potentially resulting in longer dialysis exposure for some patients with prolonged evaluation times. There are established disparities demonstrating that groups of patients take longer to be placed on the waitlist and thus have less access to kidney transplant. Quality improvement report. 905 patients from a university-based hospital were evaluated for kidney transplant candidacy, and analysis was performed from July 1, 2004, to January 31, 2010. A 1-day centralized work-up was implemented on July 1, 2007, whereby the transplant center coordinated the necessary tests needed to fulfill minimal listing criteria. Time from evaluation to UNOS listing was compared between the 2 cohorts. Multivariable Cox proportional hazards models were created to assess the relative hazards of waitlist placement comparing 1-day versus conventional work-up and were adjusted for age, sex, race, and education. Of 905 patients analyzed, 378 underwent conventional evaluation and 527 underwent a 1-day center-coordinated evaluation. Median time to listing in the 1-day center-coordinated evaluation compared with conventional was significantly less (46 vs 226 days, P < 0.001). On multivariable analysis controlling for age, sex, and education level, the 1-day in-center group was 3 times more likely to place patients on the wait list (adjusted HR, 3.08; 95% CI, 2.64-3.59). Listing time was significantly decreased across race, sex, education, and ethnicity. Single center, retrospective. Variables that may influence transplant practitioners, such as comorbid conditions or functional status, were not assessed. A 1-day center-coordinated pretransplant work-up model significantly decreased time to listing for kidney transplant. Copyright © 2012 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
All-cause mortality in asymptomatic persons with extensive Agatston scores above 1000.
Patel, Jaideep; Blaha, Michael J; McEvoy, John W; Qadir, Sadia; Tota-Maharaj, Rajesh; Shaw, Leslee J; Rumberger, John A; Callister, Tracy Q; Berman, Daniel S; Min, James K; Raggi, Paolo; Agatston, Arthur A; Blumenthal, Roger S; Budoff, Matthew J; Nasir, Khurram
2014-01-01
Risk assessment in the extensive calcified plaque phenotype has been limited by small sample size. We studied all-cause mortality rates among asymptomatic patients with markedly elevated Agatston scores > 1000. We studied a clinical cohort of 44,052 asymptomatic patients referred for coronary calcium scans. Mean follow-up was 5.6 years (range, 1-13 years). All-cause mortality rates were calculated after stratifying by Agatston score (0, 1-1000, 1001-1500, 1500-2000, and >2000). A multivariable Cox regression model adjusting for self-reported traditional risk factors was created to assess the relative mortality hazard of Agatston scores 1001 to 1500, 1501 to 2000, and >2000. With the use of post-estimation modeling, we assessed for the presence of an upper threshold of risk with high Agatston scores. A total of 1593 patients (4% of total population) had Agatston score > 1000. There was a continuous graded decrease in estimated 10-year survival across increasing Agatston score, continuing when Agatston score > 1000 (Agatston score 1001-1500, 78%; Agatston score 1501-2000, 74%; Agatston score > 2000, 51%). After multivariable adjustment, Agatston scores 1001 to 1500, 1501 to 2000, and >2000 were associated with an 8.05-, 7.45-, and 13.26-fold greater mortality risk, respectively, than for Agatston score of 0. Compared with Agatston score 1001 to 1500, Agatston score 1501 to 2000 had a similar all-cause mortality risk, whereas Agatston score > 2000 had an increased relative risk (Agatston score 1501-2000: hazard ratio [HR], 1.01 [95% CI, 0.67-1.51]; Agatston score > 2000: HR, 1.79 [95% CI, 1.30-2.46]). Graphical assessment of the predicted survival model suggests no upper threshold for risk associated with calcified plaque in coronary arteries. Increasing calcified plaque in coronary arteries continues to predict a graded decrease in survival among patients with extensive Agatston score > 1000 with no apparent upper threshold. Published by Elsevier Inc.
Exercise Decreases and Smoking Increases Bladder Cancer Mortality
Liss, Michael A.; White, Martha; Natarajan, Loki; Parsons, J. Kellogg
2018-01-01
Modifiable lifestyle factors play an important role regarding the development and outcomes in solid tumors. Whereas smoking has been attributed to bladder cancer and cessation leads to better outcome, we show that exercise may provide similar benefits regarding bladder cancer mortality Background The aim of this study was to investigate modifiable lifestyle factors of smoking, exercise, and obesity with bladder cancer mortality. Patients and Methods We used mortality-linked data from the National Health Information Survey from 1998 through 2006. The primary outcome was bladder cancer-specific mortality. The primary exposures were self-reported smoking status (never- vs. former vs. current smoker), self-reported exercise (dichotomized as “did no exercise” vs. “light, moderate, or vigorous exercise in ≥ 10-minute bouts”), and body mass index. We utilized multivariable adjusted Cox proportional hazards regression models, with delayed entry to account for age at survey interview. Results Complete data were available on 222,163 participants, of whom 96,715 (44%) were men and 146,014 (66%) were non-Hispanic whites, and among whom we identified 83 bladder cancer-specific deaths. In multivariate analyses, individuals who reported any exercise were 47% less likely (adjusted hazard ratio [HRadj], 0.53; 95% confidence interval [CI], 0.29–0.96; P = .038) to die of bladder cancer than “no exercise”. Compared with never-smokers, current (HRadj, 4.24; 95% CI, 1.89–9.65; P = .001) and former (HRadj, 2.95; 95% CI, 1.50–5.79; P = .002) smokers were 4 and 3 times more likely, respectively, to die of bladder cancer. There were no significant associations of body mass index with bladder cancer mortality. Conclusion Exercise decreases and current smoking increases the risk of bladder cancer-specific mortality. These data suggest that exercise and smoking cessation interventions may reduce bladder cancer death. PMID:28007367
Planer, David; Mehran, Roxana; Witzenbichler, Bernhard; Guagliumi, Giulio; Peruga, Jan Z; Brodie, Bruce R; Dudek, Dariusz; Möckel, Martin; Reyes, Selene Leon; Stone, Gregg W
2011-10-15
Measurement of left ventricular end-diastolic pressure (LVEDP) is readily obtainable in patients with ST-segment elevation myocardial infarction (STEMI) undergoing primary percutaneous coronary intervention (PCI). However, the prognostic utility of LVEDP during primary PCI has never been studied. LVEDP was measured in 2,797 patients during primary PCI in the Harmonizing Outcomes with RevascularIZatiON and Stents in Acute Myocardial Infarction (HORIZONS-AMI) trial. Outcomes were assessed at 30 days and 2 years stratified by medians of LVEDP. Multivariable analysis was performed to determine whether LVEDP was an independent determinate of adverse outcomes. The median (interquartile range) for LVEDP was 18 mm Hg (12 to 24). For patients with LVEDP >18 mm Hg versus those with ≤18 mm Hg, hazard ratios (95% confidence intervals) for death and death or reinfarction at 30 days were 2.00 (1.20 to 3.33, p = 0.007) and 1.84 (1.24 to 2.73, p = 0.002), respectively, and at 2 years were 1.57 (1.12 to 2.21, p = 0.009) and 1.45 (1.14 to 1.85, p = 0.002), respectively. Patients in the highest quartile of LVEDP (≥24 mm Hg) were at the greatest risk of mortality. Only a weak correlation was present between LVEDP and left ventricular ejection fraction (LVEF; R(2) = 0.03, p <0.01). By multivariable analysis increased LVEDP was an independent predictor of death or reinfarction at 2 years (hazard ratio 1.20, 95% confidence interval 1.02 to 1.42, p = 0.03) even after adjustment for baseline LVEF. In conclusion, baseline increased LVEDP is an independent predictor of adverse outcomes in patients with STEMI undergoing primary PCI even after adjustment for baseline LVEF. Patients with LVEDP ≥24 mm Hg are at the greatest risk for early and late mortality. Copyright © 2011 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garsa, Adam A.; Badiyan, Shahed N.; DeWees, Todd
2014-10-01
Purpose: To evaluate local control rates and predictors of individual tumor local control for brain metastases from non-small cell lung cancer (NSCLC) treated with stereotactic radiosurgery (SRS). Methods and Materials: Between June 1998 and May 2011, 401 brain metastases in 228 patients were treated with Gamma Knife single-fraction SRS. Local failure was defined as an increase in lesion size after SRS. Local control was estimated using the Kaplan-Meier method. The Cox proportional hazards model was used for univariate and multivariate analysis. Receiver operating characteristic analysis was used to identify an optimal cutpoint for conformality index relative to local control. Amore » P value <.05 was considered statistically significant. Results: Median age was 60 years (range, 27-84 years). There were 66 cerebellar metastases (16%) and 335 supratentorial metastases (84%). The median prescription dose was 20 Gy (range, 14-24 Gy). Median overall survival from time of SRS was 12.1 months. The estimated local control at 12 months was 74%. On multivariate analysis, cerebellar location (hazard ratio [HR] 1.94, P=.009), larger tumor volume (HR 1.09, P<.001), and lower conformality (HR 0.700, P=.044) were significant independent predictors of local failure. Conformality index cutpoints of 1.4-1.9 were predictive of local control, whereas a cutpoint of 1.75 was the most predictive (P=.001). The adjusted Kaplan-Meier 1-year local control for conformality index ≥1.75 was 84% versus 69% for conformality index <1.75, controlling for tumor volume and location. The 1-year adjusted local control for cerebellar lesions was 60%, compared with 77% for supratentorial lesions, controlling for tumor volume and conformality index. Conclusions: Cerebellar tumor location, lower conformality index, and larger tumor volume were significant independent predictors of local failure after SRS for brain metastases from NSCLC. These results warrant further investigation in a prospective setting.« less
Shamriz, Oded; Leiba, Merav; Levine, Hagai; Derazne, Estela; Keinan-Boker, Lital; Kark, Jeremy D
2017-04-01
Studies evaluating adolescent risk factors for developing acute myeloid leukemia (AML) are virtually nonexistent. We assessed adolescent predictors of AML in adults, with a main focus on adolescent BMI. The study included 2,310,922 16-19-year-old Jewish Israeli adolescents (mean age 17.3 ± 0.4, 59.5% male), called up for an obligatory health examination. Sociodemographic and health data, including measured weight and height, were gathered. Body mass index (BMI) was examined both as a continuous variable and grouped according to the World Health Organization (WHO) classification and US-CDC percentiles. Bone-marrow-biopsy-verified AML cases diagnosed up to 31 December 2012 were identified by linkage to the Israel national cancer registry. Multivariable-adjusted Cox proportional-hazards models were used to model time to diagnosis. During 47 million person years of follow-up, 568 AML cases were identified (crude incidence rate 1.21/100,000 person years). There was a multivariable-adjusted hazard ratio (HR) of 1.041 (95% CI 1.015-1.068, p = 0.002) per unit BMI. The association was evident in those of Middle Eastern, North African, and European origin. A graded association was evident across the overweight and obese WHO grouping. With the US-CDC grouping, excess risk was evident in overweight but not in obese adolescents, although a test for trend in percentiles was significant (p = 0.004). Borderline associations were noted for origin (p = 0.065) (higher in the predominantly Ashkenazi European origin), sex (higher in women: HR = 1.24 (95% CI 0.99-1.55), and stature (HR = 1.013, 95% CI 1.000-1.026, per cm). Higher BMI in adolescence was associated with increased AML incidence in adulthood in this multiethnic population.
Utilization of Body Contouring Procedures Following Weight Loss Surgery: A Study of 37,806 Patients.
Altieri, Maria S; Yang, Jie; Park, Jihye; Novikov, David; Kang, Lijuan; Spaniolas, Konstantinos; Bates, Andrew; Talamini, Mark; Pryor, Aurora
2017-11-01
Bariatric surgery has substantial health benefits; however, some patients desire body contouring (BC) procedures following rapid weight loss. There is a paucity of data regarding the true rate of BC following bariatric procedures. The purpose of our study is to examine the utilization of two common procedures, abdominoplasty, and panniculectomy, following bariatric surgery in New York State. The SPARCS longitudinal administrative database was used to identify bariatric procedures by using ICD-9 and CPT codes between 2004 and 2010. Procedures included sleeve gastrectomy, Roux-en-Y gastric bypass, and laparoscopic adjustable gastric banding. Using a unique patient identifier, we tracked those patients who subsequently underwent either abdominoplasty or panniculectomy with at least a 4-year follow-up (until 2014). Multivariable Cox proportional hazard model was used to evaluate predictors of follow-up BC surgery. 37,806 patients underwent bariatric surgery between 2004 and 2010. Only 5.58% (n = 2112) of these patients subsequently had a BC procedure, with 143 of them (6.8%) having ≥1 plastic surgery. The average time to plastic surgery after band, bypass, or sleeve was 1134.83 ± 671.09, 984.70 ± 570.53, and 903.02 ± 497.31 days, respectively (P < 0.0001). Following the multivariable Cox proportional hazard model, a female, SG patients, patients with Medicare or Medicaid, and patients in either <20 or >80%ile in yearly income were more likely to have plastic surgery after adjusting for age, race/ethnicity, comorbidities and complications (P values < 0.0001). This study shows that plastic surgery is completed by only 6% of patients following bariatric procedures. As insurance and income are associated with pursuing surgery, improved access may increase the number of patients who are able to undergo these reconstructive procedures.
Zell, Jason A.; Lin, Bruce S.; Ziogas, Argyrios; Anton-Culver, Hoda
2012-01-01
Background: Dietary arginine and meat consumption are implicated in colorectal cancer (CRC) progression via polyamine-dependent processes. Polymorphism in the polyamine-regulatory gene, ornithine decarboxylase 1 (Odc1, rs2302615) is prognostic for CRC-specific mortality. Here, we examined joint effects of meat consumption and Odc1 polymorphism on CRC-specific mortality. Materials and Methods: The analytic cohort was comprised of 329 incident stage I-III CRC cases diagnosed 1994-1996 with follow- up through March 2008. Odc1 genotyping was conducted using primers that amplify a 172-bp fragment containing the polymorphic base at +316. Dietary questionnaires were administered at cohort entry. Multivariate Cox proportional hazards regression analysis for CRC-specific mortality was stratified by tumor, node, metastasis (TNM) stage, and adjusted for clinically relevant variables, plus meat consumption (as a continuous variable, i.e., the number of medium-sized servings/week), Odc1 genotype, and a term representing the meat consumption and Odc1 genotype interaction. The primary outcome was the interaction of Odc1 and meat intake on CRC-specific mortality, as assessed by departures from multiplicative joint effects. Results: Odc1 genotype distribution was 51% GG, 49% GA/AA. In the multivariate model, there was a significant interaction between meat consumption and Odc1 genotype, P-int = 0.01. Among Odc1 GA/AA CRC cases in meat consumption Quartiles 1-3, increased mortality risk was observed when compared to GG cases (adjusted hazards ratio (HR) = 7.06 [95% CI 2.34-21.28]) – a difference not found among cases in the highest dietary meat consumption Quartile 4. Conclusions: Effects of meat consumption on CRC-specific mortality risk differ based on genetic polymorphism at Odc1. These results provide further evidence that polyamine metabolism and its modulation by dietary factors such as meat may have relevance to CRC outcomes. PMID:23233821
Lorente-Cánovas, Beatriz; Doré, Caroline J; Bosworth, Ailsa; Ma, Margaret H; Galloway, James B; Cope, Andrew P; Pande, Ira; Walker, David; Scott, David L
2017-01-01
Abstract Objectives RA patients receiving TNF inhibitors (TNFi) usually maintain their initial doses. The aim of the Optimizing Treatment with Tumour Necrosis Factor Inhibitors in Rheumatoid Arthritis trial was to evaluate whether tapering TNFi doses causes loss of clinical response. Methods We enrolled RA patients receiving etanercept or adalimumab and a DMARD with DAS28 under 3.2 for over 3 months. Initially (months 0–6) patients were randomized to control (constant TNFi) or two experimental groups (tapering TNFi by 33 or 66%). Subsequently (months 6–12) control subjects were randomized to taper TNFi by 33 or 66%. Disease flares (DAS28 increasing ⩾0.6 with at least one additional swollen joint) were the primary outcome. Results Two hundred and forty-four patients were screened, 103 randomized and 97 treated. In months 0–6 there were 8/50 (16%) flares in controls, 3/26 (12%) with 33% tapering and 6/21 (29%) with 66% tapering. Multivariate Cox analysis showed time to flare was unchanged with 33% tapering but was reduced with 66% tapering compared with controls (adjusted hazard ratio 2.81, 95% CI: 0.99, 7.94; P = 0.051). Analysing all tapered patients after controls were re-randomized (months 6–12) showed differences between groups: there were 6/48 (13%) flares with 33% tapering and 14/39 (36%) with 66% tapering. Multivariate Cox analysis showed 66% tapering reduced time to flare (adjusted hazard ratio 3.47, 95% CI: 1.26, 9.58; P = 0.016). Conclusion Tapering TNFi by 33% has no impact on disease flares and appears practical in patients in sustained remission and low disease activity states. Trail registration EudraCT, https://www.clinicaltrialsregister.eu, 2010-020738-24; ISRCTN registry, https://www.isrctn.com, 28955701 PMID:28968858
Modin, Daniel; Sengeløv, Morten; Jørgensen, Peter Godsk; Bruun, Niels Eske; Olsen, Flemming Javier; Dons, Maria; Fritz Hansen, Thomas; Jensen, Jan Skov; Biering-Sørensen, Tor
2018-04-01
Quantification of systolic function in patients with atrial fibrillation (AF) is challenging. A novel approach, based on RR interval correction, to counteract the varying heart cycle lengths in AF has recently been proposed. Whether this method is superior in patients with systolic heart failure (HFrEF) with AF remains unknown. This study investigates the prognostic value of RR interval-corrected peak global longitudinal strain {GLSc = GLS/[RR^(1/2)]} in relation to all-cause mortality in HFrEF patients displaying AF during echocardiographic examination. Echocardiograms from 151 patients with HFrEF and AF during examination were analysed offline. Peak global longitudinal strain (GLS) was averaged from 18 myocardial segments obtained from three apical views. GLS was indexed with the square root of the RR interval {GLSc = GLS/[RR^(1/2)]}. Endpoint was all-cause mortality. During a median follow-up of 2.7 years, 40 patients (26.5%) died. Neither uncorrected GLS (P = 0.056) nor left ventricular ejection fraction (P = 0.053) was significantly associated with all-cause mortality. After RR^(1/2) indexation, GLSc became a significant predictor of all-cause mortality (hazard ratio 1.16, 95% confidence interval 1.02-1.22, P = 0.014, per %/s^(1/2) decrease). GLSc remained an independent predictor of mortality after multivariable adjustment (age, sex, mean heart rate, mean arterial blood pressure, left atrial volume index, and E/e') (hazard ratio 1.17, 95% confidence interval 1.05-1.31, P = 0.005 per %/s^(1/2) decrease). Decreasing {GLSc = GLS/[RR^(1/2)]}, but not uncorrected GLS nor left ventricular ejection fraction, was significantly associated with increased risk of all-cause mortality in HFrEF patients with AF and remained an independent predictor after multivariable adjustment. © 2017 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.
Liu, Huikun; Zhang, Shuang; Wang, Leishen; Leng, Junhong; Li, Weiqin; Li, Nan; Li, Min; Qiao, Yijuan; Tian, Huiguang; Tuomilehto, Jaakko; Yang, Xilin; Yu, Zhijie; Hu, Gang
2016-02-01
Very few studies have assessed the association of fasting and 2h glucose, and HbA1c during pregnancy with postpartum diabetes risk among women with prior gestational diabetes mellitus (GDM). We assessed the association of fasting glucose, 2h glucose and HbA1c at 26-30 gestational weeks with postpartum diabetes risk among women with prior GDM. A cohort study in 1263 GDM women at 1-5 years after delivery was performed. Cox proportional hazards regression models were used to evaluate the association of fasting and 2h plasma glucose, and HbA1c at 26-30 gestational weeks with the risk of diabetes at postpartum. The multivariable-adjusted (age, pre-pregnancy body mass index, weight gain during pregnancy, current body mass index, family history of diabetes, marital status, education, family income, smoking status, passive smoking, leisure-time physical activity, alcohol drinking, and intake of energy, saturated fat, and dietary fiber) hazard ratios of postpartum diabetes were 1.61 (95% confidence interval [CI]: 1.36-1.91) for each 1 mmol/l increase in fasting glucose during pregnancy, 1.63 (95% CI: 1.45-1.84) for each 1 mmol/l increase in 2h glucose during pregnancy, 2.11 (95% CI: 1.50-2.97) for each 1 unit (%) increase in HbA1c during pregnancy. When fasting glucose, 2h glucose and HbA1c during pregnancy were entered multivariable-adjusted model simultaneously, 2h glucose and HbA1c but not fasting glucose remained to be significant and positive predictors for postpartum diabetes. For women with prior GDM, 2h plasma glucose and HbA1c during pregnancy are independent predictors of postpartum diabetes, but fasting plasma glucose during pregnancy is not. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Helbig, A Katharina; Stöckl, Doris; Heier, Margit; Ladwig, Karl-Heinz; Meisinger, Christa
2015-01-01
To examine the relationship between symptoms of insomnia and sleep duration and incident total (non-fatal plus fatal) strokes, non-fatal strokes, and fatal strokes in a large cohort of men and women from the general population in Germany. In four population-based MONICA (monitoring trends and determinants in cardiovascular disease)/KORA (Cooperative Health Research in the Region of Augsburg) surveys conducted between 1984 and 2001, 17,604 men and women (aged 25 to 74 years) were asked about issues like sleep, health behavior, and medical history. In subsequent surveys and mortality follow-ups, incident stroke cases (cerebral hemorrhage, ischemic stroke, transient ischemic attack, unknown stroke type) were gathered prospectively until 2009. Sex-specific hazard ratios (HR) and their 95% confidence intervals (CI) were estimated using sequential Cox proportional hazards regression models. During a mean follow-up of 14 years, 917 strokes (710 non-fatal strokes and 207 fatal strokes) were observed. Trouble falling asleep and difficulty staying asleep were not significantly related to any incident stroke outcome in either sex in the multivariable models. Among men, the HR for the association between short (≤5 hours) and long (≥10 hours) daily sleep duration and total strokes were 1.44 (95% CI: 1.01-2.06) and 1.63 (95% CI: 1.16-2.29), after adjustment for basic confounding variables. As for non-fatal strokes and fatal strokes, in the analyses adjusted for age, survey, education, physical activity, alcohol consumption, smoking habits, body mass index, hypertension, diabetes, and dyslipidemia, the increased risks persisted, albeit somewhat attenuated, but no longer remained significant. Among women, in the multivariable analyses the quantity of sleep was also not related to any stroke outcome. In the present study, symptoms of insomnia and exceptional sleep duration were not significantly predictive of incident total strokes, non-fatal strokes, and fatal strokes in either sex.
Helbig, A. Katharina; Stöckl, Doris; Heier, Margit; Ladwig, Karl-Heinz; Meisinger, Christa
2015-01-01
Objective To examine the relationship between symptoms of insomnia and sleep duration and incident total (non-fatal plus fatal) strokes, non-fatal strokes, and fatal strokes in a large cohort of men and women from the general population in Germany. Methods In four population-based MONICA (monitoring trends and determinants in cardiovascular disease)/KORA (Cooperative Health Research in the Region of Augsburg) surveys conducted between 1984 and 2001, 17,604 men and women (aged 25 to 74 years) were asked about issues like sleep, health behavior, and medical history. In subsequent surveys and mortality follow-ups, incident stroke cases (cerebral hemorrhage, ischemic stroke, transient ischemic attack, unknown stroke type) were gathered prospectively until 2009. Sex-specific hazard ratios (HR) and their 95% confidence intervals (CI) were estimated using sequential Cox proportional hazards regression models. Results During a mean follow-up of 14 years, 917 strokes (710 non-fatal strokes and 207 fatal strokes) were observed. Trouble falling asleep and difficulty staying asleep were not significantly related to any incident stroke outcome in either sex in the multivariable models. Among men, the HR for the association between short (≤5 hours) and long (≥10 hours) daily sleep duration and total strokes were 1.44 (95% CI: 1.01–2.06) and 1.63 (95% CI: 1.16–2.29), after adjustment for basic confounding variables. As for non-fatal strokes and fatal strokes, in the analyses adjusted for age, survey, education, physical activity, alcohol consumption, smoking habits, body mass index, hypertension, diabetes, and dyslipidemia, the increased risks persisted, albeit somewhat attenuated, but no longer remained significant. Among women, in the multivariable analyses the quantity of sleep was also not related to any stroke outcome. Conclusion In the present study, symptoms of insomnia and exceptional sleep duration were not significantly predictive of incident total strokes, non-fatal strokes, and fatal strokes in either sex. PMID:26230576
Liu, Huikun; Zhang, Shuang; Wang, Leishen; Leng, Junhong; Li, Weiqi; Li, Nan; Li, Min; Qiao, Yijuan; Tian, Huiguang; Tuomilehto, Jaakko; Yang, Xilin; Yu, Zhijie; Hu, Gang
2015-01-01
Aims Very few studies have assessed the association of fasting and 2-hour glucose, and HbA1c during pregnancy with postpartum diabetes risk among women with prior gestational diabetes (GDM). We assessed the association of fasting glucose, 2-hour glucose and HbA1c at 26-30 gestational weeks with postpartum diabetes risk among women with prior GDM. Methods A cohort study in 1,263 GDM women at 1–5 years after delivery was performed. Cox proportional hazards regression models were used to evaluate the association of fasting and 2-hour plasma glucose, and HbA1c at 26-30 gestational weeks with the risk of diabetes at postpartum. Results The multivariable-adjusted (age, pre-pregnancy body mass index, weight gain during pregnancy, current body mass index, family history of diabetes, marital status, education, family income, smoking status, passive smoking, leisure-time physical activity, alcohol drinking, and intake of energy, saturated fat, and dietary fiber) hazard ratios of postpartum diabetes were 1.61 (95% confidence interval [CI]: 1.36–1.91) for each 1 mmol/l increase in fasting glucose during pregnancy, 1.63 (95% CI: 1.45–1.84) for each 1 mmol/l increase in 2-hour glucose during pregnancy, 2.11 (95% CI: 1.50–2.97) for each 1 unit (%) increase in HbA1c during pregnancy. When fasting glucose, 2-hour glucose and HbA1c during pregnancy were entered multivariable-adjusted model simultaneously, 2-hour glucose and HbA1c but not fasting glucose remained to be significant and positive predictors for postpartum diabetes. Conclusions For women with prior GDM, 2-hour plasma glucose and HbA1c during pregnancy are independent predictors of postpartum diabetes, but fasting plasma glucose during pregnancy is not. PMID:26686048
Chang, Susan M; Barker, Fred G
2005-11-01
Social factors influence cancer treatment choices, potentially affecting patient survival. In the current study, the authors studied the interrelations between marital status, treatment received, and survival in patients with glioblastoma multiforme (GM), using population-based data. The data source was the Surveillance, Epidemiology, and End Results (SEER) Public Use Database, 1988-2001, 2004 release, all registries. Multivariate logistic, ordinal, and Cox regression analyses adjusted for demographic and clinical variables were used. Of 10,987 patients with GM, 67% were married, 31% were unmarried, and 2% were of unknown marital status. Tumors were slightly larger at the time of diagnosis in unmarried patients (49% of unmarried patients had tumors larger than 45 mm vs. 45% of married patients; P = 0.004, multivariate analysis). Unmarried patients were less likely to undergo surgical resection (vs. biopsy; 75% of unmarried patients vs. 78% of married patients) and were less likely to receive postoperative radiation therapy (RT) (70% of unmarried patients vs. 79% of married patients). On multivariate analysis, the odds ratio (OR) for resection (vs. biopsy) in unmarried patients was 0.88 (95% confidence interval [95% CI], 0.79-0.98; P = 0.02), and the OR for RT in unmarried patients was 0.69 (95% CI, 0.62-0.77; P < 0.001). Unmarried patients more often refused both surgical resection and RT. Unmarried patients who underwent surgical resection and RT were found to have a shorter survival than similarly treated married patients (hazard ratio for unmarried patients, 1.10; P = 0.003). Unmarried patients with GM presented with larger tumors, were less likely to undergo both surgical resection and postoperative RT, and had a shorter survival after diagnosis when compared with married patients, even after adjustment for treatment and other prognostic factors. (c) 2005 American Cancer Society.
Localized Scleroderma, Systemic Sclerosis and Cardiovascular Risk: A Danish Nationwide Cohort Study.
Hesselvig, Jeanette Halskou; Kofoed, Kristian; Wu, Jashin J; Dreyer, Lene; Gislason, Gunnar; Ahlehoff, Ole
2018-03-13
Recent findings indicate that patients with systemic sclerosis have an increased risk of cardiovascular disease. To determine whether patients with systemic sclerosis or localized scleroderma are at increased risk of cardiovascular disease, a cohort study of the entire Danish population aged ≥ 18 and ≤ 100 years was conducted, followed from 1997 to 2011 by individual-level linkage of nationwide registries. Multivariable adjusted Cox regression models were used to estimate the hazard ratios (HRs) for a composite cardiovascular disease endpoint. A total of 697 patients with localized scleroderma and 1,962 patients with systemic sclerosis were identified and compared with 5,428,380 people in the reference population. In systemic sclerosis, the adjusted HR was 2.22 (95% confidence interval 1.99-2.48). No association was seen between patients with localized scleroderma and cardiovascular disease. In conclusion, systemic sclerosis is a significant cardiovascular disease risk factor, while patients with localized scleroderma are not at increased risk of cardiovascular disease.
Kunutsor, Setor K; Laukkanen, Tanjaniina; Laukkanen, Jari A
2017-10-01
Cardiorespiratory fitness (CRF), an index of cardiac and respiratory functioning, is strongly associated with a reduced risk of adverse health outcomes. We aimed to assess the prospective association of CRF with the risk of respiratory diseases (defined as chronic obstructive pulmonary disease, pneumonia, or asthma). Cardiorespiratory fitness, as measured by maximal oxygen uptake, was assessed in 1974 middle-aged men. During a median follow-up of 25.7 years, 382 hospital diagnosed respiratory diseases were recorded. Cardiorespiratory fitness was linearly associated with risk of respiratory diseases. In analysis adjusted for several established and potential risk factors, the hazard ratio (HR) (95% CI) for respiratory diseases was 0.63 (0.45-0.88), when comparing extreme quartiles of CRF levels. The corresponding multivariate adjusted HR (95% CI) for pneumonia was 0.67 (0.48-0.95). Our findings indicate a graded inverse and independent association between CRF and the future risk of respiratory diseases in a general male Caucasian population.
Sirolimus use and incidence of venous thromboembolism in cardiac transplant recipients.
Thibodeau, Jennifer T; Mishkin, Joseph D; Patel, Parag C; Kaiser, Patricia A; Ayers, Colby R; Mammen, Pradeep P A; Markham, David W; Ring, W Steves; Peltz, Matthias; Drazner, Mark H
2012-01-01
Sirolimus is an immunosuppressive agent increasingly used in cardiac transplant recipients in the setting of allograft vasculopathy or worsening renal function. Recently, sirolimus has been associated with increased risk of venous thromboembolism (VTE) in lung transplant recipients. To investigate whether this association is also present in cardiac transplant recipients, we retrospectively reviewed the charts of 67 cardiac transplant recipients whose immunosuppressive regimen included sirolimus and 134 matched cardiac transplant recipients whose regimen did not include sirolimus. Rates of VTE were compared. Multivariable Cox proportional hazards models tested the association of sirolimus use with VTE. A higher incidence of VTE was seen in patients treated with vs. without sirolimus (8/67 [12%] vs. 9/134 [7%], log-rank statistic: 4.66, p=0.03). Lower body mass index (BMI) and total cholesterol levels were also associated with VTE (p<0.05). The association of sirolimus with VTE persisted when adjusting for BMI (hazard ratio [95% confidence interval]: 2.96 [1.13, 7.75], p=0.03) but not when adjusting for total cholesterol (p=0.08). These data suggest that sirolimus is associated with an increased risk of VTE in cardiac transplant recipients, a risk possibly mediated through comorbid conditions. Larger, more conclusive studies are needed. Until such studies are completed, a heightened level of awareness for VTE in cardiac transplant recipients treated with sirolimus appears warranted. © 2012 John Wiley & Sons A/S.
Budhathoki, S; Iwasaki, M; Sawada, N; Yamaji, T; Shimazu, T; Sasazuki, S; Inoue, M; Tsugane, S
2015-02-01
Compared with western populations, the consumption of soy foods among Japanese is very high and the incidence of endometrial cancer very low. We evaluated the association of soy food and isoflavone intake with endometrial cancer risk in Japanese women. Prospective cohort study. Ten public health centre areas in Japan. Forty nine thousand one hundred and twenty-one women of age 45-74 years who responded to a 5-year follow-up survey questionnaire. Intakes of soy foods as well as other covariates were assessed in 1995-1998 by a self-administered food frequency questionnaire. Cox proportional hazards regression models were used to estimate hazard ratios (HR) and 95% confidence intervals (CI). Incidence of endometrial cancer. During an average of 12.1 years of follow up, 112 newly diagnosed endometrial cancer cases were identified. Energy-adjusted intakes of soy food and isoflavone were not associated with the risk of endometrial cancer. The multivariate-adjusted HR per 25 g/day increase in the intake of soy food was 1.02 (95% CI 0.94-1.10), and the corresponding value for isoflavone intake per 15 mg/day was 1.01 (95% CI 0.84-1.22). In this population-based prospective cohort study of Japanese women, we observed no evidence of a protective association between soy food or isoflavone intake and endometrial cancer risk. © 2014 Royal College of Obstetricians and Gynaecologists.
Physical activity, sedentary behavior and all-cause mortality among blacks and whites with diabetes.
Glenn, Kimberly R; Slaughter, James C; Fowke, Jay H; Buchowski, Maciej S; Matthews, Charles E; Signorello, Lisa B; Blot, William J; Lipworth, Loren
2015-09-01
The study objective was to examine the role of physical activity (PA) and sedentary time (ST) on mortality risk among a population of low-income adults with diabetes. Black (n = 11,137) and white (n = 4508) men and women with diabetes from the Southern Community Cohort Study self-reported total PA levels and total ST. Participants were categorized into quartiles of total PA and total ST. Hazard ratios (HRs) and 95% confidence intervals (CIs) for subsequent mortality risk were estimated from Cox proportional hazards analysis with adjustment for potential confounders. During follow-up, 2370 participants died. The multivariable risk of mortality was lower among participants in the highest quartile of PA compared with those in the lowest quartile (HR, 0.64; 95% CI: 0.57-0.73). Mortality risk was significantly increased among participants in the highest compared with the lowest quartile of ST after adjusting for PA (HR, 1.21; 95% CI: 1.08-1.37). Across sex and race groups, similar trends of decreasing mortality with rising PA and increasing mortality with rising ST were observed. Although causality cannot be established from these observational data, the current findings suggest that increasing PA and decreasing ST may help extend survival among individuals with diabetes irrespective of race and sex. Copyright © 2015 Elsevier Inc. All rights reserved.
Primary surgery versus primary radiation-based treatment for locally advanced oropharyngeal cancer.
Kamran, Sophia C; Qureshi, Muhammad M; Jalisi, Scharukh; Salama, Andrew; Grillone, Gregory; Truong, Minh Tam
2018-06-01
Randomized data comparing surgery to radiation for locally advanced oropharyngeal cancer (OPC) are lacking. This study evaluated practice patterns and overall survival outcomes from the National Cancer Database. A total of 22,676 patients with stage III to IV, locally advanced OPC were treated between 2004 to 2013 with primary chemoradiation (CRT) or surgery with adjuvant radiotherapy with or without chemotherapy (aRT ± CT). Survival rates were estimated using the Kaplan-Meier method. Crude and adjusted hazard ratios (HR) were computed using Cox regression modeling. Median follow-up was 40.7 months; 8,555 and 14,121 patients received surgery with aRT ± CT and CRT, respectively. Corresponding 3-year survival was 85.4% and 72.6% (P < 0.0001). On multivariate analysis, adjusting for age, gender, race insurance status, median income, percentage with no high-school degree, Charlson-Deyo score, clinical tumor and node stage, tumor grade, facility type, treatment at > 1 facility, and human papillomavirus (HPV) status, surgery with aRT ± CT had a reduced hazard of death, HR, 0.79 (95% confidence interval 0.69-0.91), P = 0.001. Primary surgery with aRT ± CT for locally advanced OPC has an improved survival compared to primary radiation-based treatment even when stratified by HPV status. 2c. Laryngoscope, 128:1353-1364, 2018. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.
Liu, Feng-Cheng; Huang, Wen-Yen; Lin, Te-Yu; Shen, Chih-Hao; Chou, Yu-Ching; Lin, Cheng-Li; Lin, Kuen-Tze; Kao, Chia-Hung
2015-11-01
The effects of the inflammatory mediators involved in systemic lupus erythematous (SLE) on subsequent Parkinson disease have been reported, but no relevant studies have focused on the association between the 2 diseases. This nationwide population-based study evaluated the risk of Parkinson disease in patients with SLE.We identified 12,817 patients in the Taiwan National Health Insurance database diagnosed with SLE between 2000 and 2010 and compared the incidence rate of Parkinson disease among these patients with that among 51,268 randomly selected age and sex-matched non-SLE patients. A Cox multivariable proportional-hazards model was used to evaluate the risk factors of Parkinson disease in the SLE cohort.We observed an inverse association between a diagnosis of SLE and the risk of subsequent Parkinson disease, with the crude hazard ratio (HR) being 0.60 (95% confidence interval 0.45-0.79) and adjusted HR being 0.68 (95% confidence interval 0.51-0.90). The cumulative incidence of Parkinson disease was 0.83% lower in the SLE cohort than in the non-SLE cohort. The adjusted HR of Parkinson disease decreased as the follow-up duration increased and was decreased among older lupus patients with comorbidity.We determined that patients with SLE had a decreased risk of subsequent Parkinson disease. Further research is required to elucidate the underlying mechanism.
Regional variations in breast cancer among california teachers.
Reynolds, Peggy; Hurley, Susan; Goldberg, Debbie E; Anton-Culver, Hoda; Bernstein, Leslie; Deapen, Dennis; Horn-Ross, Pamela L; Peel, David; Pinder, Richard; Ross, Ronald K; West, Dee; Wright, William E; Ziogas, Argyrios
2004-11-01
Observed regional differences in breast cancer incidence could provide valuable clues to the etiology of this disease. The pattern of historically higher breast cancer rates among residents of California's San Francisco Bay and Southern Coastal areas is evident in the disease experience among members of the California Teachers Study. This large cohort study has followed female professional school employees for cancer incidence since 1995 and has collected extensive information on breast cancer risk factors. Between 1996 and 1999, invasive breast cancer was diagnosed in 1562 of the 115,611 cohort members who could be geocoded to a California address in 1995 and who had no previous breast cancer diagnosis. Adjusted hazard rate ratios (HRs) were estimated through multivariate Cox proportional hazards modeling. Rates were higher for cohort members in the San Francisco Bay area (HR = 1.22; 95% confidence interval = 1.06-1.40) and Southern Coastal area (1.16; 1.04-1.30) compared with those in the rest of California. The distributions of variables representing socioeconomic status, urbanization, and personal risk factors were consistent with higher risks for cohort members residing in the San Francisco Bay and Southern Coastal areas. Adjustment for these factors, however, did not explain regional differences in incidence, resulting in HRs that remained elevated for these 2 areas. Regional differences in breast cancer incidence in this large, well-defined cohort are not easily explained by known risk factors.
Accelerated death rate in population-based cohort of persons with traumatic brain injury.
Selassie, Anbesaw W; Cao, Yue; Church, Elizabeth C; Saunders, Lee L; Krause, James
2014-01-01
To determine the influence of preexisting heart, liver, kidney, cancer, stroke, and mental health problems and examine the influence of low socioeconomic status on mortality after discharge from acute care facilities for individuals with traumatic brain injury. Population-based retrospective cohort study of 33695 persons discharged from acute care hospital with traumatic brain injury in South Carolina, 1999-2010. Days elapsing from the dates of injury to death established the survival time (T). Data were censored at the 145th month. Multivariable Cox regression was used to examine the independent effect of the variables on death. Age-adjusted cumulative probability of death for each chronic disease of interest was plotted. By the 70th month of follow-up, rate of death was accelerated from 10-fold for heart diseases to 2.5-fold for mental health problems. Adjusted hazard ratios for diseases of the heart (2.13), liver-renal (3.25), cancer (2.64), neurological diseases and stroke (2.07), diabetes (1.89), hypertension (1.43), and mental health problems (1.59) were highly significant (each with P < .001). Compared with persons with private insurance, the hazard ratio was significantly elevated with Medicaid (1.67), Medicare (1.54), and uninsured (1.27) (each with P < .001). Specific chronic diseases strongly influenced postdischarge mortality after traumatic brain injury. Low socioeconomic status as measured by the type of insurance elevated the risk of death.
Charlson comorbidity index as a predictor of periodontal disease in elderly participants
2018-01-01
Purpose This study investigated the validity of the Charlson comorbidity index (CCI) as a predictor of periodontal disease (PD) over a 12-year period. Methods Nationwide representative samples of 149,785 adults aged ≥60 years with PD (International Classification of Disease, 10th revision [ICD-10], K052–K056) were derived from the National Health Insurance Service-Elderly Cohort during 2002–2013. The degree of comorbidity was measured using the CCI (grade 0–6), including 17 diseases weighted on the basis of their association with mortality, and data were analyzed using multivariate Cox proportional-hazards regression in order to investigate the associations of comorbid diseases (CDs) with PD. Results The multivariate Cox regression analysis with adjustment for sociodemographic factors (sex, age, household income, insurance status, residence area, and health status) and CDs (acute myocardial infarction, congestive heart failure, peripheral vascular disease, cerebral vascular accident, dementia, pulmonary disease, connective tissue disorders, peptic ulcer, liver disease, diabetes, diabetes complications, paraplegia, renal disease, cancer, metastatic cancer, severe liver disease, and human immunodeficiency virus [HIV]) showed that the CCI in elderly comorbid participants was significantly and positively correlated with the presence of PD (grade 1: hazard ratio [HR], 1.11; P<0.001; grade ≥2: HR, 1.12, P<0.001). Conclusions We demonstrated that a higher CCI was a significant predictor of greater risk for PD in the South Korean elderly population. PMID:29770238
Racial differences in colorectal cancer survival at a safety net hospital.
Tapan, Umit; Lee, Shin Yin; Weinberg, Janice; Kolachalama, Vijaya B; Francis, Jean; Charlot, Marjory; Hartshorn, Kevan; Chitalia, Vipul
2017-08-01
While racial disparity in colorectal cancer survival have previously been studied, whether this disparity exists in patients with metastatic colorectal cancer receiving care at safety net hospitals (and therefore of similar socioeconomic status) is poorly understood. We examined racial differences in survival in a cohort of patients with stage IV colorectal cancer treated at the largest safety net hospital in the New England region, which serves a population with a majority (65%) of non-Caucasian patients. Data was extracted from the hospital's electronic medical record. Survival differences among different racial and ethnic groups were examined graphically using Kaplan-Meier analysis. A univariate cox proportional hazards model and a multivariable adjusted model were generated. Black patients had significantly lower overall survival compared to White patients, with median overall survival of 1.9 years and 2.5 years respectively. In a multivariate analysis, Black race posed a significant hazard (HR 1.70, CI 1.01-2.90, p=0.0467) for death. Though response to therapy emerged as a strong predictor of survival (HR=0.4, CI=0.2-0.7, p=0.0021), it was comparable between Blacks and Whites. Despite presumed equal access to healthcare and socioeconomic status within a safety-net hospital system, our results reinforce findings from previous studies showing lower colorectal cancer survival in Black patients, and also point to the importance of investigating other factors such as genetic and pathologic differences. Copyright © 2017 Elsevier Ltd. All rights reserved.
Dembe, A; Erickson, J; Delbos, R; Banks, S
2005-01-01
Aims: To analyse the impact of overtime and extended working hours on the risk of occupational injuries and illnesses among a nationally representative sample of working adults from the United States. Methods: Responses from 10 793 Americans participating in the National Longitudinal Survey of Youth (NLSY) were used to evaluate workers' job histories, work schedules, and occurrence of occupational injury and illness between 1987 and 2000. A total of 110 236 job records were analysed, encompassing 89 729 person-years of accumulated working time. Aggregated incidence rates in each of five exposure categories were calculated for each NLSY survey period. Multivariate analytical techniques were used to estimate the relative risk of long working hours per day, extended hours per week, long commute times, and overtime schedules on reporting a work related injury or illness, after adjusting for age, gender, occupation, industry, and region. Results: After adjusting for those factors, working in jobs with overtime schedules was associated with a 61% higher injury hazard rate compared to jobs without overtime. Working at least 12 hours per day was associated with a 37% increased hazard rate and working at least 60 hours per week was associated with a 23% increased hazard rate. A strong dose-response effect was observed, with the injury rate (per 100 accumulated worker-years in a particular schedule) increasing in correspondence to the number of hours per day (or per week) in the workers' customary schedule. Conclusions: Results suggest that job schedules with long working hours are not more risky merely because they are concentrated in inherently hazardous industries or occupations, or because people working long hours spend more total time "at risk" for a work injury. Strategies to prevent work injuries should consider changes in scheduling practices, job redesign, and health protection programmes for people working in jobs involving overtime and extended hours. PMID:16109814
Dembe, A E; Erickson, J B; Delbos, R G; Banks, S M
2005-09-01
To analyse the impact of overtime and extended working hours on the risk of occupational injuries and illnesses among a nationally representative sample of working adults from the United States. Responses from 10,793 Americans participating in the National Longitudinal Survey of Youth (NLSY) were used to evaluate workers' job histories, work schedules, and occurrence of occupational injury and illness between 1987 and 2000. A total of 110,236 job records were analysed, encompassing 89,729 person-years of accumulated working time. Aggregated incidence rates in each of five exposure categories were calculated for each NLSY survey period. Multivariate analytical techniques were used to estimate the relative risk of long working hours per day, extended hours per week, long commute times, and overtime schedules on reporting a work related injury or illness, after adjusting for age, gender, occupation, industry, and region. After adjusting for those factors, working in jobs with overtime schedules was associated with a 61% higher injury hazard rate compared to jobs without overtime. Working at least 12 hours per day was associated with a 37% increased hazard rate and working at least 60 hours per week was associated with a 23% increased hazard rate. A strong dose-response effect was observed, with the injury rate (per 100 accumulated worker-years in a particular schedule) increasing in correspondence to the number of hours per day (or per week) in the workers' customary schedule. Results suggest that job schedules with long working hours are not more risky merely because they are concentrated in inherently hazardous industries or occupations, or because people working long hours spend more total time "at risk" for a work injury. Strategies to prevent work injuries should consider changes in scheduling practices, job redesign, and health protection programmes for people working in jobs involving overtime and extended hours.
ERIC Educational Resources Information Center
Siman-Tov, Ayelet; Kaniel, Shlomo
2011-01-01
The research validates a multivariate model that predicts parental adjustment to coping successfully with an autistic child. The model comprises four elements: parental stress, parental resources, parental adjustment and the child's autism symptoms. 176 parents of children aged between 6 to 16 diagnosed with PDD answered several questionnaires…
Consumption of ultra-processed foods and cancer risk: results from NutriNet-Santé prospective cohort
Fiolet, Thibault; Sellem, Laury; Kesse-Guyot, Emmanuelle; Allès, Benjamin; Méjean, Caroline; Deschasaux, Mélanie; Fassier, Philippine; Latino-Martel, Paule; Beslay, Marie; Hercberg, Serge; Lavalette, Céline; Monteiro, Carlos A; Julia, Chantal; Touvier, Mathilde
2018-01-01
Abstract Objective To assess the prospective associations between consumption of ultra-processed food and risk of cancer. Design Population based cohort study. Setting and participants 104 980 participants aged at least 18 years (median age 42.8 years) from the French NutriNet-Santé cohort (2009-17). Dietary intakes were collected using repeated 24 hour dietary records, designed to register participants’ usual consumption for 3300 different food items. These were categorised according to their degree of processing by the NOVA classification. Main outcome measures Associations between ultra-processed food intake and risk of overall, breast, prostate, and colorectal cancer assessed by multivariable Cox proportional hazard models adjusted for known risk factors. Results Ultra-processed food intake was associated with higher overall cancer risk (n=2228 cases; hazard ratio for a 10% increment in the proportion of ultra-processed food in the diet 1.12 (95% confidence interval 1.06 to 1.18); P for trend<0.001) and breast cancer risk (n=739 cases; hazard ratio 1.11 (1.02 to 1.22); P for trend=0.02). These results remained statistically significant after adjustment for several markers of the nutritional quality of the diet (lipid, sodium, and carbohydrate intakes and/or a Western pattern derived by principal component analysis). Conclusions In this large prospective study, a 10% increase in the proportion of ultra-processed foods in the diet was associated with a significant increase of greater than 10% in risks of overall and breast cancer. Further studies are needed to better understand the relative effect of the various dimensions of processing (nutritional composition, food additives, contact materials, and neoformed contaminants) in these associations. Study registration Clinicaltrials.gov NCT03335644. PMID:29444771
Baek, Tae-Hwa; Lee, Hae-Young; Lim, Nam-Kyoo; Park, Hyun-Young
2015-09-03
Hypertension is a leading cause of cardiovascular events. We examined whether there was a gender difference in the association between SES, measured by education and income, and hypertension incidence. Data for 2596 men and 2686 women aged 40-69 years without hypertension at baseline from the Korean Genome and Epidemiology Study (KoGES) were analyzed. Participants had two follow-up examinations during 4 years, and were classified into three categories by self-reported educational attainment: ≥ 10 years, 7-9 years, and 0-6 years, and monthly household income (×10,000 Korean Won): ≥ 200, 100-199, and <100. The association between SES and incidence hypertension was examined by Cox's proportional hazard regression analyses. Adjusting for conventional risk factors, compared with the high education group (reference), the hazard ratios (95% confidence interval) for incident hypertension across the education categories were 1.54 (1.16-2.06) and 1.80 (1.36-2.38) in women and 1.15 (0.92-1.43), and 1.08 (0.84-1.38) in men. Women with the low household income were more likely to have hypertension than those with the high household income and incident hypertension had an inverse association with household income level in women: multivariate adjusted hazard ratios were 1.00 (reference), 1.10 (0.83-1.45), and 1.63 (0.75-2.16). Men with medium income were less likely to have hypertension compared with those with high income (0.76, 0.61-0.90). Educational level and economic status had stronger impacts on hypertension in Korean women than men. Thus, a stratified approach for women of low socioeconomic status, especially those with low educational attainment, is needed for the prevention of hypertension.
Lim, Jiyeon; Lee, Yunhee; Shin, Sangah; Lee, Hwi-Won; Kim, Claire E; Lee, Jong-Koo; Lee, Sang-Ah; Kang, Daehee
2018-06-01
Diet quality scores or indices, based on dietary guidelines, are used to summarize dietary intake into a single numeric variable. The aim of this study was to examine the association between the modified diet quality index for Koreans (DQI-K) and mortality among Health Examinees-Gem (HEXA-G) study participants. The DQI-K was modified from the original diet quality index. A total of 134,547 participants (45,207 men and 89,340 women) from the HEXA-G study (2004 and 2013) were included. The DQI-K is based on eight components: 1) daily protein intake, 2) percent of energy from fat, 3) percent of energy from saturated fat, 4) daily cholesterol intake, 5) daily whole-grain intake, 6) daily fruit intake, 7) daily vegetable intake, and 8) daily sodium intake. The association between all-cause mortality and the DQI-K was examined using Cox proportional hazard regression models. Hazard ratios and confidence intervals were estimated after adjusting for age, gender, income, smoking status, alcohol drinking, body mass index, and total energy intake. The total DQI-K score was calculated by summing the scores of the eight components (range 0-9). In the multivariable adjusted models, with good diet quality (score 0-4) as a reference, poor diet quality (score 5-9) was associated with an increased risk of all-cause mortality (hazard ratios = 1.23, 95% confidence intervals = 1.06-1.43). Moreover, a one-unit increase in DQI-K score resulted in a 6% higher mortality risk. A poor diet quality DQI-K score was associated with an increased risk of mortality. The DQI-K in the present study may be used to assess the diet quality of Korean adults.
Loutfy, Mona R; Genebat, Miguel; Moore, David; Raboud, Janet; Chan, Keith; Antoniou, Tony; Milan, David; Shen, Anya; Klein, Marina B; Cooper, Curtis; Machouf, Nima; Rourke, Sean B; Rachlis, Anita; Tsoukas, Chris; Montaner, Julio S G; Walmsley, Sharon L; Smieja, Marek; Bayoumi, Ahmed; Mills, Edward; Hogg, Robert S
2010-12-01
To determine the long-term impact of immunologic discordance (viral load <50 copies/mL and CD4+ count <=200 cells/mm3) in antiretroviral-naive patients initiating combination antiretroviral therapy (cART). Our analysis included antiretroviral-naive individuals from a population-based Canadian Observational Cohort that initiated cART after January 1, 2000, and achieved virologic suppression. Multivariable Cox proportional hazards regression was used to examine the association between 1-year and 2-year immunologic discordance and time to death from all-causes. Correlates of immunologic discordance were assessed with logistic regression. Immunologic discordance was observed in 19.9% (404 of 2028) and 10.2% (176 of 1721) of individuals at 1 and 2 years after cART initiation, respectively. Two-year immunologic discordance was associated with an increased risk of death [adjusted hazard ratio = 2.69; 95% confidence interval (CI): 1.26 to 5.78]. One-year immunologic discordance was not associated with death (adjusted hazard ratio = 1.12; 95% CI: 0.54 to 2.30). Two-year immunologic discordance was associated with older age (aOR per decade = 1.23; 95% CI: 1.03 to 1.48), male gender (aOR = 1.86; 95% CI: 1.09 to 3.16), injection drug use (aOR = 2.75; 95% CI: 1.81 to 4.17), and lower baseline CD4+ count (aOR per 100 cells = 0.24; 95% CI: 0.19 to 0.31) and viral load (aOR per log10 copies/mL = 0.46; 95% CI: 0.33 to 0.65). Immunologic discordance after 2 years of cART in antiretroviral-naive individuals was significantly associated with an increased risk of mortality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, Sabine, E-mail: muellers@neuropeds.ucsf.edu; Fullerton, Heather J.; Stratton, Kayla
Purpose: To test the hypotheses that (1) the increased risk of stroke conferred by childhood cranial radiation therapy (CRT) persists into adulthood; and (2) atherosclerotic risk factors further increase the stroke risk in cancer survivors. Methods and Materials: The Childhood Cancer Survivor Study is a multi-institutional retrospective cohort study of 14,358 5-year survivors of childhood cancer and 4023 randomly selected sibling controls with longitudinal follow-up. Age-adjusted incidence rates of self-reported late-occurring (≥5 years after diagnosis) first stroke were calculated. Multivariable Cox proportional hazards models were used to identify independent stroke predictors. Results: During a mean follow-up of 23.3 years, 292more » survivors reported a late-occurring stroke. The age-adjusted stroke rate per 100,000 person-years was 77 (95% confidence interval [CI] 62-96), compared with 9.3 (95% CI 4-23) for siblings. Treatment with CRT increased stroke risk in a dose-dependent manner: hazard ratio 5.9 (95% CI 3.5-9.9) for 30-49 Gy CRT and 11.0 (7.4-17.0) for 50+ Gy CRT. The cumulative stroke incidence in survivors treated with 50+ Gy CRT was 1.1% (95% CI 0.4-1.8%) at 10 years after diagnosis and 12% (95% CI 8.9-15.0%) at 30 years. Hypertension increased stroke hazard by 4-fold (95% CI 2.8-5.5) and in black survivors by 16-fold (95% CI 6.9-36.6). Conclusion: Young adult pediatric cancer survivors have an increased stroke risk that is associated with CRT in a dose-dependent manner. Atherosclerotic risk factors enhanced this risk and should be treated aggressively.« less
An Update on Mortality in the U.S. Astronaut Corps: 1959-2009
NASA Technical Reports Server (NTRS)
Amirian, E.; Clark, April; Halm, Melissa; Hartnett, Heather
2009-01-01
Although it has now been over 50 years since mankind first ventured into space, the long-term health impacts of human space flight remain largely unknown. Identifying factors that affect survival and prognosis among those who participate in space flight is vitally important, as the era of commercial space flight approaches and NASA prepares for missions to Mars. The Longitudinal Study of Astronaut Health is a prospective study designed to examine trends in astronaut morbidity and mortality. The purpose of this analysis was to describe and explore predictors of overall and cause-specific mortality among individuals selected for the U.S. astronaut corps. All U.S. astronauts (n=321), regardless of flight status, were included in this analysis. Death certificate searches were conducted to ascertain vital status and cause of death through April 2009. Data were collected from medical records and lifestyle questionnaires. Multivariable Cox regression modeling was used to calculate the mortality hazard associated with embarking on space flight, adjusted for sex, race, and age at selection. Between 1959 and 2009, there were 39 (12.1%) deaths. Of these deaths, 18 (42.2%) were due to occupational accidents; 7 (17.9%) were due to other accidents; 6 (15.4%) were attributable to cancer; 6 (15.4%) resulted from cardiovascular/circulatory diseases; and 2 (5.1%) were from other causes. Participation in space flight did not significantly increase mortality hazard over time (adjusted hazard ratio=0.57; 95% confidence interval=0.26-1.26. Because our results are based on a small sample size, future research that includes payload specialists, other space flight participants, and international crew members is warranted to maximize statistical power.
Sleep duration and risk of stroke mortality among Chinese adults: Singapore Chinese health study.
Pan, An; De Silva, Deidre Anne; Yuan, Jian-Min; Koh, Woon-Puay
2014-06-01
Prospective relation between sleep duration and stroke risk is less studied, particularly in Asians. We examined the association between sleep duration and stroke mortality among Chinese adults. The Singapore Chinese Health Study is a population-based cohort of 63 257 Chinese adults aged 45 to 74 years enrolled during 1993 through 1998. Sleep duration at baseline was assessed via in-person interview, and death information during follow-up was ascertained via record linkage with the death registry up to December 31, 2011. Cox proportional hazard models were used to calculate hazard ratios with adjustment for other comorbidities and lifestyle risk factors of stroke mortality. During 926 752 person-years of follow-up, we documented 1381 stroke deaths (322 from hemorrhagic and 1059 from ischemic or nonspecified strokes). Compared with individuals with 7 hours per day of sleep, the multivariate-adjusted hazard ratio (95% confidence interval) of total stroke mortality was 1.25 (1.05-1.50) for ≤5 hours per day (short duration), 1.01 (0.87-1.18) for 6 hours per day, 1.09 (0.95-1.26) for 8 hours per day, and 1.54 (1.28-1.85) for ≥9 hours per day (long duration). The increased risk of stroke death with short (1.54; 1.16-2.03) and long durations of sleep (1.95; 1.48-2.57) was seen among subjects with a history of hypertension, but not in those without hypertension. These findings were limited to risk of death from ischemic or nonspecified stroke, but not observed for hemorrhagic stroke. Both short and long sleep durations are associated with increased risk of stroke mortality in a Chinese population, particularly among those with a history of hypertension. © 2014 American Heart Association, Inc.
Mortality among African American women with sarcoidosis: Data from the Black Women’s Health Study
Tukey, Melissa H.; Berman, Jeffrey S.; Boggs, Deborah A; White, Laura F.; Rosenberg, Lynn; Cozier, Yvette C.
2013-01-01
Rationale Sarcoidosis is a chronic systemic granulomatous disease of unknown etiology that disproportionately affects black females. Few studies have specifically addressed causes of death in this population. Objectives To assess rates and causes of death among women with sarcoidosis in a prospective cohort study of U.S. black women. Methods The Black Women’s Health Study is a follow-up study of 59,000 U.S. black women aged 21–69 (median age 38) at entry in 1995. Data on demographic and lifestyle factors and medical conditions, including sarcoidosis, were obtained through biennial questionnaires. Deaths and causes of death from 1995 through 2009 among study subjects were identified from National Death Index data. Measurements We assessed mortality rates among women with and without a history of sarcoidosis. Poisson regression models were used to estimate age-adjusted mortality rates. Cox proportional-hazards models were used to estimate hazard ratios for mortality and 95% confidence intervals (95% CI). Main Results A total of 121 deaths occurred among 1,192 women with a history of sarcoidosis and 2813 deaths among women without the diagnosis. Mortality was greater at every age among women with sarcoidosis and the overall multivariable-adjusted hazard ratio was 2.44 (95% CI 2.03–2.93, p<0.0001). Of the deaths among women with sarcoidosis, 24.7% were directly attributable to sarcoidosis. Conclusions In the Black Women’s Health Study, women with sarcoidosis were more than twice as likely to die as women without the disease, with many of the deaths directly attributable to sarcoidosis. Sarcoidosis is an important cause of premature death among black women with the disease. PMID:24071884
Sleep-disordered Breathing and Cancer Mortality
Peppard, Paul E.; Young, Terry; Finn, Laurel; Hla, Khin Mae; Farré, Ramon
2012-01-01
Rationale: Sleep-disordered breathing (SDB) has been associated with total and cardiovascular mortality, but an association with cancer mortality has not been studied. Results from in vitro and animal studies suggest that intermittent hypoxia promotes cancer tumor growth. Objectives: The goal of the present study was to examine whether SDB is associated with cancer mortality in a community-based sample. Methods: We used 22-year mortality follow-up data from the Wisconsin Sleep Cohort sample (n = 1,522). SDB was assessed at baseline with full polysomnography. SDB was categorized using the apnea-hypopnea index (AHI) and the hypoxemia index (percent sleep time below 90% oxyhemoglobin saturation). The hazards of cancer mortality across levels of SDB severity were compared using crude and multivariate analyses. Measurements and Main Results: Adjusting for age, sex, body mass index, and smoking, SDB was associated with total and cancer mortality in a dose–response fashion. Compared with normal subjects, the adjusted relative hazards of cancer mortality were 1.1 (95% confidence interval [CI], 0.5–2.7) for mild SDB (AHI, 5–14.9), 2.0 (95% CI, 0.7–5.5) for moderate SDB (AHI, 15–29.9), and 4.8 (95% CI, 1.7–13.2) for severe SDB (AHI ≥ 30) (P-trend = 0.0052). For categories of increasing severity of the hypoxemia index, the corresponding relative hazards were 1.6 (95% CI, 0.6–4.4), 2.9 (95% CI, 0.9–9.8), and 8.6 (95% CI, 2.6–28.7). Conclusions: Our study suggests that baseline SDB is associated with increased cancer mortality in a community-based sample. Future studies that replicate our findings and look at the association between sleep apnea and survival after cancer diagnosis are needed. PMID:22610391
Can echocardiographic findings predict falls in older persons?
van der Velde, Nathalie; Stricker, Bruno H Ch; Roelandt, Jos R T C; Ten Cate, Folkert J; van der Cammen, Tischa J M
2007-07-25
The European and American guidelines state the need for echocardiography in patients with syncope. 50% of older adults with syncope present with a fall. Nonetheless, up to now no data have been published addressing echocardiographic abnormalities in older fallers. In order to determine the association between echocardiographic abnormalities and falls in older adults, we performed a prospective cohort study, in which 215 new consecutive referrals (age 77.4, SD 6.0) of a geriatric outpatient clinic of a Dutch university hospital were included. During the previous year, 139 had experienced a fall. At baseline, all patients underwent routine two-dimensional and Doppler echocardiography. Falls were recorded during a three-month follow-up. Multivariate adjustment for confounders was performed with a Cox proportional hazards model. 55 patients (26%) fell at least once during follow-up. The adjusted hazard ratio of a fall during follow-up was 1.35 (95% CI, 1.08-1.71) for pulmonary hypertension, 1.66 (95% CI, 1.01 to 2.89) for mitral regurgitation, 2.41 (95% CI, 1.32 to 4.37) for tricuspid regurgitation and 1.76 (95% CI, 1.03 to 3.01) for pulmonary regurgitation. For aortic regurgitation the risk of a fall was also increased, but non-significantly (hazard ratio, 1.57 [95% CI, 0.85 to 2.92]). Trend analysis of the severity of the different regurgitations showed a significant relationship for mitral, tricuspid and pulmonary valve regurgitation and pulmonary hypertension. Echo (Doppler) cardiography can be useful in order to identify risk indicators for falling. Presence of pulmonary hypertension or regurgitation of mitral, tricuspid or pulmonary valves was associated with a higher fall risk. Our study indicates that the diagnostic work-up for falls in older adults might be improved by adding an echo (Doppler) cardiogram in selected groups.
Ferrante, Lauren E; Murphy, Terrence E; Gahbauer, Evelyne A; Leo-Summers, Linda S; Pisani, Margaret A; Gill, Thomas M
2018-05-01
Cognitive impairment is common among older adults, yet little is known about the association of pre-intensive care unit cognitive status with outcomes relevant to older adults maintaining independence after a critical illness. To evaluate whether pre-intensive care unit cognitive status is associated with post-intensive care unit disability, new nursing home admission, and mortality after a critical illness among older adults. In this prospective cohort study, 754 persons aged 70 years or more were monitored from March 1998 to December 2013 with monthly assessments of disability. Cognitive status was assessed every 18 months, using the Mini-Mental State Examination (range, 0-30), with scores classified as 28 or higher (cognitively intact), 24-27 (minimal impairment), and less than 24 (moderate impairment). The primary outcome was disability count (range, 0-13), assessed monthly over 6 months after an intensive care unit stay. The secondary outcomes were incident nursing home admission and time to death after intensive care unit admission. The analytic sample included 391 intensive care unit admissions. The mean age was 83.5 years. The prevalence of moderate impairment, minimal impairment, and intact cognition (the comparison group) was 17.3, 46.2, and 36.5%, respectively. In the multivariable analysis, moderate impairment was associated with nearly a 20% increase in disability over the 6-month follow-up period (adjusted relative risk, 1.19; 95% confidence interval, 1.04-1.36), and minimal impairment was associated with a 16% increase in post-intensive care unit disability (adjusted relative risk, 1.16; 95% confidence interval, 1.02-1.32). Moderate impairment was associated with more than double the likelihood of a new nursing home admission (adjusted odds ratio, 2.37; 95% confidence interval, 1.01-5.55). Survival differed significantly across the three cognitive groups (log-rank P = 0.002), but neither moderate impairment (adjusted hazard ratio, 1.19; 95% confidence interval, 0.65-2.19) nor minimal impairment (adjusted hazard ratio, 1.00; 95% confidence interval, 0.61-1.62) was significantly associated with mortality in the multivariable analysis. Among older adults, any impairment (even minimal) in pre-intensive care unit cognitive status was associated with an increase in post-intensive care unit disability over the 6 months after a critical illness; moderate cognitive impairment doubled the likelihood of a new nursing home admission. Pre-intensive care unit cognitive impairment was not associated with mortality from intensive care unit admission through 6 months of follow-up. Pre-intensive care unit cognitive status may provide prognostic information about the likelihood of older adults maintaining independence after a critical illness.
Association of the Timing of Pregnancy With Survival in Women With Breast Cancer
Iqbal, Javaid; Amir, Eitan; Rochon, Paula A.; Giannakeas, Vasily; Sun, Ping
2017-01-01
Importance Increasing numbers of women experience pregnancy around the time of, or after, a diagnosis of breast cancer. Understanding the effect of pregnancy on survival in women with breast cancer will help in the counseling and treatment of these women. Objective To compare the overall survival of women diagnosed with breast cancer during pregnancy or in the postpartum period with that of women who had breast cancer but did not become pregnant. Design, Setting, and Participants This population-based, retrospective cohort study linked health administrative databases in Ontario, Canada, comprising 7553 women aged 20 to 45 years at the time of diagnosis with invasive breast cancer, from January 1, 2003, to December 31, 2014. Exposures Any pregnancy in the period from 5 years before, until 5 years after, the index date of the diagnosis of breast cancer. Women were classified into the following 4 exposure groups: no pregnancy (the referent), pregnancy before breast cancer, pregnancy-associated breast cancer, and pregnancy following breast cancer. Main Outcomes and Measures Five-year actuarial survival rates for all exposure groups, age-adjusted and multivariable hazard ratios [HRs] of pregnancy for overall survival for all exposure groups, and time-dependent hazard ratios for women with pregnancy following breast cancer. Results Among the 7553 women in the study (mean age at diagnosis, 39.1 years; median, 40 years; range, 20-44 years) the 5-year actuarial survival rate was 87.5% (95% CI, 86.5%-88.4%) for women with no pregnancy, 85.3% (95% CI, 82.8%-87.8%) for women with pregnancy before breast cancer (age-adjusted hazard ratio, 1.03; 95% CI, 0.85-1.27; P = .73), and 82.1% (95% CI, 78.3%-85.9%) for women with pregnancy-associated breast cancer (age-adjusted hazard ratio, 1.18; 95% CI, 0.91-1.53; P = .20). The 5-year actuarial survival rate was 96.7% (95% CI, 94.1%-99.3%) for women who had pregnancy 6 months or more after diagnosis of breast cancer, vs 87.5% (95% CI, 86.5%-88.4%) for women with no pregnancy) (age-adjusted HR, 0.22; 95% CI, 0.10-0.49; P < .001). Conclusions and Relevance Pregnancy did not adversely affect survival in women with breast cancer. For breast cancer survivors who wish to conceive, the risk of death is lowest if pregnancy occurs 6 months or more after diagnosis. PMID:28278319
Risk of Colchicine-Associated Myopathy in Gout: Influence of Concomitant Use of Statin.
Kwon, Oh Chan; Hong, Seokchan; Ghang, Byeongzu; Kim, Yong-Gil; Lee, Chang-Keun; Yoo, Bin
2017-05-01
The purpose of this study was to investigate the risk of myopathy when statins are coadministered with colchicine in patients with gout. In gout patients who received colchicine with or without statin, clinical data collected included medications and history of hypertension, chronic kidney disease, and liver cirrhosis. Myopathy was defined as the presence of muscle symptoms with elevated creatine kinase or myoglobin. Multivariate analysis was performed to identify risk factors for myopathy. Inverse probability of treatment weighting (IPTW)-adjusted analysis was used to evaluate the influence of concomitant colchicine and statin use on myopathy. Of 674 patients, 486 received colchicine alone and 188 also received statin. The incidence of myopathy was not significantly higher in those on both drugs than in those on colchicine alone (2.7% vs 1.4%, P = .330). On multivariate analysis, chronic kidney disease (hazard ratio [HR] 29.056; 95% confidence interval [CI], 4.387-192.450; P <.001), liver cirrhosis (HR 10.676; 95% CI, 1.279-89.126; P = .029), higher colchicine dose (HR 20.960; 95% CI, 1.835-239.481; P = .014), and concomitant CYP3A4 inhibitor (HR 12.027; 95% CI, 2.743-52.725; P = .001) were associated with increased risk of myopathy. Concomitant use of statins, however, was not, even after adjusting for confounders (HR 1.123; 95% CI, 0.262-4.814; P = .875; IPTW-adjusted HR 0.321; 95% CI, 0.077-1.345; P = .120). Concomitant use of statin and colchicine was not associated with increased risk of myopathy. Thus, concomitant use of statin with colchicine seems to be safe from myotoxicity in gout patients. Copyright © 2017 Elsevier Inc. All rights reserved.
Sakurai, M; Nakamura, K; Miura, K; Takamura, T; Yoshita, K; Nagasawa, S Y; Morikawa, Y; Ishizaki, M; Kido, T; Naruse, Y; Suwazono, Y; Sasaki, S; Nakagawa, H
2014-02-01
This cohort study investigated the association between sugar-sweetened beverage (SSB) and diet soda consumption and the incidence of type 2 diabetes in Japanese men. The participants were 2,037 employees of a factory in Japan. We measured consumption of SSB and diet soda using a self-administered diet history questionnaire. The incidence of diabetes was determined in annual medical examinations over a 7-year period. Hazard ratios (HRs) with 95 % confidence intervals (CIs) for diabetes were estimated after adjusting for age, body mass index, family history, and dietary and other lifestyle factors. During the study, 170 participants developed diabetes. The crude incidence rates (/1,000 person-years) across participants who were rare/never SSB consumers, <1 serving/week, ≥ 1 serving/week and <1 serving/day, and ≥ 1 serving/day were 15.5, 12.7, 14.9, and 17.4, respectively. The multivariate-adjusted HR compared to rare/never SSB consumers was 1.35 (95 % CI 0.80-2.27) for participants who consumed ≥ 1 serving/day SSB. Diet soda consumption was significantly associated with the incident risk of diabetes (P for trend = 0.013), and multivariate-adjusted HRs compared to rare/never diet soda consumers were 1.05 (0.62-1.78) and 1.70 (1.13-2.55), respectively, for participants who consumed <1 serving/week and ≥ 1 serving/week. Consumption of diet soda was significantly associated with an increased risk for diabetes in Japanese men. Diet soda is not always effective at preventing type 2 diabetes even though it is a zero-calorie drink.
Eguaras, Sonia; Bes-Rastrollo, Maira; Ruiz-Canela, Miguel; Carlos, Silvia; de la Rosa, Pedro; Martínez-González, Miguel A
2017-05-01
It is likely that the Mediterranean diet (MedDiet) may mitigate the adverse effects of obesity on the incidence of type 2 diabetes mellitus (T2DM). We assessed this hypothesis in a cohort of 18 225 participants initially free of diabetes (mean age: 38 years, 61 % women). A validated semi-quantitative 136-item FFQ was used to assess dietary intake and to build a 0-9 score of adherence to MedDiet. After a median of 9·5-year follow-up, 136 incident cases of T2DM were confirmed during 173 591 person-years follow-up. When MedDiet adherence was low (≤4 points), the multivariable-adjusted hazard ratios (HR) were 4·07 (95 % CI 1·58, 10·50) for participants with BMI 25-29·99 kg/m2 and 17·70 (95 % CI 6·29, 49·78) kg/m2 for participants with BMI≥30 kg/m2, (v.4 points), these multivariable-adjusted HR were 3·13 (95 % CI 1·63, 6·01) and 10·70 (95 % CI 4·98, 22·99) for BMI 25-30 and ≥30 kg/m2, respectively. The P value for the interaction was statistically significant (P=0·002). When we assessed both variables (BMI and MedDiet) as continuous, the P value for their interaction product-term was marginally significant (P=0·051) in fully adjusted models. This effect modification was not explained by weight changes during follow-up. Our results suggest that the MedDiet may attenuate the adverse effects of obesity on the risk of T2DM.
Marquis-Gravel, Guillaume; Matteau, Alexis; Potter, Brian J; Gobeil, François; Noiseux, Nicolas; Stevens, Louis-Mathieu; Mansour, Samer
2017-01-01
Background The place of drug-eluting balloons (DEB) in the treatment of in-stent restenosis (ISR) is not well-defined, particularly in a population of all-comers with acute coronary syndromes (ACS). Objective Compare the clinical outcomes of DEB with second-generation drug-eluting stents (DES) for the treatment of ISR in a real-world population with a high proportion of ACS. Methods A retrospective analysis of consecutive patients with ISR treated with a DEB compared to patients treated with a second-generation DES was performed. The primary endpoint was a composite of major adverse cardiovascular events (MACE: all-cause death, non-fatal myocardial infarction, and target lesion revascularization). Comparisons were performed using Cox proportional hazards multivariate adjustment and Kaplan-Meier analysis with log-rank. Results The cohort included 91 patients treated with a DEB and 89 patients treated with a DES (74% ACS). Median follow-up was 26 months. MACE occurred in 33 patients (36%) in the DEB group, compared to 17 patients (19%) in the DES group (p log-rank = 0.02). After multivariate adjustment, there was no significant difference between the groups (HR for DEB = 1.45 [95%CI: 0.75-2.83]; p = 0.27). Mortality rates at 1 year were 11% with DEB, and 3% with DES (p = 0.04; adjusted HR = 2.85 [95%CI: 0.98-8.32]; p = 0.06). Conclusion In a population with a high proportion of ACS, a non-significant numerical signal towards increased rates of MACE with DEB compared to second-generation DES for the treatment of ISR was observed, mainly driven by a higher mortality rate. An adequately-powered randomized controlled trial is necessary to confirm these findings. PMID:28977052
Redaniel, Maria Theresa; Laudico, Adriano; Mirasol-Lumague, Maria Rica; Gondos, Adam; Uy, Gemma Leonora; Toral, Jean Ann; Benavides, Doris; Brenner, Hermann
2009-08-01
Few studies have assessed and compared cervical cancer survival between developed and developing countries, or between ethnic groups within a country. Fewer still have addressed how much of the international or interracial survival differences can be attributed to ethnicity or health care. To determine the role of ethnicity and health care, 5-year survival of patients with cervical cancer was compared between patients in the Philippines and Filipino-Americans, who have the same ethnicity, and between Filipino-Americans and Caucasians, who have the same health care system. Cervical cancer databases from the Manila and Rizal Cancer Registries and Surveillance, Epidemiology, and End Results 13 were used. Age-adjusted 5-year survival estimates were computed and compared between the three patient groups. Using Cox proportional hazards modeling, potential determinants of survival differences were examined. Overall 5-year relative survival was similar in Filipino-Americans (68.8%) and Caucasians (66.6%), but was lower for Philippine residents (42.9%). Although late stage at diagnosis explained a large proportion of the survival differences between Philippine residents and Filipino-Americans, excess mortality prevailed after adjustment for stage, age, and morphology in multivariate analysis [relative risk (RR), 2.07; 95% confidence interval (CI), 1.68-2.55]. Excess mortality decreased, but persisted, when treatments were included in the multivariate models (RR, 1.78; 95% CI, 1.41-2.23). A moderate, marginally significant excess mortality was found among Caucasians compared with Filipino-Americans (adjusted RR, 1.22; 95% CI, 1.01-1.47). The differences in cervical cancer survival between patients in the Philippines and in the United States highlight the importance of enhanced health care and access to diagnostic and treatment facilities in the Philippines.
Maisonneuve, Patrick; Shivappa, Nitin; Hébert, James R; Bellomi, Massimo; Rampinelli, Cristiano; Bertolotti, Raffaella; Spaggiari, Lorenzo; Palli, Domenico; Veronesi, Giulia; Gnagnarella, Patrizia
2016-04-01
To test whether the inflammatory potential of diet, as measured using the dietary inflammatory index (DII), is associated with risk of lung cancer or other respiratory conditions and to compare results obtained with those based on the aMED score, an established dietary index that measures adherence to the traditional Mediterranean diet. In 4336 heavy smokers enrolled in a prospective, non-randomized lung cancer screening program, we measured participants' diets at baseline using a self-administered food frequency questionnaire from which dietary scores were calculated. Cox proportional hazards and logistic regression models were used to assess association between the dietary indices and lung cancer diagnosed during annual screening, and other respiratory outcomes that were recorded at baseline, respectively. In multivariable analysis, adjusted for baseline lung cancer risk (estimated from age, sex, smoking history, and asbestos exposure) and total energy, both DII and aMED scores were associated with dyspnoea (p trend = 0.046 and 0.02, respectively) and radiological evidence of emphysema (p trend = 0.0002 and 0.02). After mutual adjustment of the two dietary scores, only the association between DII and radiological evidence of emphysema (Q4 vs. Q1, OR 1.30, 95 % CI 1.01-1.67, p trend = 0.012) remained statistically significant. At univariate analysis, both DII and aMED were associated with lung cancer risk, but in fully adjusted multivariate analysis, only the association with aMED remained statistically significant (p trend = 0.04). Among heavy smokers, a pro-inflammatory diet, as indicated by increasing DII score, is associated with dyspnoea and radiological evidence of emphysema. A traditional Mediterranean diet, which is associated with a lower DII, may lower lung cancer risk.
Shivappa, Nitin; Hébert, James R.; Bellomi, Massimo; Rampinelli, Cristiano; Bertolotti, Raffaella; Spaggiari, Lorenzo; Palli, Domenico; Veronesi, Giulia; Gnagnarella, Patrizia
2016-01-01
Purpose To test whether the inflammatory potential of diet, as measured using the dietary inflammatory index (DII), is associated with risk of lung cancer or other respiratory conditions and to compare results obtained with those based on the aMED score, an established dietary index that measures adherence to the traditional Mediterranean diet. Methods In 4336 heavy smokers enrolled in a prospective, non-randomized lung cancer screening program, we measured participants’ diets at baseline using a self-administered food frequency questionnaire from which dietary scores were calculated. Cox proportional hazards and logistic regression models were used to assess association between the dietary indices and lung cancer diagnosed during annual screening, and other respiratory outcomes that were recorded at baseline, respectively. Results In multivariable analysis, adjusted for baseline lung cancer risk (estimated from age, sex, smoking history, and asbestos exposure) and total energy, both DII and aMED scores were associated with dyspnoea (p trend = 0.046 and 0.02, respectively) and radiological evidence of emphysema (p trend = 0.0002 and 0.02). After mutual adjustment of the two dietary scores, only the association between DII and radiological evidence of emphysema (Q4 vs. Q1, OR 1.30, 95 % CI 1.01–1.67, p trend = 0.012) remained statistically significant. At univariate analysis, both DII and aMED were associated with lung cancer risk, but in fully adjusted multivariate analysis, only the association with aMED remained statistically significant (p trend = 0.04). Conclusions Among heavy smokers, a pro-inflammatory diet, as indicated by increasing DII score, is associated with dyspnoea and radiological evidence of emphysema. A traditional Mediterranean diet, which is associated with a lower DII, may lower lung cancer risk. PMID:25953452
UNSAFE SEXUAL BEHAVIOUR ASSOCIATED WITH HAZARDOUS ALCOHOL USE AMONG STREET-INVOLVED YOUTH
Fairbairn, Nadia; Wood, Evan; Dong, Huiru; Kerr, Thomas; DeBeck, Kora
2016-01-01
While risky sexual behaviours related to illicit drug use among street youth have been explored, the impacts of alcohol use have received less attention. This longitudinal study examined hazardous alcohol use among a population of street-involved youth, with particular attention to sexual and drug-related risk behaviours. Data were derived from the At-Risk Youth Study, a prospective cohort of street-involved youth in Vancouver, Canada. The outcome of interest was hazardous alcohol use defined by the US National Institute on Alcohol Abuse and Alcoholism. We used generalized estimating equations (GEEs) analyses to identify factors associated with hazardous alcohol use. Between 2005 and 2014, 1149 drug-using youth were recruited and 629 (55%) reported hazardous alcohol use in the previous 6 months during study follow-up. In multivariable GEE analyses, unprotected sex (adjusted odds ratio [AOR] = 1.28, 95% confidence interval [95% CI] = 1.12–1.46) and homelessness (AOR = 1.35, 95% CI = 1.19–1.54) were independently associated with hazardous alcohol use (all p < .001). Older age (AOR = 0.95, 95% CI = 0.92–0.99), Caucasian ethnicity (AOR = 0.74, 95% CI = 0.61–0.90), daily heroin use (AOR = 0.53, 95% CI = 0.42– 0.67), daily crack cocaine smoking (AOR = 0.73, 95% CI = 0.59–0.91), and daily crystal methamphetamine use (AOR = 0.52, 95% CI = 0.42–0.64) were negatively associated with hazardous alcohol use (all p < .05). In sub-analysis, consistent dose–response patterns were observed between levels of alcohol use and unprotected sex, homelessness, and daily heroin injection. In sum, hazardous alcohol use was positively associated with unsafe sexual behaviour and negatively associated with high-intensity drug use. Interventions to address hazardous alcohol use should be central to HIV prevention efforts for street-involved youth. PMID:27539676
Association of Race with Mortality and Cardiovascular Events in a Large Cohort of US Veterans
Kovesdy, Csaba P.; Norris, Keith C.; Boulware, L. Ebony; Lu, Jun L.; Ma, Jennie Z.; Streja, Elani; Molnar, Miklos Z.; Kalantar-Zadeh, Kamyar
2015-01-01
Background In the general population African-Americans experience higher mortality than their white peers, attributed, in part, to their lower socio-economic status, reduced access to care and possibly intrinsic biologic factors. A notable exception are patients with kidney disease, among whom African-Americans experience lower mortality. It is unclear if similar differences affecting outcomes exist in patients with no kidney disease but with similar access to health care. Methods and Results We compared all-cause mortality, incident coronary heart disease (CHD) and incident ischemic stroke using multivariable adjusted Cox models in a nationwide cohort of 547,441 African-American and 2,525,525 white patients with baseline estimated glomerular filtration rate (eGFR) ≥60 ml/min/1.73m2 receiving care from the US Veterans Health Administration. In parallel analyses we compared outcomes in African-American vs. white individuals in the National Health and Nutrition Examination Survey 1999–2004 (NHANES). After multivariable adjustments in veterans, African-American race was associated with 24% lower all-cause mortality (adjusted hazard ratio (aHR), 95% confidence interval (CI): 0.76, 0.75–0.77, p<0.001) and 37% lower incidence of CHD (aHR, 95%CI: 0.63, 0.62–0.65, p<0.001), but similar incidence of ischemic stroke (aHR, 95%CI: 0.99, 0.97–1.01, p=0.3). African-American race was associated with a 42% higher adjusted mortality among individuals with eGFR≥60 ml/min/1.73m2 in NHANES (aHR, 95%CI: 1.42 (1.09–1.87)). Conclusions African-American veterans with normal eGFR have lower all-cause mortality and incidence of CHD, and similar incidence of ischemic stroke. These associations are in contrast with the higher mortality experienced by African-American individuals in the general US population. PMID:26384521
Coffee Intake and Incidence of Erectile Dysfunction.
Lopez, David S; Liu, Lydia; Rimm, Eric B; Tsilidis, Konstantinos K; de Oliveira Otto, Marcia; Wang, Run; Canfield, Steven; Giovannucci, Edward
2018-05-01
Coffee intake is suggested to have a positive impact on chronic diseases, yet its role in urological diseases such as erectile dysfunction (ED) remains unclear. We investigated the association of coffee intake with incidence of ED by conducting the Health Professionals Follow-Up Study, a prospective analysis of 21,403 men aged 40-75 years old. Total, regular, and decaffeinated coffee intakes were self-reported on food frequency questionnaires. ED was assessed by mean values of questionnaires in 2000, 2004 and 2008. Multivariable adjusted Cox proportional hazards models were used to compute hazard ratios for patients with incident ED (n = 7,298). No significant differences were identified for patients with incident ED after comparing highest (≥4 cups/day) with lowest (0 cups/day) categories of total (hazard ratio (HR) = 1.00, 95% confidence interval (CI): 0.90, 1.11) and regular coffee intakes (HR = 1.00, 95% CI: 0.89, 1.13). When comparing the highest category with lowest category of decaffeinated coffee intake, we found a 37% increased risk of ED (HR = 1.37, 95% CI: 1.08, 1.73), with a significant trend (P trend = 0.02). Stratified analyses also showed an association among current smokers (P trend = 0.005). Overall, long-term coffee intake was not associated with risk of ED in a prospective cohort study.
Acupuncture Therapy and Incidence of Depression After Stroke.
Lu, Chung-Yen; Huang, Hsin-Chia; Chang, Hen-Hong; Yang, Tsung-Hsien; Chang, Chee-Jen; Chang, Su-Wei; Chen, Pei-Chun
2017-06-01
We investigated whether use of acupuncture within a 3-month poststroke period after hospital discharge is associated with reduced risk of depression. This cohort study included 16 046 patients aged ≥18 years with an initial hospitalization for stroke during 2000 and 2012 in the claims database of a universal health insurance program. Patients who had received acupuncture therapies within 3 months of discharge were defined as acupuncture users (n=1714). All patients were followed up for incidence of depression until the end of 2013. We assessed the association between use of acupuncture and incidence of depression using Cox proportional hazards models in all subjects and in propensity score-matched samples consisting of 1714 pairs of users and nonusers. During the follow-up period, the incidence of depression per 1000 person-years was 11.1 and 9.7 in users and nonusers, respectively. Neither multivariable-adjusted Cox models (hazard ratio, 1.04; 95% confidence interval, 0.84-1.29) nor the propensity score-matching model (hazard ratio, 1.06; 95% confidence interval, 0.79-1.42) revealed an association between use of acupuncture and incidence of depression. In patients admitted to hospital for stroke, acupuncture therapy within 3 months after discharge was not associated with subsequent incidence of depression. © 2017 American Heart Association, Inc.
Brenowitz, Willa D; Kukull, Walter A; Beresford, Shirley A A; Monsell, Sarah E; Williams, Emily C
2014-01-01
Social relationships are hypothesized to prevent or slow cognitive decline. We sought to evaluate associations between social relationships and mild cognitive impairment (MCI). Participants from the National Alzheimer's Coordinating Center database who were cognitively normal, aged 55 and older at baseline, and had at least 2 in-person visits (n=5335) were included. Multivariable Cox proportional hazard models evaluated the association between 4 social relationships at baseline (marital status, living situation, having children, and having siblings) and risk of developing MCI (on the basis of clinician diagnosis following established criteria). Primary models were adjusted for baseline demographics. Participants were followed, on average, for 3.2 years; 15.2% were diagnosed with MCI. Compared with married participants, risk of MCI was significantly lower for widowed participants (hazard ratio: 0.87; 95% confidence interval: 0.76, 0.99) but not for divorced/separated or never-married participants. Compared with living with a spouse/partner, risk of MCI was significantly higher for living with others (hazard ratio: 1.35; 95% confidence interval: 1.03, 1.77) but not for living alone. Risk of MCI was not associated with having children or having siblings. These results did not consistently identify social relationships as a strong risk factor for, or independent clinical predictor of, MCI.
Connelly, Margery A; Gruppen, Eke G; Wolak-Dinsmore, Justyna; Matyus, Steven P; Riphagen, Ineke J; Shalaurova, Irina; Bakker, Stephan J L; Otvos, James D; Dullaart, Robin P F
2016-01-15
GlycA is a recently developed glycoprotein biomarker of systemic inflammation that may be predictive of incident type 2 diabetes mellitus (T2DM). Analytical performance of the GlycA test, measured on the Vantera® Clinical Analyzer, was evaluated. To test its prospective association with T2DM, GlycA was measured in 4524 individuals from the PREVEND study and a survival analysis was performed with a mean follow-up period of 7.3y. Imprecision for the GlycA test ranged from 1.3-2.3% and linearity was established between 150 and 1588μmol/l. During the follow-up period, 220 new T2DM cases were ascertained. In analyses adjusted for relevant covariates, GlycA was associated with incident T2DM; hazard ratio (HR) for the highest vs. lowest quartile 1.77 [95% Confidence Interval (CI): 1.10-2.86, P=0.01], whereas the association of high sensitivity C-reactive protein (hsCRP) with T2DM was not significant. GlycA remained associated with incident T2DM after additional adjustment for hsCRP; HR 1.71 [1.00-2.92, P=0.04]. A multivariable adjusted analysis of dichotomized subgroups showed that the hazard for incident T2DM was highest in the subgroup with high GlycA and low hsCRP (P=0.03). The performance characteristics of the GlycA test reveal that it is suitable for clinical applications, including assessment of the risk of future T2DM. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Pulmonary function and adverse cardiovascular outcomes: Can cardiac function explain the link?
Burroughs Peña, Melissa S; Dunning, Allison; Schulte, Phillip J; Durheim, Michael T; Kussin, Peter; Checkley, William; Velazquez, Eric J
2016-12-01
The complex interaction between pulmonary function, cardiac function and adverse cardiovascular events has only been partially described. We sought to describe the association between pulmonary function with left heart structure and function, all-cause mortality and incident cardiovascular hospitalization. This study is a retrospective analysis of patients evaluated in a single tertiary care medical center. We used multivariable linear regression analyses to examine the relationship between FVC and FEV1 with left ventricular ejection fraction (LVEF), left ventricular internal dimension in systole and diastole (LVIDS, LVIDD) and left atrial diameter, adjusting for baseline characteristics, right ventricular function and lung hyperinflation. We also used Cox proportional hazards models to examine the relationship between FVC and FEV1 with all-cause mortality and cardiac hospitalization. A total of 1807 patients were included in this analysis with a median age of 61 years and 50% were female. Decreased FVC and FEV1 were both associated with decreased LVEF. In individuals with FVC less than 2.75 L, decreased FVC was associated with increased all-cause mortality after adjusting for left and right heart echocardiographic variables (hazard ratio [HR] 0.49, 95% CI 0.29, 0.82, respectively). Decreased FVC was associated with increased cardiac hospitalization after adjusting for left heart size (HR 0.80, 95% CI 0.67, 0.96), even in patients with normal LVEF (HR 0.75, 95% CI 0.57, 0.97). In a tertiary care center reduced pulmonary function was associated with adverse cardiovascular events, a relationship that is not fully explained by left heart remodeling or right heart dysfunction. Copyright © 2016 Elsevier Ltd. All rights reserved.
Simpson, Scot H; Lin, Mu; Eurich, Dean T
2017-08-01
Inducement programs can promote customer loyalty; however, the clinical effects of these programs are unknown. To examine relationships among inducement program use, medication adherence, and health outcomes. Alberta residents with ≥ 1 physician visit for diabetes or hypertension between April 2008 and March 2014 were eligible for this study and included if they were new statin users and alive at least 455 days after the first statin dispensation. Group assignment was based on whether all statin dispensations in the first year were obtained from pharmacies with or without inducement programs. Discontinuation was defined as no statin dispensations between 275 and 455 days after the first statin dispensation. Acute coronary syndrome (ACS) hospitalizations or deaths were identified between 456 days and 3 years after the first statin dispensation. Multivariable regression analyses were conducted to examine relationships among inducement program use, discontinuation, and ACS events. Among the 159 998 new statin users, mean age was 60.2 (±13.7) years and 67 534 (42%) were women. Statin discontinuation occurred in 22 455 (28.9%) of 77 803 inducement group participants and 25 816 (31.4%) of 82 195 noninducement group participants (adjusted odds ratio = 0.88; 95% CI = 0.86-0.90). Risk of an ACS event was similar between groups (adjusted hazard ratio = 1.00; 95% CI 0.92-1.08); however, discontinuing statin therapy was associated with a higher risk of an ACS event (adjusted hazard ratio = 1.27; 95% CI = 1.16-1.39). Inducement programs are associated with better adherence and not directly associated with risk of health outcomes.
Survival after acute hemodialysis in Pennsylvania, 2005-2007: a retrospective cohort study.
Ramer, Sarah J; Cohen, Elan D; Chang, Chung-Chou H; Unruh, Mark L; Barnato, Amber E
2014-01-01
Little is known about acute hemodialysis in the US. Here we describe predictors of receipt of acute hemodialysis in one state and estimate the marginal impact of acute hemodialysis on survival after accounting for confounding due to illness severity. This is a retrospective cohort study of acute-care hospitalizations in Pennsylvania from October 2005 to December 2007 using data from the Pennsylvania Health Care Cost Containment Council. Exposure variable is acute hemodialysis; dependent variable is survival following acute hemodialysis. We used multivariable logistic regression to determine propensity to receive acute hemodialysis and then, for a Cox proportional hazards model, matched acute hemodialysis and non-acute hemodialysis patients 1∶5 on this propensity. In 2,131,248 admissions of adults without end-stage renal disease, there were 6,657 instances of acute hemodialysis. In analyses adjusted for predicted probability of death upon admission plus other covariates and stratified on age, being male, black, and insured were independent predictors of receipt of acute hemodialysis. One-year post-admission mortality was 43% for those receiving acute hemodialysis, compared to 13% among those not receiving acute hemodialysis. After matching on propensity to receive acute hemodialysis and adjusting for predicted probability of death upon admission, patients who received acute hemodialysis had a higher risk of death than patients who did not over at least 1 year of follow-up (hazard ratio 1·82, 95% confidence interval 1·68-1·97). In a populous US state, receipt of acute hemodialysis varied by age, sex, race, and insurance status even after adjustment for illness severity. In a comparison of patients with similar propensity to receive acute hemodialysis, those who did receive it were less likely to survive than those who did not. These findings raise questions about reasons for lack of benefit.
Meijide, Héctor; Pértega, Sonia; Rodríguez-Osorio, Iria; Castro-Iglesias, Ángeles; Baliñas, Josefa; Rodríguez-Martínez, Guillermo; Mena, Álvaro; Poveda, Eva
2017-05-15
Cancer is a growing problem in persons living with HIV infection (PLWH) and hepatitis C virus (HCV) coinfection could play an additional role in carcinogenesis. Herein, all cancers in an HIV-mono and HIV/HCV-coinfected cohort were evaluated and compared to identify any differences between these two populations. A retrospective cohort study was conducted including all cancers in PLWH between 1993 and 2014. Cancers were classified in two groups: AIDS-defining cancer (ADC) and non-AIDS-defining cancer (NADC). Cancer incidence rates were calculated and compared with that observed in the Spanish general population (GLOBOCAN, 2012), computing the standardized incidence ratios (SIRs). A competing risk approach was used to estimate the probability of cancer after HIV diagnosis. Cumulative incidence in HIV-monoinfected and HIV/HCV-coinfected patients was also compared using multivariable analysis. A total of 185 patients (117 HIV-monoinfected and 68 HIV/HCV) developed cancer in the 26 580 patient-years cohort, with an incidence rate of 696 cancers per 100 000 person-years, higher than in the general population (SIR = 3.8). The incidence rate of NADC in HIV/HCV-coinfected patients was 415.0 (SIR = 3.4), significantly higher than in monoinfected (377.3; SIR = 1.8). After adjustments, HIV/HCV-coinfected patients had a higher cumulative incidence of NADC than HIV-monoinfected (adjusted hazard ratio = 1.80), even when excluding hepatocellular carcinomas (adjusted hazard ratio = 1.26). PLWH have a higher incidence of NADC than the general population and HCV-coinfection is associated with a higher incidence of NADC. These data justify the need for prevention strategies in these two populations and the importance of eradicating HCV.
McKenzie, Nicole; Williams, Teresa A; Ho, Kwok M; Inoue, Madoka; Bailey, Paul; Celenza, Antonio; Fatovich, Daniel; Jenkins, Ian; Finn, Judith
2018-05-02
To compare survival outcomes of adults with out-of-hospital cardiac arrest (OHCA) of medical aetiology directly transported to a percutaneous-coronary-intervention capable (PCI-capable) hospital (direct transport) with patients transferred to a PCI-capable hospital via another hospital without PCI services available (indirect transport) by emergency medical services (EMS). This retrospective cohort study used the St John Ambulance Western Australia OHCA Database and medical chart review. We included OHCA patients (≥18 years) admitted to any one of five PCI-capable hospitals in Perth between January 2012 and December 2015. Survival to hospital discharge (STHD) and survival up to 12-months after OHCA were compared between the direct and indirect transport groups using multivariable logistic and Cox-proportional hazards regression, respectively, while adjusting for so-called "Utstein variables" and other potential confounders. Of the 509 included patients, 404 (79.4%) were directly transported to a PCI-capable hospital and 105 (20.6%) transferred via another hospital to a PCI-capable hospital; 274/509 (53.8%) patients STHD and 253/509 (49.7%) survived to 12-months after OHCA. Direct transport patients were twice as likely to STHD (adjusted odds ratio 1.97, 95% confidence interval [CI] 1.13-3.43) than those transferred via another hospital. Indirect transport was also associated with a possible increased risk of death, up to 12-months, compared to direct transport (adjusted hazard ratio 1.36, 95% CI 1.00-1.84). Direct transport to a PCI-capable hospital for post-resuscitation care is associated with a survival advantage for adults with OHCA of medical aetiology. This has implications for EMS transport protocols for patients with OHCA. Copyright © 2018 Elsevier B.V. All rights reserved.
Kim, Gyuri; Lee, Seung-Eun; Jun, Ji Eun; Lee, You-Bin; Ahn, Jiyeon; Bae, Ji Cheol; Jin, Sang-Man; Hur, Kyu Yeon; Jee, Jae Hwan; Lee, Moon-Kyu; Kim, Jae Hyeon
2018-02-05
Skeletal muscle mass was negatively associated with metabolic syndrome prevalence in previous cross-sectional studies. The aim of this study was to investigate the impact of baseline skeletal muscle mass and changes in skeletal muscle mass over time on the development of metabolic syndrome in a large population-based 7-year cohort study. A total of 14,830 and 11,639 individuals who underwent health examinations at the Health Promotion Center at Samsung Medical Center, Seoul, Korea were included in the analyses of baseline skeletal muscle mass and those changes from baseline over 1 year, respectively. Skeletal muscle mass was estimated by bioelectrical impedance analysis and was presented as a skeletal muscle mass index (SMI), a body weight-adjusted appendicular skeletal muscle mass value. Using Cox regression models, hazard ratio for developing metabolic syndrome associated with SMI values at baseline or changes of SMI over a year was analyzed. During 7 years of follow-up, 20.1% of subjects developed metabolic syndrome. Compared to the lowest sex-specific SMI tertile at baseline, the highest sex-specific SMI tertile showed a significant inverse association with metabolic syndrome risk (adjusted hazard ratio [AHR] = 0.61, 95% confidence interval [CI] 0.54-0.68). Furthermore, compared with SMI changes < 0% over a year, multivariate-AHRs for metabolic syndrome development were 0.87 (95% CI 0.78-0.97) for 0-1% changes and 0.67 (0.56-0.79) for > 1% changes in SMI over 1 year after additionally adjusting for baseline SMI and glycometabolic parameters. An increase in relative skeletal muscle mass over time has a potential preventive effect on developing metabolic syndrome, independently of baseline skeletal muscle mass and glycometabolic parameters.
Huth, Cornelia; Thorand, Barbara; Baumert, Jens; Kruse, Johannes; Emeny, Rebecca Thwing; Schneider, Andrea; Meisinger, Christa; Ladwig, Karl-Heinz
2014-09-01
To examine whether job strain is associated with an increased risk of subsequent Type 2 diabetes mellitus (T2DM) development in a population-based study of men and women. Data were derived from the prospective MONICA/KORA Augsburg study. We investigated 5337 working participants aged 29 to 66 years without diabetes at one of the three baseline surveys. Job strain was measured by the Karasek job content questionnaire. High job strain was defined by the quadrant approach, where high job demands combined with low job control were classified as high job strain. Continuous job strain (quotient of job demands divided by job control) was additionally analyzed as sensitivity analysis. Hazard ratios (HRs) were estimated using multivariable Cox proportional hazards models with adjustment for age, sex, survey, socioeconomic and life-style variables, parental history of diabetes, and body mass index. During a median follow-up of 12.7 years, 291 incident cases of T2DM were observed. The participants with high job strain at baseline had a 45% higher fully adjusted risk to develop T2DM than did those with low job strain (HR = 1.45 [95% confidence interval = 1.00-2.10], p = .048). On the continuous scale, more severe job strain in the magnitude of 1 standard deviation corresponded to a 12% increased fully adjusted T2DM risk (HR = 1.12 [95% confidence interval = 1.00-1.25], p = .045). Men and women who experience high job strain are at higher risk for developing T2DM independently of traditional risk factors. Preventive strategies to combat the globally increasing T2DM epidemic should take into consideration the adverse effects of high strain in the work environment.
Chaiteerakij, Roongruedee; Chattieng, Piyanat; Choi, Jonggi; Pinchareon, Nutcha; Thanapirom, Kessirin; Geratikornsupuk, Nopavut
Evidence supporting benefit of hepatocellular carcinoma (HCC) surveillance in reducing mortality is not well-established. The effect of HCC surveillance in reducing mortality was assessed by an inverse probability of treatment weighting (IPTW)-based analysis controlled for inherent bias and confounders in observational studies. This retrospective cohort study was conducted on 446 patients diagnosed with HCC between 2007 and 2013 at a major referral center. Surveillance was defined as having at least 1 ultrasound test within a year before HCC diagnosis. Primary outcome was survival estimated using the Kaplan-Meier method with lead-time bias adjustment and compared using the log-rank test. Hazard ratio (HR) and 95% confidence interval (CI) were computed using conventional Cox and weighted Cox proportional hazards analysis with IPTW adjustment. Of the 446 patients, 103 (23.1%) were diagnosed with HCC through surveillance. The surveillance group had more patients with the Barcelona-Clinic Liver Cancer stage A (80.6% vs. 33.8%, P < 0.0001), more patients eligible for potentially curative treatment (73.8% vs. 44.9%, P < 0.0001), and longer median survival (49.6 vs. 15.9 months, P < 0.0001). By conventional multivariate Cox analysis, HR (95% CI) of surveillance was 0.63 (0.45-0.87), P = 0.005. The estimated effect of surveillance remained similar in the IPTW-adjusted Cox analysis (HR: 0.57; 95% CI: 0.43-0.76, P < 0.001). HCC surveillance by ultrasound is associated with a 37% reduction in mortality. Even though surveillance is recommended in all guidelines, but in practice, it is underutilized. Interventions are needed to increase surveillance rate for improving HCC outcome.
Cirillo, Piera M.; Terry, Mary Beth; Krigbaum, Nickilou Y.; Flom, Julie D.; Cohn, Barbara A.
2013-01-01
Background: Elevated levels of the pesticide DDT (dichlorodiphenyltrichloroethane) have been positively associated with blood pressure and hypertension in studies among adults. Accumulating epidemiologic and toxicologic evidence suggests that hypertension during adulthood may also be affected by earlier life and possibly the prenatal environment. Objectives: We assessed whether prenatal exposure to the pesticide DDT increases risk of adult hypertension. Methods: We examined concentrations of DDT (p,p´- and o,p´-) and its metabolite p,p´-DDE (dichlorodiphenyldichloroethylene) in prenatal serum samples from a subset of women (n = 527) who had participated in the prospective Child Health and Development Studies birth cohort in the San Francisco Bay area while they were pregnant between 1959 and 1967. We surveyed daughters 39–47 years of age by telephone interview from 2005 to 2008 to obtain information on self-reported physician-diagnosed hypertension and use of hypertensive medication. We used multivariable regression analysis of time to hypertension based on the Cox proportional hazards model to estimate relative rates for the association between prenatal DDT exposures and hypertension treated with medication in adulthood, with adjustment for potential confounding by maternal, early-life, and adult exposures. Results: Prenatal p,p´-DDT exposure was associated with hypertension [adjusted hazard ratio (aHR) = 3.6; 95% CI: 1.8, 7.2 and aHR = 2.5; 95% CI: 1.2, 5.3 for middle and high tertiles of p,p´-DDT relative to the lowest tertile, respectively]. These associations between p,p´-DDT and hypertension were robust to adjustment for independent hypertension risk factors as well as sensitivity analyses. Conclusions: These findings suggest that the association between DDT exposure and hypertension may have its origins early in development. PMID:23591545
Deo, Rajat; Katz, Ronit; Shlipak, Michael G.; Sotoodehnia, Nona; Psaty, Bruce M.; Sarnak, Mark J.; Fried, Linda F.; Chonchol, Michel; de Boer, Ian H.; Enquobahrie, Daniel; Siscovick, David; Kestenbaum, Bryan
2012-01-01
Recent studies have demonstrated greater risks of cardiovascular events and mortality among persons who have lower 25-hydroxyvitamin D (25-OHD) and higher parathyroid hormone (PTH) levels. We sought to evaluate the association between markers of mineral metabolism and sudden cardiac death (SCD) among the 2,312 participants from the Cardiovascular Health Study who were free of clinical cardiovascular disease at baseline. We estimated associations of baseline 25-OHD and PTH concentrations individually and in combination with SCD using Cox proportional hazards models after adjustment for demographics, cardiovascular risk factors, and kidney function. During a median follow-up of 14 years, there were 73 adjudicated SCD events. The annual incidence of SCD was greater among subjects who had lower 25-OHD concentrations: 2 events per 10,000 for 25-OHD ≥ 20 ng/ml and 4 events per 10,000 for 25-OHD < 20 ng/ml. Similarly, SCD incidence was greater among subjects who had higher PTH concentrations: 2 events per 10,000 for PTH ≤ 65 pg/ml and 4 events per 10,000 for PTH > 65 pg/ml. Multivariate adjustment attenuated associations of 25-OHD and PTH with SCD. Finally, 267 participants (11.7% of the cohort) had high PTH and low 25-OHD concentrations. This combination was associated with a more than 2-fold risk of SCD after adjustment (hazard ratio 2.19, 95% confidence interval 1.17, 4.10, p=0.017) compared to participants with normal levels of PTH and 25-OHD. The combination of lower 25-OHD and higher PTH concentrations appears to be associated independently with SCD risk among older adults without cardiovascular disease. PMID:22068871
Johnson, Linda S B; Persson, Anders P; Wollmer, Per; Juul-Möller, Steen; Juhlin, Tord; Engström, Gunnar
2018-02-13
Atrial fibrillation (AF) is defined as an irregular supraventricular tachycardia (SVT) without p waves, with duration >30 seconds. Whether AF characteristics during short SVT episodes predict AF and stroke is not known. The purpose of this study was to determine whether irregularity and lack of p waves, alone or in combination, during short SVT episodes increase the risk of incident AF and ischemic stroke. The population-based Malmö Diet and Cancer study includes 24-hour ECG screening of 377 AF-free individuals (mean age 64.5 years; 43% men) who were prospectively followed for >13 years. There were 65 AF events and 25 ischemic stroke events during follow-up. Subjects with an SVT episode ≥5 beats were identified, and the longest SVT episode was assessed for irregularity and lack of p waves. The association between SVT classification and AF and stroke was assessed using multivariable adjusted Cox regression. The incidence of AF increased with increasing abnormality of the SVTs. The risk-factor adjusted hazard ratio for AF was 4.95 (95% confidence interval 2.06-11.9; P <.0001) for those with short irregular SVTs (<70 beats) without p waves. The incidence of ischemic stroke was highest in the group with regular SVT episodes without p waves (hazard ratio 14.2; 95% confidence interval 3.76-57.6; P <.0001, adjusted for age and sex). Characteristics of short SVT episodes detected at 24-hour ECG screening are associated with incident AF and ischemic stroke. Short irregular SVTs without p waves likely represent early stages of AF or atrial myopathy. Twenty-four-hour ECG could identify subjects suitable for primary prevention efforts. Copyright © 2018 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.
Feldman, Candace H.; Hiraki, Linda T.; Winkelmayer, Wolfgang C.; Marty, Francisco M.; Franklin, Jessica M.; Kim, Seoyoung C.; Costenbader, Karen H.
2015-01-01
Objective While serious infections are significant causes of morbidity and mortality in systemic lupus erythematosus (SLE), the epidemiology in a nationwide cohort of SLE and lupus nephritis (LN) patients has not been examined. Methods Using the Medicaid Analytic eXtract (MAX) database, 2000-2006, we identified patients 18-64 years with SLE and a subset with LN. We ascertained hospitalized serious infections using validated algorithms, and 30-day mortality rates. We used Poisson regression to calculate infection incidence rates (IR), and multivariable Cox proportional hazards models to calculate hazard ratios (HR) for first infection, adjusted for sociodemographics, medication use, and a SLE-specific risk adjustment index. Results We identified 33,565 patients with SLE and 7,113 with LN. There were 9,078 serious infections in 5,078 SLE patients and 3,494 infections in 1,825 LN patients. The infection IR per 100 person-years was 10.8 in SLE and 23.9 in LN. In adjusted models, in SLE, we observed increased risks of infection among males compared to females (HR 1.33, 95% CI 1.20-1.47), in Blacks compared to Whites (HR 1.14, 95% CI 1.06-1.21), and glucocorticoid users (HR 1.51, 95% CI 1.43-1.61) and immunosuppressive users (HR 1.11, 95% CI 1.03-1.20) compared with non-users. Hydroxychloroquine users had a reduced risk of infection compared to non-users (HR 0.73, 95% CI 0.68-0.77). The 30-day mortality rate per 1,000 person-years among those hospitalized with infections was 21.4 in SLE and 38.7 in LN. Conclusion In this diverse, nationwide cohort of SLE patients, we observed a substantial burden of serious infections with many subsequent deaths. PMID:25772621
Left Atrial Enlargement and Stroke Recurrence: The Northern Manhattan Stroke Study
Yaghi, Shadi; Moon, Yeseon P.; Mora-McLaughlin, Consuelo; Willey, Joshua Z.; Cheung, Ken; Tullio, Marco R. Di; Homma, Shunichi; Kamel, Hooman; Sacco, Ralph L.; Elkind, Mitchell S. V.
2015-01-01
Background and purpose While left atrial enlargement (LAE) increases incident stroke risk, the association with recurrent stroke is less clear. Our aim was to determine the association of LAE with recurrent stroke most likely related to embolism (cryptogenic and cardioembolic), and all ischemic stroke recurrences. Methods We followed 655 first ischemic stroke patients in the Northern Manhattan Stroke Study for up to 5 years. LA size from 2-D echocardiography was categorized as normal (52.7%), mild LAE (31.6%), and moderate-severe LAE (15.7%). We used Cox proportional hazard models to calculate the hazard ratios and 95% confidence intervals (HR, 95%CI) for the association of LA size and LAE with recurrent cryptogenic/cardioembolic and total recurrent ischemic stroke. Results LA size was available in 529 (81%) patients. Mean age at enrollment was 69±13 years; 45.8% were male, 54.0% Hispanic, and 18.5% had atrial fibrillation. Over a median of 4 years there were 65 recurrent ischemic strokes (29 were cardioembolic or cryptogenic). In multivariable models adjusted for confounders including atrial fibrillation and heart failure, moderate-severe LAE compared to normal LA size was associated with greater risk of recurrent cardioembolic/cryptogenic stroke (adjusted HR 2.83, 95% CI 1.03-7.81), but not total ischemic stroke (adjusted HR 1.06, 95% CI, 0.48-2.30). Mild LAE was not associated with recurrent stroke. Conclusion Moderate to severe LAE was an independent marker of recurrent cardioembolic or cryptogenic stroke in a multiethnic cohort of ischemic stroke patients. Further research is needed to determine whether anticoagulant use may reduce risk of recurrence in ischemic stroke patients with moderate to severe LAE. PMID:25908460
Kabat, Geoffrey C; Kim, Mimi Y; Lane, Dorothy S; Zaslavsky, Oleg; Ho, Gloria Y F; Luo, Juhua; Nicholson, Wanda K; Chlebowski, Rowan T; Barrington, Wendy E; Vitolins, Mara Z; Lin, Xiaochen; Liu, Simin; Rohan, Thomas E
2018-05-01
Limited evidence suggests that hyperinsulinemia may contribute to the risk of breast, endometrial, and, possibly, ovarian cancer. The aim of this study was to assess the association of serum glucose and insulin with risk of these cancers in postmenopausal women, while taking into account potential confounding and modifying factors. We studied 21 103 women with fasting baseline insulin and glucose measurements in a subsample of the Women's Health Initiative. The subsample was composed of four studies within Women's Health Initiative with different selection and sampling strategies. Over a mean of 14.7 years of follow-up, 1185 breast cancer cases, 156 endometrial cancer cases, and 130 ovarian cancer cases were diagnosed. We used Cox proportional hazards models to estimate hazard ratios (HRs) and 95% confidence intervals (95% CIs) by quartile of glucose or insulin. Serum insulin was positively associated with breast cancer risk (multivariable-adjusted HR for highest vs. lowest quartile 1.41, 95% CI: 1.16-1.72, Ptrend<0.0003), and glucose and insulin were associated with roughly a doubling of endometrial cancer risk (for glucose: HR: 2.00, 95% CI: 1.203.35, Ptrend=0.01; for insulin: HR: 2.39, 95% CI: 1.32-4.33, Ptrend=0.008). These associations remained unchanged or were slightly attenuated after mutual adjustment, adjustment for serum lipids, and assessment of possible reverse causation. Glucose and insulin showed no association with ovarian cancer. Our findings provide support for a role of insulin-related pathways in the etiology of cancers of the breast and endometrium. However, because of the unrepresentative nature of the sample, our results need confirmation in other populations.
Socioeconomic Status As a Risk Factor for Unintended Pregnancy in the Contraceptive CHOICE Project.
Iseyemi, Abigail; Zhao, Qiuhong; McNicholas, Colleen; Peipert, Jeffrey F
2017-09-01
To evaluate the association of low socioeconomic status as an independent risk factor for unintended pregnancy. We performed a secondary analysis of data from the Contraceptive CHOICE project. Between 2007 and 2011, 9,256 participants were recruited and followed for up to 3 years. The primary outcome of interest was unintended pregnancy; the primary exposure variable was low socioeconomic status, defined as self-report of either receiving public assistance or having difficulty paying for basic necessities. Four contraceptive groups were evaluated: 1) long-acting reversible contraceptive method (hormonal or copper intrauterine device or subdermal implant); 2) depot medroxyprogesterone acetate injection; 3) oral contraceptive pills, a transdermal patch, or a vaginal ring; or 4) other or no method. Confounders were adjusted for in the multivariable Cox proportional hazard model to estimate the effect of socioeconomic status on risk of unintended pregnancy. Participants with low socioeconomic status experienced 515 unintended pregnancies during 14,001 women-years of follow-up (3.68/100 women-years; 95% CI 3.37-4.01) compared with 200 unintended pregnancies during 10,296 women-years (1.94/100 women-years; 95% CI 1.68-2.23) among participants without low socioeconomic status. Women with low socioeconomic status were more likely to have an unintended pregnancy (unadjusted hazard ratio [HR] 1.8, 95% CI 1.5-2.2). After adjusting for age, education level, insurance status, and history of unintended pregnancy, low socioeconomic status was associated with an increased risk of unintended pregnancy (adjusted HR 1.4, 95% CI 1.1-1.7). Despite the removal of cost barriers, low socioeconomic status is associated with a higher incidence of unintended pregnancy.
Wolak, Talya; Shoham-Vardi, Ilana; Sergienko, Ruslan; Sheiner, Eyal
2016-02-01
This study aims to examine whether renal function during pregnancy can serve as a surrogate marker for the risk of developing atherosclerotic-related morbidity. A case-control study, including women who gave birth at a tertiary referral medical centre during 2000-2012. This population was divided into cases of women who were subsequently hospitalized for atherosclerotic morbidity during the study period and age-matched controls. From the study population, we retrieved two groups: the creatinine (Cr) group: women who had at least one Cr measurement (4945 women) and the urea group: women who had at least one urea measurement (4932 women) during their pregnancies. In the Cr and urea group, there were 572 and 571 cases and 4373 and 4361 controls, respectively. The mean follow-up period in the Cr and urea group was 61.7 ± 37.0 and 57.3 ± 36.0 months, respectively. Cox proportional hazards models (controlling for confounders: gestational hypertension, gestational diabetes, obesity, maternal age, creatinine level (for urea), and gestational week) were used to estimate the adjusted hazard ratios (HR) for hospitalizations. A significant association was documented between renal function during pregnancy and long-term atherosclerotic morbidity. Multivariate analysis, showed that Cr at pregnancy index of ≥89 μmol/L was associated with a significant increased risk for hospitalization due to cardiovascular (CVS) events (adjusted HR = 2.91 CI 1.37-6.19 P = 0.005) and urea level ≤7 mmol/L was independently associated with reduced prevalence of CVS hospitalization (adjusted HR = 0.62 CI 0.57-0.86 P = 0.001). Renal function abnormality during pregnancy may reveal occult predisposition to atherosclerotic morbidity years after childbirth. © 2015 Asian Pacific Society of Nephrology.
Dewing, Sarah; Tomlinson, Mark; le Roux, Ingrid M.; Chopra, Mickey; Tsai, Alexander C.
2013-01-01
Background Although the public health impacts of food insecurity and depression on both maternal and child health are extensive, no studies have investigated the associations between food insecurity and postnatal depression or suicidality. Methods We interviewed 249 women three months after they had given birth and assessed food insecurity, postnatal depression symptom severity, suicide risk, and hazardous drinking. Multivariable Poisson regression models with robust standard errors were used to estimate the impact of food insecurity on psychosocial outcomes. Results Food insecurity, probable depression, and hazardous drinking were highly prevalent and co-occurring. More than half of the women (149 [59.8%]) were severely food insecure, 79 (31.7%) women met screening criteria for probable depression, and 39 (15.7%) women met screening criteria for hazardous drinking. Nineteen (7.6%) women had significant suicidality, of whom 7 (2.8%) were classified as high risk. Each additional point on the food insecurity scale was associated with increased risks of probable depression (adjusted risk ratio [ARR], 1.05; 95% CI, 1.02–1.07), hazardous drinking (ARR, 1.04; 95% CI, 1.00–1.09), and suicidality (ARR, 1.12; 95% CI, 1.02–1.23). Evaluated at the means of the covariates, these estimated associations were large in magnitude. Limitations The study is limited by lack of data on formal DSM-IV diagnoses of major depressive disorder, potential sample selection bias, and inability to assess the causal impact of food insecurity. Conclusion Food insecurity is strongly associated with postnatal depression, hazardous drinking, and suicidality. Programmes promoting food security for new may enhance overall psychological well-being in addition to improving nutritional status. PMID:23707034
Dewing, Sarah; Tomlinson, Mark; le Roux, Ingrid M; Chopra, Mickey; Tsai, Alexander C
2013-09-05
Although the public health impacts of food insecurity and depression on both maternal and child health are extensive, no studies have investigated the associations between food insecurity and postnatal depression or suicidality. We interviewed 249 women three months after they had given birth and assessed food insecurity, postnatal depression symptom severity, suicide risk, and hazardous drinking. Multivariable Poisson regression models with robust standard errors were used to estimate the impact of food insecurity on psychosocial outcomes. Food insecurity, probable depression, and hazardous drinking were highly prevalent and co-occurring. More than half of the women (149 [59.8%]) were severely food insecure, 79 (31.7%) women met screening criteria for probable depression, and 39 (15.7%) women met screening criteria for hazardous drinking. Nineteen (7.6%) women had significant suicidality, of whom 7 (2.8%) were classified as high risk. Each additional point on the food insecurity scale was associated with increased risks of probable depression (adjusted risk ratio [ARR], 1.05; 95% CI, 1.02-1.07), hazardous drinking (ARR, 1.04; 95% CI, 1.00-1.09), and suicidality (ARR, 1.12; 95% CI, 1.02-1.23). Evaluated at the means of the covariates, these estimated associations were large in magnitude. The study is limited by lack of data on formal DSM-IV diagnoses of major depressive disorder, potential sample selection bias, and inability to assess the causal impact of food insecurity. Food insecurity is strongly associated with postnatal depression, hazardous drinking, and suicidality. Programmes promoting food security for new may enhance overall psychological well-being in addition to improving nutritional status. Copyright © 2013 Elsevier B.V. All rights reserved.
Brasky, Theodore M.; Sponholtz, Todd R.; Palmer, Julie R.; Rosenberg, Lynn; Ruiz-Narváez, Edward A.; Wise, Lauren A.
2016-01-01
Dietary long-chain (LC) ω-3 polyunsaturated fatty acids (PUFAs), which derive primarily from intakes of fatty fish, are thought to inhibit inflammation and de novo estrogen synthesis. This study prospectively examined the associations of dietary LC ω-3 PUFAs and fish with endometrial cancer risk in 47,602 African-American women living in the United States, aged 21–69 years at baseline in 1995, and followed them until 2013 (n = 282 cases). Multivariable-adjusted Cox regression models estimated hazard ratios and 95% confidence intervals for associations of LC ω-3 PUFA (quintiled) and fish (quartiled) intake with endometrial cancer risk, overall and by body mass index (BMI; weight (kg)/height (m)2). The hazard ratio for quintile 5 of total dietary LC ω-3 PUFAs versus quintile 1 was 0.79 (95% confidence interval (CI): 0.51, 1.24); there was no linear trend. Hazard ratios for the association were smaller among normal-weight women (BMI <25: hazard ratio (HR) = 0.53, 95% CI: 0.18, 1.58) than among overweight/obese women (BMI ≥25: HR = 0.88, 95% CI: 0.54, 1.43), but these differences were not statistically significant. Fish intake was also not associated with risk (quartile 4 vs. quartile 1: HR = 0.86, 95% CI: 0.56, 1.31). Again hazard ratios were smaller among normal-weight women (HR = 0.65) than among overweight/obese women (HR = 0.94). While compatible with no association, the hazard ratios observed among leaner African-American women are similar to those from recent prospective studies conducted in predominantly white populations. PMID:26755676
Association of Low-Dose Aspirin and Survival of Women With Endometrial Cancer.
Matsuo, Koji; Cahoon, Sigita S; Yoshihara, Kosuke; Shida, Masako; Kakuda, Mamoru; Adachi, Sosuke; Moeini, Aida; Machida, Hiroko; Garcia-Sayre, Jocelyn; Ueda, Yutaka; Enomoto, Takayuki; Mikami, Mikio; Roman, Lynda D; Sood, Anil K
2016-07-01
To examine the survival outcomes in women with endometrial cancer who were taking low-dose aspirin (81-100 mg/d). A multicenter retrospective study was conducted examining patients with stage I-IV endometrial cancer who underwent hysterectomy-based surgical staging between January 2000 and December 2013 (N=1,687). Patient demographics, medical comorbidities, medication types, tumor characteristics, and treatment patterns were correlated to survival outcomes. A Cox proportional hazard regression model was used to estimate adjusted hazard ratio for disease-free and disease-specific overall survival. One hundred fifty-eight patients (9.4%, 95% confidence interval [CI] 8.8-11.9) were taking low-dose aspirin. Median follow-up time for the study cohort was 31.5 months. One hundred twenty-seven patients (7.5%) died of endometrial cancer. Low-dose aspirin use was significantly correlated with concurrent obesity, hypertension, diabetes mellitus, and hypercholesterolemia (all P<.001). Low-dose aspirin users were more likely to take other antihypertensive, antiglycemic, and anticholesterol agents (all P<.05). Low-dose aspirin use was not associated with histologic subtype, tumor grade, nodal metastasis, or cancer stage (all P>.05). On multivariable analysis, low-dose aspirin use remained an independent prognostic factor associated with an improved 5-year disease-free survival rate (90.6% compared with 80.9%, adjusted hazard ratio 0.46, 95% CI 0.25-0.86, P=.014) and disease-specific overall survival rate (96.4% compared with 87.3%, adjusted hazard ratio 0.23, 95% CI 0.08-0.64, P=.005). The increased survival effect noted with low-dose aspirin use was greatest in patients whose age was younger than 60 years (5-year disease-free survival rates, 93.9% compared with 84.0%, P=.013), body mass index was 30 or greater (92.2% compared with 81.4%, P=.027), who had type I cancer (96.5% compared with 88.6%, P=.029), and who received postoperative whole pelvic radiotherapy (88.2% compared with 61.5%, P=.014). These four factors remained significant for disease-specific overall survival (all P<.05). Our results suggest that low-dose aspirin use is associated with improved survival outcomes in women with endometrial cancer, especially in those who are young, obese, with low-grade disease, and who receive postoperative radiotherapy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Che Mingxin; DeSilvio, Michelle; Pollack, Alan
2007-11-15
Purpose: The goal of this study was to verify the significance of p53 as a prognostic factor in Radiation Therapy Oncology Group 9202, which compared short-term androgen deprivation (STAD) with radiation therapy (RT) to long-term androgen deprivation + RT in men with locally advanced prostate cancer (Pca). Methods and Materials: Tumor tissue was sufficient for p53 analysis in 777 cases. p53 status was determined by immunohistochemistry. Abnormal p53 expression was defined as 20% or more tumor cells with positive nuclei. Univariate and multivariate Cox proportional hazards models were used to evaluate the relationships of p53 status to patient outcomes. Results:more » Abnormal p53 was detected in 168 of 777 (21.6%) cases, and was significantly associated with cause-specific mortality (adjusted hazard ratio [HR] = 1.89; 95% confidence interval (CI) 1.14 - 3.14; p = 0.014) and distant metastasis (adjusted HR = 1.72; 95% CI 1.13-2.62; p = 0.013). When patients were divided into subgroups according to assigned treatment, only the subgroup of patients who underwent STAD + RT showed significant correlation between p53 status and cause-specific mortality (adjusted HR = 2.43; 95% CI = 1.32-4.49; p = 0.0044). When patients were divided into subgroups according to p53 status, only the subgroup of patients with abnormal p53 showed significant association between assigned treatment and cause-specific mortality (adjusted HR = 3.81; 95% CI 1.40-10.37; p = 0.0087). Conclusions: Abnormal p53 is a significant prognostic factor for patients with prostate cancer who undergo short-term androgen deprivation and radiotherapy. Long-term androgen deprivation may significantly improve the cause-specific survival for those with abnormal p53.« less
Lean body mass and risk of incident atrial fibrillation in post-menopausal women.
Azarbal, Farnaz; Stefanick, Marcia L; Assimes, Themistocles L; Manson, JoAnn E; Bea, Jennifer W; Li, Wenjun; Hlatky, Mark A; Larson, Joseph C; LeBlanc, Erin S; Albert, Christine M; Nassir, Rami; Martin, Lisa W; Perez, Marco V
2016-05-21
High body mass index (BMI) is a risk factor for atrial fibrillation (AF). The aim of this study was to determine whether lean body mass (LBM) predicts AF. The Women's Health Initiative is a study of post-menopausal women aged 50-79 enrolled at 40 US centres from 1994 to 1998. A subset of 11 393 participants at three centres underwent dual-energy X-ray absorptiometry. Baseline demographics and clinical histories were recorded. Incident AF was identified using hospitalization records and diagnostic codes from Medicare claims. A multivariable Cox hazard regression model adjusted for demographic and clinical risk factors was used to evaluate associations between components of body composition and AF risk. After exclusion for prevalent AF or incomplete data, 8832 participants with an average age of 63.3 years remained for analysis. Over the 11.6 years of average follow-up time, 1035 women developed incident AF. After covariate adjustment, all measures of LBM were independently associated with higher rates of AF: total LBM [hazard ratio (HR) 1.24 per 5 kg increase, 95% confidence intervals (CI) 1.14-1.34], central LBM (HR 1.51 per 5 kg increase, 95% CI 1.31-1.74), and peripheral LBM (HR 1.39 per 5 kg increase, 95% CI 1.19-1.63). The association between total LBM and AF remained significant after adjustment for total fat mass (HR 1.22 per 5 kg increase, 95% CI 1.13-1.31). Greater LBM is a strong independent risk factor for AF. After adjusting for obesity-related risk factors, the risk of AF conferred by higher BMI is primarily driven by the association between LBM and AF. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.
Yates, Tom A.; Tomlinson, Laurie A.; Bhaskaran, Krishnan; Langan, Sinead; Thomas, Sara; Smeeth, Liam
2017-01-01
Background Recent in vitro and animal studies have found the proton pump inhibitor (PPI) lansoprazole to be highly active against Mycobacterium tuberculosis. Omeprazole and pantoprazole have no activity. There is no evidence that, in clinical practice, lansoprazole can treat or prevent incident tuberculosis (TB) disease. Methods and findings We studied a cohort of new users of lansoprazole, omeprazole, or pantoprazole from the United Kingdom Clinical Practice Research Datalink to determine whether lansoprazole users have a lower incidence of TB disease than omeprazole or pantoprazole users. Negative control outcomes of myocardial infarction (MI) and herpes zoster were also studied. Multivariable Cox proportional hazards regression was used to adjust for potential confounding by a wide range of factors. We identified 527,364 lansoprazole initiators and 923,500 omeprazole or pantoprazole initiators. Lansoprazole users had a lower rate of TB disease (n = 86; 10.0 cases per 100,000 person years; 95% confidence interval 8.1–12.4) than omeprazole or pantoprazole users (n = 193; 15.3 cases per 100,000 person years; 95% confidence interval 13.3–17.7), with an adjusted hazard ratio (HR) of 0.68 (0.52–0.89). No association was found with MI (adjusted HR 1.04; 95% confidence interval 1.00–1.08) or herpes zoster (adjusted HR 1.03; 95% confidence interval 1.00–1.06). Limitations of this study are that we could not determine whether TB disease was due to reactivation of latent infection or a result of recent transmission, nor could we determine whether lansoprazole would have a beneficial effect if given to people presenting with TB disease. Conclusions In this study, use of the commonly prescribed and cheaply available PPI lansoprazole was associated with reduced incidence of TB disease. Given the serious problem of drug resistance and the adverse side effect profiles of many TB drugs, further investigation of lansoprazole as a potential antituberculosis agent is warranted. PMID:29161254
Jing, Jing; Pan, Yuesong; Zhao, Xingquan; Zheng, Huaguang; Jia, Qian; Mi, Donghua; Chen, Weiqi; Li, Hao; Liu, Liping; Wang, Chunxue; He, Yan; Wang, David; Wang, Yilong; Wang, Yongjun
2017-04-01
Insulin resistance was common in patients with stroke. This study investigated the association between insulin resistance and outcomes in nondiabetic patients with first-ever acute ischemic stroke. Patients with ischemic stroke without history of diabetes mellitus in the ACROSS-China registry (Abnormal Glucose Regulation in Patients With Acute Stroke Across China) were included. Insulin resistance was defined as a homeostatis model assessment-insulin resistance (HOMA-IR) index in the top quartile (Q4). HOMA-IR was calculated as fasting insulin (μU/mL)×fasting glucose (mmol/L)/22.5. Multivariable logistic regression or Cox regression was performed to estimate the association between HOMA-IR and 1-year prognosis (mortality, stroke recurrence, poor functional outcome [modified Rankin scale score 3-6], and dependence [modified Rankin scale score 3-5]). Among the 1245 patients with acute ischemic stroke enrolled in this study, the median HOMA-IR was 1.9 (interquartile range, 1.1-3.1). Patients with insulin resistance were associated with a higher mortality risk than those without (adjusted hazard ratio, 1.68; 95% confidence interval, 1.12-2.53; P =0.01), stroke recurrence (adjusted hazard ratio, 1.57, 95% confidence interval, 1.12-2.19; P =0.008), and poor outcome (adjusted odds ratio, 1.42; 95% confidence interval, 1.03-1.95; P =0.03) but not dependence after adjustment for potential confounders. Higher HOMA-IR quartile categories were associated with a higher risk of 1-year death, stroke recurrence, and poor outcome ( P for trend =0.005, 0.005, and 0.001, respectively). Insulin resistance was associated with an increased risk of death, stroke recurrence, and poor outcome but not dependence in nondiabetic patients with acute ischemic stroke. © 2017 American Heart Association, Inc.
Potentially modifiable factors contributing to sepsis-associated encephalopathy.
Sonneville, Romain; de Montmollin, Etienne; Poujade, Julien; Garrouste-Orgeas, Maïté; Souweine, Bertrand; Darmon, Michael; Mariotte, Eric; Argaud, Laurent; Barbier, François; Goldgran-Toledano, Dany; Marcotte, Guillaume; Dumenil, Anne-Sylvie; Jamali, Samir; Lacave, Guillaume; Ruckly, Stéphane; Mourvillier, Bruno; Timsit, Jean-François
2017-08-01
Identifying modifiable factors for sepsis-associated encephalopathy may help improve patient care and outcomes. We conducted a retrospective analysis of a prospective multicenter database. Sepsis-associated encephalopathy (SAE) was defined by a score on the Glasgow coma scale (GCS) <15 or when features of delirium were noted. Potentially modifiable risk factors for SAE at ICU admission and its impact on mortality were investigated using multivariate logistic regression analysis and Cox proportional hazard modeling, respectively. We included 2513 patients with sepsis at ICU admission, of whom 1341 (53%) had sepsis-associated encephalopathy. After adjusting for baseline characteristics, site of infection, and type of admission, the following factors remained independently associated with sepsis-associated encephalopathy: acute renal failure [adjusted odds ratio (aOR) = 1.41, 95% confidence interval (CI) 1.19-1.67], hypoglycemia <3 mmol/l (aOR = 2.66, 95% CI 1.27-5.59), hyperglycemia >10 mmol/l (aOR = 1.37, 95% CI 1.09-1.72), hypercapnia >45 mmHg (aOR = 1.91, 95% CI 1.53-2.38), hypernatremia >145 mmol/l (aOR = 2.30, 95% CI 1.48-3.57), and S. aureus (aOR = 1.54, 95% CI 1.05-2.25). Sepsis-associated encephalopathy was associated with higher mortality, higher use of ICU resources, and longer hospital stay. After adjusting for age, comorbidities, year of admission, and non-neurological SOFA score, even mild alteration of mental status (i.e., a score on the GCS of 13-14) remained independently associated with mortality (adjusted hazard ratio = 1.38, 95% CI 1.09-1.76). Acute renal failure and common metabolic disturbances represent potentially modifiable factors contributing to sepsis-associated encephalopathy. However, a true causal relationship has yet to be demonstrated. Our study confirms the prognostic significance of mild alteration of mental status in patients with sepsis.
Huang, Henry D; Waks, Jonathan W; Steinhaus, Daniel A; Zimetbaum, Peter
2016-07-01
Dofetilide is a class III antiarrhythmic drug approved for the treatment of atrial fibrillation (AF). Dofetilide-induced corrected QT (QTc) interval prolongation is a surrogate for the degree of drug effect, but the relationships between drug-induced QTc interval prolongation, pharmacological cardioversion (PCV), and freedom from recurrent AF are unclear. The purpose of this study was to assess associations between QTc interval change during dofetilide initiation and PCV and long-term AF recurrence. We performed retrospective analyses of a prospective cohort of patients with AF admitted for dofetilide initiation between 2001 and 2014. Clinical characteristics and electrocardiographic variables were assessed. We evaluated outcomes of successful PCV in patients with persistent AF and time to recurrence of AF in patients with paroxysmal and persistent AF. During the study, 243 patients with persistent AF and 176 patients with paroxysmal AF initiated dofetilide. PCV occurred in 93/243 (41.7%) patients with persistent AF. After multivariable adjustment, QTc interval change was associated with PCV (adjusted odds ratio 1.21; P = .003 per 10-ms QTc increase). Inhospital QTc interval change was associated with long-term freedom from AF in patients with persistent AF (adjusted hazard ratio 0.92; P = .011 at 4 years per 10-ms QTc increase), but not in patients with paroxysmal AF. In patients with persistent AF, PCV was also associated with long-term freedom from recurrent AF (adjusted hazard ratio 0.62; P = .009 at 4 years). The magnitude of QTc interval prolongation during dofetilide initiation is an independent predictor of successful PCV and long-term freedom from arrhythmia in patients with persistent AF. QTc interval change had no association with AF recurrence in patients with paroxysmal AF, suggesting that different mechanisms of arrhythmogenesis may be operant in different AF types. Copyright © 2016 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.
Steinberg, Benjamin A; Moghbeli, Nazanin; Buros, Jacqueline; Ruda, Mikhail; Parkhomenko, Alexander; Raju, B Soma; García-Castillo, Armando; Janion, Marianna; Nicolau, José C; Fox, Keith A A; Morrow, David A; Gibson, C Michael; Antman, Elliott M
2007-07-01
Outcomes in patients with ST-elevation myocardial infarction (STEMI) differ between those in clinical trials and those in routine practice, as well as across different regions. We hypothesized that adjustment for baseline risk would minimize such variations. The Enoxaparin and Thrombolysis Reperfusion for Acute Myocardial Infarction Treatment-Thrombolysis In Myocardial Infarction (ExTRACT-TIMI) 25 registry was an observational study of patients with STEMI presenting to hospitals participating in the ExTRACT-TIMI 25 randomized clinical trial. Consecutive patients with STEMI who were not enrolled in the trial were entered into the registry. Demographics, in-hospital therapies, and in-hospital events were collected. Baseline risk was assessed using the TIMI Risk Index for STEMI. To adjust for differences among the countries from which the patients presented, the gross national income per annum per capita (GNI) was used. A total of 3726 patients were registered from 109 sites in 25 countries. Patients in the registry had a higher baseline risk than those in the trial; they had more extensive prior cardiac histories and more comorbidities. Unadjusted in-hospital mortality was higher in the registry (8.3%) than in the trial (6.6%) (hazard ratio, 1.30; P < .001); however, after adjusting for TIMI Risk Index, mortality was similar (hazard ratio(adj), 1.00; P = .97). The GNI was not significantly predictive of in-hospital mortality in the multivariable model of the registry. Patients in the registry had higher mortality than those in the trial. This difference could be explained by the higher baseline risk of patients in the registry. After adjusting for baseline risk, the GNI of the country in which the patient presented did not contribute to predicting in-hospital mortality.
Migraine and risk of stroke: a national population-based twin study.
Lantz, Maria; Sieurin, Johanna; Sjölander, Arvid; Waldenlind, Elisabet; Sjöstrand, Christina; Wirdefeldt, Karin
2017-10-01
Numerous studies have indicated an increased risk for stroke in patients with migraine, especially migraine with aura; however, many studies used self-reported migraine and only a few controlled for familial factors. We aimed to investigate migraine as a risk factor for stroke in a Swedish population-based twin cohort, and whether familial factors contribute to an increased risk. The study population included twins without prior cerebrovascular disease who answered a headache questionnaire during 1998 and 2002 for twins born 1935-58 and during 2005-06 for twins born between 1959 and 1985. Migraine with and without aura and probable migraine was defined by an algorithm mapping on to clinical diagnostic criteria according to the International Classification of Headache Disorders. Stroke diagnoses were obtained from the national patient and cause of death registers. Twins were followed longitudinally, by linkage of national registers, from date of interview until date of first stroke, death, or end of study on 31 Dec 2014. In total, 8635 twins had any migraineous headache, whereof 3553 had migraine with aura and 5082 had non-aura migraineous headache (including migraine without aura and probable migraine), and 44 769 twins had no migraine. During a mean follow-up time of 11.9 years we observed 1297 incident cases of stroke. The Cox proportional hazards model with attained age as underlying time scale was used to estimate hazard ratios with 95% confidence intervals for stroke including ischaemic and haemorrhagic subtypes related to migraine with aura, non-aura migraineous headache, and any migraineous headache. Analyses were adjusted for gender and cardiovascular risk factors. Where appropriate; within-pair analyses were performed to control for confounding by familial factors. The age- and gender-adjusted hazard ratio for stroke related to migraine with aura was 1.27 (95% confidence interval 1.00-1.62), P = 0.05, and 1.07 (95% confidence interval 0.91-1.26), P = 0.39 related to any migraineous headache. Multivariable adjusted analyses showed similar results. When stratified by gender and attained age of ≤50 or >50 years, the estimated hazard ratio for stroke was higher in twins younger than 50 years and in females; however, non-significant. In the within-pair analysis, the hazard ratio for stroke related to migraine with aura was attenuated [hazard ratio 1.09 (95% confidence interval 0.81-1.46), P = 0.59]. In conclusion, we observed no increased stroke risk related to migraine overall but there was a modestly increased risk for stroke related to migraine with aura, and within-pair analyses suggested that familial factors might contribute to this association. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Weinberg, Ido; Gona, Philimon; O’Donnell, Christopher J.; Jaff, Michael R.; Murabito, Joanne M.
2014-01-01
Background An increased inter-arm systolic blood pressure difference is an easily determined physical examination finding. The relationship between inter-arm systolic blood pressure difference and risk of future cardiovascular disease is uncertain. We described the prevalence and risk factor correlates of inter-arm systolic blood pressure difference in the Framingham Heart Study (FHS) original and offspring cohorts and examined the association between inter-arm systolic blood pressure difference and incident cardiovascular disease and all-cause mortality. Methods An increased inter-arm systolic blood pressure difference was defined as ≥10mmHg using the average of initial and repeat blood pressure measurements obtained in both arms. Participants were followed through 2010 for incident cardiovascular disease events. Multivariable Cox proportional hazards regression analyses were performed to investigate the effect of inter-arm systolic blood pressure difference on incident cardiovascular disease. Results We examined 3,390 (56.3% female) participants aged 40 years and older, free of cardiovascular disease at baseline, mean age of 61.1 years, who attended a FHS examination between 1991 and 1994 (original cohort) and from 1995 to 1998 (offspring cohort). The mean absolute inter-arm systolic blood pressure difference was 4.6 mmHg (range 0 to 78). Increased inter-arm systolic blood pressure difference was present in 317 (9.4%) participants. The median follow-up time was 13.3 years, during which time 598 participants (17.6%) experienced a first cardiovascular event including 83 (26.2%) participants with inter-arm systolic blood pressure difference ≥10 mmHg. Compared to those with normal inter-arm systolic blood pressure difference, participants with an elevated inter-arm systolic blood pressure difference were older (63.0 years vs. 60.9 years), had a greater prevalence of diabetes mellitus (13.3% vs. 7.5%,), higher systolic blood pressure (136.3 mmHg vs. 129.3 mmHg), and a higher total cholesterol level (212.1 mg/dL vs. 206.5 mg/dL). Inter-arm systolic blood pressure difference was associated with a significantly increased hazard of incident cardiovascular events in the multivariable adjusted model (hazard ratio 1.38, 95% CI, 1.09 to 1.75). For each 1-standard deviation unit increase in absolute interarm systolic blood pressure difference, the hazard ratio for incident cardiovascular events was 1.07 (CI, 1.00 to 1.14) in the fully-adjusted model. There was no such association with mortality (hazard ratio 1.02, 95% CI 0.76 to 1.38). Conclusions In this community-based cohort, an inter-arm systolic blood pressure difference is common and associated with a significant increased risk for future cardiovascular events, even when the absolute difference in arm systolic blood pressure is modest. These findings support research to expand clinical use of this simple measurement. PMID:24287007
Nonesterified fatty acids and risk of sudden cardiac death in older adults.
Djoussé, Luc; Biggs, Mary L; Ix, Joachim H; Kizer, Jorge R; Lemaitre, Rozenn N; Sotoodehnia, Nona; Zieman, Susan J; Mozaffarian, Dariush; Tracy, Russell P; Mukamal, Kenneth J; Siscovick, David S
2012-04-01
Although nonesterified fatty acids (NEFA) have been positively associated with coronary heart disease risk factors, limited and inconsistent data are available on the relation between NEFA and sudden cardiac death. Using a prospective design, we studied 4657 older men and women (mean age, 75 years) from the Cardiovascular Health Study (1992-2006) to evaluate the association between plasma NEFA and the risk of sudden cardiac death in older adults. Plasma concentrations of NEFA were measured using established enzymatic methods, and sudden death was adjudicated using medical records, death certificates, proxy interview, and autopsy reports. We used Cox proportional hazard models to estimate multivariable-adjusted relative risks. During a median follow-up of 10.0 years, 221 new cases of sudden cardiac death occurred. In a multivariable model adjusting for age, sex, race, clinic site, alcohol intake, smoking, prevalent coronary heart disease and heart failure, and self-reported health status, relative risks (95% confidence interval) for sudden cardiac death were 1.0 (ref), 1.15 (0.81-1.64), 1.06 (0.72-1.55), and 0.91 (0.60-1.38) across consecutive quartiles of NEFA concentration. In secondary analyses restricted to the first 5 years of follow-up, we also did not observe a statistically significant association between plasma NEFA and sudden cardiac death. Our data do not provide evidence for an association between plasma NEFA measured late in life and the risk of sudden cardiac death in older adults.
Choi, Andy I; Weekley, Cristin C; Chen, Shu-Cheng; Li, Suying; Tamura, Manjula Kurella; Norris, Keith C; Shlipak, Michael G
2011-08-01
Recent reports have suggested a close relationship between education and health, including mortality, in the United States. Observational cohort. We studied 61,457 participants enrolled in a national health screening initiative, the National Kidney Foundation's Kidney Early Evaluation Program (KEEP). Self-reported educational attainment. Chronic diseases (hypertension, diabetes, cardiovascular disease, reduced kidney function, and albuminuria) and mortality. We evaluated cross-sectional associations between self-reported educational attainment with the chronic diseases listed using logistic regression models adjusted for demographics, access to care, behaviors, and comorbid conditions. The association of educational attainment with survival was determined using multivariable Cox proportional hazards regression. Higher educational attainment was associated with a lower prevalence of each of the chronic conditions listed. In multivariable models, compared with persons not completing high school, college graduates had a lower risk of each chronic condition, ranging from 11% lower odds of decreased kidney function to 37% lower odds of cardiovascular disease. During a mean follow-up of 3.9 (median, 3.7) years, 2,384 (4%) deaths occurred. In the fully adjusted Cox model, those who had completed college had 24% lower mortality compared with participants who had completed at least some high school. Lack of income data does not allow us to disentangle the independent effects of education from income. In this diverse contemporary cohort, higher educational attainment was associated independently with a lower prevalence of chronic diseases and short-term mortality in all age and race/ethnicity groups. Published by Elsevier Inc.
Cheng, Susan; Gupta, Deepak K; Claggett, Brian; Sharrett, A Richey; Shah, Amil M; Skali, Hicham; Takeuchi, Madoka; Ni, Hanyu; Solomon, Scott D
2013-09-01
Elevation in blood pressure (BP) increases risk for all cardiovascular events. Nevertheless, the extent to which different indices of BP elevation may be associated to varying degrees with different cardiovascular outcomes remains unclear. We studied 13340 participants (aged 54 ± 6 years, 56% women and 27% black) of the Atherosclerosis Risk in Communities Study who were free of baseline cardiovascular disease. We used Cox proportional hazards models to compare the relative contributions of systolic BP, diastolic BP, pulse pressure, and mean arterial pressure to risk for coronary heart disease, heart failure, stroke, and all-cause mortality. For each multivariable-adjusted model, the largest area under the receiver-operating curve (AUC) and smallest -2 log-likelihood values were used to identify BP measures with the greatest contribution to risk prediction for each outcome. A total of 2095 coronary heart disease events, 1669 heart failure events, 771 stroke events, and 3016 deaths occurred during 18 ± 5 years of follow-up. In multivariable analyses adjusting for traditional cardiovascular risk factors, the BP measures with the greatest risk contributions were the following: systolic BP for coronary heart disease (AUC=0.74); pulse pressure for heart failure (AUC=0.79); systolic BP for stroke (AUC=0.74); and pulse pressure for all-cause mortality (AUC=0.74). With few exceptions, results were similar in analyses stratified by age, sex, and race. Our data indicate that distinct BP components contribute variably to risk for different cardiovascular outcomes.
Cheng, Susan; Gupta, Deepak K.; Claggett, Brian; Sharrett, A. Richey; Shah, Amil M.; Skali, Hicham; Takeuchi, Madoka; Ni, Hanyu; Solomon, Scott D.
2013-01-01
Elevation in blood pressure (BP) increases risk for all cardiovascular events. Nevertheless, the extent to which different indices of BP elevation may be associated to varying degrees with different cardiovascular outcomes remains unclear. We studied 13,340 participants (aged 54±6 years, 56% women, 27% black) of the Atherosclerosis Risk in Communities Study who were free of baseline cardiovascular disease. We used Cox proportional hazards models to compare the relative contributions of systolic (SBP), diastolic (DBP), pulse pressure (PP), and mean arterial pressure (MAP) to risk for coronary heart disease (CHD), heart failure (HF), stroke, and all-cause mortality. For each multivariable-adjusted model, the largest area under the receiver-operating curve (AUC) and smallest -2 log likelihood values were used to identify BP measures with the greatest contribution to risk prediction for each outcome. A total of 2095 CHD events, 1669 HF events, 771 stroke events, and 3016 deaths occurred during up to 18±5 years of follow up. In multivariable analyses adjusting for traditional cardiovascular risk factors, the BP measures with the greatest risk contributions were: SBP for CHD (AUC=0.74); PP for HF (AUC=0.79), SBP for stroke (AUC=0.74), and PP for all-cause mortality (AUC=0.74). With few exceptions, results were similar in analyses stratified by age, sex, and race. Our data indicate that distinct BP components contribute variably to risk for different cardiovascular outcomes. PMID:23876475
Helzner, E P.; Scarmeas, N; Cosentino, S; Tang, M X.; Schupf, N; Stern, Y
2008-01-01
Objective: To describe factors associated with survival in Alzheimer disease (AD) in a multiethnic, population-based longitudinal study. Methods: AD cases were identified in the Washington Heights Inwood Columbia Aging Project, a longitudinal, community-based study of cognitive aging in Northern Manhattan. The sample comprised 323 participants who were initially dementia-free but developed AD during study follow-up (incident cases). Participants were followed for an average of 4.1 (up to 12.6) years. Possible factors associated with shorter lifespan were assessed using Cox proportional hazards models with attained age as the time to event (time from birth to death or last follow-up). In subanalyses, median postdiagnosis survival durations were estimated using postdiagnosis study follow-up as the timescale. Results: The mortality rate was 10.7 per 100 person-years. Mortality rates were higher among those diagnosed at older ages, and among Hispanics compared to non-Hispanic whites. The median lifespan of the entire sample was 92.2 years (95% CI: 90.3, 94.1). In a multivariable-adjusted Cox model, history of diabetes and history of hypertension were independently associated with a shorter lifespan. No differences in lifespan were seen by race/ethnicity after multivariable adjustment. The median postdiagnosis survival duration was 3.7 years among non-Hispanic whites, 4.8 years among African Americans, and 7.6 years among Hispanics. Conclusion: Factors influencing survival in Alzheimer disease include race/ethnicity and comorbid diabetes and hypertension. GLOSSARY AD = Alzheimer disease; NDI = National Death Index; WHICAP = Washington Heights Inwood Columbia Aging Project. PMID:18981370
Hamer, Mark; Batty, G David; Stamatakis, Emmanuel; Kivimaki, Mika
2010-12-01
Common mental disorders, such as anxiety and depression, are risk factors for mortality among cardiac patients, although this topic has gained little attention in individuals with hypertension. We examined the combined effects of hypertension and common mental disorder on mortality in participants with both treated and untreated hypertension. In a representative, prospective study of 31 495 adults (aged 52.5 ± 12.5 years, 45.7% men) we measured baseline levels of common mental disorder using the 12-item General Health Questionnaire (GHQ-12) and collected data on blood pressure, history of hypertension diagnosis, and medication use. High blood pressure (systolic/diastolic >140/90 mmHg) in study members with an existing diagnosis of hypertension indicated uncontrolled hypertension and, in undiagnosed individuals, untreated hypertension. There were 3200 deaths from all causes [943 cardiovascular disease (CVD)] over 8.4 years follow-up. As expected, the risk of CVD was elevated in participants with controlled [multivariate hazard ratio = 1.63, 95% confidence interval (CI) 1.26-2.12] and uncontrolled (multivariate hazard ratio = 1.57, 95% CI 1.08-2.27) hypertension compared with normotensive participants. Common mental disorder (GHQ-12 score of ≥4) was also associated with CVD death (multivariate hazard ratio = 1.60, 95% CI 1.35-1.90). The risk of CVD death was highest in participants with both diagnosed hypertension and common mental disorder, especially in study members with controlled (multivariate hazard ratio = 2.32, 95% CI 1.70-3.17) hypertension but also in uncontrolled hypertension (multivariate hazard ratio = 1.90, 95% CI 1.18-3.05). The combined effect of common mental disorder was also apparent in participants with undiagnosed (untreated) hypertension, especially for all-cause mortality. These findings suggest that the association of hypertension with total and CVD mortality is stronger when combined with common mental disorder.
NASA Astrophysics Data System (ADS)
Evtushenko, V. F.; Myshlyaev, L. P.; Makarov, G. V.; Ivushkin, K. A.; Burkova, E. V.
2016-10-01
The structure of multi-variant physical and mathematical models of control system is offered as well as its application for adjustment of automatic control system (ACS) of production facilities on the example of coal processing plant.
Ren, J S; Freedman, N D; Kamangar, F; Dawsey, S M; Hollenbeck, A R; Schatzkin, A; Abnet, C C
2010-07-01
The authors investigated the relationship between hot tea, iced tea, coffee and carbonated soft drinks consumption and upper gastrointestinal tract cancers risk in the NIH-AARP Study. During 2,584,953 person-years of follow-up on 481,563 subjects, 392 oral cavity, 178 pharynx, 307 larynx, 231 gastric cardia, 224 gastric non-cardia cancer, 123 Oesophageal Squamous Cell Carcinoma (ESCC) and 305 Oesophageal Adenocarcinoma (EADC) cases were accrued. Hazard ratios (HRs) and 95% confidence intervals (95% CIs) were calculated by multivariate-adjusted Cox regression. Compared to non-drinking, the hazard ratio for hot tea intake of > or =1 cup/day was 0.37 (95% CI: 0.20, 0.70) for pharyngeal cancer. The authors also observed a significant association between coffee drinking and risk of gastric cardia cancer (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 1.57 (95% CI: 1.03, 2.39)), and an inverse association between coffee drinking and EADC for the cases occurring in the last 3 years of follow-up (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 0.54 (95% CI: 0.31, 0.92)), but no association in earlier follow-up. In summary, hot tea intake was inversely associated with pharyngeal cancer, and coffee was directly associated with gastric cardia cancer, but was inversely associated with EADC during some follow-up periods. Published by Elsevier Ltd.
Ren, JS; Freedman, ND; Kamangar, F; Dawsey, SM; Hollenbeck, AR; Schatzkin, A; Abnet, CC
2010-01-01
The authors investigated the relationship between hot tea, iced tea, coffee and carbonated soft drinks consumption and upper gastrointestinal tract cancers risk in the NIH-AARP Study. During 2,584,953 person-years of follow-up on 481,563 subjects, 392 oral cavity, 178 pharynx, 307 larynx, 231 gastric cardia, 224 gastric noncardia cancer, 123 esophageal squamous cell carcinoma (ESCC) and 305 esophageal adenocarcinoma (EADC) cases were accrued. Hazard ratios (HRs) and 95% Confidence Intervals (95%CIs) were calculated by multivariate-adjusted Cox regression. Compared to non-drinking, the hazard ratio for hot tea intake of ≥1 cup/day was 0.37 (95%CI: 0.20, 0.70) for pharyngeal cancer. The authors also observed a significant association between coffee drinking and risk of gastric cardia cancer (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 1.57 (95%CI: 1.03, 2.39)), and an inverse association between coffee drinking and EADC for the cases occurring in the last three years of follow-up (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 0.54 (95%CI: 0.31, 0.92)), but no association in earlier follow-up. In summary, hot tea intake was inversely associated with pharyngeal cancer, and coffee was directly associated with gastric cardia cancer, but was inversely associated with EADC during some follow-up periods. PMID:20395127
Migraine and risk of cardiovascular diseases: Danish population based matched cohort study
Szépligeti, Szimonetta Komjáthiné; Holland-Bill, Louise; Ehrenstein, Vera; Horváth-Puhó, Erzsébet; Henderson, Victor W; Sørensen, Henrik Toft
2018-01-01
Abstract Objective To examine the risks of myocardial infarction, stroke (ischaemic and haemorrhagic), peripheral artery disease, venous thromboembolism, atrial fibrillation or atrial flutter, and heart failure in patients with migraine and in a general population comparison cohort. Design Nationwide, population based cohort study. Setting All Danish hospitals and hospital outpatient clinics from 1995 to 2013. Participants 51 032 patients with migraine and 510 320 people from the general population matched on age, sex, and calendar year. Main outcome measures Comorbidity adjusted hazard ratios of cardiovascular outcomes based on Cox regression analysis. Results Higher absolute risks were observed among patients with incident migraine than in the general population across most outcomes and follow-up periods. After 19 years of follow-up, the cumulative incidences per 1000 people for the migraine cohort compared with the general population were 25 v 17 for myocardial infarction, 45 v 25 for ischaemic stroke, 11 v 6 for haemorrhagic stroke, 13 v 11 for peripheral artery disease, 27 v 18 for venous thromboembolism, 47 v 34 for atrial fibrillation or atrial flutter, and 19 v 18 for heart failure. Correspondingly, migraine was positively associated with myocardial infarction (adjusted hazard ratio 1.49, 95% confidence interval 1.36 to 1.64), ischaemic stroke (2.26, 2.11 to 2.41), and haemorrhagic stroke (1.94, 1.68 to 2.23), as well as venous thromboembolism (1.59, 1.45 to 1.74) and atrial fibrillation or atrial flutter (1.25, 1.16 to 1.36). No meaningful association was found with peripheral artery disease (adjusted hazard ratio 1.12, 0.96 to 1.30) or heart failure (1.04, 0.93 to 1.16). The associations, particularly for stroke outcomes, were stronger during the short term (0-1 years) after diagnosis than the long term (up to 19 years), in patients with aura than in those without aura, and in women than in men. In a subcohort of patients, the associations persisted after additional multivariable adjustment for body mass index and smoking. Conclusions Migraine was associated with increased risks of myocardial infarction, ischaemic stroke, haemorrhagic stroke, venous thromboembolism, and atrial fibrillation or atrial flutter. Migraine may be an important risk factor for most cardiovascular diseases. PMID:29386181
Migraine and risk of cardiovascular diseases: Danish population based matched cohort study.
Adelborg, Kasper; Szépligeti, Szimonetta Komjáthiné; Holland-Bill, Louise; Ehrenstein, Vera; Horváth-Puhó, Erzsébet; Henderson, Victor W; Sørensen, Henrik Toft
2018-01-31
To examine the risks of myocardial infarction, stroke (ischaemic and haemorrhagic), peripheral artery disease, venous thromboembolism, atrial fibrillation or atrial flutter, and heart failure in patients with migraine and in a general population comparison cohort. Nationwide, population based cohort study. All Danish hospitals and hospital outpatient clinics from 1995 to 2013. 51 032 patients with migraine and 510 320 people from the general population matched on age, sex, and calendar year. Comorbidity adjusted hazard ratios of cardiovascular outcomes based on Cox regression analysis. Higher absolute risks were observed among patients with incident migraine than in the general population across most outcomes and follow-up periods. After 19 years of follow-up, the cumulative incidences per 1000 people for the migraine cohort compared with the general population were 25 v 17 for myocardial infarction, 45 v 25 for ischaemic stroke, 11 v 6 for haemorrhagic stroke, 13 v 11 for peripheral artery disease, 27 v 18 for venous thromboembolism, 47 v 34 for atrial fibrillation or atrial flutter, and 19 v 18 for heart failure. Correspondingly, migraine was positively associated with myocardial infarction (adjusted hazard ratio 1.49, 95% confidence interval 1.36 to 1.64), ischaemic stroke (2.26, 2.11 to 2.41), and haemorrhagic stroke (1.94, 1.68 to 2.23), as well as venous thromboembolism (1.59, 1.45 to 1.74) and atrial fibrillation or atrial flutter (1.25, 1.16 to 1.36). No meaningful association was found with peripheral artery disease (adjusted hazard ratio 1.12, 0.96 to 1.30) or heart failure (1.04, 0.93 to 1.16). The associations, particularly for stroke outcomes, were stronger during the short term (0-1 years) after diagnosis than the long term (up to 19 years), in patients with aura than in those without aura, and in women than in men. In a subcohort of patients, the associations persisted after additional multivariable adjustment for body mass index and smoking. Migraine was associated with increased risks of myocardial infarction, ischaemic stroke, haemorrhagic stroke, venous thromboembolism, and atrial fibrillation or atrial flutter. Migraine may be an important risk factor for most cardiovascular diseases. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Tsao, Connie W.; Gona, Philimon; Salton, Carol; Murabito, Joanne M.; Oyama, Noriko; Danias, Peter G.; O’Donnell, Christopher J.; Manning, Warren J.; Yeon, Susan B.
2011-01-01
We aimed to determine the relationships between resting left ventricular (LV) wall motion abnormalities (WMAs), aortic plaque, and PAD in a community cohort. 1726 Framingham Heart Study Offspring Cohort participants (806 males, 65±9 years) underwent cardiovascular magnetic resonance with quantification of aortic plaque volume and assessment of regional LV systolic function. Claudication, lower extremity revascularization, and ankle-brachial index (ABI) were recorded at Examination 7. WMAs were associated with greater aortic plaque burden, decreased ABI, and claudication in age- and sex-adjusted analyses (all p<0.001), which were not significant after adjustment for cardiovascular risk factors. In age- and sex-adjusted analyses, both the presence (p<0.001) and volume of aortic plaque were associated with decreased ABI (p<0.001). After multivariable adjustment, ABI≤0.9 or prior revascularization was associated with a three-fold odds of aortic plaque (p=0.0083). Plaque volume significantly increased with decreasing ABI in multivariable-adjusted analyses (p<0.0001). In this free-living population, associations of WMAs with aortic plaque burden and clinical measures of PAD were attenuated after adjustment for coronary heart disease risk factors. Aortic plaque volume and ABI remained strongly negatively correlated after multivariable adjustment. Our findings suggest that the association between coronary heart disease and non-coronary atherosclerosis is explained by cardiovascular risk factors. Aortic atherosclerosis and PAD remain strongly associated after multivariable adjustment suggesting shared mechanisms beyond those captured by traditional risk factors. PMID:21708875
Geri, Guillaume; Dumas, Florence; Chenevier-Gobeaux, Camille; Bouglé, Adrien; Daviaud, Fabrice; Morichau-Beauchant, Tristan; Jouven, Xavier; Mira, Jean-Paul; Pène, Frédéric; Empana, Jean-Philippe; Cariou, Alain
2015-02-01
The availability of circulating biomarkers that helps to identify early out-of-hospital cardiac arrest survivors who are at increased risk of long-term mortality remains challenging. Our aim was to prospectively study the association between copeptin and 1-year mortality in patients with out-of-hospital cardiac arrest admitted in a tertiary cardiac arrest center. Retrospective monocenter study. Tertiary cardiac arrest center in Paris, France. Copeptin was assessed at admission and day 3. Pre- and intrahospital factors associated with 1-year mortality were analyzed by multivariate Cox proportional analysis. None. Two hundred ninety-eight consecutive out-of-hospital cardiac arrest patients (70.3% male; median age, 60.2 yr [49.9-71.4]) were admitted in a tertiary cardiac arrest center in Paris (France). After multivariate analysis, higher admission copeptin was associated with 1-year mortality with a threshold effect (hazard ratio(5th vs 1st quintile) = 1.64; 95% CI, 1.05-2.58; p = 0.03). Day 3 copeptin was associated with 1-year mortality in a dose-dependent manner (hazard ratio(2nd vs 1st quintile) = 1.87; 95% CI, 1.00-3.49; p = 0.05; hazard ratio(3rd vs 1st quintile) = 1.92; 95% CI, 1.02-3.64; p = 0.04; hazard ratio(4th vs 1st quintile) = 2.12; 95% CI, 1.14-3.93; p = 0.02; and hazard ratio(5th vs 1st quintile) = 2.75; 95% CI, 1.47-5.15; p < 0.01; p for trend < 0.01). For both admission and day 3 copeptin, association with 1-year mortality existed for out-of-hospital cardiac arrest of cardiac origin only (p for interaction = 0.05 and < 0.01, respectively). When admission and day 3 copeptin were mutually adjusted, only day 3 copeptin remained associated with 1-year mortality in a dose-dependent manner (p for trend = 0.01). High levels of copeptin were associated with 1-year mortality independently from prehospital and intrahospital risk factors, especially in out-of-hospital cardiac arrest of cardiac origin. Day 3 copeptin was superior to admission copeptin: this could permit identification of out-of-hospital cardiac arrest survivors at increased risk of mortality and allow for close observation of such patients.
Comparative effectiveness from a single-arm trial and real-world data: alectinib versus ceritinib.
Davies, Jessica; Martinec, Michael; Delmar, Paul; Coudert, Mathieu; Bordogna, Walter; Golding, Sophie; Martina, Reynaldo; Crane, Gracy
2018-06-26
To compare the overall survival of anaplastic lymphoma kinase-positive non-small-cell lung cancer patients who received alectinib with those who received ceritinib. Two treatment arms (alectinib [n = 183] and ceritinib [n = 67]) were extracted from clinical trials and an electronic health record database, respectively. Propensity scores were applied to balance baseline characteristics. Kaplan-Meier and multivariate Cox regression were conducted. After propensity score adjustment, baseline characteristics were balanced. Alectinib had a prolonged median overall survival (alectinib = 24.3 months and ceritinib = 15.6 months) and lower risk of death (hazard ratio: 0.65; 95% CI: 0.48-0.88). Alectinib was associated with prolonged overall survival versus ceritinib, which is consistent with efficacy evidence from clinical trials.
Chatterjee, Ranee; Yeh, Hsin-Chieh; Shafi, Tariq; Selvin, Elizabeth; Anderson, Cheryl; Pankow, James S; Miller, Edgar; Brancati, Frederick
2010-10-25
Serum potassium levels affect insulin secretion by pancreatic β-cells, and hypokalemia associated with diuretic use has been associated with dysglycemia. We hypothesized that adults with lower serum potassium levels and lower dietary potassium intake are at higher risk for incident diabetes mellitus (DM), independent of diuretic use. We analyzed data from 12 209 participants from the Atherosclerosis Risk in Communities (ARIC) Study, an ongoing prospective cohort study, beginning in 1986, with 9 years of in-person follow-up and 17 years of telephone follow-up. Using multivariate Cox proportional hazard models, we estimated the hazard ratio (HR) of incident DM associated with baseline serum potassium levels. During 9 years of in-person follow-up, 1475 participants developed incident DM. In multivariate analyses, we found an inverse association between serum potassium and risk of incident DM. Compared with those with a high-normal serum potassium level (5.0-5.5 mEq/L), adults with serum potassium levels lower than 4.0 mEq/L, 4.0 to lower than 4.5 mEq/L, and 4.5 to lower than 5.0 mEq/L had an adjusted HR (95% confidence interval [CI]) of incident DM of 1.64 (95% CI, 1.29-2.08), 1.64 (95% CI, 1.34-2.01), and 1.39 (95% CI, 1.14-1.71), respectively. An increased risk persisted during an additional 8 years of telephone follow-up based on self-report with HRs of 1.2 to 1.3 for those with a serum potassium level lower than 5.0 mEq/L. Dietary potassium intake was significantly associated with risk of incident DM in unadjusted models but not in multivariate models. Serum potassium level is an independent predictor of incident DM in this cohort. Further study is needed to determine if modification of serum potassium could reduce the subsequent risk of DM.
Cho, Yeong-Ah; Ryu, Kyoung-A.; Yoo, Min-Kyong; Kim, Seong-Ah; Ha, Kyungho; Kim, Jeongseon; Cho, Yoon Hee; Shin, Sangah; Joung, Hyojee
2017-01-01
Markedly increased red meat consumption is a cancer risk factor, while dietary flavonoids may help prevent the disease. The purpose of this study was to investigate the associations of red meat and flavonoid consumption with cancer risk, based on data from 8024 subjects, drawn from the 2004–2008 Cancer Screening Examination Cohort of the Korean National Cancer Center. Hazard ratios (HRs) were obtained by using a Cox proportional hazard model. During the mean follow-up period of 10.1 years, 443 cases were newly diagnosed with cancer. After adjusting for age, there was a significant correlation between cancer risk and the daily intake of ≥43 g of red meat per day (HR 1.31; 95% CI 1.01, 1.71; p = 0.045), and total flavonoid intake tended to decrease cancer risk (HR 0.70; 95% CI 0.49, 0.99; highest vs. lowest quartile; p-trend = 0.073) in men. Following multivariable adjustment, there were no statistically significant associations between flavonoid intake and overall cancer risk in individuals with high levels of red meat intake. Men with low daily red meat intake exhibited an inverse association between flavonoid consumption and cancer incidence (HR 0.41; 95% CI 0.21, 0.80; highest vs. lowest; p-trend = 0.017). Additional research is necessary to clarify the effects of flavonoid consumption on specific cancer incidence, relative to daily red meat intake. PMID:28841199
Kim, So Young; Wie, Gyung-Ah; Cho, Yeong-Ah; Kang, Hyun-Hee; Ryu, Kyoung-A; Yoo, Min-Kyong; Jun, Shinyoung; Kim, Seong-Ah; Ha, Kyungho; Kim, Jeongseon; Cho, Yoon Hee; Shin, Sangah; Joung, Hyojee
2017-08-25
Markedly increased red meat consumption is a cancer risk factor, while dietary flavonoids may help prevent the disease. The purpose of this study was to investigate the associations of red meat and flavonoid consumption with cancer risk, based on data from 8024 subjects, drawn from the 2004-2008 Cancer Screening Examination Cohort of the Korean National Cancer Center. Hazard ratios (HRs) were obtained by using a Cox proportional hazard model. During the mean follow-up period of 10.1 years, 443 cases were newly diagnosed with cancer. After adjusting for age, there was a significant correlation between cancer risk and the daily intake of ≥43 g of red meat per day (HR 1.31; 95% CI 1.01, 1.71; p = 0.045), and total flavonoid intake tended to decrease cancer risk (HR 0.70; 95% CI 0.49, 0.99; highest vs. lowest quartile; p -trend = 0.073) in men. Following multivariable adjustment, there were no statistically significant associations between flavonoid intake and overall cancer risk in individuals with high levels of red meat intake. Men with low daily red meat intake exhibited an inverse association between flavonoid consumption and cancer incidence (HR 0.41; 95% CI 0.21, 0.80; highest vs. lowest; p -trend = 0.017). Additional research is necessary to clarify the effects of flavonoid consumption on specific cancer incidence, relative to daily red meat intake.
Bergquist, John R; Ivanics, Tommy; Shubert, Christopher R; Habermann, Elizabeth B; Smoot, Rory L; Kendrick, Michael L; Nagorney, David M; Farnell, Michael B; Truty, Mark J
2017-06-01
Adjuvant chemotherapy improves survival after curative intent resection for localized pancreatic adenocarcinoma (PDAC). Given the differences in perioperative morbidity, we hypothesized that patients undergoing distal partial pancreatectomy (DPP) would receive adjuvant therapy more often those undergoing pancreatoduodenectomy (PD). The National Cancer Data Base (2004-2012) identified patients with localized PDAC undergoing DPP and PD, excluding neoadjuvant cases, and factors associated with receipt of adjuvant therapy were identified. Overall survival (OS) was analyzed using multivariable Cox proportional hazards regression. Overall, 13,501 patients were included (DPP, n = 1933; PD, n = 11,568). Prognostic characteristics were similar, except DPP patients had fewer N1 lesions, less often positive margins, more minimally invasive resections, and shorter hospital stay. The proportion of patients not receiving adjuvant chemotherapy was equivalent (DPP 33.7%, PD 32.0%; p = 0.148). The type of procedure was not independently associated with adjuvant chemotherapy (hazard ratio 0.96, 95% confidence interval 0.90-1.02; p = 0.150), and patients receiving adjuvant chemotherapy had improved unadjusted and adjusted OS compared with surgery alone. The type of resection did not predict adjusted mortality (p = 0.870). Receipt of adjuvant chemotherapy did not vary by type of resection but improved survival independent of procedure performed. Factors other than type of resection appear to be driving the nationwide rates of post-resection adjuvant chemotherapy in localized PDAC.
Lee, Da-Young; Lee, Mi-Yeon; Sung, Ki-Chul
2018-06-01
This paper investigated the impact of A Body Shape Index (ABSI) on the risk of all-cause mortality compared with the impact of waist circumference (WC) and body mass index (BMI). This paper reviewed data of 213,569 Korean adults who participated in health checkups between 2002 and 2012 at Kangbuk Samsung Hospital in Seoul, Korea. A multivariate Cox proportional hazard analysis was performed on the BMI, WC, and ABSI z score continuous variables as well as quintiles. During 1,168,668.7 person-years, 1,107 deaths occurred. As continuous variables, a significant positive relationship with the risk of all-cause death was found only in ABSI z scores after adjustment for age, sex, current smoking, alcohol consumption, regular exercise, presence of diabetes or hypertension, and history of cardiovascular diseases. In Cox analysis of quintiles, quintile 5 of the ABSI z score showed significantly increased hazard ratios (HRs) for mortality risk (HR [95% CI] was 1.32 [1.05-1.66]), whereas the risk for all-cause mortality, on the other hand, decreased in quintiles 3 through 5 of BMI and WC compared with their first quintiles after adjusting for several confounders. This study showed that the predictive value of ABSI for mortality risk was strong for a sample of young Asian participants and that its usefulness was better than BMI or WC. © 2018 The Obesity Society.
Al-Shahi Salman, Rustam; White, Philip M; Counsell, Carl E; du Plessis, Johann; van Beijnum, Janneke; Josephson, Colin B; Wilkinson, Tim; Wedderburn, Catherine J; Chandy, Zoe; St George, E Jerome; Sellar, Robin J; Warlow, Charles P
Whether conservative management is superior to interventional treatment for unruptured brain arteriovenous malformations (bAVMs) is uncertain because of the shortage of long-term comparative data. To compare the long-term outcomes of conservative management vs intervention for unruptured bAVM. Population-based inception cohort study of 204 residents of Scotland aged 16 years or older who were first diagnosed as having an unruptured bAVM during 1999-2003 or 2006-2010 and followed up prospectively for 12 years. Conservative management (no intervention) vs intervention (any endovascular embolization, neurosurgical excision, or stereotactic radiosurgery alone or in combination). Cox regression analyses, with multivariable adjustment for prognostic factors and baseline imbalances if hazards were proportional, to compare rates of the primary outcome (death or sustained morbidity of any cause by Oxford Handicap Scale [OHS] score ≥2 for ≥2 successive years [0 = no symptoms and 6 = death]) and the secondary outcome (nonfatal symptomatic stroke or death due to bAVM, associated arterial aneurysm, or intervention). Of 204 patients, 103 underwent intervention. Those who underwent intervention were younger, more likely to have presented with seizure, and less likely to have large bAVMs than patients managed conservatively. During a median follow-up of 6.9 years (94% completeness), the rate of progression to the primary outcome was lower with conservative management during the first 4 years of follow-up (36 vs 39 events; 9.5 vs 9.8 per 100 person-years; adjusted hazard ratio, 0.59; 95% CI, 0.35-0.99), but rates were similar thereafter. The rate of the secondary outcome was lower with conservative management during 12 years of follow-up (14 vs 38 events; 1.6 vs 3.3 per 100 person-years; adjusted hazard ratio, 0.37; 95% CI, 0.19-0.72). Among patients aged 16 years or older diagnosed as having unruptured bAVM, use of conservative management compared with intervention was associated with better clinical outcomes for up to 12 years. Longer follow-up is required to understand whether this association persists.
Phobic anxiety symptom scores and incidence of type 2 diabetes in US men and women.
Farvid, Maryam S; Qi, Lu; Hu, Frank B; Kawachi, Ichiro; Okereke, Olivia I; Kubzansky, Laura D; Willett, Walter C
2014-02-01
Emotional stress may be a risk factor for type 2 diabetes (T2D), but the relation between phobic anxiety symptoms and risk of T2D is uncertain. To evaluate prospectively the association between phobic anxiety symptoms and incident T2D in three cohorts of US men and women. We followed 30,791 men in the Health Professional's Follow-Up Study (HPFS) (1988-2008), 68,904 women in the Nurses' Health Study (NHS) (1988-2008), and 79,960 women in the Nurses' Health Study II (NHS II) (1993-2011). Phobic anxiety symptom scores, as measured by the Crown-Crisp index (CCI), calculated from 8 questions, were administered at baseline and updated in 2004 for NHS, in 2005 for NHS II, and in 2000 for HPFS. Incident T2D was confirmed by a validated supplementary questionnaire. We used Cox proportional hazards analysis to evaluate associations with incident T2D. During 3,099,651 person-years of follow-up, we documented 12,831 incident T2D cases. In multivariate Cox proportional-hazards models with adjustment for major lifestyle and dietary risk factors, the hazard ratios (HRs) of T2D across categories of increasing levels of CCI (scores=2 to <3, 3 to <4, 4 to <6, ⩾6), compared with a score of <2, were increased significantly by 6%, 10%, 10% and 13% (Ptrend=0.001) for NHS; and by 19%, 11%, 21%, and 29% (Ptrend<0.0001) for NHS II. Each score increment in CCI was associated with 2% higher risk of T2D in NHS (HRs, 1.02, 95% confidence intervals: 1.01-1.03) and 4% higher risk of T2D in NHS II (HRs, 1.04, 95% confidence intervals: 1.02-1.05). Further adjustment for depression did not change the results. In HPFS, the association between CCI and T2D was not significant after adjusting for lifestyle variables. Our results suggest that higher phobic anxiety symptoms are associated with an increased risk of T2D in women. Copyright © 2013 Elsevier Inc. All rights reserved.
Kowalkowski, Marc A; Day, Rena S; Du, Xianglin L; Chan, Wenyaw; Chiao, Elizabeth Y
2014-10-01
Research suggests that cumulative measurement of HIV exposure is associated with mortality, AIDS, and AIDS-defining malignancies. However, the relationship between cumulative HIV and non-AIDS-defining malignancies (NADMs) remains unclear. The aim of this study was to evaluate the effect of different HIV measures on NADM hazard among HIV-infected male veterans. We performed a retrospective cohort study using Veterans Affairs HIV Clinical Case Registry data from 1985 to 2010. We analyzed the relationship between HIV exposure (recent HIV RNA, % undetectable HIV RNA, and HIV copy-years viremia) and NADM. To evaluate the effect of HIV, we calculated hazard ratios for 3 common virally associated NADM [ie, hepatocarcinoma (HCC), Hodgkin lymphoma (HL), and squamous cell carcinoma of the anus (SCCA)] in multivariable Cox regression models. Among 31,576 HIV-infected male veterans, 383 HCC, 211 HL, and 373 SCCA cases were identified. In multivariable regression models, cross-sectional HIV measurement was not associated with NADM. However, compared with <20% undetectable HIV, individuals with ≥80% had decreased HL [adjusted hazard ratio (aHR) = 0.62; 95% confidence interval (CI): 0.37 to 1.02] and SCCA (aHR = 0.64; 95% CI: 0.44 to 0.93). Conversely, each log10 increase in HIV copy-years was associated with elevated HL (aHR = 1.22; 95% CI: 1.06 to 1.40) and SCCA (aHR = 1.36; 95% CI: 1.21 to 1.52). Model fit was best with HIV copy-years. Cumulative HIV was not associated with HCC. Cumulative HIV was associated with certain virally associated NADM (ie, HL and SCCA), independent of measured covariates. Findings underline the importance of early treatment initiation and durable medication adherence to reduce cumulative HIV burden. Future research should prioritize how to best apply cumulative HIV measures in screening for these cancers.
Shivakoti, Rupak; Gupte, Nikhil; Yang, Wei-Teng; Mwelase, Noluthando; Kanyama, Cecilia; Tang, Alice M.; Pillay, Sandy; Samaneka, Wadzanai; Riviere, Cynthia; Berendes, Sima; Lama, Javier R.; Cardoso, Sandra W.; Sugandhavesa, Patcharaphan; Semba, Richard D.; Christian, Parul; Campbell, Thomas B.; Gupta, Amita
2014-01-01
A case-cohort study, within a multi-country trial of antiretroviral therapy (ART) efficacy (Prospective Evaluation of Antiretrovirals in Resource Limited Settings (PEARLS)), was conducted to determine if pre-ART serum selenium deficiency is independently associated with human immunodeficiency virus (HIV) disease progression after ART initiation. Cases were HIV-1 infected adults with either clinical failure (incident World Health Organization (WHO) stage 3, 4 or death by 96 weeks) or virologic failure by 24 months. Risk factors for serum selenium deficiency (<85 μg/L) pre-ART and its association with outcomes were examined. Median serum selenium concentration was 82.04 μg/L (Interquartile range (IQR): 57.28–99.89) and serum selenium deficiency was 53%, varying widely by country from 0% to 100%. In multivariable models, risk factors for serum selenium deficiency were country, previous tuberculosis, anemia, and elevated C-reactive protein. Serum selenium deficiency was not associated with either clinical failure or virologic failure in multivariable models. However, relative to people in the third quartile (74.86–95.10 μg/L) of serum selenium, we observed increased hazards (adjusted hazards ratio (HR): 3.50; 95% confidence intervals (CI): 1.30–9.42) of clinical failure but not virologic failure for people in the highest quartile. If future studies confirm this relationship of high serum selenium with increased clinical failure, a cautious approach to selenium supplementation might be needed, especially in HIV-infected populations with sufficient or unknown levels of selenium. PMID:25401501
Periodontal disease, tooth loss and colorectal cancer risk: Results from the Nurses' Health Study.
Momen-Heravi, Fatemeh; Babic, Ana; Tworoger, Shelley S; Zhang, Libin; Wu, Kana; Smith-Warner, Stephanie A; Ogino, Shuji; Chan, Andrew T; Meyerhardt, Jeffrey; Giovannucci, Edward; Fuchs, Charles; Cho, Eunyoung; Michaud, Dominique S; Stampfer, Meir J; Yu, Yau-Hua; Kim, David; Zhang, Xuehong
2017-02-01
Periodontal diseases including tooth loss might increase systemic inflammation, lead to immune dysregulation and alter gut microbiota, thereby possibly influencing colorectal carcinogenesis. Few epidemiological studies have examined the association between periodontal diseases and colorectal cancer (CRC) risk. We collected information on the periodontal disease (defined as history of periodontal bone loss) and number of natural teeth in the Nurses' Health Study. A total of 77,443 women were followed since 1992. We used Cox proportional hazard models to calculate multivariable hazard ratios (HRs) and 95% confidence intervals (95% CIs) after adjustment for smoking and other known risk factors for CRC. We documented 1,165 incident CRC through 2010. Compared to women with 25-32 teeth, the multivariable HR (95% CI) for CRC for women with <17 teeth was 1.20 (1.04-1.39). With regard to tumor site, the HRs (95% CIs) for the same comparison were 1.23 (1.01-1.51) for proximal colon cancer, 1.03 (0.76-1.38) for distal colon cancer and 1.48 (1.07-2.05) for rectal cancer. In addition, compared to those without periodontal disease, HRs for CRC were 0.91 (95% CI 0.74-1.12) for periodontal disease, and 1.22 (95% CI 0.91-1.63) when limited to moderate to severe periodontal disease. The results were not modified by smoking status, body mass index or alcohol consumption. Women with fewer teeth, possibly moderate or severe periodontal disease, might be at a modest increased risk of developing CRC, suggesting a potential role of oral health in colorectal carcinogenesis. © 2016 UICC.
Night shift work and incident diabetes among U.S. black women
Vimalananda, Varsha G.; Palmer, Julie R.; Gerlovin, Hanna; Wise, Lauren A.; Rosenzweig, James L.; Rosenberg, Lynn; Narváez, Edward A. Ruiz
2015-01-01
Aims To assess shift work in relation to incident type 2 diabetes among African American women. Methods In the Black Women's Health Study (BWHS), an ongoing prospective cohort study, we followed 28,041 participants for incident diabetes during 2005-2013. They answered questions in 2005 about having worked the night shift. We estimated hazard ratios (HR) and 95% confidence intervals (CI) for incident diabetes using Cox proportional hazards models. The basic multivariable model included age, time period, family history of diabetes, education, and neighborhood SES. In further models, we controlled for lifestyle factors and body mass index (BMI). Results Over the 8 years of follow-up, there were 1,786 incident diabetes cases. Relative to never having worked the night shift, HRs (95% CI) of diabetes were 1.17 (1.04, 1.31) for 1-2 years of night shift work, 1.23 (1.06, 1.41) for 3-9 years, and 1.42 (1.19, 1.70) for ≥ 10 years (P-trend < 0.0001). The monotonic positive association between night shift work and type 2 diabetes remained after multivariable adjustment (P-trend = 0.02). The association did not vary by obesity status, but was stronger in women aged < 50 years. Conclusions Long durations of shift work were associated with an increased risk of type 2 diabetes. The association was only partially explained by lifestyle factors and BMI. A better understanding of the mechanisms by which shift work may affect risk of diabetes is needed in view of the high prevalence of shift work among U.S. workers. PMID:25586362
Takase, Hiroyuki; Sugiura, Tonomori; Murai, Shunsuke; Yamashita, Sumiyo; Ohte, Nobuyuki; Dohi, Yasuaki
2017-08-01
Increased carotid intima-media thickness (IMT) in individuals without hypertension might indicate other factors promoting the atherosclerotic process that are often simultaneously clustered in individuals. The present study tested the hypothesis that carotid IMT predicts new onset of hypertension in the normotensive subjects.A total of 867 participants were enrolled from our yearly physical checkup program and their carotid IMT was measured. After a baseline examination, the subjects were followed up for a median of 1091 days with the endpoint being the development of hypertension.At baseline, the carotid IMT value was 0.75 ± 0.16 mm. Hypertension developed in 184 subjects during the follow-up (76.9/1000 person-years). The incidence of hypertension was increased across the tertiles of the carotid IMT value (39.6, 70.0, and 134.5/1000 person-years in the first, second, and third tertiles, respectively, P < .001 by log-rank test). Multivariate Cox-hazard analysis after adjustment identified carotid IMT, taken as a continuous variable, as a significant predictor of new-onset hypertension (hazard ratio = 7.08, 95% confidence interval = 3.06-15.39). Furthermore, multivariate linear regression analyses indicated a significant correlation between the carotid IMT at baseline and yearly increases in systolic blood pressure during the follow-up period (β = 0.189, P < .001).Carotid IMT is an independent predictor of hypertension onset in normotensive subjects. The findings also suggested a close association between increased carotid IMT and blood pressure.
Takase, Hiroyuki; Sugiura, Tonomori; Murai, Shunsuke; Yamashita, Sumiyo; Ohte, Nobuyuki; Dohi, Yasuaki
2017-01-01
Abstract Increased carotid intima-media thickness (IMT) in individuals without hypertension might indicate other factors promoting the atherosclerotic process that are often simultaneously clustered in individuals. The present study tested the hypothesis that carotid IMT predicts new onset of hypertension in the normotensive subjects. A total of 867 participants were enrolled from our yearly physical checkup program and their carotid IMT was measured. After a baseline examination, the subjects were followed up for a median of 1091 days with the endpoint being the development of hypertension. At baseline, the carotid IMT value was 0.75 ± 0.16 mm. Hypertension developed in 184 subjects during the follow-up (76.9/1000 person-years). The incidence of hypertension was increased across the tertiles of the carotid IMT value (39.6, 70.0, and 134.5/1000 person-years in the first, second, and third tertiles, respectively, P < .001 by log-rank test). Multivariate Cox-hazard analysis after adjustment identified carotid IMT, taken as a continuous variable, as a significant predictor of new-onset hypertension (hazard ratio = 7.08, 95% confidence interval = 3.06–15.39). Furthermore, multivariate linear regression analyses indicated a significant correlation between the carotid IMT at baseline and yearly increases in systolic blood pressure during the follow-up period (β = 0.189, P < .001). Carotid IMT is an independent predictor of hypertension onset in normotensive subjects. The findings also suggested a close association between increased carotid IMT and blood pressure. PMID:28767608
Takase, Hiroyuki; Sugiura, Tomonori; Kimura, Genjiro; Ohte, Nobuyuki; Dohi, Yasuaki
2015-01-01
Background Although there is a close relationship between dietary sodium and hypertension, the concept that persons with relatively high dietary sodium are at increased risk of developing hypertension compared with those with relatively low dietary sodium has not been studied intensively in a cohort. Methods and Results We conducted an observational study to investigate whether dietary sodium intake predicts future blood pressure and the onset of hypertension in the general population. Individual sodium intake was estimated by calculating 24-hour urinary sodium excretion from spot urine in 4523 normotensive participants who visited our hospital for a health checkup. After a baseline examination, they were followed for a median of 1143 days, with the end point being development of hypertension. During the follow-up period, hypertension developed in 1027 participants (22.7%). The risk of developing hypertension was higher in those with higher rather than lower sodium intake (hazard ratio 1.25, 95% CI 1.04 to 1.50). In multivariate Cox proportional hazards regression analysis, baseline sodium intake and the yearly change in sodium intake during the follow-up period (as continuous variables) correlated with the incidence of hypertension. Furthermore, both the yearly increase in sodium intake and baseline sodium intake showed significant correlations with the yearly increase in systolic blood pressure in multivariate regression analysis after adjustment for possible risk factors. Conclusions Both relatively high levels of dietary sodium intake and gradual increases in dietary sodium are associated with future increases in blood pressure and the incidence of hypertension in the Japanese general population. PMID:26224048
Lucas, Michel; O'Reilly, Eilis J; Pan, An; Mirzaei, Fariba; Willett, Walter C; Okereke, Olivia I; Ascherio, Alberto
2014-07-01
To evaluate the association between coffee and caffeine consumption and suicide risk in three large-scale cohorts of US men and women. We accessed data of 43,599 men enrolled in the Health Professionals Follow-up Study (HPFS, 1988-2008), 73,820 women in the Nurses' Health Study (NHS, 1992-2008), and 91,005 women in the NHS II (1993-2007). Consumption of caffeine, coffee, and decaffeinated coffee, was assessed every 4 years by validated food-frequency questionnaires. Deaths from suicide were determined by physician review of death certificates. Multivariate adjusted relative risks (RRs) were estimated with Cox proportional hazard models. Cohort specific RRs were pooled using random-effect models. We documented 277 deaths from suicide. Compared to those consuming ≤ 1 cup/week of caffeinated coffee (< 8 oz/237 ml), the pooled multivariate RR (95% confidence interval [CI]) of suicide was 0.55 (0.38-0.78) for those consuming 2-3 cups/day and 0.47 (0.27-0.81) for those consuming ≥ 4 cups/day (P trend < 0.001). The pooled multivariate RR (95% CI) for suicide was 0.75 (0.63-0.90) for each increment of 2 cups/day of caffeinated coffee and 0.77 (0.63-0.93) for each increment of 300 mg/day of caffeine. These results from three large cohorts support an association between caffeine consumption and lower risk of suicide.
Bernstein, Elana J; Bathon, Joan M; Lederer, David J
2018-05-01
Pulmonary arterial hypertension (PAH) is a major cause of morbidity and mortality in adults with systemic autoimmune rheumatic diseases (ARDs). The aim of this study was to determine whether adults with ARDs and PAH on right-sided heart catheterization (ARD-PAH) have increased mortality following lung transplantation compared with those with PAH not due to an ARD. We conducted a retrospective cohort study of 93 adults with ARD-PAH and 222 adults with PAH who underwent lung transplantation in the USA between 4 May 2005 and 9 March 2015 using data from the United Network for Organ Sharing. We examined associations between diagnosis and survival after lung transplantation using stratified Cox models adjusted for potential confounding recipient factors. Among adults undergoing lung transplantation in the USA, we did not detect a difference in the multivariable-adjusted mortality rate between those with ARD-PAH and those with PAH [hazard ratio 0.75 (95% CI 0.47, 1.19)]. The presence of an ARD was not associated with increased mortality after lung transplantation in adults with PAH.
Glioblastoma multiforme (GBM) in the elderly: initial treatment strategy and overall survival.
Glaser, Scott M; Dohopolski, Michael J; Balasubramani, Goundappa K; Flickinger, John C; Beriwal, Sushil
2017-08-01
The EORTC trial which solidified the role of external beam radiotherapy (EBRT) plus temozolomide (TMZ) in the management of GBM excluded patients over age 70. Randomized studies of elderly patients showed that hypofractionated EBRT (HFRT) alone or TMZ alone was at least equivalent to conventionally fractionated EBRT (CFRT) alone. We sought to investigate the practice patterns and survival in elderly patients with GBM. We identified patients age 65-90 in the National Cancer Data Base (NCDB) with histologically confirmed GBM from 1998 to 2012 and known chemotherapy and radiotherapy status. We analyzed factors predicting treatment with EBRT alone vs. EBRT plus concurrent single-agent chemotherapy (CRT) using multivariable logistic regression. Similarly, within the EBRT alone cohort we compared CFRT (54-65 Gy at 1.7-2.1 Gy/fraction) to HFRT (34-60 Gy at 2.5-5 Gy/fraction). Multivariable Cox proportional hazards model (MVA) with propensity score adjustment was used to compare survival. A total of 38,862 patients were included. Initial treatments for 1998 versus 2012 were: EBRT alone = 50 versus 10%; CRT = 6 versus 50%; chemo alone = 1.6% (70% single-agent) versus 3.2% (94% single-agent). Among EBRT alone patients, use of HFRT (compared to CFRT) increased from 13 to 41%. Numerous factors predictive for utilization of CRT over EBRT alone and for HFRT over CFRT were identified. Median survival and 1-year overall survival were higher in the CRT versus EBRT alone group at 8.6 months vs. 5.1 months and 36.0 versus 15.7% (p < 0.0005 by log-rank, multivariable HR 0.65 [95% CI = 0.61-0.68, p < 0.0005], multivariable HR with propensity adjustment 0.66 [95% CI = 0.63-0.70, p < 0.0005]). For elderly GBM patients in the United States, CRT is the most common initial treatment and appears to offer a survival advantage over EBRT alone. Adoption of hypofractionation has increased over time but continues to be low.
Stinchcombe, Thomas E; Zhang, Ying; Vokes, Everett E; Schiller, Joan H; Bradley, Jeffrey D; Kelly, Karen; Curran, Walter J; Schild, Steven E; Movsas, Benjamin; Clamon, Gerald; Govindan, Ramaswamy; Blumenschein, George R; Socinski, Mark A; Ready, Neal E; Akerley, Wallace L; Cohen, Harvey J; Pang, Herbert H; Wang, Xiaofei
2017-09-01
Purpose Concurrent chemoradiotherapy is standard treatment for patients with stage III non-small-cell lung cancer. Elderly patients may experience increased rates of adverse events (AEs) or less benefit from concurrent chemoradiotherapy. Patients and Methods Individual patient data were collected from 16 phase II or III trials conducted by US National Cancer Institute-supported cooperative groups of concurrent chemoradiotherapy alone or with consolidation or induction chemotherapy for stage III non-small-cell lung cancer from 1990 to 2012. Overall survival (OS), progression-free survival, and AEs were compared between patients age ≥ 70 (elderly) and those younger than 70 years (younger). Unadjusted and adjusted hazard ratios (HRs) for survival time and CIs were estimated by single-predictor and multivariable frailty Cox models. Unadjusted and adjusted odds ratio (ORs) for AEs and CIs were obtained from single-predictor and multivariable generalized linear mixed-effect models. Results A total of 2,768 patients were classified as younger and 832 as elderly. In unadjusted and multivariable models, elderly patients had worse OS (HR, 1.20; 95% CI, 1.09 to 1.31 and HR, 1.17; 95% CI, 1.07 to 1.29, respectively). In unadjusted and multivariable models, elderly and younger patients had similar progression-free survival (HR, 1.01; 95% CI, 0.93 to 1.10 and HR, 1.00; 95% CI, 0.91 to 1.09, respectively). Elderly patients had a higher rate of grade ≥ 3 AEs in unadjusted and multivariable models (OR, 1.35; 95% CI, 1.07 to 1.70 and OR, 1.38; 95% CI, 1.10 to 1.74, respectively). Grade 5 AEs were significantly higher in elderly compared with younger patients (9% v 4%; P < .01). Fewer elderly compared with younger patients completed treatment (47% v 57%; P < .01), and more discontinued treatment because of AEs (20% v 13%; P < .01), died during treatment (7.8% v 2.9%; P < .01), and refused further treatment (5.8% v 3.9%; P = .02). Conclusion Elderly patients in concurrent chemoradiotherapy trials experienced worse OS, more toxicity, and had a higher rate of death during treatment than younger patients.
Rotating Night-Shift Work and Lung Cancer Risk Among Female Nurses in the United States
Schernhammer, Eva S.; Feskanich, Diane; Liang, Geyu; Han, Jiali
2013-01-01
The risk of lung cancer among night-shift workers is unknown. Over 20 years of follow-up (1988–2008), we documented 1,455 incident lung cancers among 78,612 women in the Nurses' Health Study. To examine the relationship between rotating night-shift work and lung cancer risk, we used multivariate Cox proportional hazard models adjusted for detailed smoking characteristics and other risk factors. We observed a 28% increased risk of lung cancer among women with 15 or more years spent working rotating night shifts (multivariate relative risk (RR) = 1.28, 95% confidence interval (CI): 1.07, 1.53; Ptrend = 0.03) compared with women who did not work any night shifts. This association was strongest for small-cell lung carcinomas (multivariate RR = 1.56, 95% CI: 0.99, 2.47; Ptrend = 0.03) and was not observed for adenocarcinomas of the lung (multivariate RR = 0.91, 95% CI: 0.67, 1.24; Ptrend = 0.40). Further, the increased risk associated with 15 or more years of rotating night-shift work was limited to current smokers (RR = 1.61, 95% CI: 1.21, 2.13; Ptrend < 0.001), with no association seen in nonsmokers (Pinteraction = 0.03). These results suggest that there are modestly increased risks of lung cancer associated with extended periods of night-shift work among smokers but not among nonsmokers. Though it is possible that this observation was residually confounded by smoking, our findings could also provide evidence of circadian disruption as a “second hit” in the etiology of smoking-related lung tumors. PMID:24049158
Protein carbamylation predicts mortality in ESRD.
Koeth, Robert A; Kalantar-Zadeh, Kamyar; Wang, Zeneng; Fu, Xiaoming; Tang, W H Wilson; Hazen, Stanley L
2013-04-01
Traditional risk factors fail to explain the increased risk for cardiovascular morbidity and mortality in ESRD. Cyanate, a reactive electrophilic species in equilibrium with urea, posttranslationally modifies proteins through a process called carbamylation, which promotes atherosclerosis. The plasma level of protein-bound homocitrulline (PBHCit), which results from carbamylation, predicts major adverse cardiac events in patients with normal renal function, but whether this relationship is similar in ESRD is unknown. We quantified serum PBHCit in a cohort of 347 patients undergoing maintenance hemodialysis with 5 years of follow-up. Kaplan-Meier analyses revealed a significant association between elevated PBHCit and death (log-rank P<0.01). After adjustment for patient characteristics, laboratory values, and comorbid conditions, the risk for death among patients with PBHCit values in the highest tertile was more than double the risk among patients with values in the middle tertile (adjusted hazard ratio [HR], 2.4; 95% confidence interval [CI], 1.5-3.9) or the lowest tertile (adjusted HR, 2.3; 95% CI, 1.5-3.7). Including PBHCit significantly improved the multivariable model, with a net reclassification index of 14% (P<0.01). In summary, serum PBHCit, a footprint of protein carbamylation, predicts increased cardiovascular risk in patients with ESRD, supporting a mechanistic link among uremia, inflammation, and atherosclerosis.
Hruby, Adela; Bulathsinhala, Lakmini; McKinnon, Craig J.; Hill, Owen T.; Montain, Scott J.; Young, Andrew J.; Smith, Tracey J.
2017-01-01
Individuals entering US Army service are generally young and healthy, but many are overweight, which may impact cardiometabolic risk despite physical activity and fitness requirements. This analysis examines the association between Soldiers’ BMI at accession and incident cardiometabolic risk factors (CRF) using longitudinal data from 731,014 Soldiers (17.0% female; age: 21.6 [3.9] years; BMI: 24.7 [3.8] kg/m2) who were assessed at Army accession, 2001–2011. CRF were defined as incident diagnoses through 2011, by ICD-9 code, of metabolic syndrome, glucose/insulin disorder, hypertension, dyslipidemia, or overweight/obesity (in those not initially overweight/obese). Multivariable-adjusted proportional hazards models were used to estimate hazard ratios (HR) and 95% confidence intervals (CI) between BMI categories at accession and CRF. Initially underweight (BMI<18.5 kg/m2) were 2.4% of Soldiers, 53.5% were normal weight (18.5−<25), 34.2% were overweight (25−<30), and 10.0% were obese (≥30). Mean age range at CRF diagnosis was 24–29 years old, with generally low CRF incidence: 228 with metabolic syndrome, 3,880 with a glucose/insulin disorder, 26,373 with hypertension, and 13,404 with dyslipidemia. Of the Soldiers who were not overweight or obese at accession, 5,361 were eventually diagnosed as overweight or obese. Relative to Soldiers who were normal weight at accession, those who were overweight or obese, respectively, had significantly higher risk of developing each CRF after multivariable adjustment (HR [95% CI]: metabolic syndrome: 4.13 [2.87–5.94], 13.36 [9.00–19.83]; glucose/insulin disorder: 1.39 [1.30–1.50], 2.76 [2.52–3.04]; hypertension: 1.85 [1.80–1.90], 3.31 [3.20–3.42]; dyslipidemia: 1.81 [1.75–1.89], 3.19 [3.04–3.35]). Risk of hypertension, dyslipidemia, and overweight/obesity in initially underweight Soldiers was 40%, 31%, and 79% lower, respectively, versus normal-weight Soldiers. BMI in early adulthood has important implications for cardiometabolic health, even within young, physically active populations. PMID:28095509
Ascencio-Montiel, Iván de Jesús; Kumate-Rodríguez, Jesús; Borja-Aburto, Víctor Hugo; Fernández-Garate, José Esteban; Konik-Comonfort, Selene; Macías-Pérez, Oliver; Campos-Hernández, Ángel; Rodríguez-Vázquez, Héctor; López-Roldán, Verónica Miriam; Zitle-García, Edgar Jesús; Solís-Cruz, María Del Carmen; Velázquez-Ramírez, Ismael; Aguilar-Jiménez, Miriam; Villa-Caballero, Leonel; Cisneros-González, Nelly
2016-09-01
Permanent occupational disability is one of the most severe consequences of diabetes that impedes the performance of usual working activities among economically active individuals. Survival rates and worker compensation expenses have not previously been examined among Mexican workers. We aimed to describe the worker compensation expenses derived from pension payments and also to examine the survival rates and characteristics associated with all-cause mortality, in a cohort of 34,014 Mexican workers with permanent occupational disability caused by diabetes during the years 2000-2013 at the Mexican Institute of Social Security. A cross-sectional analysis study was conducted using national administrative records data from the entire country, regarding permanent occupational disability medical certification, pension payment and vital status. Survival rates were estimated using the Kaplan-Meier method. Multivariate Cox proportional hazard model was used to estimate adjusted hazard ratios (HR) and 95 % confidence intervals (95 % CI) in order to assess the cohort characteristics and all-cause mortality risk. Total expenses derived from pension payments for the period were accounted for in U.S. dollars (USD, 2013). There were 12,917 deaths in 142,725.1 person-years. Median survival time was 7.26 years. After multivariate adjusted analysis, males (HR, 1.39; 95 % CI, 1.29-1.50), agricultural, forestry, and fishery workers (HR, 1.41; 95 % CI, 1.15-1.73) and renal complications (HR, 3.49; 95 % CI, 3.18-3.83) had the highest association with all-cause mortality. The all-period expenses derived from pension payments amounted to $777.78 million USD (2013), and showed a sustained increment: from $58.28 million USD in 2000 to $111.62 million USD in 2013 (percentage increase of 91.5 %). Mexican workers with permanent occupational disability caused by diabetes had a median survival of 7.26 years, and those with renal complications showed the lowest survival in the cohort. Expenses derived from pension payments amounted to $ 777 million USD and showed an important increase from 2000 to 2013.
Wu, Jason H Y; Lemaitre, Rozenn N; Imamura, Fumiaki; King, Irena B; Song, Xiaoling; Spiegelman, Donna; Siscovick, David S; Mozaffarian, Dariush
2011-08-01
De novo lipogenesis (DNL) is an endogenous pathway whereby carbohydrates and proteins are converted to fatty acids. DNL could affect coronary heart disease (CHD) or sudden cardiac arrest (SCA) via generation of specific fatty acids. Whether these fatty acids are prospectively associated with SCA or other CHD events is unknown. The objective was to investigate the relations of 4 fatty acids in the DNL pathway-palmitic acid (16:0), palmitoleic acid (16:1n-7), 7-hexadecenoic acid (16:1n-9), and cis-vaccenic acid (18:1n-7)-with incident CHD, including fatal CHD, nonfatal myocardial infarction (NFMI), and SCA. A community-based prospective study was conducted in 2890 men and women aged ≥65 y, who were free of known CHD at baseline and who were followed from 1992 to 2006. Cardiovascular disease risk factors and plasma phospholipid fatty acids were measured at baseline by using standardized methods. Incident CHD was ascertained prospectively and was centrally adjudicated by using medical records. Risk was assessed by using multivariable-adjusted Cox proportional hazards. During 29,835 person-years of follow-up, 631 CHD and 71 SCA events occurred. Both 18:1n-7 and 16:1n-9 were associated with a higher risk of SCA [multivariable-adjusted hazard ratio (95% CI) for the interquintile range: 7.63 (2.58, 22.6) for 18:1n-7 and 2.30 (1.16, 4.55) for 16:1n-9] but not of total CHD, fatal CHD, or NFMI. In secondary analyses censored to mid-follow-up (7 y) to minimize the effects of changes in concentrations over time, 16:1n-9 was also associated with a significantly higher risk of total CHD (2.11; 1.76, 2.54), including a higher risk of CHD death, NFMI, and SCA; 16:0 and 16:1n-7 were not associated with clinical CHD outcomes. Higher plasma phospholipid 18:1n-7 and 16:1n-9 concentrations were prospectively associated with an elevated risk of SCA but not of other CHD events, except in secondary analyses.
Lemaitre, Rozenn N; Imamura, Fumiaki; King, Irena B; Song, Xiaoling; Spiegelman, Donna; Siscovick, David S; Mozaffarian, Dariush
2011-01-01
Background: De novo lipogenesis (DNL) is an endogenous pathway whereby carbohydrates and proteins are converted to fatty acids. DNL could affect coronary heart disease (CHD) or sudden cardiac arrest (SCA) via generation of specific fatty acids. Whether these fatty acids are prospectively associated with SCA or other CHD events is unknown. Objective: The objective was to investigate the relations of 4 fatty acids in the DNL pathway—palmitic acid (16:0), palmitoleic acid (16:1n−7), 7-hexadecenoic acid (16:1n−9), and cis-vaccenic acid (18:1n−7)—with incident CHD, including fatal CHD, nonfatal myocardial infarction (NFMI), and SCA. Design: A community-based prospective study was conducted in 2890 men and women aged ≥65 y, who were free of known CHD at baseline and who were followed from 1992 to 2006. Cardiovascular disease risk factors and plasma phospholipid fatty acids were measured at baseline by using standardized methods. Incident CHD was ascertained prospectively and was centrally adjudicated by using medical records. Risk was assessed by using multivariable-adjusted Cox proportional hazards. Results: During 29,835 person-years of follow-up, 631 CHD and 71 SCA events occurred. Both 18:1n−7 and 16:1n−9 were associated with a higher risk of SCA [multivariable-adjusted hazard ratio (95% CI) for the interquintile range: 7.63 (2.58, 22.6) for 18:1n−7 and 2.30 (1.16, 4.55) for 16:1n−9] but not of total CHD, fatal CHD, or NFMI. In secondary analyses censored to mid-follow-up (7 y) to minimize the effects of changes in concentrations over time, 16:1n−9 was also associated with a significantly higher risk of total CHD (2.11; 1.76, 2.54), including a higher risk of CHD death, NFMI, and SCA; 16:0 and 16:1n−7 were not associated with clinical CHD outcomes. Conclusion: Higher plasma phospholipid 18:1n−7 and 16:1n−9 concentrations were prospectively associated with an elevated risk of SCA but not of other CHD events, except in secondary analyses. PMID:21697077
Tsao, Connie W; Gona, Philimon; Salton, Carol; Murabito, Joanne M; Oyama, Noriko; Danias, Peter G; O'Donnell, Christopher J; Manning, Warren J; Yeon, Susan B
2011-08-01
We aimed to determine the relationships between resting left ventricular (LV) wall motion abnormalities (WMAs), aortic plaque, and peripheral artery disease (PAD) in a community cohort. A total of 1726 Framingham Heart Study Offspring Cohort participants (806 males, 65 ± 9 years) underwent cardiovascular magnetic resonance with quantification of aortic plaque volume and assessment of regional left ventricular systolic function. Claudication, lower extremity revascularization, and ankle-brachial index (ABI) were recorded at the most contemporaneous examination visit. WMAs were associated with greater aortic plaque burden, decreased ABI, and claudication in age- and sex-adjusted analyses (all p < 0.001), which were not significant after adjustment for cardiovascular risk factors. In age- and sex-adjusted analyses, both the presence (p < 0.001) and volume of aortic plaque were associated with decreased ABI (p < 0.001). After multivariable adjustment, an ABI ≤ 0.9 or prior revascularization was associated with a threefold odds of aortic plaque (p = 0.0083). Plaque volume significantly increased with decreasing ABI in multivariable-adjusted analyses (p < 0.0001). In this free-living population, associations of WMAs with aortic plaque burden and clinical measures of PAD were attenuated after adjustment for coronary heart disease risk factors. Aortic plaque volume and ABI remained strongly negatively correlated after multivariable adjustment. Our findings suggest that the association between coronary heart disease and non-coronary atherosclerosis is explained by cardiovascular risk factors. Aortic atherosclerosis and PAD remain strongly associated after multivariable adjustment, suggesting shared mechanisms beyond those captured by traditional risk factors.
Olsen, Anne-Marie Schjerning; Fosbøl, Emil L; Lindhardsen, Jesper; Folke, Fredrik; Charlot, Mette; Selmer, Christian; Bjerring Olesen, Jonas; Lamberts, Morten; Ruwald, Martin H; Køber, Lars; Hansen, Peter R; Torp-Pedersen, Christian; Gislason, Gunnar H
2012-10-16
The cardiovascular risk after the first myocardial infarction (MI) declines rapidly during the first year. We analyzed whether the cardiovascular risk associated with using nonsteroidal anti-inflammatory drugs (NSAIDs) was associated with the time elapsed following first-time MI. We identified patients aged 30 years or older admitted with first-time MI in 1997 to 2009 and subsequent NSAID use by individual-level linkage of nationwide registries of hospitalization and drug dispensing from pharmacies in Denmark. We calculated the incidence rates of death and a composite end point of coronary death or nonfatal recurrent MIs associated with NSAID use in 1-year time intervals up to 5 years after inclusion and analyzed risk by using multivariable adjusted time-dependent Cox proportional hazards models. Of the 99 187 patients included, 43 608 (44%) were prescribed NSAIDs after the index MI. There were 36 747 deaths and 28 693 coronary deaths or nonfatal recurrent MIs during the 5 years of follow-up. Relative to noncurrent treatment with NSAIDs, the use of any NSAID in the years following MI was persistently associated with an increased risk of death (hazard ratio 1.59 [95% confidence interval, 1.49-1.69]) after 1 year and hazard ratio 1.63 [95% confidence interval, 1.52-1.74] after 5 years) and coronary death or nonfatal recurrent MI (hazard ratio, 1.30 [95% confidence interval,l 1.22-1.39] and hazard ratio, 1.41 [95% confidence interval, 1.28-1.55]). The use of NSAIDs is associated with persistently increased coronary risk regardless of time elapsed after first-time MI. We advise long-term caution in the use of NSAIDs for patients after MI.
Devillier, Raynier; Dalle, Jean-Hugues; Kulasekararaj, Austin; D'aveni, Maud; Clément, Laurence; Chybicka, Alicja; Vigouroux, Stéphane; Chevallier, Patrice; Koh, Mickey; Bertrand, Yves; Michallet, Mauricette; Zecca, Marco; Yakoub-Agha, Ibrahim; Cahn, Jean-Yves; Ljungman, Per; Bernard, Marc; Loiseau, Pascale; Dubois, Valérie; Maury, Sébastien; Socié, Gérard; Dufour, Carlo; Peffault de Latour, Regis
2016-07-01
Unrelated allogeneic transplantation for severe aplastic anemia is a treatment option after immunosuppressive treatment failure in the absence of a matched sibling donor. Age, delay between disease diagnosis and transplantation, and HLA matching are the key factors in transplantation decisions, but their combined impact on patient outcomes remains unclear. Using the French Society of Bone Marrow Transplantation and Cell Therapies registry, we analyzed all consecutive patients (n=139) who underwent a first allogeneic transplantation for idiopathic severe aplastic anemia from an unrelated donor between 2000 and 2012. In an adjusted multivariate model, age over 30 years (Hazard Ratio=2.39; P=0.011), time from diagnosis to transplantation over 12 months (Hazard Ratio=2.18; P=0.027) and the use of a 9/10 mismatched unrelated donor (Hazard Ratio=2.14; P=0.036) were independent risk factors that significantly worsened overall survival. Accordingly, we built a predictive score using these three parameters, considering patients at low (zero or one risk factors, n=94) or high (two or three risk factors, n=45) risk. High-risk patients had significantly shorter survival (Hazard Ratio=3.04; P<0.001). The score was then confirmed on an independent cohort from the European Group for Blood and Marrow Transplantation database of 296 patients, with shorter survival in patients with at least 2 risk factors (Hazard Ratio=2.13; P=0.005) In conclusion, a simple score using age, transplantation timing and HLA matching would appear useful to help physicians in the daily care of patients with severe aplastic anemia. Copyright© Ferrata Storti Foundation.
A multivariate copula-based framework for dealing with hazard scenarios and failure probabilities
NASA Astrophysics Data System (ADS)
Salvadori, G.; Durante, F.; De Michele, C.; Bernardi, M.; Petrella, L.
2016-05-01
This paper is of methodological nature, and deals with the foundations of Risk Assessment. Several international guidelines have recently recommended to select appropriate/relevant Hazard Scenarios in order to tame the consequences of (extreme) natural phenomena. In particular, the scenarios should be multivariate, i.e., they should take into account the fact that several variables, generally not independent, may be of interest. In this work, it is shown how a Hazard Scenario can be identified in terms of (i) a specific geometry and (ii) a suitable probability level. Several scenarios, as well as a Structural approach, are presented, and due comparisons are carried out. In addition, it is shown how the Hazard Scenario approach illustrated here is well suited to cope with the notion of Failure Probability, a tool traditionally used for design and risk assessment in engineering practice. All the results outlined throughout the work are based on the Copula Theory, which turns out to be a fundamental theoretical apparatus for doing multivariate risk assessment: formulas for the calculation of the probability of Hazard Scenarios in the general multidimensional case (d≥2) are derived, and worthy analytical relationships among the probabilities of occurrence of Hazard Scenarios are presented. In addition, the Extreme Value and Archimedean special cases are dealt with, relationships between dependence ordering and scenario levels are studied, and a counter-example concerning Tail Dependence is shown. Suitable indications for the practical application of the techniques outlined in the work are given, and two case studies illustrate the procedures discussed in the paper.
McCullough, Marjorie L; Jacobs, Eric J; Shah, Roma; Campbell, Peter T; Wang, Ying; Hartman, Terryl J; Gapstur, Susan M
2018-01-01
Prospective cohort studies suggest that red and processed meat consumption is associated with increased risk of pancreatic cancer among men, but not women. However, evidence is limited, and less evidence exists for other types of meat. Cox proportional hazards regression was used to estimate multivariable-adjusted hazard ratios (HR) for the association of meat consumption, by type, with pancreatic cancer risk among 138,266 men and women in the Cancer Prevention Study-II Nutrition Cohort. Diet was assessed at baseline in 1992, and 10 years earlier, at enrollment into the parent CPS-II mortality cohort. 1,156 pancreatic cancers were verified through 2013. Red meat, processed meat, and fish intake at baseline were not associated with pancreatic cancer risk. However, for long-term red and processed meat consumption (highest quartiles in 1982 and 1992, vs. lowest quartiles), risk appeared different in men [hazard ratio (HR) 1.32, 95% confidence interval (CI) 0.90, 1.95] and women (HR 0.72, 95% CI 0.47, 1.10, p heterogeneity by sex = 0.05). Poultry consumption in 1992 was associated with increased pancreatic cancer risk (HR 1.27, 95% CI 1.04, 1.55, p trend = 0.01, top vs. bottom quintile). The associations of meat consumption with pancreatic cancer risk remain unclear and further research, particularly of long-term intake, is warranted.
Is Genetic Background Important in Lung Cancer Survival?
Lindström, Linda S.; Hall, Per; Hartman, Mikael; Wiklund, Fredrik; Czene, Kamila
2009-01-01
Background In lung cancer, a patient's survival is poor with a wide variation in survival within the stage of disease. The aim of this study was to investigate the familial concordance in lung cancer survival by means of analyses of pairs with different degrees of familial relationships. Methods Our population-based Swedish family database included three million families and over 58 100 lung cancer patients. We modelled the proband (parent, sibling, spouse) survival utilizing a multivariate proportional hazard (Cox) model adjusting for possible confounders of survival. Subsequently, the survival in proband's relative (child, sibling, spouse) was analysed with a Cox model. Findings By use of Cox modelling with 5 years follow-up, we noted a decreased hazard ratio for death in children with good parental survival (Hazard Ratio [HR] = 0.71, 95% CI = 0.51 to 0.99), compared to those with poor parental survival. Also for siblings, a very strong protective effect was seen (HR = 0.14, 95% CI = 0.030 to 0.65). Finally, in spouses no correlation in survival was found. Interpretation Our findings suggest that genetic factors are important in lung cancer survival. In a clinical setting, information on prognosis in a relative may be vital in foreseeing the survival in an individual newly diagnosed with lung cancer. Future molecular studies enhancing the understanding of the underlying mechanisms and pathways are needed. PMID:19478952
Marshall, B.D.L.; Grafstein, E.; Buxton, J.A.; Qi, J.; Wood, E.; Shoveller, J.A.; Kerr, T.
2011-01-01
SUMMARY Objectives Methamphetamine (MA) use has been associated with health problems that commonly present in the emergency department (ED). This study sought to determine whether frequent MA injection was a risk factor for ED utilization among street-involved youth. Study design Prospective cohort study. Methods Data were derived from a street-involved youth cohort known as the ‘At Risk Youth Study’. Behavioural data including MA use were linked to ED records at a major inner-city hospital. Kaplan-Meier and Cox proportional hazards methods were used to determine the risk factors for ED utilization. Results Between September 2005 and January 2007, 427 eligible participants were enrolled, among whom the median age was 21 (interquartile range 19–23) years and 154 (36.1%) were female. Within 1 year, 163 (38.2%) visited the ED, resulting in an incidence density of 53.7 per 100 person-years. ED utilization was significantly higher among frequent (i.e. ≥daily) MA injectors (log-rank P=0.004). In multivariate analysis, frequent MA injection was associated with an increased hazard of ED utilization (adjusted hazard ratio=1.84, 95% confidence interval 1.04–3.25; P=0.036). Conclusions Street-involved youth who frequently inject MA appear to be at increased risk of ED utilization. The integration of MA-specific addiction treatment services within emergency care settings for high-risk youth is recommended. PMID:22133669
Alcohol consumption and risk of cardiovascular disease among hypertensive women.
Bos, Sarah; Grobbee, Diederick E; Boer, Jolanda M A; Verschuren, W Monique; Beulens, Joline W J
2010-02-01
This study investigated the relation between alcohol consumption and the risk of cardiovascular disease (CVD) among 10 530-hypertensive women from the EPIC-NL cohort. Alcohol consumption was assessed using a validated food-frequency questionnaire and participants were followed for occurrence of CVD. During 9.4 years follow-up, we documented 580 coronary heart disease (CHD) events and 254 strokes, 165 of which were ischemic. An inverse association (Ptrend=0.009) between alcohol consumption and risk of CHD was observed with a multivariate-adjusted hazard ratio of 0.72 (95% confidence interval: 0.52-1.01) for those consuming 70-139.9 g alcohol/week compared to lifetime abstainers. Of different beverages, only red wine consumption was associated with a reduced risk of CHD. A U-shaped relation (P=0.08) was observed for total stroke with a hazard ratio of 0.65 (0.44-0.95) for consuming 5-69.9 g alcohol/week compared with lifetime abstainers. Similar results were observed for ischemic stroke with a hazard ratio of 0.56 (0.35-0.89) for consuming of 5-69.9 g alcohol/week. We conclude that moderate alcohol consumption is associated with a reduced risk of CHD among hypertensive women. Light alcohol consumption tended to be related to a lower risk of stroke. Current guidelines for alcohol consumption in the general population also apply to hypertensive women.
Ma, Lijun; Langefeld, Carl D.; Comeau, Mary E.; Bonomo, Jason A.; Rocco, Michael V.; Burkart, John M.; Divers, Jasmin; Palmer, Nicholette D.; Hicks, Pamela J.; Bowden, Donald W.; Lea, Janice P.; Krisher, Jenna O.; Clay, Margo J.; Freedman, Barry I.
2016-01-01
Relative to European Americans, evidence supports that African Americans with end-stage renal disease (ESRD) survive longer on dialysis. Renal-risk variants in the apolipoprotein L1 gene (APOL1), associated with non-diabetic nephropathy and less subclinical atherosclerosis, may contribute to dialysis outcomes. Here, APOL1 renal-risk variants were assessed for association with dialytic survival in 450 diabetic and 275 non-diabetic African American hemodialysis patients from Wake Forest and Emory School of Medicine outpatient facilities. Outcomes were provided by the ESRD Network 6-Southeastern Kidney Council Standardized Information Management System. Dates of death, receipt of a kidney transplant, and loss to follow-up were recorded. Outcomes were censored at the date of transplantation or through July 1, 2015. Multivariable Cox proportional hazards models were computed separately in patients with non-diabetic and diabetic ESRD, adjusting for the covariates age, gender, comorbidities, ancestry, and presence of an arteriovenous fistula or graft at dialysis initiation. In non-diabetic ESRD, patients with two (vs. zero/one) APOL1 renal-risk variants had significantly longer dialysis survival (hazard ratio 0.57); a pattern not observed in patients with diabetes-associated ESRD (hazard ratio 1.29). Thus, two APOL1 renal-risk variants are associated with longer dialysis survival in African Americans without diabetes, potentially relating to presence of renal-limited disease or less atherosclerosis. PMID:27157696
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sole, Claudio V., E-mail: csole@iram.cl; School of Medicine, Complutense University, Madrid; Calvo, Felipe A.
Purpose: To assess long-term outcomes and toxicity of intraoperative electron-beam radiation therapy (IOERT) in the management of pediatric patients with Ewing sarcomas (EWS) and rhabdomyosarcomas (RMS). Methods and Materials: Seventy-one sarcoma (EWS n=37, 52%; RMS n=34, 48%) patients underwent IOERT for primary (n=46, 65%) or locally recurrent sarcomas (n=25, 35%) from May 1983 to November 2012. Local control (LC), overall survival (OS), and disease-free survival were estimated using Kaplan-Meier methods. For survival outcomes, potential associations were assessed in univariate and multivariate analyses using the Cox proportional hazards model. Results: After a median follow-up of 72 months (range, 4-310 months), 10-year LC, disease-freemore » survival, and OS was 74%, 57%, and 68%, respectively. In multivariate analysis after adjustment for other covariates, disease status (P=.04 and P=.05) and R1 margin status (P<.01 and P=.04) remained significantly associated with LC and OS. Nine patients (13%) reported severe chronic toxicity events (all grade 3). Conclusions: A multimodal IOERT-containing approach is a well-tolerated component of treatment for pediatric EWS and RMS patients, allowing reduction or substitution of external beam radiation exposure while maintaining high local control rates.« less
Primary Surgery vs Radiotherapy for Early Stage Oral Cavity Cancer.
Ellis, Mark A; Graboyes, Evan M; Wahlquist, Amy E; Neskey, David M; Kaczmar, John M; Schopper, Heather K; Sharma, Anand K; Morgan, Patrick F; Nguyen, Shaun A; Day, Terry A
2018-04-01
Objective The goal of this study is to determine the effect of primary surgery vs radiotherapy (RT) on overall survival (OS) in patients with early stage oral cavity squamous cell carcinoma (OCSCC). In addition, this study attempts to identify factors associated with receiving primary RT. Study Design Retrospective cohort study. Setting National Cancer Database (NCDB, 2004-2013). Subjects and Methods Reviewing the NCDB from 2004 to 2013, patients with early stage I to II OCSCC were identified. Kaplan-Meier estimates of survival, Cox regression analysis, and propensity score matching were used to examine differences in OS between primary surgery and primary RT. Multivariable logistic regression analysis was performed to identify factors associated with primary RT. Results Of the 20,779 patients included in the study, 95.4% (19,823 patients) underwent primary surgery and 4.6% (956 patients) underwent primary RT. After adjusting for covariates, primary RT was associated with an increased risk of mortality (adjusted hazard ratio [aHR], 1.97; 99% confidence interval [CI], 1.74-2.22). On multivariable analysis, factors associated with primary RT included age ≥70 years, black race, Medicaid or Medicare insurance, no insurance, oral cavity subsite other than tongue, clinical stage II disease, low-volume treatment facilities, and earlier treatment year. Conclusion Primary RT for early stage OCSCC is associated with increased mortality. Approximately 5% of patients receive primary RT; however, this percentage is decreasing. Patients at highest risk for receiving primary RT include those who are elderly, black, with public insurance, and treated at low-volume facilities.
Novel immunological and nutritional-based prognostic index for gastric cancer.
Sun, Kai-Yu; Xu, Jian-Bo; Chen, Shu-Ling; Yuan, Yu-Jie; Wu, Hui; Peng, Jian-Jun; Chen, Chuang-Qi; Guo, Pi; Hao, Yuan-Tao; He, Yu-Long
2015-05-21
To assess the prognostic significance of immunological and nutritional-based indices, including the prognostic nutritional index (PNI), neutrophil-lymphocyte ratio (NLR), and platelet-lymphocyte ratio in gastric cancer. We retrospectively reviewed 632 gastric cancer patients who underwent gastrectomy between 1998 and 2008. Areas under the receiver operating characteristic curve were calculated to compare the predictive ability of the indices, together with estimating the sensitivity, specificity and agreement rate. Univariate and multivariate analyses were performed to identify risk factors for overall survival (OS). Propensity score analysis was performed to adjust variables to control for selection bias. Each index could predict OS in gastric cancer patients in univariate analysis, but only PNI had independent prognostic significance in multivariate analysis before and after adjustment with propensity scoring (hazard ratio, 1.668; 95% confidence interval: 1.368-2.035). In subgroup analysis, a low PNI predicted a significantly shorter OS in patients with stage II-III disease (P = 0.019, P < 0.001), T3-T4 tumors (P < 0.001), or lymph node metastasis (P < 0.001). Canton score, a combination of PNI, NLR, and platelet, was a better indicator for OS than PNI, with the largest area under the curve for 12-, 36-, 60-mo OS and overall OS (P = 0.022, P = 0.030, P < 0.001, and P = 0.024, respectively). The maximum sensitivity, specificity, and agreement rate of Canton score for predicting prognosis were 84.6%, 34.9%, and 70.1%, respectively. PNI is an independent prognostic factor for OS in gastric cancer. Canton score can be a novel preoperative prognostic index in gastric cancer.
Shehabi, Yahya; Bellomo, Rinaldo; Kadiman, Suhaini; Ti, Lian Kah; Howe, Belinda; Reade, Michael C; Khoo, Tien Meng; Alias, Anita; Wong, Yu-Lin; Mukhopadhyay, Amartya; McArthur, Colin; Seppelt, Ian; Webb, Steven A; Green, Maja; Bailey, Michael J
2018-06-01
In the absence of a universal definition of light or deep sedation, the level of sedation that conveys favorable outcomes is unknown. We quantified the relationship between escalating intensity of sedation in the first 48 hours of mechanical ventilation and 180-day survival, time to extubation, and delirium. Harmonized data from prospective multicenter international longitudinal cohort studies SETTING:: Diverse mix of ICUs. Critically ill patients expected to be ventilated for longer than 24 hours. Richmond Agitation Sedation Scale and pain were assessed every 4 hours. Delirium and mobilization were assessed daily using the Confusion Assessment Method of ICU and a standardized mobility assessment, respectively. Sedation intensity was assessed using a Sedation Index, calculated as the sum of negative Richmond Agitation Sedation Scale measurements divided by the total number of assessments. We used multivariable Cox proportional hazard models to adjust for relevant covariates. We performed subgroup and sensitivity analysis accounting for immortal time bias using the same variables within 120 and 168 hours. The main outcome was 180-day survival. We assessed 703 patients in 42 ICUs with a mean (SD) Acute Physiology and Chronic Health Evaluation II score of 22.2 (8.5) with 180-day mortality of 32.3% (227). The median (interquartile range) ventilation time was 4.54 days (2.47-8.43 d). Delirium occurred in 273 (38.8%) of patients. Sedation intensity, in an escalating dose-dependent relationship, independently predicted increased risk of death (hazard ratio [95% CI], 1.29 [1.15-1.46]; p < 0.001, delirium hazard ratio [95% CI], 1.25 [1.10-1.43]), p value equals to 0.001 and reduced chance of early extubation hazard ratio (95% CI) 0.80 (0.73-0.87), p value of less than 0.001. Agitation level independently predicted subsequent delirium hazard ratio [95% CI], of 1.25 (1.04-1.49), p value equals to 0.02. Delirium or mobilization episodes within 168 hours, adjusted for sedation intensity, were not associated with survival. Sedation intensity independently, in an ascending relationship, predicted increased risk of death, delirium, and delayed time to extubation. These observations suggest that keeping sedation level equivalent to a Richmond Agitation Sedation Scale 0 is a clinically desirable goal.
Herpes zoster correlates with pyogenic liver abscesses in Taiwan.
Mei-Ling, Shen; Kuan-Fu, Liao; Sung-Mao, Tsai; Cheng-Li, Lin Ms; Shih-Wei, Lai
2016-12-01
The purpose of the paper was to explore the relationship between herpes zoster and pyogenic liver abscesses in Taiwan. This was a nationwide cohort study. Using the database of the Taiwan National Health Insurance Program, there were 33049 subjects aged 20-84 years who were newly diagnosed with herpes zoster from 1998 to 2010 that were selected for our study, and they were our herpes zoster group. 131707 randomly selected subjects without herpes zoster were our non-herpes zoster group. Both groups were matched by sex, age, other comorbidities, and the index year of their herpes zoster diagnosis. The incidence of pyogenic liver abscesses at the end of 2011 was then estimated. The multivariable Cox proportional hazard regression model was used to estimate the hazard ratio and 95% confidence interval for pyogenic liver abscesses associated with herpes zoster and other comorbidities. The overall incidence rate was 1.38-fold higher in the herpes zoster group than in the non-herpes zoster group (4.47 vs. 3.25 per 10000 person-years, 95% confidence interval 1.32, 1.44). After controlling for potential confounding factors, the adjusted hazard ratio of pyogenic liver abscesses was 1.34 in the herpes zoster group (95% confidence interval 1.05, 1.72) when compared with the non-herpes zoster group. Sex (in this case male), age, presence of biliary stones, chronic kidney diseases, chronic liver diseases, cancers, and diabetes mellitus were also significantly associated with pyogenic liver abscesses. Patients with herpes zoster are associated with an increased hazard of developing pyogenic liver abscesses.
Hansen, Morten; Nyby, Sebastian; Eifer Møller, Jacob; Videbæk, Lars; Kassem, Moustapha; Barington, Torben; Thayssen, Per; Diederichsen, Axel Cosmus Pyndt
2014-01-01
Seven years ago, the DanCell study was carried out to test the hypothesis of improvement in left ventricular ejection fraction (LVEF) following repeated intracoronary injections of autologous bone marrow-derived stem cells (BMSCs) in patients suffering from chronic ischemic heart failure. In this post hoc analysis, the long-term effect of therapy is assessed. 32 patients [mean age 61 (SD ± 9), 81% males] with systolic dysfunction (LVEF 33 ± 9%) received two repeated intracoronary infusions (4 months apart) of autologous BMSCs (1,533 ± 765 × 10(6) BMSCs including 23 ± 11 × 10(6) CD34(+) cells and 14 ± 7 × 10(6) CD133(+) cells). Patients were followed for 7 years and deaths were recorded. During follow-up, 10 patients died (31%). In univariate regression analysis, the total number of BMSCs, CD34(+) cell count and CD133(+) cell count did not significantly correlate with survival (hazard ratio: 0.999, 95% CI: 0.998-1.000, p = 0.24; hazard ratio: 0.94, 95% CI: 0.88-1.01, p = 0.10, and hazard ratio: 0.96, 95% CI: 0.87-1.07, p = 0.47, respectively). After adjustment for baseline variables in multivariate regression analysis, the CD34(+) cell count was significantly associated with survival (hazard ratio: 0.90, 95% CI: 0.82-1.00, p = 0.04). Intracoronary injections of a high number of CD34(+) cells may have a beneficial effect on chronic ischemic heart failure in terms of long-term survival.
Willey, Joshua; Gardener, Hannah; Cespedes, Sandino; Cheung, Ying K; Sacco, Ralph L; Elkind, Mitchell S V
2017-11-01
There is growing evidence that increased dietary sodium (Na) intake increases the risk of vascular diseases, including stroke, at least in part via an increase in blood pressure. Higher dietary potassium (K), seen with increased intake of fruits and vegetables, is associated with lower blood pressure. The goal of this study was to determine the association of a dietary Na:K with risk of stroke in a multiethnic urban population. Stroke-free participants from the Northern Manhattan Study, a population-based cohort study of stroke incidence, were followed-up for incident stroke. Baseline food frequency questionnaires were analyzed for Na and K intake. We estimated the hazard ratios and 95% confidence intervals for the association of Na:K with incident total stroke using multivariable Cox proportional hazards models. Among 2570 participants with dietary data (mean age, 69±10 years; 64% women; 21% white; 55% Hispanic; 24% black), the mean Na:K ratio was 1.22±0.43. Over a mean follow-up of 12 years, there were 274 strokes. In adjusted models, a higher Na:K ratio was associated with increased risk for stroke (hazard ratio, 1.6; 95% confidence interval, 1.2-2.1) and specifically ischemic stroke (hazard ratio, 1.6; 95% confidence interval, 1.2-2.1). Na:K intake is an independent predictor of stroke risk. Further studies are required to understand the joint effect of Na and K intake on risk of cardiovascular disease. © 2017 American Heart Association, Inc.
Kim, Seung Han; Chun, Hoon Jai; Yoo, In Kyung; Lee, Jae Min; Nam, Seung Joo; Choi, Hyuk Soon; Kim, Eun Sun; Keum, Bora; Seo, Yeon Seok; Jeen, Yoon Tae; Lee, Hong Sik; Um, Soon Ho; Kim, Chang Duck
2015-08-14
To investigate the predictive factors of self-expandable metallic stent patency after stent placement in patients with inoperable malignant gastroduodenal obstruction. A total of 116 patients underwent stent placements for inoperable malignant gastroduodenal obstruction at a tertiary academic center. Clinical success was defined as acceptable decompression of the obstructive lesion within the malignant gastroduodenal neoplasm. We evaluated patient comorbidities and clinical statuses using the World Health Organization's scoring system and categorized patient responses to chemotherapy using the Response Evaluation Criteria in Solid Tumors criteria. We analyzed the relationships between possible predictive factors and stent patency. Self-expandable metallic stent placement was technically successful in all patients (100%), and the clinical success rate was 84.2%. In a multivariate Cox proportional hazards model, carcinoembryonic antigen (CEA) levels were correlated with a reduction in stent patency [P = 0.006; adjusted hazard ratio (aHR) = 2.92, 95%CI: 1.36-6.25]. Palliative chemotherapy was statistically associated with an increase in stent patency (P = 0.009; aHR = 0.27, 95%CI: 0.10-0.72). CEA levels can easily be measured at the time of stent placement and may help clinicians to predict stent patency and determine the appropriate stent procedure.
Frederick, Wayne A I; Ames, Sarah; Downing, Stephanie R; Oyetunji, Tolulope A; Chang, David C; Leffall, Lasalle D
2010-06-01
Randomized clinical trials have not shown survival differences between breast cancer patients who undergo breast-conserving surgery and those who undergo modified radical mastectomy (MRM). Recent studies however, have suggested that these randomized clinical trials findings may not be representative of the entire population or the nature of current patient care. A retrospective analysis of female invasive breast cancer patients who underwent surgery in the Surveillance, Epidemiology, and End Results database (1990-2003) was performed. Survival was compared amongst women who underwent partial mastectomy, partial mastectomy plus radiation (PMR), or MRM. Cox proportional hazards regressions were used to investigate the impact of method of treatment upon survival, after adjusting for patient and tumor characteristics. A total of 218,043 patients, mean age 62 years, were identified. MRM accounted for 51.5 per cent of the study population whereas PMR accounted for 34.9 per cent. On multivariate analyses, significant improvement was observed in patient survival associated with PMR when compared with MRM patients (hazard ratio = 0.71, 95% confidence interval = 0.67-0.74, P < 0.001). This population-based study suggests that there is a survival benefit for women undergoing PMR in the treatment of breast cancer.
Ochoa-Gondar, Olga; Vila-Corcoles, Angel; Ansa, Xavier; Rodriguez-Blanco, T; Salsench, Elisabeth; de Diego, Cinta; Raga, Xavier; Gomez, Frederic; Valdivieso, Empar; Fuentes, Cruzma; Palacios, Laura
2008-04-07
A prospective cohort study evaluating the clinical effectiveness of the 23-valent pneumococcal polysaccharide vaccine was conducted among 1298 Spanish older adults with chronic respiratory diseases (bronchitis, emphysema or asthma) who were followed between 2002 and 2005. Main outcomes were all-cause community-acquired pneumonia (CAP) and 30 days mortality from CAP. The association between vaccination and the risk of each outcome was evaluated by multivariable Cox proportional-hazard models adjusted for age and comorbidity pneumococcal vaccination did not alter significantly the risk of overall CAP (hazard ratio [HR]: 0.77; 95% confidence interval [CI]: 0.56-1.07) and 30 days mortality from CAP (HR: 0.87; 95% CI: 0.33-2.28). However, a borderline significant reduction of 30% in the risk of all-cause hospitalisation for CAP was observed among vaccinated subjects (HR: 0.70; 95% CI: 0.48-1.00; p=0.052). The effectiveness of the vaccine on the combined endpoint of pneumococcal and unknown organism infections reached 34% (HR: 0.66; 95% CI: 0.43-1.01; p=0.059). Although our findings suggest moderate benefits from the vaccination, the evidence of clinical effectiveness appears limited.
Olsen, Morten; Hjortdal, Vibeke E; Mortensen, Laust H; Christensen, Thomas D; Sørensen, Henrik T; Pedersen, Lars
2011-04-01
Congenital heart defect patients may experience neurodevelopmental impairment. We investigated their educational attainments from basic schooling to higher education. Using administrative databases, we identified all Danish patients with a cardiac defect diagnosis born from 1 January, 1977 to 1 January, 1991 and alive at age 13 years. As a comparison cohort, we randomly sampled 10 persons per patient. We obtained information on educational attainment from Denmark's Database for Labour Market Research. The study population was followed until achievement of educational levels, death, emigration, or 1 January, 2006. We estimated the hazard ratio of attaining given educational levels, conditional on completing preceding levels, using discrete-time Cox regression and adjusting for socio-economic factors. Analyses were repeated for a sub-cohort of patients and controls born at term and without extracardiac defects or chromosomal anomalies. We identified 2986 patients. Their probability of completing compulsory basic schooling was approximately 10% lower than that of control individuals (adjusted hazard ratio = 0.79, ranged from 0.75 to 0.82 0.79; 95% confidence interval: 0.75-0.82). Their subsequent probability of completing secondary school was lower than that of the controls, both for all patients (adjusted hazard ratio = 0.74; 95% confidence interval: 0.69-0.80) and for the sub-cohort (adjusted hazard ratio = 0.80; 95% confidence interval: 0.73-0.86). The probability of attaining a higher degree, conditional on completion of youth education, was affected both for all patients (adjusted hazard ratio = 0.88; 95% confidence interval: 0.76-1.01) and for the sub-cohort (adjusted hazard ratio = 0.92; 95% confidence interval: 0.79-1.07). The probability of educational attainment was reduced among long-term congenital heart defect survivors.
Hicks, Caitlin W; Canner, Joseph K; Mathioudakis, Nestoras; Sherman, Ronald; Malas, Mahmoud B; Black, James H; Abularrage, Christopher J
2018-04-02
Previous studies have reported correlation between the Wound, Ischemia, and foot Infection (WIfI) classification system and wound healing time on unadjusted analyses. However, in the only multivariable analysis to date, WIfI stage was not predictive of wound healing. Our aim was to examine the association between WIfI classification and wound healing after risk adjustment in patients with diabetic foot ulcers (DFUs) treated in a multidisciplinary setting. All patients presenting to our multidisciplinary DFU clinic from June 2012 to July 2017 were enrolled in a prospective database. A Cox proportional hazards model accounting for patients' sociodemographics, comorbidities, medication profiles, and wound characteristics was used to assess the association between WIfI classification and likelihood of wound healing at 1 year. There were 310 DFU patients enrolled (mean age, 59.0 ± 0.7 years; 60.3% male; 60.0% black) with 709 wounds, including 32.4% WIfI stage 1, 19.9% stage 2, 25.2% stage 3, and 22.4% stage 4. Mean wound healing time increased with increasing WIfI stage (stage 1, 96.9 ± 8.3 days; stage 4, 195.1 ± 10.6 days; P < .001). Likelihood of wound healing at 1 year was 94.1% ± 2.0% for stage 1 wounds vs 67.4% ± 4.4% for stage 4 (P < .001). After risk adjustment, increasing WIfI stage was independently associated with poor wound healing (stage 4 vs stage 1: hazard ratio, [HR] 0.44; 95% confidence interval, 0.33-0.59). Peripheral artery disease (HR, 0.73), increasing wound area (HR, 0.99 per square centimeter), and longer time from wound onset to first assessment (HR, 0.97 per month) also decreased the likelihood of wound healing, whereas use of clopidogrel was protective (HR, 1.39; all, P ≤ .04). The top three predictors of poor wound healing were WIfI stage 4 (z score, -5.40), increasing wound area (z score, -3.14), and WIfI stage 3 (z score, -3.11), respectively. Among patients with DFU, the WIfI classification system predicts wound healing at 1 year in both crude and risk-adjusted analyses. This is the first study to validate the WIfI score as an independent predictor of wound healing using multivariable analysis. Copyright © 2018 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Patel, Ravi M; Knezevic, Andrea; Shenvi, Neeta; Hinkes, Michael; Keene, Sarah; Roback, John D; Easley, Kirk A; Josephson, Cassandra D
2016-03-01
Data regarding the contribution of red blood cell (RBC) transfusion and anemia to necrotizing enterocolitis (NEC) are conflicting. These associations have not been prospectively evaluated, accounting for repeated, time-varying exposures. To determine the relationship between RBC transfusion, severe anemia, and NEC. In a secondary, prospective, multicenter observational cohort study from January 2010 to February 2014, very low-birth-weight (VLBW, ≤1500 g) infants, within 5 days of birth, were enrolled at 3 level III neonatal intensive care units in Atlanta, Georgia. Two hospitals were academically affiliated and 1 was a community hospital. Infants received follow-up until 90 days, hospital discharge, transfer to a non-study-affiliated hospital, or death (whichever came first). Multivariable competing-risks Cox regression was used, including adjustment for birth weight, center, breastfeeding, illness severity, and duration of initial antibiotic treatment, to evaluate the association between RBC transfusion, severe anemia, and NEC. The primary exposure was RBC transfusion. The secondary exposure was severe anemia, defined a priori as a hemoglobin level of 8 g/dL or less. Both exposures were evaluated as time-varying covariates at weekly intervals. Necrotizing enterocolitis, defined as Bell stage 2 or greater by preplanned adjudication. Mortality was evaluated as a competing risk. Of 600 VLBW infants enrolled, 598 were evaluated. Forty-four (7.4%) infants developed NEC. Thirty-two (5.4%) infants died (all cause). Fifty-three percent of infants (319) received a total of 1430 RBC transfusion exposures. The unadjusted cumulative incidence of NEC at week 8 among RBC transfusion-exposed infants was 9.9% (95% CI, 6.9%-14.2%) vs 4.6% (95% CI, 2.6%-8.0%) among those who were unexposed. In multivariable analysis, RBC transfusion in a given week was not significantly related to the rate of NEC (adjusted cause-specific hazard ratio, 0.44 [95% CI, 0.17-1.12]; P = .09). Based on evaluation of 4565 longitudinal measurements of hemoglobin (median, 7 per infant), the rate of NEC was significantly increased among VLBW infants with severe anemia in a given week compared with those who did not have severe anemia (adjusted cause-specific hazard ratio, 5.99 [95% CI, 2.00-18.0]; P = .001). Among VLBW infants, severe anemia, but not RBC transfusion, was associated with an increased risk of NEC. Further studies are needed to evaluate whether preventing severe anemia is more important than minimizing RBC transfusion.
A new multivariate zero-adjusted Poisson model with applications to biomedicine.
Liu, Yin; Tian, Guo-Liang; Tang, Man-Lai; Yuen, Kam Chuen
2018-05-25
Recently, although advances were made on modeling multivariate count data, existing models really has several limitations: (i) The multivariate Poisson log-normal model (Aitchison and Ho, ) cannot be used to fit multivariate count data with excess zero-vectors; (ii) The multivariate zero-inflated Poisson (ZIP) distribution (Li et al., 1999) cannot be used to model zero-truncated/deflated count data and it is difficult to apply to high-dimensional cases; (iii) The Type I multivariate zero-adjusted Poisson (ZAP) distribution (Tian et al., 2017) could only model multivariate count data with a special correlation structure for random components that are all positive or negative. In this paper, we first introduce a new multivariate ZAP distribution, based on a multivariate Poisson distribution, which allows the correlations between components with a more flexible dependency structure, that is some of the correlation coefficients could be positive while others could be negative. We then develop its important distributional properties, and provide efficient statistical inference methods for multivariate ZAP model with or without covariates. Two real data examples in biomedicine are used to illustrate the proposed methods. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Tobias, Deirdre K; Hu, Frank B; Chavarro, Jorge; Rosner, Bernard; Mozaffarian, Dariush; Zhang, Cuilin
2012-11-12
Type 2 diabetes mellitus (T2DM) has reached epidemic proportions. Women with gestational diabetes mellitus (GDM) are at high risk for T2DM after pregnancy. Adherence to healthful dietary patterns has been inversely associated with T2DM in the general population; however, whether these dietary patterns are associated with progression to T2DM among a susceptible population is unknown. Four thousand four hundred thirteen participants from the Nurses' Health Study II cohort with prior GDM were followed up from 1991 to 2005. We derived the alternate Mediterranean diet (aMED), Dietary Approaches to Stop Hypertension (DASH), and alternate Healthy Eating Index (aHEI) dietary pattern adherence scores from a post-GDM validated food-frequency questionnaire, with cumulative average updating every 4 years. Multivariable Cox proportional hazards models estimated the relative risk (hazard ratios) and 95% confidence intervals. We observed 491 cases of incident T2DM during 52 743 person-years. All 3 patterns were inversely associated with T2DM risk with adjustment for age, total calorie intake, age at first birth, parity, ethnicity, parental diabetes, oral contraceptive use, menopause, and smoking. When we compared participants with the highest adherence (quartile 4) vs lowest (quartile 1), the aMED pattern was associated with 40% lower risk of T2DM (hazard ratio, 0.60 [95% CI, 0.44-0.82; P=.002]); the DASH pattern, with 46% lower risk (0.54 [0.39-0.73; P.001]); and the aHEI pattern, with 57% lower risk (0.43 [0.31-0.59; P.001]). Adjustment for body mass index moderately attenuated these findings. Adherence to healthful dietary patterns is associated with lower T2DM risk among women with a history of GDM. The inverse associations are partly mediated by body mass index.
The Association of Ursodeoxycholic Acid Use With Colorectal Cancer Risk
Huang, Wen-Kuan; Hsu, Hung-Chih; Liu, Jia-Rou; Yang, Tsai-Sheng; Chen, Jen-Shi; Chang, John Wen-Cheng; Lin, Yung-Chang; Yu, Kuang-Hui; Kuo, Chang-Fu; See, Lai-Chu
2016-01-01
Abstract Data from preclinical studies suggest that ursodeoxycholic acid (UDCA) has a chemopreventive effect on colorectal cancer (CRC) development, but no large observational study has examined this possibility. The aim of this study was to investigate the association of UDCA use with CRC risk in a nationwide population-based cohort. This nationwide population-based cohort study used data from the Taiwan National Health Insurance Research Database for the period from 2000 through 2010. This study included data from 7119 Taiwanese adults who received ≥28 cumulative defined daily doses (cDDDs) of UDCA and 14,238 patients who did not receive UDCA (<28 cDDDs). UDCA nonusers were matched 1:2 for age, sex, enrollment date, and presence of chronic liver disease, viral hepatitis, cholelithiasis, and alcoholic liver disease. The 2 cohorts were followed until December 31, 2010 or occurrence of CRC. Cox proportional hazards regression with robust Sandwich variance estimator, which can cooperate with matching design, was used to examine the association between UDCA use and CRC risk. During 109,312 person-years of follow-up (median, 5 years), 121 patients had newly diagnosed CRC: 28 UDCA users (76.7 per 100,000 person-years) and 93 nonusers (127.7 per 100,000 person-years) (log-rank test, P = 0.0169). After multivariate adjustment for age, UDCA use was associated with a reduced risk of CRC (hazard ratio, 0.60; 95% confidence interval [CI], 0.39–0.92). The adjusted hazard ratios were 0.55 (95% CI, 0.35–0.89), 0.89 (95% CI, 0.36–2.20), and 0.63 (95% CI, 0.16–2.53) for patients with 28 to 180, 181 to 365, and >365 cDDDs, respectively, relative to nonusers. UDCA use was associated with reduced risk of CRC in a cohort mainly comprising patients with chronic liver diseases. However, further studies are needed to determine the optimal dosage of UDCA. PMID:26986110
Whole Grain Consumption and Risk of Ischemic Stroke: Results From 2 Prospective Cohort Studies.
Juan, Juan; Liu, Gang; Willett, Walter C; Hu, Frank B; Rexrode, Kathryn M; Sun, Qi
2017-12-01
Higher intake of whole grains may exert cardiometabolic benefits, although findings on stroke risk are inconclusive. The potentially differential effects of individual whole grain foods on ischemic stroke have not been examined. We analyzed whole grain consumption in relation to ischemic stroke among 71 750 women from the Nurses' Health Study and 42 823 men from the Health Professionals Follow-up Study who were free of cardiovascular disease, diabetes mellitus, and cancer at baseline (1984 and 1986, respectively) through 2010 using a Cox proportional hazards model. Validated semiquantitative food frequency questionnaires were used to assess consumption of whole grain intake, including whole grain cold breakfast cereal, dark bread, oatmeal, brown rice, popcorn, bran, and germ. Self-reported incident cases of ischemic stroke were confirmed through medical record review. During 2 820 128 person-years of follow-up in the 2 cohorts, 2458 cases of ischemic stroke were identified and confirmed. Intake of total whole grains was not associated with risk of ischemic stroke after adjustment for covariates: the pooled hazard ratio (95% confidence interval) comparing extreme intake levels was 1.04 (0.91-1.19). However, intake of whole grain cold breakfast cereal and total bran was inversely associated with ischemic stroke after multivariate adjustment: the pooled hazard ratios (95% confidence intervals) were 0.88 (0.80-0.96; P trend =0.008) and 0.89 (0.79-1.00; P trend =0.004), respectively. Other whole grain foods were not associated with a lower risk of ischemic stroke. Although overall consumption of whole grains was not associated with lower risk of ischemic stroke, greater consumption of whole grain cold breakfast cereal and bran was significantly associated with a lower risk of ischemic stroke. More studies are needed to replicate these associations between individual whole grain foods and risk of ischemic stroke among other populations. © 2017 American Heart Association, Inc.
Bagai, Akshay; Peterson, Eric D; Honeycutt, Emily; Effron, Mark B; Cohen, David J; Goodman, Shaun G; Anstrom, Kevin J; Gupta, Anjan; Messenger, John C; Wang, Tracy Y
2015-12-01
While randomized clinical trials have compared clopidogrel with higher potency adenosine diphosphate (ADP) receptor inhibitors among patients with acute myocardial infarction, little is known about the frequency, effectiveness and safety of switching between ADP receptor inhibitors in routine clinical practice. We studied 11,999 myocardial infarction patients treated with percutaneous coronary intervention at 230 hospitals from April 2010 to October 2012 in the TRANSLATE-ACS study. Multivariable Cox regression was used to compare six-month post-discharge risks of major adverse cardiovascular events (MACE: death, myocardial infarction, stroke, or unplanned revascularization) and Global Utilization of Streptokinase and t-PA for Occluded Coronary Arteries (GUSTO)-defined bleeding between in-hospital ADP receptor inhibitor switching versus continuation of the initially selected therapy. Among 8715 patients treated initially with clopidogrel, 994 (11.4%) were switched to prasugrel or ticagrelor; switching occurred primarily after percutaneous coronary intervention (60.9%) and at the time of hospital discharge (26.7%). Among 3284 patients treated initially with prasugrel or ticagrelor, 448 (13.6%) were switched to clopidogrel; 48.2% of switches occurred after percutaneous coronary intervention and 48.0% at hospital discharge. Switching to prasugrel or ticagrelor was not associated with increased bleeding when compared with continuation on clopidogrel (2.7% vs. 3.3%, adjusted hazard ratio 0.96, 95% confidence interval 0.64-1.42, p=0.82). Switching from prasugrel or ticagrelor to clopidogrel was not associated with increased MACE (8.9% vs. 7.7%, adjusted hazard ratio 1.06, 95% confidence interval 0.75-1.49, p=0.76) when compared with continuation on the higher potency agent. In-hospital ADP receptor inhibitor switching occurs in more than one in 10 myocardial infarction patients in contemporary practice. In this observational study, ADP receptor inhibitor switching does not appear to be significantly associated with increased hazard of MACE or bleeding. © The European Society of Cardiology 2014.
Camacho-Barcia, María L; Bulló, Mònica; Garcia-Gavilán, Jesús F; Ruiz-Canela, Miguel; Corella, Dolores; Estruch, Ramón; Fitó, Montserrat; García-Layana, Alfredo; Arós, Fernando; Fiol, Miquel; Lapetra, José; Serra-Majem, Lluis; Pintó, Xavier; García-Arellano, Ana; Vinyoles, Ernest; Sorli, José Vicente; Salas-Salvadó, Jordi
2017-06-01
Cataract, one of the most frequent causes of blindness in developed countries, is strongly associated with aging. The exact mechanisms underlying cataract formation are still unclear, but growing evidence suggests a potential role of inflammatory and oxidative processes. Therefore, antioxidant and anti-inflammatory factors of the diet, such as vitamin K1, could play a protective role. To examine the association between dietary vitamin K1 intake and the risk of incident cataracts in an elderly Mediterranean population. A prospective analysis was conducted in 5860 participants from the Prevención con Dieta Mediterránea Study, a randomized clinical trial executed between 2003 and 2011. Participants were community-dwelling men (44.2%) and women (55.8%), and the mean (SD) age was 66.3 (6.1) years. Dietary vitamin K1 intake was evaluated using a validated food frequency questionnaire. The time to the cataract event was calculated as the time between recruitment and the date of the occurrence to cataract surgery, the time to the last visit of the follow-up, date of death, or the end of the study. Hazard ratios and 95% CIs for cataract incidence were estimated with a multivariable Cox proportional hazards model. Participants were community-dwelling men (44.2%; n = 868) and women (55.8%; n = 1086), and the mean (SD) age was 66.3 (6.1) years. After a median of 5.6 years follow-up, we documented a total of 768 new cataracts. Participants in the highest tertile of dietary vitamin K1 intake had a lower risk of cataracts than those in the lowest tertile (hazard ratio, 0.71; 95% CI, 0.58-0.88; P = .002), after adjusting for potential confounders. High intake of dietary vitamin K1 was associated with a reduced risk of cataracts in an elderly Mediterranean population even after adjusting by other potential confounders. isrctn.org: ISRCTN35739639.
Kubo, Sachimi; Kitamura, Akihiko; Imano, Hironori; Cui, Renzhe; Yamagishi, Kazumasa; Umesawa, Mitsumasa; Muraki, Isao; Kiyama, Masahiko; Okada, Takeo
2016-01-01
Aim: It is important to explore predictive markers other than conventional cardiovascular risk factors for early detection and treatment of chronic kidney disease (CKD), a major risk factor for end-stage renal failure. We hypothesized that serum albumin and high-sensitivity C-reactive protein (hs-CRP) to be independent markers, and examined their associations with the risk of CKD. Methods: We examined the associations of serum albumin and hs-CRP levels with the risk of incident CKD, in 2535 Japanese adults aged 40–69 years without CKD at baseline during a median 9.0-year follow-up after adjustment for known cardiovascular risk factors. Results: During the follow-up period, 367 cases of CKD developed. In multivariable analyses adjusted for known risk factors, the CKD hazard ratios (95% confidence intervals) for the highest versus lowest quartiles of serum albumin levels were 0.69 (0.40–1.17) for men and 0.42 (0.28–0.64) for women. Corresponding values for hs-CRP were 0.95 (0.54–1.67) for men and 1.85 (1.25 -2.75) for women. The association of combined serum albumin and hs-CRP with the risk of CKD was examined for women. The hazard ratio was 1.72 (1.17–2.54) for low versus higher albumin levels at lower hs-CRP levels, but such an association was not observed at high hs-CRP level. The hazard ratio was 1.96 (1.44–2.66) for high versus lower hs-CRP levels at higher serum albumin levels, but such association was not observed at low serum albumin level. Conclusion: Both low serum albumin and high hs-CRP levels were predictive of CKD for women. PMID:26911856
Vitamin D deficiency and incident stroke risk in community-living black and white adults.
Judd, Suzanne E; Morgan, Charity J; Panwar, Bhupesh; Howard, Virginia J; Wadley, Virginia G; Jenny, Nancy S; Kissela, Brett M; Gutiérrez, Orlando M
2016-01-01
Black individuals are at greater risk of stroke and vitamin D deficiency than white individuals. Epidemiologic studies have shown that low 25-hydroxyvitamin D concentrations are associated with increased risk of stroke, but these studies had limited representation of black individuals. We examined the association of 25-hydroxyvitamin D with incident stroke in the Reasons for Geographic and Racial Differences in Stroke (REGARDS) study, a cohort of black and white adults ≥45 years of age. Using a case-cohort study design, plasma 25-hydroxyvitamin D was measured in 610 participants who developed incident stroke (cases) and in 937 stroke-free individuals from a stratified cohort random sample of REGARDS participants (comparison cohort). In multivariable models adjusted for socio-demographic factors, co-morbidities and laboratory values including parathyroid hormone, lower 25-hydroxyvitamin D concentrations were associated with higher risk of stroke (25-hydroxyvitamin D >30 ng/mL reference; 25-hydroxyvitamin D concentrations 20-30 ng/mL, hazard ratio 1.33, 95% confidence interval (95% CI) 0.89,1.96; 25-hydroxyvitamin D <20 ng/mL, hazard ratio 1.85, 95% CI 1.17, 2.93). There were no statistically significant differences in the association of lower 25-hydroxyvitamin D with higher risk of stroke in black vs. white participants in fully adjusted models (hazard ratio comparing lowest vs. highest 25-hydroxyvitamin D category 2.62, 95% CI 1.18, 5.83 in blacks vs. 1.64, 95% CI 0.83, 3.24 in whites, P(interaction) = 0.82). The associations were qualitatively unchanged when restricted to ischemic or hemorrhagic stroke subtypes or when using race-specific cut-offs for 25-hydroxyvitamin D categories. Vitamin D deficiency is a risk factor for incident stroke and the strength of this association does not appear to differ by race. © 2016 World Stroke Organization.
Shah, Ravi; Heydari, Bobak; Coelho-Filho, Otavio; Murthy, Venkatesh L; Abbasi, Siddique; Feng, Jiazhuo H; Pencina, Michael; Neilan, Tomas G; Meadows, Judith L; Francis, Sanjeev; Blankstein, Ron; Steigner, Michael; di Carli, Marcelo; Jerosch-Herold, Michael; Kwong, Raymond Y
2013-08-06
A recent large-scale clinical trial found that an initial invasive strategy does not improve cardiac outcomes beyond optimized medical therapy in patients with stable coronary artery disease. Novel methods to stratify at-risk patients may refine therapeutic decisions to improve outcomes. In a cohort of 815 consecutive patients referred for evaluation of myocardial ischemia, we determined the net reclassification improvement of the risk of cardiac death or nonfatal myocardial infarction (major adverse cardiac events) incremental to clinical risk models, using guideline-based low (<1%), moderate (1% to 3%), and high (>3%) annual risk categories. In the whole cohort, inducible ischemia demonstrated a strong association with major adverse cardiac events (hazard ratio=14.66; P<0.0001) with low negative event rates of major adverse cardiac events and cardiac death (0.6% and 0.4%, respectively). This prognostic robustness was maintained in patients with previous coronary artery disease (hazard ratio=8.17; P<0.0001; 1.3% and 0.6%, respectively). Adding inducible ischemia to the multivariable clinical risk model (adjusted for age and previous coronary artery disease) improved discrimination of major adverse cardiac events (C statistic, 0.81-0.86; P=0.04; adjusted hazard ratio=7.37; P<0.0001) and reclassified 91.5% of patients at moderate pretest risk (65.7% to low risk; 25.8% to high risk) with corresponding changes in the observed event rates (0.3%/y and 4.9%/y for low and high risk posttest, respectively). Categorical net reclassification index was 0.229 (95% confidence interval, 0.063-0.391). Continuous net reclassification improvement was 1.11 (95% confidence interval, 0.81-1.39). Stress cardiac magnetic resonance imaging effectively reclassifies patient risk beyond standard clinical variables, specifically in patients at moderate to high pretest clinical risk and in patients with previous coronary artery disease. http://www.clinicaltrials.gov. Unique identifier: NCT01821924.
Adjusted variable plots for Cox's proportional hazards regression model.
Hall, C B; Zeger, S L; Bandeen-Roche, K J
1996-01-01
Adjusted variable plots are useful in linear regression for outlier detection and for qualitative evaluation of the fit of a model. In this paper, we extend adjusted variable plots to Cox's proportional hazards model for possibly censored survival data. We propose three different plots: a risk level adjusted variable (RLAV) plot in which each observation in each risk set appears, a subject level adjusted variable (SLAV) plot in which each subject is represented by one point, and an event level adjusted variable (ELAV) plot in which the entire risk set at each failure event is represented by a single point. The latter two plots are derived from the RLAV by combining multiple points. In each point, the regression coefficient and standard error from a Cox proportional hazards regression is obtained by a simple linear regression through the origin fit to the coordinates of the pictured points. The plots are illustrated with a reanalysis of a dataset of 65 patients with multiple myeloma.
Donor characteristics and risk of hepatocellular carcinoma recurrence after liver transplantation.
Orci, L A; Berney, T; Majno, P E; Lacotte, S; Oldani, G; Morel, P; Mentha, G; Toso, C
2015-09-01
To date, studies assessing the risk of post-transplant hepatocellular carcinoma (HCC) recurrence have focused on tumour characteristics. This study investigated the impact of donor characteristics and graft quality on post-transplant HCC recurrence. Using the Scientific Registry of Transplant Recipients patients with HCC who received a liver transplant between 2004 and 2011 were included, and post-transplant HCC recurrence was assessed. A multivariable competing risk regression model was fitted, adjusting for confounders such as recipient sex, age, tumour volume, α-fetoprotein, time on the waiting list and transplant centre. A total of 9724 liver transplant recipients were included. Patients receiving a graft procured from a donor older than 60 years (adjusted hazard ratio (HR) 1.38, 95 per cent c.i. 1.10 to 1.73; P = 0.006), a donor with a history of diabetes (adjusted HR 1.43, 1.11 to 1.83; P = 0.006) and a donor with a body mass index of 35 kg/m(2) or more (adjusted HR 1.36, 1.04 to 1.77; P = 0.023) had an increased rate of post-transplant HCC recurrence. In 3007 patients with documented steatosis, severe graft steatosis (more than 60 per cent) was also linked to an increased risk of recurrence (adjusted HR 1.65, 1.03 to 2.64; P = 0.037). Recipients of organs from donation after cardiac death donors with prolonged warm ischaemia had higher recurrence rates (adjusted HR 4.26, 1.20 to 15.1; P = 0.025). Donor-related factors such as donor age, body mass index, diabetes and steatosis are associated with an increased rate of HCC recurrence after liver transplantation. © 2015 BJS Society Ltd Published by John Wiley & Sons Ltd.
Pan, An; Sun, Qi; Bernstein, Adam M; Manson, JoAnn E; Willett, Walter C; Hu, Frank B
2013-07-22
Red meat consumption has been consistently associated with an increased risk of type 2 diabetes mellitus (T2DM). However, whether changes in red meat intake are related to subsequent T2DM risk remains unknown. To evaluate the association between changes in red meat consumption during a 4-year period and subsequent 4-year risk of T2DM in US adults. Three prospective cohort studies in US men and women. We followed up 26,357 men in the Health Professionals Follow-up Study (1986-2006), 48,709 women in the Nurses' Health Study (1986-2006), and 74,077 women in the Nurses' Health Study II (1991-2007). Diet was assessed by validated food frequency questionnaires and updated every 4 years. Time-dependent Cox proportional hazards regression models were used to calculate hazard ratios with adjustment for age, family history, race, marital status, initial red meat consumption, smoking status, and initial and changes in other lifestyle factors (physical activity, alcohol intake, total energy intake, and diet quality). Results across cohorts were pooled by an inverse variance-weighted, fixed-effect meta-analysis. Incident T2DM cases validated by supplementary questionnaires. During 1,965,824 person-years of follow-up, we documented 7540 incident T2DM cases. In the multivariate-adjusted models, increasing red meat intake during a 4-year interval was associated with an elevated risk of T2DM during the subsequent 4 years in each cohort (all P < .001 for trend). Compared with the reference group of no change in red meat intake, increasing red meat intake of more than 0.50 servings per day was associated with a 48% (pooled hazard ratio, 1.48; 95% CI, 1.37-1.59) elevated risk in the subsequent 4-year period, and the association was modestly attenuated after further adjustment for initial body mass index and concurrent weight gain (1.30; 95% CI, 1.21-1.41). Reducing red meat consumption by more than 0.50 servings per day from baseline to the first 4 years of follow-up was associated with a 14% (pooled hazard ratio, 0.86; 95% CI, 0.80-0.93) lower risk during the subsequent entire follow-up through 2006 or 2007. Increasing red meat consumption over time is associated with an elevated subsequent risk of T2DM, and the association is partly mediated by body weight. Our results add further evidence that limiting red meat consumption over time confers benefits for T2DM prevention.
Kopecky, Chantal; Genser, Bernd; Drechsler, Christiane; Krane, Vera; Kaltenecker, Christopher C.; Hengstschläger, Markus; März, Winfried; Wanner, Christoph; Säemann, Marcus D.
2015-01-01
Background and objectives Impairment of HDL function has been associated with cardiovascular events in patients with kidney failure. The protein composition of HDLs is altered in these patients, presumably compromising the cardioprotective effects of HDLs. This post hoc study assessed the relation of distinct HDL-bound proteins with cardiovascular outcomes in a dialysis population. Design, setting, participants, & measurements The concentrations of HDL-associated serum amyloid A (SAA) and surfactant protein B (SP-B) were measured in 1152 patients with type 2 diabetes mellitus on hemodialysis participating in The German Diabetes Dialysis Study who were randomly assigned to double-blind treatment of 20 mg atorvastatin daily or matching placebo. The association of SAA(HDL) and SP-B(HDL) with cardiovascular outcomes was assessed in multivariate regression models adjusted for known clinical risk factors. Results High concentrations of SAA(HDL) were significantly and positively associated with the risk of cardiac events (hazard ratio per 1 SD higher, 1.09; 95% confidence interval, 1.01 to 1.19). High concentrations of SP-B(HDL) were significantly associated with all-cause mortality (hazard ratio per 1 SD higher, 1.10; 95% confidence interval, 1.02 to 1.19). Adjustment for HDL cholesterol did not affect these associations. Conclusions In patients with diabetes on hemodialysis, SAA(HDL) and SP-B(HDL) were related to cardiac events and all-cause mortality, respectively, and they were independent of HDL cholesterol. These findings indicate that a remodeling of the HDL proteome was associated with a higher risk for cardiovascular events and mortality in patients with ESRD. PMID:25424990
Carbohydrate nutrition and inflammatory disease mortality in older adults.
Buyken, Anette E; Flood, Victoria; Empson, Marianne; Rochtchina, Elena; Barclay, Alan W; Brand-Miller, Jennie; Mitchell, Paul
2010-09-01
Several studies suggest that carbohydrate nutrition is related to oxidative stress and inflammatory markers. We examined whether dietary glycemic index (GI), dietary fiber, and carbohydrate-containing food groups were associated with the mortality attributable to noncardiovascular, noncancer inflammatory disease in an older Australian cohort. Analysis included 1490 postmenopausal women and 1245 men aged ge 49 y at baseline (1992-1994) from a population-based cohort who completed a validated food-frequency questionnaire. Cox proportional hazards ratios were calculated both for death from diseases in which inflammation or oxidative stress was a predominant contributor and for cardiovascular mortality. Over a 13-y period, 84 women and 86 men died of inflammatory diseases. Women in the highest GI tertile had a 2.9-fold increased risk of inflammatory death compared with women in the lowest GI tertile [multivariate hazard ratio in energy-adjusted tertile 3 (tertile 1 as reference): 2.89; 95% CI: 1.52, 5.51; P for trend: 0.0006, adjusted for age, smoking, diabetes, and alcohol and fiber consumption]. Increasing intakes of foods high in refined sugars or refined starches (P = 0.04) and decreasing intakes of bread and cereals (P = 0.008) or vegetables other than potatoes (P = 0.007) also independently predicted a greater risk, with subjects' GI partly explaining these associations. In men, only an increased consumption of fruit fiber (P = 0.005) and fruit (P = 0.04) conferred an independent decrease in risk of inflammatory death. No associations were observed with cardiovascular mortality. These data provide new epidemiologic evidence of a potentially important link between GI and inflammatory disease mortality among older women.
Effects of Statin Intensity and Adherence on the Long-Term Prognosis After Acute Ischemic Stroke.
Kim, Jinkwon; Lee, Hye Sun; Nam, Chung Mo; Heo, Ji Hoe
2017-10-01
Statin is an established treatment for secondary prevention after ischemic stroke. However, the effects of statin intensity and adherence on the long-term prognosis after acute stroke are not well known. This retrospective cohort study using a nationwide health insurance claim data in South Korea included patients admitted with acute ischemic stroke between 2002 and 2012. Statin adherence and intensity were determined from the prescription data for a period of 1 year after the index stroke. The primary outcome was a composite of recurrent stroke, myocardial infarction, and all-cause mortality. We performed multivariate Cox proportional regression analyses. We included 8001 patients with acute ischemic stroke. During the mean follow-up period of 4.69±2.72 years, 2284 patients developed a primary outcome. Compared with patients with no statin, adjusted hazard ratios (95% confidence interval) were 0.74 (0.64-0.84) for good adherence, 0.93 (0.79-1.09) for intermediate adherence, and 1.07 (0.95-1.20) for poor adherence to statin. Among the 1712 patients with good adherence, risk of adverse events was lower in patients with high-intensity statin (adjusted hazard ratio [95% confidence interval], 0.48 [0.24-0.96]) compared with those with low-intensity statin. Neither good adherence nor high intensity of statin was associated with an increased risk of hemorrhagic stroke. After acute ischemic stroke, high-intensity statin therapy with good adherence was significantly associated with a lower risk of adverse events. © 2017 American Heart Association, Inc.
Lung cancer incidence and survival among HIV-infected and uninfected women and men.
Hessol, Nancy A; Martínez-Maza, Otoniel; Levine, Alexandra M; Morris, Alison; Margolick, Joseph B; Cohen, Mardge H; Jacobson, Lisa P; Seaberg, Eric C
2015-06-19
To determine the lung cancer incidence and survival time among HIV-infected and uninfected women and men. Two longitudinal studies of HIV infection in the United States. Data from 2549 women in the Women's Interagency HIV Study (WIHS) and 4274 men in the Multicenter AIDS Cohort Study (MACS), all with a history of cigarette smoking, were analyzed. Lung cancer incidence rates and incidence rate ratios were calculated using Poisson regression analyses. Survival time was assessed using Kaplan-Meier and Cox proportional-hazard analyses. Thirty-seven women and 23 men developed lung cancer (46 HIV-infected and 14 HIV-uninfected) during study follow-up. In multivariable analyses, the factors that were found to be independently associated with a higher lung cancer incidence rate ratios were older age, less education, 10 or more pack-years of smoking, and a prior diagnosis of AIDS pneumonia (vs. HIV-uninfected women). In an adjusted Cox model that allowed different hazard functions for each cohort, a history of injection drug use was associated with shorter survival, and a lung cancer diagnosis after 2001 was associated with longer survival. In an adjusted Cox model restricted to HIV-infected participants, nadir CD4 lymphocyte cell count less than 200 was associated with shorter survival time. Our data suggest that pulmonary damage and inflammation associated with HIV infection may be causative for the increased risk of lung cancer. Encouraging and assisting younger HIV-infected smokers to quit and to sustain cessation of smoking is imperative to reduce the lung cancer burden in this population.
Senthanar, S; Kristman, V L; Hogg-Johnson, S
2015-07-01
Northern Ontario, Canada has a larger elder population, more resource-based employment, and limited access to physicians and specialists compared to southern Ontario. Given these important differences, it is possible that work disability rates will vary between the two Ontario jurisdictions. To determine the association between time lost due to workplace injuries and illnesses occurring in northern vs southern Ontario and work disability duration from 2006--2011. The study base included all lost-time claims approved by the Workplace Safety and Insurance Board in Ontario, Canada for workplace injury or illness compensation occurring between January 1, 2006 and December 31, 2011. All eligible participants had to be 18 years of age or older at the time of making the claim and participants were excluded if one of the three variables used to determine location (claimant home postal code, workplace geographical code, and WSIB firm location) were missing. Multivariable proportional hazards regression models were used to estimate hazard ratios and 95% confidence intervals adjusted for sex, age, occupation, part of body, and nature of injury relating Ontario geographical location to compensated time off work. A total of 156 453 lost-time claims were approved over the study period. Injured and ill workers from northern Ontario were 16% less likely to return to work than those from southern Ontario. Adjustment for potential confounding factors had no effect. The disability duration in northern Ontario is longer than that in southern Ontario. Future research should focus on assessing the relevant factors associated with this observation to identify opportunities for intervention.
Yiu, Andrew; Van Hemelrijck, Mieke; Garmo, Hans; Holmberg, Lars; Malmström, Håkan; Lambe, Mats; Hammar, Niklas; Walldius, Göran; Jungner, Ingmar; Wulaningsih, Wahyu
2017-06-27
Serum uric acid has been suggested to be associated with cancer risk. We aimed to study the association between serum uric acid and cancer incidence in a large Swedish cohort. A positive association was found between uric acid levels and overall cancer risk, and results were similar with adjustment for glucose, triglycerides and BMI. Hazard ratio (HR) for overall cancer for the 4th quartile of uric acid compared to the 1st was 1.08 (95% CI: 1.05-1.11) in men and 1.12 (1.09 - 1.16) in women. Site-specific analysis showed a positive association between uric acid and risk of colorectal, hepatobiliary, kidney, non-melanoma skin, and other cancers in men and of head and neck and other cancers in women. An inverse association was observed for pulmonary and central nervous system (CNS) cancers in men and breast, lymphatic and haematological, and CNS malignancies in women. We included 493,281 persons aged 20 years and older who had a measurement of serum uric acid and were cancer-free at baseline in the AMORIS study. Multivariable Cox proportional hazards regression was used to investigate sex-specific quartiles of serum uric acid in relation to cancer risk in men and women. Analysis was further adjusted for serum glucose, triglycerides and, where available, BMI. Site-specific analysis was performed for major cancers. Altered uric acid levels were associated with risk of overall and some specific cancers, further indicating the potential role of uric acid metabolism in carcinogenesis.
Yiu, Andrew; Van Hemelrijck, Mieke; Garmo, Hans; Holmberg, Lars; Malmström, Håkan; Lambe, Mats; Hammar, Niklas; Walldius, Göran; Jungner, Ingmar; Wulaningsih, Wahyu
2017-01-01
Objectives Serum uric acid has been suggested to be associated with cancer risk. We aimed to study the association between serum uric acid and cancer incidence in a large Swedish cohort. Results A positive association was found between uric acid levels and overall cancer risk, and results were similar with adjustment for glucose, triglycerides and BMI. Hazard ratio (HR) for overall cancer for the 4th quartile of uric acid compared to the 1st was 1.08 (95% CI: 1.05–1.11) in men and 1.12 (1.09 – 1.16) in women. Site-specific analysis showed a positive association between uric acid and risk of colorectal, hepatobiliary, kidney, non-melanoma skin, and other cancers in men and of head and neck and other cancers in women. An inverse association was observed for pulmonary and central nervous system (CNS) cancers in men and breast, lymphatic and haematological, and CNS malignancies in women. Materials and Methods We included 493,281 persons aged 20 years and older who had a measurement of serum uric acid and were cancer-free at baseline in the AMORIS study. Multivariable Cox proportional hazards regression was used to investigate sex-specific quartiles of serum uric acid in relation to cancer risk in men and women. Analysis was further adjusted for serum glucose, triglycerides and, where available, BMI. Site-specific analysis was performed for major cancers. Conclusions Altered uric acid levels were associated with risk of overall and some specific cancers, further indicating the potential role of uric acid metabolism in carcinogenesis. PMID:28418841
Distelmaier, Klaus; Wiedemann, Dominik; Binder, Christina; Haberl, Thomas; Zimpfer, Daniel; Heinz, Gottfried; Koinig, Herbert; Felli, Alessia; Steinlechner, Barbara; Niessner, Alexander; Laufer, Günther; Lang, Irene M; Goliasch, Georg
2018-06-01
The overall therapeutic goal of venoarterial extracorporeal membrane oxygenation (ECMO) in patients with postcardiotomy shock is bridging to myocardial recovery. However, in patients with irreversible myocardial damage prolonged ECMO treatment would cause a delay or even withholding of further permanent potentially life-saving therapeutic options. We therefore assessed the prognostic effect of duration of ECMO support on survival in adult patients after cardiovascular surgery. We enrolled into our single-center registry a total of 354 patients who underwent venoarterial ECMO support after cardiovascular surgery at a university-affiliated tertiary care center. Through a median follow-up period of 45 months (interquartile range, 20-81 months), 245 patients (69%) died. We observed an increase in mortality with increasing duration of ECMO support. The association between increased duration of ECMO support and mortality persisted in patients who survived ECMO support with a crude hazard ratio of 1.96 (95% confidence interval, 1.40-2.74; P < .001) for 2-year mortality compared with the third tertile and the second tertile of ECMO duration. This effect was even more pronounced after multivariate adjustment using a bootstrap-selected confounder model with an adjusted hazard ratio of 2.30 (95% confidence interval, 1.52-3.48; P < .001) for 2-year long-term mortality. Prolonged venoarterial ECMO support is associated with poor outcome in adult patients after cardiovascular surgery. Our data suggest reevaluation of therapeutic strategies after 7 days of ECMO support because mortality disproportionally increases afterward. Copyright © 2018 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Borgman, Matthew A.; Cannon, Jeremy W.; Kuppermann, Nathan; Neff, Lucas P.
2018-01-01
Introduction In adults with traumatic brain injuries (TBI), hypotension and hypertension at presentation are associated with mortality. The effect of age-adjusted blood pressure in children with TBI has been insufficiently studied. We sought to determine if age-adjusted hypertension in children with severe TBI is associated with mortality. Methods This was a retrospective analysis of the Department of Defense Trauma Registry (DoDTR) between 2001 and 2013. We included for analysis patients <18 years with severe TBI defined as Abbreviated Injury Severity (AIS) scores of the head ≥3. We defined hypertension as moderate for systolic blood pressures (SBP) between the 95th and 99th percentile for age and gender and severe if greater than the 99th percentile. Hypotension was defined as SBP <90 mmHg for children >10 years or < 70mmHg + (2 × age) for children ≤10 years. We performed multivariable logistic regression and Cox regression to determine if BP categories were associated with mortality. Results Of 4,990 children included in the DoDTR, 740 met criteria for analysis. Fifty patients (6.8%) were hypotensive upon arrival to the ED, 385 (52.0%) were normotensive, 115 (15.5%) had moderate hypertension, and 190 (25.7%) had severe hypertension. When compared to normotensive patients, moderate and severe hypertension patients had similar Injury Severity Scores, similar AIS head scores, and similar frequencies of neurosurgical procedures. Multivariable logistic regression demonstrated that hypotension (odd ratio [OR] 2.85, 95 confidence interval [CI] 1.26–6.47) and severe hypertension (OR 2.58, 95 CI 1.32–5.03) were associated with increased 24-hour mortality. Neither hypotension (Hazard ratio (HR) 1.52, 95 CI 0.74–3.11) nor severe hypertension (HR 1.65, 95 CI 0.65–2.30) was associated with time to mortality. Conclusion Pediatric age-adjusted hypertension is frequent after severe TBI. Severe hypertension is strongly associated with 24-hour mortality. Pediatric age-adjusted blood pressure needs to be further evaluated as a critical marker of early mortality. PMID:29760839
Okamura, Tomonori; Kokubo, Yoshihiro; Watanabe, Makoto; Higashiyama, Aya; Ono, Yuu; Nishimura, Kunihiro; Okayama, Akira; Miyamoto, Yoshihiro
2011-07-01
Recently, several major organizations have proposed a unified definition for the metabolic syndrome (MetS), which should be evaluated in multiethnic groups. The effect of Mets on the incidence of cardiovascular disease needs to be assessed after adjusting for serum low density lipoprotein cholesterol (LDLC), a major risk factor for atherosclerotic diseases. This is especially needed to be evaluated in Asian populations with low incidence of coronary artery disease (CAD). We conducted a 13-year prospective study of 4939 Japanese living in an urban area. The MetS was defined using a unified classification that included cut-off points for waist circumference in Asians. The multivariable adjusted hazard ratios (HRs) of MetS for CAD and stroke were calculated using a Cox proportional model adjusted for other potential confounding factors with LDLC. During the follow-up period, there were 155 cases of CAD and 204 of stroke including 118 cerebral infarctions. In participants under 65 years old, the multivariable HRs of MetS for CAD were 1.21 (95% C.I., 0.64-2.28) in men and 4.44 (95% C.I., 1.73-11.4) in women; the HRs for ischemic stroke were 3.24 (95% C.I., 1.55-6.77) in men and 3.99 (95% C.I., 1.34-11.8) in women. In participants aged 65 years old and over, MetS only showed a significant association with CAD in men (HR 1.89, 95% C.I., 1.11-3.21). Serum LDLC was associated with increased risk of CAD in men irrespective of age group; however, it was not associated with CAD in women. There was no association between serum LDLC and ischemic stroke in any group stratified by sex and the age of 65. These results indicate that the new uniform MetS definition is useful for detecting high risk individuals, especially for middle-aged population. However, continuous screening for hypercholesterolemia is necessary to prevent CAD, especially in men, even in Asian countries such as Japan. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Johnson, M Austin; Borgman, Matthew A; Cannon, Jeremy W; Kuppermann, Nathan; Neff, Lucas P
2018-05-01
In adults with traumatic brain injuries (TBI), hypotension and hypertension at presentation are associated with mortality. The effect of age-adjusted blood pressure in children with TBI has been insufficiently studied. We sought to determine if age-adjusted hypertension in children with severe TBI is associated with mortality. This was a retrospective analysis of the Department of Defense Trauma Registry (DoDTR) between 2001 and 2013. We included for analysis patients <18 years with severe TBI defined as Abbreviated Injury Severity (AIS) scores of the head ≥3. We defined hypertension as moderate for systolic blood pressures (SBP) between the 95 th and 99 th percentile for age and gender and severe if greater than the 99th percentile. Hypotension was defined as SBP <90 mmHg for children >10 years or < 70mmHg + (2 × age) for children ≤10 years. We performed multivariable logistic regression and Cox regression to determine if BP categories were associated with mortality. Of 4,990 children included in the DoDTR, 740 met criteria for analysis. Fifty patients (6.8%) were hypotensive upon arrival to the ED, 385 (52.0%) were normotensive, 115 (15.5%) had moderate hypertension, and 190 (25.7%) had severe hypertension. When compared to normotensive patients, moderate and severe hypertension patients had similar Injury Severity Scores, similar AIS head scores, and similar frequencies of neurosurgical procedures. Multivariable logistic regression demonstrated that hypotension (odd ratio [OR] 2.85, 95 confidence interval [CI] 1.26-6.47) and severe hypertension (OR 2.58, 95 CI 1.32-5.03) were associated with increased 24-hour mortality. Neither hypotension (Hazard ratio (HR) 1.52, 95 CI 0.74-3.11) nor severe hypertension (HR 1.65, 95 CI 0.65-2.30) was associated with time to mortality. Pediatric age-adjusted hypertension is frequent after severe TBI. Severe hypertension is strongly associated with 24-hour mortality. Pediatric age-adjusted blood pressure needs to be further evaluated as a critical marker of early mortality.
Okayama, Akira; Okuda, Nagako; Miura, Katsuyuki; Okamura, Tomonori; Hayakawa, Takehito; Akasaka, Hiroshi; Ohnishi, Hirofumi; Saitoh, Shigeyuki; Arai, Yusuke; Kiyohara, Yutaka; Takashima, Naoyuki; Yoshita, Katsushi; Fujiyoshi, Akira; Zaid, Maryam; Ohkubo, Takayoshi; Ueshima, Hirotsugu
2016-01-01
Objectives To evaluate the impact of dietary sodium and potassium (Na–K) ratio on mortality from total and subtypes of stroke, cardiovascular disease (CVD) and all causes, using 24-year follow-up data of a representative sample of the Japanese population. Setting Prospective cohort study. Participants In the 1980 National Cardiovascular Survey, participants were followed for 24 years (NIPPON DATA80, National Integrated Project for Prospective Observation of Non-communicable Disease And its Trends in the Aged). Men and women aged 30–79 years without hypertensive treatment, history of stroke or acute myocardial infarction (n=8283) were divided into quintiles according to dietary Na–K ratio assessed by a 3-day weighing dietary record at baseline. Age-adjusted and multivariable-adjusted HRs were calculated using the Mantel-Haenszel method and Cox proportional hazards model. Primary outcome measures Mortality from total and subtypes of stroke, CVD and all causes. Results A total of 1938 deaths from all causes were observed over 176 926 person-years. Na–K ratio was significantly and non-linearly related to mortality from all stroke (p=0.002), CVD (p=0.005) and total mortality (p=0.001). For stroke subtypes, mortality from haemorrhagic stroke was positively related to Na–K ratio (p=0.024). Similar relationships were observed for men and women. The observed relationships remained significant after adjustment for other risk factors. Quadratic non-linear multivariable-adjusted HRs (95% CI) in the highest quintile versus the lowest quintile of Na–K ratio were 1.42 (1.07 to 1.90) for ischaemic stroke, 1.57 (1.05 to 2.34) for haemorrhagic stroke, 1.43 (1.17 to 1.76) for all stroke, 1.39 (1.20 to 1.61) for CVD and 1.16 (1.06 to 1.27) for all-cause mortality. Conclusions Dietary Na–K ratio assessed by a 3-day weighing dietary record was a significant risk factor for mortality from haemorrhagic stroke, all stroke, CVD and all causes among a Japanese population. PMID:27412107
Akinkuolie, Akintunde O; Glynn, Robert J; Padmanabhan, Latha; Ridker, Paul M; Mora, Samia
2016-07-13
GlycA, a novel protein glycan biomarker of N-acetyl side chains of acute-phase proteins, was recently associated with incident cardiovascular disease (CVD) in healthy women. Whether GlycA predicts CVD events in the setting of statin therapy in men and women without CVD but with evidence of chronic inflammation is unknown. In the Justfication for the Use of Statins in Prevention: an Intervention Trial Evaluating Rosuvastatin (JUPITER) trial (NCT00239681), participants with low-density lipoprotein cholesterol <130 mg/dL and high-sensitivity C-reactive protein (hsCRP) ≥2 mg/L were randomized to rosuvastatin 20 mg/day or placebo. GlycA was quantified by nuclear magnetic resonance spectroscopy in 12 527 before randomization and 10 039 participants at 1 year. A total of 310 first primary CVD events occurred during maximum follow-up of 5.0 years (median, 1.9). GlycA changed minimally after 1 year on study treatment: 6.8% and 4.7% decrease in the rosuvastatin and placebo groups, respectively. Overall, baseline GlycA levels were associated with increased risk of CVD: multivariable-adjusted hazard ratio (HR) per SD increment, 1.20 (95% CI, 1.08-1.34; P=0.0006). After additionally adjusting for hsCRP, this was slightly attenuated (HR, 1.18; 95% CI, 1.04-1.35; P=0.01). On-treatment GlycA levels were also associated with CVD; corresponding multivariable-adjusted HRs per SD before and after additionally adjusting for hsCRP: 1.27 (95% CI, 1.13-1.42; P<0.0001) and 1.24 (95% CI, 1.07-1.44; P=0.004), respectively. Tests for heterogeneity by treatment arm were not significant (P for interaction, >0.20). In the JUPITER trial, increased levels of GlycA were associated with an increased risk of CVD events independent of traditional risk factors and hsCRP. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00239681. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Cooking Methods for Red Meats and Risk of Type 2 Diabetes: A Prospective Study of U.S. Women.
Liu, Gang; Zong, Geng; Hu, Frank B; Willett, Walter C; Eisenberg, David M; Sun, Qi
2017-08-01
This study examined different cooking methods for red meats in relation to type 2 diabetes (T2D) risk among U.S. women who consumed red meats regularly (≥2 servings/week). We monitored 59,033 women (1986-2012) aged 30-55 years and free of diabetes, cardiovascular disease, and cancer at baseline when information on frequency of different cooking methods for red meats, including broiling, barbequing, roasting, pan-frying, and stewing/boiling, was collected. During 1.24 million person-years of follow-up, we documented 6,206 incident cases of T2D. After multivariate adjustment including red meat cooking methods, total red meat and processed red meat intake were both associated with a monotonically increased T2D risk (both P trend <0.05). After multivariate adjustment including total red meat intake, a higher frequency of broiling, barbequing, and roasting red meats was each independently associated with a higher T2D risk. When comparing ≥2 times/week with <1 time/month, the hazard ratios (HRs) and 95% CI of T2D were 1.29 (1.19, 1.40; P trend <0.001) for broiling, 1.23 (1.11, 1.38; P trend <0.001) for barbequing, and 1.11 (1.01, 1.23; P trend = 0.14) for roasting. In contrast, the frequency of stewing/boiling red meats was not associated with T2D risk, and an inverse association was observed for pan-frying frequency and T2D risk. The results remained similar after cooking methods were further mutually adjusted. Independent of total red meat consumption, high-temperature and/or open-flame cooking methods for red meats, especially broiling and barbequing, may further increase diabetes risk among regular meat eaters. © 2017 by the American Diabetes Association.
Zumsteg, Zachary S; Luu, Michael; Yoshida, Emi J; Kim, Sungjin; Tighiouart, Mourad; David, John M; Shiao, Stephen L; Mita, Alain C; Scher, Kevin S; Sherman, Eric J; Lee, Nancy Y; Ho, Allen S
2017-12-01
There is increasing evidence that primary tumor ablation can improve survival for some cancer patients with distant metastases. This may be particularly applicable to head and neck squamous cell carcinoma (HNSCC) because of its tropism for locoregional progression. This study included patients with metastatic HNSCC undergoing systemic therapy identified in the National Cancer Data Base. High-intensity local treatment was defined as radiation doses ≥ 60 Gy or oncologic resection of the primary tumor. Multivariate Cox regression, propensity score matching, landmark analysis, and subgroup analysis were performed to account for imbalances in covariates, including adjustments for the number and location of metastatic sites in the subset of patients with this information available. In all, 3269 patients were included (median follow-up, 51.5 months). Patients undergoing systemic therapy with local treatment had improved survival in comparison with patients receiving systemic therapy alone in propensity score-matched cohorts (2-year overall survival, 34.2% vs 20.6%; P < .001). Improved survival was associated only with patients receiving high-intensity local treatment, whereas those receiving lower-intensity local treatment had survival similar to that of patients receiving systemic therapy without local treatment. The impact of high-intensity local therapy was time-dependent, with a stronger impact within the first 6 months after the diagnosis (adjusted hazard ratio [AHR], 0.255; 95% confidence interval [CI], 0.210-0.309; P < .001) in comparison with more than 6 months after the diagnosis (AHR, 0.622; 95% CI, 0.561-0.689; P < .001) in the multivariate analysis. A benefit was seen in all subgroups, in landmark analyses of 1-, 2-, and 3-year survivors, and when adjusting for the number and location of metastatic sites. Aggressive local treatment warrants prospective evaluation for select patients with metastatic HNSCC. Cancer 2017;123:4583-4593. © 2017 American Cancer Society. © 2017 American Cancer Society.
Tang, Xiao-Fang; Song, Ying; Xu, Jing-Jing; Ma, Yuan-Liang; Zhang, Jia-Hui; Yao, Yi; He, Chen; Wang, Huan-Huan; Jiang, Ping; Jiang, Lin; Liu, Ru; Gao, Zhan; Zhao, Xue-Yan; Qiao, Shu-Bin; Xu, Bo; Yang, Yue-Jin; Gao, Run-Lin; Yuan, Jin-Qing
2018-02-01
To determine whether there is a difference in 2-year prognosis among patients across the spectrum of coronary artery disease undergoing percutaneous coronary intervention (PCI). We analyzed all consecutive patients undergoing PCI at a single center from 1/1-12/31/2013. Clinical presentations were compared between sexes according to baseline clinical, angiographic, and procedural characteristics and 2-year (mean 730 ± 30-day) outcomes. We grouped 10 724 consecutive patients based on sex and clinical presentation. Among patients with ST-elevation myocardial infarction (STEMI), rates of all-cause death (6.7% vs 1.4%) and cardiac death (3.8% vs 1.1%) were significantly higher in women than in men (P < 0.05), but these rates did not differ between men and women with stable coronary artery disease (SCAD) and non-ST-elevation acute coronary syndrome ((NSTE-ACS). Incidence of major bleeding was greater than in men only in those women presenting with ACS. After multivariable adjustment, female sex was not an independent predictor of outcomes in STEMI (hazard ratio [HR] for all-cause death: 1.33, 95% confidence interval [CI]:0.52-3.38; P = 0.55; HR for cardiac death: 0.69, 95%CI: 0.23-2.09, P = 0.51], but was still an independent predictor of bleeding in STEMI (HR: 3.53, 95%CI: 1.26-9.91, P = 0.017). Among STEMI patients, women had worse 2-year mortality after PCI therapy, but female sex was not an independent predictor of mortality after adjustment for baseline characteristics. In STEMI patients, women were at higher bleeding risk than men after PCI, even after multivariable adjustment. © 2017, Wiley Periodicals, Inc.
Association of flavonoid-rich foods and flavonoids with risk of all-cause mortality.
Ivey, Kerry L; Jensen, Majken K; Hodgson, Jonathan M; Eliassen, A Heather; Cassidy, Aedín; Rimm, Eric B
2017-05-01
Flavonoids are bioactive compounds found in foods such as tea, red wine, fruits and vegetables. Higher intakes of specific flavonoids, and flavonoid-rich foods, have been linked to reduced mortality from specific vascular diseases and cancers. However, the importance of flavonoid-rich foods, and flavonoids, in preventing all-cause mortality remains uncertain. As such, we examined the association of intake of flavonoid-rich foods and flavonoids with subsequent mortality among 93 145 young and middle-aged women in the Nurses' Health Study II. During 1 838 946 person-years of follow-up, 1808 participants died. When compared with non-consumers, frequent consumers of red wine, tea, peppers, blueberries and strawberries were at reduced risk of all-cause mortality (P<0·05), with the strongest associations observed for red wine and tea; multivariable-adjusted hazard ratios 0·60 (95 % CI 0·49, 0·74) and 0·73 (95 % CI 0·65, 0·83), respectively. Conversely, frequent grapefruit consumers were at increased risk of all-cause mortality, compared with their non-grapefruit consuming counterparts (P<0·05). When compared with those in the lowest consumption quintile, participants in the highest quintile of total-flavonoid intake were at reduced risk of all-cause mortality in the age-adjusted model; 0·81 (95 % CI 0·71, 0·93). However, this association was attenuated following multivariable adjustment; 0·92 (95 % CI 0·80, 1·06). Similar results were observed for consumption of flavan-3-ols, proanthocyanidins and anthocyanins. Flavonols, flavanones and flavones were not associated with all-cause mortality in any model. Despite null associations at the compound level and select foods, higher consumption of red wine, tea, peppers, blueberries and strawberries, was associated with reduced risk of total and cause-specific mortality. These findings support the rationale for making food-based dietary recommendations.
Long term mortality in critically ill burn survivors.
Nitzschke, Stephanie; Offodile, Anaeze C; Cauley, Ryan P; Frankel, Jason E; Beam, Andrew; Elias, Kevin M; Gibbons, Fiona K; Salim, Ali; Christopher, Kenneth B
2017-09-01
Little is known about long term survival risk factors in critically ill burn patients who survive hospitalization. We hypothesized that patients with major burns who survive hospitalization would have favorable long term outcomes. We performed a two center observational cohort study in 365 critically ill adult burn patients who survived to hospital discharge. The exposure of interest was major burn defined a priori as >20% total body surface area burned [TBSA]. The modified Baux score was determined by age + %TBSA+ 17(inhalational injury). The primary outcome was all-cause 5year mortality based on the US Social Security Administration Death Master File. Adjusted associations were estimated through fitting of multivariable logistic regression models. Our final model included adjustment for inhalational injury, presence of 3rd degree burn, gender and the acute organ failure score, a validated ICU risk-prediction score derived from age, ethnicity, surgery vs. medical patient type, comorbidity, sepsis and acute organ failure covariates. Time-to-event analysis was performed using Cox proportional hazard regression. Of the cohort patients studied, 76% were male, 29% were non white, 14% were over 65, 32% had TBSA >20%, and 45% had inhalational injury. The mean age was 45, 92% had 2nd degree burns, 60% had 3rd degree burns, 21% received vasopressors, and 26% had sepsis. The mean TBSA was 20.1%. The mean modified Baux score was 72.8. Post hospital discharge 5year mortality rate was 9.0%. The 30day hospital readmission rate was 4%. Patients with major burns were significantly younger (41 vs. 47 years) had a significantly higher modified Baux score (89 vs. 62), and had significantly higher comorbidity, acute organ failure, inhalational injury and sepsis (all P<0.05). There were no differences in gender and the acute organ failure score between major and non-major burns. In the multivariable logistic regression model, major burn was associated with a 3 fold decreased odds of 5year post-discharge mortality compared to patients with TBSA<20% [OR=0.29 (95%CI 0.11-0.78; P=0.014)]. The adjusted model showed good discrimination [AUC 0.81 (95%CI 0.74-0.89)] and calibration (Hosmer-Lemeshow χ 2 P=0.67). Cox proportional hazard multivariable regression modeling, adjusting for inhalational injury, presence of 3rd degree burn, gender and the acute organ failure score, showed that major burn was predictive of lower mortality following hospital admission [HR=0.34 (95% CI 0.15-0.76; P=0.009)]. The modified Baux score was not predictive for mortality following hospital discharge [OR 5year post-discharge mortality=1.00 (95%CI 0.99-1.02; P=0.74); HR for post-discharge mortality=1.00 (95% CI 0.99-1.02; P=0.55)]. Critically ill patients with major burns who survive to hospital discharge have decreased 5year mortality compared to those with less severe burns. ICU Burn unit patients who survive to hospital discharge are younger with less comorbidities. The observed relationship is likely due to the relatively higher physiological reserve present in those who survive a Burn ICU course which may provide for a survival advantage during recovery after major burn. Copyright © 2017 Elsevier Ltd and ISBI. All rights reserved.
Effectiveness of oral antibiotics for definitive therapy of Gram-negative bloodstream infections.
Kutob, Leila F; Justo, Julie Ann; Bookstaver, P Brandon; Kohn, Joseph; Albrecht, Helmut; Al-Hasan, Majdi N
2016-11-01
There is paucity of data evaluating intravenous-to-oral antibiotic switch options for Gram-negative bloodstream infections (BSIs). This retrospective cohort study examined the effectiveness of oral antibiotics for definitive treatment of Gram-negative BSI. Patients with Gram-negative BSI hospitalised for <14 days at Palmetto Health Hospitals in Columbia, SC, from 1 January 2010 through 31 December 2013 and discharged on oral antibiotics were included in this study. The cohort was stratified into three groups based on bioavailability of oral antibiotics prescribed (high, ≥95%; moderate, 75-94%; and low, <75%). Kaplan-Meier analysis and multivariate Cox proportional hazards regression were used to examine treatment failure. Among the 362 patients, high, moderate and low bioavailability oral antibiotics were prescribed to 106, 179 and 77 patients, respectively, for definitive therapy of Gram-negative BSI. Mean patient age was 63 years, 217 (59.9%) were women and 254 (70.2%) had a urinary source of infection. Treatment failure rates were 2%, 12% and 14% in patients receiving oral antibiotics with high, moderate and low bioavailability, respectively (P = 0.02). Risk of treatment failure in the multivariate Cox model was higher in patients receiving antibiotics with moderate [adjusted hazard ratio (aHR) = 5.9, 95% CI 1.6-38.5; P = 0.005] and low bioavailability (aHR = 7.7, 95% CI 1.9-51.5; P = 0.003) compared with those receiving oral antimicrobial agents with high bioavailability. These data demonstrate the effectiveness of oral antibiotics with high bioavailability for definitive therapy of Gram-negative BSI. Risk of treatment failure increases as bioavailability of the oral regimen declines. Copyright © 2016 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.
Yamada, Keiko; Iso, Hiroyasu; Cui, Renzhe; Tamakoshi, Akiko
2017-05-01
This study aimed to examine the association between recurrent pregnancy loss and the risk of cardiovascular disease mortality. We identified 54,652 women who were pregnant during the Japan Collaborative Cohort Study. These women were 40-79 years at the date of cohort entry between 1988 and 1990. Participants received municipal health screening examinations and completed self-administered questionnaires. The cause of death was confirmed by annual or biannual follow-up surveys for a median of 18 years. The exposure was the number of pregnancy loss. The outcome was mortality from total cardiovascular disease and its subtypes according to the International Classification of Diseases, 10th Revision. Adjustment variables included age, number of deliveries, education, body mass index, physical activity, smoking status, and drinking status. Kaplan-Meier survival curves were used to estimate the cumulative mortality. The number of pregnancy loss tended to be inversely associated with the risk of mortality from total stroke, intracerebral hemorrhage, and total cardiovascular disease. The multivariable hazard ratio of total cardiovascular disease for ≥2 pregnancy losses versus no pregnancy loss was .84 (95% confidence interval, .74-0.95). A 2-fold excess risk of mortality from ischemic stroke associated with ≥2 pregnancy losses was observed in women aged 40-59 years, with a multivariable hazard ratio of 2.19 (95% confidence interval, 1.06-4.49), but not in older women. Recurrentpregnancy loss tends to be associated with a lower risk of mortality from cardiovascular disease at 40-79 years. Younger women have an excess risk of ischemic stroke mortality associated with recurrent pregnancy loss. Copyright © 2017. Published by Elsevier Inc.
Takase, Hiroyuki; Sugiura, Tomonori; Kimura, Genjiro; Ohte, Nobuyuki; Dohi, Yasuaki
2015-07-29
Although there is a close relationship between dietary sodium and hypertension, the concept that persons with relatively high dietary sodium are at increased risk of developing hypertension compared with those with relatively low dietary sodium has not been studied intensively in a cohort. We conducted an observational study to investigate whether dietary sodium intake predicts future blood pressure and the onset of hypertension in the general population. Individual sodium intake was estimated by calculating 24-hour urinary sodium excretion from spot urine in 4523 normotensive participants who visited our hospital for a health checkup. After a baseline examination, they were followed for a median of 1143 days, with the end point being development of hypertension. During the follow-up period, hypertension developed in 1027 participants (22.7%). The risk of developing hypertension was higher in those with higher rather than lower sodium intake (hazard ratio 1.25, 95% CI 1.04 to 1.50). In multivariate Cox proportional hazards regression analysis, baseline sodium intake and the yearly change in sodium intake during the follow-up period (as continuous variables) correlated with the incidence of hypertension. Furthermore, both the yearly increase in sodium intake and baseline sodium intake showed significant correlations with the yearly increase in systolic blood pressure in multivariate regression analysis after adjustment for possible risk factors. Both relatively high levels of dietary sodium intake and gradual increases in dietary sodium are associated with future increases in blood pressure and the incidence of hypertension in the Japanese general population. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Periodontal disease, tooth loss, and colorectal cancer risk: results from the Nurses’ Health Study
Momen-Heravi, Fatemeh; Babic, Ana; Tworoger, Shelley S.; Zhang, Libin; Wu, Kana; Smith-Warner, Stephanie A.; Ogino, Shuji; Chan, Andrew T.; Meyerhardt, Jeffrey; Giovannucci, Edward; Fuchs, Charles; Cho, Eunyoung; Michaud, Dominique S.; Stampfer, Meir J.; Yu, Yau-Hua; Kim, David; Zhang, Xuehong
2016-01-01
Periodontal diseases including tooth loss might increase systemic inflammation, lead to immune dysregulation, and alter gut microbiota, thereby possibly influencing colorectal carcinogenesis. Few epidemiological studies have examined the association between periodontal diseases and colorectal cancer (CRC) risk. We collected information on the periodontal disease (defined as history of periodontal bone loss) and number of natural teeth in the Nurses’ Health Study. A total of 77,443 women were followed since 1992. We used Cox proportional hazard models to calculate multivariable hazard ratios (HRs) and 95% confidence intervals (95% CIs) after adjustment for smoking and other known risk factors for CRC. We documented 1,165 incident CRC through 2010. Compared to women with 25–32 teeth, the multivariable HR (95% CI) for CRC for women with < 17 teeth was 1.20 (1.04–1.39). With regard to tumor site, the HRs (95% CIs) for the same comparison were 1.23 (1.01–1.51) for proximal colon cancer, 1.03 (0.76–1.38) for distal colon cancer, and 1.48 (1.07–2.05) for rectal cancer. Additionally, compared to those without periodontal disease, HRs for CRC were 0.91 (95% CI 0.74–1.12) for periodontal disease, and 1.22 (95% CI 0.91–1.63) when limited to moderate to severe periodontal disease. The results were not modified by smoking status, body mass index, or alcohol consumption. Women with fewer teeth, possibly moderate or severe periodontal disease, might be at a modest increased risk of developing CRC, suggesting a potential role of oral health in colorectal carcinogenesis. PMID:27778343
Yang, Biru; Hallmark, Camden J; Huang, Jamie S; Wolverton, Marcia L; McNeese-Ward, Marlene; Arafat, Raouf R
2013-12-01
This population-based study assessed the characteristics, timing, and risk of syphilis diagnoses among HIV-infected males in Houston, Texas. A retrospective cohort of males newly diagnosed as having HIV between January 2000 and December 2002 was constructed using HIV surveillance data. These individuals were cross-referenced to sexually transmitted disease surveillance data to ascertain early syphilis diagnoses for the subsequent 10 years. Multivariable Cox regression was used to identify risk factors for syphilis diagnosis while controlling for the effects of covariates. Approximately 6% of the HIV-infected male cohort received early syphilis diagnoses during a 10-year period. Of these comorbid individuals, 40.8% received an incident syphilis diagnosis 5 years or more after their HIV diagnosis. Men who have sex with men (MSM) transmission risk was associated with significantly increased hazard of having a syphilis diagnosis in multivariable analysis (adjusted hazard ratio [HR] of a syphilis diagnosis, 5.24; 95% confidence interval, 3.41-8.05). Compared with men who were older than 40 years at HIV diagnosis, those 13 to 19 years old were 4.06 (2.18-7.55) times more likely to obtain a syphilis diagnosis. The HRs of having an HIV-syphilis comorbidity decreased as age increased. Compared with whites, non-Hispanic African Americans had 1.59 (1.11-2.26) times increased risk of having a subsequent syphilis diagnosis. Risk-stratified HRs showed that MSM had an increased risk of contracting syphilis in all race/ethnicity and age groups. This study suggests that HIV-positive African Americans, youth, and MSM had increased risk of having a subsequent syphilis diagnosis. Targeting these groups with STI prevention messaging may be beneficial to reducing comorbidity.
Haddad, Ahmed Q; Jiang, Lai; Cadeddu, Jeffrey A; Lotan, Yair; Gahan, Jeffrey C; Hynan, Linda S; Gupta, Neil; Raj, Ganesh V; Sagalowsky, Arthur I; Margulis, Vitaly
2015-12-01
To evaluate the association of statin use and preoperative serum lipid parameters with oncologic outcomes following surgery for renal cell carcinoma. A total of 850 patients who underwent surgery for localized renal cell carcinoma at our institution from 2000 to 2012 were included. Use of statins, preoperative serum lipid profile, and comprehensive clinicopathologic features were retrospectively recorded. Kaplan-Meier analysis and multivariate Cox proportional hazards model were employed to compare survival outcomes. There were 342 statin users and 508 non-users. Median follow-up was 25.0 months. Statin users were older, had greater body mass index, and had worse performance status than non-users. Tumor pathologic characteristics were balanced between groups. Five-year recurrence free survival (RFS) was 77.9% for non-users compared with 87.6% for statin users (P = .004). After adjustment for clinicopathologic variables, statin use was independently associated with improved RFS (hazard ratio [HR] 0.54, 95% confidence interval [CI] 0.33-0.86, P = .011) and overall survival (HR 0.45, 95%CI 0.28-0.71, P = .001). In patients with available serum lipid parameters (n = 193), 5-year RFS was 83.8% for patients with triglycerides <250 mg/dL compared with 33.3% for those with triglycerides >250 mg/dL (P <.0001). Elevated serum triglycerides (>250 mg/dL) was independently associated with worse RFS (HR 2.69, 95%CI 1.22-5.93, P = .015) on multivariate analysis. Statin use was independently associated with improved survival, whereas elevated serum triglyceride levels correlated with worse oncologic outcomes in this cohort. These findings warrant validation in prospective studies. Copyright © 2015 Elsevier Inc. All rights reserved.
Citrus Consumption and Risk of Cutaneous Malignant Melanoma
Wu, Shaowei; Han, Jiali; Feskanich, Diane; Cho, Eunyoung; Stampfer, Meir J.; Willett, Walter C.; Qureshi, Abrar A.
2015-01-01
Purpose Citrus products are widely consumed foods that are rich in psoralens and furocoumarins, a group of naturally occurring chemicals with potential photocarcinogenic properties. We prospectively evaluated the risk of cutaneous malignant melanoma associated with citrus consumption. Methods A total of 63,810 women in the Nurses' Health Study (1984 to 2010) and 41,622 men in the Health Professionals Follow-Up Study (1986 to 2010) were included. Dietary information was repeatedly assessed every 2 to 4 years during follow-up. Incident melanoma cases were identified through self-report and confirmed by pathologic records. Results Over 24 to 26 years of follow-up, we documented 1,840 incident melanomas. After adjustment for other risk factors, the pooled multivariable hazard ratios for melanoma were 1.00 for overall citrus consumption < twice per week (reference), 1.10 (95% CI, 0.94 to 1.30) for two to four times per week, 1.26 (95% CI, 1.08 to 1.47) for five to six times per week, 1.27 (95% CI, 1.09 to 1.49) for once to 1.5 times per day, and 1.36 (95% CI, 1.14 to 1.63) for ≥ 1.6 times per day (Ptrend < .001). Among individual citrus products, grapefruit showed the most apparent association with risk of melanoma, which was independent of other lifestyle and dietary factors. The pooled multivariable hazard ratio for melanoma comparing the extreme consumption categories of grapefruit (≥ three times per week v never) was 1.41 (95% CI, 1.10 to 1.82; Ptrend < .001). Conclusion Citrus consumption was associated with an increased risk of malignant melanoma in two cohorts of women and men. Nevertheless, further investigation is needed to confirm our findings and explore related health implications. PMID:26124488
Palatini, Paolo; Fania, Claudio; Mos, Lucio; Garavelli, Guido; Mazzer, Adriano; Cozzio, Susanna; Saladini, Francesca; Casiglia, Edoardo
2016-06-01
Controversy still exists about the long-term cardiovascular effects of coffee consumption in hypertension. The predictive capacity of coffee use for cardiovascular events (CVEs) was investigated in 1204 participants from the HARVEST, a prospective cohort study of non-diabetic subjects aged 18-45years, screened for stage 1 hypertension. Subjects were grouped into three categories of coffee drinking, non-drinkers (none), moderate drinkers (1 to 3cups/day) and heavy drinkers (4or more cups/day). Multivariate Cox proportional hazards models were developed adjusting for possible confounding variables and risk factors. During a median follow-up of 12.6years, CVEs were developed by 60 participants. CVEs were more common among coffee drinkers than abstainers (abstainers, 2.2%; moderate drinkers, 7.0%; heavy drinkers, 14.0%; p for trend=0.0003). In a multivariable Cox regression model, coffee use was a significant predictor of CVE in both coffee categories, with a hazard ratio of 2.8 (95% CI, 1.0-7.9) in moderate coffee drinkers and of 4.5 (1.4-14.2) in heavy drinkers compared to abstainers. After inclusion of change in body weight (p=ns), incident hypertension (p=0.027) and presence of diabetes/prediabetes (p=ns) at follow-up end, the association with CVE was attenuated but remained significant in heavy coffee drinkers (HR, 95% CI, 3.4, 1.04-11.3). These data show that coffee consumption increases the risk of CVE in a linear fashion in hypertension. This association may be explained in part by the association between coffee and development of hypertension. Hypertensive patients should be discouraged from drinking coffee. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Lafeuille, Marie-Hélène; Grittner, Amanda Melina; Fortier, Jonathan; Muser, Erik; Fasteneau, John; Duh, Mei Sheng; Lefebvre, Patrick
2015-03-01
Comparative data on rehospitalization patterns and associated institutional costs after inpatient treatment with paliperidone palmitate or oral antipsychotic therapy are reported. A retrospective cohort study was conducted using discharge and billing records from a large hospital database. Selected clinical and cost outcomes were compared in a cohort of adult patients who received the long-acting antipsychotic paliperidone palmitate during a schizophrenia-related index hospital stay and a cohort of patients who received oral antipsychotic therapy during their index admission. Inverse probability-of-treatment weights based on propensity scores were used to reduce confounding. Rates of all-cause and schizophrenia-related rehospitalization and emergency room (ER) use in the two cohorts over periods of up to 12 months were analyzed using a multivariate Cox proportional hazard model. Institutional costs for the evaluated postdischarge events were compared via multivariate linear regression analysis. In the first 12 months after index hospital discharge, the risk of all-cause rehospitalization and ER use was significantly lower in the paliperidone palmitate cohort than in the oral antipsychotic cohort (hazard ratio, 0.61; 95% confidence interval [CI], 0.59-0.63; p < 0.0001); institutional costs during the first 6 months after discharge were significantly lower in the paliperidone palmitate cohort than in the comparator group (adjusted mean monthly cost difference -$404; 95% CI, -$781 to -$148; p < 0.0001). The use of paliperidone palmitate therapy during patients' index hospital admission for schizophrenia was associated with a reduced risk of hospital readmission or ER use and lower postdischarge institutional costs. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
DNA mismatch repair gene polymorphisms affect survival in pancreatic cancer.
Dong, Xiaoqun; Li, Yanan; Hess, Kenneth R; Abbruzzese, James L; Li, Donghui
2011-01-01
DNA mismatch repair (MMR) maintains genomic stability and mediates cellular response to DNA damage. We aim to demonstrate whether MMR genetic variants affect overall survival (OS) in pancreatic cancer. Using the Sequenom method in genomic DNA, we retrospectively genotyped 102 single-nucleotide polymorphisms (SNPs) of 13 MMR genes from 706 patients with pancreatic adenocarcinoma seen at The University of Texas MD Anderson Cancer Center. Association between genotype and OS was evaluated using multivariable Cox proportional hazard regression models. At a false discovery rate of 1% (p ≤ .0015), 15 SNPs of EXO1, MLH1, MSH2, MSH3, MSH6, PMS2, PMS2L3, TP73, and TREX1 in patients with localized disease (n = 333) and 6 SNPs of MSH3, MSH6, and TP73 in patients with locally advanced or metastatic disease (n = 373) were significantly associated with OS. In multivariable Cox proportional hazard regression models, SNPs of EXO1, MSH2, MSH3, PMS2L3, and TP73 in patients with localized disease, MSH2, MSH3, MSH6, and TP73 in patients with locally advanced or metastatic disease, and EXO1, MGMT, MSH2, MSH3, MSH6, PMS2L3, and TP73 in all patients remained significant predictors for OS (p ≤ .0015) after adjusting for all clinical predictors and all SNPs with p ≤ .0015 in single-locus analysis. Sixteen haplotypes of EXO1, MLH1, MSH2, MSH3, MSH6, PMS2, PMS2L3, RECQL, TP73, and TREX1 significantly correlated with OS in all patients (p ≤ .001). MMR gene variants may have potential value as prognostic markers for OS in pancreatic cancer patients.
Quantitative survival impact of composite treatment delays in head and neck cancer.
Ho, Allen S; Kim, Sungjin; Tighiouart, Mourad; Mita, Alain; Scher, Kevin S; Epstein, Joel B; Laury, Anna; Prasad, Ravi; Ali, Nabilah; Patio, Chrysanta; St-Clair, Jon Mallen; Zumsteg, Zachary S
2018-05-09
Multidisciplinary management of head and neck cancer (HNC) must reconcile increasingly sophisticated subspecialty care with timeliness of care. Prior studies examined the individual effects of delays in diagnosis-to-treatment interval, postoperative interval, and radiation interval but did not consider them collectively. The objective of the current study was to investigate the combined impact of these interwoven intervals on patients with HNC. Patients with HNC who underwent curative-intent surgery with radiation were identified in the National Cancer Database between 2004 and 2013. Multivariable models were constructed using restricted cubic splines to determine nonlinear relations with overall survival. Overall, 15,064 patients were evaluated. After adjustment for covariates, only prolonged postoperative interval (P < .001) and radiation interval (P < .001) independently predicted for worse outcomes, whereas the association of diagnosis-to-treatment interval with survival disappeared. By using multivariable restricted cubic spline functions, increasing postoperative interval did not affect mortality until 40 days after surgery, and each day of delay beyond this increased the risk of mortality until 70 days after surgery (hazard ratio, 1.14; 95% confidence interval, 1.01-1.28; P = .029). For radiation interval, mortality escalated continuously with each additional day of delay, plateauing at 55 days (hazard ratio, 1.25; 95% confidence interval, 1.11-1.41; P < .001). Delays beyond these change points were not associated with further survival decrements. Increasing delays in postoperative and radiation intervals are associated independently with an escalating risk of mortality that plateaus beyond certain thresholds. Delays in initiating therapy, conversely, are eclipsed in importance when appraised in conjunction with the entire treatment course. Such findings may redirect focus to streamlining those intervals that are most sensitive to delays when considering survival burden. Cancer 2018. © 2018 American Cancer Society. © 2018 American Cancer Society.
Eguchi, Kazuo; Pickering, Thomas G.; Schwartz, Joseph E.; Hoshide, Satoshi; Ishikawa, Joji; Ishikawa, Shizukiyo; Shimada, Kazuyuki; Kario, Kazuomi
2013-01-01
Context It is not known whether short duration of sleep is a predictor of future cardiovascular events in hypertensive patients. Objective To test the hypothesis that short duration of sleep is independently associated with incident cardiovascular diseases (CVD). Design, Setting, and Participants We performed ambulatory BP monitoring (ABPM) in 1255 subjects with hypertension (mean age: 70.4±9.9 years) and they were followed for an average of 50±23 months. Short sleep duration was defined as <7.5 hrs (20th percentile). Multivariable Cox hazard models predicting CVD events were used to estimate the adjusted hazard ratio (HR) and 95% CI for short sleep duration. A riser pattern was defined when average nighttime SBP exceeded daytime SBP. Main Outcome Measures The end point was cardiovascular events: stroke, fatal or non-fatal myocardial infarction (MI), and sudden cardiac death. Results In multivariable analyses, short duration of sleep (<7.5 hrs) was associated with incident CVD (HR=1.68; 1.06–2.66, P=.03). A synergistic interaction was observed between short sleep duration and the riser pattern (P=.089). When subjects were categorized on the basis of their sleep time and riser/non-riser patterns, the shorter sleep+riser group had a substantially and significantly higher incidence of CVD than the predominant normal sleep+non-riser group (HR=4.43;2.09–9.39, P<0.001), independent of covariates. Conclusions Short duration of sleep is associated with incident CVD risk, and the combination of riser pattern and short duration of sleep that is most strongly predictive of future CVD, independent of ambulatory BP levels. Physicians should inquire about sleep duration in the risk assessment of hypertensive patients. PMID:19001199
Chipollini, Juan; Abel, E Jason; Peyton, Charles C; Boulware, David C; Karam, Jose A; Margulis, Vitaly; Master, Viraj A; Zargar-Shoshtari, Kamran; Matin, Surena F; Sexton, Wade J; Raman, Jay D; Wood, Christopher G; Spiess, Philippe E
2018-04-01
To determine the therapeutic value of lymph node dissection (LND) during cytoreductive nephrectomy (CN) and assess predictors of cancer-specific survival (CSS) in metastatic renal-cell carcinoma. We identified 293 consecutive patients treated with CN at 4 academic institutions from March 2000 to May 2015. LND was performed in 187 patients (63.8%). CSS was estimated by the Kaplan-Meier method for the entire cohort and for a propensity score-matched cohort. Cox proportional hazards regression was used to evaluate CSS in a multivariate model and in an inverse probability weighting-adjusted model for patients who underwent dissection. Median follow-up was 12.6 months (interquartile range, 4.47, 30.3), and median survival was 15.9 months. Of the 293 patients, 187 (63.8%) underwent LND. One hundred six patients had nodal involvement (pN+) with a median CSS of 11.3 months (95% confidence interval [CI], 6.6, 15.9) versus 24.2 months (95% confidence interval, 14.1, 34.3) for pN- patients (log-rank P = .002). The hazard ratio for LND was 1.325 (95% CI, 1.002, 1.75) for the whole cohort and 1.024 (95% CI, 0.682, 1.537) in the propensity score-matched cohort. Multivariate analysis revealed that number of positive lymph nodes (P < .001) was a significant predictor of worse CSS. For patients with metastatic renal-cell carcinoma undergoing CN with lymphadenectomy, the number of nodes positive was predictive of survival at short-term follow-up. However, nonstandardized lymphadenectomy only provided prognostic information without therapeutic benefit. Prospective studies with standardized templates are required to further ascertain the therapeutic value of LND. Copyright © 2017 Elsevier Inc. All rights reserved.
Epidermal growth factor receptor pathway polymorphisms and the prognosis of hepatocellular carcinoma
Wang, Wenjia; Ma, Xiao-Pin; Shi, Zhuqing; Zhang, Pengyin; Ding, Dong-Lin; Huang, Hui-Xing; Saiyin, Hexi Ge; Chen, Tao-Yang; Lu, Pei-Xin; Wang, Neng-Jin; Yu, Hongjie; Sun, Jielin; Zheng, S Lilly; Yu, Long; Xu, Jianfeng; Jiang, De-Ke
2015-01-01
The EGFR signaling pathway is important in the control of vital processes in the carcinogenesis of hepatocellular carcinoma (HCC), including cell survival, cell cycle progression, tumor invasion and angiogenesis. In the current study, we aim to assess if genetic variants in the genes of the EGFR signaling pathway are associated with the prognosis of HCC. We genotyped 36 single nucleotide polymorphisms (SNP) in four core genes (EGF, EGFR, VEGF, and VEGFR2) by using DNA from blood samples of 363 HCC patients with surgical resection. The associations between genotypes and overall survival (OS) and disease-free survival (DFS) were estimated using the Kaplan-Meier method. Hazard ratios (HRs) and 95% confident intervals (CIs) were estimated for the multivariate survival analyses by Cox proportional hazards regression models, adjusting for age, gender, family history, HBsAg and AFP. We found that five SNPs in the VEGFR2 gene were significantly associated with clinical outcomes of HCC patients. Among them, four SNPs (rs7692791, rs2305948, rs13109660, rs6838752) were associated with OS (p=0.035, 0.038, 0.029 and 0.028, respectively), and two SNPs (rs7692791 and rs2034965) were associated with DFS (p=0.039 and 0.017, respectively). Particularly, rs7692791 TT genotype was associated with both reduced OS (p=0.037) and DFS (p=0.043). However, only one SNP rs2034965 with the AA genotype was shown to be an independent effect on DFS (p=0.009) in the multivariate analysis. None of the other 31 polymorphisms or 9 haplotypes attained from the four genes was significantly associated with OS or DFS. Our results illustrated the potential use of VEGFR2 polymorphisms as prognostic markers for HCC patients. PMID:25628948
Slopnick, Emily A; Kim, Simon P; Kiechle, Jonathan E; Gonzalez, Christopher M; Zhu, Hui; Abouassaly, Robert
2016-10-01
To evaluate racial disparities in the diagnosis and treatment of penile cancer among a contemporary series of men from a large diverse national data base. Using the 1998-2012 National Cancer Data Base, all men with squamous cell carcinoma (SCC) were stratified by race and ethnicity. Demographic and disease characteristics were compared between groups. Likelihood of undergoing surgery and type of surgery were compared among patients with nonmetastatic disease. Factors influencing disease stage and treatment type were analyzed with univariate and multivariable logistic regressions. Overall survival was examined with Kaplan-Meier and adjusted Cox proportional hazard models. We identified 12,090 men with penile SCC with median age 66 years (range 18-90). Distribution of patients is as follows: 76.8% Caucasian, 10.2% African American (AA), 8.7% Hispanic. On multivariable analysis, Hispanic men are more likely to present with high-risk (≥T1G3) penile SCC (odds ratio [OR] 1.6; confidence interval [CI] 1.20-2.00; P = .001) and tend to undergo penectomy rather than penile-sparing surgery (OR 1.46; CI 1.15-1.85; P = .002) for equal stage SCC compared to Caucasian patients. Whereas AA men are less likely to undergo surgery of any type (OR 0.67; CI 0.51-0.87; P = .003) and have higher mortality rates than Caucasian patients (hazard ratio 1.25; CI 1.10-1.42; P < .001). Hispanic men with penile SCC are more likely to present with high-risk disease and undergo more aggressive treatment than Caucasian patients but have comparable survival. AA men are less likely to undergo surgical management of their disease and have higher mortality rates. Copyright © 2016 Elsevier Inc. All rights reserved.
Undernutrition as independent predictor of early mortality in elderly cancer patients.
Martucci, Renata B; Barbosa, Mariana V; D'Almeida, Cristiane A; Rodrigues, Viviane D; Bergmann, Anke; de Pinho, Nivaldo B; Thuler, Luiz Claudio S
2017-02-01
The aim of this study was to evaluate the 1-y survival of elderly patients with cancer and the association between undernutrition and mortality. This was a cohort study with elderly patients ages ≥65 y admitted between September and October 2014. A nutritionist performed a Mini Nutritional Assessment-Short Form (MNA-SF) assessment during 48 h of hospital admission and collected data about potential confounding variables (comorbidities, stage of cancer, treatment in the previous 3 mo, and reason for hospitalization). Vital status was determined from the medical records or public records office. Overall survival was estimated using the Kaplan-Meier method. Cox regression was performed to estimate unadjusted hazard ratios. Variables with P < 0.20 by univariate analysis were selected for multivariate analysis. P < 0.05 was considered statistically significant. Of the 136 patients (mean age, 73.1 y; 52.2% men), 29.4%, 41.2%, and 29.4% were classified as normal, at risk for undernutrition, and undernutrition, respectively, according to the MNA-SF. The mortality rate was 31.6% after 12 mo. One-year mortality was higher among the undernourished patients, followed by patients at risk for undernutrition. After adjustment for confounding variables, the multivariate regression Cox model showed that being undernourished according to the MNA-SF increased the risk for death at 1 y (hazard ratio, 5.59; 95% confidence interval, 1.8-17.3; P < 0.001). The results showed that the MNA-SF can be a useful tool in identifying elderly patients at higher risk for 1-y mortality. Copyright © 2016 Elsevier Inc. All rights reserved.
Multiple barriers delay care among women with abnormal cancer screening despite patient navigation.
Ramachandran, Ambili; Freund, Karen M; Bak, Sharon M; Heeren, Timothy C; Chen, Clara A; Battaglia, Tracy A
2015-01-01
While there is widespread dissemination of patient navigation programs in an effort to reduce delays in cancer care, little is known about the impact of barriers to care on timely outcomes. We conducted a secondary analysis of the Boston Patient Navigation Research Program (PNRP) to examine the effect that the presence of barriers had on time to diagnostic resolution of abnormal breast or cervical cancer screening tests. We used multivariable Cox proportional hazards regression with time to diagnostic resolution as the outcome to examine the effect of the number of barriers, controlling for demographic covariates and clustered by patients' primary navigator. There were 1481 women who received navigation; mean age was 39 years; 32% were White, 27% Black, and 31% Hispanic; 28% had private health insurance; and 38% did not speak English. Overall, half (n=745, 50%) had documentation of one or more barriers to care. Women with barriers were more likely to be older, non-White, non-English language speakers, and on public or no health insurance compared with women without barriers. In multivariable analyses, we found less timely diagnostic resolution as the number of barriers increased (one barrier, adjusted hazard ratio [aHR] 0.81 [95% CI 0.56-1.17], p=0.26; two barriers, aHR 0.55 [95% CI 0.37-0.81], p=0.0025; three or more barriers, aHR 0.31 [95% CI 0.21-0.46], p<0.0001)]. Within a patient navigation program proven to reduce delays in care, we found that navigated patients with documented barriers to care experience less timely resolution of abnormal cancer screening tests.
Momen-Heravi, Fatemeh; Masugi, Yohei; Qian, Zhi Rong; Nishihara, Reiko; Liu, Li; Smith-Warner, Stephanie A; Keum, NaNa; Zhang, Lanjing; Tchrakian, Nairi; Nowak, Jonathan A; Yang, Wanshui; Ma, Yanan; Bowden, Michaela; da Silva, Annacarolina; Wang, Molin; Fuchs, Charles S; Meyerhardt, Jeffrey A; Ng, Kimmie; Wu, Kana; Giovannucci, Edward; Ogino, Shuji; Zhang, Xuehong
2017-12-15
Although experimental evidence suggests calcium-sensing receptor (CASR) as a tumor-suppressor, the prognostic role of tumor CASR expression in colorectal carcinoma remains unclear. We hypothesized that higher tumor CASR expression might be associated with improved survival among colorectal cancer patients. We evaluated tumor expression levels of CASR by immunohistochemistry in 809 incident colorectal cancer patients within the Nurses' Health Study and the Health Professionals Follow-up Study. We used Cox proportional hazards regression models to estimate multivariable hazard ratio (HR) for the association of tumor CASR expression with colorectal cancer-specific and all-cause mortality. We adjusted for potential confounders including tumor biomarkers such as microsatellite instability, CpG island methylator phenotype, LINE-1 methylation level, expressions of PTGS2, VDR and CTNNB1 and mutations of KRAS, BRAF and PIK3CA. There were 240 colorectal cancer-specific deaths and 427 all-cause deaths. The median follow-up of censored patients was 10.8 years (interquartile range: 7.2, 15.1). Compared with patients with no or weak expression of CASR, the multivariable HRs for colorectal cancer-specific mortality were 0.80 [95% confidence interval (CI): 0.55-1.16] in patients with moderate CASR expression and 0.50 (95% CI: 0.32-0.79) in patients with intense CASR expression (p-trend = 0.003). The corresponding HRs for overall mortality were 0.85 (0.64-1.13) and 0.81 (0.58-1.12), respectively. Higher tumor CASR expression was associated with a lower risk of colorectal cancer-specific mortality. This finding needs further confirmation and if confirmed, may lead to better understanding of the role of CASR in colorectal cancer progression. © 2017 UICC.
Optimal Utilization of Donor Grafts With Extended Criteria
Cameron, Andrew M.; Ghobrial, R Mark; Yersiz, Hasan; Farmer, Douglas G.; Lipshutz, Gerald S.; Gordon, Sherilyn A.; Zimmerman, Michael; Hong, Johnny; Collins, Thomas E.; Gornbein, Jeffery; Amersi, Farin; Weaver, Michael; Cao, Carlos; Chen, Tony; Hiatt, Jonathan R.; Busuttil, Ronald W.
2006-01-01
Objective: Severely limited organ resources mandate maximum utilization of donor allografts for orthotopic liver transplantation (OLT). This work aimed to identify factors that impact survival outcomes for extended criteria donors (ECD) and developed an ECD scoring system to facilitate graft-recipient matching and optimize utilization of ECDs. Methods: Retrospective analysis of over 1000 primary adult OLTs at UCLA. Extended criteria (EC) considered included donor age (>55 years), donor hospital stay (>5 days), cold ischemia time (>10 hours), and warm ischemia time (>40 minutes). One point was assigned for each extended criterion. Cox proportional hazard regression model was used for multivariate analysis. Results: Of 1153 allografts considered in the study, 568 organs exhibited no extended criteria (0 score), while 429, 135 and 21 donor allografts exhibited an EC score of 1, 2 and 3, respectively. Overall 1-year patient survival rates were 88%, 82%, 77% and 48% for recipients with EC scores of 0, 1, 2 and 3 respectively (P < 0.001). Adjusting for recipient age and urgency at the time of transplantation, multivariate analysis identified an ascending mortality risk ratio of 1.4 and 1.8 compared to a score of 0 for an EC score of 1, and 2 (P < 0.01) respectively. In contrast, an EC score of 3 was associated with a mortality risk ratio of 4.5 (P < 0.001). Further, advanced recipient age linearly increased the death hazard ratio, while an urgent recipient status increased the risk ratio of death by 50%. Conclusions: Extended criteria donors can be scored using readily available parameters. Optimizing perioperative variables and matching ECD allografts to appropriately selected recipients are crucial to maintain acceptable outcomes and represent a preferable alternative to both high waiting list mortality and to a potentially futile transplant that utilizes an ECD for a critically ill recipient. PMID:16772778
Serum prognostic biomarkers in head and neck cancer patients.
Lin, Ho-Sheng; Siddiq, Fauzia; Talwar, Harvinder S; Chen, Wei; Voichita, Calin; Draghici, Sorin; Jeyapalan, Gerald; Chatterjee, Madhumita; Fribley, Andrew; Yoo, George H; Sethi, Seema; Kim, Harold; Sukari, Ammar; Folbe, Adam J; Tainsky, Michael A
2014-08-01
A reliable estimate of survival is important as it may impact treatment choice. The objective of this study is to identify serum autoantibody biomarkers that can be used to improve prognostication for patients affected with head and neck squamous cell carcinoma (HNSCC). Prospective cohort study. A panel of 130 serum biomarkers, previously selected for cancer detection using microarray-based serological profiling and specialized bioinformatics, were evaluated for their potential as prognostic biomarkers in a cohort of 119 HNSCC patients followed for up to 12.7 years. A biomarker was considered positive if its reactivity to the particular patient's serum was greater than one standard deviation above the mean reactivity to sera from the other 118 patients, using a leave-one-out cross-validation model. Survival curves were estimated according to the Kaplan-Meier method, and statistically significant differences in survival were examined using the log rank test. Independent prognostic biomarkers were identified following analysis using multivariate Cox proportional hazards models. Poor overall survival was associated with African Americans (hazard ratio [HR] for death = 2.61; 95% confidence interval [CI]: 1.58-4.33; P = .000), advanced stage (HR = 2.79; 95% CI: 1.40-5.57; P = .004), and recurrent disease (HR = 6.66; 95% CI: 2.54-17.44; P = .000). On multivariable Cox analysis adjusted for covariates (race and stage), six of the 130 markers evaluated were found to be independent prognosticators of overall survival. The results shown here are promising and demonstrate the potential use of serum biomarkers for prognostication in HNSCC patients. Further clinical trials to include larger samples of patients across multiple centers may be warranted. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.
Serum Prognostic Biomarkers in Head and Neck Cancer Patients
Lin, Ho-Sheng; Siddiq, Fauzia; Talwar, Harvinder S.; Chen, Wei; Voichita, Calin; Draghici, Sorin; Jeyapalan, Gerald; Chatterjee, Madhumita; Fribley, Andrew; Yoo, George H.; Sethi, Seema; Kim, Harold; Sukari, Ammar; Folbe, Adam J.; Tainsky, Michael A.
2014-01-01
Objectives/Hypothesis A reliable estimate of survival is important as it may impact treatment choice. The objective of this study is to identify serum autoantibody biomarkers that can be used to improve prognostication for patients affected with head and neck squamous cell carcinoma (HNSCC). Study Design Prospective cohort study. Methods A panel of 130 serum biomarkers, previously selected for cancer detection using microarray-based serological profiling and specialized bioinformatics, were evaluated for their potential as prognostic biomarkers in a cohort of 119 HNSCC patients followed for up to 12.7 years. A biomarker was considered positive if its reactivity to the particular patient’s serum was greater than one standard deviation above the mean reactivity to sera from the other 118 patients, using a leave-one-out cross-validation model. Survival curves were estimated according to the Kaplan-Meier method, and statistically significant differences in survival were examined using the log rank test. Independent prognostic biomarkers were identified following analysis using multivariate Cox proportional hazards models. Results Poor overall survival was associated with African Americans (hazard ratio [HR] for death =2.61; 95% confidence interval [CI]: 1.58–4.33; P =.000), advanced stage (HR =2.79; 95% CI: 1.40–5.57; P =.004), and recurrent disease (HR =6.66; 95% CI: 2.54–17.44; P =.000). On multivariable Cox analysis adjusted for covariates (race and stage), six of the 130 markers evaluated were found to be independent prognosticators of overall survival. Conclusions The results shown here are promising and demonstrate the potential use of serum biomarkers for prognostication in HNSCC patients. Further clinical trials to include larger samples of patients across multiple centers may be warranted. PMID:24347532
Vatandoust, Sina; Price, Timothy J; Ullah, Shahid; Roy, Amitesh C; Beeke, Carole; Young, Joanne P; Townsend, Amanda; Padbury, Robert; Roder, David; Karapetis, Christos S
2016-03-01
Colorectal cancer (CRC) is a common malignancy. There is growing evidence that CRC incidence is increasing in the younger population. There is controversy surrounding the prognosis of young patients with CRC. In this study we reviewed Australian patients with metastatic CRC (mCRC) who were younger than 40 years of age at the time of diagnosis of metastatic disease. To our knowledge this is the first study to focus on this age group with mCRC. This was a retrospective study using data from the South Australian Metastatic Colorectal Cancer database. We compared patient and disease characteristics, management approaches, and outcomes for age groups < 40 and ≥ 40. A multivariate Cox proportional hazards model was fitted to compare the survival outcomes (death from all causes) between the 2 groups. From 3318 patients, 46 (1.4%) were younger than 40 years of age. In a comparison of patients in the younger than 40-year-old group with the older group, a greater proportion had synchronous metastatic disease (80.4% vs. 64.4%, respectively; P = .04) and disease originating from the left colon (71.7% vs. 61.7%, respectively; P = .035); also a larger proportion in the younger than 40-year-old group received chemotherapy compared with the older group (82.6% vs. 58.7%, respectively; P < .01). In the adjusted multivariate model, survival was not significantly different between the 2 groups (hazard ratio, 0.81; 95% confidence interval, 0.56-1.16; log rank P = .25). Young-onset mCRC patients, when defined as aged younger than 40 years, have equivalent survival compared with their older counterparts. This is despite differences in disease characteristics and management approach between the 2 groups. Copyright © 2016 Elsevier Inc. All rights reserved.
Lucas, Michel; O’Reilly, Eilis J.; Pan, An; Mirzaei, Fariba; Willett, Walter C.; Okereke, Olivia I.; Ascherio, Alberto
2014-01-01
Objective To evaluate the association between coffee and caffeine consumption and suicide risk in three large-scale cohorts of U.S. men and women. Methods We accessed data of 43,599 men enrolled in the Health Professionals Follow-up Study (HPFS, 1988–2008), 73,820 women in the Nurses’ Health Study (NHS, 1992–2008), and 91,005 women in the NHS II (1993–2007). Consumption of caffeine, coffee, and decaffeinated coffee, was assessed every four years by validated food-frequency questionnaires. Deaths from suicide were determined by physician review of death certificates. Multivariate adjusted relative risks (RRs) were estimated with Cox proportional hazard models. Cohort specific RRs were pooled using random-effect models. Results We documented 277 deaths from suicide. Compared to those consuming ≤1 cup/week of caffeinated coffee (≤8 oz/237 ml), the pooled multivariate RR (95% confidence interval [CI]) of suicide was 0.55 (0.38–0.78) for those consuming 2–3 cups/day and 0.47 (0.27–0.81) for those consuming ≥4 cups/day (P trend <0.001). The pooled multivariate RR (95% CI) for suicide was 0.75 (0.63–0.90) for each increment of 2 cups/day of caffeinated coffee and 0.77 (0.63–0.93) for each increment of 300 mg/day of caffeine. Conclusions These results from three large cohorts support an association between caffeine consumption and lower risk of suicide. PMID:23819683
Tanno, Kozo; Sakata, Kiyomi; Ohsawa, Masaki; Onoda, Toshiyuki; Itai, Kazuyoshi; Yaegashi, Yumi; Tamakoshi, Akiko
2009-07-01
To determine whether presence of ikigai as a positive psychological factor is associated with decreased risks for all-cause and cause-specific mortality among middle-aged and elderly Japanese men and women. From 1988 to 1990, a total of 30,155 men and 43,117 women aged 40 to 79 years completed a lifestyle questionnaire including a question about ikigai. Mortality follow-up was available for a mean of 12.5 years and was classified as having occurred in the first 5 years or the subsequent follow-up period. Associations between ikigai and all-cause and cause-specific mortality were assessed using a Cox's regression model. Multivariate hazard ratios (HRs) were adjusted for age, body mass index, drinking and smoking status, physical activity, sleep duration, education, occupation, marital status, perceived mental stress, and medical history. During the follow-up period, 10,021 deaths were recorded. Men and women with ikigai had decreased risks of mortality from all causes in the long-term follow-up period; multivariate HRs (95% confidence intervals, CIs) were 0.85 (0.80-0.90) for men and 0.93 (0.86-1.00) for women. The risk of cardiovascular mortality was reduced in men with ikigai; the multivariate HR (95% CI) was 0.86 (0.76-0.97). Furthermore, men and women with ikigai had a decreased risk for mortality from external causes; multivariate HRs (95% CIs) were 0.74 (0.59-0.93) for men and 0.67 (0.51-0.88) for women. The findings suggest that a positive psychological factor such as ikigai is associated with longevity among Japanese people.
Turner, Melanie; Barber, Mark; Dodds, Hazel; Dennis, Martin; Langhorne, Peter; Macleod, Mary Joan
2015-03-01
Randomised trials indicate that stroke unit care reduces morbidity and mortality after stroke. Similar results have been seen in observational studies but many have not corrected for selection bias or independent predictors of outcome. We evaluated the effect of stroke unit compared with general ward care on outcomes after stroke in Scotland, adjusting for case mix by incorporating the six simple variables (SSV) model, also taking into account selection bias and stroke subtype. We used routine data from National Scottish datasets for acute stroke patients admitted between 2005 and 2011. Patients who died within 3 days of admission were excluded from analysis. The main outcome measures were survival and discharge home. Multivariable logistic regression was used to estimate the OR for survival, and adjustment was made for the effect of the SSV model and for early mortality. Cox proportional hazards model was used to estimate the hazard of death within 365 days. There were 41 692 index stroke events; 79% were admitted to a stroke unit at some point during their hospital stay and 21% were cared for in a general ward. Using the SSV model, we obtained a receiver operated curve of 0.82 (SE 0.002) for mortality at 6 months. The adjusted OR for survival at 7 days was 3.11 (95% CI 2.71 to 3.56) and at 1 year 1.43 (95% CI 1.34 to 1.54) while the adjusted OR for being discharged home was 1.19 (95% CI 1.11 to 1.28) for stroke unit care. In routine practice, stroke unit admission is associated with a greater likelihood of discharge home and with lower mortality up to 1 year, after correcting for known independent predictors of outcome, and excluding early non-modifiable mortality. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Cukierman-Yaffe, Tali; Kasher-Meron, Michal; Fruchter, Eyal; Gerstein, Hertzel C; Afek, Arnon; Derazne, Estela; Tzur, Dorit; Karasik, Avraham; Twig, Gilad
2015-12-01
Although dysglycemia is a risk factor for cognitive decline, it is unknown whether cognitive performance among young and apparently healthy adults affect the risk for impaired fasting glucose (IFG). This study aimed to characterize the relationship between cognitive function and the risk for IFG among young adults. This was a retrospective cohort study utilizing data collected at pre-military recruitment assessments with information collected at the screening center of Israeli Army Medical Corps. Normoglycemic adults (n = 17 348) (free of IFG and diabetes; mean age 31.0 ± 5.6 y; 87% men) of the Metabolic Lifestyle and Nutrition Assessment in Young Adults (MELANY) cohort with data regarding their General Intelligence Score (GIS), a comprehensive measure of cognitive function, at age 17 y. Fasting plasma glucose was assessed every 3-5 y at scheduled visits. Cox proportional hazards models were applied. The main outcome of the study was incident IFG (≥ 100 mg/dL and <126 mg/dL) at scheduled visits. During a median followup of 6.6 y, 1478 cases of IFG were recorded (1402 men). After adjustment for age and sex, participants in the lowest GIS category had a 1.9-fold greater risk for incident IFG compared with those in the highest GIS category. In multivariable analysis adjusted for age, sex, body mass index, fasting plasma glucose, family history of diabetes, country of origin, socioeconomic status, education, physical activity, smoking status, alcohol consumption, breakfast consumption, triglyceride level, white blood cell count, the risk for IFG was nearly doubled in the lowest GIS category compared with the highest GIS category (hazard ratio, 1.8; 95% confidence interval, 1.4-2.3; P < .001). These results persisted when GIS was treated as a continuous variable and when the model was adjusted also for body mass index at the end of followup. This study demonstrates that lower cognitive function at late adolescence is independently associated with an elevated risk IFG in both men and women.
Prediagnostic Plasma 25-Hydroxyvitamin D and Pancreatic Cancer Survival
Yuan, Chen; Qian, Zhi Rong; Babic, Ana; Morales-Oyarvide, Vicente; Rubinson, Douglas A.; Kraft, Peter; Ng, Kimmie; Bao, Ying; Giovannucci, Edward L.; Ogino, Shuji; Stampfer, Meir J.; Gaziano, John Michael; Sesso, Howard D.; Buring, Julie E.; Cochrane, Barbara B.; Chlebowski, Rowan T.; Snetselaar, Linda G.; Manson, JoAnn E.; Fuchs, Charles S.
2016-01-01
Purpose Although vitamin D inhibits pancreatic cancer proliferation in laboratory models, the association of plasma 25-hydroxyvitamin D [25(OH)D] with patient survival is largely unexplored. Patients and Methods We analyzed survival among 493 patients from five prospective US cohorts who were diagnosed with pancreatic cancer from 1984 to 2008. We estimated hazard ratios (HRs) for death by plasma level of 25(OH)D (insufficient, < 20 ng/mL; relative insufficiency, 20 to < 30 ng/mL; sufficient ≥ 30 ng/mL) by using Cox proportional hazards regression models adjusted for age, cohort, race and ethnicity, smoking, diagnosis year, stage, and blood collection month. We also evaluated 30 tagging single-nucleotide polymorphisms in the vitamin D receptor gene, requiring P < .002 (0.05 divided by 30 genotyped variants) for statistical significance. Results Mean prediagnostic plasma level of 25(OH)D was 24.6 ng/mL, and 165 patients (33%) were vitamin D insufficient. Compared with patients with insufficient levels, multivariable-adjusted HRs for death were 0.79 (95% CI, 0.48 to 1.29) for patients with relative insufficiency and 0.66 (95% CI, 0.49 to 0.90) for patients with sufficient levels (P trend = .01). These results were unchanged after further adjustment for body mass index and history of diabetes (P trend = .02). The association was strongest among patients with blood collected within 5 years of diagnosis, with an HR of 0.58 (95% CI, 0.35 to 0.98) comparing patients with sufficient to patients with insufficient 25(OH)D levels. No single-nucleotide polymorphism at the vitamin D receptor gene met our corrected significance threshold of P < .002; rs7299460 was most strongly associated with survival (HR per minor allele, 0.80; 95% CI, 0.68 to 0.95; P = .01). Conclusion We observed longer overall survival in patients with pancreatic cancer who had sufficient prediagnostic plasma levels of 25(OH)D. PMID:27325858
Feldman, Candace H; Hiraki, Linda T; Winkelmayer, Wolfgang C; Marty, Francisco M; Franklin, Jessica M; Kim, Seoyoung C; Costenbader, Karen H
2015-06-01
To examine the epidemiology of serious infections, a significant cause of morbidity and mortality in systemic lupus erythematosus (SLE), in a nationwide cohort of SLE and lupus nephritis (LN) patients. Using the Medicaid Analytic eXtract database for the years 2000-2006, we identified patients ages 18-64 years who had SLE and the subset who had LN. We ascertained cases of serious hospitalized infections using validated algorithms, and we determined 30-day mortality rates. Poisson regression was used to calculate infection incidence rates and multivariable Cox proportional hazards models were used to calculate hazard ratios (HRs) for the first infection, adjusted for sociodemographic variables, medication use, and an SLE-specific risk adjustment index. We identified 33,565 patients with SLE, 7,113 of whom had LN. There were 9,078 serious infections in 5,078 SLE patients and 3,494 infections in 1,825 LN patients. The infection incidence rate per 100 person-years was 10.8 in the SLE cohort and 23.9 in the LN subcohort. In adjusted models for the SLE cohort, we observed increased risks of infection in men as compared to women (HR 1.33 [95% confidence interval (95% CI) 1.20-1.47]), in blacks as compared to whites (HR 1.14 [95% CI 1.06-1.21]), and in users of glucocorticoids (HR 1.51 [95% CI 1.43-1.61]) and immunosuppressive drugs (HR 1.11 [95% CI 1.03-1.20]) as compared to never users. Hydroxychloroquine users had a reduced risk of infection as compared to never users (HR 0.73 [95% CI 0.68-0.77]). The 30-day mortality rate per 1,000 person-years among those hospitalized with infections was 21.4 in the SLE cohort and 38.6 in the LN subcohort. In this diverse, nationwide cohort of SLE patients, we observed a substantial burden of serious infections with many subsequent deaths, particularly among those with LN. © 2015, American College of Rheumatology.
Survival after Acute Hemodialysis in Pennsylvania, 2005–2007: A Retrospective Cohort Study
Ramer, Sarah J.; Cohen, Elan D.; Chang, Chung-Chou H.; Unruh, Mark L.; Barnato, Amber E.
2014-01-01
Background Little is known about acute hemodialysis in the US. Here we describe predictors of receipt of acute hemodialysis in one state and estimate the marginal impact of acute hemodialysis on survival after accounting for confounding due to illness severity. Materials and Methods This is a retrospective cohort study of acute-care hospitalizations in Pennsylvania from October 2005 to December 2007 using data from the Pennsylvania Health Care Cost Containment Council. Exposure variable is acute hemodialysis; dependent variable is survival following acute hemodialysis. We used multivariable logistic regression to determine propensity to receive acute hemodialysis and then, for a Cox proportional hazards model, matched acute hemodialysis and non-acute hemodialysis patients 1∶5 on this propensity. Results In 2,131,248 admissions of adults without end-stage renal disease, there were 6,657 instances of acute hemodialysis. In analyses adjusted for predicted probability of death upon admission plus other covariates and stratified on age, being male, black, and insured were independent predictors of receipt of acute hemodialysis. One-year post-admission mortality was 43% for those receiving acute hemodialysis, compared to 13% among those not receiving acute hemodialysis. After matching on propensity to receive acute hemodialysis and adjusting for predicted probability of death upon admission, patients who received acute hemodialysis had a higher risk of death than patients who did not over at least 1 year of follow-up (hazard ratio 1·82, 95% confidence interval 1·68–1·97). Conclusions In a populous US state, receipt of acute hemodialysis varied by age, sex, race, and insurance status even after adjustment for illness severity. In a comparison of patients with similar propensity to receive acute hemodialysis, those who did receive it were less likely to survive than those who did not. These findings raise questions about reasons for lack of benefit. PMID:25141028
Schoormans, Dounya; van de Poll-Franse, Lonneke; Vissers, Pauline; van Herk-Sukel, Myrthe P P; Pedersen, Susanne S; Rottmann, Nina; Horsbøl, Trine; Dalton, Susanne; Denollet, Johan
2017-11-01
To examine the associations between pharmaceutically treated anxiety and depression present in the year prior to breast cancer diagnosis and the risk of incident cardiovascular disease (CVD), while controlling for traditional cardiovascular risk factors and clinical characteristics in a population-based observational study. Adult 1-year breast cancer survivors (n = 7227), diagnosed between 01-01-1999 and 12-31-2010, with no history of CVD, were selected from the Netherlands Cancer Registry. Drug dispensing data were derived from the PHARMO Database Network and used as proxy for CVD, anxiety, and depression. By multivariable Cox regression analysis, we examined the risk associated with pharmaceutically treated anxiety and depression for developing CVD after cancer diagnosis, adjusting for age, pharmaceutically treated hypertension, hypercholesterolemia, and diabetes mellitus in the year prior to cancer diagnosis, tumor stage, and cancer treatment. During the 13-year follow-up period, 193 (3%) breast cancer survivors developed CVD. Women pharmaceutically treated for anxiety in the year prior to their cancer diagnosis had a 48% increased hazard for CVD [HR = 1.48; 95% CI 1.05-1.08] after full adjustment. This association was restricted to breast cancer survivors who were 65 years or younger. Depression was not associated with CVD risk [HR = 0.89; 95% CI 0.52-1.53]. Older age [HR = 1.06; 95% CI 1.05-1.08], hypertension [HR = 1.80; 95% CI 1.32-2.46], and hypercholesterolemia [HR = 1.63; 95% CI 1.15-2.33] were associated with an increased hazard for incident CVD, whereas hormone therapy [HR = 0.59; 95% CI 0.42-0.83] was protective. Anxiety present in the year prior to breast cancer diagnosis increases the risk of incident CVD in 1-year breast cancer survivors, after adjustment for depression, traditional cardiovascular risk factors, and clinical characteristics.
Kuwahara, Keisuke; Honda, Toru; Nakagawa, Tohru; Yamamoto, Shuichiro; Hayashi, Takeshi; Mizoue, Tetsuya
2018-02-05
Data on the effect of physical activity intensity on depression is scarce. We investigated the prospective association between intensity of leisure-time exercise and risk of depressive symptoms among Japanese workers. The participants were 29,052 employees (24,653 men and 4,399 women) aged 20 to 64 years without psychiatric disease including depressive symptoms at health checkup in 2006-2007 and were followed up until 2014-2015. Details of leisure-time exercise were ascertained via a questionnaire. Depressive states were assessed using a 13-item questionnaire. Multivariable-adjusted hazard ratio of depressive symptoms was estimated using Cox regression analysis. During a mean follow-up of 5.8 years with 168,203 person-years, 6,847 workers developed depressive symptoms. Compared with workers who engaged in no exercise during leisure-time (0 MET-hours per week), hazard ratios (95% confidence intervals) associated with >0 to <7.5, 7.5 to <15.0, and ≥15.0 MET-hours of leisure-time exercise were 0.88 (0.82-0.94), 0.85 (0.76-0.94), and 0.78 (0.68-0.88) among workers who engaged in moderate-intensity exercise alone; 0.93 (0.82-1.06), 0.82 (0.68-0.98), and 0.83 (0.71-0.98) among workers who engaged in vigorous-intensity exercise alone; and 0.96 (0.80-1.15), 0.80 (0.67-0.95), and 0.76 (0.66-0.87) among workers who engaged in both moderate- and vigorous-intensity exercise with adjustment for age, sex, lifestyles, work-related and socioeconomic factors, and body mass index. Additional adjustment for baseline depression score attenuated the inverse association, especially among those who engaged in moderate-intensity exercise alone. The results suggest that vigorous-intensity exercise alone or vigorous-intensity combined with moderate-intensity exercise would prevent depressive symptoms among Japanese workers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nanda, Akash, E-mail: ananda@partners.or; Chen, M.-H.; Moran, Brian J.
2010-05-01
Purpose: To identify clinical factors associated with prostate cancer-specific mortality (PCSM), adjusting for comorbidity, in elderly men with intermediate-risk prostate cancer treated with brachytherapy alone or in conjunction with external beam radiation therapy. Methods and Materials: The study cohort comprised 1,978 men of median age 71 (interquartile range, 66-75) years with intermediate-risk disease (Gleason score 7, prostate-specific antigen (PSA) 20 ng/mL or less, tumor category T2c or less). Fine and Gray's multivariable competing risks regression was used to assess whether prevalent cardiovascular disease (CVD), age, treatment, year of brachytherapy, PSA level, or tumor category was associated with the risk ofmore » PCSM. Results: After a median follow-up of 3.2 (interquartile range, 1.7-5.4) years, the presence of CVD was significantly associated with a decreased risk of PCSM (adjusted hazard ratio, 0.20; 95% CI 0.04-0.99; p = 0.05), whereas an increasing PSA level was significantly associated with an increased risk of PCSM (adjusted hazard ratio 1.14; 95% CI 1.02-1.27; p = 0.02). In the absence of CVD, cumulative incidence estimates of PCSM were higher (p = 0.03) in men with PSA levels above as compared with the median PSA level (7.3 ng/mL) or less; however, in the setting of CVD there was no difference (p = 0.27) in these estimates stratified by the median PSA level (6.9 ng/mL). Conclusions: In elderly men with intermediate-risk prostate cancer, CVD status is a negative predictor of PCSM and affects the prognostic capacity of pretreatment PSA level. These observations support the potential utility of prerandomization stratification by comorbidity to more accurately assess prognostic factors and treatment effects within this population.« less
Berton, Giuseppe; Cordiano, Rocco; Cavuto, Fiorella; Bagato, Francesco; Pellegrinet, Marco; Cati, Arianna
2016-10-01
We investigated the gender-based differences in the association between heart failure (HF) during acute coronary syndrome (ACS) and post-discharge, long-term cardiovascular (CV) mortality. The present study included 557 patients enrolled in three intensive coronary care units and discharged alive. HF during ACS was evaluated by Killip class and left ventricular ejection fraction (LVEF). Interaction between gender and HF after 15years of follow up was studied using Cox models including a formal interaction term. Median age was 67 (interquartile range [IQR], 59-75) years, 29% were females, 37% had non-ST elevation myocardial infarction and 32% Killip class>1, and median LVEF was 53% (IQR 46-61). All but five patients were followed up to 15years, representing 5332 person-years. Of these, 40.2% died of CV-related causes. Crude CV mortality rate was higher among women (52.2%) than men (35.3%; P<0.0001). At a univariable level, a negative interaction between female gender and Killip class for CV mortality was found [hazard ratio (HR)=0.51 (0.34-0.77), P=0.002]. In five multivariable models after controlling for age, main CV risk factors, clinical features, post-discharge medical treatment, and mechanical coronary reperfusion, the interaction was significant across all models [HR=0.63 (0.42-0.95), P=0.02 in the fully adjusted model]. LVEF showed no significant hazard associated with female gender on univariable analysis [HR=1.4 (0.9-0.2.0), P=0.11] but did so in all adjusted models [HR=1.7 (1.2-2.5), P=0.005 in the fully adjusted model]. Gender is a consistent, independent effect modifier in the association between HF and long-term CV mortality after ACS. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
44 CFR 62.21 - Claims adjustment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program SALE OF INSURANCE AND ADJUSTMENT OF CLAIMS Claims Adjustment, Claims Appeals, and Judicial Review § 62.21 Claims adjustment. (a) In...
Girotra, Saket; Kitzman, Dalane W.; Kop, Willem J.; Stein, Phyllis K.; Gottdiener, John S.; Mukamal, Kenneth J.
2012-01-01
OBJECTIVES To determine the relationship between heart rate response during low-grade physical exertion (six-minute walk) with mortality and adverse cardiovascular outcomes in the elderly. METHODS Participants in the Cardiovascular Health Study, who completed a six-minute walk test, were included. We used delta heart rate (difference between post-walk heart rate and resting heart rate) as a measure of chronotropic response and examined its association with 1) all-cause mortality and 2) incident coronary heart disease (CHD) event, using multivariable Cox regression models. RESULTS We included 2224 participants (mean age 77±4 years; 60% women, 85% white). The average delta heart rate was 26 beats/min. Participants in the lowest tertile of delta heart rate (<20 beats/min) had higher risk-adjusted mortality (hazard ratio [HR] 1.18; 95% confidence interval [CI][1.00, 1.40]) and incident CHD (HR 1.37; 95% CI[1.05, 1.78]) compared to subjects in the highest tertile (≥30 beats/min), with a significant linear trend across tertiles (P for trend <0.05 for both outcomes). This relationship was not significant after adjustment for distance walked. CONCLUSION Impaired chronotropic response during six-minute walk test was associated with an increased risk of mortality and incident CHD among the elderly. This association was attenuated after adjusting for distance walked. PMID:22722364
Smoking and hemorrhagic stroke mortality in a prospective cohort study of older Chinese.
Xu, Lin; Schooling, Catherine Mary; Chan, Wai Man; Lee, Siu Yin; Leung, Gabriel M; Lam, Tai Hing
2013-08-01
Hemorrhagic stroke is more common in non-Western settings and does not always share risk factors with other cardiovascular diseases. The association of smoking with hemorrhagic stroke subtypes has not been established. We examined the association of cigarette smoking with hemorrhagic stroke, by subtype (intracerebral hemorrhage and subarachnoid hemorrhage), in a large cohort of older Chinese from Hong Kong. Multivariable Cox regression analysis was used to assess the adjusted associations of smoking at baseline with death from hemorrhagic stroke and its subtypes, using a population-based prospective cohort of 66 820 Chinese aged>65 years enrolled from July 1998 to December 2001 at all the 18 Elderly Health Centers of the Hong Kong Government Department of Health and followed until May 31, 2012. After follow-up for an average of 10.9 years (SD=3.1), 648 deaths from hemorrhagic stroke had occurred, of which 530 (82%) were intracerebral hemorrhage. Current smoking was associated with a higher risk of hemorrhagic stroke (hazard ratio, 2.19; 95% confidence interval, 1.49-3.22), intracerebral hemorrhage (1.94; 1.25-3.01), and subarachnoid hemorrhage (3.58; 1.62-7.94), adjusted for age, sex, education, public assistance, housing type, monthly expenditure, alcohol use, and exercise. Further adjustment for hypertension and body mass index slightly changed the estimates. Smoking is strongly associated with hemorrhagic stroke mortality, particularly for subarachnoid hemorrhage.
Lee, Jae-Hong; Kweon, Helen Hye-In; Choi, Jung-Kyu; Kim, Young-Taek; Choi, Seong-Ho
2017-01-01
The incidence of prostate cancer (PC) accompanying periodontal disease (PD) is anticipated to increase due to population aging. The aim of this study was to determine the association between PD and PC using data in the National Health Insurance Service-Health Examinee Cohort (NHIS-HEC). A random stratified sample of 187,934 South Koreans was collected from the NHIS database from 2002 to 2013. We assessed the relationship between PD and PC while adjusting for potential confounding factors (sex, age, household income, insurance status, residence area, hypertension, diabetes mellitus, cerebral infarction, angina pectoris, myocardial infarction, smoking status, alcohol intake, and regular exercise). The overall incidence of PC with PD among those aged 40 years and older was 0.28% (n = 531). In the multivariate Cox proportional-hazard regression analysis with adjustment for confounding factors, PD was associated with a 14% higher risk of PC (HR = 1.14, 95% CI = 1.01-1.31, P = 0.042). The findings of this study suggest that PD is significantly and positively associated with PC. Further studies are required to identify the mechanisms underlying the links between PD and PC. PMID:28928887
Ambrosy, Andrew P; Bhatt, Ankeet S; Stebbins, Amanda L; Wruck, Lisa M; Fudim, Marat; Greene, Stephen J; Kraus, William E; O'Connor, Christopher M; Piña, Ileana L; Whellan, David J; Mentz, Robert J
2018-05-01
Despite more than 200 years of clinical experience and a pivotal trial, recently published research has called into question the safety and efficacy of digoxin therapy in heart failure (HF). HF-ACTION (ClinicalTrials.gov Number: NCT00047437) enrolled 2331 outpatients with HF and an EF ≤35% between April 2003 and February 2007 and randomized them to aerobic exercise training versus usual care. Patients were grouped according to prevalent digoxin status at baseline. The association between digoxin therapy and outcomes was assessed using Cox proportional hazard and inverse-probability weighted (IPW) regression models adjusted for demographics, medical history, medications, laboratory values, quality of life, and exercise parameters. The prevalence of digoxin therapy decreased from 52% during the first 6 months of enrollment to 35% at the end of the HF-ACTION trial (P <0.0001). Study participants were 59± 13 years of age, 72% were male, and approximately half had an ischemic etiology of HF. Patients receiving digoxin at baseline tended to be younger and were more likely to report New York Heart Association functional class III/IV symptoms (rather than class II) compared to those not receiving digoxin. Patients taking digoxin had worse baseline exercise capacity as measured by peak VO 2 and 6-min walk test and greater impairments in health status as reflected by the Kansas City Cardiomyopathy Questionnaire. The association between digoxin and the risk of death or hospitalization differed depending on whether Cox proportional hazard (Hazard Ratio 1.03, 95% Confidence Interval 0.92-1.16; P = .62) or IPW regression models (HR 1.08, 95% CI 1.00-1.17; P = .057) were used to adjust for potential confounders. Although digoxin use was associated with high-risk clinical features, the association between digoxin therapy and outcomes was dependent on the statistical methods used for multivariable adjustment. Clinical equipoise exists and additional prospective research is required to clarify the role of digoxin in contemporary clinical practice including its effects on functional capacity, quality of life, and long-term outcomes. Copyright © 2018. Published by Elsevier Inc.
Patient survival and surgical re-intervention predictors for intracapsular hip fractures.
González Quevedo, David; Mariño, Iskandar Tamimi; Sánchez Siles, Juan Manuel; Escribano, Esther Romero; Granero Molina, Esther Judith; Enrique, David Bautista; Smoljanović, Tomislav; Pareja, Francisco Villanueva
2017-08-01
Choosing between total hip replacement (THR) and partial hip replacement (PHR) for patients with intracapsular hip fractures is often based on subjective factors. Predicting the survival of these patients and risk of surgical re-intervention is essential to select the most adequate implant. We conducted a retrospective cohort study on mortality of patients over 70 years with intracapsular hip fractures who were treated between January 2010 and December 2013, with either PHR or THR. Patients' information was withdrawn from our local computerized database. The age-adjusted Charlson comorbidity index (ACCI) and American Society of Anesthesiologists (ASA) score were calculated for all patients. The patients were followed for 2 years after surgery. Survival and surgical re-intervention rates were compared between the two groups using a Multivariate Cox proportional hazard model. A total of 356 individuals were included in this study. At 2 years of follow-up, 221 (74.4%) of the patients with ACCI score≤7 were still alive, in contrast to only 20 (29.0%) of those with ACCI score>7. In addition, 201 (76.2%) of the patients with ASA score≤3 were still alive after 2 years, compared to 30 (32.6%) of individuals with ASA >3. Patients with the ACCI score>7, and ASA score>3 had a significant increase in all-cause 2-year mortality (adjusted hazard ratio of 3.2, 95% CI 2.2-4.6; and 3.12, 95% CI 2.2-4.5, respectively). Patients with an ASA score>3 had a quasi-significant increase in the re-intervention risk (adjusted hazard ratio 2.2, 95% CI 1.0-5.1). The sensitivity, specificity, positive predictive value and negative predictive values of ACCI in predicting 2-year mortality were 39.2%, 91.1%, 71%, and 74.4%, respectively. On the other hand, the sensitivity, specificity, positive predictive value and negative predictive values of ASA score in predicting 2-year mortality were 49.6%, 79.1%, 67.4%, and 76.1%, respectively. Both ACCI and ASA scales were able to predict the 2-year survival of patients with intracapsular hip fractures. The ASA scale was also able to predict the risk of re-intervention in these patients. Copyright © 2017 Elsevier Ltd. All rights reserved.
Lai, Shih-Wei; Lin, Cheng-Li; Liao, Kuan-Fu
2017-08-01
Very little is known about the association between glaucoma and Parkinson's disease in the elderly. The objective of this study was to determine whether glaucoma is associated with Parkinson's disease in older people in Taiwan. A retrospective cohort study was conducted to analyze the Taiwan National Health Insurance Program database from 2000 to 2010. We included 4330 subjects aged 65 years or older with newly diagnosed glaucoma as the glaucoma group, and 17,000 randomly selected subjects without a glaucoma diagnosis as the non-glaucoma group. Both groups were matched for sex, age, other comorbidities, and index year of glaucoma diagnosis. The incidence of Parkinson's disease at the end of 2011 was measured. A multivariable Cox proportional hazard regression model was used to measure the hazard ratio and 95% confidence intervals for Parkinson's disease associated with glaucoma. The overall incidence of Parkinson's disease was 1.28-fold higher in the glaucoma group than that in the non-glaucoma group (7.73 vs. 6.02 per 1000 person-years; 95% confidence interval 1.18, 1.40). After controlling for potential confounding factors, the adjusted hazard ratio of Parkinson's disease was 1.23 for the glaucoma group (95% confidence interval 1.05, 1.46), compared with the non-glaucoma group. Older people with glaucoma correlate with a small but statistically significant increase in the risk for Parkinson's disease. Whether glaucoma may be a non-motor feature of Parkinson's disease in older people requires further research to confirm.
Ma, Lijun; Langefeld, Carl D; Comeau, Mary E; Bonomo, Jason A; Rocco, Michael V; Burkart, John M; Divers, Jasmin; Palmer, Nicholette D; Hicks, Pamela J; Bowden, Donald W; Lea, Janice P; Krisher, Jenna O; Clay, Margo J; Freedman, Barry I
2016-08-01
Relative to European Americans, evidence supports that African Americans with end-stage renal disease (ESRD) survive longer on dialysis. Renal-risk variants in the apolipoprotein L1 gene (APOL1), associated with nondiabetic nephropathy and less subclinical atherosclerosis, may contribute to dialysis outcomes. Here, APOL1 renal-risk variants were assessed for association with dialytic survival in 450 diabetic and 275 nondiabetic African American hemodialysis patients from Wake Forest and Emory School of Medicine outpatient facilities. Outcomes were provided by the ESRD Network 6-Southeastern Kidney Council Standardized Information Management System. Dates of death, receipt of a kidney transplant, and loss to follow-up were recorded. Outcomes were censored at the date of transplantation or through 1 July 2015. Multivariable Cox proportional hazards models were computed separately in patients with nondiabetic and diabetic ESRD, adjusting for the covariates age, gender, comorbidities, ancestry, and presence of an arteriovenous fistula or graft at dialysis initiation. In nondiabetic ESRD, patients with 2 (vs. 0/1) APOL1 renal-risk variants had significantly longer dialysis survival (hazard ratio 0.57), a pattern not observed in patients with diabetes-associated ESRD (hazard ratio 1.29). Thus, 2 APOL1 renal-risk variants are associated with longer dialysis survival in African Americans without diabetes, potentially relating to presence of renal-limited disease or less atherosclerosis. Copyright © 2016 International Society of Nephrology. All rights reserved.
Argos, Maria; Kalra, Tara; Pierce, Brandon L.; Chen, Yu; Parvez, Faruque; Islam, Tariqul; Ahmed, Alauddin; Hasan, Rabiul; Hasan, Khaled; Sarwar, Golam; Levy, Diane; Slavkovich, Vesna; Graziano, Joseph H.; Rathouz, Paul J.; Ahsan, Habibul
2011-01-01
Elevated concentrations of arsenic in groundwater pose a public health threat to millions of people worldwide. The authors aimed to evaluate the association between arsenic exposure and skin lesion incidence among participants in the Health Effects of Arsenic Longitudinal Study (HEALS). The analyses used data on 10,182 adults free of skin lesions at baseline through the third biennial follow-up of the cohort (2000–2009). Discrete-time hazard regression models were used to estimate hazard ratios and 95% confidence intervals for incident skin lesions. Multivariate-adjusted hazard ratios for incident skin lesions comparing 10.1–50.0, 50.1–100.0, 100.1–200.0, and ≥200.1 μg/L with ≤10.0 μg/L of well water arsenic exposure were 1.17 (95% confidence interval (CI): 0.92, 1.49), 1.69 (95% CI: 1.33, 2.14), 1.97 (95% CI: 1.58, 2.46), and 2.98 (95% CI: 2.40, 3.71), respectively (Ptrend = 0.0001). Results were similar for the other measures of arsenic exposure, and the increased risks remained unchanged with changes in exposure in recent years. Dose-dependent associations were more pronounced in females, but the incidence of skin lesions was greater in males and older individuals. Chronic arsenic exposure from drinking water was associated with increased incidence of skin lesions, even at low levels of arsenic exposure (<100 μg/L). PMID:21576319
An epigenetic signature of adhesion molecules predicts poor prognosis of ovarian cancer patients
Chang, Ping-Ying; Liao, Yu-Ping; Wang, Hui-Chen; Chen, Yu-Chih; Huang, Rui-Lan; Wang, Yu-Chi; Yuan, Chiou-Chung; Lai, Hung-Cheng
2017-01-01
DNA methylation is a promising biomarker for cancer. The epigenetic effects of cell adhesion molecules may affect the therapeutic outcome and the present study examined their effects on survival in ovarian cancer. We integrated methylomics and genomics datasets in The Cancer Genome Atlas (n = 391) and identified 106 highly methylated adhesion-related genes in ovarian cancer tissues. Univariate analysis revealed the methylation status of eight genes related to progression-free survival. In multivariate Cox regression analysis, four highly methylated genes (CD97, CTNNA1, DLC1, HAPLN2) and three genes (LAMA4, LPP, MFAP4) with low methylation were significantly associated with poor progression-free survival. Low methylation of VTN was an independent poor prognostic factor for overall survival after adjustment for age and stage. Patients who carried any two of CTNNA1, DLC1 or MFAP4 were significantly associated with poor progression-free survival (hazard ratio: 1.59; 95% confidence interval: 1.23, 2.05). This prognostic methylation signature was validated in a methylomics dataset generated in our lab (n = 37, hazard ratio: 16.64; 95% confidence interval: 2.68, 103.14) and in another from the Australian Ovarian Cancer Study (n = 91, hazard ratio: 2.43; 95% confidence interval: 1.11, 5.36). Epigenetics of cell adhesion molecules is related to ovarian cancer prognosis. A more comprehensive methylomics of cell adhesion molecules is needed and may advance personalized treatment with adhesion molecule-related drugs. PMID:28881822
Lin, Gen-Min; Colangelo, Laura A.; Lloyd-Jones, Donald M.; Redline, Susan; Yeboah, Joseph; Heckbert, Susan R.; Nazarian, Saman; Alonso, Alvaro; Bluemke, David A.; Punjabi, Naresh M.; Szklo, Moyses; Liu, Kiang
2015-01-01
The association between sleep apnea and atrial fibrillation (AF) has not been examined in a multiethnic adult population in prospective community-based studies. We prospectively (2000–2011) investigated the associations of physician-diagnosed sleep apnea (PDSA), which is considered more severe sleep apnea, and self-reported habitual snoring without PDSA (HS), a surrogate for mild sleep apnea, with incident AF in white, black, and Hispanic participants in the Multi-Ethnic Study of Atherosclerosis (MESA) who were free of clinical cardiovascular disease at baseline (2000–2002). Cox proportional hazards models were used to assess the associations, with adjustment for socioeconomic status, traditional vascular disease risk factors, race/ethnicity, body mass index, diabetes, chronic kidney disease, alcohol intake, and lipid-lowering therapy. Out of 4,395 respondents to a sleep questionnaire administered in MESA, 181 reported PDSA, 1,086 reported HS, and 3,128 reported neither HS nor PDSA (unaffected). Over an average 8.5-year follow-up period, 212 AF events were identified. As compared with unaffected participants, PDSA was associated with incident AF in the multivariable analysis, but HS was not (PDSA: hazard ratio = 1.76, 95% confidence interval: 1.03, 3.02; HS: hazard ratio = 1.02, 95% confidence interval: 0.72, 1.44). PDSA, a marker of more severe sleep apnea, was associated with higher risk of incident AF in this analysis of MESA data. PMID:25977516
Diet Quality and Colorectal Cancer Risk in the Women's Health Initiative Observational Study
Vargas, Ashley J.; Neuhouser, Marian L.; George, Stephanie M.; Thomson, Cynthia A.; Ho, Gloria Y. F.; Rohan, Thomas E.; Kato, Ikuko; Nassir, Rami; Hou, Lifang; Manson, JoAnn E.
2016-01-01
Abstract Diet quality index scores on Healthy Eating Index 2010 (HEI-2010), Alternative HEI-2010, alternative Mediterranean Diet Index, and the Dietary Approaches to Stop Hypertension (DASH) index have been inversely associated with all-cause and cancer-specific death. This study assessed the association between these scores and colorectal cancer (CRC) incidence as well as CRC-specific mortality in the Women's Health Initiative Observational Study (1993–2012), a US study of postmenopausal women. During an average of 12.4 years of follow-up, there were 938 cases of CRC and 238 CRC-specific deaths. We estimated multivariate hazard ratios and 95% confidence intervals for relationships between quintiles of diet scores (from baseline food frequency questionnaires) and outcomes. HEI-2010 score (hazard ratios were 0.81, 0.77, and 0.73 with P values of 0.04, 0.01, and <0.01 for quintiles 3–5 vs. quintile 1, respectively) and DASH score (hazard ratios were 0.72, 0.74, and 0.78 with P values of <0.01, <0.01, and 0.03 for quintiles 3–5 vs. quintile 1, respectively), but not other diet scores, were associated with a lower risk of CRC in adjusted models. No diet scores were significantly associated with CRC-specific mortality. Closer adherence to HEI-2010 and DASH dietary recommendations was inversely associated with risk of CRC in this large cohort of postmenopausal women. PMID:27267948
HIV drug therapy duration; a Swedish real world nationwide cohort study on InfCareHIV 2009-2014.
Häggblom, Amanda; Lindbäck, Stefan; Gisslén, Magnus; Flamholc, Leo; Hejdeman, Bo; Palmborg, Andreas; Leval, Amy; Herweijer, Eva; Valgardsson, Sverrir; Svedhem, Veronica
2017-01-01
As HIV infection needs a lifelong treatment, studying drug therapy duration and factors influencing treatment durability is crucial. The Swedish database InfCareHIV includes high quality data from more than 99% of all patients diagnosed with HIV infection in Sweden and provides a unique opportunity to examine outcomes in a nationwide real world cohort. Adult patients who started a new therapy defined as a new 3rd agent (all antiretrovirals that are not N[t]RTIs) 2009-2014 with more than 100 observations in treatment-naive or treatment-experienced patients were included. Dolutegravir was excluded due to short follow up period. Multivariate Cox proportional hazards models were used to estimate hazard ratios for treatment discontinuation. In treatment-naïve 2541 patients started 2583 episodes of treatments with a 3rd agent. Efavirenz was most commonly used (n = 1096) followed by darunavir (n = 504), atazanavir (n = 386), lopinavir (n = 292), rilpivirine (n = 156) and raltegravir (n = 149). In comparison with efavirenz, patients on rilpivirine were least likely to discontinue treatment (adjusted HR 0.33; 95% CI 0.20-0.54, p<0.001), while patients on lopinavir were most likely to discontinue treatment (adjusted HR 2.80; 95% CI 2.30-3.40, p<0.001). Also raltegravir was associated with early treatment discontinuation (adjusted HR 1.47; 95% CI 1.12-1.92, p = 0.005). The adjusted HR for atazanavir and darunavir were not significantly different from efavirenz. In treatment-experienced 2991 patients started 4552 episodes of treatments with a 3rd agent. Darunavir was most commonly used (n = 1285), followed by atazanavir (n = 806), efavirenz (n = 694), raltegravir (n = 622), rilpivirine (n = 592), lopinavir (n = 291) and etravirine (n = 262). Compared to darunavir all other drugs except for rilpivirine (HR 0.66; 95% CI 0.52-0.83, p<0.001) had higher risk for discontinuation in the multivariate adjusted analyses; atazanavir (HR 1.71; 95% CI 1.48-1.97, p<0.001), efavirenz (HR 1.86; 95% CI 1.59-2.17, p<0.001), raltegravir (HR 1.35; 95% CI 1.15-1.58, p<0.001), lopinavir (HR 3.58; 95% CI 3.02-4.25, p<0.001) and etravirine (HR 1.61; 95% CI 1.31-1.98, p<0.001).Besides the 3rd agent chosen also certain baseline characteristics of patients were independently associated with differences in treatment duration. In naive patients, presence of an AIDS-defining diagnosis and the use of other backbone than TDF/FTC or ABC/3TC increased the risk for early treatment discontinuation. In treatment-experienced patients, detectable plasma viral load at the time of switch or being highly treatment experienced increased the risk for early treatment discontinuation. Treatment durability is dependent on several factors among others patient characteristics and ART guidelines. The choice of 3rd agent has a strong impact and significant differences between different drugs on treatment duration exist.
HIV drug therapy duration; a Swedish real world nationwide cohort study on InfCareHIV 2009-2014
Häggblom, Amanda; Lindbäck, Stefan; Gisslén, Magnus; Flamholc, Leo; Hejdeman, Bo; Palmborg, Andreas; Leval, Amy; Herweijer, Eva; Valgardsson, Sverrir; Svedhem, Veronica
2017-01-01
Background As HIV infection needs a lifelong treatment, studying drug therapy duration and factors influencing treatment durability is crucial. The Swedish database InfCareHIV includes high quality data from more than 99% of all patients diagnosed with HIV infection in Sweden and provides a unique opportunity to examine outcomes in a nationwide real world cohort. Methods Adult patients who started a new therapy defined as a new 3rd agent (all antiretrovirals that are not N[t]RTIs) 2009–2014 with more than 100 observations in treatment-naive or treatment-experienced patients were included. Dolutegravir was excluded due to short follow up period. Multivariate Cox proportional hazards models were used to estimate hazard ratios for treatment discontinuation. Results In treatment-naïve 2541 patients started 2583 episodes of treatments with a 3rd agent. Efavirenz was most commonly used (n = 1096) followed by darunavir (n = 504), atazanavir (n = 386), lopinavir (n = 292), rilpivirine (n = 156) and raltegravir (n = 149). In comparison with efavirenz, patients on rilpivirine were least likely to discontinue treatment (adjusted HR 0.33; 95% CI 0.20–0.54, p<0.001), while patients on lopinavir were most likely to discontinue treatment (adjusted HR 2.80; 95% CI 2.30–3.40, p<0.001). Also raltegravir was associated with early treatment discontinuation (adjusted HR 1.47; 95% CI 1.12–1.92, p = 0.005). The adjusted HR for atazanavir and darunavir were not significantly different from efavirenz. In treatment-experienced 2991 patients started 4552 episodes of treatments with a 3rd agent. Darunavir was most commonly used (n = 1285), followed by atazanavir (n = 806), efavirenz (n = 694), raltegravir (n = 622), rilpivirine (n = 592), lopinavir (n = 291) and etravirine (n = 262). Compared to darunavir all other drugs except for rilpivirine (HR 0.66; 95% CI 0.52–0.83, p<0.001) had higher risk for discontinuation in the multivariate adjusted analyses; atazanavir (HR 1.71; 95% CI 1.48–1.97, p<0.001), efavirenz (HR 1.86; 95% CI 1.59–2.17, p<0.001), raltegravir (HR 1.35; 95% CI 1.15–1.58, p<0.001), lopinavir (HR 3.58; 95% CI 3.02–4.25, p<0.001) and etravirine (HR 1.61; 95% CI 1.31–1.98, p<0.001).Besides the 3rd agent chosen also certain baseline characteristics of patients were independently associated with differences in treatment duration. In naive patients, presence of an AIDS-defining diagnosis and the use of other backbone than TDF/FTC or ABC/3TC increased the risk for early treatment discontinuation. In treatment-experienced patients, detectable plasma viral load at the time of switch or being highly treatment experienced increased the risk for early treatment discontinuation. Conclusions Treatment durability is dependent on several factors among others patient characteristics and ART guidelines. The choice of 3rd agent has a strong impact and significant differences between different drugs on treatment duration exist. PMID:28207816
Clemens, Michael S; Stewart, Ian J; Sosnov, Jonathan A; Howard, Jeffrey T; Belenkiy, Slava M; Sine, Christy R; Henderson, Jonathan L; Buel, Allison R; Batchinsky, Andriy I; Cancio, Leopoldo C; Chung, Kevin K
2016-10-01
To evaluate the association between acute respiratory distress syndrome and acute kidney injury with respect to their contributions to mortality in critically ill patients. Retrospective analysis of consecutive adult burn patients requiring mechanical ventilation. A 16-bed burn ICU at tertiary military teaching hospital. Adult patients more than 18 years old requiring mechanical ventilation during their initial admission to our burn ICU from January 1, 2003, to December 31, 2011. None. A total 830 patients were included, of whom 48.2% had acute kidney injury (n = 400). These patients had a 73% increased risk of developing acute respiratory distress syndrome after controlling for age, gender, total body surface area burned, and inhalation injury (hazard ratio, 1.73; 95% CI, 1.18-2.54; p = 0.005). In a reciprocal multivariate analysis, acute respiratory distress syndrome (n = 299; 36%) demonstrated a strong trend toward developing acute kidney injury (hazard ratio, 1.39; 95% CI, 0.99-1.95; p = 0.05). There was a 24% overall in-hospital mortality (n = 198). After adjusting for the aforementioned confounders, both acute kidney injury (hazard ratio, 3.73; 95% CI, 2.39-5.82; p < 0.001) and acute respiratory distress syndrome (hazard ratio, 2.16; 95% CI, 1.58-2.94; p < 0.001) significantly contributed to mortality. Age, total body surface area burned, and inhalation injury were also significantly associated with increased mortality. Acute kidney injury increases the risk of acute respiratory distress syndrome in mechanically ventilated burn patients, whereas acute respiratory distress syndrome similarly demonstrates a strong trend toward the development of acute kidney injury. Acute kidney injury and acute respiratory distress syndrome are both independent risks for subsequent death. Future research should look at this interplay for possible early interventions.
Mortality in former Olympic athletes: retrospective cohort analysis
Zwiers, R; Zantvoord, F W A; van Bodegom, D; van der Ouderaa, F J G; Westendorp, R G J
2012-01-01
Objective To assess the mortality risk in subsequent years (adjusted for year of birth, nationality, and sex) of former Olympic athletes from disciplines with different levels of exercise intensity. Design Retrospective cohort study. Setting Former Olympic athletes. Participants 9889 athletes (with a known age at death) who participated in the Olympic Games between 1896 and 1936, representing 43 types of disciplines with different levels of cardiovascular, static, and dynamic intensity exercise; high or low risk of bodily collision; and different levels of physical contact. Main outcome measure All cause mortality. Results Hazard ratios for mortality among athletes from disciplines with moderate cardiovascular intensity (1.01, 95% confidence interval 0.96 to 1.07) or high cardiovascular intensity (0.98, 0.92 to 1.04) were similar to those in athletes from disciplines with low cardiovascular intensity. The underlying static and dynamic components in exercise intensity showed similar non-significant results. Increased mortality was seen among athletes from disciplines with a high risk of bodily collision (hazard ratio 1.11, 1.06 to 1.15) and with high levels of physical contact (1.16, 1.11 to 1.22). In a multivariate analysis, the effect of high cardiovascular intensity remained similar (hazard ratio 1.05, 0.89 to 1.25); the increased mortality associated with high physical contact persisted (hazard ratio 1.13, 1.06 to 1.21), but that for bodily collision became non-significant (1.03, 0.98 to 1.09) as a consequence of its close relation with physical contact. Conclusions Among former Olympic athletes, engagement in disciplines with high intensity exercise did not bring a survival benefit compared with disciplines with low intensity exercise. Those who engaged in disciplines with high levels of physical contact had higher mortality than other Olympians later in life. PMID:23241269
Seth, Arjun; Mossavar-Rahmani, Yasmin; Kamensky, Victor; Silver, Brian; Lakshminarayan, Kamakshi; Prentice, Ross; Van Horn, Linda; Wassertheil-Smoller, Sylvia
2014-10-01
Dietary potassium has been associated with lower risk of stroke, but there are little data on dietary potassium effects on different stroke subtypes or in older women with hypertension and nonhypertension. The study population consisted of 90 137 postmenopausal women aged 50 to 79 at enrollment, free of stroke history at baseline, followed up prospectively for an average of 11 years. Outcome variables were total, ischemic, and hemorrhagic stroke, and all-cause mortality. Incidence was compared across quartiles of dietary potassium intake, and hazard ratios were obtained from Cox proportional hazards models after adjusting for potential confounding variables, and in women with hypertension and nonhypertension separately. Mean dietary potassium intake was 2611 mg/d. Highest quartile of potassium intake was associated with lower incidence of ischemic and hemorrhagic stroke and total mortality. Multivariate analyses comparing highest to lowest quartile of potassium intake indicated a hazard ratio of 0.90 (95% confidence interval, 0.85-0.95) for all-cause mortality, 0.88 (95% confidence interval, 0.79-0.98) for all stroke, and 0.84 (95% confidence interval, 0.74-0.96) for ischemic stroke. The effect on ischemic stroke was more apparent in women with nonhypertension among whom there was a 27% lower risk with hazard ratio of 0.73 (95% confidence interval, 0.60-0.88), interaction P<0.10. There was no association with hemorrhagic stroke. High potassium intake is associated with a lower risk of all stroke and ischemic stroke, as well as all-cause mortality in older women, particularly those who are not hypertensive. © 2014 American Heart Association, Inc.
Welfare state regimes and gender inequalities in the exposure to work-related psychosocial hazards.
Campos-Serna, Javier; Ronda-Pérez, Elena; Moen, Bente E; Artazcoz, Lucia; Benavides, Fernando G
2013-01-01
Gender inequalities in the exposure to work-related psychosocial hazards are well established. However, little is known about how welfare state regimes influence these inequalities. To examine the relationship between welfare state regimes and gender inequalities in the exposure to work-related psychosocial hazards in Europe, considering occupational social class. We used a sample of 27, 465 workers from 28 European countries. Dependent variables were high strain, iso-strain, and effort-reward imbalance, and the independent was gender. We calculated the prevalence and prevalence ratio separately for each welfare state regime and occupational social class, using multivariate logistic regression models. More female than male managers/professionals were exposed to: high strain, iso-strain, and effort-reward imbalance in Scandinavian [adjusted prevalence ratio (aPR) = 2·26; 95% confidence interval (95% CI): 1·87-2·75; 2·12: 1·72-2·61; 1·41: 1·15-1·74; respectively] and Continental regimes (1·43: 1·23-1·54; 1·51: 1·23-1·84; 1·40: 1·17-1·67); and to high strain and iso-strain in Anglo-Saxon (1·92: 1·40-2·63; 1·85: 1·30-2·64; respectively), Southern (1·43: 1·14-1·79; 1·60: 1·18-2·18), and Eastern regimes (1·56: 1·35-1·81; 1·53: 1·28-1·83). Gender inequalities in the exposure to work-related psychosocial hazards were not lower in those welfare state regimes with higher levels of universal social protection policies.
Falkstedt, Daniel; Wolff, Valerie; Allebeck, Peter; Hemmingsson, Tomas; Danielsson, Anna-Karin
2017-02-01
Current knowledge on cannabis use in relation to stroke is based almost exclusively on clinical reports. By using a population-based cohort, we aimed to find out whether there was an association between cannabis use and early-onset stroke, when accounting for the use of tobacco and alcohol. The cohort comprises 49 321 Swedish men, born between 1949 and 1951, who were conscripted into compulsory military service between the ages of 18 and 20. All men answered 2 detailed questionnaires at conscription and were subject to examinations of physical aptitude, psychological functioning, and medical status. Information on stroke events up to ≈60 years of age was obtained from national databases; this includes strokes experienced before 45 years of age. No associations between cannabis use in young adulthood and strokes experienced ≤45 years of age or beyond were found in multivariable models: cannabis use >50 times, hazard ratios=0.93 (95% confidence interval [CI], 0.34-2.57) and 0.95 (95% CI, 0.59-1.53). Although an almost doubled risk of ischemic stroke was observed in those with cannabis use >50 times, this risk was attenuated when adjusted for tobacco usage: hazards ratio=1.47 (95% CI, 0.83-2.56). Smoking ≥20 cigarettes per day was clearly associated both with strokes before 45 years of age, hazards ratio=5.04 (95% CI, 2.80-9.06), and with strokes throughout the follow-up, hazards ratio=2.15 (95% CI, 1.61-2.88). We found no evident association between cannabis use in young adulthood and stroke, including strokes before 45 years of age. Tobacco smoking, however, showed a clear, dose-response shaped association with stroke. © 2016 American Heart Association, Inc.
Palatini, Paolo; Reboldi, Gianpaolo; Beilin, Lawrence J; Eguchi, Kazuo; Imai, Yutaka; Kario, Kazuomi; Ohkubo, Takayoshi; Pierdomenico, Sante D; Saladini, Francesca; Schwartz, Joseph E; Wing, Lindon; Verdecchia, Paolo
2013-09-30
Data from prospective cohort studies regarding the association between ambulatory heart rate (HR) and cardiovascular events (CVE) are conflicting. To investigate whether ambulatory HR predicts CVE in hypertension, we performed 24-hour ambulatory blood pressure and HR monitoring in 7600 hypertensive patients aged 52 ± 16 years from Italy, U.S.A., Japan, and Australia, included in the 'ABP-International' registry. All were untreated at baseline examination. Standardized hazard ratios for ambulatory HRs were computed, stratifying for cohort, and adjusting for age, gender, blood pressure, smoking, diabetes, serum total cholesterol and serum creatinine. During a median follow-up of 5.0 years there were 639 fatal and nonfatal CVE. In a multivariable Cox model, night-time HR predicted fatal combined with nonfatal CVE more closely than 24h HR (p=0.007 and =0.03, respectively). Daytime HR and the night:day HR ratio were not associated with CVE (p=0.07 and =0.18, respectively). The hazard ratio of the fatal combined with nonfatal CVE for a 10-beats/min increment of the night-time HR was 1.13 (95% CI, 1.04-1.22). This relationship remained significant when subjects taking beta-blockers during the follow-up (hazard ratio, 1.15; 95% CI, 1.05-1.25) or subjects who had an event within 5 years after enrollment (hazard ratio, 1.23; 95% CI, 1.05-1.45) were excluded from analysis. At variance with previous data obtained from general populations, ambulatory HR added to the risk stratification for fatal combined with nonfatal CVE in the hypertensive patients from the ABP-International study. Night-time HR was a better predictor of CVE than daytime HR. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Social networks and mortality based on the Komo-Ise cohort study in Japan.
Iwasaki, Motoki; Otani, Tetsuya; Sunaga, Rumiko; Miyazaki, Hiroko; Xiao, Liu; Wang, Naren; Yosiaki, Sasazawa; Suzuki, Shosuke
2002-12-01
No prospective studies have examined the association between social networks and all-cause and cause-specific mortality among middle-aged Japanese. The study of varied populations may contribute to clarifying the robustness of the observed effects of social networks and extend their generalizability. To clarify the association between social networks and mortality among middle-aged and elderly Japanese, a community-based prospective study, the Komo-Ise Study, was conducted in two areas of Gunma Prefecture, Japan. A total of 11 565 subjects aged 40-69 years at baseline in 1993 completed a self-administered questionnaire. During the 7-year follow-up period, 335 men and 155 women died and the relative risk (RR) of each social network item was estimated by the Cox proportional hazard model. Single women had significantly increased risks of all-cause (multivariate RR = 2.2), and all circulatory system disease (age-area adjusted RR = 2.6) mortality. Men who did not participate in hobbies, club activities, or community groups had significantly higher multivariate RR for all-cause (RR = 1.5), all circulatory system disease (RR = 1.6) and non-cancer and non-circulatory system disease (RR = 2.3) mortality. Urban women who rarely or never met close relatives had significantly elevated risks of all-cause (RR = 2.4), all cancer (RR = 2.6), and non-cancer and non-circulatory system disease (RR = 2.7) mortality after adjustment for established risk factors. This study provides evidence that social networks are an important predictor of mortality risk for middle-aged and elderly Japanese men and women. Lack of participation, for men, and being single and lack of meeting close relatives, for women, were independent risk factors for mortality.
Maan, Abhishek; Jorgensen, Neal W; Mansour, Moussa; Dudley, Samuel; Jenny, Nancy S; Defilippi, Christopher; Szklo, Moyses; Alonso, Alvaro; Refaat, Marwan M; Ruskin, Jeremy; Heckbert, Susan R; Heist, E Kevin
2016-12-01
During atrial fibrillation (AF), a high rate of myocyte activation causes cellular stress and initiates the process of atrial remodeling, which further promotes persistence of AF. Although heat shock proteins (HSPs) have been shown to prevent atrial remodeling and suppress the occurrence of AF in cellular and animal experimental models, increased levels of HSP-60 have been observed in patients with postoperative AF, likely reflecting a response to cellular stress. To better understand the role of HSP-60 in relation to AF, we examined the association of HSP-60 levels in relation to the future development of AF in the Multi-Ethnic Study of Atherosclerosis (MESA). MESA is a cohort study that recruited 6,814 participants aged 45-84 years and free of known cardiovascular disease at baseline (2000-2002) from six field centers. We investigated 983 participants, selected at random from the total cohort, who had HSP-60 measured and were free of AF at baseline. We tested the association of HSP-60 levels with the incidence of AF using multivariate Cox models after adjustment for demographics, clinical characteristics, and biomarkers. During an average of 10.6 years of follow-up, 77 participants developed AF. We did not observe a significant association between the log-transformed HSP-60 levels and development of AF on either unadjusted or multivariate analysis (adjusted hazard ratio: 1.02 per unit difference on natural log scale, 95% confidence interval: 0.77-1.34 ln (ng/mL). Contrary to the findings from the preclinical studies, which demonstrated an important role of HSP-60 in the pathogenesis of AF, we did not observe a significant association between HSP-60 and occurrence of AF. © 2016 Wiley Periodicals, Inc.
Meat consumption, heterocyclic amines and colorectal cancer risk: the Multiethnic Cohort Study.
Ollberding, Nicholas J; Wilkens, Lynne R; Henderson, Brian E; Kolonel, Laurence N; Le Marchand, Loïc
2012-10-01
Greater consumption of red and processed meat has been associated with an increased risk of colorectal cancer in several recent meta-analyses. Heterocyclic amines (HCAs) have been hypothesized to underlie this association. In this prospective analysis conducted within the Multiethnic Cohort Study, we examined whether greater consumption of total, red or processed meat was associated with the risk of colorectal cancer among 165,717 participants who completed a detailed food frequency questionnaire at baseline. In addition, we examined whether greater estimated intake of HCAs was associated with the risk of colorectal cancer among 131,763 participants who completed a follow-up questionnaire that included a meat-cooking module. A total of 3,404 and 1,757 invasive colorectal cancers were identified from baseline to the end of follow-up and from the date of administration of the meat-cooking module to the end of follow-up, respectively. Proportional hazard models were used to estimate basic and multivariable-adjusted relative risks (RRs) and 95% confidence intervals for colorectal cancer associated with dietary exposures. In multivariable models, no association with the risk of colorectal cancer was detected for density-adjusted total meat (RR(Q5 vs. Q1) = 0.93 [0.83-1.05]), red meat (RR = 1.02 [0.91-1.16]) or processed meat intake (RR = 1.06 [0.94-1.19]) or for total (RR = 0.90 [0.76-1.05]) or specific HCA intake whether comparing quintiles of dietary exposure or using continuous variables. Although our results do not support a role for meat or for HCAs from meat in the etiology of colorectal cancer, we cannot rule out the possibility of a modest effect. Copyright © 2012 UICC.
Meat Consumption, Heterocyclic Amines, and Colorectal Cancer Risk: The Multiethnic Cohort Study
Ollberding, Nicholas J.; Wilkens, Lynne R.; Henderson, Brian E.; Kolonel, Laurence N.; Le Marchand, Loïc
2012-01-01
Greater consumption of red and processed meat has been associated with an increased risk of colorectal cancer in several recent meta-analyses. Heterocyclic amines (HCAs) have been hypothesized to underlie this association. In this prospective analysis conducted within the Multiethnic Cohort Study, we examined whether greater consumption of total, red, or processed meat was associated with the risk of colorectal cancer among 165,717 participants who completed a detailed food frequency questionnaire at baseline. In addition, we examined whether greater estimated intake of HCAs was associated with the risk of colorectal cancer among 131,763 participants who completed a follow-up questionnaire that included a meat-cooking module. A total of 3,404 and 1,757 invasive colorectal cancers were identified from baseline to the end of follow-up, and from the date of administration of the meat-cooking module to the end of follow-up, respectively. Proportional hazards models were used to estimate basic and multivariable-adjusted relative risks (RRs) and 95% confidence intervals (CIs) for colorectal cancer associated with dietary exposures. In multivariable models, no association with the risk of colorectal cancer was detected for density-adjusted total meat (RRQ5 vs Q1=0.93 [0.83–1.05]), red meat (RR =1.02 [0.91–1.16]), or processed meat intake (RR =1.06 [0.94–1.19]), or for total (RR =0.90 [0.76–1.05]) or specific HCA intake whether comparing quintiles of dietary exposure or using continuous variables. Although our results do not support a role for meat or for HCAs from meat in the etiology of colorectal cancer, we cannot rule out the possibility of a modest effect. PMID:22438055
Perlstein, Todd S; Weuve, Jennifer; Pfeffer, Marc A; Beckman, Joshua A
2011-01-01
Background The red cell distribution width (RDW), an automated measure of red blood cell size heterogeneity (e.g. anisocytosis) that is largely overlooked, is a newly recognized risk marker in patients with established cardiovascular disease (CVD). It is unknown whether RDW is associated with mortality in the general population, or whether this association is specific to CVD. Methods We examined the association of RDW with all-cause mortality, as well as cardiovascular, cancer, and chronic lower respiratory disease mortality among 15,852 adult participants in The Third National Health and Nutrition Examination Survey (1988–1994), a nationally representative sample of the United States population. Mortality status was obtained by matching to the National Death Index, with follow-up through December 31, 2000. Results Estimated mortality rates increased 5-fold from the lowest to highest quintile of RDW after accounting for age, and 2-fold after multivariable adjustment (each Ptrend < 0.001). A 1- standard deviation increment in RDW (0.98) was associated with a 23% greater risk of all-cause mortality (hazard ratio (HR) 1.23, 95% confidence interval (CI) 1.18–1.28) after multivariable adjustment. RDW was also associated with risk of death due to cardiovascular disease (HR 1.22, 95% CI 1.14–1.31), cancer (HR 1.28, 95% CI 1.21–1.36), and chronic lower respiratory disease (HR 1.32, 95% CI 1.17–1.49). Conclusions Higher RDW was associated with increased mortality risk in this large, community-based sample, an association not specific to CVD. Study of anisocytosis may therefore yield novel pathophysiological insights, and measurement of RDW may contribute to risk assessment. PMID:19307522
Hu, Gang; Tuomilehto, Jaakko; Borodulin, Katja; Jousilahti, Pekka
2007-02-01
To determine joint associations of different kinds of physical activity and the Framingham risk score (FRS) with the 10-year risk of coronary heart disease (CHD) events. Study cohorts included 41 053 Finnish participants aged 25-64 years without history of CHD and stroke. The multivariable-adjusted 10-year hazard ratios (HRs) of coronary events associated with low, moderate, and high occupational physical activity were 1.00, 0.66, and 0.74 (Ptrend<0.001) for men, and 1.00, 0.53, and 0.58 (Ptrend<0.001) for women, respectively. The multivariable-adjusted 10-year HRs of coronary events associated with low, moderate, and high leisure-time physical activity were 1.00, 0.97, and 0.66 (Ptrend=0.002) for men, and 1.00, 0.74, and 0.54 (Ptrend=0.003) for women, respectively. Active commuting had a significant inverse association with 10-year risk of coronary events in women only. The FRS predicted 10-year risk of coronary events among both men and women. The protective effects of occupational, commuting, or leisure-time physical activity were consistent in subjects with a very low (<6%), low (6-9%), intermediate (10-19%), or high (>or=20%) risk of the FRS. Moderate or high levels of occupational or leisure-time physical activity among both men and women, and daily walking or cycling to and from work among women are associated with a reduced 10-year risk of CHD events. These favourable effects of physical activity on CHD risk are observed at all levels of CHD risk based on FRS assessment.
Biological and Behavioral Risks for Incident Chlamydia trachomatis Infection in a Prospective Cohort
Hwang, Loris Y.; Ma, Yifei; Moscicki, Anna-Barbara
2014-01-01
Objective To identify biological and behavioral risks for incident Chlamydia trachomatis among a prospective cohort of young women followed frequently. Methods Our cohort of 629 women from two outpatient sites was seen every 4 months (October 2000 through April 2012) for behavioral interviews and infection testing. C trachomatis was tested annually, and anytime patients reported symptoms or possible exposure using commercial nucleic acid amplification tests. Analyses excluded baseline prevalent C trachomatis infections. Risk factors for incident C trachomatis were assessed using Cox proportional hazards models. Significant risks (p<0.10) from bivariate models were entered in a multivariate model, adjusted for four covariates chosen a priori (age, race or ethnicity, condom use, study site). Backwards step-wise elimination produced a final parsimonious model retaining significant variables (p<0.05) and the four adjustment variables. Results The 629 women attended 9,594 total visits. Median follow-up time was 6.9 years (interquartile range 3.2-9.8), during which 97 (15%) women had incident C trachomatis . In the final multivariate model, incident C trachomatis was independently associated with HPV at the preceding visit (p<0.01), smoking (p=0.02), and weekly use of substances besides alcohol and marijuana (p<0.01) since prior visit. Among 207 women with available colpophotographs (1,742 visits), cervical ectopy was not a significant risk factor (p range=0.16-0.39 for ectopy as continuous and ordinal variables). Conclusion Novel risks for C trachomatis include preceding HPV, smoking, and substance use, which may reflect both biological and behavioral mechanisms of risk, such as immune modulation, higher-risk sexual networks, or both. Improved understanding of the biological bases for C trachomatis risk would inform our strategies for C trachomatis control. PMID:25437724
Dai, Jun; Krasnow, Ruth E.; Reed, Terry
2018-01-01
It is unknown whether influences of midlife whole diet on the long-term CHD mortality risk are independent of genetic and common environmental factors or familial predisposition. We addressed this question prospectively using data from the National Heart, Lung, and Blood Institute Twin Study. We included 910 male twins who were middle-aged and had usual diet assessed with nutritionist-administered, crosschecked dietary history interview at baseline (1969–1973). Moderation-quantified healthy diet (MQHD), a dietary pattern, was created to evaluate a whole diet. Primary outcome was time-to-CHD death. Hazard ratios (HR) were estimated using frailty survival model. Known CHD risk factors were controlled. During the follow-up of 40 years through 31 December 2009, 113 CHD deaths, 198 total cardiovascular deaths and 610 all-cause deaths occurred. In the entire cohort, the multivariable-adjusted HR for the overall association (equivalent to a general population association) was 0·76 (95 % CI 0·66, 0·88) per 10-unit increment in the MQHD score for CHD, and the multivariable-adjusted HR for a twin with a MQHD score ten units higher than his co-twin brother was 0·79 (95 % CI 0·64, 0·96, P = 0·02) for CHD independent of familial predisposition. Similar results were found for a slightly more food-specified alternative moderation-quantified healthy diet (aMQHD). The between-pair association (reflecting familial influence) was significant for CHD for both MQHD and aMQHD. It is concluded that associations of MQHD and aMQHD with a lower long-term CHD mortality risk are both nutritionally and familially affected, supporting their use for dietary planning to prevent CHD mortality. PMID:27188259
The relationship of age-adjusted Charlson comorbidity ındex and diurnal variation of blood pressure.
Kalaycı, Belma; Erten, Yunus Turgay; Akgün, Tunahan; Karabag, Turgut; Kokturk, Furuzan
2018-03-05
Charlson Comorbidity index (CCI) is a scoring system to predict prognosis and mortality. It exhibits better utility when combined with age, age-adjusted Charlson Comorbidity Index (ACCI). The aim of this study was to evaluate the relationship between ACCI and diurnal variation of blood pressure parameters in hypertensive patients and normotensive patients. We enrolled 236 patients. All patients underwent a 24-h ambulatory blood pressure monitoring (ABPM) for evaluation of dipper or non-dipper pattern. We searched the correlation between ACCI and dipper or non-dipper pattern and other ABPM parameters. To further investigate the role of these parameters in predicting survival, a multivariate analysis using the Cox proportional hazard model was performed. 167 patients were in the hypertensive group (87 patients in non-dipper status) and 69 patients were in the normotensive group (41 patients in non-dipper status) of all study patients. We found a significant difference and negative correlation between AACI and 24-h diastolic blood pressure (DBP), awake DBP, awake mean blood pressure (MBP) and 24-h MBP and awake systolic blood pressure(SBP). Night decrease ratio of blood pressure had also a negative correlation with ACCI (p = 0.003, r = -0.233). However, we found a relationship with non-dipper pattern and ACCI in the hypertensive patients (p = 0.050). In multivariate Cox analysis sleep MBP was found related to mortality like ACCI (p = 0.023, HR = 1.086, %95 CI 1.012-1.165) Conclusion: ACCI was statistically significantly higher in non-dipper hypertensive patients than dipper hypertensive patients while ACCI had a negative correlation with blood pressure. Sleep MBP may predict mortality.
Matata, Bashir M; Shaw, Matthew; Grayson, Antony D; McShane, James; Lucy, John; Fisher, Michael; Jackson, Mark
2016-02-01
There is strong evidence to suggest that social deprivation is linked to health inequalities. In the UK, concerns have been raised regarding disparities in the outcomes of acute cardiac services within the National Health Service (NHS). This study explored whether differences exist in (a) elective hospital presentation time (b) indicators of severity and disease burden and (c) treatment outcomes (hospital stay and mortality) on the basis of the index of multiple deprivation (IMD) status. This study was a retrospective analysis of data from NHS databases for 13,758 patients that had undergone cardiac revascularisation interventions at the Liverpool Heart and Chest Hospital between April 2007-March 2012. The data was analysed by descriptive, univariate and multivariate statistics to explore the association between the IMD quintiles (Q1-Q5) and revascularisation type, elective presentation time, hospital length of stay and mortality. Univariate analysis indicated that there were significant differences between patients from the most deprived areas (Q5) compared with patients from the least deprived areas (Q1), these included admission volumes, time before presentation to hospital and proportion of non-elective cases. After risk-adjustments, percutaneous coronary intervention patients from Q5 compared with Q1 had significantly greater length of hospital stay and risk of in-hospital major acute cardiovascular events. After multivariate adjustment for baseline risk factors, patients from Q5 were associated with significantly worse five-year survival as compared with Q1 (hazard ratio (HR) 1.52, 95% confidence interval (CI): 1.36-1.71; p < 0.001). In conclusion, there is evidence to suggest that inequalities in cardiac revascularisation choices and outcomes in the UK may be associated with social deprivation. © The European Society of Cardiology 2015.
Menopause and postmenopausal hormone therapy and risk of hearing loss.
Curhan, Sharon G; Eliassen, A Heather; Eavey, Roland D; Wang, Molin; Lin, Brian M; Curhan, Gary C
2017-09-01
Menopause may be a risk factor for hearing loss, and postmenopausal hormone therapy (HT) has been proposed to slow hearing decline; however, there are no large prospective studies. We prospectively examined the independent relations between menopause and postmenopausal HT and risk of self-reported hearing loss. Prospective cohort study among 80,972 women in the Nurses' Health Study II, baseline age 27 to 44 years, followed from 1991 to 2013. Baseline and updated information was obtained from detailed validated biennial questionnaires. Cox proportional-hazards regression models were used to examine independent associations between menopausal status and postmenopausal HT and risk of hearing loss. After 1,410,928 person-years of follow-up, 18,558 cases of hearing loss were reported. There was no significant overall association between menopausal status, natural or surgical, and risk of hearing loss. Older age at natural menopause was associated with higher risk. The multivariable-adjusted relative risk of hearing loss among women who underwent natural menopause at age 50+ years compared with those aged less than 50 years was 1.10 (95% confidence interval [CI] 1.03, 1.17). Among postmenopausal women, oral HT (estrogen therapy or estrogen plus progestogen therapy) was associated with higher risk of hearing loss, and longer duration of use was associated with higher risk (P trend < 0.001). Compared with women who never used HT, the multivariable-adjusted relative risk of hearing loss among women who used oral HT for 5 to 9.9 years was 1.15 (95% CI 1.06, 1.24) and for 10+ years was 1.21 (95% CI 1.07, 1.37). Older age at menopause and longer duration of postmenopausal HT are associated with higher risk of hearing loss.
Muraleedharan, Vakkat; Marsh, Hazel; Kapoor, Dheeraj; Channer, Kevin S; Jones, T Hugh
2013-12-01
Men with type 2 diabetes are known to have a high prevalence of testosterone deficiency. No long-term data are available regarding testosterone and mortality in men with type 2 diabetes or any effect of testosterone replacement therapy (TRT). We report a 6-year follow-up study to examine the effect of baseline testosterone and TRT on all-cause mortality in men with type 2 diabetes and low testosterone. A total of 581 men with type 2 diabetes who had testosterone levels performed between 2002 and 2005 were followed up for a mean period of 5.81.3 S.D. years. mortality rates were compared between total testosterone 10.4nmol/l (300ng/dl; n=343) and testosterone 10.4nmol/l (n=238). the effect of TRT (as per normal clinical practise: 85.9% testosterone gel and 14.1% intramuscular testosterone undecanoate) was assessed retrospectively within the low testosterone group. Mortality was increased in the low testosterone group (17.2%) compared with the normal testosterone group (9%; P=0.003) when controlled for covariates. In the Cox regression model, multivariate-adjusted hazard ratio (HR) for decreased survival was 2.02 (P=0.009, 95% CI 1.2-3.4). TRT (mean duration 41.6±20.7 months; n=64) was associated with a reduced mortality of 8.4% compared with 19.2% (P=0.002) in the untreated group (n=174). The multivariate-adjusted HR for decreased survival in the untreated group was 2.3 (95% CI 1.3-3.9, P=0.004). Low testosterone levels predict an increase in all-cause mortality during long-term follow-up. Testosterone replacement may improve survival in hypogonadal men with type 2 diabetes.
Tsao, Connie W; Gona, Philimon N; Salton, Carol J; Chuang, Michael L; Levy, Daniel; Manning, Warren J; O’Donnell, Christopher J
2015-01-01
Background Elevated left ventricular mass index (LVMI) and concentric left ventricular (LV) remodeling are related to adverse cardiovascular disease (CVD) events. The predictive utility of LV concentric remodeling and LV mass in the prediction of CVD events is not well characterized. Methods and Results Framingham Heart Study Offspring Cohort members without prevalent CVD (n=1715, 50% men, aged 65±9 years) underwent cardiovascular magnetic resonance for LVMI and geometry (2002–2006) and were prospectively followed for incident CVD (myocardial infarction, coronary insufficiency, heart failure, stroke) or CVD death. Over 13 808 person-years of follow-up (median 8.4, range 0.0 to 10.5 years), 85 CVD events occurred. In multivariable-adjusted proportional hazards regression models, each 10-g/m2 increment in LVMI and each 0.1 unit in relative wall thickness was associated with 33% and 59% increased risk for CVD, respectively (P=0.004 and P=0.009, respectively). The association between LV mass/LV end-diastolic volume and incident CVD was borderline significant (P=0.053). Multivariable-adjusted risk reclassification models showed a modest improvement in CVD risk prediction with the incorporation of cardiovascular magnetic resonance LVMI and measures of LV concentricity (C-statistic 0.71 [95% CI 0.65 to 0.78] for the model with traditional risk factors only, improved to 0.74 [95% CI 0.68 to 0.80] for the risk factor model additionally including LVMI and relative wall thickness). Conclusions Among adults free of prevalent CVD in the community, greater LVMI and LV concentric hypertrophy are associated with a marked increase in adverse incident CVD events. The potential benefit of aggressive primary prevention to modify LV mass and geometry in these adults requires further investigation. PMID:26374295
Kieneker, Lyanne M; Gansevoort, Ron T; Mukamal, Kenneth J; de Boer, Rudolf A; Navis, Gerjan; Bakker, Stephan J L; Joosten, Michel M
2014-10-01
Previous prospective cohort studies on the association between potassium intake and risk of hypertension have almost exclusively relied on self-reported dietary data, whereas repeated 24-hour urine excretions, as estimate of dietary uptake, may provide a more objective and quantitative estimate of this association. Risk of hypertension (defined as blood pressure ≥140/90 mm Hg or initiation of blood pressure-lowering drugs) was prospectively studied in 5511 normotensive subjects aged 28 to 75 years not using blood pressure-lowering drugs at baseline of the Prevention of Renal and Vascular End-Stage Disease (PREVEND) study. Potassium excretion was measured in two 24-hour urine specimens at baseline (1997-1998) and midway during follow-up (2001-2003). Baseline median potassium excretion was 70 mmol/24 h (interquartile range, 57-85 mmol/24 h), which corresponds to a dietary potassium intake of ≈91 mmol/24 h. During a median follow-up of 7.6 years (interquartile range, 5.0-9.3 years), 1172 subjects developed hypertension. The lowest sex-specific tertile of potassium excretion (men: <68 mmol/24 h; women: <58 mmol/24 h) had an increased risk of hypertension after multivariable adjustment (hazard ratio, 1.20; 95% confidence interval, 1.05-1.37), compared with the upper 2 tertiles (Pnonlinearity=0.008). The proportion of hypertension attributable to low potassium excretion was 6.2% (95% confidence interval, 1.7%-10.9%). No association was found between the sodium to potassium excretion ratio and risk of hypertension after multivariable adjustment. Low urinary potassium excretion was associated with an increased risk of developing hypertension. Dietary strategies to increase potassium intake to the recommended level of 90 mmol/d may have the potential to reduce the incidence of hypertension. © 2014 American Heart Association, Inc.
Nagayoshi, Mako; Lutsey, Pamela L.; Benkeser, David; Wassel, Christina L.; Folsom, Aaron R.; Shahar, Eyal; Iso, Hiroyasu; Allison, Matthew A.; Criqui, Michael H.; Redline, Susan
2016-01-01
Background and aims Numerous biological pathways linking sleep disturbances to atherosclerosis have been identified, such as insulin resistance, inflammation, hypertension, and endothelial dysfunction. Yet, the association of sleep apnea and sleep duration with peripheral artery disease (PAD) is not well characterized. Methods We evaluated the cross-sectional association between objectively measured sleep and prevalent PAD in 1,844 participants (mean age 68 years) who in 2010–2013 had in-home polysomnography, 7-day wrist actigraphy and ankle-brachial index (ABI) measurements. We also evaluated the relation between self-reported diagnosed sleep apnea and PAD incidence in 5,365 participants followed from 2000 to 2012. PAD was defined as ABI<0.90. Results In cross-sectional analyses, severe sleep apnea [apnea-hypopnea index (AHI) ≥30 vs. AHI <5] was associated with greater prevalent PAD only among black participants [multivariate adjusted prevalence ratio (95% CI): 2.29 (1.07–4.89); p-interaction = 0.05]. Short and long sleep duration was also associated with a 2-fold higher prevalence of PAD as compared with those who slept 7h/night, in the full sample. In longitudinal analyses, participants with self-reported diagnosed sleep apnea were at higher risk of incident PAD [multivariable adjusted hazard ratio (95% CI): 1.93 (1.05–3.53)], with no evidence of interaction by race/ethnicity. Conclusions These findings support a significant association between sleep apnea and prevalent and incident PAD, with evidence for stronger associations with objectively measured sleep apnea and cross sectional PAD in blacks. In addition, short and long sleep duration was associated with PAD. These results identify sleep disturbances as a potential risk factor for PAD. PMID:27423537
Kwon, H-M; Moon, Y-J; Jung, K-W; Jun, I-G; Song, J-G; Hwang, G-S
2018-05-01
The connection between renal dysfunction and cardiovascular dysfunction has been consistently shown. In patients with liver cirrhosis, renal dysfunction shows a tight correlation with prognosis after liver transplantation (LT); therefore, precise renal assessment is mandatory. Cystatin C, a sensitive biomarker for assessing renal function, has shown superiority in detecting mild renal dysfunction compared to classical biomarker creatinine. In this study, we aimed to compare cystatin C and creatinine in predicting 30-day major cardiovascular events (MACE) and all-cause mortality in LT recipients with normal serum creatinine levels. Between May 2010 and October 2015, 1181 LT recipients (mean Model for End-stage Liver Disease score 12.1) with pretransplantation creatinine level ≤1.4 mg/dL were divided into tertiles according to each renal biomarker. The 30-day MACE was a composite of troponin I >0.2 ng/mL, arrhythmia, congestive heart failure, death, and cerebrovascular events. The highest tertile of cystatin C (≥0.95 mg/L) was associated with a higher risk for a 30-day MACE event (odds ratio: 1.62; 95% confidence interval: 1.07 to 2.48) and higher risk of death (hazard ratio: 1.96; 95% confidence interval: 1.04 to 3.67) than the lowest tertile (<0.74 mg/L) after multivariate adjustments. However, the highest tertile of creatinine level showed neither increasing MACE event rate nor worse survival rate compared with the lowest tertile (both insignificant after multivariate adjustment). Pretransplantation cystatin C is superior in risk prediction of MACE and all-cause mortality in LT recipients with normal creatinine, compared to creatinine. It would assist further risk stratification which may not be detected with creatinine. Copyright © 2018 Elsevier Inc. All rights reserved.
Dai, Jun; Krasnow, Ruth E; Reed, Terry
2016-07-01
It is unknown whether influences of midlife whole diet on the long-term CHD mortality risk are independent of genetic and common environmental factors or familial predisposition. We addressed this question prospectively using data from the National Heart, Lung, and Blood Institute Twin Study. We included 910 male twins who were middle-aged and had usual diet assessed with nutritionist-administered, cross-checked dietary history interview at baseline (1969-1973). Moderation-quantified healthy diet (MQHD), a dietary pattern, was created to evaluate a whole diet. Primary outcome was time-to-CHD death. Hazard ratios (HR) were estimated using frailty survival model. Known CHD risk factors were controlled. During the follow-up of 40 years through 31 December 2009, 113 CHD deaths, 198 total cardiovascular deaths and 610 all-cause deaths occurred. In the entire cohort, the multivariable-adjusted HR for the overall association (equivalent to a general population association) was 0·76 (95 % CI 0·66, 0·88) per 10-unit increment in the MQHD score for CHD, and the multivariable-adjusted HR for a twin with a MQHD score ten units higher than his co-twin brother was 0·79 (95 % CI 0·64, 0·96, P=0·02) for CHD independent of familial predisposition. Similar results were found for a slightly more food-specified alternative moderation-quantified healthy diet (aMQHD). The between-pair association (reflecting familial influence) was significant for CHD for both MQHD and aMQHD. It is concluded that associations of MQHD and aMQHD with a lower long-term CHD mortality risk are both nutritionally and familially affected, supporting their use for dietary planning to prevent CHD mortality.
Li, H; Yang, G; Xiang, Y-B; Zhang, X; Zheng, W; Gao, Y-T; Shu, X-O
2013-06-01
The objective was to evaluate the association of body size and fat distribution with the risk of colorectal cancer (CRC) in Chinese men and women. This was a population-based, prospective cohort study. The analysis included 134,255 Chinese adults enrolled in the Shanghai Women's Health Study and the Shanghai Men's Health Study, with an average follow-up of 11.0 and 5.5 years, respectively. Waist circumference (WC), body mass index (BMI) and waist-to-hip ratio (WHR) were measured by trained interviewers at baseline. Multivariable Cox models were used to calculate adjusted hazard ratios (HRs) for incident CRC. A total of 935 incident CRC cases were identified. Both measures of general adiposity (measured by BMI) and central adiposity (measured by WHR and WC) were significantly associated with an increased risk of colon cancer in men but not in women. Multivariable-adjusted HRs for colon cancer in men in the highest compared with the lowest quintiles were 2.15 (95% confidence interval (CI): 1.35-3.43; P for trend=0.0006) for BMI, 1.97 (95% CI: 1.19-3.24; P for trend=0.0004) for WHR and 2.00 (95% CI: 1.21-3.29; P for trend=0.0002) for WC. The BMI-associated risk was attenuated in analyses stratified by WHR, whereas the WHR-associated risk remained significant in the high BMI stratum (HR for comparison of extreme tertiles of WHR: 3.38, 95% CI: 1.47-7.75; P for trend =0.0002). None of these anthropometric measures were significantly associated with rectal cancer. Obesity, particularly central obesity, was associated with an increased risk of colon cancer in men.
Robbins, Anthony S; Pavluck, Alexandre L; Fedewa, Stacey A; Chen, Amy Y; Ward, Elizabeth M
2009-08-01
Previous analyses have found that insurance status is a strong predictor of survival among patients with colorectal cancer aged 18 to 64 years. We investigated whether differences in comorbidity level may account in part for the association between insurance status and survival. We used 2003 to 2005 data from the National Cancer Data Base, a national hospital-based cancer registry, to examine the relationship between baseline characteristics and overall survival at 1 year among 64,304 white and black patients with colorectal cancer. In race-specific analyses, we used Cox proportional hazards models to assess 1-year survival by insurance status, controlling first for age, stage, facility type, and neighborhood education level and income, and then further controlling for comorbidity level. RESULTS; Comorbidity level was lowest among those with private insurance, higher for those who were uninsured or insured by Medicaid, and highest for those insured by Medicare. Survival at 1 year was significantly poorer for patients without private insurance, even after adjusting for important covariates. In these multivariate models, risk of death at 1 year was approximately 50% to 90% higher for white and black patients without private insurance. Further adjustment for number of comorbidities had only a modest impact on the association between insurance status and survival. In multivariate analyses, patients with > or = three comorbid conditions had approximately 40% to 50% higher risk of death at 1 year. CONCLUSION Among white and black patients aged 18 to 64 years, differences in comorbidity level do not account for the association between insurance status and survival in patients with colorectal cancer.
Bromfield, Samantha G; Ngameni, Cedric-Anthony; Colantonio, Lisandro D; Bowling, C Barrett; Shimbo, Daichi; Reynolds, Kristi; Safford, Monika M; Banach, Maciej; Toth, Peter P; Muntner, Paul
2017-08-01
Antihypertensive medication and low systolic blood pressure (BP) and diastolic BP have been associated with an increased falls risk in some studies. Many older adults have indicators of frailty, which may increase their risk for falls. We contrasted the association of systolic BP, diastolic BP, number of antihypertensive medication classes taken, and indicators of frailty with risk for serious fall injuries among 5236 REGARDS study (Reasons for Geographic and Racial Difference in Stroke) participants ≥65 years taking antihypertensive medication at baseline with Medicare fee-for-service coverage. Systolic BP and diastolic BP were measured, and antihypertensive medication classes being taken assessed through a pill bottle review during a study visit. Indicators of frailty included low body mass index, cognitive impairment, depressive symptoms, exhaustion, impaired mobility, and history of falls. Serious fall injuries were defined as fall-related fractures, brain injuries, or joint dislocations using Medicare claims through December 31, 2014. Over a median of 6.4 years, 802 (15.3%) participants had a serious fall injury. The multivariable-adjusted hazard ratio for a serious fall injury among participants with 1, 2, or ≥3 indicators of frailty versus no frailty indicators was 1.18 (95% confidence interval, 0.99-1.40), 1.49 (95% confidence interval, 1.19-1.87), and 2.04 (95% confidence interval, 1.56-2.67), respectively. Systolic BP, diastolic BP, and number of antihypertensive medication classes being taken at baseline were not associated with risk for serious fall injuries after multivariable adjustment. In conclusion, indicators of frailty, but not BP or number of antihypertensive medication classes, were associated with increased risk for serious fall injuries among older adults taking antihypertensive medication. © 2017 American Heart Association, Inc.
Wan, Ke; Zhao, Jianxun; Huang, Hao; Zhang, Qing; Chen, Xi; Zeng, Zhi; Zhang, Li; Chen, Yucheng
2015-01-01
Aims High triglycerides (TG) and low high-density lipoprotein cholesterol (HDL-C) are cardiovascular risk factors. A positive correlation between elevated TG/HDL-C ratio and all-cause mortality and cardiovascular events exists in women. However, utility of TG to HDL-C ratio for prediction is unknown among acute coronary syndrome (ACS). Methods Fasting lipid profiles, detailed demographic data, and clinical data were obtained at baseline from 416 patients with ACS after coronary revascularization. Subjects were stratified into three levels of TG/HDL-C. We constructed multivariate Cox-proportional hazard models for all-cause mortality over a median follow-up of 3 years using log TG to HDL-C ratio as a predictor variable and analyzing traditional cardiovascular risk factors. We constructed a logistic regression model for major adverse cardiovascular events (MACEs) to prove that the TG/HDL-C ratio is a risk factor. Results The subject’s mean age was 64 ± 11 years; 54.5% were hypertensive, 21.8% diabetic, and 61.0% current or prior smokers. TG/HDL-C ratio ranged from 0.27 to 14.33. During the follow-up period, there were 43 deaths. In multivariate Cox models after adjusting for age, smoking, hypertension, diabetes, and severity of angiographic coronary disease, patients in the highest tertile of ACS had a 5.32-fold increased risk of mortality compared with the lowest tertile. After adjusting for conventional coronary heart disease risk factors by the logistic regression model, the TG/HDL-C ratio was associated with MACEs. Conclusion The TG to HDL-C ratio is a powerful independent predictor of all-cause mortality and is a risk factor of cardiovascular events. PMID:25880982
Zuo, Hui; Ueland, Per M; Eussen, Simone J P M; Tell, Grethe S; Vollset, Stein E; Nygård, Ottar; Midttun, Øivind; Meyer, Klaus; Ulvik, Arve
2015-06-15
Dietary intake and/or circulating concentrations of vitamin B6 have been associated with risk of cancer, but results are inconsistent and mechanisms uncertain. Pyridoxal 5'-phosphate (PLP) is the most commonly used marker of B6 status. We recently proposed the ratio 3-hydroxykynurenine/xanthurenic acid (HK/XA) as an indicator of functional vitamin B6 status, and the 4-pyridoxic acid (PA) /(pyridoxal (PL) +PLP) ratio (PAr) as a marker of vitamin B6 catabolism during inflammation. We compared plasma PLP, HK/XA and PAr as predictors of cancer incidence in a prospective community-based cohort in Norway. This study included 6,539 adults without known cancer at baseline (1998-99) from the Hordaland Health Study (HUSK). HR and 95% CI were calculated for the risk of overall and site-specific cancers using multivariate Cox proportional hazards regression with adjustment for potential confounders. After a median follow-up time of 11.9 years, 963 cancer cases (501 men and 462 women) were identified. Multivariate-adjusted Cox-regression showed no significant relation of plasma PLP or HK/XA with risk of incident cancer. In contrast, PAr was significantly associated with risk of cancer with HR (95% CI) = 1.31 (1.12-1.52) per two standard deviation (SD) increment (p < 0.01). Further analysis showed that PAr was a particular strong predictor of lung cancer with HR (95% CI) = 2.46 (1.49-4.05) per two SD increment (p < 0.01). The present results indicate that associations of vitamin B6 with cancer may be related to increased catabolism of vitamin B6, in particular for lung cancer where inflammation may be largely involved in carcinogenesis. © 2014 The Authors. Published by Wiley Periodicals, Inc. on behalf of UICC.
Sharpe, J Danielle; Zhou, Zhi; Escobar-Viera, César G; Morano, Jamie P; Lucero, Robert J; Ibañez, Gladys E; Hart, Mark; Cook, Christa L; Cook, Robert L
2018-01-02
Alcohol consumption at hazardous levels is more prevalent and associated with poor health outcomes among persons living with the human immunodeficiency virus (HIV; PLWH). Although PLWH are receptive to using technology to manage health issues, it is unknown whether a cell phone app to self-manage alcohol use would be acceptable among PLWH who drink. The objectives of this study were to determine factors associated with interest in an app to self-manage drinking and to identify differences in baseline mobile technology use among PLWH by drinking level. The study population included 757 PLWH recruited from 2014 to 2016 into the Florida Cohort, an ongoing cohort study investigating the utilization of health services and HIV care outcomes among PLWH. Participants completed a questionnaire examining demographics, substance use, mobile technology use, and other health behaviors. Multivariable logistic regression was used to identify factors significantly associated with interest in an app to self-manage drinking. We also determined whether mobile technology use varied by drinking level. Of the sample, 40% of persons who drink at hazardous levels, 34% of persons who drink at nonhazardous levels, and 19% of persons who do not drink were interested in a self-management app for alcohol use. Multivariable logistic regression analysis indicated that nonhazardous drinking (adjusted odds ratio [AOR] = 1.78; confidence interval [CI 95%]: 1.10-2.88) and hazardous drinking (AOR = 2.58; CI: 1.60-4.16) were associated with interest, controlling for age, gender, education, and drug use. Regarding mobile technology use, most of the sample reported smartphone ownership (56%), text messaging (89%), and at least one cell phone app (69%). Regardless of drinking level, overall mobile technology use among PLWH was moderate, whereas PLWH who consumed alcohol expressed greater interest in a cell phone app to self-manage alcohol use. This indicates that many PLWH who drink would be interested in and prepared for a mobile technology-based intervention to reduce alcohol consumption.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grover, Surbhi; Swisher-McClure, Samuel; Mitra, Nandita
2015-07-01
Purpose: To examine practice patterns and compare survival outcomes between total laryngectomy (TL) and larynx preservation chemoradiation (LP-CRT) in the setting of T4a larynx cancer, using a large national cancer registry. Methods and Materials: Using the National Cancer Database, we identified 969 patients from 2003 to 2006 with T4a squamous cell larynx cancer receiving definitive treatment with either initial TL plus adjuvant therapy or LP-CRT. Univariate and multivariable logistic regression were used to assess predictors of undergoing surgery. Survival outcomes were compared using Kaplan-Meier and propensity score–adjusted and inverse probability of treatment–weighted Cox proportional hazards methods. Sensitivity analyses were performed tomore » account for unmeasured confounders. Results: A total of 616 patients (64%) received LP-CRT, and 353 (36%) received TL. On multivariable logistic regression, patients with advanced nodal disease were less likely to receive TL (N2 vs N0, 26.6% vs 43.4%, odds ratio [OR] 0.52, 95% confidence interval [CI] 0.37-0.73; N3 vs N0, 19.1% vs 43.4%, OR 0.23, 95% CI 0.07-0.77), whereas patients treated in high case-volume facilities were more likely to receive TL (46.1% vs 31.5%, OR 1.78, 95% CI 1.27-2.48). Median survival for TL versus LP was 61 versus 39 months (P<.001). After controlling for potential confounders, LP-CRT had inferior overall survival compared with TL (hazard ratio 1.31, 95% CI 1.10-1.57), and with the inverse probability of treatment–weighted model (hazard ratio 1.25, 95% CI 1.05-1.49). This survival difference was shown to be robust on additional sensitivity analyses. Conclusions: Most patients with T4a larynx cancer receive LP-CRT, despite guidelines suggesting TL as the preferred initial approach. Patients receiving LP-CRT had more advanced nodal disease and worse overall survival. Previous studies of (non-T4a) locally advanced larynx cancer showing no difference in survival between LP-CRT and TL may not apply to T4a disease, and patients should be counseled accordingly.« less
Comparative Robustness of Recent Methods for Analyzing Multivariate Repeated Measures Designs
ERIC Educational Resources Information Center
Seco, Guillermo Vallejo; Gras, Jaime Arnau; Garcia, Manuel Ato
2007-01-01
This study evaluated the robustness of two recent methods for analyzing multivariate repeated measures when the assumptions of covariance homogeneity and multivariate normality are violated. Specifically, the authors' work compares the performance of the modified Brown-Forsythe (MBF) procedure and the mixed-model procedure adjusted by the…
Turner, Caitlin M; Coffin, Phillip; Santos, Deirdre; Huffaker, Shannon; Matheson, Tim; Euren, Jason; DeMartini, Anna; Rowe, Chris; Batki, Steven; Santos, Glenn-Milo
2017-04-01
Ecological momentary assessments (EMA) are data collection approaches that characterize behaviors in real-time. However, EMA is underutilized in alcohol and substance use research among men who have sex with men (MSM). The aim of this analysis is to explore the correlates of engagement in EMA text messages among substance-using MSM in San Francisco. The present analysis uses data collected from the Project iN pilot study (n=30). Over a two-month period, participants received and responded to EMA daily text messages inquiring about their study medication, alcohol, and methamphetamine use. Baseline characteristics including demographics, alcohol use, and substance use were examined as potential correlates of engagement in EMA text messages in logistic regression and proportional hazards models. Participants had a 74% response rate to EMA text messages over the study period. MSM of color had significantly lower adjusted odds of responding to EMA texts 80% of the time or more, compared to white MSM (adjusted odds ratio=0.05, 95%CI=0.01-0.38). College-educated MSM had a lower adjusted hazard of week-long discontinuation in EMA texts (adjusted hazard ratio=0.12, 95%CI=0.02-0.63). Older MSM had a higher adjusted hazard of week-long discontinuation in EMA texts (adjusted hazard ratio=1.15, 95%CI=1.01-1.31). Differences in engagement in EMA text prompts were discovered for MSM with different racial/ethnic backgrounds, ages, and education levels. Substance use variables were not correlated with engagement in text messages, suggesting that EMA may be a useful research tool among actively substance-using MSM in San Francisco. Published by Elsevier Inc.
Nochioka, Kotaro; Biering-Sørensen, Tor; Hansen, Kim Wadt; Sørensen, Rikke; Pedersen, Sune; Jørgensen, Peter Godsk; Iversen, Allan; Shimokawa, Hiroaki; Jeger, Raban; Kaiser, Christoph; Pfisterer, Matthias; Galatius, Søren
2017-12-01
Rheumatologic disorders are characterised by inflammation and an increased risk of coronary artery disease (CAD). However, the association between rheumatologic disorders and long-term prognosis in CAD patients undergoing percutaneous coronary intervention (PCI) is unknown. Thus, we aimed to examine the association between rheumatologic disorders and long-term prognosis in CAD patients undergoing PCI. A post-hoc analysis was performed in 4605 patients (age: 63.3 ± 11.0 years; male: 76.6%) with ST-segment elevation myocardial infarction (STEMI; n = 1396), non-STEMI ( n = 1541), and stable CAD ( n = 1668) from the all-comer stent trials, the BAsel Stent Kosten-Effektivitäts Trial-PROspective Validation Examination (BASKET-PROVE) I and II trials. We evaluated the association between rheumatologic disorders and 2-year major adverse cardiac events (MACEs; cardiac death, nonfatal myocardial infarction (MI), and target vessel revascularisation (TVR)) by Cox regression analysis. Patients with rheumatologic disorders ( n = 197) were older, more often female, had a higher prevalence of renal disease, multi-vessel coronary disease, and bifurcation lesions, and had longer total stent lengths. During the 2-year follow-up, the MACE rate was 8.6% in the total cohort. After adjustment for potential confounders, rheumatologic disorders were associated with MACEs in the total cohort (adjusted hazard ratio: 1.55; 95% confidence interval (CI): 1.04-2.31) driven by the STEMI subgroup (adjusted hazard ratio: 2.38; 95% CI: 1.26-4.51). In all patients, rheumatologic disorders were associated with all-cause death (adjusted hazard ratio: 2.05; 95% CI: 1.14-3.70), cardiac death (adjusted hazard ratio: 2.63; 95% CI: 1.27-5.43), and non-fatal MI (adjusted hazard ratio: 2.64; 95% CI: 1.36-5.13), but not with TVR (adjusted hazard ratio: 0.81; 95% CI: 0.41-1.58). The presence of rheumatologic disorders appears to be independently associated with worse outcome in CAD patients undergoing PCI. This calls for further studies and focus on this high-risk group of patients following PCI.
FABP4 and Cardiovascular Events in Peripheral Arterial Disease.
Höbaus, Clemens; Herz, Carsten Thilo; Pesau, Gerfried; Wrba, Thomas; Koppensteiner, Renate; Schernthaner, Gerit-Holger
2018-05-01
Fatty acid-binding protein 4 (FABP4) is a possible biomarker of atherosclerosis. We evaluated FABP4 levels, for the first time, in patients with peripheral artery disease (PAD) and the possible association between baseline FABP4 levels and cardiovascular events over time. Patients (n = 327; mean age 69 ± 10 years) with stable PAD were enrolled in this study. Serum FABP4 was measured by bead-based multiplex assay. Cardiovascular events were analyzed by FABP4 tertiles using Kaplan-Meier and Cox regression analyses after 5 years. Serum FABP4 levels showed a significant association with the classical 3-point major adverse cardiovascular event (MACE) end point (including death, nonlethal myocardial infarction, or nonfatal stroke) in patients with PAD ( P = .038). A standard deviation increase of FABP4 resulted in a hazard ratio (HR) of 1.33 (95% confidence interval [95% CI]: 1.03-1.71) for MACE. This association increased (HR: 1.47, 95% CI: 1.03-1.71) after multivariable adjustment ( P = .020). Additionally, in multivariable linear regression analysis, FABP4 was linked to estimated glomerular filtration rate ( P < .001), gender ( P = .005), fasting triglycerides ( P = .048), and body mass index ( P < .001). Circulating FABP4 may be a useful additional biomarker to evaluate patients with stable PAD at risk of major cardiovascular complications.
Corella, Dolores; Ramírez-Sabio, Judith B; Coltell, Oscar; Ortega-Azorín, Carolina; Estruch, Ramón; Martínez-González, Miguel A; Salas-Salvadó, Jordi; Sorlí, José V; Castañer, Olga; Arós, Fernando; Garcia-Corte, Franscisco J; Serra-Majem, Lluís; Gómez-Gracia, Enrique; Fiol, Miquel; Pintó, Xavier; Saez, Guillermo T; Toledo, Estefanía; Basora, Josep; Fitó, Montserrat; Cofán, Montserrat; Ros, Emilio; Ordovas, Jose M
2018-04-01
Oxidatively induced DNA damage, an important factor in cancer etiology, is repaired by oxyguanine glycosylase 1 (OGG1). The lower repair capacity genotype (homozygote Cys326Cys) in the OGG1-rs1052133 (Ser326Cys) polymorphism has been associated with cancer risk. However, no information is available in relation to cancer mortality, other causes of death, and modulation by diet. Our aim was to evaluate the association of the OGG1-rs1052133 with total, cancer, and cardiovascular disease (CVD) mortality and to analyze its modulation by the Mediterranean diet, focusing especially on total vegetable intake as one of the main characteristics of this diet. Secondary analysis in the PREDIMED (Prevención con Dieta Mediterránea) trial is a randomized, controlled trial conducted in Spain from 2003 to 2010. Study participants (n=7,170) were at high risk for CVD and were aged 55 to 80 years. Participants were randomly allocated to two groups with a Mediterranean diet intervention or a control diet. Vegetable intake was measured at baseline. Main outcomes were all-cause, cancer, and CVD mortality after a median follow-up of 4.8 years. Multivariable-adjusted Cox regression models were fitted. Three hundred eighteen deaths were detected (cancer, n=127; CVD, n=81; and other, n=110). Cys326Cys individuals (prevalence 4.2%) presented higher total mortality rates than Ser326-carriers (P=0.009). The multivariable-adjusted hazard ratio for Cys326Cys vs Ser326-carriers was 1.69 (95% CI 1.09 to 2.62; P=0.018). This association was greater for CVD mortality (P=0.001). No relationship was detected for cancer mortality in the whole population (hazard ratio 1.07; 95% CI 0.47 to 2.45; P=0.867), but a significant age interaction (P=0.048) was observed, as Cys326Cys was associated with cancer mortality in participants <66.5 years (P=0.029). Recessive effects limited our ability to investigate Cys326Cys×diet interactions for cancer mortality. No statistically significant interactions for total or CVD mortality were found for the Mediterranean diet intervention. However, significant protective interactions for CVD mortality were found for vegetable intake (hazard ratio interaction per standard deviation 0.42; 95% CI 0.18 to 0.98; P=0.046). In this population, the Cys326Cys-OGG1 genotype was associated with all-cause mortality, mainly CVD instead of cancer mortality. Additional studies are needed to provide further evidence on its dietary modulation. Copyright © 2018 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Winkfield, Karen M; Chen, Ming-Hui; Dosoretz, Daniel E; Salenius, Sharon A; Katin, Michael; Ross, Rudi; D'Amico, Anthony V
2011-11-15
We investigated whether race was associated with risk of death following brachytherapy-based treatment for localized prostate cancer, adjusting for age, cardiovascular comorbidity, treatment, and established prostate cancer prognostic factors. The study cohort was composed of 5,360 men with clinical stage T1-3N0M0 prostate cancer who underwent brachytherapy-based treatment at 20 centers within the 21st Century Oncology consortium. Cox regression multivariable analysis was used to evaluate the risk of death in African-American and Hispanic men compared to that in Caucasian men, adjusting for age, pretreatment prostate-specific antigen (PSA) level, Gleason score, clinical T stage, year and type of treatment, median income, and cardiovascular comorbidities. After a median follow-up of 3 years, there were 673 deaths. African-American and Hispanic races were significantly associated with an increased risk of all-cause mortality (ACM) (adjusted hazard ratio, 1.77 and 1.79; 95% confidence intervals, 1.3-2.5 and 1.2-2.7; p < 0.001 and p = 0.005, respectively). Other factors significantly associated with an increased risk of death included age (p < 0.001), Gleason score of 8 to 10 (p = 0.04), year of brachytherapy (p < 0.001), and history of myocardial infarction treated with stent or coronary artery bypass graft (p < 0.001). After adjustment for prostate cancer prognostic factors, age, income level, and revascularized cardiovascular comorbidities, African-American and Hispanic races were associated with higher ACM in men with prostate cancer. Additional causative factors need to be identified. Copyright © 2011 Elsevier Inc. All rights reserved.
Shue, Bing; Damle, Rachelle N; Flahive, Julie; Kalish, Jeffrey A; Stone, David H; Patel, Virendra I; Schanzer, Andres; Baril, Donald T
2015-08-01
Angiography remains the gold standard imaging modality before infrainguinal bypass. Computed tomography angiography (CTA) and magnetic resonance angiography (MRA) have emerged as noninvasive alternatives for preoperative imaging. We sought to examine contemporary trends in the utilization of CTA and MRA as isolated imaging modalities before infrainguinal bypass and to compare outcomes following infrainguinal bypass in patients who underwent CTA or MRA versus those who underwent conventional arteriography. Patients undergoing infrainguinal bypass within the Vascular Study Group of New England were identified (2003-2012). Patients were stratified by preoperative imaging modality: CTA/MRA alone or conventional angiography. Trends in utilization of these modalities were examined and demographics of these groups were compared. Primary end points included primary patency, secondary patency, and major adverse limb events (MALE) at 1 year as determined by Kaplan-Meier analysis. Multivariable Cox proportional hazards models were constructed to evaluate the effect of imaging modality on primary patency, secondary patency, and MALE after adjusting for confounders. In 3123 infrainguinal bypasses, CTA/MRA alone was used in 462 cases (15%) and angiography was used in 2661 cases (85%). Use of CTA/MRA alone increased over time, with 52 (11%) bypasses performed between 2003 and 2005, 189 (41%) bypasses performed between 2006 and 2009, and 221 (48%) bypasses performed between 2010 and 2012 (P < 0.001). Patients with CTA/MRA alone, compared with patients with angiography, more frequently underwent bypass for claudication (33% vs. 26%, P = 0.001) or acute limb ischemia (13% vs. 5%, P < 0.0001), more frequently had prosthetic conduits (39% vs. 30%, P = 0.001), and less frequently had tibial/pedal targets (32% vs. 40%, P = 0.002). After adjusting for these and other confounders, multivariable analysis demonstrated that the use of CTA/MRA alone was not associated with a significant difference in 1 year primary patency (hazard ratio [HR] 0.95, 95% confidence interval [CI] 0.78-1.16), secondary patency (HR 1.30, 95% CI 0.99-1.72), or MALE (HR 1.08, 95% CI 0.89-1.32). CTA and MRA are being increasingly used as the sole preoperative imaging modality before infrainguinal bypass. This shift in practice patterns appears to have no measurable effect on outcomes at 1 year. Copyright © 2015 Elsevier Inc. All rights reserved.
Hara, Azusa; Yang, Wen-Yi; Petit, Thibault; Zhang, Zhen-Yu; Gu, Yu-Mei; Wei, Fang-Fei; Jacobs, Lotte; Odili, Augustine N; Thijs, Lutgarde; Nawrot, Tim S; Staessen, Jan A
2016-02-01
Whether environmental exposure to nephrotoxic agents that potentially interfere with calcium homeostasis, such as lead and cadmium, contribute to the incidence of nephrolithiasis needs further clarification. We investigated the relation between nephrolithiasis incidence and environmental lead and cadmium exposure in a general population. In 1302 participants randomly recruited from a Flemish population (50.9% women; mean age, 47.9 years), we obtained baseline measurements (1985-2005) of blood lead (BPb), blood cadmium (BCd), 24-h urinary cadmium (UCd) and covariables. We monitored the incidence of kidney stones until October 6, 2014. We used Cox regression to calculate multivariable-adjusted hazard ratios for nephrolithiasis. At baseline, geometric mean BPb, BCd and UCd was 0.29µmol/L, 9.0nmol/L, and 8.5nmol per 24h, respectively. Over 11.5 years (median), nephrolithiasis occurred in 40 people. Contrasting the low and top tertiles of the distributions, the sex- and age-standardized rates of nephrolithiasis expressed as events per 1000 person-years were 0.68 vs. 3.36 (p=0.0016) for BPb, 1.80 vs. 3.28 (p=0.11) for BCd, and 1.65 vs. 2.95 (p=0.28) for UCd. In continuous analysis, with adjustments applied for sex, age, serum magnesium, and 24-h urinary volume and calcium, the hazard ratios expressing the risk associated with a doubling of the exposure biomarkers were 1.35 (p=0.015) for BPb, 1.13 (p=0.22) for BCd, and 1.23 (p=0.070) for UCd. In conclusion, our results suggest that environmental lead exposure is a risk factor for nephrolithiasis in the general population. Copyright © 2015 Elsevier Inc. All rights reserved.